<%BANNER%>

Visual Servo Tracking Control via a Lyapunov-Based Approach

Permanent Link: http://ufdc.ufl.edu/UFE0021789/00001

Material Information

Title: Visual Servo Tracking Control via a Lyapunov-Based Approach
Physical Description: 1 online resource (171 p.)
Language: english
Creator: Hu, Guoqiang
Publisher: University of Florida
Place of Publication: Gainesville, Fla.
Publication Date: 2007

Subjects

Subjects / Keywords: adaptive, autonomous, control, nonlinear, robotics, robust, tracking, visual
Mechanical and Aerospace Engineering -- Dissertations, Academic -- UF
Genre: Mechanical Engineering thesis, Ph.D.
bibliography   ( marcgt )
theses   ( marcgt )
government publication (state, provincial, terriorial, dependent)   ( marcgt )
born-digital   ( sobekcm )
Electronic Thesis or Dissertation

Notes

Abstract: Recent advances in image processing, computational technology and control theory are enabling visual servo control to become more prevalent in robotics and autonomous systems applications. In this dissertation, visual servo control algorithms and architectures are developed that exploit the visual feedback from a camera system to achieve a tracking or regulation control objective for a rigid-body object (e.g., the end-effector of a robot manipulator, a satellite, an autonomous vehicle) identified by a patch of feature points. The first two chapters present the introduction and background information for this dissertation. In the third chapter, a new visual servo tracking control method for a rigid-body object is developed by exploiting a combination of homography techniques, a quaternion parameterization, adaptive control techniques, and nonlinear Lyapunov-based control methods. The desired trajectory to be tracked is represented by a sequence of images (e.g., a video), which can be taken online or offline by a camera. This controller is singularity-free by using the homography techniques and the quaternion parameterization. In the fourth chapter, a new collaborative visual servo control method is developed to enable a rigid-body object to track a desired trajectory. In contrast to typical camera-to-hand and camera-in-hand visual servo control configurations, the proposed controller is developed using a moving on-board camera viewing a moving object to obtain feedback signals. This collaborative method weakens the field-of-view restriction and enables the control object to perform large area motion. In the fifth chapter, a visual servo controller is developed that yields an asymptotic tracking result for the completely nonlinear camera-in-hand central catadioptric camera system. A panoramic field-of-view is obtained by using the central catadioptric camera. In the sixth chapter, a robust visual servo control method is developed to achieve a regulation control objective in presence of intrinsic camera calibration uncertainties. A quaternion-based estimate for the rotation error signal is developed and used in the controller development. The similarity relationship between the estimated and actual rotation matrices is used to construct the relationship between the estimated and actual quaternions. A Lyapunov-based stability analysis is provided that indicates a unique controller can be developed to achieve the regulation result despite a sign ambiguity in the developed quaternion estimate. In the seventh chapter, a new combined robust and adaptive visual servo control method is developed to asymptotically regulate the feature points in an image to the desired locations while also regulating the pose of the control object without calibrating the camera. These dual objectives are achieved by using a homography-based approach that exploits both image-space and reconstructed Euclidean information in the feedback loop. The robust rotation controller accommodates for the time-varying uncertainties in the rotation error system, and the adaptive translation controller compensates for the unknown calibration parameters in the translation error system. Chapter 8 serves as the conclusions of this dissertation.
General Note: In the series University of Florida Digital Collections.
General Note: Includes vita.
Bibliography: Includes bibliographical references.
Source of Description: Description based on online resource; title from PDF title page.
Source of Description: This bibliographic record is available under the Creative Commons CC0 public domain dedication. The University of Florida Libraries, as creator of this bibliographic record, has waived all rights to it worldwide under copyright law, including all related and neighboring rights, to the extent allowed by law.
Statement of Responsibility: by Guoqiang Hu.
Thesis: Thesis (Ph.D.)--University of Florida, 2007.
Local: Adviser: Dixon, Warren E.

Record Information

Source Institution: UFRGP
Rights Management: Applicable rights reserved.
Classification: lcc - LD1780 2007
System ID: UFE0021789:00001

Permanent Link: http://ufdc.ufl.edu/UFE0021789/00001

Material Information

Title: Visual Servo Tracking Control via a Lyapunov-Based Approach
Physical Description: 1 online resource (171 p.)
Language: english
Creator: Hu, Guoqiang
Publisher: University of Florida
Place of Publication: Gainesville, Fla.
Publication Date: 2007

Subjects

Subjects / Keywords: adaptive, autonomous, control, nonlinear, robotics, robust, tracking, visual
Mechanical and Aerospace Engineering -- Dissertations, Academic -- UF
Genre: Mechanical Engineering thesis, Ph.D.
bibliography   ( marcgt )
theses   ( marcgt )
government publication (state, provincial, terriorial, dependent)   ( marcgt )
born-digital   ( sobekcm )
Electronic Thesis or Dissertation

Notes

Abstract: Recent advances in image processing, computational technology and control theory are enabling visual servo control to become more prevalent in robotics and autonomous systems applications. In this dissertation, visual servo control algorithms and architectures are developed that exploit the visual feedback from a camera system to achieve a tracking or regulation control objective for a rigid-body object (e.g., the end-effector of a robot manipulator, a satellite, an autonomous vehicle) identified by a patch of feature points. The first two chapters present the introduction and background information for this dissertation. In the third chapter, a new visual servo tracking control method for a rigid-body object is developed by exploiting a combination of homography techniques, a quaternion parameterization, adaptive control techniques, and nonlinear Lyapunov-based control methods. The desired trajectory to be tracked is represented by a sequence of images (e.g., a video), which can be taken online or offline by a camera. This controller is singularity-free by using the homography techniques and the quaternion parameterization. In the fourth chapter, a new collaborative visual servo control method is developed to enable a rigid-body object to track a desired trajectory. In contrast to typical camera-to-hand and camera-in-hand visual servo control configurations, the proposed controller is developed using a moving on-board camera viewing a moving object to obtain feedback signals. This collaborative method weakens the field-of-view restriction and enables the control object to perform large area motion. In the fifth chapter, a visual servo controller is developed that yields an asymptotic tracking result for the completely nonlinear camera-in-hand central catadioptric camera system. A panoramic field-of-view is obtained by using the central catadioptric camera. In the sixth chapter, a robust visual servo control method is developed to achieve a regulation control objective in presence of intrinsic camera calibration uncertainties. A quaternion-based estimate for the rotation error signal is developed and used in the controller development. The similarity relationship between the estimated and actual rotation matrices is used to construct the relationship between the estimated and actual quaternions. A Lyapunov-based stability analysis is provided that indicates a unique controller can be developed to achieve the regulation result despite a sign ambiguity in the developed quaternion estimate. In the seventh chapter, a new combined robust and adaptive visual servo control method is developed to asymptotically regulate the feature points in an image to the desired locations while also regulating the pose of the control object without calibrating the camera. These dual objectives are achieved by using a homography-based approach that exploits both image-space and reconstructed Euclidean information in the feedback loop. The robust rotation controller accommodates for the time-varying uncertainties in the rotation error system, and the adaptive translation controller compensates for the unknown calibration parameters in the translation error system. Chapter 8 serves as the conclusions of this dissertation.
General Note: In the series University of Florida Digital Collections.
General Note: Includes vita.
Bibliography: Includes bibliographical references.
Source of Description: Description based on online resource; title from PDF title page.
Source of Description: This bibliographic record is available under the Creative Commons CC0 public domain dedication. The University of Florida Libraries, as creator of this bibliographic record, has waived all rights to it worldwide under copyright law, including all related and neighboring rights, to the extent allowed by law.
Statement of Responsibility: by Guoqiang Hu.
Thesis: Thesis (Ph.D.)--University of Florida, 2007.
Local: Adviser: Dixon, Warren E.

Record Information

Source Institution: UFRGP
Rights Management: Applicable rights reserved.
Classification: lcc - LD1780 2007
System ID: UFE0021789:00001


This item has the following downloads:


Full Text
xml version 1.0 encoding UTF-8
REPORT xmlns http:www.fcla.edudlsmddaitss xmlns:xsi http:www.w3.org2001XMLSchema-instance xsi:schemaLocation http:www.fcla.edudlsmddaitssdaitssReport.xsd
INGEST IEID E20110209_AAAADO INGEST_TIME 2011-02-09T20:27:21Z PACKAGE UFE0021789_00001
AGREEMENT_INFO ACCOUNT UF PROJECT UFDC
FILES
FILE SIZE 48699 DFID F20110209_AACCFW ORIGIN DEPOSITOR PATH hu_g_Page_119.jp2 GLOBAL false PRESERVATION BIT MESSAGE_DIGEST ALGORITHM MD5
47608d2e4aa112486a8cf9b8cbfe65ef
SHA-1
6b63aa2f1e43d95330ba9b29761ebe996151adab
1053954 F20110209_AACCGL hu_g_Page_013.tif
ab433fed96790b536570d055815e13f9
aa8c4449b4b751f8d8cb10902f4fc5ce0e070007
117803 F20110209_AACCFX hu_g_Page_018.jpg
50fc46e850204616458f6a5e55665ed0
99e40f06d0a1e0cd8d8d2ba9831815efb7c16f9c
F20110209_AACCHA hu_g_Page_046.tif
32de2178de1c25fb37542302599ee414
2fcdb39fb51a6343732367bf641f15f2a462214c
F20110209_AACCGM hu_g_Page_019.tif
2eae458bdab82ce38bd2e1940cc6778b
ffb669b3eac2e48e64e11c5e8dc4db7eb6ecdf8a
F20110209_AACCFY hu_g_Page_076.tif
77d8242f2059377984730f1460c455ac
38603ccc05179c5450c9063b9a5430f6c632a267
F20110209_AACCHB hu_g_Page_052.tif
be87d7277c075d1db3faded71e0bc253
532b3de6ee310138354142b72285e5a8ec3f2284
F20110209_AACCGN hu_g_Page_020.tif
1ffa8dd91fdb7c5b3942fdcf23ad42fa
868f91c4df15fbf236791ceff7ee8365f2285f19
45793 F20110209_AACCFZ hu_g_Page_100.jpg
6aae7e3497706f572920d9d042c16d16
82982238d9bdc03a7937de81db4cda0f65e4b185
F20110209_AACCHC hu_g_Page_055.tif
17e88293daef47186d642969e324a413
ea7fddae2bb43930b5ad31a5da0416e7c0c6b254
F20110209_AACCGO hu_g_Page_022.tif
0ea981d20065ff49f654a0fc01e4e429
0d0a4acad78197edce74d6644f76623e1b16fa49
F20110209_AACCHD hu_g_Page_057.tif
77edc611d23b1ee5021005beb466a6c5
48fff9f1c42634a9aed959b923489bac18e8a1bc
F20110209_AACCGP hu_g_Page_025.tif
2c4cd8a3759834e8cefabd06d28a76ad
0c7ad4d3961b616bf48357bd6e632fde15aa7dbe
F20110209_AACCHE hu_g_Page_059.tif
cf128c54d376a5f82391e2f7a03ce0a2
ec4e71260e2c05efa05a9e24ca0d0ac53c87984f
F20110209_AACCGQ hu_g_Page_027.tif
0629f36aff9b85e6cb69bcb1a90a0abf
86be26576f314c00e75e1d4ab6c6ba36667b39ca
F20110209_AACCHF hu_g_Page_060.tif
57f0f90e12955cd662f024899f12450a
e2df93ba3ffce43cee017603e8c2422638bd514e
F20110209_AACCGR hu_g_Page_030.tif
c9efe8b0d1de0e6ca0b0afe08a339051
63575b4e66e4b45a5767d5f0b7858d737da11db5
F20110209_AACCHG hu_g_Page_061.tif
1fd0b863a02696e0e69d724c24bbf483
cf26de2d6e509791a83064d923c4999a0222359d
F20110209_AACCGS hu_g_Page_031.tif
6cbe4dd41cde46bb08e349139026bda2
3dee9e2f7ca55323f8c113c16a42efb9211fad00
25271604 F20110209_AACCHH hu_g_Page_063.tif
2eb3641cd261ca839eea3f03430dc8db
e11791e1b5c31ed55e032b51fde6d419ea2ec27e
F20110209_AACCGT hu_g_Page_033.tif
eec941ad1e78b1892ec1590e21d462ab
8e0aee4dfbe5de148e1f11ea070983f3bab5ac33
F20110209_AACCHI hu_g_Page_066.tif
02baf14e99a5ad20ee88a86083c28769
a2b698e752c1db0ba0b844a47645491d40affcdc
F20110209_AACCGU hu_g_Page_035.tif
1fe63db2b340b21df1492f3a9e756406
21861dc03bc6fd7ce15376f69fd51f111b2bfc6c
F20110209_AACCHJ hu_g_Page_067.tif
ff4bdf45f06815598a620f432e87e675
c114eb1edb4c83e0c7a851ef95297588e1c41db9
F20110209_AACCGV hu_g_Page_039.tif
c506243cba6ab84b52eba1e2701d37c1
5e4972c3868376ef64042ca6dc3dc761621d1418
F20110209_AACCHK hu_g_Page_070.tif
8919341c272068ec23f96a7ffc7bbf33
cc5bad8573637ac8fda60e7ed2bedcfc72c3f196
F20110209_AACCHL hu_g_Page_071.tif
5480247ec264bf25e22fb3639cd6408a
582794eb64e732faab7e2157c91955b07683f5b7
F20110209_AACCGW hu_g_Page_040.tif
6b742593bcaccc8ce667ccfb5ebe4ec2
02bf120b68efe824cfb166c01e4af650de0fbf5d
F20110209_AACCIA hu_g_Page_120.tif
0842009a44bcca2ca2222f37eccc0122
c018702d70fd9ced549c78e37cc20f67c040b1b3
F20110209_AACCHM hu_g_Page_073.tif
5041d5e409c2b894e03a5b2f84dc2613
007f34a6943380efac664d7fbe6a877025d8b6e6
F20110209_AACCGX hu_g_Page_041.tif
d1f75c949920ec4a3740d6fe124270cb
4b1bf28b734c5413337db112cd3aa746f2eb71df
F20110209_AACCIB hu_g_Page_122.tif
4022656c5e5be906a1263bbda3a83078
566828c3181d5c42902c00be3bcd55187e19a2ce
F20110209_AACCHN hu_g_Page_074.tif
a98cf1f32ca93eac1beb48d7c2e56d7e
bbce8d3741b567df74faf1b00cbc137d5a99fb6c
F20110209_AACCGY hu_g_Page_042.tif
d6f2022a25193f61a467cea541def00d
b026e6d82ba40fe20b0d6869c321fbb67f575ec2
F20110209_AACCIC hu_g_Page_127.tif
0051e81759e7f9c2770ae4b585c48dee
c89cd5f45327608cb47396a173e755abf11837d2
F20110209_AACCHO hu_g_Page_081.tif
6eadd0a235c949c3401106764b7222a2
4fc29d217c8ce550f1277cdd8145a113bc3f3b15
F20110209_AACCGZ hu_g_Page_045.tif
77f30df13fb24e508dc2228c0f7fe5a3
4bb00fd491e511a8be474a6bf56b0a68d3e6ef25
F20110209_AACCID hu_g_Page_130.tif
cd55c8fd8bb29102ce61be3337eab855
7a72721f0988dc608092d1ab92e61368bcdad12b
F20110209_AACCHP hu_g_Page_083.tif
dbfa76c796e51ab75451d7836638160d
88581360e5cb11154b117e63355cb993587260d2
F20110209_AACCIE hu_g_Page_132.tif
e08a619d927a588cfedda01e4deae854
a5b5d915bd6ca40b0d8ef09b78fb7475a9d74741
F20110209_AACCHQ hu_g_Page_090.tif
25c3e415130d19aa807b1fddb0830533
c9336204173390761a9608d589540a137b017dec
F20110209_AACCIF hu_g_Page_133.tif
ecd5a081a5e0aaf640bdd4dc7cfc8675
292b5ce814e36001c2de52bd59c4e650ad24a14a
F20110209_AACCHR hu_g_Page_095.tif
bb06eed2c113593e56ed39e4949a1723
a3507d7e8557a706071e7a742dbd6765205777e4
F20110209_AACCIG hu_g_Page_135.tif
3f21f5c0d8cc43139c8998d665bdc5ba
7ffb39f0e5461cb00fc98b3d0dbd2426a7666a17
F20110209_AACCHS hu_g_Page_097.tif
89943b9f186de2a5b4061d07b30f7241
da44e9a329660da5646bebf44d70a1184c3706e7
F20110209_AACCIH hu_g_Page_136.tif
0899b59374d23b6ad928cd12f2baa427
4361b7e130f5a6e6f4cee914a108532421600689
F20110209_AACCHT hu_g_Page_102.tif
6b52b04446332e93bf2b2ef0de50c227
207effd2b2e0ae10ceca737dedf88df64cc594b8
F20110209_AACCII hu_g_Page_138.tif
97a2de39b9d8f2db3ffce29c32d9e764
dc09b07052df85db20610a4b25837fd21880a943
F20110209_AACCHU hu_g_Page_105.tif
d1ab23e11d69ef38ce44ff53e6a51e0c
ea030c03d1cd316956f96b18b90d391fd890aa8d
F20110209_AACCIJ hu_g_Page_139.tif
5273d9eb75bb8152ef0e73f2b9f6a777
98931129f7bd1b659e0e73192e7e61a59ccd883e
F20110209_AACCHV hu_g_Page_107.tif
d9f9bed2ca5536f3215bca75df1138f2
6ab0d33774da84e8b8c3fe8df5cb0de13bef0bfe
F20110209_AACCIK hu_g_Page_141.tif
2811170281f77901c205597ceeaa7c46
16d7c4b1a8b58e700b4f949c781967b5d9b26ab3
F20110209_AACCHW hu_g_Page_112.tif
4b2be06f3cb3f9d51911b3c8357db828
11c70cb92ca39cf848643cc9b14dcecf5cf5379a
F20110209_AACCIL hu_g_Page_143.tif
4a856ce0b9ab4e7ffa6208b93587210b
f7423ca63f190143d9b9388ad318edaa6e4906b5
F20110209_AACCIM hu_g_Page_148.tif
120618888541aad8324970a48b25a3f5
d8b3e9def8f23ec3e46b019c8a4d25a6068d8438
F20110209_AACCHX hu_g_Page_113.tif
85804bea8dc9331cab8af149fdc8d36b
da06dd3c2543635b99d2911e49a8ba664ba2e5b7
1403 F20110209_AACCJA hu_g_Page_007.txt
01102b53929fa7942ec642f572e43dcb
8d8c5247077f3c59085e271e6282f4666ae80dd0
F20110209_AACCIN hu_g_Page_157.tif
6eae93dff3b442fd8239f807c01d2fd3
682d00cd0b79a854f9724eb3f4256bc04a344296
F20110209_AACCHY hu_g_Page_114.tif
646b609f0f1f2e8b64a9fc3c100de4c8
7dda0a63838a759395fa9228f11e74c7afb27b17
2609 F20110209_AACCJB hu_g_Page_011.txt
638346fbe64ee24170396f874050697e
b88798bca33d2bd40132aa3a0c23ef0c7b96e79b
F20110209_AACCIO hu_g_Page_158.tif
e5d3df7136abecc9903667a2b88c9209
f3d4c86d20b71c21807b34bac9e4dbe4cd6181b8
F20110209_AACCHZ hu_g_Page_119.tif
4a4a3122f84cbcab302d9f779cf2795b
2785d25708b40522f7267f3f74b6b812c4b629f1
2017 F20110209_AACCJC hu_g_Page_015.txt
7eb96b918957c18c02cac05543d6e3ab
dca349e79c40c28c6de3c0d11f3df6cc6f59ac4d
F20110209_AACCIP hu_g_Page_162.tif
2788a2c9af8a4f4102506901d5cb68cb
5d3bd050c7065efc1518fe79b757e0b769d23f77
2151 F20110209_AACCJD hu_g_Page_016.txt
b7aab8a1d7122221ac30f1d6e68c40f9
cd88d91e128252be6e04ec9b4c98e257b2b0dede
F20110209_AACCIQ hu_g_Page_164.tif
5e859545d24ddad7bcc848695e0d08e3
eabba9ec2bf2f154592aff0aebb29a510bf2feb3
2185 F20110209_AACCJE hu_g_Page_017.txt
fe7ac3522f58ba3ccb71e3a2c8e16824
554e3cc3f16509d5e6a6122ebe2fa2f5155cf6e7
F20110209_AACCIR hu_g_Page_165.tif
335815a54e929f9534da1bbd7e65919d
06ffd8122b582a30c944cd7a3d7fd2aeff44b68f
2247 F20110209_AACCJF hu_g_Page_018.txt
3765f03e871ff391341d15b3c39cfd5a
6920e53cc7d0e4ec75262647b4f04a228032c060
F20110209_AACCIS hu_g_Page_166.tif
bab01e10f122584e75aa052c2e922837
1d3fbb44523d89bffccb5dd1dd25e57a908b676a
2301 F20110209_AACCJG hu_g_Page_019.txt
83bb56c3685723cd255dbf489ace45f8
c3fc6f882c71ac1426b180c33331ac6e4c363a73
F20110209_AACCIT hu_g_Page_167.tif
b74512c943aaec380ec5114c5e275f33
e2ae1dea2d240700994209a49e77df6342ceb98c
2171 F20110209_AACCJH hu_g_Page_023.txt
cf02ec0a3df6b98e3eacd2ed06b2c370
630c62de6019bf2f47883578188c3472b5866a2b
F20110209_AACCIU hu_g_Page_168.tif
9625332304387576243ca526080716e7
ab1548abc6e21e01d60389f3dda492ab34ab8bfb
2238 F20110209_AACCJI hu_g_Page_024.txt
c96cff50bd17262e6a3e2547608fe96d
12116bf227c7442471c40da1e6a5ebfe1b34f54f
F20110209_AACCIV hu_g_Page_170.tif
53cf3271df0de7c87fb5e53f4415f916
84fb9e0a238ffb688b4631bfb92c1af654894a22
2184 F20110209_AACCJJ hu_g_Page_025.txt
5faecaaf51522bd047d67eefe33379b5
6543a2f7e158a6f5b615adf90d4734d28628fc27
F20110209_AACCIW hu_g_Page_171.tif
0ff92d8e2b9ef4c3bfe3bf23e0692530
98ca1a2d69b397a6951a542bb5e8a250ef2043b5
2162 F20110209_AACCJK hu_g_Page_027.txt
0ceae6f23f07a20709a09353116c6e79
41d472925627c6e8e1e845ae88e728307117eac7
189 F20110209_AACCIX hu_g_Page_003.txt
af157a5c6f527ae3bc00e8d78cd89f3b
6c5e6ec20e454bb11c4c6d0bfe2beaa3c002e9ec
22386 F20110209_AACBGJ hu_g_Page_129.pro
aa49fe931f4f47ef42be4e46d8fce1b3
9aebed0bacb0a4409dbdb4c935bd53eaa085a0af
1002 F20110209_AACCJL hu_g_Page_029.txt
092a4b04d1ad098ca24269b1946fefb0
c48448b5d50271370cc732458cf984d06d0398e9
1439 F20110209_AACCKA hu_g_Page_054.txt
aac9600c7b67bf6cd24efe8f651118d6
efba192ceb3b46014a28138e64006cf4cf01b2f0
F20110209_AACBGK hu_g_Page_047.tif
321d1ce56aab56556f4b11f88e8c6ee5
b4b68bf45e9e00b930d82ed8a72b2fae79aea55e
1927 F20110209_AACCJM hu_g_Page_030.txt
de1d7bfdd4ebe4800fdd97678533619c
992af2e44e050ff92c7d457183bf546b31b5c9ec
2119 F20110209_AACCIY hu_g_Page_005.txt
3f5621721b07bdf78913d847423843ad
3d87e63b90370e74fd4a8dc878031b1bba871b6b
1529 F20110209_AACCKB hu_g_Page_059.txt
459bffeb5a77eccde61db28386a0a9e3
d3ab682a634ef3b2c45486ad4add06f762f99b5c
1980 F20110209_AACBGL hu_g_Page_103.txt
709ff7bf21ae546d6becdeaf0325a3de
551c162c468719ce8587f96adfcd1eccf0b3f9e0
1161 F20110209_AACCJN hu_g_Page_031.txt
3f6faa1c8ce4ceec4a15b105cc1b3e7b
d8148753b5e3505cd765a15a002718fb2e970ee7
2830 F20110209_AACCIZ hu_g_Page_006.txt
0cac24f9beaf667932da76857e6832ff
0a9ba915d0f89cf265b33499f7e6829d6fb84b9c
107684 F20110209_AACBHA hu_g_Page_040.jpg
313d6b4c7d16ed1edfa2965c1a6b68c7
cb2d0579cadffee88daab95bc53a7f377b99362f
1955 F20110209_AACCKC hu_g_Page_060.txt
6e767e8658a190771ec887e83c346ab5
7ba389990cb11eecbb40ee38618f7dedd6afa990
771203 F20110209_AACBGM hu_g_Page_072.jp2
824b31ce7cc98acddeebe00377302370
1e062dbadf45eeb9da73fab67bb66c8e826517bc
975 F20110209_AACCJO hu_g_Page_032.txt
e408fa3b859de3467f229c1f21df5938
8a5bce8409eb023adbf5c5d71d8da1f9c4f4f6b1
45239 F20110209_AACBHB hu_g_Page_070.jpg
0bee14b377133b93bab03f8934ef983a
4c9f8f9348ebb97588d89b11e92148a1fd24dfcc
625 F20110209_AACCKD hu_g_Page_065.txt
ad85ab9a117b8c72094f7363d16565fb
906e85300e74a9a63f15cbe327b1954f9942df51
472383 F20110209_AACBGN hu_g_Page_070.jp2
84c60629b03796dcd2bc266225830838
c208da138e94a0a01a815b4aeb1778eff75cf19d
1757 F20110209_AACCJP hu_g_Page_033.txt
01e3b00d8c90f95f11d45a1b75d85c80
4570748a820c47dad23475fe803d57b09b0da16c
32652 F20110209_AACBHC hu_g_Page_045.pro
7f95066af38137910a82976a65a8a281
a592e51a20c3caec347fd1b2ef96f2d1d686da41
1205 F20110209_AACCKE hu_g_Page_066.txt
77f3b447cabf20b6c69cb54bcdf23fc3
48378e122da13fba6be6f94695e209042783dcbb
97227 F20110209_AACBGO hu_g_Page_112.jp2
7a46839d8c2f3f657a7f2bff4378baf5
c4838d7d75f8a413c1f85068b784ad5862c13f04
1582 F20110209_AACCJQ hu_g_Page_035.txt
5663a59d172a1ede5f204192230743c0
c0b7468c019a1f74c07bbf967e9abd49d1bc8d0b
1433 F20110209_AACBHD hu_g_Page_136.txt
e3be6af83da0f95ad9452a40fa1d1fb8
f08d21c19c65b2b5edadbc8ecac1cd53bd3ead8b
860 F20110209_AACCKF hu_g_Page_067.txt
0615593a8b70094670ccc1799b3abf1a
6f210766daa4de3aa7f95ad45df5807599124538
8134 F20110209_AACBGP hu_g_Page_027thm.jpg
097b54420a1785a6eb617c08c9189e41
2f473d9f44f191b8d16fd4d26d21f8f51d3fd587
2012 F20110209_AACCJR hu_g_Page_038.txt
5f756b773736dfe290c25fdd94c70e9b
614d8c6d62380fd4a375de9fc6340f6a584e22db
1051979 F20110209_AACBHE hu_g_Page_064.jp2
f596a9363413c7f57a67001422c21acb
dfe32b518133acbfb4f3694837a7484ab3213c70
1032 F20110209_AACCKG hu_g_Page_069.txt
da1d857cfb41e038b2f735fd47e9861e
240db7b9f363d224d4952eb8ecfa50516459d042
97225 F20110209_AACBGQ hu_g_Page_030.jpg
6b53b5cb67e006af9b99c69061813df5
03afa7bbdc749972c8ee409a694e51efe575a3d9
2130 F20110209_AACCJS hu_g_Page_040.txt
b3060ae413969e7109951102da194dbf
3a2019945d4d668d818f929600fcea37caf81ab2
19276 F20110209_AACBHF hu_g_Page_122.QC.jpg
72222061eda036c542adc715d909a585
62f33e28a3d2bcd55c7f66638067b97fa77ac867
1954 F20110209_AACCKH hu_g_Page_075.txt
d1325edd4699e950779652178626f2d5
07567f14ef781534da62fe5cddf9c38795450310
55407 F20110209_AACBGR hu_g_Page_150.pro
79c66250fd39dc5d7c27342ac399a48d
562200dff43e4d0693d342ec204f9e7a95ee1b61
1319 F20110209_AACCJT hu_g_Page_042.txt
0d7422011dd9ffd53dc3b1ce1b9c4121
70454ecae4aa57ba854a6119e5d90d5b4a094db9
44586 F20110209_AACBHG hu_g_Page_112.pro
878c6642ea80b2ac246d9dfd90ae76a3
09e037608391573d94d02e4ecdaec0723c98c32b
2123 F20110209_AACCKI hu_g_Page_076.txt
95c5c10da2ac246cd79efe15a41deb5f
46442e75e013563b2094f1d6a20bc72c79e5458d
26860 F20110209_AACBGS hu_g_Page_077.QC.jpg
3e8a0df03135548517dcbc75a7d3e770
8e9a22ad3d25c7e1da04fb9a5c14c22fbde359ed
1870 F20110209_AACCJU hu_g_Page_043.txt
7c37a4fe9a261cff9ac25558beaa5396
22cc836beb28d24cacd1c436295af711d861deb6
1960 F20110209_AACBHH hu_g_Page_118.txt
63c2e7f49a0f32839cd86ee3c48a99ee
7a648dbaf6c27e75aaf838a72c833ff9e3c37a14
2023 F20110209_AACCKJ hu_g_Page_078.txt
e94bcef946ce0e4120b6f822dd63083e
0ab2924d3c8092ac8dd8db91d582fb0b95f3d278
2013 F20110209_AACBGT hu_g_Page_149.txt
deeaa717c0c512a53ee7a7d7da6d689b
c5d171d931eadf71c447e7e8ff0150e6a6fb8f54
1563 F20110209_AACCJV hu_g_Page_045.txt
54b317a1151fa93634669b2b579f281f
cbcf211bb739683bbdadddee7d5688451bf1ee76
21517 F20110209_AACBHI hu_g_Page_050.QC.jpg
54a96bcdc0652cd3700e00955d649051
89fc84de2ed4396a802a6726075fa5156595a783
1483 F20110209_AACCKK hu_g_Page_080.txt
e9ef4204372b7d7c08c0605188c109d3
964326980f92c30f00fbc9a8af2c5210bcbe120f
120799 F20110209_AACBGU hu_g_Page_024.jp2
0bf8708727a4fab32ea10cee71053ae7
12b1a60d55bcf1eff4315d707b2e5e1849f2a16c
1292 F20110209_AACCJW hu_g_Page_047.txt
3bf63745d775c5d2caef2b6cad8f6b89
a90ce1f71834abe40d2cba26c529cf78869b184b
6416 F20110209_AACBHJ hu_g_Page_064thm.jpg
ca3c19a2f578c7927d3bc33b631322a1
131a6260e7a787a3f42340f8222ebcaa57df4b23
1746 F20110209_AACCKL hu_g_Page_082.txt
f9880a34a8b7bd595c7a4bba76d82055
dd021b7a9dceafb22eb48178d92a1ce869165e86
11018 F20110209_AACBGV hu_g_Page_070.pro
475658e1ff2dee6d9d5cd06116c58601
b8b7dac7d470c5e08fd5490d269bb8074d2d8113
1792 F20110209_AACCJX hu_g_Page_048.txt
c19b1741f10d58cbfa17a7d7694c1de0
b617e233cb946612d074fd4d9c70c66d400793a5
1685 F20110209_AACCLA hu_g_Page_120.txt
4510d0eb60fd4bc5b1d1d686ce49218f
1ac377c1a157a855995fcc7771e58ac1192dec7c
70107 F20110209_AACBHK hu_g_Page_126.jpg
6a790c1fc69d18519cf3b44625b12496
e2bb91edf81a6d4a6d1019c0342798ee5fa1ab3c
1981 F20110209_AACCKM hu_g_Page_085.txt
643f7b9408d2ff38f63fd3dd01633725
08ba8bbd397611736f43dcb87fd45848384b61b3
110610 F20110209_AACBGW hu_g_Page_022.jpg
ac7f9605b6fd6ccccba6573b566e0332
3a39bfc223651dd467d162e9764473b7eceba5a3
1786 F20110209_AACCJY hu_g_Page_049.txt
f6a46159b88ac4fedf52e7e0a0325b01
c2c3ebd0d031b9c1675533b5dde90fc79a2fe861
1435 F20110209_AACCLB hu_g_Page_121.txt
f5fa9b830b5b1fef8c56a7145d26264b
e31dc3a432578a32baf40df6c7ca16a1bd77272e
36440 F20110209_AACBHL hu_g_Page_083.pro
4efe561d4a4a96caf32349e40dece55c
072d42e2fe1819b3527f09f6d8d3ecf4ee510705
1648 F20110209_AACCKN hu_g_Page_087.txt
7d1e3565b61ff9b0efab7353bacd0bd9
aca8265282407e1968f49bc96e9f04cc786e06b4
1520 F20110209_AACCLC hu_g_Page_123.txt
1ef32cbf3f7b62f8d9c3aeb5589f7000
7ece4889028b4bbd357794a7db564191a8888adc
26625 F20110209_AACBHM hu_g_Page_031.pro
716c33df390afa96609767e7ebed0ecc
e4848a5f012e3cc5a03a838750c506513f51a693
1496 F20110209_AACCKO hu_g_Page_088.txt
a4969fb11f930a0ddd962a013432dd74
3255bd10a2317dac64e0d2feb691c5316110834f
21084 F20110209_AACBGX hu_g_Page_110.QC.jpg
4a48e48429ad3036b5e8f32c16221106
e4146fb0a2e501e352b3cf03c1f94e49d668cede
1318 F20110209_AACCJZ hu_g_Page_052.txt
f5edf45411a850222b2c197dc7d6e303
62aec5dc05672a99c4ec6732abf42093fc4cce31
F20110209_AACBIA hu_g_Page_160.tif
6f1699bee1f580ae1077e4587744a9b0
2279c6d9af20f346721d701e7dbe96c1227127b9
1420 F20110209_AACCLD hu_g_Page_125.txt
a0354902f70b4766207dd1d11f255834
85ce486a503c2d4a4a8cc767f183594972455889
1624 F20110209_AACCKP hu_g_Page_089.txt
9518f9ba5c9ed1df59642177fa866677
3c6aee4ebd1966773c78b693257a115f2931d684
40002 F20110209_AACBHN hu_g_Page_098.pro
0d78c8b397b7a360fb66800d7597b8bb
b693f417fafee05fd4a0a3c9e2d347e23e01ea78
47149 F20110209_AACBGY hu_g_Page_075.pro
78e1bff3357a120a3265af62e020dde4
7b05a97e8f5cb42996d759d5a75c67aa82e29d21
26280 F20110209_AACBIB hu_g_Page_099.QC.jpg
a44893a8dde2691164f58ab757bc0da6
361fb57d345193773786467c3e365ca6ec20fe20
841 F20110209_AACCLE hu_g_Page_127.txt
c496ab54f3d39995718383e74257515e
2a18729e6205e5cee8c26f98c8b4d7430d70bfe0
1413 F20110209_AACCKQ hu_g_Page_091.txt
dab7abdb127202db05d992ef336248d2
4ec6a615af5f35606cf785b181522acbc9e23ac2
79200 F20110209_AACBHO hu_g_Page_062.jpg
2dd83c5add9525ad5d758281cf19f39d
4f6e698823b43c3ecd115bc25402a84a14331ca8
8423 F20110209_AACBGZ hu_g_Page_018thm.jpg
7db289a9a58aa6b383727b7b4063b9f6
528c6700b48baad2799fe0af6959afb5541b9767
1051980 F20110209_AACBIC hu_g_Page_010.jp2
6072bef5f67c964fb5f422ee429a4984
18e9c5df7128aa3f49866a7c91968241e0248c90
423 F20110209_AACCLF hu_g_Page_128.txt
d0b7afc965f8599c585ea2c73f714930
f523cadd97c492e1d55076c5c01cf8a04c567e18
1429 F20110209_AACCKR hu_g_Page_092.txt
8d0e9dccf7b86eddfdd8a53ac74753c2
befdbcacfab80409b1a74094e5212f10b20e0169
43438 F20110209_AACBHP hu_g_Page_155.jp2
8f028cdd3c27d5a3247c396474922779
ea409bb25015b54a9a89ce335d2b19c271845893
F20110209_AACBID hu_g_Page_089.tif
7f15d7bf6fb3aa46a9517573071de036
3b9bbb11fdf805318f0e972eaef4b11e8478c3a0
353 F20110209_AACCLG hu_g_Page_130.txt
b883fb33d044e794be202ea0ba166710
b2186f0b0417b0a3c62f093ca6633bdfd0323250
888 F20110209_AACCKS hu_g_Page_093.txt
7403168649978dff93b77a1815569b6d
32ded3287f9cd415f70f23b4fd7c70922aac7d7e
131761 F20110209_AACBHQ hu_g_Page_166.jp2
331464798e453df662c375dfc04f70e2
05ed7d11e73cac5d10620c08c50c9b8aca090098
102476 F20110209_AACBIE hu_g_Page_058.jpg
d7f38cefb0a7da37294615906cbf8c27
b118d2058e7f5a9f0e88c082da176abb62ac1fe7
1890 F20110209_AACCLH hu_g_Page_131.txt
eaacbf2e11e57a3bb0b5c2486bfb984e
57b62cb4d5273b6a5255261c8f41bf5594f92a41
1227 F20110209_AACCKT hu_g_Page_095.txt
f69c9e3521c3a2de9ade634c378ac70c
a72644b5e5466595e961553ea1146dec7609137d
27775 F20110209_AACBHR hu_g_Page_125.pro
8b5ed61d929f44ec59b10cec80d44c86
45a0940ef2906f5f10624d8b17afc3d8572a6ce2
74112 F20110209_AACBIF hu_g_Page_140.jp2
3845b43dfe448478649cb09db4d27216
bb27c4a7cef6402b1b9b555f8087750c056c264f
F20110209_AACCLI hu_g_Page_132.txt
143e82b0d3f592be6efac10e5409f401
8d4a899eb47fdfadeab9255b828f33621c041d75
959 F20110209_AACCKU hu_g_Page_096.txt
b347d758c91b4bb9e5850c6b8d1fc061
c73bf57d3d671e06733a1cb75e604150ae01ff77
F20110209_AACBHS hu_g_Page_140.tif
bf14ab6a9c6a0ff193384d0e55e4c6b1
3ad0edb070daec7a14ba05b2eec0fef11fb5ea29
35439 F20110209_AACBIG hu_g_Page_168.QC.jpg
3781ad8793437fe3440d6c77fab23e6f
094dc674ce404082a1c10c2da9ccd0789705599a
1640 F20110209_AACCLJ hu_g_Page_133.txt
2b0fda5ca622c66e72687bc088ca5031
f48cebd731fa2c2112db07333c200fbd5a9ea066
1377 F20110209_AACCKV hu_g_Page_099.txt
6e7b070ece6f7d317665d4e4b0d6eff7
0a7523d319e025c8f314f311c116de05ccfd2d6d
1847 F20110209_AACBHT hu_g_Page_104.txt
392fb18d9d7b3c83c709297bb3adb03b
e6dee321b6dc2c3ca2eba08a4f92327a38c8f06d
49801 F20110209_AACBIH hu_g_Page_078.pro
4a0556cfa88639c97d7aea1cd80e7c7c
3a7f24dc1b388842aff55d097d73723bb28a888a
F20110209_AACCLK hu_g_Page_134.txt
3532e8ff97f8657f98435aa675c095f1
661f8b8130dead1eacb5af522cebbaebb148f3ab
1204 F20110209_AACCKW hu_g_Page_100.txt
17b4254f853bec733bc1c41320c69083
78450b379e2f5b31a717540fc583d4a78e93b2a7
9693 F20110209_AACBHU hu_g_Page_010thm.jpg
3945aa5c2160dd1b057972e9d0026340
06f6c0d6d65dea3569ca2f322be312e259b92b71
17692 F20110209_AACBII hu_g_Page_029.QC.jpg
738356c011ec61f037f2321668f11878
99d4a6feb6fadb1e3aca3a7390a405649e49dfbd
1866 F20110209_AACCLL hu_g_Page_135.txt
e39f86a68702f57092e34f8f029763a9
4a669f4c82a709698416688603db069654eb1537
2132 F20110209_AACCKX hu_g_Page_101.txt
6ff03f47f94b10df034cb5996ef339ee
2bf6e4c0edd86320ce00d54f72b126e5f24b6806
F20110209_AACBHV hu_g_Page_058.tif
c67abfe2c2006849d00d6da7abca3b69
87d4c3a9043551c6f05012bc122466a1a79be148
2811 F20110209_AACBIJ hu_g_Page_010.txt
a56cfdb74ce6d76c4407df6b93aa712b
319a78daf3ed47fbf5ed67833b4ec0fce112bcc5
820 F20110209_AACCMA hu_g_Page_157.txt
e148bd08ef3f0340cf960e33307d3388
4ccba5c91b8ecf2d245445b770d0570e3b98e965
1261 F20110209_AACCLM hu_g_Page_138.txt
c957b511473a1b0f0c575490175229c7
60ad91e75cb9bdbf5ac4d40dfcf18c38eca108f0
1947 F20110209_AACCKY hu_g_Page_107.txt
9861abcbafe80d00d89059607f910056
58fdb398fe049a251bc12f1bcab0583f66022296
7165 F20110209_AACBHW hu_g_Page_085thm.jpg
b6812baad26cccc20299a0225a1ac452
abacc74b2b9353509be9c757d027f493aefb4b00
33057 F20110209_AACBIK hu_g_Page_108.pro
92541665dfc4f1d026f459865c46e908
4a4112efd3cb1ecc2d78fe7a21660344b0b82107
1280 F20110209_AACCMB hu_g_Page_158.txt
72196490cf3a5aefab52cf6e5dacf71f
601e893a4b04f24edbae436bac39a467b7430c70
1693 F20110209_AACCLN hu_g_Page_141.txt
ef23609bf59d8d60e2eeef655c163f00
144b2f6bb2463ceb87f66c6531b1ab120cb4ea9b
1886 F20110209_AACCKZ hu_g_Page_115.txt
c7b618cca975582fde6026a714443670
73b85841679b2c7c7dee21cfbd843a39b72e2372
91051 F20110209_AACBHX hu_g_Page_036.jpg
3c3f4a7e25e14c0a762e02830d5b5d35
19d06c9314adaacb0ab16a2b219dd298fb23bd41
F20110209_AACBIL hu_g_Page_099.tif
d5f8cf8832c7fda16c081e15080a1417
7593ef8336abc91a41619074c5007b45968ac651
786 F20110209_AACCMC hu_g_Page_160.txt
5d5d26c2545996e2b05df80dd61de525
467226b271ae928277c5a5f8c39eab0f5ac2ac7c
922 F20110209_AACCLO hu_g_Page_142.txt
ecb2fc70b44b728e798204ceb3e29ed5
c976d2cfc8f1a9675cda686de0a7d3e1141c57c5
64784 F20110209_AACBJA hu_g_Page_072.jpg
372c435d70d395e1bdc9fea65ef9a03b
f586077770e93a8595a49b236c9dab916be85818
4778 F20110209_AACBIM hu_g_Page_156.pro
eff75c1aba04571b5a6df62511024747
88107c04f73247c6a670fb1ea0660554b8626be8
F20110209_AACCMD hu_g_Page_161.txt
f39fe9b88e7fe9b4f3f0c5f26de57668
1d160e0e6383f90211d5fbbfe13527751f8de38a
1341 F20110209_AACCLP hu_g_Page_143.txt
5b9b7f5d4ce2349b5d2e878766f7d1c8
085ff679ebdf126845bfca01aebd0b1033a6a737
1425 F20110209_AACBHY hu_g_Page_058.txt
6761920a87b8ca01b8f24804d74d9581
2595fcf772159de49ae8bade2136f853e7f5060b
53975 F20110209_AACBJB hu_g_Page_022.pro
df82620efd8a1bb653a45d47db8630e0
855d740aa988cf6f92edd3d91b6f71d2e0057da6
8400 F20110209_AACBIN hu_g_Page_150thm.jpg
f0e12ba98e887cd1349e1363b4a86888
94bd1eeee46ffe88ffcc5eeda862688e021daa60
1521 F20110209_AACCME hu_g_Page_162.txt
3e5b704986a2dd579815a6f6fd8568eb
de6bac118e625b7962803fb1c313ebd97f904d0b
1547 F20110209_AACCLQ hu_g_Page_144.txt
f13d2b79de797a6ccabfdd843f53f80b
ec48017294417fbd28f7786f5de6480203b1fa54
F20110209_AACBHZ hu_g_Page_016.tif
e4e8249d284b872b95c8d20d70786810
c4f8d4fde828dfba76f9ec2516f1519199675249
F20110209_AACBJC hu_g_Page_149.tif
abc184b5ebf87bab40186f9af4852bda
58989c7dd52b93c2f673dc503a40cd540d42188d
25896 F20110209_AACBIO hu_g_Page_143.pro
b51108acb4ab029102b6d9de11cdfde3
1753dd5656603d0beb1d1cefa36e13f7a9a84ed2
2164 F20110209_AACCMF hu_g_Page_170.txt
37367b0c3a051651bbffbdb8ec8221d3
e73f41bc1b992e3dd825c93f66784d805e09c53c
428 F20110209_AACCLR hu_g_Page_145.txt
1ae9dab3f1d925f944363d0b011e9910
34f14429842878a95af8f35a9eb0a1a5b484d64f
29051 F20110209_AACBJD hu_g_Page_110.pro
40b43eea19710eb23ee62fcef4f18674
89d6925c0ad3124f65e11872cd04f3bb9cdf1f39
F20110209_AACBIP hu_g_Page_010.tif
3c841f6becd93454d3f35f1db5e9b84a
bfc1cd626da6fe792c0bdc42aea3a387aa9ffdbb
2950 F20110209_AACCMG hu_g_Page_003.pro
487d0f819adfedf638526f103e9e47cf
2d0bf4ba4c7e5b7735e9e6745aff0efd1ad85d87
718 F20110209_AACCLS hu_g_Page_147.txt
ea71c6eddfa7c699c12532287aebf897
924309a1b5fb8609202b9886839e85b133bb55c9
5894 F20110209_AACBJE hu_g_Page_110thm.jpg
1a8d9d30b2c9b605b03d3c094017c364
b5cd5bfacd312ed252e4cd2191c52be884d823b2
88711 F20110209_AACBIQ hu_g_Page_116.jp2
c53080b76f358303805e67a0befe3468
e3b9a37761b7dca526cac5b9195721f58c6caa37
21067 F20110209_AACCMH hu_g_Page_004.pro
43ea32514ed9e98a92131fdfea3cee01
4ef1f9537203330e46bcf14620b80ff06f225185
627 F20110209_AACCLT hu_g_Page_148.txt
022cfc22c5b05fe037bedea21f64f4f4
c6a8e9262ef58b3d7ed095c5621e73b024a276ce
6436 F20110209_AACBJF hu_g_Page_089thm.jpg
5af5df00ea9457a88cb57916d965cf2c
294e01bc1138c35420def9ab9711152f43f669e1
13232 F20110209_AACBIR hu_g_Page_065.pro
8c2cd24cb80a443924b5e62d185f379c
3768dd0be21536dbe84d4bce8bf70238e046c9c7
33368 F20110209_AACCMI hu_g_Page_007.pro
017a545371127805eda7df59a07ec147
0a84dcf57ba7efbedc50005459ce1e24f54528eb
2183 F20110209_AACCLU hu_g_Page_150.txt
fe713cd90cd1ffe07e29134749a6a8c8
51d65d4b9c04434a1cce0c84b5a9d50831acc672
434851 F20110209_AACBJG hu_g_Page_066.jp2
82fddd7fb84a0cfa110e76d48338a25a
4bdc50b41aa2126aa114ab25a8160bdf3ba1127e
25557 F20110209_AACBIS hu_g_Page_047.pro
f7b1ae980157e3c9a92af17ecd6f8c58
52773c400ca27ca1dd1bc585a402f0928f72146a
3305 F20110209_AACCMJ hu_g_Page_008.pro
4998518e29637ba1de6f5179f8981bf1
1dc54742fb1776fdaf2d0072400fc801f01aec53
1309 F20110209_AACCLV hu_g_Page_151.txt
30ceedb18b426ac891d7048ddf2fc454
ad47ddc5aae3abcebbd3dce7fc883dc8f1d5265e
F20110209_AACBJH hu_g_Page_118.tif
962d85d9b0fb51a58464679541467f08
91554d6655505ebb6876a0bad5758df9848de59f
F20110209_AACBIT hu_g_Page_124.txt
bb4003832920b19bbd7bf3aad34c2c44
778e8bde9fe7a315755c147ac9bcba8938fb3851
59524 F20110209_AACCMK hu_g_Page_009.pro
02c1d04196c84775cd4e70c89c8ee93a
7a05450c94fd698c59247e131758074992653bf5
765 F20110209_AACCLW hu_g_Page_153.txt
ab79e40801726482f43d26b87b0b2e66
e38e3b5c6c38ac6192f3304f106644f2fb7685ca
57633 F20110209_AACBJI hu_g_Page_019.pro
291f1c5d612b3500e9f3fe156ba68b92
b7bc5d6453e23bed0fed3a569560bcf924360d34
2065 F20110209_AACBIU hu_g_Page_163.txt
d6b1efb14f7ce6513aa2753d59ff6bf8
6d9c6eff67bbadb8987b3c9967a22eaebcc6940e
70412 F20110209_AACCML hu_g_Page_010.pro
4fffe18a9f7ceee717995d89e91ec4dd
198b8b0b225213567407c99a5f763d1c2bcad114
1202 F20110209_AACCLX hu_g_Page_154.txt
34bf021322f67df118ad318424cf4f27
62cc7f7df1787513b074264180131aa7588f024b
19543 F20110209_AACBJJ hu_g_Page_091.QC.jpg
07c879897abe558d12c1a35bca3ee37c
0b0b72e497d06a04cfa63dd27dc7687a38542139
69911 F20110209_AACBIV hu_g_Page_047.jpg
417eada07bbbb26c523c079fc479f3ac
15f7c0b970ffea1383972b308441a74b491fe880
32238 F20110209_AACCNA hu_g_Page_053.pro
20a2761de504050fdf88ce8bee51df3a
bde110372ef0be63b4121a01dc860be67d4f46e6
46682 F20110209_AACCMM hu_g_Page_012.pro
820a791efeaaacdfd5ac24c2b35d6de8
b6dacc7c390f1bcb2feb5f66e827ebaf35ea8e02
1601 F20110209_AACCLY hu_g_Page_155.txt
5966707fee67462b34cbc00c4cb09e3a
8bb4aed37f9b1b415276710a30921246a65d3fb2
35691 F20110209_AACBJK hu_g_Page_027.QC.jpg
9fd240f15a0b7ba82964bc68277df4f9
5f0bf885bb295dc4ef7ee43996eef0292e9f88c6
33995 F20110209_AACBIW hu_g_Page_137.pro
abd3842a8948f4f67cb579de2a7e003e
345d7f066a9f3715175c23c3852b8b4a7b4a22fa
32105 F20110209_AACCNB hu_g_Page_054.pro
970455af4f566ea5850f0cd35ad4b34c
c3e7f644af6a6b2e44f34410c679f21d60929bc1
56931 F20110209_AACCMN hu_g_Page_020.pro
275bf91fdab46676f8b64206e1e29029
6065926783b1ad0b4db64f5957ab09d88fe29fe9
401 F20110209_AACCLZ hu_g_Page_156.txt
efda7831546834be0dfb51acdec08cc6
3fa4462870e3bf4cdfc5a5e9d7a7ddfa769937d5
5534 F20110209_AACBJL hu_g_Page_136thm.jpg
720d08f7b15485c4151c25b12433d2a8
f9bf8d1f4bbb8ce0e934e0124cc3c3e3404e2dbe
28614 F20110209_AACBIX hu_g_Page_090.pro
42b89f90bf18537ee2a407f241702d43
fbaa15030d3712f92d61f91267765d222fd053b9
51932 F20110209_AACCNC hu_g_Page_055.pro
32c786aab64c0add5842d0bf93976a53
23a0ebd88ac5859cb94afd097b6ea98411088b75
57109 F20110209_AACCMO hu_g_Page_024.pro
b5ce382e31e6c4b71e2716ff1f15f185
f3489bd0c3c38abbf5869be578f62b25258293b6
14346 F20110209_AACBKA hu_g_Page_066.QC.jpg
e4e864749bd1b4296e979a0ad1800f4e
0b2233ce6e877443808ae4e656caa3b62a25a609
19764 F20110209_AACBJM hu_g_Page_129.QC.jpg
45bf473fba25db04662a7a539064bd3e
1622637a9310d8eeae33c5460f6149ff5b05c0d1
7406 F20110209_AACBIY hu_g_Page_001.QC.jpg
ca30c7a38b7086790c82a91400f53c0f
f298a0c89302902471ad1014e4b1ea49a9de8e00
8841 F20110209_AACCND hu_g_Page_056.pro
43505218bc189d0b79366f0113645fc8
35a29be53add80f801dc0963aa9a37f1113afd3f
24514 F20110209_AACCMP hu_g_Page_029.pro
76750c41e8a5f1210b19407709f181a7
944aa657451ad22ce054c6b16115af58df748caa
91056 F20110209_AACBKB hu_g_Page_013.jp2
e8148e308d749e6d5760abd1d8c7f9fb
86fddeba662ed15071629f008c2f19b0669c8212
15530 F20110209_AACBJN hu_g_Page_096.pro
2f7a52660d85893292bd23e3780b4172
473a67bb47ca9330a1a035e5b3dbae091a7dada1
35348 F20110209_AACCNE hu_g_Page_058.pro
d59826ce8caa91125529884e82ccade7
a7c41f38898f0c82503bd53440fc468f1b88cdd6
21373 F20110209_AACCMQ hu_g_Page_032.pro
6017501d495aadeab4294074bf84c0f2
147aaa73c6f6569a30821c894d87170ea160acb2
88961 F20110209_AACBKC hu_g_Page_141.jp2
976812c2721ab297842033b962c8dbac
b1e71510597d99b2a938da6c5c055ef724abf59b
114245 F20110209_AACBJO hu_g_Page_027.jp2
bc85b948d02930d995917a42889644f5
2be95aa892ac7cc39d219d351f5087442fcc3b5d
F20110209_AACBIZ hu_g_Page_064.tif
ffef879015bc67600a66ed9e1e0053f3
3a9454c81afe8ee661196ad61318bb4346c87b88
33020 F20110209_AACCNF hu_g_Page_059.pro
af23150085f9394e92974bbf57de81a1
7f065c0e7e9a8947ea9a4611af7fe9c59f079823
27350 F20110209_AACCMR hu_g_Page_033.pro
7aa550bf31ea59acf8474e1708bdadb4
50726411cb6316c64777064e340c8cdc7a58d5b9
549 F20110209_AACBKD hu_g_Page_070.txt
9eab3a71608b13dde1a4bfdbe4cec79b
e74ba94e3db3c930218e43f8018e2c633f76e90e
F20110209_AACBJP hu_g_Page_051.tif
72efb8bcf960da4077a531aceca599d3
082fde996c5d5931b5033d68bff0a39a8bc855d3
22746 F20110209_AACCNG hu_g_Page_062.pro
dc6732749e4f54ca57d1a7ea09551a07
acf1768ccdd18d212eace8c4e9dc3d9448609eea
31339 F20110209_AACCMS hu_g_Page_034.pro
c1a3ddfe1ff71edeb8d9849fdb8044fa
5b4d87234a5ce5cd2f2602e574028d8390630d56
3147 F20110209_AACBKE hu_g_Page_008.QC.jpg
afaf7141ab53e65af0f8f6a02c706d98
ed178e58ca13f7556c47acde4df88f992b98ffbc
66810 F20110209_AACBJQ hu_g_Page_042.jp2
d600a252265dc40b3d2ea67852605cdc
9d08e3a25f87da78be604ac4630580864861d907
12869 F20110209_AACCNH hu_g_Page_064.pro
20e0dfd4771ca38655aa5303f9bcc882
fd50dda96c635367e163205909bd70131cd86d85
44281 F20110209_AACCMT hu_g_Page_036.pro
7510050c6f65536962fdad1076a99f8e
e2e7fa4df46382370fa82585784aad60f7ba4c55
112510 F20110209_AACBKF hu_g_Page_026.jpg
563ad083a5b2667b95d9b5d2f3f5822a
45100f4e38a0dff3f6ba5d8acf229c8ca87634f0
29780 F20110209_AACBJR hu_g_Page_052.pro
974dc9b391033fb3bd73aeaa3a2d25d2
7ba609c9888f2602655570f4ce068fd8f64d6dc9
20890 F20110209_AACCNI hu_g_Page_069.pro
05400e119335a4fe553e9eccd6a02b8e
26232a2ad6cb6b9502e696b9611febf65fa14d3e
26210 F20110209_AACCMU hu_g_Page_038.pro
d1daef5c390ec8faf733a886c81beb25
ecfe6f80064129eacf33a05c22620079b2711127
2348620 F20110209_AACBKG hu_g.pdf
0e2f10b0426ab3d5d11096228a7eb405
4055c756a20400c9c285523b5cf923ada75c9e43
55086 F20110209_AACBJS hu_g_Page_079.jp2
9e33b1803e6e74b8205e77d554adaef5
447039b161317a75c2d62de72e1b237bf8f58081
12329 F20110209_AACCNJ hu_g_Page_071.pro
8a13868166862884c0ebc246ce427d1d
81e77630bb8f95664567279579d9eb3bf5c174b4
47755 F20110209_AACCMV hu_g_Page_039.pro
05a534d07b314a7709d417f1eb503703
9e51d4eed5bdb02e3179b92495078eccac2c0adf
62943 F20110209_AACBKH hu_g_Page_134.jpg
4dc481def61c027469bbe37c4e9f36fc
e5ed57f599e47c10fd3236bf3405aad12401f6ff
1733 F20110209_AACBJT hu_g_Page_137.txt
07e78d87cb48ae162fdf2a3c2a322772
e4e2d0cd8f2757f69ba84c980912121c2ab9af27
37970 F20110209_AACCNK hu_g_Page_084.pro
5e4a2d9470267369d61cbdb9cb8cef1b
c673103c5b69c3e7a776d444bcdd9c1cb382472b
42015 F20110209_AACCMW hu_g_Page_043.pro
93c1fae87a11b9ba8e8cfaba30b22af5
0a9a7c54296034c5a51f2689411358448c97c15c
70937 F20110209_AACBKI hu_g_Page_117.jpg
ac5e718182feb71c71afedc5ba7ca0a6
6abddbf97891c85391026e2beda1566838476c16
9819 F20110209_AACBJU hu_g_Page_003.jp2
001cac87350d7978f6edc4da3b5c3c14
0477123365e64a6eee6294d876530c1d18cd5111
26548 F20110209_AACCNL hu_g_Page_087.pro
a7bde53deee218c3f75b016954edc23b
2b5ad5532891ed3d2d75b4065d1abaff7988a1e7
29598 F20110209_AACCMX hu_g_Page_044.pro
8b06c517f4ced671e4a4e61d49ffd4d1
bff6c0b870253d71d74f3a1a63268fe206ce645a
2477 F20110209_AACBKJ hu_g_Page_169.txt
29f1df143c0ee0493a10418e6aeeb2c5
2d6b8fbfdca8e785cc9fd968e097f30d035805b5
F20110209_AACBJV hu_g_Page_001.tif
48aa439ffba145055c7d844d7143e3c3
ed8bd03e30fab4c55ec1e5592a38deaa5cbc8bbc
31915 F20110209_AACCOA hu_g_Page_126.pro
8e0c62953fb048aed436c42c187cd370
28d0e216db98cf681b001e66884c259cf767503c
28036 F20110209_AACCNM hu_g_Page_088.pro
491a0d96bc11cea0716fe740a4364afd
0fae32c47b0f1fbf3702236fae5592cbb70448f1
41737 F20110209_AACCMY hu_g_Page_049.pro
6a9c7e336f89a04e015360572d52d935
f082b2cb5db9678a14d3e4db250db31c0bc3ec41
54317 F20110209_AACBKK hu_g_Page_027.pro
b1a893ad89b5d2373d93a9a5ecc9de30
c25b2005e34f21a2ff38a5ad7f036646a0632377
68179 F20110209_AACBJW hu_g_Page_050.jpg
e2f8cef031df512847851c9d0a042310
11e9e9f9a149b225d62c87b7b9e30273496ee275
45894 F20110209_AACCOB hu_g_Page_131.pro
e1e6b903286697ad7fd0d2ea71bb10da
0782d0af5394c66b4c0388c353e28b655f2ec031
15526 F20110209_AACCNN hu_g_Page_093.pro
4a1147ea8d3b4f64577fc684f4961076
b7cc99f4b3a4db1e248766bef8015ce8724e93ff
21418 F20110209_AACCMZ hu_g_Page_051.pro
928a33ab96b8a738191c1be3f29f3151
5a9e1b426c5cd6bb27313cb17b4ffb0a1b201ee3
65063 F20110209_AACBKL hu_g_Page_006.pro
f2d018264ea3a20d68be4425fe5d7c4f
9e357b060e51e4ab50358a5db098b9164cc9c394
30639 F20110209_AACBJX hu_g_Page_042.pro
f0c0953834c171c7ae056cd8514b71af
d3c3d667c51cbbc09168ba2f974421a7c3bee978
31033 F20110209_AACCOC hu_g_Page_134.pro
eafd9d6082a395cbd7a3bc5d8631cf30
2148e6d813d8720f76a12df4efb10b50bbd772b2
5729 F20110209_AACCNO hu_g_Page_097.pro
72e4cf5f3709c6ee03ad8d90b4dd5511
36f945efe7096bec95047744e486b1e008c53551
F20110209_AACBKM hu_g_Page_082.tif
22242b96b26b3c82b4cb84fb4a8c7a29
6b38b7e1d524c19ffd7ac3aa634138141c872533
7277 F20110209_AACBJY hu_g_Page_118thm.jpg
1a05e0cbb67509c3014025f7245e7f47
b6121ca5850a4310b75ca42eeeaf00e666e05b54
4277 F20110209_AACBLA hu_g_Page_130.pro
a3c47f781e5fb1463ee5f80e6abbbc95
cbf7f1b22931749b94935e084705cfc056ac9600
29155 F20110209_AACCOD hu_g_Page_136.pro
df84331f61b922b8bb97e231f4e1b82a
8d25c7b3db3e402b653c94bc45a67c012166b3a5
18094 F20110209_AACCNP hu_g_Page_100.pro
7660892b0e1a79b1477d551523703e51
6e7ea02beaede15374e81a7ee3534e2e1a1b5aa5
16680 F20110209_AACBJZ hu_g_Page_147.pro
ccc0a0fe571351f57154054eb486b390
a32136e24d07c71484ae6a38353b7a73720b227d
F20110209_AACBLB hu_g_Page_015.tif
b110d46df37f1d7d377a2b4aa6ec597c
4362f4e65badffb9e3257a2868ff88d99122a2ab
19430 F20110209_AACBKN hu_g_Page_147.QC.jpg
0dd00d9447308c102e815593f663e60a
89d9a60b2d346cf17901ca1c23e6bd27f931ca0c
26896 F20110209_AACCOE hu_g_Page_138.pro
0bb07342017db3288076b874c914bcca
932d3584496365b0f44d6adc3879d3bae4b01ea4
37332 F20110209_AACCNQ hu_g_Page_101.pro
0863ccd7ee64988ebf70bfaefaed44d5
3650698a312d1ee7a30a8a29a8d3b5aa41ed6aa5
F20110209_AACBLC hu_g_Page_128.tif
5a7214aaeda74621c7e3b3c133e6a378
f9ab1e69c04937e44252dedae4741fb1e9b80e8b
45271 F20110209_AACBKO hu_g_Page_142.jp2
f9a2cee161dd34fc23484599c761c191
d65cef16790c9dd211b3d093f0ec2405d45141d3
40290 F20110209_AACCOF hu_g_Page_139.pro
bf05fc092dbd0fc4c82c0cd179856045
aa791e5d71e55b91eb3758b44d564a54f0c39b1f
31931 F20110209_AACCNR hu_g_Page_103.pro
d2269a3f2804a4cb486494cacd7a17f5
2a8690e6aa6c95c08a54c96b899af485151aa1ee
34354 F20110209_AACBLD hu_g_Page_135.pro
da75aa21b86d8f373b09d8e875fb6388
760630d811e2b03bde8cd5a553bc0356ed5aaaf8
2505 F20110209_AACBKP hu_g_Page_009.txt
0968fbb5a14d3983115f01bfe6040773
f8304f4cc472ba27bb763865c4bff52be22c5932
34770 F20110209_AACCOG hu_g_Page_140.pro
ef515612cf1987fc2db444d08f05fb82
9fe81f0393a5aa9c7d59678bbf0b09a7d52ebb82
38257 F20110209_AACCNS hu_g_Page_104.pro
38d373503ba77de54123071612fabbe8
305f9037e6a09627edc57afad567b605436a8363
62513 F20110209_AACBLE hu_g_Page_166.pro
110ac2246f34012572a0b9b19cf9ba5b
21cef4c0e9735289cf36796a8c84fa27b7f43276
54011 F20110209_AACBKQ hu_g_Page_016.pro
833ed1b4c7349796fe3c09198c1fd1e0
4ad54f9f14549445499ef4b252a52a00afb37444
34678 F20110209_AACCOH hu_g_Page_144.pro
26aff2e32f2306dcedc56757f53dd9e4
f0076354796d956d6268b6cc39333366e94f92ba
51296 F20110209_AACCNT hu_g_Page_105.pro
7915ea51824092e408cfbc93fe14e08a
a39fcde6867c4c2123d4f6079015f927c18f64e3
47537 F20110209_AACBLF hu_g_Page_100.jp2
73b41a1ddd6f22e592a89134173706f2
7ea400583c050dbec8087b1383edf35bfad0b864
1353 F20110209_AACBKR hu_g_Page_090.txt
922e97c5979e01864045bc3262c90ee0
664fb3839a69d7a2e8334ae7a7a42b44dc1b77f0
7288 F20110209_AACCOI hu_g_Page_145.pro
be7f4f9f511fa3a286658e1fa3e4e10a
15d379286c5c71971b239ede9acb584843538623
31159 F20110209_AACCNU hu_g_Page_111.pro
5bfad5221c09ab9ee2628b6577c93694
8d6f5d451ae09cb3fbedee9df0e642a0266df806
18005 F20110209_AACBLG hu_g_Page_095.pro
a831ef90eef69fbf332cb7a69e001434
047755c9131eb445017f6411a6cabf7486c62713
17005 F20110209_AACBKS hu_g_Page_152.QC.jpg
970a3cfe4c4d24c8ffc4533a6c5a59e0
42118f679b242679bd3befbfbf1e63d9ce329335
16722 F20110209_AACCOJ hu_g_Page_146.pro
6bac522093cb00cf495ea95e615c325b
e3497c3bcffc2ebbbf08a77d21556409cca62a2c
38534 F20110209_AACCNV hu_g_Page_115.pro
b8f8a4038a0009e457ada1b9c456d7bf
25a0174303c86aae06969152b972e64ddb290dde
15376 F20110209_AACBLH hu_g_Page_145.QC.jpg
8ce6623ed4ea59f2a610dd509670e5f9
9ac1b7d42cfd34c2942cfd58d73ef6fcd925238e
39746 F20110209_AACBKT hu_g_Page_141.pro
79977a7dc8c8afbb03e965bc2d9b569c
28bed84836a15ea4f5c1431720f75576280d56a0
32931 F20110209_AACCOK hu_g_Page_151.pro
617fbc9411d4263bf4c433f8cb2ec389
5f107fa5fdc0e6ff59688a5efe5e66d8b7274aeb
43570 F20110209_AACCNW hu_g_Page_116.pro
7377cf499374f30e302b335da4f2597d
377d7ca68f1b5ad9bb461a2b86390f2813203d21
1845 F20110209_AACBLI hu_g_Page_140.txt
2200c5b94edaea15384390c1cb1ea57e
48365b6e82aa00dc111be466ba87ac672cc45205
64469 F20110209_AACBKU hu_g_Page_090.jp2
2d08fa0bbf7be785a4b8dfe8a00a5e0b
c66eb8c0b64ec67bb3d7a9cf26732ace7f7d4789
24909 F20110209_AACCOL hu_g_Page_152.pro
c34b65747bf85dc5da495a4acbfe4549
69b5c4a7910a31a613935a25ef10d5f2bda99930
34056 F20110209_AACCNX hu_g_Page_120.pro
fd6c5351de23cd1a6affef2fec4b6bfb
7f856719aa978caa0407b8cc3b06bb057268fc3c
F20110209_AACBLJ hu_g_Page_150.tif
70052e1ef2b2c8635b655bdb8be57925
7e0708ac7a51e7b862174c78c2e41a1ea1fcf28b
F20110209_AACBKV hu_g_Page_043.tif
da64f78785d2e02fd95026fb5cdb6fc2
9ea33613e02bac8a5d971e485c0fd345c24db34f
48043 F20110209_AACCPA hu_g_Page_004.jpg
c9ccea3f2767e4c023713dfb41866faa
34878afc2304a76bcd7deb66a243df96cd71c7f0
16196 F20110209_AACCOM hu_g_Page_153.pro
c75ccc134bb2e3e44ecc45a342c9a599
a779ff4b25c2b92663b8bd8a49fcc7ff4d153b8a
28901 F20110209_AACCNY hu_g_Page_121.pro
4ec6d8e4573c044e428eed68bffd6ad4
43298756bee426ff1f0b05648c165a3381621ff4
1935 F20110209_AACBLK hu_g_Page_139.txt
683c1de0ab38027c75cea492cff29514
a86082e5ff4497160d9fa40e986ab789b6753c55
22596 F20110209_AACBKW hu_g_Page_140.QC.jpg
abca2b448768beec3b2c7be5f640a98a
522a79a0f1882457bfe24a92455dca6a4832e06b
15768 F20110209_AACCPB hu_g_Page_004.QC.jpg
fc141130d44154d5344ea2d774206214
3096a6bfa1f0a9f3a1e2281bb2f108aa7120dd06
21101 F20110209_AACCON hu_g_Page_154.pro
5394a75a8a577d6808aaf095e96a32a8
8840d0e968f92f5ff7b76d749cd955700de1f57b
26242 F20110209_AACCNZ hu_g_Page_122.pro
6d15bfccf897c4e7d1d7c553eb45d0e7
751db49bb87b45233e99250584001e03d56862de
37495 F20110209_AACBLL hu_g_Page_082.pro
d031a47c0eba027d657e609366c89754
e6a3bd966ff1019396c5487e030ad09dcc0684a3
36113 F20110209_AACBKX hu_g_Page_022.QC.jpg
580413fcfa8f9cdf5c21530fac9882c0
38a22285cc1b44741b58e09adb4d99550a7526d3
107709 F20110209_AACCPC hu_g_Page_005.jpg
72bbf9597411c515a86666bc893a4f04
342a0dd1cf22310049b2293a7875014494e63d39
18525 F20110209_AACCOO hu_g_Page_159.pro
9deba5b9514773a6c53f0ec5be02e3dd
aec003244d4a2769d9e769012ecb989734352763
22673 F20110209_AACBMA hu_g_Page_151.QC.jpg
77c0c93736d94cbcec5158e78508197f
822bfa01c481624441a2a57160f8bb28422d172c
F20110209_AACBLM hu_g_Page_068.tif
6de6fd5117c254e673df37b4d1785bca
eeda35387336082474181eccc2f9e842300a59e4
73613 F20110209_AACBKY hu_g_Page_041.jp2
8f6c98de387244e1a0343082fa902115
da44fc54af81871c81147bfd341d4e66aa14e121
131177 F20110209_AACCPD hu_g_Page_009.jpg
0de283e4ad644a9613c108568b6b4eb7
806b24feb54c3d0f1e0a866e13e395f1b6eb2d7d
19982 F20110209_AACCOP hu_g_Page_161.pro
64583ecdbbe2acc36d5b51eda6d58dc6
2aac0ff52c2d3701c723e5584bf3b23b0e1d58dc
2048 F20110209_AACBMB hu_g_Page_081.txt
9901d5a04369b5a0ac260d26971cf867
9b6a45c6f420e3f024a7de87c47bab8f552f62e6
15090 F20110209_AACBLN hu_g_Page_072.pro
569d650ba77622b63f0e669df518a084
b8e8bd2c543a957aa98f73ab75f13a0bed97d1d3
31743 F20110209_AACBKZ hu_g_Page_030.QC.jpg
c3004c8afc75ac9a5312595d355ed3c1
360439596ce928ed8467b4ff4857c2f7b646ade8
35451 F20110209_AACCPE hu_g_Page_009.QC.jpg
7fc80b359a5460f8e6b12016cb1e0825
ec4061df0749805839ee9a5edf31099a56dd30c7
50828 F20110209_AACCOQ hu_g_Page_163.pro
9ec3d871dbfc0c7e755ab92d3792434d
2b04b2eb2f51c838ec3aa30306c62c4b60cc6fd1
F20110209_AACBMC hu_g_Page_151.tif
a9b7f8169254ff4e72121493c59ab75a
6e01367bfeabe5caf3ceef7bbdaba7ca2fbed3b3
33987 F20110209_AACBLO hu_g_Page_167.QC.jpg
9d8abdf3db4cb60b4c8865b6f6e5b03b
6fd7df4803c27bbe61d8820f14eb61ddfbafc884
161953 F20110209_AACCPF hu_g_Page_010.jpg
0cbf77bcf86631cf469432f9eb098fb2
383dd051ff957ae7ca57c087c640a6f14c1991da
61707 F20110209_AACCOR hu_g_Page_164.pro
19da7b44bbe3eea917fcf9e894569ae3
4a6c7e89e5eedc1f2c1da6876ec7e6d8995e9d7c
62761 F20110209_AACBMD hu_g_Page_125.jp2
ac5773a7b85118884302141d770f7be7
e8743a4a88c02b803f584035cb1d1205f1fec786
30442 F20110209_AACBLP hu_g_Page_163.QC.jpg
7e1ec107ba6ce4cc8717605ebda9114e
675b78bd5aa713472f2befe1206ce564ed40aca5
44543 F20110209_AACCPG hu_g_Page_010.QC.jpg
99d37598e19cf41d221b2a706ff9469e
9a7a91cc3d40e974100d870b75ae27f5682c6824
58255 F20110209_AACCOS hu_g_Page_165.pro
37d576c33c66d2eba1d9352a75234b24
c6f614f73194b95676294ab0821898c4ef951a20
13722 F20110209_AACBME hu_g_Page_127.pro
a552c2df965ecbd4b89b05a21bd6a745
a9f7613d4b2d62680733a8181540b526af23b9df
33880 F20110209_AACBLQ hu_g_Page_041.pro
f3dbabc42de9730a8927d5f6031e2fc5
82947229ba4095d202203286932de4705e348368
133658 F20110209_AACCPH hu_g_Page_011.jpg
84d602922f854299948f1f42a2d1e9da
aecc6d3a5c56d8569545a0cebfb310915b76cbbf
57771 F20110209_AACCOT hu_g_Page_167.pro
e4fa6be13a1f6edb9587d3a47675d4be
dfe2128faaed910d6151880082d9b95d7235f685
413852 F20110209_AACBMF hu_g_Page_145.jp2
762910130fda50d57ae2ba81a34d0061
e906ac0e0a05046fc81f269f35680db82f1fd1c9
F20110209_AACBLR hu_g_Page_142.tif
6a7f8e3f1465a3c6e796a310f71ec926
e4e9d6a5cebdb8a0b069699d9fde54f58ccf6934
28353 F20110209_AACCPI hu_g_Page_012.QC.jpg
70f77252872ad8d8a2a847395f26bc1f
1aacf9495f1c05081020439cade935c39eba09cb
61540 F20110209_AACCOU hu_g_Page_169.pro
13ae1d27a62b6630b141ae70633e960b
7e1dc2e737364450909dc8ad47cb3424eb177da4
1303 F20110209_AACBMG hu_g_Page_152.txt
1f9acbca6b0a4ac8d2f93e81062f7ac2
f976ad4ed5a99436a7ec0aa59f6871d3cd90e4db
F20110209_AACBLS hu_g_Page_036.tif
94d5c30512f3cec315ae8ad72b1902c1
68eb6fba65616789cc1dc16b96c2d733d22d43df
90015 F20110209_AACCPJ hu_g_Page_013.jpg
5072df644079b9b6c76bcb467167a6ac
2546c69ae297651cc94b2d193fe17205e8d5b7f4
F20110209_AACCOV hu_g_Page_001.jpg
17104180224e46b566f38c5f291e8e8b
68916da28e874aa3c5d41e86bd4e3e2778e48260
10080 F20110209_AACBMH hu_g_Page_008.jpg
41d0cfcedb0e47ddfc8269fbe3b4e695
6b03fc83daa711e0bafed7d00bcb951fdc3795f5
1868 F20110209_AACBLT hu_g_Page_112.txt
4de973524637d8e56ce5c19865e9b3db
4e0e129ccfd3a55b286a1ca42b26d176a9b0f535
110881 F20110209_AACCPK hu_g_Page_014.jpg
b50653e296f324ab4e59bb9e77c91a58
3d156b142aee0aec57ba73570335dbd15b7f94cd
3668 F20110209_AACCOW hu_g_Page_002.jpg
ded549c79344dceb2882a8900dbab8e6
ac031576da2b4a4b01eb3e1cb44eb3fff1f9027f
613707 F20110209_AACBMI hu_g_Page_146.jp2
616a0f082ea60f84177e87e33beec133
b0fb3f8a0b0fad846ef96bc42b3b293f74b407f3
5636 F20110209_AACBLU hu_g_Page_045thm.jpg
87a5d5f0033e8c9cdaadb54142b7c25b
1f2ae384ddc4ec07d8057b8c4ffd7ab045e1cde1
99712 F20110209_AACCPL hu_g_Page_015.jpg
01f351b6cec0b95a778746341f05ca61
dd2da484d65f8017c04afb2c7beb44bd1f56c522
1079 F20110209_AACCOX hu_g_Page_002.QC.jpg
a936bd8d56f91e10f13cb332152b184a
599a5de16978ddb3515dfcbe84ed2f3a32cc9a62
6721 F20110209_AACBMJ hu_g_Page_054thm.jpg
373b1b385e165bb5566d93b423eb753e
f866bc2a1062371ace585f48f8aa8654bddc8f05
8198 F20110209_AACBLV hu_g_Page_022thm.jpg
41f68b2170b29231db4c89a69e4929dd
ac290810eb666696b494032386deea8b850dd5a7
22714 F20110209_AACCQA hu_g_Page_031.QC.jpg
b308c30ab7d9a2b67247cbb987fcf1d9
160d13e122c0bddd69129c3258bf87b5008c2b6a
110306 F20110209_AACCPM hu_g_Page_016.jpg
2f44343373e561aa001186ba92f09851
22f971a739a3d2f2c7012b3631a554c6a9bc63df
7938 F20110209_AACCOY hu_g_Page_003.jpg
1d1e6c4d3052a4e43cd11096f7c3daad
7159a3d6d56746076db4d68519209eb90a897f7b
1051982 F20110209_AACBMK hu_g_Page_009.jp2
04c3de796a38cf853313912de940c0e0
d28f4ddcf1784de0a6e555dba533e92ae74d776b
77093 F20110209_AACBLW hu_g_Page_101.jp2
4fb2da9b00dc54152b0a2280325dbdcc
bf775e9328d85cfee3e352695ed769878c66739a
19508 F20110209_AACCQB hu_g_Page_032.QC.jpg
7345cfe37e2076bc0392de4fb24ae876
0184c0873d804feb174a3823011c2e4e69506472
114118 F20110209_AACCPN hu_g_Page_017.jpg
ba232fbf0934f2b733f48a4c926fc925
fab757fc626bc186ab394a2fcc4121648436e01e
2642 F20110209_AACCOZ hu_g_Page_003.QC.jpg
e48a7c0d5008d43240497fc600493ebc
fd6128f358e8d52b44d9de63230efb38e49c8211
53994 F20110209_AACBML hu_g_Page_021.pro
36438c8a1962910e338605d37b988d9b
98dbf62ca3b5a8c6f92fc8618b61ef301769efb9
33159 F20110209_AACBLX hu_g_Page_092.pro
53c09d0947d29be2f93592c17e31e59d
30003f9c59c4bf9cb47f25a26106f49e2b417320
59173 F20110209_AACCQC hu_g_Page_033.jpg
acc766a1a95feb5d71ebf33834e3fe5d
ba27d5d1f70f3a5f98669bf54e3196a8ec2b287d
37891 F20110209_AACCPO hu_g_Page_018.QC.jpg
8aa61dfa41e09559cccf293fc3042604
6fb8ab5c6a7e618e8d857764b81664cd44807614
34259 F20110209_AACBMM hu_g_Page_117.pro
c5370c15be851fa91789dde11114ca9a
b2a65a11f9b570bb42d0ab82de93ffc881c24592
65788 F20110209_AACBLY hu_g_Page_134.jp2
7d10b1ba71c8251121757ce0372cd5cc
dcc1d5eb581c9cf89784b133bb77510270179f01
1051975 F20110209_AACBNA hu_g_Page_005.jp2
969151b4fc6af7c45d2472785febc88d
eeea1366f3d1fee0b11c00a47f8c9b56f693f2a0
73028 F20110209_AACCQD hu_g_Page_035.jpg
55f145926e817f26d8281262a1f79e11
5a0f12bf4f748caa9170192b3a70485b855c6032
37504 F20110209_AACCPP hu_g_Page_020.QC.jpg
bdca0d7b0b0fa69b34463b49f6c81a6d
406b3bcd5eb3148d3c7e6e9abc3049134d5a554d
22567 F20110209_AACBMN hu_g_Page_109.pro
d88c021357bb5212a7b80b0e8f9d2a1f
ca3c40d8bf37762a64ac0bfe690826e50382381f
4851 F20110209_AACBLZ hu_g_Page_063thm.jpg
5e3f6ff3e54eb175168210b15b931f19
79eab0d76c6c099e7ad232ff474b9f51ae2f1e4c
53154 F20110209_AACBNB hu_g_Page_076.pro
3ac6f7f1c29255700f7c5c0295d970fa
77c4d43e1bf7a481b7120c91d59c38db8cf12f44
25255 F20110209_AACCQE hu_g_Page_035.QC.jpg
b9abf34b105b48a1f55b815343a140d3
816113ba3ec3f5349d63898680aa81c99e75a573
35444 F20110209_AACCPQ hu_g_Page_021.QC.jpg
6f0fa21ccf84200102bc159efd37c775
3b5cc08308145074eba57f77a822fec23d933c85
1608 F20110209_AACBMO hu_g_Page_046.txt
a388c50581290b6cc64f402f54dca2b1
2da42c400a5b16bd7d124debdbdacb302bc09468
24975 F20110209_AACBNC hu_g_Page_124.QC.jpg
0d9186944b583370fefcedb376bb3f9d
aab6da9635395a577c93700a0365ec843da4fb18
55552 F20110209_AACCQF hu_g_Page_037.jpg
f4b6c2008d9c54a40c88b23b99026096
5da8fee76bc49b8000b8c5c95375347bda59ef89
113283 F20110209_AACCPR hu_g_Page_023.jpg
248413d835f53174e0eb023defb1b6d0
2588797819b9769745bf1d98b17fa107ff7f5e2e
32018 F20110209_AACBMP hu_g_Page_162.pro
60e7945f25e8b2c0670ac332d6d9c3eb
123c65a9447d0a21b6727ee126ea7f99c0b77781
3664 F20110209_AACBND hu_g_Page_074.pro
132840a48ff6c96166969629c54ab846
bd4d3a54e743a896842d6d9c589aecd298a88136
17852 F20110209_AACCQG hu_g_Page_037.QC.jpg
ab3a3bdde4a9d99b9bed95e2b4076e68
d5503a85ff6112b952744bc9ce8e739b9d1cdf11
36356 F20110209_AACCPS hu_g_Page_023.QC.jpg
011fe269f743a3ed7ebdc53dd60175ef
22d74eee97d10542c096d2c95aef0db544104c63
61728 F20110209_AACBMQ hu_g_Page_032.jp2
62ab3b9ffcd75cf87fdcd5dbdbfe6cc4
0006cccf91f0d467b086b2ed6fedb67bcbbe13e5
38442 F20110209_AACBNE hu_g_Page_106.pro
e8bffc836385cd6a7616e37844c02aad
cad6d318048698887acbfd311639e51e82126096
17639 F20110209_AACCQH hu_g_Page_038.QC.jpg
a561edaad1bc572a446ec9577384f6d6
ae070cf9884129cc0f20285975e3562531e0d62e
37741 F20110209_AACCPT hu_g_Page_024.QC.jpg
dd25bdbce52ddaa79af96014dbae573a
53ff3b832fd521ae04949f63cf48ccd39deaa48a
27993 F20110209_AACBMR hu_g_Page_158.pro
3d2d4a68c1e1316906d7528934d9204a
ed1ac79dde537bfdfd4f1e07c84b5ff5bdf8defa
F20110209_AACBNF hu_g_Page_053.tif
dc7dee92c4ad75747a05a3aa68f9a97c
e5f15f463b99888c3a1c58bfa137a0738d12f5d9
34141 F20110209_AACCQI hu_g_Page_040.QC.jpg
c42087d841ef990b0fcac9ee6431a457
a3870382b503c9832864bd478ce6debb21da5f6a
116637 F20110209_AACCPU hu_g_Page_025.jpg
43755c9c3fd5eca29f75f3efdeda291d
719fa854fb60b6af39b42f5308da3c663be7fbe5
6217 F20110209_AACBMS hu_g_Page_108thm.jpg
7ae0813ab55528feaa5d95a8b3ccce7a
19ebaae07defc8c93edc6ce3c4b1df8a3f91d650
8174 F20110209_AACBNG hu_g_Page_167thm.jpg
6edc78b47e50b229c7a412b3b03628e7
ae116856203d620de79685d1aac5dcb20e3bd768
27558 F20110209_AACCQJ hu_g_Page_043.QC.jpg
79453a94d802ff0c87687e7da177e100
e5798d6a900b61570b82dea6d85d20d361257ce2
37453 F20110209_AACCPV hu_g_Page_025.QC.jpg
5ebbfba55fd5f042bc1be189cc514249
df53300b479e5843873382b9e82ae28b1d979100
F20110209_AACBMT hu_g_Page_093.tif
83bdca960f81dd31992d19bfa5468790
998fbfe767487d6b0a5d630fefddc7344d1ad42c
33632 F20110209_AACBNH hu_g_Page_105.QC.jpg
7d35631bc39a9116f26475f507445b1c
e9aa61af6685a6d1efc21bcf99561a7abeab4ffc
65215 F20110209_AACCQK hu_g_Page_044.jpg
16093e7367d54e4788469343409ce17f
4a8679a0b8261b5069734bfa2638e59b4540c5ed
35963 F20110209_AACCPW hu_g_Page_026.QC.jpg
7e7aa5bf61de6382daba57bb2150a690
6f4c2dfaa4edb076862aeeda67a4cea906b02edf
F20110209_AACBMU hu_g_Page_062.tif
8289410fa40e44a401d5268adf6f13e8
58e2ea0860604dd2efbed0a867037819a7e166cc
36505 F20110209_AACBNI hu_g_Page_057.QC.jpg
8eeb3e80a3430f11b08d155c47242a4d
b334332e444a88f4e61963cdcdcb0dac18c2a581
51766 F20110209_AACCRA hu_g_Page_065.jpg
d5b5125aadf3b07aa09cbab197dcf6a8
876d88924eaa3f4657de47cd2b4041ec0d4fe10c
21899 F20110209_AACCQL hu_g_Page_044.QC.jpg
b855390751aa0f84f1bb241f9e512370
04de0bd5812be3bc779d8ca14638ec6922c80c95
110265 F20110209_AACCPX hu_g_Page_027.jpg
eee67d630d978a9300eba2372234e0b1
4382a93337fcb434b24428b27749b1e12367b221
1706 F20110209_AACBMV hu_g_Page_117.txt
a715549195171839e94a4eba9b591f36
3a0515d5ff355d6498a9fb85ec762f193c709bbc
54917 F20110209_AACBNJ hu_g_Page_026.pro
d4441ccf8606e0203902383c9e8790dd
32facc3d7584e7877c027b35b26d35548dfd9ffd
64644 F20110209_AACCQM hu_g_Page_045.jpg
e46a5af744fd60d23f171d80878516b0
8950769d24f1d271959852d35f3525d8399a3b3d
32144 F20110209_AACCPY hu_g_Page_028.QC.jpg
2e606f7bcf2206fea221ad343b4286af
03216bcbaa69baa142e7c7ebe2e0d64f00ffcaf6
31190 F20110209_AACBMW hu_g_Page_170.QC.jpg
4221b72f67d734196d8738e71db64229
b8ab06027022f3a4f1afccef5cbe3d6987e1a38e
F20110209_AACBNK hu_g_Page_109.tif
f7b9acf3fa8b2a368213db5d308e033d
69a7645a4113bab43a82faa03001c431be1ee3d7
17137 F20110209_AACCRB hu_g_Page_065.QC.jpg
e9006222fc1744e90283f7d3abcf64c6
4cd5c770dafee4a485484d8e286de55ababa11cd
22891 F20110209_AACCQN hu_g_Page_047.QC.jpg
56adf8998fb58e7359a107192b612c3d
5c8129e487a8a2e4d7f3ffca81bad84ead4f4074
70171 F20110209_AACCPZ hu_g_Page_031.jpg
c1b799ba19bfb926d7bf3bc3a754550a
ba3dc5a8ca57676711f451b8555885754ca74ac3
21216 F20110209_AACBMX hu_g_Page_134.QC.jpg
abc9a3a0507502863b7b243aed4943a5
f4e0b5cc52aeaedc0a705a46cfb3d9c51b7d9250
55253 F20110209_AACBNL hu_g_Page_023.pro
a0d7c9f116ebb026fa0c85853272999e
7b7ebeb10dfdcb7fecd9a7f191029114b74285d2
47750 F20110209_AACCRC hu_g_Page_066.jpg
36fa19fbc014c3dccc2bb0349c11bee6
46d5bd95d8bb917326ca277985d642ac6b0094f4
79972 F20110209_AACCQO hu_g_Page_048.jpg
123990d9e0d3c0a6e5fb00bc08fc6559
84ea8851bc98a81f7ffe478d55612d50d72cb63c
104399 F20110209_AACBMY hu_g_Page_008.jp2
2f642b02cbe340cdf3a503db6e089b15
d1f977bf4267afcf7e3b82ec698172315be2795c
71559 F20110209_AACBOA hu_g_Page_054.jp2
eac975f595ba2fc6f0b5bf4bad92b96f
b419f8c05dd27ea4b959452032fcb9468ebb8538
F20110209_AACBNM hu_g_Page_123.tif
1c6b46a4742f8f11c7d99de5e7ddf425
0c5bfec175a26ea46addfc35975e8b40d8f0c29b
52737 F20110209_AACCRD hu_g_Page_067.jpg
d0664c3040f205be41dc0bedeb758429
94bd16e3fb9935160b7c1673a24218e0f2d32245
26106 F20110209_AACCQP hu_g_Page_048.QC.jpg
6f4e46097f38def436b53c646079c9c5
486236fc06c478af164a77578674b496de49b006
8395 F20110209_AACBMZ hu_g_Page_019thm.jpg
2ad2d526375a69adf9ae4943606fcfd7
9fc16c28e7803118f742a36f74a65c16c6abf971
8313 F20110209_AACBOB hu_g_Page_165thm.jpg
85c7bc4f435ae95827aab8865c51a959
ef53b549d2acfdc3b705d77b42792c11dec97637
48638 F20110209_AACBNN hu_g_Page_109.jpg
d1c005720ece8fb6f4c8de3567a8d2c5
81591a2c661735d39d21ffface05db66d2d4d579
73641 F20110209_AACCRE hu_g_Page_068.jpg
62921af1a87cd8f1952639edeae1e0f6
095db3de4a1147adb70d7eb9b708df252e9b2960
15349 F20110209_AACCQQ hu_g_Page_051.QC.jpg
2b5d3f67847624d2d85ae129863edbf9
9cec6e0ec370253038164cc5d8aa19b0bb7c9d87
28113 F20110209_AACBOC hu_g_Page_049.QC.jpg
142b9b0c26e352cdd153bd8dc40c77c9
a364f32c83d9c5ce932e7f3435d420ffb6f3a75f
1467 F20110209_AACBNO hu_g_Page_111.txt
f9ce1eda910c63720cae8dcf91967c82
2a01853ed423867d68c2304864e2112110db6fca
20289 F20110209_AACCRF hu_g_Page_068.QC.jpg
cd17752b9ca4dbaf692d6a598ca754e4
c78b0adc6ad2bf4c2f925d2d7b9733a16176f90d
70314 F20110209_AACCQR hu_g_Page_053.jpg
d9f2bdd0eda8407180f8fe9d901360aa
f6f47d3b09130208e33fb2957254c77fc4c0443c
79926 F20110209_AACBOD hu_g_Page_046.jp2
532399e081cf4ca0544541174d74ea2b
9361b5178551913965b78de9e2827d4c1000fed0
52645 F20110209_AACBNP hu_g_Page_170.pro
9075d74ef352a10963d9e694c6d7f810
32d2d318117e289978696f4c84543016e9847b05
56873 F20110209_AACCRG hu_g_Page_069.jpg
1905faa43fd1cc71df8b7dc90eda522f
10d6a7c17d6ff219955eb43579a96fa41334bd6d
22826 F20110209_AACCQS hu_g_Page_053.QC.jpg
95ecb8cc7d970418b2945dc71e7bf786
69c4f8bb0c13bb72162b8d1a2eae78709607eb46
22087 F20110209_AACBOE hu_g_Page_062.QC.jpg
f6614d99930ff376e2bdc1adf6f508cc
afcfd28162c75de263263d61d465e69a7d97711c
2327 F20110209_AACBNQ hu_g_Page_167.txt
6d812480826d4ca4d10dba944766b8a2
bf422833f80bfd76dee740b7e50b6fd48599eeec
14045 F20110209_AACCRH hu_g_Page_070.QC.jpg
fc82c7829fbde6dbabd57105a54e9eda
b617a2862207d5d4f5be8c701122aff059424284
23856 F20110209_AACCQT hu_g_Page_054.QC.jpg
3ac7398ffe28d921ce66b652b08bab62
1f5ab7c9ef31477eb14af7f2c47e0ccb42909521
31002 F20110209_AACBOF hu_g_Page_131.QC.jpg
011b69046393ad13ce193f08080aa4c9
accafe8a33293bf8e1cdd3a82c875a187d0da5ba
53295 F20110209_AACBNR hu_g_Page_014.pro
81e85c04c9f6b04773f986b32ed06bb8
e15eb9a3b996b9f61860ce1027a1bc7cfe4a2775
15296 F20110209_AACCRI hu_g_Page_071.QC.jpg
b9448f9af9b010d55c991fa4b3d15301
2ff425d88e46256dd9d0654fa2df4acff70c9ebb
105552 F20110209_AACCQU hu_g_Page_055.jpg
4c43be5094061d6d951a94a1418577f4
66ffc3bd221883c493927e6243d94cb2e47efe62
F20110209_AACBOG hu_g_Page_034.tif
f44ca5018bbabaeaef8884a2d3159dbf
4ea084159db0ffe8c04521cf80790415e76ee49d
31395 F20110209_AACBNS hu_g_Page_039.QC.jpg
0309217cdbf43f8b11a0f42d7b4c9b5f
4d39b4ab75a6a08b651f5141b75e25a71680fd31
45606 F20110209_AACCRJ hu_g_Page_073.jpg
b1dffb77d94d488bcccf89f4e146f8f1
537fb2715b9585edaae04d0ac807f044288339c5
32080 F20110209_AACCQV hu_g_Page_058.QC.jpg
9868a10a6b9cf944c6ade5df112ba04e
fe41e9f61f5658f00802b6410d99ba54ef23d8d0
85315 F20110209_AACBOH hu_g_Page_043.jpg
30f32599e1db3e19f11c6ba4f2c38673
d228dfb4e61e56e937cdaecd04d5663a17475385
F20110209_AACBNT hu_g_Page_145.tif
068bd307e1cd6fd5003ba63f19c12050
ca51e4f195f8f99f693037a3a56c49b2cd2da622
14927 F20110209_AACCRK hu_g_Page_073.QC.jpg
a3021eaa1c8678eaaad0d5ed57cfb637
0e6c489fae33b0257310491a7005cf6d5ad74d6f
33348 F20110209_AACCQW hu_g_Page_060.QC.jpg
590f5757b49ee89f743ac451aebb448b
f3c7c286cb336b6c3158948e53dc47498ddc017a
67380 F20110209_AACBOI hu_g_Page_110.jp2
fd8861305974b3cb0c0e2332fc2da90e
dc6afe27e046513184b29396e114b565439e4ec7
61310 F20110209_AACBNU hu_g_Page_146.jpg
a5c62568206cdc2db56b53cbfdf9cc70
eef0567f665b6eca27190e53677703b2482e17f8
20981 F20110209_AACCSA hu_g_Page_086.QC.jpg
22e5ab1bacc76c8ad1d7acb0a869a48e
85a9724a5594d70b2a17c3f3f91e41b58f33b2ed
22405 F20110209_AACCRL hu_g_Page_074.jpg
db21aa19d616e497d5a922ccf9a9a447
3858368669d6b2607fd5f5d742edd280ba8a3b41
17583 F20110209_AACCQX hu_g_Page_061.QC.jpg
ce02bcbb57f55d4582dd2bef05c9bc7c
714c96cb315253c78a73bef01ed03d7011943b39
35400 F20110209_AACBOJ hu_g_Page_016.QC.jpg
86399759d2e5edce2c6eb0b76af5fdf9
1d5bb83b974cdb845fe538d3ce8ac0d2ce151b13
71983 F20110209_AACBNV hu_g_Page_111.jp2
1920483a3dee03d0ff2cb9fc0d0e1bd4
9c4dbf57bb743557c682404609a85aacbea22eb0
19165 F20110209_AACCSB hu_g_Page_087.QC.jpg
c14a05b8c48675fd18c58057a1fcdb8e
45529a08e3773ae1faf7f99307dcfa38ee663660
7323 F20110209_AACCRM hu_g_Page_074.QC.jpg
f9d01aae3e06aa44714044ea91cd29c9
9d849db4ef441ea8ef91958b0577b2fcf273fa9b
14609 F20110209_AACCQY hu_g_Page_063.QC.jpg
e87c8a83ac90f4dcd2d57b815124fba5
3684fdf39603dbd5bf6c7985b08546dc7ff00b2f
3815 F20110209_AACBOK hu_g_Page_004thm.jpg
34418211031c6c8e66d001e227dacaf1
f31694219cc4689cea0f5c21b6207644fa89b061
F20110209_AACBNW hu_g_Page_048.tif
c2823365af499e8171d83af2179ee850
99a9a23c31afcce84431090ba03a6fd9a390815e
98191 F20110209_AACCRN hu_g_Page_075.jpg
a595875add609069d0b349f1cbf993a3
3cbb9e32ebeaa59903208d43c4a81d804f29d5ac
21940 F20110209_AACCQZ hu_g_Page_064.QC.jpg
51c283c808b080b7816a214113741c64
ffc240a68daef8560c9ea388cfc8d910a403eecc
764 F20110209_AACBOL hu_g_Page_146.txt
6066d22ebd7410ec043159a15d687f40
b3bd6e8c958bded28dc74c80f61364fe46c3e080
1041 F20110209_AACBNX hu_g_Page_159.txt
d8b9f4024d0f14256ad1b0d0416816a8
8be66ba3f0403ec4c86b433bac2b1fa30358217b
76448 F20110209_AACCSC hu_g_Page_089.jpg
22e5eb02f858892175076246271771a0
336282249044d3d0d0ba0c92520a13e9b42c82b4
35249 F20110209_AACCRO hu_g_Page_076.QC.jpg
877db12d85e5ee0cbfbf1a2194573712
430da0d0074f4584800e47bff6891fb2f2f25d94
F20110209_AACBPA hu_g_Page_029.tif
0a23b18bcdecdbbab74143302c4e4b6a
09430f641c0018dae94805c0ce65cf2ac2557ce1
2079 F20110209_AACBOM hu_g_Page_001thm.jpg
56815d1612fa3f0cf715e7c9a8478439
92210bbe6249eedf6907a0a41db620a22f276796
18080 F20110209_AACBNY hu_g_Page_072.QC.jpg
f32bbb29b6fa1a6f0bf601c86a534853
67fad96e6984e07a7543e8d30acacfd06c6d8368
25091 F20110209_AACCSD hu_g_Page_089.QC.jpg
6fef4a55639fde9c02c60da605eb36f0
0c562e9d4564a9b45ec6273a59f45958e9d72e8d
83692 F20110209_AACCRP hu_g_Page_077.jpg
828f929a47bcb790eb35330114b375f9
d2318a2b0632576780d7810f5f1dab47a8f56f57
112288 F20110209_AACBPB hu_g_Page_076.jp2
b0788bcd486523c8019e6b8fee4a2793
c52f00f63bb12c37bc7e8e564f792e7249a02e40
434 F20110209_AACBON hu_g_Page_002thm.jpg
6065729c7e0e80c58fec8278d4cc298d
8961b4f37c735ba559783137183fedc91a46f3f0
99670 F20110209_AACBNZ hu_g_Page_039.jpg
eadd86c9713e4390030a3e15344bb000
ae620a840f02ed8aae0689c60d181be7b2148760
62196 F20110209_AACCSE hu_g_Page_090.jpg
c2af8bd52eb80fee63af1c239e23d993
f9eb751b51dcb0723a9088f44bda567b997e82b0
33309 F20110209_AACCRQ hu_g_Page_078.QC.jpg
96c0adb771f71ffd3657dba559cee1dd
c15900e548de562838f39d14a90eb7a23b6f169f
7087 F20110209_AACBPC hu_g_Page_048thm.jpg
7444c2376ef5929143596588ddb7ce0a
022dd1390f6cd4e052e231923a4c97e588817e18
84462 F20110209_AACBOO hu_g_Page_139.jp2
3502f773a36febb1f6979076323758a4
d4433fe6414fca3d7a9bae80f397c95e20b5574c
70059 F20110209_AACCSF hu_g_Page_092.jpg
64710926107c6c34568709da10fa0796
d43e097767f2227c691edd2c9adb5104de9c11fb
53406 F20110209_AACCRR hu_g_Page_079.jpg
82a5d0685d0b28c0f5a0a5b8b2f8b1c0
f5cefdc7f54eab979f023b5221966611d746b341
677 F20110209_AACBPD hu_g_Page_072.txt
03b7d5c4321f3d522bd488735e36b03a
8c30ba74fc62e19aa11d4a7313d7f7120f34632d
F20110209_AACBOP hu_g_Page_044.tif
7dd80eabcb6d8c233083bcea3c1b988a
ebd068bec69fe8ae1b8340cd96f6b1be7b29d0c4
24160 F20110209_AACCSG hu_g_Page_092.QC.jpg
b85bb4cff7265b63c272723ce7752dc0
1d65db3f2ff9491844b74688522cafa168028f8e
16608 F20110209_AACCRS hu_g_Page_079.QC.jpg
7a71aff7254958fdc674e11ab680dd00
d42dbb0a22439d45bb88866f02105ad5c9f182dd
58510 F20110209_AACBPE hu_g_Page_087.jpg
5eadaff2744ea0e279315d134c4e86ef
f792013f6395d7a0a13889396b1c2be68fe60c59
F20110209_AACBOQ hu_g_Page_129.tif
dbe1d2893bb80e6173d6473a63bd0a92
fb194d5f38aaa8c1d8be35e624c07576bafa7622
50046 F20110209_AACCSH hu_g_Page_093.jpg
27e89b9b71fa63e54083c5cd8af0fdab
e0fde264f8c8637a59232f8df9d3ea39f6b0d689
54709 F20110209_AACCRT hu_g_Page_080.jpg
7e3f7815674d6f58c46c6491e807bb26
3ca7934745bb91f4ca28f314e590379dfa49b3c9
134134 F20110209_AACBPF hu_g_Page_006.jpg
fcef55b8f03227c75409946f4c5d26b3
68a1d214d08452fb83a1f0c24c07eb102db3d45d
116939 F20110209_AACBOR hu_g_Page_150.jp2
b91befec91e06d81730860d03fa1ce65
0f6cb793b94e7d101024ba7edb7aaa96c9f61188
81770 F20110209_AACCSI hu_g_Page_094.jpg
ae6a30bbe91db1ce9e1d1d50af09b168
e9cc264c9c3ddc886dfae070c59cb0881dc9a15f
18835 F20110209_AACCRU hu_g_Page_080.QC.jpg
a838df1c0d6d8539cd9239aa2938c655
02bc249bc5d652cbc6ac56aa7533683a4d900185
35934 F20110209_AACBPG hu_g_Page_046.pro
f68ea2fea2f2ce5616189c1d55cbe1a4
a4d55ae29df790bc2188f77311947f3b3428753c
20362 F20110209_AACBOS hu_g_Page_090.QC.jpg
9949fe2ea8d83cb4750a703679a4ece1
942ad18eb22c67edbb22c80718ec95a09bd8ca92
16210 F20110209_AACCSJ hu_g_Page_095.QC.jpg
25c360e3de7af9be0eab1af2c73f307b
0bd74678dc6b5b0084d1d2ec9522a5abe6fb262f
92717 F20110209_AACCRV hu_g_Page_081.jpg
b3197ac851beb28585aa3673107fcd36
efe503e2e74c9bd7ad6e7be466aa68eceab92ed3
F20110209_AACBPH hu_g_Page_028.tif
058d03fb6ee8959f2e8b5be8faf21350
0702668881202477bb55a1597a5ef899bdb47dea
4368 F20110209_AACBOT hu_g_Page_142thm.jpg
d068858d4bf2f1a86e3a70657bb2b09d
b4189f389496187b9ef91756b47a37bb52aa44c6
7689 F20110209_AACCSK hu_g_Page_097.QC.jpg
711c79828b785969493ab611dde04ae6
3816129fe5133c082873d1ec62b940ea1b469a5c
77092 F20110209_AACCRW hu_g_Page_082.jpg
1c07f23f79db04b4fad95e332fcd2675
c3b577e1ccff6139a2e8c6a02616404e8323bf0c
5643 F20110209_AACBPI hu_g_Page_111thm.jpg
934434079a71343ba0a7dc1c6409af7f
e7abb7bec8b5abda3b339f957fefef3e0aed1130
68498 F20110209_AACBOU hu_g_Page_054.jpg
631748cf53cc60a465d03dfe1602954b
2866c6e87be0105da9e261eba5fb88bbdf902d2a
62707 F20110209_AACCTA hu_g_Page_110.jpg
d8e4bb1625461232961dc4f102e99db1
aa8f99b234b4d7b7f85134da5caa94483b891ff7
85585 F20110209_AACCSL hu_g_Page_098.jpg
f3ba82cef0252651f795de6502f3c8cf
d8cdc4ba9e8ec3095cf8ccc13c6eee1e1846dae4
25239 F20110209_AACCRX hu_g_Page_082.QC.jpg
ad51ebc68c01b25bb12082cbfd51b0b3
3e0cbc3ba269a61a9ed61f5cfa30a382ff4fe876
57766 F20110209_AACBPJ hu_g_Page_038.jp2
31a803344a2439df409d744bcc876813
108502af738b3e1c94b0cb3c4d518da53d52c6f5
49237 F20110209_AACBOV hu_g_Page_149.pro
b7b44dc5ca6bfacbb24f532b1076d330
cd6955911d9763b3a0d9eb39f9668b779ded4be4
68432 F20110209_AACCTB hu_g_Page_111.jpg
fbff8190a82ba010d30512c43e38a53b
4d797ec37ab378f7bfd2dbe9b4c3908167535d12
27119 F20110209_AACCSM hu_g_Page_098.QC.jpg
84bd15cede3f50ba5ae6e05c77b7940f
b527dd9094fa2bd33f6d02b5889eada69459bb7f
22474 F20110209_AACCRY hu_g_Page_083.QC.jpg
b25be6861100261b10b97dd97fad7f0a
7a7281598c73310de5fe519479178d01fb095a8c
6983 F20110209_AACBOW hu_g_Page_049thm.jpg
48db79f7224c136f6068937ac7a57320
888d0f62ba32ef1734b7df5217982ecf146f3ff9
120378 F20110209_AACBPK hu_g_Page_018.jp2
2ac6be44d3463118da52dbe0c60a81ad
a8c17af7b53f507d1f135d831e737fdfe4c5b1de
28936 F20110209_AACCTC hu_g_Page_112.QC.jpg
019dff6b6847b6d60ae91d4227f99941
8f323854348ba20351af47304504c6b2e7203678
79595 F20110209_AACCSN hu_g_Page_099.jpg
e3ba9769a364b28c826b82440f9e3f54
1860e0f335c0b936f008d8bf0a1a1e998fdae979
25901 F20110209_AACCRZ hu_g_Page_084.QC.jpg
8d25aa1391f36abb4056b03c7c53c08b
29732785c4a23b51f5853034408dc93c13008db8
589258 F20110209_AACBOX hu_g_Page_095.jp2
81186a78355926d494d4d5db558eddcc
da7bc6948edce54453e4233c4fef3b27f4fcbe31
36683 F20110209_AACBQA hu_g_Page_124.pro
008b38601a161d6bb1cd9c675ae30803
c26c6e0721a817a52e406ad1bc764074ed8704b1
99863 F20110209_AACBPL hu_g_Page_149.jpg
6c0dd457b48ffc892ddccd8140996132
d8b44da249434ede7e0b08f7575a95d91293bd39
24664 F20110209_AACCSO hu_g_Page_101.QC.jpg
148737b240d65965bcb158339a6c2057
22eed2ee034069e812c4bea785a132aa486fad17
F20110209_AACBOY hu_g_Page_110.tif
2d3a9ac7b8517ac32b0e8b7e49ec896d
70886f78e991d68438265539aa68c36788c4ac12
148 F20110209_AACBPM hu_g_Page_008.txt
185d415a990acc06e60705696cb9434e
96a8fd47e75149918389a71d9ab8a4d18e1a76ac
65950 F20110209_AACCTD hu_g_Page_114.jpg
820895ada86c9186b8786fbdca6d8cf7
2be523b6180539bd7a3b2680cc617d3154389417
54264 F20110209_AACCSP hu_g_Page_102.jpg
e80df154f54a63a5ff761309d0514f12
fd8b0b51afdbf568dc1e5b4b77c2e67ed1df016e
F20110209_AACBOZ hu_g_Page_101.tif
5e17a6b98e8c43e118820cab3fae1558
f16092ee44729df38f6dcc8a277e424c0155e60c
13571 F20110209_AACBQB hu_g_Page_156.jp2
22d245d365da87c2a89fa4bc4e6798f6
cf6ce67374277bd7a50bc7b4037a8ae352debfe5
1008100 F20110209_AACBPN hu_g_Page_062.jp2
5afd35cb42b52f123c7fa3e7a13bf165
be98e97e1c2805395a5cc5062fda5c30d1650daf
24509 F20110209_AACCTE hu_g_Page_115.QC.jpg
0414dd4fe6d2c74054edc84c49f080a2
dc02512f57818d4f4d5fcaf4cf4a63a4ccf64168
16842 F20110209_AACCSQ hu_g_Page_102.QC.jpg
c921895932b34949ac124302c2b34bed
feffba95ccf40938c6db2dd3b146144eaf0463cc
1614 F20110209_AACBQC hu_g_Page_050.txt
880d68c0d322ef7ac2e8005b01f9c7c0
1ce9c46db71f9e1421a8afd8499addbce55a358e
6295 F20110209_AACBPO hu_g_Page_092thm.jpg
cbad1e9871efdd96d04f8c28e7c49202
a39af98dd1f0a9bf3033f7c2129aa09c8ffa6790
87275 F20110209_AACCTF hu_g_Page_116.jpg
5962e9ab86d873749d4406b7cd20b56c
749f2d2ca74c1542957419127e7e9c99b9c605ae
63339 F20110209_AACCSR hu_g_Page_103.jpg
9632c588d08fcdea12c118d4b49ff2af
806178863d78cadf9084f1216f3c569e821d2b09
32993 F20110209_AACBQD hu_g_Page_099.pro
9554cd3852132728166eb71670ef4786
2cdd65ecca29c328249d9a1810ee873b575ec6cf
100453 F20110209_AACBPP hu_g_Page_030.jp2
aa34d4c1ac634ed0f7b726e727c5e998
c96a66e2b0472fb860970e970ef562a362e4c041
29530 F20110209_AACCTG hu_g_Page_116.QC.jpg
67ce6116320676cd4b954edc2f44355e
cab487860c6fac7e127c9c695f844919384057f2
21177 F20110209_AACCSS hu_g_Page_103.QC.jpg
44235c226fea86f72c3b34578a6f810c
58530765e1112e3e846789eb414d406dff9983db
114010 F20110209_AACBQE hu_g_Page_170.jp2
abf1a2bbb7e64f1e63e353337cbc9e74
ca3e5ea0ff50e02ff43707c115ca7de65e41d183
45645 F20110209_AACBPQ hu_g_Page_085.pro
b4833b09b462928ca73415272bd377c8
aea1fc8dce2fd2c31d6c859fc2604ad06a35e804
23113 F20110209_AACCTH hu_g_Page_117.QC.jpg
ed84f7470320aaa683b661084869b348
37937463689b7a9c9c52884d230c100359dd8f34
77192 F20110209_AACCST hu_g_Page_104.jpg
103d0213fdc8816328fdc1e9058f33e4
56b1a77ac22a9246674dd516188305dbddaad169
4061 F20110209_AACBQF hu_g_Page_029thm.jpg
cb21bdeadc7fb4bbf0a0df12f40687a3
c41c88d0871114573391000d9419a62c2b415ae3
21441 F20110209_AACBPR hu_g_Page_097.jpg
d9adc710fb512b587050b85a15449d1e
27d6fb434096f6c557d7a45520582c25df0d8167
46759 F20110209_AACCTI hu_g_Page_119.jpg
67a6f378e98bab5f6793893ed40797cf
ac0db286249c113c0de96f47fa51a58bd62aa230
24797 F20110209_AACCSU hu_g_Page_104.QC.jpg
1a9bb1d65be7d0eb4efc4b50725d363e
eff0f3be6a98f6337ef2d950632b478331f7234d
96443 F20110209_AACBQG hu_g_Page_085.jp2
19b82dea8114f31cfe49a7ca2bc613f2
3b71a14cbb16c31cec399b258e3493637c1188ff
7616 F20110209_AACBPS hu_g_Page_001.pro
8d73cb5c247e06545df781805ad43efa
c7ef5c9f508a0f8b7ed213379ad98241ca1947c1
14615 F20110209_AACCTJ hu_g_Page_119.QC.jpg
346ecca815f2b04c4831c26b77c8d522
f4295bf12674b4456f644f5734b48fb748b00be6
81330 F20110209_AACCSV hu_g_Page_106.jpg
d4818b80627f10b205a321d0287f209d
1ed3bac2728155f79bf76cca0fb0afbeb38e1186
25992 F20110209_AACBQH hu_g_Page_106.QC.jpg
32ddfd877d74210d662a406e38c2e198
2a7b74b1f2c5f4d968e0bd40e3400a07f288adc9
22395 F20110209_AACBPT hu_g_Page_107.QC.jpg
784cb1e4b11268c093382976dc3d14cf
29d47a074486a2c6ee83c39aac1be2fee66af844
72577 F20110209_AACCTK hu_g_Page_120.jpg
3ea3a043b95ad5a3a3a36d8144e50e1f
476453a0a4af2467c30726f3e5de9060e91e6ca3
70347 F20110209_AACCSW hu_g_Page_107.jpg
471b29a0d097edae13f1e565ed4846d2
fb460345fdc6ab9eb7b728634be9edfea5a3a92b
74044 F20110209_AACBQI hu_g_Page_117.jp2
e158f73e5845304c30a17686b627b674
55405746c1daeb53e3bd7ae3232f8bd917e0aadf
F20110209_AACBPU hu_g_Page_054.tif
011e6df49d0ea00cadf80217ffb8634b
4dcd997021a6ee4c4d9f8dd7b735f46a2be5d49a
87579 F20110209_AACCUA hu_g_Page_141.jpg
b9d2c83d57f8df0fe88d306b755ab9ec
39dea26b788b6b2ee9c39319b72dc4bb72db64f0
23467 F20110209_AACCTL hu_g_Page_120.QC.jpg
f46debdf01a270938d6ef76c973af070
df723e5cdcf19494c4b43271816edb95d6b25669
66805 F20110209_AACCSX hu_g_Page_108.jpg
a3a1d54d103a4c430fdd6a5ad77590b9
7988dcac51671e5bb7e8e4d5818cc9c609c444d1
F20110209_AACBQJ hu_g_Page_086.tif
8f4a7538bcc3183eb0250fb2794372f5
e3134f8d4df6865f279deef4922438ee93abab9d
F20110209_AACBPV hu_g_Page_017.tif
2c797deac21e51e54ed9880671c43cd6
05c7c508f0d1bf79d2eb4bd32ade26758e43625a
45851 F20110209_AACCUB hu_g_Page_142.jpg
3428ab5ae421ecbfa3ca88f9f6d2993e
59c760cb5388016a5e87442897a1ed1b4d948982
20332 F20110209_AACCTM hu_g_Page_121.QC.jpg
ea5600eca771da4f265fe713888cf522
19e18f84fd46a8f7a3251b83e0b9fe4a2f7fcdfd
21250 F20110209_AACCSY hu_g_Page_108.QC.jpg
d3b0d7b0cdfd9a84358cc028ca557c98
2cc20ddd071dc9930862dc8f0d315befc23cfea8
46245 F20110209_AACBQK hu_g_Page_154.jpg
8a5c3a1e0954f835f530e6706dc2e673
ccee157d203c4d64591507daae941c89418ede39
2096 F20110209_AACBPW hu_g_Page_014.txt
957d9c3e9d9e1434f8516504986403e0
0e064c190c3279016bd3a148a0c281cbca4cd199
16585 F20110209_AACCUC hu_g_Page_142.QC.jpg
d90a4c7120b06a87fd758d2f14d27263
a95dd70486c0ffb3f9120b002d79d3fde21d0dc2
59747 F20110209_AACCTN hu_g_Page_122.jpg
be52fdaec7f42fe19d0f8ccb6834f29d
d1efec56873f92e47b1abd8041dbb4344a54bb08
16814 F20110209_AACCSZ hu_g_Page_109.QC.jpg
94950c457e722f74ab71ae129b748d8f
75cfeeff805e62b04c5d990d6d30dea703fdcd1b
100603 F20110209_AACBRA hu_g_Page_028.jpg
9302715b4f6f154634059e2d62e79034
0b0321d61d10cbed8e2d0de57b5968cb15387b51
436 F20110209_AACBQL hu_g_Page_056.txt
728c3326eca81e5ddddb17463a27ac71
f31c265514343f9e9aa6ef0778205101df5330a2
1969 F20110209_AACBPX hu_g_Page_113.txt
776e0ac7cd370d8a377013d6ee89697e
3aa4fff5841a087a7ac0f4e000e569b2aae12f9e
58318 F20110209_AACCUD hu_g_Page_143.jpg
55dda83e18ddbb89723d786a4dae5cc5
0d727b332ef9f33c8e258e4ed814ea1807107ef3
60266 F20110209_AACCTO hu_g_Page_123.jpg
fdd0acc54c7fd3f0ed49f0d63e0c9899
2ee2c71ddb1e11ea979937b295a1e1ed112e6c24
F20110209_AACBRB hu_g_Page_125.tif
e4cee190b9a6c6d7c4adea3abaca3524
db00553c00e73f02331ff00950e6d46f18843a47
22919 F20110209_AACBQM hu_g_Page_001.jp2
3fd84b060cce2a5fc2268d905a2b285d
9908a29b35b48edaa1cc94130919e0f143c2b89f
1300 F20110209_AACBPY hu_g_Page_119.txt
a6cee6108d6404a1c7055df8e8363804
52606d2c8eae39b736c2f3fa553819a2dea943b7
77686 F20110209_AACCTP hu_g_Page_124.jpg
317de67d4c4b6401909443791e867afa
71f964551589b34f8c788dacf762ce726d26e068
33070 F20110209_AACBQN hu_g_Page_127.jpg
91dee2eb151ec1d9940773daafaa89c0
e4bc6843855c1fdb2f4059bd34e2583ac9c250af
739162 F20110209_AACBPZ hu_g_Page_147.jp2
e6119ae5846aa21215834a1d36210158
d279329de80b318bcff4fcc738b9a561460844ce
72430 F20110209_AACCUE hu_g_Page_144.jpg
783de052197737cb61ac9258bc29c931
896302b80243352bae4bc244b8e4402080004729
22308 F20110209_AACCTQ hu_g_Page_125.QC.jpg
e77885139f38890c412b7c40c867c923
c73d1a4276d1555159f13e72c9cb764cdb8e89e4
6673 F20110209_AACBRC hu_g_Page_098thm.jpg
29ca565096aa8eb45c03294093c945ea
dc719e425d30bcb033f61ea1411bf892246704bb
393630 F20110209_AACBQO hu_g_Page_130.jp2
04fffa314e00e73c58a237161e7783df
911e943ec3a0104ff7a2ed425e6c36a25f9f577f
7306 F20110209_AACDAA hu_g_Page_149thm.jpg
afa9513928b0436c2232205a4e15c898
49b1ee6c9221a5588732d93b6a6df598b1e77073
22643 F20110209_AACCUF hu_g_Page_144.QC.jpg
950c748a8d2104682de615b28d96f57d
36aec665e2df579520439f2700da992f22148cec
23743 F20110209_AACCTR hu_g_Page_126.QC.jpg
68555dc47e7306b640e4d08ea835e497
73bb009823c9ca8ea8f62f41cfe3989eaa3274a0
5561 F20110209_AACBRD hu_g_Page_062thm.jpg
461285c154d4233c49d416df79ffff52
9c0a9bfb8827129bdf53d028aae9eb29271f6f92
1059 F20110209_AACBQP hu_g_Page_061.txt
059bde518277c5022cd50fbb6e515ab5
b59ee5e1617f2b6a4813bbbe7833da1e67c2095b
5180 F20110209_AACDAB hu_g_Page_151thm.jpg
80ed1eddc86aeb73734204e82df2d922
938e3748fe7b8eb5b926c77cf46c31ea2c54e1b5
68468 F20110209_AACCUG hu_g_Page_147.jpg
4b4df2d242b8199ba40114076116f849
5fdd8b0b3c85a41b4f1847e24001eac10e1a1227
11464 F20110209_AACCTS hu_g_Page_127.QC.jpg
a5e9c5787b13b8147c4f36dad5168223
a58c9ffafa51175a3c75f77688ee351ea05b50a1
6551 F20110209_AACBRE hu_g_Page_106thm.jpg
0e451864712fed080665c689378e98c4
75705c60107187c66d0b4fde7de4c7a0ae62d832
23227 F20110209_AACBQQ hu_g_Page_094.QC.jpg
012a1664c87c79467f2005e8519c7b10
fc7eb1ee7ac9f92b7fee562a4a388f3869d27902
5139 F20110209_AACDAC hu_g_Page_152thm.jpg
1cc07eb888e4f89aee927d916110bc3c
4915c16099db37b221c7f301bc8da4e6dc76200a
44859 F20110209_AACCUH hu_g_Page_148.jpg
f9901dc9dd5170a6bd6ffd3014412514
876907a20e52398ec2c4eb50971523bc485aea50
71150 F20110209_AACCTT hu_g_Page_129.jpg
b19919c98b80be6e65d2a9ddf2c26b66
d1348ae5552dead2081f7ef83965f8312c359209
79446 F20110209_AACBRF hu_g_Page_124.jp2
1ca20590f3c852b5d60ffc46977c9ea2
f3a2b083b978fa9598c4b8a69d467a4684c20a37
120961 F20110209_AACBQR hu_g_Page_165.jpg
401a12e4bc3500d2b28c756d58187a18
5ad1fda62e0f92d1115c7408ec8f91bba284396b
2852 F20110209_AACDAD hu_g_Page_153thm.jpg
07f1436ea333e6f40b3b066f8d8fae18
aa2bbba86c9eaf3520f48ea93f8c31a7de373179
32254 F20110209_AACCUI hu_g_Page_149.QC.jpg
c5dc3a3ea0ce7131a864d060e3a4c1d8
bd0897bfefcfa67cd678278d08b0e84c67a0826c
43503 F20110209_AACCTU hu_g_Page_130.jpg
1ab79fd50f6cd502ecb3fa07b8b2681d
5d666ac9b267b14f255783e5a4404f3ea96c2384
758518 F20110209_AACBRG hu_g_Page_056.jp2
4007d30c102152d761c6d15ae93a3a0c
bbc5dd3f56586a3c2e2b34bc2e256a402293a3d4
20456 F20110209_AACBQS hu_g_Page_119.pro
00ff201b673ecd9fb907c81c8d4960fb
f018a9771806153454e38d1c70ee18dc852ffdc2
4673 F20110209_AACDAE hu_g_Page_154thm.jpg
0616548d9c826b9c5b9f5b0cc01f3610
5648e37e41571e49f2e0b88bbf7ef179c0bc441b
113402 F20110209_AACCUJ hu_g_Page_150.jpg
731df9010735110df41cfdef5b342248
ac82e5955e833b649d850ebb591cc7d3289226cf
F20110209_AACCTV hu_g_Page_130.QC.jpg
d086d434ba8c69449609e6b098d24e3b
d6d26cd3b76fb893c36edb04e6e688ce89aec6c2
25749 F20110209_AACBRH hu_g_Page_061.pro
a15d62f0cbbe5675a418ac293471ea6a
6b761a12e176c08b987871d390261e14abe5fb1d
20175 F20110209_AACBQT hu_g_Page_056.QC.jpg
0173912cafbda182a1c4488b8fa656a0
47f66325d2ac0d4b3885a511456ab18b60c40908
4251 F20110209_AACDAF hu_g_Page_155thm.jpg
c31269e05d8ae9943e1bbf16ec3055ad
42cdfa36076b591eb965d7eeedd23793a95eb288
36305 F20110209_AACCUK hu_g_Page_150.QC.jpg
ef440fd953bc78c7a637e8988bb33bf7
259e3e11a3ec214179f969d536a47bc77efab2b3
68840 F20110209_AACCTW hu_g_Page_132.jpg
b04745b5fe4d3b3383d88e325cf2c584
da75f2ef82438a458297fd4234c79e84a400b389
4579 F20110209_AACBRI hu_g_Page_037thm.jpg
0f747db231c0f130f58487755c2e8f6f
becde8aca12592a8f85fcc359fd2ed9e1ecc49c5
1051970 F20110209_AACBQU hu_g_Page_058.jp2
f62938e286bde1ad51ca4dbee620fd3f
142117543c094c349242a7e3b2422382a6340dc8
1349 F20110209_AACDAG hu_g_Page_156thm.jpg
aff47a458a0f8b55990c51ecd2573cd5
6eec6bd69ac8205300f7137ddf3cf3b3106cd36e
127929 F20110209_AACCVA hu_g_Page_166.jpg
884cd3c8279d7816eccd4e0a9b736aac
40db2c73f0ff09523d6836a858add24ab30d2c4a
70172 F20110209_AACCUL hu_g_Page_151.jpg
599fbaacf0af1009c8b5076651e5e475
abb5e5b5ade458b900da1268ac5c6a6de9c3f0d0
22841 F20110209_AACCTX hu_g_Page_135.QC.jpg
dc687c46d948d56118951c80ef75eac4
1753d296a1f9f37dba63751480843c9526f4ec8a
4420 F20110209_AACBRJ hu_g_Page_130thm.jpg
463b52a1e48a681b375ad56648bf9dca
dca25d7db9f8bf12ce8ad95c51de023e9c25dcda
104127 F20110209_AACBQV hu_g_Page_078.jp2
077e7b155aad778812d3a30e89c8df64
15327c82302aff7f65a247429bd73f0045d30a45
4057 F20110209_AACDAH hu_g_Page_159thm.jpg
c1d8cce2211c859e243e3a7c18c1d6f2
86fcd667b4acad412846c566cfc6cfd2682de5a3
36060 F20110209_AACCVB hu_g_Page_169.QC.jpg
5a6770f060360670125e05c35a39ef41
60436b5e4ae0665d695b6c19c12576762f9b3d14
52128 F20110209_AACCUM hu_g_Page_152.jpg
a5fd89ec620809ff441188d0ad71f23b
02600f8bf98bbb299a6ebcc9b7372bfeebc76e01
62485 F20110209_AACCTY hu_g_Page_136.jpg
29a913b70ff2ff528a5735f0c3f46228
0ddc9e7431339ff8174a12bf89bd66f45d695ed0
83636 F20110209_AACBRK hu_g_Page_084.jp2
801ee306f5bf5f5e59a1c7c4e1839173
fff355da437d1a2a3a145bc27181d621dad18940
F20110209_AACBQW hu_g_Page_032.tif
d2fc152466e3b88c5b45c11105f823a2
5b9ec57e09670770c88858496a6d82af59bd209b
2480 F20110209_AACDAI hu_g_Page_160thm.jpg
2b5db3813726b7947003134e305f76bd
114646cc9ecdb74f4e4d7aba12d45077e93adbb7
108966 F20110209_AACCVC hu_g_Page_170.jpg
5af1fcd15da3626ec19ab9beeff0800b
ba0c2699650b1278ff46ba23375b44c941ea0b2a
33300 F20110209_AACCUN hu_g_Page_153.jpg
088a3b9d83ed665ba1bec81a58a75a94
a84655ea5c8a7f39fdd1ade5f6ed995d3ff51fe1
68703 F20110209_AACCTZ hu_g_Page_140.jpg
281687d8bb3b4b96cd2aad29258613c7
5fc082004c5400d5a383d04862db51f0941a8a30
6051 F20110209_AACBRL hu_g_Page_144thm.jpg
f10ca157dd08caaf86255bb9fdbc398e
7c48e7c6f757aea28f3ae23ea0ccf65b9b51d726
28062 F20110209_AACBQX hu_g_Page_123.pro
1bb1ac3ba1a0b49daa2de5259786d1a0
665d0ca56cbfedbd4a3ca8b756abd01ef90157d7
35269 F20110209_AACBSA hu_g_Page_035.pro
b05c7db5f8f22524f43df0bd51b096ee
d75450f6e4911a01860629fd0a1949ab7b4dbb3f
7351 F20110209_AACDAJ hu_g_Page_163thm.jpg
7923f7a100751257a5a27d007d117c69
6ff2d5fe080f89da8e8ff52ea37732db36aab45a
30054 F20110209_AACCVD hu_g_Page_171.jpg
df46ab8976edf397ed6e22eff6a1bc8a
1017bd21eb2a6db84b88d08ab9a7b81c7ac6bb08
13694 F20110209_AACCUO hu_g_Page_155.QC.jpg
e605fd58e4e9555dc96be37d1c8a522f
22490635fd82020133fee9aaa83e5dc24fb2af2e
20758 F20110209_AACBRM hu_g_Page_138.QC.jpg
cba4f948d622d0774d02537dd059e113
e4d9db0aa4fdf6de6a6845991b9cd8cd97d7d729
72522 F20110209_AACBQY hu_g_Page_135.jp2
fe9db25cf518179562ae0703391f55f0
957ffdd0985c3d5eec8f5351c5b6c124ccb2f9ab
1737 F20110209_AACBSB hu_g_Page_108.txt
f64ea223cd37516943a473577031c625
0eb51963b7a53978f50283935650cc1c7e8b4689
8617 F20110209_AACDAK hu_g_Page_164thm.jpg
593336bfe5e772e30c24502e58a56988
1523fe6692f21fb0707124342d63d40d83285494
9984 F20110209_AACCVE hu_g_Page_171.QC.jpg
c192daaf64bd554fd40e92e91f14cbd6
7f9a1dcbfed47f0d619154a876c60024bdc3b667
11545 F20110209_AACCUP hu_g_Page_156.jpg
e750c0017760406aeb64b82d44853109
a872dc012f36d940b016add878869e95da53c89b
79854 F20110209_AACBRN hu_g_Page_115.jp2
ae48de0ea4795ad2db8ede93719e27f3
62cf4ffbe7aad10d6d010a9961240da71350a18c
77298 F20110209_AACBQZ hu_g_Page_101.jpg
922ec03cadf88464eb2415ed6f152184
af31de455611a8c0be6c493aa1416471e73b4cb2
F20110209_AACBSC hu_g_Page_065.tif
f6b9ac1e6294e02cce5ab54a6c228283
b0687c8b126a9cf20c63cf28106b706ef572f4ab
8586 F20110209_AACDAL hu_g_Page_166thm.jpg
9d1c32aa5e4366ed8bf4a3f40aa039f2
8eea6614754825dcd61cc262b9ee43385855c95d
3940 F20110209_AACCUQ hu_g_Page_156.QC.jpg
b694353b93a90a998d3681b340e257c8
fe118847af3f394bbc296737b033c9894930eae7
F20110209_AACBRO hu_g_Page_100.tif
8df084d0e5647a9facc65d041baad76f
f733c5899e475faa32631120cecb4a755c8e8488
8722 F20110209_AACDAM hu_g_Page_169thm.jpg
264fd7f14312617a2e9f2ad5e3fd0320
02e387a0272be8e51eeb36274ca98068adf39a39
1051938 F20110209_AACCVF hu_g_Page_007.jp2
69cf7d6900c548a81f23feea6ca7bfab
c23e514867a8e4f3a983234aa1d8b6f1e79b2816
36582 F20110209_AACCUR hu_g_Page_157.jpg
2afcd880bad7fbdc009f316b63b82935
afe225a59ecff58c769eb6a95ec0f755a2728399
4674 F20110209_AACBRP hu_g_Page_102thm.jpg
01f66d5cc4a28f65a043e9406f017211
3d2cbb8496a7598c9c854eab2e585626fcbf7882
3498 F20110209_AACBSD hu_g_Page_127thm.jpg
2cae218eb0524d496b5e3dcb418d47af
b30115a5d33d0c337dde029ebbb790d81f91a2e5
2580 F20110209_AACDAN hu_g_Page_171thm.jpg
2a4557c40e75fecc31a9c8a8fea82000
314a127fd4c5cc12ac4b2fc404054ac798331df7
F20110209_AACCVG hu_g_Page_011.jp2
936f7eb33954d97ce17026743a7aca6a
d7646d34c99581db4856ed4bf0b9a8540983c248
61844 F20110209_AACCUS hu_g_Page_158.jpg
74105f4722f5432fb33ae31f5e6ce3dd
14fe7fecc6bb3e2cec1e5d5bc57a76764c4cface
2445 F20110209_AACBRQ hu_g_Page_168.txt
f87f4c88c67c126c6f5df28bef11d174
4ce6c1e61ccfd397a2fa9b7522eccb9bfd27c276
76004 F20110209_AACBSE hu_g_Page_115.jpg
311b0c58615cffb20280d7ee19eca484
66362ff5a96aa65924e1ba606b854cc9ee7b1a46
196501 F20110209_AACDAO UFE0021789_00001.mets FULL
b015c0ff691bf1ef6caeca804265ae3f
eb3d069c201b79b6dd59acbf60e9edabfe20d843
1051977 F20110209_AACCVH hu_g_Page_012.jp2
6dc1003cd251d09be92159e5994dd004
08f6219e50e5bc7dc389ec6106a57dfc6702651c
36734 F20110209_AACCUT hu_g_Page_159.jpg
e73d1032b4bbf3a566fb26205c83fa07
6c7b4c64a56cb6c0993c87a02a1bb7c92aa34fbf
6621 F20110209_AACBRR hu_g_Page_094thm.jpg
a126e550449ac4cadfac63edbf3cb7df
41f23591966da26c052959c7ca62f15703ebc9d4
30940 F20110209_AACBSF hu_g_Page_114.pro
c7e46937b5424f18063ec9fd5ddf7e10
ad4f5489e0a117356c6498ae7cb5df8873daeb78
118213 F20110209_AACCVI hu_g_Page_017.jp2
a4f4a6ba1d43261c94af7146e683a1bf
564574344c8ecdf1ab98db4097168928cb8efce3
57174 F20110209_AACCUU hu_g_Page_162.jpg
788572ff8d12b5ba8d42167fc9391bad
ede93ca674bfd5e9a8c16662491fbe4401047583
55699 F20110209_AACBRS hu_g_Page_025.pro
bb43f7f6fca105a01e5dcbf1a019008f
089206d1c62def64e287beebef48b01a75bff492
F20110209_AACBSG hu_g_Page_156.tif
cc8444d1137574ee9ea2c454ec18f131
b20aee1b7d255ff3f0aefa7d0f6b3e127c30f227
113655 F20110209_AACCVJ hu_g_Page_022.jp2
e71919b23694a0fcdcb36c5ccf9c5188
342369fdc24be86c54f76bd0ee88926e0eab896b
18574 F20110209_AACCUV hu_g_Page_162.QC.jpg
aaab40a23b590869a7b6df87f065bd7b
106b1b2b54eceb547165e6db8750f867726ed2f7
59500 F20110209_AACBRT hu_g_Page_091.jpg
74df319a1e453463dce26517f36ddce9
07f3a8824b54a3d7b7637ad108ac41ec990c1f05
1149 F20110209_AACBSH hu_g_Page_079.txt
663bb60edf6e53be486c307dcee711a8
d8ec8e286dc582b5b684b4ff723b1a802375bde2
117539 F20110209_AACCVK hu_g_Page_025.jp2
30a7f38999f33718dc60b1d8f20a5e47
15d6a73734811d26917ef831ce330c61475b313b
106364 F20110209_AACCUW hu_g_Page_163.jpg
cd4d14e67bb2d36652345ef094969c75
dca8b22906de3384408900ca13980e6dd694ae68
4891 F20110209_AACBRU hu_g_Page_109thm.jpg
c9213f5a97b58c36b11dcf3f3a336117
7928012695619f5882e186289ccdfc2f4dc6c8f5
40783 F20110209_AACBSI hu_g_Page_159.jp2
231dae4810464f0693be9b4f22a56a81
8b99a32f4d816c508afb3cb7c47d8215b5834c8f
882443 F20110209_AACCWA hu_g_Page_068.jp2
27253dacc3dbd328fae9baea54dfa92d
b4f81e10cb92cb087d4d79a5b9d24fd9b930db63
116091 F20110209_AACCVL hu_g_Page_026.jp2
5488012b68df6b35d23709427ba376a2
5840bca64627ed24e0fb1725e245ca951c887610
127734 F20110209_AACCUX hu_g_Page_164.jpg
c37907549d896010c79bde20bab6aa0e
1ec5ea9d9b09a37c52f46302b4d60ee7d674faf6
79271 F20110209_AACBRV hu_g_Page_064.jpg
58dea9b998f56133a11c431e107c3287
a1e682f2683c8f6213fae0db41a1e2da0c3f19f3
21324 F20110209_AACBSJ hu_g_Page_042.QC.jpg
07f5cbc9a02c4d9494a93016a95127a9
4af06d2146c971ab8479f8b23470a08d42d83929
632871 F20110209_AACCWB hu_g_Page_069.jp2
0aeb717760f5282db8f34134339a6b93
5feb0fc17f071619dd509130a582230291d6352a
54931 F20110209_AACCVM hu_g_Page_029.jp2
1e2d9d29baf44fe83064dbfd9929878c
5eef2d55d22e304d0b495abbebaf1153cf5185e8
36089 F20110209_AACCUY hu_g_Page_164.QC.jpg
dc21e1cb83ab9549fce020ef14d2c1fe
ebd0ec8831bd155e5a507728c13caa14b39ca7e8
1096 F20110209_AACBRW hu_g_Page_068.txt
b978d68dd7f8b577b11bd7c3bb63884d
cd4d8d978ae406f3ccdb0760dea7e6d924adad19
15711 F20110209_AACBSK hu_g_Page_154.QC.jpg
2e4f1753591f0174616f34152ea609d8
da8465765e13af8762ce61a5558777c852483894
101566 F20110209_AACCWC hu_g_Page_075.jp2
2a5661212cc51d83c710e1a16b6eca3f
59712fa3bd9298e7c8ccaf54c4fabb8b5c28b0e1
68234 F20110209_AACCVN hu_g_Page_034.jp2
4fc905d89b74be5e50b0cd4a7162504a
6aad9aedefd64ec6560535c1bd6abdb5976bbef9
34480 F20110209_AACCUZ hu_g_Page_165.QC.jpg
88e80cc77d8872b8677c73632856d185
a4e8b8bf1bda89a3f962fc27e79c3a8334e2656e
31718 F20110209_AACBRX hu_g_Page_160.jpg
9a159f6b4c6ba8bdca50866abd954aae
5fee3291a9c558c634409e6660bfbf2d944e5c3f
8022 F20110209_AACBTA hu_g_Page_076thm.jpg
acd94a52e48331eeae114f340cdf1511
bf40efcbe34fe9a762d8ba83e43634eabdb379bb
19332 F20110209_AACBSL hu_g_Page_158.QC.jpg
3783198e0a9ccaee1b67e2a98d1ff2f7
f7ff3b8548ae8c1bea4f95233fcbb2455db1171f
80380 F20110209_AACCWD hu_g_Page_082.jp2
761ecb9d26f46214314b953a33cbbe8c
e4a0b9a671b14397f0f056dda4f1609d7a476198
75175 F20110209_AACCVO hu_g_Page_035.jp2
e719e8ccfa21bac8bbb8d381be4b93db
a8880a62e09d0cc3b26f310988f5413bcea67ad2
21394 F20110209_AACBRY hu_g_Page_045.QC.jpg
127da636601e6bbba1f19a7bf81e2e79
b7e10a801a4266d978c9da4a861ce9c6f38af4c6
466 F20110209_AACBTB hu_g_Page_001.txt
23d9e3d03ad518f5bb839e870906bb92
353a8186b0a6209f4fc32409a3f8b6af4fa8d556
36950 F20110209_AACBSM hu_g_Page_006.QC.jpg
f75660839a4e603fda3971e09d48d643
0d1748ab672dd6a0032418a3f034baef079a6ea7
68956 F20110209_AACCWE hu_g_Page_083.jp2
629f4bd4b49582b55b47f3ab897c9b3a
42a43aaef3a202a65a6a8bf04c8405e8ff130553
84819 F20110209_AACCVP hu_g_Page_048.jp2
706d26908fa91f1db4ef4bb7af5cbd45
d47cbd9e0effc7105dcf6a68cbdb73c5a46315f7
32783 F20110209_AACBRZ hu_g_Page_133.pro
8292f7cf18bf5d5e36c81845fa443b8d
57a38332c68b7d5181ef48f2940c36b883bc96d0
91733 F20110209_AACBTC hu_g_Page_112.jpg
a9c5244271e8b0055c7dbd057d7dc8cc
ca8515e1718384f7b04945d38bb580f0d0414b7b
94685 F20110209_AACBSN hu_g_Page_081.jp2
db03bfa0bdb9d53b1bff439ed516abfc
0543128b9da7b4407ea6b679660e9bd6c4a08f7e
60055 F20110209_AACCWF hu_g_Page_087.jp2
23f8ccf7a97c2ca06fc8ab4d4479e218
05bf4f393de783956bad11e1cdd9fbf5a7614a18
89793 F20110209_AACCVQ hu_g_Page_049.jp2
011917a81121ff8bd26be1edeae01ea4
b97f77dd7bb5cce89b896984c7ba7d5d83f6a15a
57840 F20110209_AACBTD hu_g_Page_088.jpg
98679039bc0dd763f1a3766f8ff7fac2
6ae99573fb7baf734f3a1a1907f4b5036453cd43
18306 F20110209_AACBSO hu_g_Page_146.QC.jpg
cd69be99fe7cdb7cdc91e91e495d01cc
c1c86e490cdba3693fee3f29ee313363bb5bbf7e
72485 F20110209_AACCVR hu_g_Page_050.jp2
f2f1b7dc41cfed2188daccd8ff8d3c0d
be8c3700fcf3b6ca5ab2a2ac6d6e746a378a2b14
1546 F20110209_AACBSP hu_g_Page_053.txt
aa2e243886097a816c00fb4f205f0f22
effe04f9a3adf91e295683a5f0e2c390240c06a3
61983 F20110209_AACCWG hu_g_Page_088.jp2
fac64fc625e0619320414f050f3cdb6a
44747652f76b108117a1ea13b6041ef4c708dc54
48824 F20110209_AACCVS hu_g_Page_051.jp2
533ee7332d4a2b01c6eb5560ed8f64ea
cd0329e1990448ad1b6d8d2f4b24ed6d4966c9f0
F20110209_AACBTE hu_g_Page_131.tif
f39db130b41525be703645f8be6170da
1f623e10e468da05c725a83f28fd9880a8cc905d
7150 F20110209_AACBSQ hu_g_Page_116thm.jpg
00e745523d0560413202b5e458940f2a
6d0e0b57730d05a271f92cb8841448f5d06b5efb
60463 F20110209_AACCWH hu_g_Page_091.jp2
34f993ed2aa8d5edc1182d491e56b8c4
b703bc20344504f2ed05790110161869b20dafc4
70371 F20110209_AACCVT hu_g_Page_052.jp2
96a3fe85b2bf38109e13c38bc873fc0d
242fd7599e445081844e889cb083249d29288d12
48620 F20110209_AACBTF hu_g_Page_015.pro
795588254a1e63557d4b7832655faee3
31579b652bfaa82aae74bbded6f5ad78b4c3b709
F20110209_AACBSR hu_g_Page_153.tif
689f4f3238d288faaa169a9b2aaf0163
505a3f7744cb70d30f3651d23570728a2b3ca57d
71143 F20110209_AACCWI hu_g_Page_092.jp2
fcacc2104b3c7bb6680d8fda9dbea270
492556455021887add0830435cabdeb77f7d135d
70047 F20110209_AACCVU hu_g_Page_053.jp2
2357648561c8c779af5edd1263002686
a9af4bf4419f1c7776812e777c78d0d9245acfe2
66339 F20110209_AACBTG hu_g_Page_086.jpg
44a1e34bbd32ded766290c2a628d30b4
e9e6a3dbdd872ce1eacce19101ad5a5eb689a5ce
54727 F20110209_AACBSS hu_g_Page_029.jpg
d674666774ca3feb9242483a66c41cdc
d49a3fd09e4161d9c1843fec3c462a472856d7c0
488178 F20110209_AACCWJ hu_g_Page_093.jp2
e4344eb7400d727e92f329a197434480
9696427e6a9243dc9e759a38e6a02e8d1bb6f84c
110119 F20110209_AACCVV hu_g_Page_055.jp2
13f3f1f362cfd16ac2f3cce805f8a115
66de48b7ab18a2236e6474dd971bbe1b79118232
45423 F20110209_AACBTH hu_g_Page_118.pro
144f34a6c436b9bf14f5df7a24cb6c3d
0a9ffaec85aafc94a60fddbbdfcd8559974fb89f
1262 F20110209_AACBST hu_g_Page_109.txt
d86a7586df83f9d43c50e5f1d1b6d7f5
fe7296cf681cc57aec7cf242daa89858c333f6d6
1033763 F20110209_AACCWK hu_g_Page_094.jp2
2209db9048122ac4735cb65e804ee031
e435d7f2d99159778306a59bf76485962a169410
71537 F20110209_AACCVW hu_g_Page_059.jp2
6ae81d3fe76a6ea405aad77bf4bf9f6a
84be3ae8e812778f5d6950ad83e4d2a048c5bb58
F20110209_AACBTI hu_g_Page_124.tif
b929e2ec3afc0abc033f46a7eacf8891
3312f45bb3716a69d34f7623cf5173a293822088
F20110209_AACBSU hu_g_Page_147.tif
6833d5b1e0182dd2b3d75da21ab4090f
de0de729be15dbbad508b02dc6161b2ccd38836a
57799 F20110209_AACCXA hu_g_Page_143.jp2
16210e6fb8b757e8065016c2c835b86e
55e0734d3480d5f879771473a9d2a71bb294bcef
412140 F20110209_AACCWL hu_g_Page_096.jp2
209f3bd09373ffb434584f549240dfec
f3629ee577df097456969fb75471eaf15cf7fb45
104786 F20110209_AACCVX hu_g_Page_060.jp2
af809a2bf92cc39e17503987b0f8cfc1
089417c841913600c745dd3b105848610bc55c19
802708 F20110209_AACBTJ hu_g_Page_129.jp2
09d2b2a615c51f2b17841d70367374fe
0f4d676acd7b56c055e841a35f7bb91cf248c242
4484 F20110209_AACBSV hu_g_Page_100thm.jpg
a4babc1f6c0f0f6a07750123c4a9131a
9118485b80f05d7fd8daea8d67c025b358af8f86
74041 F20110209_AACCXB hu_g_Page_144.jp2
1e946453c708e6eeeb428a8896e43ff2
4fc52004b5dc5a7ad5e0d0b3dbef9c49f89593c7
88027 F20110209_AACCWM hu_g_Page_098.jp2
9e37dd180e326db55a7515db06906181
7fac9b675462465e3938c1251dfb0642953b3914
58044 F20110209_AACCVY hu_g_Page_061.jp2
9a42fb01080e8d9858e3aae34877500c
6157b04f3bbfe63e7cfa0c692cabf26bd54b76d6
961 F20110209_AACBTK hu_g_Page_094.txt
00191cff75d771e06bb5cd0ddd0408a3
cbe1564a6087e0365b1c3758e4f2f992296a1733
670 F20110209_AACBSW hu_g_Page_071.txt
23f1234a0723fc53e8e2eb75e5ce17c5
2a4dcff9d6b2be1924075face443f05f34f18500
104240 F20110209_AACCXC hu_g_Page_149.jp2
1c7b16e4856074b5d30504b8ab2cfef4
f42e25e5f35ac9ca26aa6a523de388891fad0ff0
57489 F20110209_AACCWN hu_g_Page_102.jp2
54b887a2c351414f74479aab73fb5697
0773d83a36c30265ec3292a88d38242cb399a950
539634 F20110209_AACCVZ hu_g_Page_065.jp2
5c819570e19c69006e9be0f3ac0a9583
5492cc5115b57fb55cbdff45ce57657e42d93c48
F20110209_AACBUA hu_g_Page_075.tif
49454a1f77bedb9ed2074ddcc8d4c6a1
e73d3c797abc3963c154390714111d844cac928d
943 F20110209_AACBTL hu_g_Page_003thm.jpg
d33a4bf6ee553f1992492761a064c4c5
f7577ec87b3f539fa6f431fb3096ce2748811809
19988 F20110209_AACBSX hu_g_Page_123.QC.jpg
df2805fe57114f2e510a68e5c0a7e5ca
869ea9228056521d93733008051da6c12113b449
73670 F20110209_AACCXD hu_g_Page_151.jp2
dbaa618fb846d285cc849c8ca8c43470
c9e8a9920a6dce378db6fc977000c876b8aad5d8
67965 F20110209_AACCWO hu_g_Page_103.jp2
d3da541090c4002ae2c5711b67b5bf98
48eb0055c43f5c71294d6e2ea1088df530c4b7fe
401235 F20110209_AACBUB hu_g_Page_128.jp2
1206f9ab075f1ec4346a9c3076f1d6d3
3fadcb027501dc59d3c64124872afed2a48d3eb5
131713 F20110209_AACBTM hu_g_Page_164.jp2
6edf8d767a46b5ad1487dec430d33103
e18431bf637d2e48c90b367c9fb4d6036b1b2509
7600 F20110209_AACBSY hu_g_Page_036thm.jpg
e34d387bf4e7892f58a7236b8a2b26ed
6c678c7cea4f197406660bf215d54b5dd3a24c52
54484 F20110209_AACCXE hu_g_Page_152.jp2
e7ab90c9f4428f8be6093b6289cd2d04
b7bdc4235056871dc84321734085e49f2e102571
79844 F20110209_AACCWP hu_g_Page_104.jp2
fcabbc74a7ceb5c5705bf615788a2108
8fa1e74af108a657d2d9e6c39ca3720134659092
91935 F20110209_AACBUC hu_g_Page_085.jpg
52a80f02c5ef19e89b93f98d2b370253
d4f7c02d460f38bfdb8861c00e88099ff8e7c817
69518 F20110209_AACBTN hu_g_Page_041.jpg
9f873ce64fa6b2bac8d3aefbf2705e8f
4dfc8a878a3e9fca651ae6c553aea6f457caf186
F20110209_AACBSZ hu_g_Page_038.tif
238cabbac8a3bbef33636893382539ae
3ce0da29b7001ae5dea378303ec14cbe19198473
35452 F20110209_AACCXF hu_g_Page_153.jp2
84cd035840a8e627d76c2356fba3b524
d4bb73331e5233f39cf0fea5b80fd524ca70b1f2
108325 F20110209_AACCWQ hu_g_Page_105.jp2
ab7f3aef35c603412f6ffe66d7ea3406
da04fad55ebbe8a92c8d8ed139cd005ba88aba25
F20110209_AACBUD hu_g_Page_163.tif
d8e2b946ee37913c92969ae193f2baba
7904ec5c4915a2aecd96c1fea6564e08c4c1d612
26277 F20110209_AACBTO hu_g_Page_091.pro
ae9a752ddbeb13137f16cc9de088d2c3
42328fdce77976dd0c0046f6d7884de9a3ef4966
62287 F20110209_AACCXG hu_g_Page_158.jp2
e4a9ab9cd7fd14a0d7ae4f366481159c
5169fb8e941f986888bb7a4b89e3964060c90418
84156 F20110209_AACCWR hu_g_Page_106.jp2
b1f6d29c771ba64ee6c9e2ae0cfbb8b2
e8b5da498e9ed9addfef97581d2bfb3c12124008
116257 F20110209_AACBUE hu_g_Page_020.jpg
cd5f21134174c2a73cfe9df029b621de
f8b4e63f0d1e01308a7a2d6072f0909cefa66a4e
F20110209_AACBTP hu_g_Page_111.tif
c2f8c77ac3bfed660018620d35d43805
46e3df9f790ff3fb0a402a37a5cc231baf8a0639
51720 F20110209_AACCWS hu_g_Page_109.jp2
028c5658d927225700b81dd4e2990e1c
0f051c39146f5288ef03ab94ebb7c1eec6895974
45339 F20110209_AACBTQ hu_g_Page_081.pro
d9666f999b638e32f9a59d493a7e5b31
3311fdfd7e2135d2459d9e219511f5c3d1e45893
123840 F20110209_AACCXH hu_g_Page_167.jp2
11ae90e4e6d91f40ae15d530c31847a9
53189e17e3796d69c29d6b5f943e20d0eb5a9adc
68092 F20110209_AACCWT hu_g_Page_114.jp2
32af4f22ca99f088495ce18f5c6c4d4d
e3824c10ca3466822f76f6ca68322fca7ef5cc55
71020 F20110209_AACBUF hu_g_Page_045.jp2
29f36cf6f3c6ce6169a2bb01c5c93113
07fb7f621927a1cd8804e7b4e997fc271b978910
4917 F20110209_AACBTR hu_g_Page_073thm.jpg
87844773cfcdd238ef60d95c2d60cc1d
7f8a918339c11f493e67f8026b4f2ec080b79b4f
17147 F20110209_AACCAA hu_g_Page_067.QC.jpg
fd094a671ec104c3a4a570361b749b2f
f3aaba0034bb783de5c4d62d9bc87f9a67002922
130120 F20110209_AACCXI hu_g_Page_169.jp2
11dce9d634b6ff4d180201b51a2c9c53
73e6804c283f8dbc6561bf39d38b63902bdad141
73951 F20110209_AACCWU hu_g_Page_120.jp2
1594fd929999179e35dd4ba1d532ecf9
a7aa9c5c4d165417607a7e87ff1d51bbcad237fb
67436 F20110209_AACBUG hu_g_Page_133.jpg
656df28a2a45564b02320f33ca3ccaa7
7314e3aa0d1a9094935dc48eb6b5579ecf28f033
81426 F20110209_AACBTS hu_g_Page_139.jpg
6239cbfdf541de9d08bdb64ec4d00582
29dd59295daaabecee0d3b1822a92a46ccefc5c2
101417 F20110209_AACCAB hu_g_Page_012.jpg
c4e63b21757abbfd92c948b697dc21d2
b598d74c2a14e20ead964ef83476981ad41daf75
31519 F20110209_AACCXJ hu_g_Page_171.jp2
281181040942e0f30ea7a80e8380e6e0
886463711786bc477093a406d4fb8e42c7e5cdc1
65949 F20110209_AACCWV hu_g_Page_121.jp2
c5bc2766f09d9de0b14ee82383e4a96c
89b7ee81993371256580b4bfa821249c66a2a197
47598 F20110209_AACBTT hu_g_Page_071.jpg
c16bb79e3838924f16ebf084b44a1702
850a2e76ffb92b68aa7a23ecac73031a05e6e6cc
35351 F20110209_AACCAC hu_g_Page_077.pro
d8242e4ab42ca12db8b7aab7316179c4
96c1ff5b95caffe9d906bd3e0e66696b3534244b
2236 F20110209_AACBUH hu_g_Page_020.txt
7a2edcd2ccf97afd739aff4542c86f8a
8e565051bb0001b0ca6f094c4336403c82a11d3a
6629 F20110209_AACCXK hu_g_Page_005thm.jpg
5c32d9dbc511f5f6e519092bd2e2a1cb
8d16f74f8e0aeb35ba0bd644e4b65006aef1621f
63569 F20110209_AACCWW hu_g_Page_123.jp2
509e5a213caea43729f554c55e5442e4
c8e0bdc4849615efca1060f48915a0bc3fe5cf61
F20110209_AACBTU hu_g_Page_009.tif
024736ea0e949f51a4739787fae2cfef
9f7d1d719249f76fe61e3bff1174be2509f4bce3
F20110209_AACCAD hu_g_Page_085.tif
67e9a931446f69d76a316528a0ef8065
0f3b1387a2ac52dde3f9047760ace0d8a5ab3186
10066 F20110209_AACBUI hu_g_Page_153.QC.jpg
06c162de63b6783897a9c772e3a29384
e7d1477affbbf4a2cfe46d53f07a61f7ae5cf322
7931 F20110209_AACCYA hu_g_Page_040thm.jpg
4faf8b52b278e42eac20e8d616879134
d7cee9eac06e071812d781e8886c010a9154f92d
7478 F20110209_AACCXL hu_g_Page_006thm.jpg
a3bd46d7af40e5c6c57d913bdbdeaad9
6fa50674154430273f92cbd5219c3dc546f68e49
69618 F20110209_AACCWX hu_g_Page_132.jp2
1a3c329a1d196955ce9c2500f002af53
290f4172bed05668992e1648f92140d03688e7f4
2487 F20110209_AACBTV hu_g_Page_164.txt
4ec19f7e0a7e5b349979fa2f412b024e
6eb8cb831dec74d06032fb319ac07e175ee53905
2516 F20110209_AACCAE hu_g_Page_166.txt
3eaf7aff7871dfd47d836e0cb4efbebb
b1e763b9e5859b1490599e48b9498c999ad8500d
551 F20110209_AACBUJ hu_g_Page_073.txt
04c3db14898a02737ee47821760220ac
89cd7041ac1eea6f9276922a5d94bbf7eff7e8af
6352 F20110209_AACCYB hu_g_Page_044thm.jpg
b5a74b5a6b56fd24e8d7bcddad03cf3a
4beb0d804bc9a669e07ef5d1535bbe4e061c5cb4
5866 F20110209_AACCXM hu_g_Page_007thm.jpg
d79c31943bc84ad878a1be1100f20659
05ada22942726656de9088c1f190e7e3d1cba193
63392 F20110209_AACCWY hu_g_Page_136.jp2
7a42c0dd6a6533a3c8c495ee62c3af7f
9821e0c0d55c61fe52d4e80233f700f0a1497550
49272 F20110209_AACBTW hu_g_Page_060.pro
8ebe52c0b445815bde064e9b954b649c
fc478a7d9aaeff6f2f7927231144ec30d4f22880
117807 F20110209_AACCAF hu_g_Page_019.jpg
6a1d1c69abf0de4726f73af4f5b210ef
2fb641f5d09b05deefb9ccfcdec1645af4060c49
42124 F20110209_AACBUK hu_g_Page_013.pro
4d9f055a674dd1906c0558c9fb047704
63ee69351394320a38c31e728330efe235d76990
6802 F20110209_AACCYC hu_g_Page_046thm.jpg
d8e93f1d0b268afa35475e4f4eec3f4b
20bb69f7570ea1f09d95987a950b1b37f99edc3a
1257 F20110209_AACCXN hu_g_Page_008thm.jpg
b0721a52be5a04e802fc107ca0dbc03c
e51d96c1d248db8a8111465c0bf0b5ad163171a9
74647 F20110209_AACCWZ hu_g_Page_137.jp2
42e54d07ea71d6a5480b48dac836f0c2
9c93655725c4b776e84be80ee884d1d222b38322
F20110209_AACBTX hu_g_Page_021.txt
1884090b079fb56e709b97f8b3214508
b8f59a3d86bb71e3da4ed8e244f57588d8843327
102098 F20110209_AACCAG hu_g_Page_060.jpg
08b6a041b41a1f010ec65f224556b787
d26d1394306053be0e1a4073ee002747a102f29e
65160 F20110209_AACBVA hu_g_Page_083.jpg
d6c66bf65f0973910c4c3c356d12cbe3
53baab3f23f688db737ec58c871f16eb18dad992
66043 F20110209_AACBUL hu_g_Page_034.jpg
22a663f6ab2e06beccfd687c973e97fb
748a8a2f1357b754580795f767a46d8c617d835f
4690 F20110209_AACCYD hu_g_Page_051thm.jpg
d5f70ad1049766fb30c1a3e53f29382f
94a5ae63b10aca7c89de4aa25e62c5c7ee76f4eb
8291 F20110209_AACCXO hu_g_Page_009thm.jpg
512a1ffb3f264fef81f3bac10b559dad
0321ba75216ebe002eeb74a5576f24b87abb706c
84424 F20110209_AACBTY hu_g_Page_049.jpg
89cfb61d16fe46aa6f77489e32e8268a
cf6c8bae65456e2d36358a63611219fe5c82b9d5
6464 F20110209_AACCAH hu_g_Page_128.pro
4486d1e429a65f5b1ccf3f4d2419f812
2360af4b16a23b73d0bb67ed5b22fc21b7654ecd
22018 F20110209_AACBVB hu_g_Page_111.QC.jpg
f632c715199e66bdb177e8dc437a7c88
55a63aad296f5ef16fbe1d47aea174fce18667cb
5922 F20110209_AACBUM hu_g_Page_047thm.jpg
26dfc29b6b12749aeb5ada493f22111e
91681ed45d06c4720a39e72e8740c91ac6032c30
7785 F20110209_AACCYE hu_g_Page_055thm.jpg
718770f70b09d39a819b08020c70d447
b129c7e367ee8f417b4711547c467beeeb5d6f34
7959 F20110209_AACCXP hu_g_Page_014thm.jpg
0c1fc1ad1246d81f6ebf8857463af9f9
b676eb2dcf7092805dc3a23adab7d04baecfaa53
F20110209_AACBTZ hu_g_Page_037.tif
64ee74200d54e1865e70c8eae905728e
fcedeb570dda8dd5d3c1c884ffb971dc8bbbae3b
6116 F20110209_AACCAI hu_g_Page_137thm.jpg
fbb2a359f6c5eb221f13256e6106596d
d723744f5b8de54ae6cc6c124e88bf60c2ab13b5
2083 F20110209_AACBVC hu_g_Page_105.txt
1c1840889dce9d06964d6b124e5c3da8
4e967ff16cb4918ad9467f693862fbd024c969bf
28989 F20110209_AACBUN hu_g_Page_007.QC.jpg
12a6c89df7305bf659995122ab8fb5ce
2fece5f06501bdd23b19f65ecec2cd2d8d039555
6292 F20110209_AACCYF hu_g_Page_056thm.jpg
1e8be2b6a909a2912fe9226ed101bb3d
e4b40fd0222f21343d6166ece7de15cc0ba77c49
8003 F20110209_AACCXQ hu_g_Page_017thm.jpg
c44e5b7be898c0819c36cb1ad5cf11ab
6dd56acb8773d955f302bd54f873c72e368f2d2d
110769 F20110209_AACCAJ hu_g_Page_163.jp2
f27b870d01367e775539960916fdabcc
4e4eae188208ccae906117b6bbe5c2920ceba1cb
10010 F20110209_AACBVD hu_g_Page_160.QC.jpg
44f90cb2f1cf66af48e39aaadd7349eb
1d0507e7eb3e9582f8f8e160d0c9824a6247afe6
F20110209_AACBUO hu_g_Page_094.tif
f5cd5914857c971b790790cb7693906a
93feabc24870c3e46886e63d060242225bcadc86
8449 F20110209_AACCYG hu_g_Page_057thm.jpg
095f6b82142c9f8238661c70fd96899d
786cbef908e8e16af6a9054bbec46daa4037fbcc
8345 F20110209_AACCXR hu_g_Page_020thm.jpg
399808ac3fa211a5709ffa2a20f61e7d
77d2a6a860423d94fbe1e4fc794edd53af470d47
1934 F20110209_AACCAK hu_g_Page_028.txt
a23f7646c099f82414859fcaa57f3406
87c1b5ecda1a9476cfad5a94da57fc299af98a72
93525 F20110209_AACBVE hu_g_Page_113.jp2
7792643508e6f4fa6ebf8852d435715a
caac5597b21b7c4be793ef133e3454aac81f6f5b
73027 F20110209_AACBUP hu_g_Page_107.jp2
365a8713aef7d374c7829be44db16b47
219cb0909071f6c38b23bd18b6898959f96959d6
8262 F20110209_AACCYH hu_g_Page_058thm.jpg
404eb544261167f1d9b103780a01e927
8f6c506ce59bb5b473776bf190dd03af5da683f7
7964 F20110209_AACCXS hu_g_Page_021thm.jpg
435b31b2aa01129c9b20e50afe8cccc2
203a1eb1bf286704f0da12b6518ec15919fc196f
1639 F20110209_AACCAL hu_g_Page_034.txt
398d2ba8d2f1983123621294d6ffe6aa
aab474fb2f4ece0884408563769691963e816de2
44732 F20110209_AACBVF hu_g_Page_063.jpg
430ca39e2da1ced7de88018ee9bef21b
0b51dbae007d26cafb81e059532258824e7c7066
F20110209_AACBUQ hu_g_Page_144.tif
aada9310fe836c764d259dc50dea5b84
5be7ad8117754588ed46c323e472445daac223f9
8263 F20110209_AACCXT hu_g_Page_023thm.jpg
45ad1702c00ca55f7d47e52481c913c0
049c8c69bb25a3b3308090b4146d18d123585294
31759 F20110209_AACCBA hu_g_Page_015.QC.jpg
f1b5537d8bd7154416f4555d407dfba8
253aa79fdd76093acde298fe289db1af1c8258cc
21781 F20110209_AACCAM hu_g_Page_034.QC.jpg
fc891e978d5759f5a856f52baf637230
a53f750a30c251de8b0cf15ce4543128995546de
19291 F20110209_AACBUR hu_g_Page_094.pro
1a16dbe4605b95c0f6414c90eea80e61
3a782a0720017cc1909e9b744de8930208507f79
5110 F20110209_AACCYI hu_g_Page_065thm.jpg
8e6daea6c0d93afdec21c924724933a3
44d093221f4ff5b32a939588c4c6f2e532bf66f8
8398 F20110209_AACCXU hu_g_Page_025thm.jpg
9e2042db411734ee6bd38a3aa0dd4e82
d11711df3d8ef015e4dc8fbf7a542326aeb25139
103134 F20110209_AACCBB hu_g_Page_039.jp2
1fa257fadf4c483b4b826311f10f2513
33b17b9cc8f0c59405f13752aa58dfdd09f87af2
F20110209_AACCAN hu_g_Page_018.tif
ead2eaedffbbe93cd08476e03a714937
ef43351f97742476400c949fbf7e1603a13ee696
1660 F20110209_AACBVG hu_g_Page_086.txt
a08ee586587a57c16ab3eb247fe8d5f2
c24cfe254f76dc7fb6aa1a7954574375266d03c9
70666 F20110209_AACBUS hu_g_Page_137.jpg
58c0b546b138d35b7ea3c60238c8bc09
e49983d19ae0032751c7649aed8b8de1e7e06c07
4473 F20110209_AACCYJ hu_g_Page_066thm.jpg
ba33bcb696797e3ec086cc40c4dee19f
bc8e6b0d7d1f206b5bfd9aae1c018f5c3c48b63c
7359 F20110209_AACCXV hu_g_Page_030thm.jpg
14167e98d9ab002aef8cbec73bca1336
f5975d9b963d261247581fd4ee3463c1bc8fe1d6
43985 F20110209_AACCBC hu_g_Page_113.pro
740b6e48c6d094c9c2240e671081a9f1
04a4663efd6d19cd46aea7b80f2a97e9c479ad1a
165201 F20110209_AACCAO hu_g_Page_097.jp2
3023c8064c3e0996a360bc11c329a780
0cdb1e923b7b84dba124e0996fb2a6c1b13e923c
96310 F20110209_AACBVH hu_g_Page_118.jp2
039e8325020b701db4b30d68283ffac1
914081e542c2b454e601e42b4023465870cc0998
1900 F20110209_AACBUT hu_g_Page_036.txt
acfe6c2000ea5ec56185889edf77df61
ba877ff59c5f3471c60353200b60f52cf4cc64ab
5771 F20110209_AACCYK hu_g_Page_068thm.jpg
95e96746b599166e9cda225d5772d6b7
2d7e86b80fe7e3de337e6caa1ab7e79b5ac1d953
5342 F20110209_AACCXW hu_g_Page_032thm.jpg
d50e6524f5444494859a22b50596e93a
eb7e3fc30bc91afc5cc5893f30a91c4c84112762
F20110209_AACCBD hu_g_Page_069.tif
124595d277c321504a0966a80e31541d
3db2728055d2ed9ec09b3bcdf318355b97010227
44700 F20110209_AACCAP hu_g_Page_128.jpg
73a062e3b8e97ae4068dfc9baed1a31f
5a8406b6ecbabe9b426b9232deb03671d526665c
6514 F20110209_AACBVI hu_g_Page_012thm.jpg
085989b28af8c765b8664c8468e9402c
067da474d01ceaf0bd8716950d9d638564cbbec2
1152 F20110209_AACBUU hu_g_Page_051.txt
0232abe008be61a578fa84fb048e8a22
0a8dde0e7bce22887c48e1c328f524c6452fea96
F20110209_AACCZA hu_g_Page_099thm.jpg
aef35a01b1f9ccf90d3a95480b617daf
4485f29ae820fdf8af586dbcfc9574450771ff5f
5368 F20110209_AACCYL hu_g_Page_069thm.jpg
253898390e6f1a1058dd2a53e4c3034f
5ab1627e1d3572021a48b5b7bcfad96a82e4cf6a
5697 F20110209_AACCXX hu_g_Page_034thm.jpg
3ff3c7cf9fc060197c2e516bb839c4c8
72b92c6209e689e6e36f0b4102383d499e3eea5d
47451 F20110209_AACCBE hu_g_Page_005.pro
7fe575422cbc459f569f7455d1794410
a368c729aecfe3891432885f757c8a3b1f39fca2
129428 F20110209_AACBVJ hu_g_Page_168.jp2
b545623a7d8b6a63f640491c33fa177b
ae6435acbe5d1e3a6f6ff68c2389a8d7aff987dd
5582 F20110209_AACBUV hu_g_Page_052thm.jpg
6c536e1e885dea89a600d8abf757c46e
33c5521982aa3bfc587f1c110970044245672d48
6577 F20110209_AACCZB hu_g_Page_101thm.jpg
7c4c967481b36eab6b60c3c00f7dcbba
46361d5d9147ec1c085f403791a6a90ac92c1997
4607 F20110209_AACCYM hu_g_Page_070thm.jpg
ce399b65b97514bd6c38355462c6a8b6
5c5ff258b07a910fe25ecf5ca39b9678b7b52638
6522 F20110209_AACCXY hu_g_Page_035thm.jpg
633011fb5a4e033023a114ba38435b61
9659deec2c296c1d1e70bccb3859265004b2778d
F20110209_AACCAQ hu_g_Page_161.tif
2683b13cdd9336ab0b5032aa114a3423
e2c7175eae51dbd4e76dbaff7d401cedd8283ae0
8488 F20110209_AACBVK hu_g_Page_024thm.jpg
ba80fcb99605c2869e5399c7abeefb69
395141e06d3dbadb4d33cc17feb3f9b8f49503da
61408 F20110209_AACBUW hu_g_Page_138.jpg
78862b4f75f1adb23dd7f28e04046838
050448d33c7cccb4b94858a0e479efad54790329
7304 F20110209_AACCBF hu_g_Page_015thm.jpg
ff80e7851412dba55fa5963b9af09404
eabf2e0effd81606a112e4cfe693d5bffd3b373b
5747 F20110209_AACCZC hu_g_Page_103thm.jpg
2b7367ed90ee61c883d344428f6f74a2
f64f3ac5f00d728bf8afed04b11fbcaf09d6a2a5
4943 F20110209_AACCYN hu_g_Page_071thm.jpg
76abebfb3b25e3d04c3227794c456fa7
8bf6af8ea1db663ad6608fbf4367c38bb643c418
4915 F20110209_AACCXZ hu_g_Page_038thm.jpg
7c7ac503269f951f8b50e92c5b8dcd99
4114bb9315f3a41303bf27f20ce22cc5a6f4aa10
121781 F20110209_AACBWA hu_g_Page_019.jp2
338a989d50a56e22345d8476d4c2ad87
d19a9f450baf195e7b68d6d47375ead181e54fcc
94408 F20110209_AACCAR hu_g_Page_118.jpg
c9688b9055721b3e87eb1acc73a4bed6
a3bcb08301694242dc9f0ba254482524a18383f7
1051969 F20110209_AACBVL hu_g_Page_006.jp2
8f69db00b5ceef448a6fd5679378d480
71fc38604170937c41c9bd63112c58ddd2af60c5
72883 F20110209_AACBUX hu_g_Page_086.jp2
3c115e9c0c978cc29e4afa2ca6e85a6b
961ab7d59bc50a3450433c0fd017fb250906b477
113461 F20110209_AACCBG hu_g_Page_016.jp2
9b55197b4eef62310db7258b677cfa4c
03b9930eeb4d7898d4dba15fe0e351b0408e406b
6365 F20110209_AACCZD hu_g_Page_107thm.jpg
5939dc3b7aab4a9b33db37949058804b
f9f3eb832ce6e71d158118407ce34b175fcf789e
2214 F20110209_AACCYO hu_g_Page_074thm.jpg
220f044b2532719bb088bec98ee6eed3
9bfecc5029d7f1aa63a265fd22292edd52b08bfa
3808 F20110209_AACBWB hu_g_Page_093thm.jpg
c3a8a5ead0eadc0d4f2fffc5094a1374
311d1a99d073dcd6f0ece2e57bfb1983e3c8e8a0
11293 F20110209_AACCAS hu_g_Page_148.pro
06036be57dc2b07929ad3609f58abc73
a5e68dfdfa3f3a3d3a88b9e06982e4ba67bdec53
20331 F20110209_AACBVM hu_g_Page_155.pro
c6e4eb71235a0df4d678bbeae1be55cd
2b93de25264eb4dd64c16e6c117edd24e03c7061
F20110209_AACBUY hu_g_Page_116.tif
2cdc058e0b72d50cc8de26225ebd3d74
91a67ca44b70c770aac4babdd59a20f39367676e
F20110209_AACCBH hu_g_Page_096.tif
71b554137560e3b688d8fb178bdf4a13
4fe1c5ae9432637fd6d69d27da5b2711252cc469
7098 F20110209_AACCZE hu_g_Page_113thm.jpg
ef2bea5d6f6cadb3f4b1dd4b55b14f41
fab4b4ef576572fb53adc0ddf854f36927666918
7515 F20110209_AACCYP hu_g_Page_075thm.jpg
8bd1cca8db9cfd468651ee560ccc6302
a044f60791ab02c2f66c5d61d00cb72fb21fcc1c
588747 F20110209_AACBWC hu_g_Page_067.jp2
82bd79dc6cfadd208f92255a20742d99
c8e78e7383cc6c4b215b9a2f9b13ddc28c45ab62
8773 F20110209_AACCAT hu_g_Page_063.pro
6609ce951d9074b76f071e448e00a9c4
6d306d5db4f07d6d70ca7ec65603f2f761ee1b53
F20110209_AACBVN hu_g_Page_026.txt
89ed326f23949ebdddbc5ff94f1ff5de
69a7c631715a7a112d136efe467f64911d07ea6c
F20110209_AACBUZ hu_g_Page_154.tif
f611ca05628bb4962ebda5e873393add
543e16489005c80e671add55101285772491c935
22020 F20110209_AACCBI hu_g_Page_052.QC.jpg
b755265cf97ec01aa46fa2e1d000765f
69b9582b6111bfbbefdeca5b4331fa573d435dbf
5872 F20110209_AACCZF hu_g_Page_114thm.jpg
476e9d05081ad9a762942238dd3691af
305e09512ca5e8db8437d7b27e402b77993eee0f
6646 F20110209_AACCYQ hu_g_Page_077thm.jpg
3bc6aea08f7a9c9c291b58fd241b3e0b
6fd62e368931d4cd495a9a0b92d86b6f70163dd5
F20110209_AACBWD hu_g_Page_092.tif
28844bb4de109df95ba6e90bf180e1d9
b145880f92fe770649418ba06de52cb85543b7b6
547227 F20110209_AACCAU hu_g_Page_071.jp2
f0255b4d06f3819d0c81075acaddd1a9
09c6380026fd5a25d7aa817361fbcf7760756dbd
2149 F20110209_AACBVO hu_g_Page_022.txt
3bb3d750cc36ba6b558f085a414da7ff
0dcee82d0263b89ee2b9492b19f024c893ee56b3
5859 F20110209_AACCBJ hu_g_Page_121thm.jpg
608079d788b901baef54fb39b4394606
2574b9c98e0980e40cba66601eec88634e3443c8
6391 F20110209_AACCZG hu_g_Page_115thm.jpg
b03b9ed8f35b624a4aedac18a4d1ecc0
a769ae0d2b1cd0f8e04f4e2a1109b59caa30f151
5150 F20110209_AACCYR hu_g_Page_080thm.jpg
b1f8b8c7ae20fb9738a4dae075cab7a1
3d78ce85656b129214d64b55c9e67e2503e6bf3c
1558 F20110209_AACBWE hu_g_Page_041.txt
982c6ea1794ba36d9570dcbcaf791067
5134f3dbdba4d5a40ffd9ad9d59be2248a10eb61
19888 F20110209_AACCAV hu_g_Page_142.pro
ef9ec88feee857a4fc60c4c180049750
96211e72f532757808b6e8f85b478319b370ca54
F20110209_AACBVP hu_g_Page_021.tif
a2ef68c782d1f6b87faba32b49c05aef
8fb6d41a7baef75060799ad21b7847e4778490d7
29536 F20110209_AACCBK hu_g_Page_036.QC.jpg
15bdc61ba82048fd57536e938fce6ef0
d4b732726d7ef7e92041b4697ade850f05757b1b
6525 F20110209_AACCZH hu_g_Page_117thm.jpg
fc0548fb7d58d73a66180c3fa18370b4
bae99ee8e1d99561db2998c7aee64d8ab9a5f695
7372 F20110209_AACCYS hu_g_Page_081thm.jpg
acfe8ee0d4703df6a168c066c9ed9703
bc79b390b1cea926a12b09bb683b9a3237f2e2b9
F20110209_AACBWF hu_g_Page_006.tif
f73792dd13137548fb9feaa32cf3d4f1
ebb3fe26bb26ceb4d7d538e15216972766e21437
104760 F20110209_AACCAW hu_g_Page_015.jp2
b44fa5e4982a63a6c83644438f5d1844
a35b2977c88a9ad2649cdd0cfcbcf3f4a7163b4c
F20110209_AACBVQ hu_g_Page_169.tif
0a447062830d9d7a11e2528702d35a49
ecd54993f294d315603c7a847f2b0a891001dbed
F20110209_AACCBL hu_g_Page_056.tif
17b94b0a28a9d7b5e316b2d0049c9b80
b6517405cdbc5543c39956ccbe5767f37dee3b7e
4658 F20110209_AACCZI hu_g_Page_119thm.jpg
7ea0263789d37e654fe19c5e8b20a9b3
9f5a5f05364a5cc873d17101f0670d3f77c404ea
6124 F20110209_AACCYT hu_g_Page_082thm.jpg
65e2c3118aedb06eb78c6141a5d5ca73
7f43838b0f4ad1b6a6b648a3624aa3c9bac855ed
7838 F20110209_AACBWG hu_g_Page_060thm.jpg
7be5069db75f4840d9f1a14e7684eccf
77946bf60f8e336b453d5eae83ff7d2ebeb7d054
5407 F20110209_AACCAX hu_g_Page_072thm.jpg
fe26098ca589ca39656b995c682a7540
0052c030e8f3a73c1bbf212e338c4a254e76c381
320535 F20110209_AACBVR hu_g_Page_127.jp2
9c5ff5646ac8f8493a824235d474bbc9
20d3280cf8bb5aa9baa90b14856b4db8ff1578bc
38085 F20110209_AACCCA hu_g_Page_019.QC.jpg
a0b9667bfaec1a60173f8747d29470d3
f0b4f8984eebd832cc1081a8e7d4d759cd98f8fd
1310 F20110209_AACCBM hu_g_Page_122.txt
cdd67d24634ecc388d72209d4c4fce43
cb9880c38446b91c9baae545706e4bfc203c0c1f
6840 F20110209_AACCYU hu_g_Page_084thm.jpg
801c49a2423f791e1bb44f88054979ad
56053c0e81b5a1ce5f651d9233aebefd435fc2f0
9566 F20110209_AACCAY hu_g_Page_073.pro
2395c081b0b4509a575eb1486db9a4b7
e94038af7dd7f341969ae014a32897fe0b127ed8
38508 F20110209_AACBVS hu_g_Page_011.QC.jpg
1041e6201e981cf727b650fd5b91a666
6b17b238ebfef24ec682f44cbfde58bab393dd0f
16951 F20110209_AACCCB hu_g_Page_066.pro
78a595c78ce0e0a52aeb82e6867c5204
d30a8cc4c4bcab142f39b0747aaf2c699f75b4de
57213 F20110209_AACCBN hu_g_Page_018.pro
bf0af1151f5eec402351cc49808d1409
2e4d144d37eeec3832cc1c37f1cca87db350e2b3
6170 F20110209_AACCZJ hu_g_Page_120thm.jpg
ed1dd2ef0a66268e581af7c9e35ba029
74114d7cb489a0c22e54dfe78837b732f1ec4de8
5330 F20110209_AACCYV hu_g_Page_088thm.jpg
34b36ec6cd5b2300b2df042ebaa6f1c5
aa0c5a67ba2af445aa79f3d25741b83a8598cfe7
F20110209_AACBWH hu_g_Page_024.tif
a6347d64bc2514351421070e60bc35d4
db200859387a8f4a91870d9ffc020249b71398dd
11681 F20110209_AACCAZ hu_g_Page_159.QC.jpg
f6d16d91b26967ec9f26b83ca22b024c
14eb8106087e2e6d5fd0ea347bd7a4a143c932ab
28326 F20110209_AACBVT hu_g_Page_080.pro
3258981630b49c2aee4b5ec0c1ca2646
aed61d8f0eb37601a85e695c796304e628ef1cd8
1480 F20110209_AACCCC hu_g_Page_114.txt
a9871249689d147d4f64efda5a18ee97
0bb74c022e2ef8b93bd150e21c2bec678774d1e2
111552 F20110209_AACCBO hu_g_Page_040.jp2
ee69ac209ab13204f926ca7d5b2b2df1
f227ca72987d95d91949750dc1d495ac90a8d2be
5408 F20110209_AACCZK hu_g_Page_122thm.jpg
57a389a02b94d609ef985825237ee7e8
189764e0abaf44dbe6db820265d6179faf0a3a45
5738 F20110209_AACCYW hu_g_Page_090thm.jpg
df3dce7a87da14845a3be0344c83bbe2
4f18754690f9c4696107a0d851ad97c241c9801f
6615 F20110209_AACBWI hu_g_Page_013thm.jpg
bc81fb10555bbb028efc6187521e2fae
31144d3cce40f245dcb45d1c48e2a7b7d03568c9
14428 F20110209_AACBVU hu_g_Page_128.QC.jpg
0e0ba0ee9d4976153421eff14fffc332
5c54d3c2bc2ca2e5f6744b535ea2166801954d87
37956 F20110209_AACCCD hu_g_Page_107.pro
6e4f6c3d03ced25f5964ba14c6e535de
4cc28993fa5269dca408068cd8b400d11cfc4697
6006 F20110209_AACCBP hu_g_Page_053thm.jpg
efb17a82f21665eec22bb3e09a10db83
103fdb1a307dd3f989d9e656daab8b411f4a07b9
5736 F20110209_AACCZL hu_g_Page_123thm.jpg
d6510f3961bf352162b51ffe8120504c
b2e56899c8bcee0e12e5536b4b0be867a94f1cbe
5121 F20110209_AACCYX hu_g_Page_091thm.jpg
3c3f5f083d89da929202ee53db69df4d
9cd97f41714932dee536ae775fe3e3a7b360551c
F20110209_AACBWJ hu_g_Page_121.tif
1b472bb3c6f40d3b6f14150bb1975074
8c81caa98d49c41ad798b85cee6f6cd9ce4e2e46
F20110209_AACBVV hu_g_Page_079.tif
32cdce074e2d49c61dad790dc8139f03
ac407ef272273627fa3674b9bdd98cbfb7307e4e
14173 F20110209_AACCCE hu_g_Page_160.pro
4a7f18cdabe2e5172a0e1ca16a3e8cd7
57eca44a4547fb0d2bce1e87ffe2a4fd153fdf20
55850 F20110209_AACCBQ hu_g_Page_037.jp2
435aed47dd613aa52509d5de82d06820
13ada73c03d631aabde6eaf1b4c18ad166fd182b
6705 F20110209_AACCZM hu_g_Page_124thm.jpg
4b65010f63419aa65f7715e53c0c1109
2450f8394e6677042662950c989756bc8834ac38
4521 F20110209_AACCYY hu_g_Page_096thm.jpg
557292db7b5f635f562cd9a0d334f8fa
99c78a6211427c26ab6a49009197c92256a9b485
F20110209_AACBWK hu_g_Page_049.tif
28ed03f54c49d2cdc3bc43ee506061b1
d0d01c0db62a6054d6963b027ba92f3a39da19a4
F20110209_AACBVW hu_g_Page_084.tif
1743bff1e4ccb51ab02220d00a1bb947
0d9af73c5ee11cf2867ec80a33f37af11b7f4a56
32063 F20110209_AACCCF hu_g_Page_160.jp2
48769a7914151861eb2301c5fa04dd58
4f31d0cc11b125128fdbd26fe3159b35fc14c0cc
6664 F20110209_AACCZN hu_g_Page_126thm.jpg
1948746e3cef94ac6ab51e9c7cfe522c
f41084a898eade253339839b2ee635a5657748d8
2520 F20110209_AACCYZ hu_g_Page_097thm.jpg
51c5b9f068f07ef1c0f2730d5dc9dc8e
f86c8e35fe407304ab96d8fa7772c2a657cf0b27
27945 F20110209_AACBWL hu_g_Page_013.QC.jpg
eb619493a9a9ba3d03169f0c6d411887
9dcedec73cbf7aee85a17fffa27dd4bbaefa5bb3
8247 F20110209_AACBVX hu_g_Page_026thm.jpg
effc41733997f077fca6bf8393429e11
394b2d43a6736cec0db6cd8be49648e36b946a72
47731 F20110209_AACCCG hu_g_Page_161.jp2
b80a0e154443e9a8334270c324c2b303
5050bd2b06504a36b4d15d4acee4ffb703f44d33
4260 F20110209_AACBXA hu_g_Page_161thm.jpg
b61dfb3906e559040207e5054595a87d
e329ebf5e0b8b20047bfd9fdd86fa6431d96e452
7694 F20110209_AACCBR hu_g_Page_170thm.jpg
94ff7c0ee9f3d8aea341fb9d1de3bf07
c26040f3abeba00d317447bc802ef14407faa187
5296 F20110209_AACCZO hu_g_Page_129thm.jpg
0d6a20363fc160d08d8bfbb4880909a3
ba84cc2915ad545016cd2eff84415aed65c6d826
787 F20110209_AACBWM hu_g_Page_102.txt
52da4579ef5c7635cc92aebc74b8a3f9
b22017953f76f2ec56fc3cd798a50df416eae995
70470 F20110209_AACBVY hu_g_Page_133.jp2
d9e21ac1a8499b4355cad1c51bf38ead
b5073890259b77d5dee2a018d992499f3a0abd17
48439 F20110209_AACCCH hu_g_Page_004.jp2
f9cfed12b81fea01fa73d60911296d84
a1b925daca3b319bd42551da881f022635929bb5
F20110209_AACBXB hu_g_Page_137.tif
9aa54b7403b5b1bf8136439b4d658d40
285bcd0e35534de217ea94dc56a0ca36b6f3f1a0
22196 F20110209_AACCBS hu_g_Page_132.QC.jpg
a7839fc21121c5adea2433eb6ebcc71e
066e8b0268ef0f7b5e9ff231020c051832609639
7342 F20110209_AACCZP hu_g_Page_131thm.jpg
652594be2c50aa5052bb66253d04f095
6202116ff7b7eaaede74d29815b2a877d3390ff9
F20110209_AACBWN hu_g_Page_117.tif
68004ab46a4a73b068f4f7b38e4dd61c
d7e9fe237fbb4906c7f6e84e1bd49d0ff9e0bbbf
5096 F20110209_AACBVZ hu_g_Page_148thm.jpg
5684a0a5ed4aa2f5096c7283b6e3f049
94296a79396afa9422fedde5e18aab02dcd85bf2
7635 F20110209_AACCCI hu_g_Page_039thm.jpg
cd12b8b3032ba9405e451a0bc1552743
5e181d6fa53f8b617a626d8d1c9a03a82b5b97a2
F20110209_AACBXC hu_g_Page_077.tif
f875d894e2501d9c96d31acb4f054a6c
079bb15b55d0a7ca0067e978fbff1a6e711b9d25
F20110209_AACCBT hu_g_Page_115.tif
5fe11a257f95b5806e49e6439730def3
c1cb40ddba4c248f5c00dee743d8e0f92beab188
6175 F20110209_AACCZQ hu_g_Page_132thm.jpg
bb533f640182fc345772f848ada79921
4f9ad753683ccdae4718eb00cbbe28be8f486b7a
53259 F20110209_AACBWO hu_g_Page_040.pro
9c57ffa22424ad99737ce3b4076d4f3d
5183c39a76b9c6b526db9d4ae33368d0c7bc5355
112443 F20110209_AACCCJ hu_g_Page_014.jp2
85b2c9cb7101197465d028333a468f26
95744b45ba4393f62be281a24b9ad891c4593baa
2073 F20110209_AACBXD hu_g_Page_055.txt
d4b38730bb90253b9226116ad3fc4a29
ae98c50486a324ba1c7826796e6b2defcdf7d7de
4832 F20110209_AACCBU hu_g_Page_145thm.jpg
b99ef6e64d340fdada0d8c84ea281b6e
560dd648b75481878d399f76d824aa27ba676b15
5868 F20110209_AACCZR hu_g_Page_133thm.jpg
e50bc5ae6eb35c989fa92a3aac430064
ff484a974d16340a184957a1a6086fcb21641037
5614 F20110209_AACBWP hu_g_Page_158thm.jpg
a248ce08039a7d9ecf86abeacb539d3c
f30081c703c58afdc07ca1cc934ecd8089f988d3
61311 F20110209_AACCCK hu_g_Page_121.jpg
342e08fae389288fb38a1f312a237f87
95d565a5192e48af0aa2cec12dd1e2b921a40ffa
6560 F20110209_AACBXE hu_g_Page_086thm.jpg
74798f96a06819177062192dfbbb801b
2e3bedf07a50168e59cd09fb7a448dda85c8404c
F20110209_AACCBV hu_g_Page_091.tif
08f6c1bcd647f4622d16d921990ae21c
dbc8a56c3e4924ae7ee98d44be556b76d67d0c03
5925 F20110209_AACCZS hu_g_Page_134thm.jpg
b19c5ecce884fb03bb6ba11e7245f66c
8dd3a9dd9cc79cc12743cb5643a6057ab433d63c
F20110209_AACBWQ hu_g_Page_159.tif
fbbf7b66fe1f2a80411f432f48e51062
100f27e5220e531a6ca7ce8569374ba30b53edcf
533 F20110209_AACCCL hu_g_Page_064.txt
ea75c9ecc605a8b8080873e5988aa218
051392d87225473d55c632646718382a7ba13f20
19555 F20110209_AACBXF hu_g_Page_033.QC.jpg
64520a6cdb98c4d712ba0407517536ff
9dc22d08e9e4a78092325c05ccc4f7de9427e9df
F20110209_AACCBW hu_g_Page_003.tif
c14ab482dc45f156ca3c14441680879f
4d537fc068381d5db25d2f57afb5556fdd542c71
6127 F20110209_AACCZT hu_g_Page_135thm.jpg
1cd0d3b6d4244e329eb0a7c97e6e4708
e81a39fde9aeb45e0db24b42de95bd20e24e6e38
6069 F20110209_AACBWR hu_g_Page_031thm.jpg
4f0b7a8d039a016a38bb543e7e5e2600
d91c9d98e3ef1526e6f097fcb2107b9feef0b9dd
62150 F20110209_AACCDA hu_g_Page_033.jp2
072b7527a605e33f432610ee55e08b9d
777d71b67041dcb1a716289437adbae80423bf4e
6627 F20110209_AACCCM hu_g_Page_104thm.jpg
79840a9abe625ea89bc688870afadb88
6f0304810f47f2ec93ebd2757ececbaf47ce932e
4689 F20110209_AACBXG hu_g_Page_128thm.jpg
0c18018b138408875e9a72edc4b21cdb
7830513c4a9fb678d5d5c35d91209b5207f57ad2
112948 F20110209_AACCBX hu_g_Page_021.jp2
fc0be142d197c8e741c8899eb5075015
baf9bf5b0e9cf6dd304f2255eb8a282f121af59a
5792 F20110209_AACCZU hu_g_Page_138thm.jpg
1ff0173b859e0cbc5bbed95fd62946d0
84b5859a15480f474ecf20140426d2229a413fc8
F20110209_AACBWS hu_g_Page_108.tif
315bd7dcc8f3ed174e179c2e9f104ac3
915627c2ccb76172292e7932af5df2ea88bd61fb
36841 F20110209_AACCDB hu_g_Page_157.jp2
d33673af71b9e88c1fea5553619c2ede
12ce9ccb02a937a0e21656fe16d3da23cd8ca854
6151 F20110209_AACCCN hu_g_Page_050thm.jpg
12137774b224c5f97a3cd71d1a22fe9a
ad399f87b3951e54bb21aa165c7e9dc70198b59b
12670 F20110209_AACBXH hu_g_Page_157.QC.jpg
801a047a1d6b6b9241807258d37d874f
2d44b2d03a0d85564cd165516ca5ed9f4850f64a
55483 F20110209_AACCBY hu_g_Page_017.pro
3b865e41259a75eb51d565d480af385f
15f935b349614cf66607102490164ea7d244e4a1
6934 F20110209_AACCZV hu_g_Page_139thm.jpg
f81ae2f6d82d6c825f1c473d4b9824b1
84a199785b30f4b50999021db438633c22d39e27
53803 F20110209_AACBWT hu_g_Page_057.pro
5592d1d28170827fb4b6d8b6f7b65e3d
7f9928182c6f6c215aaa0aa7aa37105cbffde519
29411 F20110209_AACCDC hu_g_Page_113.QC.jpg
8368bb9ad5ad559715a570d288c08096
059b84c2f08f54d2acca12c5746d29551012414d
60766 F20110209_AACCCO hu_g_Page_168.pro
8ddb8661141817e61eb536f5d328d729
0893aba5672374d27040cd6d20700c8eac0223d6
35888 F20110209_AACCBZ hu_g_Page_166.QC.jpg
81e03d843812357e6f1ba08a027eefc6
8688d3b1bef93ff77b0e2a0a375f5b40198d87e9
6231 F20110209_AACCZW hu_g_Page_140thm.jpg
26d39ab1706a840cf2a9b3cda05fec01
f2eb2da21486698d7b6f19d3a445d5f399fa4b27
110343 F20110209_AACBWU hu_g_Page_021.jpg
c46e997e259944cbd6dd3c352f98bfe4
0c1e40e081b9f0f8216a08b45f91259843b9c895
6398 F20110209_AACCDD hu_g_Page_042thm.jpg
fa51e0d46b2196a5b6d1840d67d4e159
cc8b6599e87460a9e02e0bd04003c67d800460ad
F20110209_AACCCP hu_g_Page_078.tif
dc6cc2823790ed129799b0bd322abafb
29eba3ad49ef0dbd64231e2889828e9b725c0d80
19076 F20110209_AACBXI hu_g_Page_088.QC.jpg
cce3774723a1412dc6b0c64a14c534af
b94f1ad7eede62abc5acec0d2a9a8863f784787a
7355 F20110209_AACCZX hu_g_Page_141thm.jpg
517a0d95558ec258aa435450cce8e59b
81233f9ad9484b24aaa800cb70639ef8c7e96073
60759 F20110209_AACBWV hu_g_Page_056.jpg
dbc0428ecb819849cd88bf8fd20c84d3
71450bc8b2de87263bf6f126e33c9bc8a4b9a22b
25269 F20110209_AACCDE hu_g_Page_037.pro
7ce6246a26e57dd1572a7689c2457de0
1d10ec44eecbe221c75f68056b41cfd17099f751
79429 F20110209_AACCCQ hu_g_Page_099.jp2
d6989ccc95e2162ec4f1abb3b3073259
328fcd37edbab2c161d878a755e30c37e28b29c1
23316 F20110209_AACBXJ hu_g_Page_059.QC.jpg
860ba7ff980b52a3138dfa2aee2193a1
cb0a12c1ff7a14ba80b1e9cda496bb34d61dbeff
5412 F20110209_AACCZY hu_g_Page_143thm.jpg
4293d2c86446ff68f8c8582462be53fe
b92ffe14efd17d342a3228384aef457c7220611e
105203 F20110209_AACBWW hu_g_Page_105.jpg
229e965ec5901eed132f0e7287f2e7af
20f2be40a4c9964fd5324914e9c65981739e797d
61240 F20110209_AACCDF hu_g_Page_032.jpg
b8aaf66071cd70228c2c7870faff87ee
0ccfca0e3c66db03781392b2e88e3e828453b242
F20110209_AACCCR hu_g_Page_050.tif
348252168cdb2d6cb187426fff2d6940
7aee92712367627fe07b0862835c23804dc3cb82
29718 F20110209_AACBXK hu_g_Page_085.QC.jpg
9cfec7d67d033f02e725e599f9ce7c14
fb57dc2dd02d02da3531e7ba0b938e27c7d4c518
5956 F20110209_AACCZZ hu_g_Page_147thm.jpg
17098009cc0c3cd1eefd36fab5d094a5
23ed75fe0c1fd09f688341f624b5d9409b5c622a
4869 F20110209_AACBWX hu_g_Page_079thm.jpg
ba91b1f6251074e017760648cb08c30f
9845b2c02b797ada9d9674973dc42fc707b1dd80
34644 F20110209_AACCDG hu_g_Page_086.pro
2d5c40560d544fb9fe39c119720b88d8
48f4541a7b5128aa5218ff9c1aa959596e3975a6
67283 F20110209_AACBYA hu_g_Page_059.jpg
cdc9c63a5bbd4fe67bfb529a06693548
78fd07956dd5f753d340ecd5e8471b0860fb256e
F20110209_AACBXL hu_g_Page_134.tif
644e5071b74c0040431696e8310b6c8a
5f8e167260dc791300519e25ba4f1d280f989ed0
13618 F20110209_AACBWY hu_g_Page_067.pro
5b86eb04cfdb76d23cd52a62d4757567
7eea40f2af479bdb5953fca21acde7e518b0a9e6
70688 F20110209_AACCDH hu_g_Page_108.jp2
2f92d1de6b8b13855af9d09a5ad6bbf4
b842a9847f592276e35623f8ca4a8785d20daeca
12576 F20110209_AACBYB hu_g_Page_157.pro
083514ddd4b7ed796e4d98949624ffcb
4a979711bbb02df0eb450538bb513a62cf54f8e2
117936 F20110209_AACCCS hu_g_Page_167.jpg
7aff3f4ca590279761f56ab486b5b738
2736e1ffe2bb4d57db97dba9fe6d7160a81ed860
F20110209_AACBXM hu_g_Page_103.tif
b168f4e39eef974e4bd8eaa4b4035f3e
254bebcd1c4dbd390cfbd777b3cca7b5d9f42de0
5529 F20110209_AACBWZ hu_g_Page_033thm.jpg
7232dd78519c5f4351e8f2d8cf8e5661
f1ab095ac7dcd8525b6f748eb9dd1bfb0926d20f
21986 F20110209_AACCDI hu_g_Page_133.QC.jpg
6dec7e3e5ca5bd6e78c4fdc9750a2d07
d8e070f36e4f3f59b8ac3b7f38d96db33bee07ea
1971 F20110209_AACBYC hu_g_Page_084.txt
bdc7e32dc00b95d6506d7f3519fcef6e
1be48773d34bf4b020ff5823d0c17e15839bdfc7
47892 F20110209_AACCCT hu_g_Page_028.pro
2745ccfa0173e9a65f97c6286a45f551
77ab29626d9331c314919509b5cc91bb29e5f07d
15842 F20110209_AACBXN hu_g_Page_148.QC.jpg
c66883e47b70e11f5676229f598eb910
8cbcb295b1a473c39fa95c376982d7b1d81c61d2
2343 F20110209_AACCDJ hu_g_Page_165.txt
ff4b3cc85f083a6449a51c0e4b252540
35792d327580eb541f1c74d05ee954be3cd18cdc
F20110209_AACBYD hu_g_Page_008.tif
b62af76185d4a5d348487c09201f4d81
0a45291c1d439cd00127ed0a105bc970e2620c5e
43880 F20110209_AACCCU hu_g_Page_096.jpg
0194071a42d46ff6d464787fcaf04905
99aa412ef33891bb5243d07053f07e085e64ee5d
F20110209_AACBXO hu_g_Page_098.tif
33630f89f10146a09ab1ca98321cb930
9a07e83aaf250b854158679f9dcc5c96af8e439f
100106 F20110209_AACCDK hu_g_Page_131.jp2
368df0d905270ba9def0ce5b15f6bc21
5751a5a2bdeb257fd92d53d06f963cfa2d92ea4b
116041 F20110209_AACBYE hu_g_Page_023.jp2
e10edb62b428709e8fd9ccf211a1a7ef
d0ba48346cb71c90058dfed48b4f69a32e7a473d
30866 F20110209_AACCCV hu_g_Page_075.QC.jpg
9f06f371454def2f261dc25382b01b03
9835f37626e81e0418d9a8b861d51e5db2c0a483
111926 F20110209_AACBXP hu_g_Page_057.jpg
26d05fa6840109909823b92be262f3ff
887073de96b5fff15a4b2f83b8a48b9de958aa51
7998 F20110209_AACCDL hu_g_Page_105thm.jpg
b5520b5790c9516d9bae963fc50b4f7a
90bfa021db19b97b11a3a874fe58bb4d73f1e87c
62782 F20110209_AACBYF hu_g_Page_162.jp2
135cf587e82ed5a1d912196beafeb55d
80532268be4fada22a46124cf6b512e9cecdde22
14108 F20110209_AACCCW hu_g_Page_096.QC.jpg
9302bd115efe70016e909ee3e6bce4a3
8920f5c57a8d6db460ce9ec11dae86659bf69edb
36183 F20110209_AACBXQ hu_g_Page_017.QC.jpg
448e33bcb4871adb44cc693d6c75ceee
ddcb7542d5270f247136ab6065467dbdf3ef1b91
F20110209_AACCDM hu_g_Page_104.tif
8bbe3f9d51916f187936a2a48ce63114
de2184d05173f87b2bf9c3c02508dfa3b4f33b7b
125130 F20110209_AACBYG hu_g_Page_165.jp2
549f6e1488ff9c4ac6b301ce38dad8aa
d1c74f22f124ff5900771af2cd2054f5a665d1f4
95572 F20110209_AACCCX hu_g_Page_007.jpg
5bbb7e25bb3676ac88014886c954fad4
af3426563598a42d24f8cde5b991aac3f879fe68
F20110209_AACBXR hu_g_Page_014.tif
5dc73d8d34a6e9fda6d23877851849f0
ef613a7afdb74d96902c07a5a7cff116ea43cb69
67554 F20110209_AACCEA hu_g_Page_052.jpg
4d691dbd0220f5c0a2aa3f42fc54a3fa
969991882b645a60aaaa3b306671a547b3f0eb71
F20110209_AACCDN hu_g_Page_057.txt
cd33ab732ab9856697fbc8f9409a0089
0c988df7a1ab76ce7cc6392a7efecc24b1a52d94
1896 F20110209_AACBYH hu_g_Page_012.txt
b0ada70eaef660faf4de876522ab7938
4ddd1c605edb512bae6ed0aa9a4b7fe8967e9555
21396 F20110209_AACCCY hu_g_Page_143.QC.jpg
98ff500faa720203df68b21713707790
886cce68305d0724bd76204347fef33b31355bf1
1905 F20110209_AACBXS hu_g_Page_116.txt
dc907b1a4ef7e378b1f191961f81d7c5
ee2ae03d3540e778ebf79a368ec586671ac1b984
70700 F20110209_AACCEB hu_g_Page_126.jp2
2f5657b4b95498d11118ecdbcd298c1e
2adb168db073f259dce3edfdbd43227b3bcc0d1e
F20110209_AACCDO hu_g_Page_162thm.jpg
19e1a7e6411082361c0a312596400759
e5fdab52d2dba7da3fbc0b0e9695cbc8e667d0fc
87844 F20110209_AACBYI hu_g_Page_113.jpg
33669c1f853be6692cae56dfb26b8508
593d695a6e090404e480986544cfa1703ffa8f78
39654 F20110209_AACCCZ hu_g_Page_048.pro
2d9d541f4d9acb60e25610ffa68a89a1
5d6aa159e9c486fa9a7132ace5b18b8046ee5646
1753 F20110209_AACBXT hu_g_Page_077.txt
460b011725d9043f421c1f66a4703fee
19232e3b95c633d065e3c21e1114a4c672da0a6c
46515 F20110209_AACCEC hu_g_Page_161.jpg
51b296e6dde632c9839ce96cb8a387b5
ab8cb1f130c2da664ba9cd20e3735ae749bb0d77
40754 F20110209_AACCDP hu_g_Page_155.jpg
ae8af36e0c2ef75703b0274a9581f384
a3914c24cc984e1c053e7c87f2b5726937a15542
1383 F20110209_AACBXU hu_g_Page_129.txt
dd1e1e611d87ad4ba14502252145712c
ff82c094e89f43102cfe2bd813b5bd0da0a74279
86114 F20110209_AACCED hu_g_Page_077.jp2
ea3479717362a3bf09884061d322df41
f5b1c533b620726fb2ad044b2cd3110e1cb305bf
F20110209_AACCDQ hu_g_Page_155.tif
c8659b002126135afbe50a2630975db7
b855829f7b1a991c86e12771527ef60b067c8fa4
4783 F20110209_AACBYJ hu_g_Page_002.jp2
9bd13ab856d8c5c7fb88122fbfd19f45
747d629e8d0618243ae492e38b193c1f6370ee4e
1363 F20110209_AACBXV hu_g_Page_037.txt
7e18b613aa9f6afd11af6848c2126a08
94bbd59d70f372c1c891f19d03fe73ccdaee1b75
3764 F20110209_AACCEE hu_g_Page_157thm.jpg
50b79fd3fe7934f420d2571921b10d4c
61dab7181bff3d2c4af1197d2a0679df69d7f729
398850 F20110209_AACCDR hu_g_Page_063.jp2
0e9a284fdad5cf2bf460c3715658c83c
c58dcbb58f7b086a1bc68f1166e9b28b60692da6
F20110209_AACBYK hu_g_Page_106.tif
85c097cb3c2afbe77b375f68eb2be728
197a805d27e295b658c08ba5b8367907f1a319e3
1979 F20110209_AACBXW hu_g_Page_039.txt
9bd11600fb5d3870a0f3c6d7c454ba3d
332e1aa6d2796b1f3d0f869b70702ce43e0d0b50
44874 F20110209_AACCEF hu_g_Page_145.jpg
5c4fb5dd9534e1e2291b26981fed9127
265db2b31ba4fffa863803ec15a407cde0aa83af
1689 F20110209_AACBZA hu_g_Page_098.txt
30364c820df7719d27754bc0d9d7ab57
8f5279690ef4dd4aeea774e277eeecc17e3be5c2
56322 F20110209_AACCDS hu_g_Page_038.jpg
f27a55601c30522769e60ad640f03d5f
8d5e3b08ba2c5429794a167aa87175e54d598ef1
5712 F20110209_AACBYL hu_g_Page_146thm.jpg
74939f17f495352d1a01b8f5510ff6ef
4f11449b242a86485d935a4a6e01a660471da425
5876 F20110209_AACBXX hu_g_Page_083thm.jpg
eabd55c29d2931c7add2b09848bfc108
26dee81b697023ad67ce676a7920f0faefdc0fbe
429176 F20110209_AACCEG hu_g_Page_148.jp2
9ef37ae965cfc3415d8bc47f803e6b07
8a1d39a59bcfc52c7e814ae4341ab51c406bf397
5037 F20110209_AACBZB hu_g_Page_095thm.jpg
c57af72180a013eb1e0d98d6fec41d16
aab2d9f93eb92dedbc00061df83884b12eccbf81
547 F20110209_AACBYM hu_g_Page_171.txt
292a2724e56c405b15d391803e2e00b1
f98caf4422789a73462010b3c2fb2c6fb291a8b5
26856 F20110209_AACBXY hu_g_Page_139.QC.jpg
edb5ef207498e12d83206d1a855df482
2c654065508887d5fa16d714f129a36524e07d59
1848 F20110209_AACCEH hu_g_Page_013.txt
4a8cc4a794f5fc6a3ccdd822f36efcb1
12be8c6de7118a86b805877c6246d7f7278a3a29
F20110209_AACBZC hu_g_Page_080.tif
9396d423390b48ef958aca062454578b
07c23e499fac1f22ed62e6dc0dc4eb1b234adf03
404 F20110209_AACCDT hu_g_Page_097.txt
643cf0b03fdbcf798fa2b4d287d352f3
06ac90271ac5d4e0a918a3d0bcd9627f2a6f825e
117818 F20110209_AACBYN hu_g_Page_024.jpg
72b18bcee4371bb6c8d5778cb4dd2b20
7a19bcd001968822c2e58b8b9feb5593ff638aa6
F20110209_AACBXZ hu_g_Page_072.tif
a801749518e71b5dfc26921817f68d6b
2adc42a8204228d1a5e8fa53fa87a34d308e5de5
94490 F20110209_AACCEI hu_g_Page_036.jp2
87f937052671a69b6be6d2941bca1207
66f2fc525a89bae32035cc6a19def893753319e3
22145 F20110209_AACBZD hu_g_Page_041.QC.jpg
c223c23ba3b5d651dd341c1c9e41c580
4ea791bef343ad645effb5bb9fbdac407274297a
14525 F20110209_AACCDU hu_g_Page_093.QC.jpg
e3dfe2ea12e5e07291be63e4fc85c3b3
cabd877e0fd60fe7d86c6d4949f332354afa1f33
69173 F20110209_AACBYO hu_g_Page_044.jp2
667bfdcec61a16ad576420656f55afa7
2a19fa5814bb8426032b06f4c8b92d8c159d54f2
101597 F20110209_AACCEJ hu_g_Page_028.jp2
e875fb99381093e6361be100bb6bfb48
7ff59650023866157d1b8629dadc1e9217b433d8
F20110209_AACBZE hu_g_Page_087.tif
e55eb554c94f1c92d256e32dd826a2b4
e625c3d2d0d32d7665ea333d9e5623b7e6537462
269 F20110209_AACCDV hu_g_Page_074.txt
f938afad7feda7aa933991d588e36c27
10cd61289bff10104503fed2e797bf04e3490d1c
30252 F20110209_AACBYP hu_g_Page_005.QC.jpg
0490a0f1f58e16d18932d5cf3244b880
8ff44e7a8034b0608da30bc10fd0747625b08267
35748 F20110209_AACCEK hu_g_Page_089.pro
7bf4150149588c10e6c906f45bdb893c
35779415771530ad36715faa9f78cff7a3ac75f7
F20110209_AACCDW hu_g_Page_026.tif
f669b15310ee866d27175b7b26c1eaab
9379c567377d0045940b4b5e057db33d59c69fb8
7571 F20110209_AACBYQ hu_g_Page_028thm.jpg
5ceaef405589e9a78d2a09c65581426e
125a2d7892afbb42979424e2d47b3ad13a861ed2
119370 F20110209_AACCEL hu_g_Page_020.jp2
7f09318e8aa8e6ea561031ce2b447a4f
e38eb0458e3db94631105f9547ca3d609a6943c9
5854 F20110209_AACBZF hu_g_Page_125thm.jpg
d162d1ace6dfb50fd2c30cb4db107c72
3f6ee89836eacb921bd26e9725fb030244a21a01
62709 F20110209_AACCDX hu_g_Page_125.jpg
a1759999b249f0d5ae50b75671b33f8d
731c4ce7a0002ffe31adbf43eb3b0e5f4b9273bb
200788 F20110209_AACBYR hu_g_Page_074.jp2
cb2c33c03352f6b57ba2b5d2b0855d53
b99199d7c8f40cac188eb4025f666d7e04c125d8
6052 F20110209_AACCFA hu_g_Page_059thm.jpg
cbecbc79a639bf7cff0de72657f45642
d8d7d77cbb956eca162777fe4701b4dd2b23d7b1
4555 F20110209_AACCEM hu_g_Page_061thm.jpg
23c9d5a7427c6c14436f1d87a05d5ab4
ae01b10fdf676a99f5701ebf371a67edf5048cd5
8566 F20110209_AACBZG hu_g_Page_011thm.jpg
68d02afadc4b4018cc4b1a8e4e80bcd0
2652712de6c3f95ea03a8ae89b3c0efdf8eb36bc
97127 F20110209_AACCDY hu_g_Page_131.jpg
64c073781ecde6ff58fb4aa2f6eac6a5
540d663398dad4115d2ffab3979a5a1415089422
7308 F20110209_AACBYS hu_g_Page_112thm.jpg
35f1023fc0d62103637e6ab3d05b45c6
db1997fedbeb820ba818dd340b53ac2afda916d9
34402 F20110209_AACCFB hu_g_Page_055.QC.jpg
8b5e0d30f970a0a61d037427944977b5
ef0746915655a91375398b52e1dbccf9b66c3bce
15435 F20110209_AACCEN hu_g_Page_161.QC.jpg
48473a75db6e085150a68929fc489f68
8f3f264d20ba6fddfb882e6c1a34527ddf65a1b5
15312 F20110209_AACBZH hu_g_Page_100.QC.jpg
724ada2d8ca41830dc504fa7f3e29afd
91f87e52f1639200b6be0bc02858dd27cae17e08
7891 F20110209_AACCDZ hu_g_Page_078thm.jpg
75609063da5125a72b96e710e1d9e86d
4ce91903afbdc504f4049d4111f8c5347acd5b6b
F20110209_AACBYT hu_g_Page_152.tif
bf7186c8d71ff4367bba3a685d41dfdd
044552b8e3dce8e7a167996e158fd82c314c7cf7
46906 F20110209_AACCFC hu_g_Page_030.pro
eb9b9c16d8543f10eb4f9954134580af
504f6fad6158c6f80f38a3a2a03292b3705c9693
F20110209_AACCEO hu_g_Page_088.tif
304b3ac6a925a4d72c20251d8f462917
e7888c3ccb7e2fade5eb9340e4578bb33fc1144b
32571 F20110209_AACBZI hu_g_Page_050.pro
63d5ed0d0264b2a8a02eb06be810743c
e48e89b459fef2e994204f95cf81a9b63c916c53
126301 F20110209_AACBYU hu_g_Page_169.jpg
9d530ff81bc2ad279b61b916ee80164e
31afba0c7387a893e64fe18928df8596a7bd1fb1
24943 F20110209_AACCFD hu_g_Page_068.pro
38372db546849abb099b7953d6568745
960c84c2d4fd2ff11f99494dd29b61210d0c334e
1555 F20110209_AACCEP hu_g_Page_044.txt
32b47065c24c9573647c2513a8f1e5ea
bb1aebf5525413319af781f3a916cdfd0666fb0f
482 F20110209_AACBZJ hu_g_Page_063.txt
8d105363997958b04b132fcec044b256
8aec2fcd1a61441b3bac8795e600cb4a61d69443
64225 F20110209_AACBYV hu_g_Page_011.pro
85ecf03910e72d8f1a18e717028d1791
26207eb6ba375747f881f7dc8180cb9544b0f708
50377 F20110209_AACCFE hu_g_Page_095.jpg
a97f16f80abd14b02b33be1896fdd362
b1b029ab93626ab6d7d1d7ac257813344d8e316d
22089 F20110209_AACCEQ hu_g_Page_114.QC.jpg
7c0dece5198dcbd916710d2d81aa8dd0
89cf03601d914415c2933f77782ed82a416f00eb
1510 F20110209_AACBYW hu_g_Page_062.txt
4d17d09344dc595aca6ca01f28e481b1
a2b2a9eab5b9a768fa7615ec939be054cc6313e2
727 F20110209_AACCFF hu_g_Page_002.pro
774c8f145aa2f4112f7a9ca79386a2e2
40c8968d6f82bb9ab1b236479d8a554e25b90b62
17776 F20110209_AACCER hu_g_Page_102.pro
5bc27d4e11d4df9d81d893b946a1b00e
6789a29901ad9193cc99bc95cb3978a1838b150d
24260 F20110209_AACBZK hu_g_Page_079.pro
2bf0b15fc30d078ff63eb80282248fb1
f392692e7892f843331d375bf7d71c255160c36e
1725 F20110209_AACBYX hu_g_Page_106.txt
a83e54b16e4f0ab1b9e714313d4ef628
5153307c9fb2f892a81491618f8dc679bd3299da
1457 F20110209_AACCFG hu_g_Page_126.txt
ef81d40edf314a3e449278acc81e578c
06a34b38130b3f83afa19f51d7a14233c3e733ed
64401 F20110209_AACCES hu_g_Page_138.jp2
29a32aeb7115797d021d1824e0ba4e18
d44d1afeba86db1fcdc7d7be45d771b906877754
F20110209_AACBZL hu_g_Page_146.tif
c6a99cdacee4a62237940f842ce4630d
003be3416d37319137a2e1266d59c1321d3fc3c1
20498 F20110209_AACBYY hu_g_Page_136.QC.jpg
eec5dc352466d665fa1fe203862df7b1
2c2f9a5a3856173fe97c2af43014f06661d07089
F20110209_AACCFH hu_g_Page_023.tif
6e60d79c456490d58f52454b42be5a89
b6357250b0d5f29275c0e28a115e5a06108cceea
113888 F20110209_AACCET hu_g_Page_057.jp2
bedb891055e21a9a101d707f42908021
80aaed8e77e60227b87e3ad9c89cad4b35cc670b
58935 F20110209_AACBZM hu_g_Page_080.jp2
c50a60013d35e8851208563bdca8ba7b
7d2e8d28bc824f8f3048449bcc8d3e3b6ec33aab
F20110209_AACBYZ hu_g_Page_126.tif
86641099cc522ac7664c23d9327e7df0
1906331a5229da4eb69b938e6036c3561177462a
35327 F20110209_AACCFI hu_g_Page_014.QC.jpg
28dcbf147535153c63d12baef769f0a7
0a7bb98de2d5f01959ddbc549128340b2ac50a61
78431 F20110209_AACBZN hu_g_Page_084.jpg
8ed4885dd252fb962eb53535541d3687
dbfe5472899e0bb2c9780e486806670dabe3a22a
28061 F20110209_AACCFJ hu_g_Page_141.QC.jpg
a52589ac7676200f99e903e4493dd286
9d94b1d2b382232faed4d725c9550ff388d64ae0
18788 F20110209_AACCEU hu_g_Page_069.QC.jpg
c9538494c324dbe303fedddcede90c49
9f6874a43c09a08ccd74d1527a6165e57e4ef8f4
54060 F20110209_AACBZO hu_g_Page_061.jpg
0ceafaeeb1feb9f95ccca6005fe1a732
a74f3fe5f7b3a7c7161f4e738830086ff0949c8e
12452 F20110209_AACCFK hu_g_Page_171.pro
86b9741e0cdfc9ed622b4a0c6a370646
d340140826f8016d4af5bdd3c62cfd87688f8570
76842 F20110209_AACCEV hu_g_Page_046.jpg
b8d0cca88820e47f8a26ce04419d03aa
a0c2198c4e3442f2fb2834a1dd602588767e68c5
125060 F20110209_AACBZP hu_g_Page_168.jpg
601aaed075167b20c7e30aa4a1cb4b0e
6d62af1f50145fec7ed94fa7ef188a4df7908bf6
87547 F20110209_AACCFL hu_g_Page_043.jp2
8ed8d1b2faa19e2bfc7cd2c97d19cef3
79e0806ca22362b5b18abba9e32afbe450046120
F20110209_AACCEW hu_g_Page_081.QC.jpg
be608dac76a944b3a0ab489bce0b27b9
0aade1ba0f2f6b7ca91abec8da19477d90bc6aa7
78348 F20110209_AACBZQ hu_g_Page_089.jp2
3bd414af8ee1d6e0e0e0f378ebffba05
e3d2f78eb84f71e7ba5ec0f2d969f8e99de5a9e9
67957 F20110209_AACCGA hu_g_Page_135.jpg
ea3a620de67ce6d1adbed1bb528fe407
ca6415a332d71cafa95bd545d7f9fed0c39624c5
23386 F20110209_AACCFM hu_g_Page_137.QC.jpg
481239c3bd495757ac7d457b97a03f4d
5a881f59b418608aa060276721d143c30948f5f8
2202 F20110209_AACCEX hu_g_Page_083.txt
1673f54153383843b5db5c200fc0242d
82bbde09f2bc1e8fe560518b3f780821cbe7e0d1
61488 F20110209_AACBZR hu_g_Page_122.jp2
3713602089a38472de358bb810f83ef4
e4c024398c6e6d726a478d9fe292fec89d2b0966
893 F20110209_AACCGB hu_g_Page_004.txt
8b2578f490a7b6d065ce98353e498de2
9105e7ca34f96c16750d4f259653bc52bf8b56f6
24696 F20110209_AACCFN hu_g_Page_046.QC.jpg
bd0f29e6d915368225cd965a1404229a
473c9e0531394110ea05c9f080436bcba897039a
5316 F20110209_AACCEY hu_g_Page_067thm.jpg
f9f5f0848019a9350c397a43631823cf
61949cd010920e87571e343c77e8b6d7ffe13a36
30505 F20110209_AACBZS hu_g_Page_118.QC.jpg
d336cad4aa1b105e17961268105a0f32
b5c6ecf5ac267885007a5890e314a018cb2241e7
8715 F20110209_AACCFO hu_g_Page_168thm.jpg
49355bae245f0cc358716fd1f309bca2
c0c8a8ea9d8e8a5940d2409582de22ac4284269c
101355 F20110209_AACCEZ hu_g_Page_078.jpg
a11f36ba3a37c8c40eee0f2ad6d6f81e
36329b11a59372607a29379a086d4340d7b4c7b0
8195 F20110209_AACBZT hu_g_Page_016thm.jpg
7e4a3922994e9aae48ba8659e52fd1c3
4d001899c395ca93539f6847a83a064ecc4d73b5
270501 F20110209_AACCGC UFE0021789_00001.xml
847ae1871eed9114f28d12eaa1c9b4c7
d8ec7c8c3f49d30ae2d856946b865a50d47c7c65
6875 F20110209_AACCFP hu_g_Page_043thm.jpg
1f408792359bba45947c1d1b7868fcd3
26d5a3bd6adc94d64653efa40c5fa866b2d4e8b4
31909 F20110209_AACBZU hu_g_Page_132.pro
3a3f7d27d8eb1a53ff2e970785e8dba9
30edf5d519dbfae04330f2d3fa290d720dfc7f3c
5952 F20110209_AACCFQ hu_g_Page_041thm.jpg
c1067e22fedf1478bccc8d7eaf253098
be4412225c8227b65a9487c3c2d662b0b9d2b844
46249 F20110209_AACBZV hu_g_Page_051.jpg
18c6ecd61907a22144f5a9bfb9caaff0
4d3826cc68f20de0f0810df0dc54193250c956ea
73403 F20110209_AACCFR hu_g_Page_047.jp2
0ff0d2b7c960116e313d68df01afbd29
534ac540a54ecc49b8e4fd2d59b09d1e13a415cf
5578 F20110209_AACBZW hu_g_Page_087thm.jpg
f524342aed56358abdba5ce3bef6f52f
24c417e8c13878b8f6f98d64fabfbac235538381
F20110209_AACCGF hu_g_Page_002.tif
e0266a0a020b6722a5b21ca1e471e288
c347302cb9ba1c8110d32a415fe3f28538ecdbf2
75087 F20110209_AACCFS hu_g_Page_031.jp2
bd15f4d4c4f949db6096984da1ec7727
aaa529ed541dcd1cd00da00ea0bb4511b53276cf
109222 F20110209_AACBZX hu_g_Page_076.jpg
7160443ab3ccb7c559d334c2058b51fb
02bc808b776d11b186b181ed9e168a42ab85da55
F20110209_AACCGG hu_g_Page_004.tif
f1c36ea44cd7760dc8d1a2bf425822b6
a3cbd9d5cef947c648f572a598ab26fb85544c1c
523102 F20110209_AACCFT hu_g_Page_073.jp2
7cb6a6158e3e0f6784a9847d5eba6c57
f631ad0c7fec6c62d6d2c482b1bae3b7ac8a41f4
74 F20110209_AACBZY hu_g_Page_002.txt
93bd05feab5a207102bee6c1440b4b8f
0e0a6d68cd49ab94efdb1a7d4ec0699e9f26a5e4
F20110209_AACCGH hu_g_Page_005.tif
55a81edf12c194d7736c9eceab4175ed
14681567932c902d6e59bbba5d9708ef057ed3dd
1470 F20110209_AACCFU hu_g_Page_110.txt
9b34646da9c61ab65a8a07a93fe0df51
6d2a0e7d7d2c9011da77b0efdc2f4702604671e9
45435 F20110209_AACBZZ hu_g_Page_154.jp2
1b3bb3fbdef963c93f7180381aa6e8ff
0efac0c1d64671701dee14ca808275f34a2b04a8
F20110209_AACCGI hu_g_Page_007.tif
3257a745359a2045692188d06c262c84
4c899723c2dcbc4459afed450f66ea00acec993a
F20110209_AACCGJ hu_g_Page_011.tif
b7a31d626f59b6c356e0c1ed89ffa806
42f6bd6e78644af74322f19b9e7d935558122b4b
63162 F20110209_AACCFV hu_g_Page_042.jpg
95a31331f14f085ee864dee23f06fdc6
316126025fdb8d9fc8fef20ab281aef0d4cb0a52
F20110209_AACCGK hu_g_Page_012.tif
b49caa7a72fafaed26a296764fb50698
cabe4f6f0d050b0aa5cc7366dc5ad1990e4ba678













VISUAL SERVO TRACKING CONTROL VIA A LYAPUNOV-BASED
APPROACH
















By

GUOQIANG HU
















A DISSERTATION PRESENTED TO THE GRADUATE SCHOOL
OF THE UNIVERSITY OF FLORIDA IN PARTIAL FULFILLMENT
OF THE REQUIREMENTS FOR THE DEGREE OF
DOCTOR OF PHILOSOPHY

UNIVERSITY OF FLORIDA


2007




























2007 Guoqiang Hu






























To my mother Huanyin Zhang, my father Huade Hu, and my wife Yi Feng for their

endless love and support















ACKNOWLEDGMENTS

Thank you to my advisor, Dr. Warren Dixon, for his guidance and encour-

agement which will benefit me a life time. As an advisor, he kept polishing my

methodology and skills in resolving problems and formulating new problems. As a

mentor, he helped me develop professional skills and gave me opportunities to get

exposed to professional working environment.

Thanks to my committee members Dr. Thomas Burks, Dr. Carl Crane III, Dr.

Seth Hutchinson, and Dr. Rick Lind, for the time and help that they provided.

Thanks to Dr. Nick Gans and Sid Mehta for all the insightful discussions.

Thanks to Dr. Joe Kehoe for his suggestions in formatting my defense presentation

and dissertation. Finally, thanks to my NCR lab fellows for their friendship during

the past three years of joy.















TABLE OF CONTENTS
page

ACKNOWLEDGMENTS ............................. iv

LIST OF TABLES ................... .............. viii

LIST OF FIGURES ................... ............. ix

ABSTRACT .................... ............... xiii

CHAPTER

1 INTRODUCTION .............................. 1

1.1 Motivation ................... ............. 1
1.2 Problem Statement ................... ........ 2
1.3 Literature Review ................... ......... 8
1.3.1 Basic Visual Servo Control Approaches ............ 8
1.3.2 Visual Servo Control Approaches to Enlarge the FOV .. 9
1.3.3 Robust and Adaptive Visual Servo Control .......... 11
1.4 Contributions .................... .......... 13

2 BACKGROUND AND PRELIMINARY DEVELOPMENT ........ 16

2.1 Geometric Model ................... ........ 16
2.2 Euclidean Reconstruction ................... .. 19
2.3 Unit Quaternion Representation of the Rotation Matrix ....... 21

3 LYAPUNOV-BASED VISUAL SERVO TRACKING CONTROL VIA A
QUATERNION FORMULATION ......................... 25

3.1 Introduction .................... ........... 25
3.2 Control Objective .................... ........ 26
3.3 Control Development ................... ...... 28
3.3.1 Open-Loop Error System ................. .... 28
3.3.2 Closed-Loop Error System .................... 30
3.3.3 Stability Analysis ........................ 32
3.4 Camera-To-Hand Extension . . . .. .. 33
3.4.1 Model Development . . . .... ..... 33
3.4.2 Control Formulation ............ ..... .... 35
3.5 Simulation Results ................... ....... 38
3.6 Experiment Results .................... ....... 41









3.6.1 Experiment Configurations . . . ... 41
3.6.2 Experiment for Tracking . . . .... 45
3.6.3 Experiment for Regulation . . . ... 47

4 COLLABORATIVE VISUAL SERVO TRACKING CONTROL VIA A
DAISY-CHAINING APPROACH ...................... 61

4.1 Introduction .................... ........... 61
4.2 Problem Scenario .................... ........ 62
4.3 Geometric Model ........................... 64
4.4 Euclidean Reconstruction ................... .. 68
4.5 Control Objective .................... ........ 71
4.6 Control Development ......................... 73
4.6.1 Open-Loop Error System ................. .. 73
4.6.2 Closed-Loop Error System . . . ... 74
4.6.3 Stability Analysis ........................ 75
4.7 Simulation Results ................... ........ 76

5 ADAPTIVE VISUAL SERVO TRACKING CONTROL USING A CEN-
TRAL CATADIOPTRIC CAMERA. . . . .. 84

5.1 Introduction ..................... ......... 84
5.2 Geometric M odel ................... ........ 85
5.3 Euclidean Reconstruction .. .................... 90
5.4 Control Objective .................... ........ 91
5.5 Control Development ......................... 94
5.5.1 Open-Loop Error System . . . .... 94
5.5.2 Closed-Loop Error System . . . ... 95
5.5.3 Stability Analysis ........................ 96

6 VISUAL SERVO CONTROL IN THE PRESENCE OF CAMERA CAL-
IBRATION ERROR ................... ......... 98

6.1 Introduction . . . . . . .. 98
6.2 Feedback Control Measurements . . . .... 99
6.3 Control Objective ................... ......... 101
6.4 Quaternion Estimation ......................... 103
6.4.1 Estimate Development . . . . ... .. 103
6.4.2 Estimate Relationships . . . . ... .. 104
6.5 Control Development ................... ....... 106
6.5.1 Rotation Control ........................ 106
6.5.2 Translation Control ....................... 106
6.6 Stability Analysis ................... ......... 107
6.7 Simulation Results ................... ........ 110









7 COMBINED ROBUST AND ADAPTIVE HOMOGRAPHY-BASED
VISUAL SERVO CONTROL VIA AN UNCALIBRATED CAMERA 117

7.1 Introduction .................... ........... 117
7.2 Camera Geometry and Assumptions .............. 118
7.3 Open-Loop Error System ........................ 120
7.3.1 Rotation Error System ................. ... 120
7.3.2 Translation Error System ................ .... 121
7.4 Control Development ................... ...... 123
7.4.1 Rotation Control Development and Stability Analysis . 123
7.4.2 Translation Control Development and Stability Analysis 125
7.5 Simulation Results ................... ........ 128

8 CONCLUSIONS ................... ........... 135

APPENDIX

A UNIT NORM PROPERTY FOR THE QUATERNION ERROR ..... 138

B ONE PROPERTY OF UNIT QUATERNIONS ............... 140

C OPEN-LOOP TRANSLATION ERROR SYSTEM ........... 141

D PROPERTY ON MATRIX NORM ....................... 143

E COMPUTATION OF DEPTH RATIOS ................ 144

F INEQUALITY DEVELOPMENT ................. .... 147

G LINEAR PARAMETERIZATION OF TRANSLATION ERROR SYS-
TEM ...................... ................ 148

REFERENCES .................... .............. 149

BIOGRAPHICAL SKETCH ................... ........ 157















LIST OF TABLES


page


4-1 Coordinate frames relationships . . . ..... ..... 65


Table















LIST OF FIGURES
Figure page

2-1 Coordinate frame relationships between a camera viewing a planar patch
at different spatiotemporal instances. The coordinate frames 7, T* and
Fd are attached to the current, reference and desired locations, respec-
tively. .... ................ ................ 17

2-2 Coordinate frame relationships between a camera viewing a planar patch
at different spatiotemporal instances. The coordinate frames 7 and F*are
attached to the current and reference locations, respectively. ...... ..18

3-1 Coordinate frame relationships between a fixed camera and the planes
defined by the current, desired, and reference feature points (i.e., T, Td,
and )T*). . . . . . . . ...... .. 33

3-2 Block diagram of the experiment. . . . ..... 42

3-3 The Sony XCD-710CR color firewire camera pointed at the virtual envi-
ronment..................... .. .............. .. 42

3-4 Virtual reality environment exmaple: a virtual recreation of the US Army's
urban warfare training ground at Fort Benning. . . ... 44

3-5 Desired image-space coordinates of the four feature points (i.e., pd(t)) in
the tracking Matlab simulation shown in a 3D graph. In the figure, "0"
denotes the initial image-space positions of the 4 feature points in the
desired trajectory, and "*" denotes the corresponding final positions of
the feature points. .............................. 48

3-6 Current image-space coordinates of the four feature points (i.e., pd(t)) in
the tracking Matlab simulation shown in a 3D graph. In the figure, "0"
denotes the initial image-space positions of the 4 feature points, and "*"
denotes the corresponding final positions of the feature points. . 48

3-7 Translation error e(t) in the tracking Matlab simulation. . ... 49

3-8 Rotation quaternion error q(t) in the tracking Matlab simulation. . 49

3-9 Pixel coordinate pd(t) of the four feature points in a sequence of desired
images in the tracking Matlab simulation. The upper figure is for the
Ud(t) component and the bottom figure is for the vd(t) component. 50









3-10 Pixel coordinate p(t) of the current pose of the four feature points in the
tracking Matlab simulation. The upper figure is for the u(t) component
and the bottom figure is for the v(t) component. . . ... 50

3-11 Tracking error p(t) pd(t) (in pixels) of the four feature points in the
tracking Matlab simulation. The upper figure is for the u(t) ud(t) com-
ponent and the bottom figure is for the v(t) Vd(t) component. . 51

3-12 Linear camera velocity input vc(t) in the tracking Matlab simulation. 51

3-13 Angular camera velocity input wc(t) in the tracking Matlab simulation. 52

3-14 Adaptive on-line estimate of z* in the tracking Matlab simulation. . 52

3-15 Translation error e(t) in the tracking experiment. . . ... 53

3-16 Rotation quaternion error q(t) in the tracking experiment. . ... 53

3-17 Pixel coordinate pd(t) of the four feature points in a sequence of desired
images in the tracking experiment. The upper figure is for the ud(t) com-
ponent and the bottom figure is for the Vd(t) component. . ... 54

3-18 Pixel coordinate p(t) of the current pose of the four feature points in the
tracking experiment. The upper figure is for the u(t) component and the
bottom figure is for the v(t) component. . . . ... 54

3-19 Tracking error p(t) pd(t) (in pixels) of the four feature points in the
tracking experiment. The upper figure is for the u(t) ud(t) component
and the bottom figure is for the v(t) Vd(t) component. . ... 55

3-20 Linear camera velocity input Vc(t) in the tracking experiment. . 55

3-21 Angular camera velocity input Wc(t) in the tracking experiment. . 56

3-22 Adaptive on-line estimate of z* in the tracking experiment. . ... 56

3-23 Translation error e(t) in the regulation experiment. . . ... 57

3-24 Rotation quaternion error q(t) in the regulation experiment. ...... 57

3-25 Pixel coordinate p(t) (in pixels) of the current pose of the four feature
points in the regulation experiment. The upper figure is for the u(t) com-
ponent and the bottom figure is for the v(t) component. . ... 58

3-26 Regulation error p(t) p*(in pixels) of the four feature points in the reg-
ulation experiment. The upper figure is for the u(t) u*(t) component
and the bottom figure is for the v(t) v*(t) component. . ... 58

3-27 Linear camera velocity input Vc(t) in the regulation experiment. . 59

3-28 Angular camera velocity input Wc(t) in the regulation experiment. . 59









3-29 Adaptive on-line estimate of z* in the regulation experiment. ...... ..60

4-1 Geometric model ................... ........... 63

4-2 This figure shows the initial positions of the cameras and the feature point
planes. The initial positions of the cameras attached to I and IR are
denoted by "O". The feature points on the planes T, T* and Td are de-
noted by ".". The origins of the coordinate frames 7, F*and Td are de-
noted by "*". .. . . . . . . .. 79

4-3 Pixel coordinate prd(t) of the four feature points on the plane Td in a se-
quence of desired images taken by the camera attached to IR. The up-
per figure is for the u,d(t) component and the bottom figure is for the
V,d(t) component. ............................... 80

4-4 Pixel coordinate p* (t) of the four feature points on the plane T* in a se-
quence of reference images taken by the moving camera attached to 1.
The upper figure is for the u* (t) component and the bottom figure is for
the v*(t) component. ........................... .. 80

4-5 Pixel coordinate p(t) of the four feature points on the plane T in a se-
quence of images taken by the moving camera attached to 1. The up-
per figure is for the u(t) component and the bottom figure is for the v(t)
com ponent. . . . . . . . ... 81

4-6 Translation error e(t). ................... ......... 81

4-7 Rotation quaternion error q(t). . . . ..... . 82

4-8 Linear camera velocity input v,(t). . . . ..... .. 82

4-9 Angular camera velocity input wc(t). .................. 83

5-1 Central catadioptric projection relationship. . . . . 85

5-2 Projection model of the central catadioptric camera. . . ... 86

5-3 Camera relationships represented in homography. . . .. 88

6-1 Unitless translation error between ml(t) and mn. ............. 113

6-2 Quaternion rotation error. . . . ... ........ .. 114

6-3 Quaternion rotation error for comparison with different sign. ....... 114

6-4 Image-space error in pixles between pi(t) and p7. In the figure, "0" de-
notes the initial positions of the 4 feature points in the image, and "*"
denotes the corresponding final positions of the feature points. . 115









6-5 Image-space error in pixies between pi (t) and p7 shown in a 3D graph.
In the figure, "0" denotes the initial positions of the 4 feature points in
the image, and "*" denotes the corresponding final positions of the fea-
ture points. . . . . . . . ... .. .. 115

6-6 Linear camera velocity control input. . . . ...... ...... 116

6-7 Angular camera velocity control input. . . . . ... 116

7-1 Unitless translation error e(t).... ..................... 131

7-2 Quaternion rotation error q(t) . . . ... . 131

7-3 Pixel coordinate p(t) (in pixels) of the current pose of the four feature
points in the simulation. The upper figure is for the u(t) component and
the bottom figure is for the v(t) component. . . . ... 132

7-4 Regulation error p(t) p*(in pixels) of the four feature points in the sim-
ulation. The upper figure is for the u(t) u*(t) component and the bot-
tom figure is for the v(t) v*(t) component. . . . ... 132

7-5 Image-space error in pixles between p(t) and p*. In the figure, "0" de-
notes the initial positions of the 4 feature points in the image, and "*"
denotes the corresponding final positions of the feature points. ..... 133

7-6 Image-space error in pixles between p(t) and p* shown in a 3D graph. In
the figure, "0" denotes the initial positions of the 4 feature points in the
image, and "*" denotes the corresponding final positions of the feature
points. ..................................... 133

7-7 Linear camera velocity control input (t). . . . 134

7-8 Angular camera velocity control input (t). . . . ... 134















Abstract of Dissertation Presented to the Graduate School
of the University of Florida in Partial Fulfillment of the
Requirements for the Degree of Doctor of Philosophy

VISUAL SERVO TRACKING CONTROL VIA A LYAPUNOV-BASED
APPROACH

By

Guoqiang Hu

December 2007

Chair: Dr. Warren E. Dixon
Major: Mechanical Engineering

Recent advances in image processing, computational technology and control

theory are enabling visual servo control to become more prevalent in robotics

and autonomous systems applications. In this dissertation, visual servo control

algorithms and architectures are developed that exploit the visual feedback from a

camera system to achieve a tracking or regulation control objective for a rigid-body

object (e.g., the end-effector of a robot manipulator, a satellite, an autonomous

vehicle) identified by a patch of feature points.

The first two chapters present the introduction and background information for

this dissertation. In the third chapter, a new visual servo tracking control method

for a rigid-body object is developed by exploiting a combination of homography

techniques, a quaternion parameterization, adaptive control techniques, and

nonlinear Lyapunov-based control methods. The desired trajectory to be tracked

is represented by a sequence of images (e.g., a video), which can be taken online

or offline by a camera. This controller is singularity-free by using the homography

techniques and the quaternion parameterization. In the fourth chapter, a new

collaborative visual servo control method is developed to enable a rigid-body









object to track a desired trajectory. In contrast to typical camera-to-hand and

camera-in-hand visual servo control configurations, the proposed controller is

developed using a moving on-board camera viewing a moving object to obtain

feedback signals. This collaborative method weakens the field-of-view restriction

and enables the control object to perform large area motion. In the fifth chapter,

a visual servo controller is developed that yields an asymptotic tracking result for

the completely nonlinear camera-in-hand central catadioptric camera system. A

panoramic field-of-view is obtained by using the central catadioptric camera. In

the sixth chapter, a robust visual servo control method is developed to achieve a

regulation control objective in presence of intrinsic camera calibration uncertainties.

A quaternion-based estimate for the rotation error signal is developed and used

in the controller development. The similarity relationship between the estimated

and actual rotation matrices is used to construct the relationship between the

estimated and actual quaternions. A Lyapunov-based stability analysis is provided

that indicates a unique controller can be developed to achieve the regulation result

despite a sign ambiguity in the developed quaternion estimate. In the seventh

chapter, a new combined robust and adaptive visual servo control method is

developed to asymptotically regulate the feature points in an image to the desired

locations while also regulating the pose of the control object without calibrating

the camera. These dual objectives are achieved by using a homography-based

approach that exploits both image-space and reconstructed Euclidean information

in the feedback loop. The robust rotation controller accommodates for the time-

varying uncertainties in the rotation error system, and the adaptive translation

controller compensates for the unknown calibration parameters in the translation

error system. Chapter 8 serves as the conclusions of this dissertation.















CHAPTER 1
INTRODUCTION

1.1 Motivation

Control systems that use information acquired from an imaging source in the

feedback loop are defined as visual servo control systems. Visual servo control has

developed into a large subset of robotics literature (see [1-4] for a review) because

of the enabling capabilities it can provide for autonomy. Recent advances in image

processing, computational technology and control theory are enabling visual servo

control to become more prevalent in autonomous systems applications (e.g., the

autonomous ground vehicles grand challenge and urban challenge sponsored by the

U.S. Defense Advanced Research Projects Agency (DARPA)). Instead of relying

solely on a global positioning system (GPS) or inertial measurement units (IMU)

for navigation and control, image-based methods are a promising approach to

provide autonomous vehicles with position and orientation (i.e., pose) information.

Specifically, rather than obtain an inertial measurement of an autonomous system,

vision systems can be used to recast the navigation and control problem in terms

of the image space. In addition to providing feedback relating the local pose of

the camera with respect to some target, an image sensor can also be used to relate

local sensor information to an inertial reference frame for global control tasks.

Visual serving requires multidisciplinary expertise to integrate a vision system

with the controller for tasks including: selecting the proper imaging hardware;

extracting and processing images at rates amenable to closed-loop control; image

analysis and feature point extraction/tracking; and recovering/estimating necessary

state information from an image, etc. While each of the aforementioned tasks

are active topics of research interest in computer vision and image processing









societies, they will not be the focus of this dissertation. The development in this

dissertation is based on the assumption that images can be acquired, analyzed, and

the resulting data can be provided to the controller without restricting the control

rates.

The use of image-based feedback adds complexity and new challenges for

the control system design. The scope of this dissertation is focused on issues

associated with using reconstructed and estimated state information from a

sequence of images to develop a stable closed-loop error system. Particularly, this

dissertation focuses on the following problems: 1) how to design a visual servo

tracking controller that achieves asymptotic tracking via a quaternion formulation?

2) how to design a collaborative visual servo control scheme when both the camera

and the control object are moving? 3) how to design a visual servo controller using

a central catadioptric camera? 4) and how to design a visual servo controller that is

robust to camera calibration uncertainty?

1.2 Problem Statement

In this dissertation, visual servo control algorithms and architectures are

developed that exploit the visual feedback from a camera system to achieve a

tracking or regulation control objective for a six degrees of freedom (DOF) rigid-

body control object (e.g., the end-effector of a robot manipulator, a satellite, an

autonomous vehicle) identified by a patch of feature points. The tracking control

objective is for the control object to track a desired trajectory that is encoded by

a video obtained from a camera in either the camera-in-hand or camera-to-hand

configuration. This video can be taken online or offline by a camera. For example,

the motion of a control object can be prerecorded by a camera (for the camera-

to-hand configuration) beforehand and used as a desired trajectory, or, a video of

the reference object can be prerecorded as a desired trajectory while the camera

moves (for the camera-in-hand configuration). The regulation control objective is









for the object to go to a desired pose that is encoded by a prerecorded image. The

regulation problem can be considered as a particular case of the tracking problem.

For example, when all the images in the sequence of desired images are identical,

the tracking problem becomes a regulation problem. The dissertation will address

the following problems of interest: 1) visual servo tracking control via a quaternion

formulation; 2) collaborative visual servo tracking control using a daisy-chaining

approach; 3) visual servo tracking control using a central catadioptric camera; 4)

robust visual servo control in presence of camera calibration uncertainty; and 5)

combined robust and adaptive visual servo control via an uncalibrated camera. The

control development in the dissertation is proven by using nonlinear Lyapunov-

based methods and is demonstrated by Matlab simulation and/or experimental

results.

1) Visual servo tracking control via a quaternion formulation.

Much of the previous visual servo controllers have only been designed to

address the regulation problem. Motivated by the need for new advancements to

meet visual servo tracking applications, previous research has concentrated on

developing different types of path planning techniques [5-9]. Recently, Chen et al.

[10] provided a new formulation of the tracking control problem. A homography-

based adaptive visual servo controller is developed to enable a robot end-effector

to track a prerecorded time-varying reference trajectory determined by a sequence

of images. The Euler angle-axis representation is used to represent the rotation

error system. Due to the computational singularity limitation of the angle axis

extraction algorithm (see Spong and Vidyasagar [11]), rotation angles of T

were not considered. Motivated by the desire to avoid the rotation singularity

completely, an error system and visual servo tracking controller is developed in

Chapter 3 based on the quaternion formulation. A homography is constructed

from image pairs and decomposed via textbook methods (e.g., Faugeras [12] and









Hartley and Zisserman [13]) to determine the rotation matrix. Once the rotation

matrix has been determined, the corresponding unit quaternion can be obtained by

numerically robust algorithms (see Hu et al. [14] and Shuster [15]). Then an error

system is constructed in terms of the unit quaternion, which is void of singularities.

An adaptive controller is then developed and proven to make a camera track a

desired trajectory that is determined from a sequence of images. The controller

contains an adaptive feedforward term to compensate for the unknown distance

from the camera to the observed planar patch. A quaternion-based Lyapunov

function is developed to facilitate the control design and the stability analysis.

2) Collaborative visual servo tracking control using a 1i,.:-,-chaining approach.

Unlike typical visual servo controllers in camera-in-hand and camera-to-hand

configurations, a unique aspect of the development for this problem is that a

moving camera (e.g., a camera mounted on an unmanned air vehicle) is used to

provide visual feedback to a moving autonomous vehicle. The control objective

is for the autonomous vehicle (identified by a planar patch of feature points) to

track the pose of a desired vehicle trajectory that is encoded by a prerecorded

video obtained from a fixed camera (e.g., a camera mounted on a satellite, a

camera mounted on a building). Several challenges must be resolved to achieve

this unexplored control objective. The relative velocity between the moving feature

point patch and the moving camera presents a significant challenge. By using

a daisy-chaining approach (e.g., [16-19]), Euclidean homography relationships

between different camera coordinate frames and feature point patch coordinate

frames are developed. These homographies are used to relate coordinate frames

attached to the moving camera, the reference object, the control object, and the

object used to record the desired trajectory. Another challenge is that for general

six DOF motion by both the camera and the planar patch, the normal to planar

patch is unknown. By decomposing the homography relationships, the normal to






5


the moving feature point patch can be obtained. Likewise, the distance between the

moving camera, the moving planar patch, and a reference patch are unknown. By

using the depth ratios obtained from the homography decomposition, the unknown

distance is related to an unknown constant parameter. A Lyapunov-based adaptive

estimation law is designed to compensate for the unknown constant parameter.

Since the moving camera could be attached to a remotely piloted vehicle with

arbitrary rotations, another challenge is to eliminate potential singularities in

the rotation parameterization obtained from the homography decomposition.

To address this issue, homography-based visual servo control techniques (e.g.,

[10, 20-22]) are combined with quaternion-based control methods (e.g., [14,23,24]),

to eliminate singularities associated with the image Jacobian and the rotation

error system. By using the quaternion parameterization, the resulting closed-

loop rotation error system can be stabilized by a proportional rotation controller

combined with a feedforward term that is a function of the desired trajectory.

3) Visual servo tracking control using a central catadioptric camera.

Visual servo controllers require the image-space coordinates of some set of

Euclidean feature points in the control development; hence, the feature points

must remain in the camera's field-of-view (FOV). Since the FOV of conventional

perspective cameras (e.g., pinhole cameras) is restricted, keeping the feature points

in the FOV is a fundamental challenge for visual servo control algorithms. The fun-

damental nature of the FOV problem has resulted in a variety of control and path

planning methods (e.g., [6,7,25-31]). An alternative solution to the aforementioned

algorithmic approaches to resolve the FOV issue is to use advanced optics such

as omnidirectional cameras. Catadioptric cameras (one type of omnidirectional

camera) are devices which use both mirrors (reflective or catadioptric elements) and

lenses (refractive or dioptric elements) to form images [32]. Catadioptric cameras

with a single effective viewpoint are classified as central catadioptric cameras,









which are desirable because they yield pure perspective images [33]. In Chapter 5,

a visual servo control scheme is presented that yields a tracking result for a camera-

in-hand central catadioptric camera system. The tracking controller is developed

based on the relative relationships of a central catadioptric camera between the

current, reference, and desired camera poses. To find the relative camera pose rela-

tionships, homographies are computed based on the projection model of the central

catadioptric camera [33-36]. Geyer and Daniilidis [36] proposed a unifying theory

to show that all central catadioptric systems are isomorphic to projective mappings

from the sphere to a plane with a projection center on the perpendicular axis to the

plane. By constructing links between the projected coordinates on the sphere, the

homographies up to scalar multiples can be obtained. Various methods can then

be applied to decompose the Euclidean homographies to find the corresponding

rotation matrices, and depth ratios. The rotation error system is based on the

quaternion formulation which has a full-rank interaction matrix. Lyapunov-based

methods are utilized to develop the controller and to prove asymptotic tracking.

4) Robust visual servo control.

In vision-based control, exact calibration is often required so that the image-

space sensor measurements can be related to the Euclidean or joint space for

control implementation. Specifically, a camera model (e.g., the pinhole model)

is often required to relate pixel coordinates from an image to the (normalized)

Euclidean coordinates. The camera model is typically assumed to be exactly known

(i.e., the intrinsic calibration parameters are assumed to be known); however,

despite the availability of several popular calibration methods (cf. [37-43]),

camera calibration can be time consuming, requires some level of expertise,

and has inherent inaccuracies. If the calibration parameters are not exactly known,

performance degradation and potential unpredictable response from the visual

servo controller may occur. The goal of this research is to develop a visual servo









controller which is robust to the intrinsic calibration parameters. As in the previous

three problems, the quaternion parameterization will be used to represent the

rotation error system. Since the quaternion error cannot be measured precisely

due to the uncertain calibration, an estimated quaternion is required to develop

the controller. One of the challenges to develop a quaternion estimate is that the

estimated rotation matrix is not a true rotation matrix in general. To address this

challenge, the similarity relationship between the estimated and actual rotation

matrices is used to construct the relationship between the estimated and actual

quaternions. A Lyapunov-based stability analysis is provided that indicates a

unique controller can be developed to achieve the regulation result.

5) Combined robust and adaptive visual servo control.

This research is also motivated by the desire to compensate for uncertain cam-

era calibration. This controller has adaptive updated terms which can compensate

for the unknown calibration parameters. The open-loop error system is composed

of a rotation error system and a translation error system. One challenge is that the

rotation quaternion error is not measurable. To address this problem, an estimated

quaternion is obtained based on the image-space information and is used to develop

the controller. The transformation between the actual and estimated quaternions

is an upper triangular matrix determined by the calibration parameters and the

diagonal elements are positive. This fact is exploited to design a robust high-gain

controller. Another challenge is that the unknown calibration matrix is coupled in

the translation error system. To address this problem, the translation error system

is linearly parameterized in terms of the calibration parameters. An adaptive up-

date law is used to estimate the unknown calibration parameters, and a translation

controller containing the adaptive compensation terms is used to asymptotically

regulate the translation error.









1.3 Literature Review

1.3.1 Basic Visual Servo Control Approaches

Different visual servo control methods can be divided into three main cate-

gories including: image-based, position-based, and approaches that make use of a

blend of image and position-based approaches. Image-based visual servo control

(e.g., [1, 44-47]) consists of a feedback signal that is composed of pure image-space

information (i.e., the control objective is defined in terms of an image pixel er-

ror). This approach is considered to be more robust to camera calibration and

robot kinematic errors and is more likely to keep the relevant image features in

the FOV than position-based methods because the feedback is directly obtained

from the image without the need to transfer the image-space measurement to

another space. A drawback of image-based visual servo control is that since the

controller is implemented in the robot joint space, an image-Jacobian is required

to relate the derivative of the image-space measurements to the camera's linear

and angular velocities. However, the image-Jacobian typically contains singularities

and local minima (see Chaumette [48]), and the controller stability analysis is

difficult to obtain in the presence of calibration uncertainty (see Espiau et al. [49]).

Another drawback of image-based methods are that since the controller is based

on image-feedback, the robot could be commanded along a trajectory that is not

physically possible. This issue is described as Chaumette's conundrum. Further

discussion of the Chaumette's conundrum is provided in Chaumette [48] and Corke

and Hutchinson [25].

Position-based visual servo control (e.g., [1, 44, 50-52]) uses reconstructed

Euclidean information in the feedback loop. For this approach, the image-Jacobian

singularity and local minima problems are avoided, and physically realizable

trajectories are generated. However, the approach is susceptible to inaccuracies

in the task-space reconstruction if the transformation is corrupted (e.g., uncertain









camera calibration). Also, since the controller does not directly use the image

features in the feedback, the commanded robot trajectory may cause the feature

points to leave the FOV. A review of these two approaches is provided in [1, 2,53].

The third class of visual servo controllers use some image-space informa-

tion combined with some reconstructed information as a means to combine

the advantages of these two approaches while avoiding their disadvantages

(e.g., [10, 20-22, 24, 25, 54-58]). One particular approach was coined 2.5D visual

servo control in [20, 21,55,56] because this class of controllers exploits two dimen-

sional image feedback and reconstructed three-dimensional feedback. This class of

controllers is also called homography-based visual servo control in [10, 22, 24, 57]

because of the underlying reliance of the construction and decomposition of a

homography.

1.3.2 Visual Servo Control Approaches to Enlarge the FOV

Visual servo controllers often require the image-space coordinates of some set

of Euclidean feature points in the control development; hence, the feature points

must remain in the camera's FOV. Since the FOV of conventional perspective

cameras (e.g., pinhole cameras) is restricted, keeping the feature points in the FOV

is a fundamental challenge for visual servo control algorithms. The fundamental

nature of the FOV problem has resulted in a variety of control and path planning

methods (e.g., [6, 7, 25-31]). Corke and Hutchinson [25] and Chesi et al. [26]

used partitioned or switching visual serving methods to keep the object in

the FOV. In [6, 7, 27-29], potential fields (or navigation functions) are used to

ensure the visibility of all features during the control task. In Benhimane and

Malis [30], the focal length of the camera was automatically adjusted (i.e., zoom

control) to keep all features in the FOV during the control task by using an

intrinsic-free visual serving approach developed by Malis [59]. In Garcka-Aracil et

al. [31], a continuous controller is obtained by using a new smooth task function









with weighted features that allows visibility changes in the image features (i.e.,

some features can come in and out of the FOV) during the control task. Some

researchers have also investigated methods to enlarge the FOV [60-64]. In [60-63],

image mosaicing is used to capture multiple images of the scene as a camera moves

and the images are stitched together to obtain a larger image. In Swaminathan and

Nayar [64], multiple images are fused from multiple cameras mounted in order to

have minimally overlapping FOV.

An alternative solution to the aforementioned algorithmic approaches to

resolve the FOV issue is to use advanced optics such as omnidirectional cameras.

Catadioptric cameras (one type of omnidirectional camera) are devices which

use both mirrors (reflective or catadioptric elements) and lenses (refractive or

dioptric elements) to form images [32]. Catadioptric systems with a single effective

viewpoint are classified as central catadioptric systems, which are desirable because

they yield pure perspective images [33]. In Baker and Nayar [34], the complete

class of single-lens single-mirror catadioptric systems is derived that satisfy the

single viewpoint constraint. Recently, catadioptric systems have been investigated

to enlarge the FOV for visual servo control tasks (e.g., [35, 65-72]). Burschka

and Hager [65] addressed the visual serving problem of mobile robots equipped

with central catadioptric cameras, in which an estimation of the feature height

to the plane of motion is required. Barreto et al. [73] developed a model-based

tracking approach of a rigid object using a central catadioptric camera. Mezouar

et al. [66] controlled a robotic system using the projection of 3D lines in the image

plane of a central catadioptric system. In [35, 65, 66], the inverse of the image

Jacobian is required in the controller development which may lead to a singularity

problem for certain configurations. Hadj-Abdelkader et al. [67] presented the 2 1/2

D visual serving approach using omnidirectional cameras, in which the inverse

of an estimated image-Jacobian (containing potential singularities) is required in









the controller. Mariottini et al. [68, 69] developed an image-based visual serving

strategy using epipolar geometry for a three DOF mobile robot equipped with a

central catadioptric camera. Particularly, the singularity problem in the image

Jacobian was addressed by Mariottini et al. [69] using epipolar geometry. In

Benhimane and Malis [70], a new approach to visual servo regulation for omni-

directional cameras was developed that uses the feedback of a homography directly

(without requiring a decomposition) that also does not require any measure of the

3D information on the observed scene. However, the result in [70] is restricted to be

local since the controller is developed based on a linearized open-loop error system

at the origin, and the task function is only isomorphic to a restricted region within

the omnidirectional view. In Tatsambon and Chaumette [71, 72], a new optimal

combination of visual features is proposed for visual serving from spheres using

central catadioptric systems.

1.3.3 Robust and Adaptive Visual Servo Control

Motivated by the desire to incorporate robustness to camera calibration,

different control approaches that do not depend on exact camera calibration have

been proposed (cf. [9, 21, 74-88]). Efforts such as [74-78] have investigated the

development of methods to estimate the image and robot manipulator Jacobians.

These methods are composed of some form of recursive Jacobian estimation law

and a control law. Specifically, Hosoda and Asada [74] developed a visual servo

controller based on a weighted recursive least-squares update law to estimate

the image Jacobian. In Jagersand et al. [75], a Broyden Jacobian estimator is

applied and a nonlinear least-square optimization method is used for the visual

servo control development. Shahamiri and Jagersand [76] used a nullspace-biased

Newton-step visual servo strategy with a Broyden Jacobian estimation for online

singularity detection and avoidance in an uncalibrated visual servo control problem.

In Piepmeier and Lipkin [77] and Piepmeier et al. [78], a recursive least-squares









algorithm is implemented for Jacobian estimation, and a dynamic Gauss-Newton

method is used to minimize the squared error in the image plane.

Robust control approaches based on static best-guess estimation of the

calibration matrix have been developed to solve the uncalibrated visual servo

regulation problem (cf. [21, 82, 87, 88]). Specifically, under a set of assumptions

on the rotation and calibration matrix, a kinematic controller was developed by

Taylor and Ostrowski [82] that utilizes a constant, best-guess estimate of the

calibration parameters to achieve local set-point regulation for the six DOF visual

servo control problem. Homography-based visual serving methods using best-guess

estimation are used by Malis and Chaumette [21] and Fang et al. [87] to achieve

asymptotic or exponential regulation with respect to both camera and hand-eye

calibration errors for the six DOF problem.

The development of traditional adaptive control methods to compensate for

uncertainty in the camera calibration matrix is inhibited because of the time-

varying uncertainty injected in the transformation from the normalization of the

Euclidean coordinates. As a result, initial adaptive control results such as [79-85]

were limited to scenarios where the optic axis of the camera was assumed to be

perpendicular with the plane formed by the feature points (i.e., the time-varying

uncertainty is reduced to a constant uncertainty) or assumed an additional sensor

(e.g., ultrasonic sensors, laser-based sensors, additional cameras) could be used to

measure the depth information.

More recent approaches exploit geometric relationships between multiple

spatiotemporal views of an object to transform the time-varying uncertainty into

known time-varying terms multiplied by an unknown constant [9,21, 86-89]. In

Ruf et al. [9], an on-line calibration algorithm was developed for position-based

visual serving. In Liu et al. [86], an adaptive image-based visual servo controller

was developed that regulated the feature points in an image to desired locations.









One problem with methods based on the image-Jacobian is that the estimated

image-Jacobian may contain singularities. The development in [86] exploits an

additional potential force function to drive the estimated parameters away from the

values that result in a singular Jacobian matrix. In Chen et al. [89], an adaptive

homography-based controller was proposed to address problems of uncertainty

in the intrinsic camera calibration parameters and lack of depth measurements.

Specifically, an adaptive control strategy was developed from a Lyapunov-based

approach that exploits the triangular structure of the calibration matrix. To the

best of our knowledge, the result in [89] was the first result that regulates the

robot end-effector to a desired position/orientation through visual serving by

actively compensating for the lack of depth measurements and uncertainty in

the camera intrinsic calibration matrix with regard to the six DOF regulation

problem. However, the relationship between the estimated rotation axis and the

actual rotation axis is not correctly developed. A time-varying scaling factor was

omitted which is required to relate the estimated rotation matrix and the actual

rotation matrix. Specifically, the estimated rotation matrix and the actual rotation

matrix were incorrectly related through eigenvectors that are associated with the

eigenvalue of 1. An unknown time-varying scalar is required to relate these vectors,

and the methods developed in [89] do not appear to be suitable to accommodate

for this uncertainty.

1.4 Contributions

The main contribution of this dissertation is the development of visual servo

control algorithms and architectures that exploit the visual feedback from a camera

system to achieve a tracking or regulation control objective for a rigid-body control

object (e.g., the end-effector of a robot manipulator, a satellite, an autonomous

vehicle) identified by a patch of feature points. In the process of achieving the main

contribution, the following contributions were made:









* A new adaptive homography-based visual servo control method via a quater-

nion formulation is developed that achieves asymptotic tracking control. This

control scheme is singularity-free by exploiting the homography techniques

and a quaternion parameterization. The adaptive estimation term in the pro-

posed controller compensates for the unknown depth information dynamically

while the controller achieves the asymptotic tracking results.

* A new collaborative visual servo control method is developed to enable a

rigid-body object to track a desired trajectory via a daisy-chaining multi-view

geometry. In contrast to typical camera-to-hand and camera-in-hand visual

servo control configurations, the proposed controller is developed using a

moving on-board camera viewing a moving object to obtain feedback signals.

This collaborative method weakens the FOV restriction and enables the

control object to perform large area motion.

* A visual servo controller is developed that yields an asymptotic tracking

result for the complete nonlinear six DOF camera-in-hand central cata-

dioptric camera system. A panoramic FOV is obtained by using the central

catadioptric camera.

* A robust visual servo controller is developed to achieve a regulation con-

trol objective in presence of intrinsic camera calibration uncertainties. A

quaternion-based estimate for the rotation error signal is developed and

used in the controller development. The similarity relationship between the

estimated and actual rotation matrices is used to construct the relationship

between the estimated and actual quaternions. A Lyapunov-based stability

analysis is provided that indicates a unique controller can be developed

to achieve the regulation result despite a sign ambiguity in the developed

quaternion estimate.









* A new combined robust and adaptive visual servo control method is de-

veloped to asymptotically regulate the feature points in an image to the

desired locations while also regulating the six DOF pose of the control object

without calibrating the camera. These dual objectives are achieved by using

a homography-based approach that exploits both image-space and recon-

structed Euclidean information in the feedback loop. The robust rotation

controller that accommodates for the time-varying uncertain scaling factor

is developed by exploiting the upper triangular form of the rotation error

system and the fact that the diagonal elements of the camera calibration

matrix are positive. The adaptive translation controller that compensates for

the constant unknown parameters in the translation error system is developed

by a certainty-equivalence-based adaptive control method and a nonlinear

Lyapunov-based design approach.















CHAPTER 2
BACKGROUND AND PRELIMINARY DEVELOPMENT

The purpose of this chapter is to provide some background information

pertaining to the camera geometric model, Euclidean reconstruction, and unit

quaternion parameterization approach. Sections 2.1 and 2.2 develop the notation

and framework for the camera geometric model and Euclidean reconstruction used

in Chapters 3, 6 and 7. Their extensions are also used in Chapters 4 and 5. Section

2.3 reviews the unit quaternion, a rotation representation approach that is used

throughout this dissertation.

2.1 Geometric Model

Image processing techniques can often be used to select coplanar and non-

collinear feature points within an image. However, if four coplanar feature points

are not available then the subsequent development can also exploit the classic

eight-points algorithm with no four of the eight feature points being coplanar

(see Hartley and Zisserman [13]) or the virtual parallax method (see Boufama

and Mohr [90] and Malis [55]) where the non-coplanar points are projected onto

a virtual plane. Without loss of generality, the subsequent development is based

on the assumption that an object (e.g., the end-effector of a robot manipulator,

an aircraft, a tumbling satellite, an autonomous vehicle, etc.) has four coplanar

and non-collinear feature points denoted by Oi Vi = 1, 2, 3, 4, and the feature

points can be determined from a feature point tracking algorithm (e.g., Kanade-

Lucas-Tomasi (KLT) algorithm discussed by Shi and Tomasi [91] and Tomasi

and Kanade [92]). The plane defined by the four feature points is denoted by

7r as depicted in Figure 2-1. The coordinate frame F in Figure 2-1 is affixed

to a camera viewing the object, the stationary coordinate frame F* denotes a


























Figure 2-1: Coordinate frame relationships between a camera viewing a planar
patch at different spatiotemporal instances. The coordinate frames 7, T* and Fd
are attached to the current, reference and desired locations, respectively.

reference location for the camera, and the coordinate frame Fd that is attached
to the desired location of the camera. When the desired location of the camera
is a constant, Fd can be chosen the same as F* as shown in Figure 2-2. That
is, the tracking problem becomes a more particular regulation problem for the
configuration in Figure 2-2.
The vectors mf(t), mi, mdi(t) E IR3 in Figure 2-1 are defined as


m A [ x(t) (t) z,(t) (2-1)
A[ /

mdi A Xdi(t) ydi(t) Zdi(t)

where xi(t), /, (t), z,(t) E x,, /:, z ER and Xdi(t), ydi(t), zdi(t) E R denote the
Euclidean coordinates of the feature points Oi expressed in the frames 7, F* and
Td, respectively. From standard Euclidean geometry, relationships between fi(t),





















M*



xf, R




Figure 2-2: Coordinate frame relationships between a camera viewing a planar
patch at different spatiotemporal instances. The coordinate frames F and T*are
attached to the current and reference locations, respectively.

mi and mddi(t) can be determined as


mi = Xf + Ri fmd + = Xfd + Rdm, (2-2)


where R (t) Rd (t) E SO(3) denote the orientations of F* with respect to F and

Td, respectively, and xf (t) Xfd (t) E R3 denote translation vectors from F to

T* and Td to T* expressed in the coordinates of F and Td, respectively. As also

illustrated in Figure 2-1, n* E R3 denotes the constant unit normal to the plane

T, and the constant distance from the origin of T* to T along the unit normal is

denoted by d* A n* 2 E IR. The normalized Euclidean coordinates, denoted by










mi (t) m, mdn (t) E i 3 are defined as

i 7"i ]T
mi = (2-3)


T nT
zi [Z
a mdi xdi Yi 1
Tndi =-
Zdi Zdi Zdi

with the standard assumption that zi (t) z, zdi (t) > e where e is an arbitrarily

small positive constant. From (2-3), the relationships in (2-2) can be expressed as




H
mi = zi (R + m
1- (2 4)
ai H

Tndi (Rd + -i
Zdi d_

adi Hd

where ai(t), adi(t) E IR are scaling terms, and H(t), Hd(t) IR3x3 denote the

Euclidean homographies.

2.2 Euclidean Reconstruction

Each feature point on 7T has a projected pixel coordinate pi (t) E R3, p R3

and pdi (t) E R3 in F, F* and Fd respectively, denoted by


Pi u (2-5)
ST
Pi 7, 7 1


Pdi Udi Vdi 1


where ui(t), (' (t), u7, vf, Udi(t), Vdi(t) E R. The projected pixel coordinates pi (t),

p* and Pdi (t) are related to the normalized task-space coordinates mi (t), m7 and

mdi (t) by the following global invertible transformation (i.e., the pinhole camera









model)

pi = Ami p* = Am* pdi = Amdn, (2-6)

where A e E 3x3 is a constant, upper triangular, and invertible intrinsic camera

calibration matrix that is explicitly defined as [13]

a -acot u0

AA 0 vo (2-7)
sin
00 1

In (2-7), u0, vo E IR denote the pixel coordinates of the principal point (i.e., the

image center that is defined as the frame buffer coordinates of the intersection

of the optical axis with the image plane), a, 3 E IR represent the product of the

camera scaling factors and the focal length, and R E IR is the skew angle between

the camera axes.

Based on (2-6), the Euclidean relationship in (2-4) can be expressed in terms

of the image coordinates as


pi= a (AHA- )p

G

Pdi adi (AHdA-1) p*
(2 8)
Gd

By using the feature point pairs (p*, pi (t)) and (p*, pdi (t)), the projective homog-

raphy up to a scalar multiple (i.e., G and Gd) can be determined (see Chen et

al. [10]). Various methods can then be applied (e.g., see Faugeras and Lustman [93]

and Zhang and Hanson [94]) to decompose the Euclidean homographies to obtain

the rotation matrices R(t), Rd(t) and the depth ratios ai (t) adi (t).









2.3 Unit Quaternion Representation of the Rotation Matrix

For a given rotation matrix, several different representations (e.g., Euler angle-

axis, direction cosines matrix, Euler angles, unit quaternion (or Euler parameters),

etc.) can be utilized to develop the error system. In previous homography-based
visual servo control literature, the Euler angle-axis representation has been used

to describe the rotation matrix. In the angle-axis parameters (p, k), p(t) E R

represents a rotation angle about a suitable unit vector k(t) E IR3. The parameters

(cp, k) can be easily calculated (e.g., using the algorithm shown in Spong and
Vidyasagar [11]).
Given unit vector k(t) and angle p(t), the rotation matrix R(t) = ekk can be

calculated using the Rodrigues formula

R = ek = 13 + kx sin(c) + (kx)2(1 cos(y)), (2-9)

where 13 is the 3 x 3 identity matrix, and the notation kx (t) denotes the following
skew-symmetric form of the vector k(t):

0 -k3 k2
C x= k3 0 -ck Vk = [ci i2 ck3 (2-10)

-k2 ki 0

The unit quaternion is a four dimensional vector which can be defined as [23]

q [qo qjT (2-11)

F FT
In (2-11), qg(t) A qi(t 2) 2t) 3(t) qo(t), qi(t) R Vi = 1, 2, 3. The unit
quaternion must also satisfy the following nonlinear constraint


qTq = 1.


(2-12)








This parameterization facilitates the subsequent problem formulation, control
development, and stability analysis since the unit quaternion provides a globally
nonsingular parameterization of the rotation matrix.
Given (p, k), the unit quaternion vector q (t) can be constructed as

(t)cos )
Sqot) ( Cs p(t (2 13)
q (t) k(t)sin (

Based on (2-13), the rotation matrix in (2-9) can be expressed as

R (q) = I3 + 2qoq, + 2(q, )2 = (q Tq,) 13 + 2qvq + 2qoq (2-14)

The rotation matrix in (2-14) is typical in robotics literature where the moving
coordinate system is expressed in terms of a fixed coordinate system (typically the
coordinate system attached to the base frame). However, the typical representation
of the rotation matrix in aerospace literature (e.g., [15]) is

R (q) = (q2 qT v) 13 + 2qq 2qoqx. (2-15)

The difference is due to the fact that the rotation matrix in (2-15) (which is used
in the current dissertation) relates the moving coordinate frame T to the fixed
coordinate frame T* with the corresponding states expressed in F. The rotation
matrix in (2-15) can be expanded as

q0 + q,1 qW2 q3 2(qvqv2 + qv3qo) 2(q~vqv3 qv2qo0)
R (q) = 2(qqv2 qv3qo) q q1 + q,2 q3 2(qv2qv3 + qvqo) (216)

2(qvlqv3 + qv2qo) 2(v, _,1, :- qqo) qO q2 2 + q3

From (2-16) various approaches could be used to determine qo(t) and qv(t);
however, numerical significance of the resulting computations can be lost if qo(t)
is close to zero [15]. Shuster [15] developed a method to determine qo(t) and

q (t) that provides robustness against such computational issues. Specifically, the









diagonal terms of R (q) can be obtained from (2-12) and (2-16) as


2 (q22 23)
(q2 + W3)

2 (q21 + W3)

2 (q1 + W2)


By utilizing (2-12) and (2-17)-(2

2
q=

2
'=

2=

23
q '3


19), the following expressions can be developed:

R11 + R22 + R33 + 1 3
4 (2-20)

R11 R -22 R33 + 1
4
R22 R11 R33 +1
4
R33 R11 22 +1


where qo(t) is restricted to be non-negative without loss of generality (this restric-

tion enables the minimum rotation to be obtained). As stated in [15], the greatest

numerical accuracy for computing qo(t) and q,(t) is obtained by using the element

in (2-20) with the largest value and then computing the remaining terms respec-

tively. For example, if q (t) has the maximum value in (2-20) then the greatest

numerical accuracy can be obtained by computing qo(t) and q,(t) as


R11 + R22 + R33 + 1
qo =
R23 R32
q^1 = -- ---
R31 R13
qv2 =
4qo
R12 R21
qv3 -
4qo


(2-21)


(2-17)

(2-18)

(2-19)










Likewise, if q,1 (t) has the maximum value in (2-20) then the greatest numerical

accuracy can be obtained by computing qo(t) and q,(t) as

R23 R32
qo =

S R1 R22 R33 + 1

R12 + R21
qv2 =
R13 + R31
qv3 I

where the sign of qv(t) is selected so that qo(t) > 0. If qv2(t) is the maximum, then

R31 R13
qo -
R12 + R21

-- 4,/, _, (2-23)
2 /-22 -R11 R33 + 1
92 =
4
R23 + R32
v3 --
,1, _,

or if q23(t) is the maximum, then

R12 R21
qo -
R13 + R31
qv- = 4,J--
S- 4,1 :(2-24)
R23 + R32 ( 2
qv2 =

R33 R11 R22 + 1
9v3 =
4

where the sign of qv2(t) or qv3(t) is selected so that qo(t) > 0.

The expressions in (2-21)-(2-24) indicate that given the rotation matrix R(t)

from the homography decomposition, the unit quaternion vector can be determined

that represents the rotation without introducing a singularity. The expressions in

(2-21)-(2-24) will be utilized in the subsequent control development and stability

analysis.















CHAPTER 3
LYAPUNOV-BASED VISUAL SERVO TRACKING CONTROL VIA A
QUATERNION FORMULATION

3.1 Introduction

Previous visual servo controllers typically only address the regulation problem.

Motivated by the need for new advancements to meet visual servo tracking appli-

cations, previous research has concentrated on developing different types of path

planning techniques [5-9]. Recently, Chen et al. [10] developed a new formulation

of the tracking control problem. The homography-based adaptive visual servo

controller in [10] is developed to enable an actuated object to track a prerecorded

time-varying desired trajectory determined by a sequence of images, where the

Euler angle-axis representation is used to represent the rotation error system. Due

to the computational singularity limitation of the angle axis extraction algorithm

(see Spong and Vidyasagar [11]), rotation angles of T were not considered.

This chapter considers the previously unexamined problem of six DOF visual

servo tracking control with a nonsingular rotation parameterization. A homography

is constructed from image pairs and decomposed via textbook methods (e.g.,

Faugeras [12] and Hartley and Zisserman [13]) to obtain the rotation matrix. Once

the rotation matrix has been determined, the corresponding unit quaternion can

be determined from globally nonsingular and numerically robust algorithms (e.g.,

Hu et al. [14] and Shuster [15]). An error system is constructed in terms of the

unit quaternion. An adaptive controller is then developed and proven to enable

a camera (attached to a rigid-body object) to track a desired trajectory that is

determined from a sequence of images. These images can be taken online or offline

by a camera. For example, a sequence of images of the reference object can be









prerecorded as the camera moves (a camera-in-hand configuration), and these

images can be used as a desired trajectory in a later real-time tracking control.

The camera is attached to a rigid-body object (e.g., the end-effector of a robot

manipulator, a satellite, an autonomous vehicle, etc.) that can be identified by a

planar patch of feature points. The controller contains an adaptive feedforward

term to compensate for the unknown distance from the camera to the observed

features. A quaternion-based Lyapunov function is developed to facilitate the

control design and the stability analysis.

The remainder of this chapter is organized as follows. In Section 3.2, the

control objective is formulated in terms of unit quaternion representation. In

Section 3.3, the controller is developed, and closed-loop stability analysis is given

based on Lyapunov-based methods. In Section 3.4, the control development is

extended to the camera-to-hand configuration. In Sections 3.5 and 3.6, Matlab

simulations and tracking experiments that were performed in a virtual-reality

test-bed for unmanned systems at the University of Florida are used to show the

performance of the proposed visual servo tracking controller.

3.2 Control Objective

The control objective is for a camera to track a desired trajectory that is

determined by a sequence of images. This objective is based on the assumption

that the linear and angular velocities of the camera are control inputs that can be

independently controlled (i.e., unconstrained motion) and that the camera is cali-

brated (i.e., A is known). The signals in (2-5) are the only required measurements

to develop the controller.

One of the outcomes of the homography decomposition is the rotation matrices

R(t) and Rd(t). From these rotation matrices, several different representations can

be utilized to develop the error system. In previous homography-based visual servo

control literature, the Euler angle-axis representation has been used to describe the









rotation matrix. In this chapter, the unit quaternion parameterization will be used

to describe the rotation matrix. This parameterization facilitates the subsequent

problem formulation, control development, and stability analysis since the unit

quaternion provides a global nonsingular parameterization of the corresponding

rotation matrices. Section 2.3 provides background, definitions and development

related to the unit quaternion.

Given the rotation matrices R (t) and Rd (t), the corresponding unit quater-

nions q (t) and qd (t) can be calculated by using the numerically robust method

(see [14] and [15]) based on the corresponding relationships

R (q) = (q- q qv) 13 + 2qvq, 2qoqv (3-1)

Rd (qd) = (qld d- qvd) 13 + 2qdql 2qodqa (3 2)

where 13 is the 3 x 3 identity matrix, and the notation qx (t) denotes the skew-

symmetric form of the vector q,(t) as in (2-12).

To quantify the error between the actual and desired camera orientations, the

mismatch between rotation matrices R (t) and Rd (t) is defined as

R = RRI. (3-3)

Based on (3-1)-(3-3),

R= (402 fqv) 13 + 2q v,~ 2qoq4 (3-4)

where the error quaternion (qo(t), qT(t))T is defined as

qo = qoqod + qTqvd (3-5)

4v = qodqv qoqvd + qX qvd









The definition of qo(t) and q4(t) in (3-5) makes (qo(t), qT(t))T a unit quater-

nion based on the fact that q(t) and qd(t) are two unit quaternions (see Appendix

A).

The translation error, denoted by e(t) E IR3, is defined as


e = Pe Ped,


where Pe (t), Ped(t) E IR3 are defined as


e = ui 'n -In (ai)

where i {1, ,4}.

In the Euclidean-space (see Figure 2


[ T
Ped di Vdi In (adi) I, (3-7)



1), the tracking objective can be quanti-


fied as


and


R(t) I3 as t oo



||e(t) 0 as t -oo.


Since q(t) is a unit quaternion, (3-4), (3-5) and (3-8) can be used to quantify the

rotation tracking objective as the desire to regulate q,(t) as


,, (t)l 0 as t oo.


(3-10)


The subsequent section will target the control development based on the objectives

in (3-9) and (3-10).

3.3 Control Development

3.3.1 Open-Loop Error System

The actual angular velocity of the camera expressed in is defined as wJc(t)

R3, the desired angular velocity of the camera expressed in .Fd is defined as

w'd (t) E iR3, and the relative angular velocity of the camera with respect to Fd


(3-6)


(3-8)


(3-9)








expressed in F is defined as wc(t) R3 where

ij = c Rwcd. (3-11)

The camera angular velocities can be related to the time derivatives of q (t) and

qd (t) as [23]

= :] q (3-12)
q 2 qo2 0 + q,

and
qOd 1 1
2 j Wed, (313)
qvd qod13 + qvd
respectively.
As stated in Remark 3 in Chen et al. [10], a sufficiently smooth function can
be used to fit the sequence of feature points to generate the desired trajectory

pdi(t); hence, it is assumed that Ped(t) and Ped(t) are bounded functions of time.
In practice, the a priori developed smooth functions adi(t), Rd(t), and can
be constructed as bounded functions with bounded time derivatives. Based on
the assumption that Rd(t) is a bounded first order differentiable function with a
bounded derivative, the algorithm for computing quaternions in [14] can be used to
conclude that (qod(t), q7d(t)) are bounded first order differentiable functions with
a bounded derivative; hence, (qod(t), QT(t)) and (qod(t), d(t)) are bounded. In
the subsequent tracking control development, the desired signals Ped(t) and qvd(t)
will be used as feedforward control terms. To avoid the computational singularity
in Od(t), the desired trajectory in [10] was generated by carefully choosing the
smooth function such that the workspace is limited to (-r, r). Unlike [10], the use
of the quaternion alleviates the restriction on the desired trajectory pd(t).
From (3-13), the signal w d (t) can be calculated as


Jcd = 2(qodqvd qvdqod) 2qqXdvd,


(3-14)









where (qod(t), qT (t))T, (Qod(t), T (t))T are bounded, so wcd(t) is also bounded.

Based on (3-4), (3-5), (3-12) and (3-13), the open-loop rotation error system can

be developed as
-T
1 -(

o013 +q ]

where q(t) = (qo(t), T(t))T.

By using (2-5), (2-6), (3-6), (3-11), and the fact that [95]


mi = -Vc+ m j c, (3-16)


where Vc(t) E R3 denotes the actual linear velocity of the camera expressed in .F,

the open-loop translation error system can be derived as [10]


zi* = -aiL vc + (L Ped)Z (3-17)


where Lv(t) E R3x3 are defined as


0 0 no 1 0
L A= A- 0 0 vo 0 1 (3-18)
Zi
0 0 0 001

The auxiliary term Lv (t) is an invertible upper triangular matrix.

3.3.2 Closed-Loop Error System

Based on the open-loop rotation error system in (3-15) and the subsequent

Lyapunov-based stability analysis, the angular velocity controller is designed as


c = -K,(I3 + q)-14v + Rwcd = -K~q + Rwcd, (3-19)


where K, E R3x3 denotes a diagonal matrix of positive constant control gains. See

Appendix B for the proof that (13 + qx)-1q = qv. Based on (3-11), (3-15) and









(3-19), the rotation closed-loop error system can be determined as

1
0 = -~ Kwv (3-20)

v = (qo13 + q) K& .


The contribution of this chapter is the development of the quaternion-based

rotation tracking controller. Several other homography-based translation controllers

could be combined with the developed rotation controller. For completeness, the

following development illustrates how the translation controller and adaptive

update law in [10] can be used to complete the six DOF tracking result.

Based on (3-17), the translation control input v,(t) is designed as


Vc = -L;' (Ke + z (Lvrm Ped)) (3-21)
ai

where K, E R3x3 denotes a diagonal matrix of positive constant control gains. In

(3-21), the parameter estimate i4(t) E R for the unknown constant z4 is defined as

S= e (Lmw Ped) (3-22)


where 7 R denotes a positive constant adaptation gain. The controller in (3-21)

does not exhibit a singularity since L,(t) is invertible and ai(t) > 0. From (3-17)

and (3-21), the translation closed-loop error system can be listed as



z = -Ke + (LmZ c ped)Z, (3-23)

where *" (t) E R denotes the following parameter estimation error:


* *
z. z. z


(3-24)








3.3.3 Stability Analysis
Theorem 3.1: The controller given in (3-19) and (3-21), along with the
adaptive update law in (3-22) ensures global asymptotic tracking in the sense that

,,, (t) | 0, e(t)| 0, as t oo. (3-25)

Proof: Let V(t) E IR denote the following differentiable non-negative function
(i.e., a Lyapunov candidate):

V = % + (1 Go)2 + lZT + 1f2. (3-26)
2 2y(

The time-derivative of V(t) can be determined as



V = 2q q+ 2(1 qo)(- o) + z e'T + -

= -T qo403 + q) K (1 qo)qK& ,

+ eT(-Ke + (L,~n2w Ped)x) zfeT (LL~ c Ped)

= -qTK (qo13 + + (1 o)13) q eTve

= K eTKe, (3-27)

where (3-20) and (3-22)-(3-24) were utilized. It can be seen from (3-27) that V(t)
is negative semi-definite.
Based on (3-26) and (3-27), e(t), q4(t), qo(t), *(t) E LC and e(t), q,(t) E C2.
Since z*(t) e LC, it is clear from (3-24) that 4 (t) E C,. Based on the fact that
e(t) E C,, (2-3), (2-6), (3-6) and (3-7) can be used to prove that mi(t) E L".
Since mi(t) e C, (3-18) implies that L,(t), L 1 (t) E C,. Based on the fact that

qv(t), qo(t) E Co, (3-19) can be used to prove that c(t) E Co. Since j(t) E Co
and Jcd(t) is a bounded function, (3-11) can be used to conclude that wc(t) E Co.
Since z(t), e((t), (t t), mi(t), Lv(t), L-1 (t) E LC and ped(t) is assumed to be









bounded, (3-11) and (3-21) can be utilized to prove that Vc(t) Co. From the

previous results, (3-11)-(3-17) can be used to prove that e(t), q,(t) E Co. Since

e(t), 4v(t) E Co n 2, and e(t), q4(t) E Co, Barbalat's Lemma [96] can be used to

conclude the result given in (3-25).

3.4 Camera-To-Hand Extension

Fixed camera



mdz




d* m, Pd
*d
p--




p*


Figure 3-1: Coordinate frame relationships between a fixed camera and the planes
defined by the current, desired, and reference feature points (i.e., 7T, 7Td, and T*).


3.4.1 Model Development

For the fixed camera problem, consider the fixed plane T* that is defined by a

reference image of the object. In addition, consider the actual and desired motion

of the planes 7T and Td (see Figure 3-1). To develop a relationship between the

planes, an inertial coordinate system, denoted by I, is defined where the origin

coincides with the center of a fixed camera. The Euclidean coordinates of the









feature points on T, T* and 7Td can be expressed in terms of 1, respectively, as


M)A x(t) '/ (t) z (t) XA z (328)
T
ndi() A [Xdi(t) Ydi(t) Zdi(t)

under the standard assumption that the distances from the origin of I to the

feature points remains positive (i.e., zi (t) z~, Zdi(t) > e where e denotes an

arbitrarily small positive constant). Orthogonal coordinate systems F', F*, and

.Fd are attached to the planes T, T*, and Trd, respectively. To relate the coordinate

systems, let R (t) R*, Rd (t) e SO(3) denote the orientations of F, F* and Fd

with respect to 1, respectively, and let xf (t) x, Xfd (t) E R3 denote the respective

translation vectors expressed in I. As also illustrated in Figure 3-1, n* IR3

denotes the constant unit normal to the plane T* expressed in I, and si IR3

denotes the constant coordinates of the i th feature point expressed in the

corresponding coordinate frames F, F*, and Fd.

From the geometry between the coordinate frames depicted in Figure 3-1, the

following relationships can be developed

m, = Xf + Rmn md = Xfd + Rdm, (3-29)

where R (t) Rd(t) E SO (3) and Xf (t) Xfd (t) R3 denote new rotation and

translation variables, respectively, defined as

R= R(R*) R = Rd (R*) (330)

Xf = RX Rx* Xfd = Xfd ., .

Similar to the camera-in-hand configuration, the relationships in (3-29) can be

expressed as

m*T (R + ) md = (Rd + TXfd ) m) .
1nTn fin 1ni R R+ *d n









The rotation matrices R(t), Rd(t) and the depth ratios ai (t) and adi(t) can
be obtained as described in Section 2.2. The constant rotation matrix R* can

be obtained a prior using various methods (e.g., a second camera, Euclidean

measurements) [10]. Based on (3-30), R (t) and Rd (t) can be determined.

The orientations of F* with respect to F and Fd can be expressed as RT (t) R*

and Rd (t) R* respectively. To quantify the error between the actual and desired

plane orientations, the mismatch between rotation matrices RT (t) R* and Rd (t) R*

is defined as


S= RTR* [RTR*]T = RRd. (331)

Similar to the development for the camera-in-hand configuration, the following

expression can be obtained:

R = (qo2 q ) 13 + 2qq 2qoq (3-32)

where the error quaternion (qo(t), q (t))T is defined as


qo = qoqod + q qvd (3-33)

qv = qodqv qoqvd + q qvd,

where (qo(t), q,(t))T and (qod(t), qTd(t))T are unit quaternions computed from the

rotation matrices RT (t) R* and Rd (t) R* following the method given in [14].

3.4.2 Control Formulation

By expressing the translation vectors, angular velocity and linear velocity in

the body-fixed coordinate frame f, and by defining rotation matrices RT (t) R*

and Rd (t) R*, the control objective in this section can be formulated in the same

manner as done for the camera-in-hand configuration problem. So, the rotation

and translation errors can be described the same as those for the camera-in-hand

configuration problem. The actual angular velocity of the object expressed in F is









defined as WJe(t) E IR3, the desired angular velocity of the object expressed in Fd is

defined as ed (t) E IR3, and the relative angular velocity of the object with respect

to Fd expressed in F is defined as We,(t) E IR3 where


e W = e RWed, (3-34)


where Wed (t) has the same form as Jcd (t) in (3-14). The open-loop rotation error

system is
T

2 q03 9+ 41

and the translation error system is [10]

ze = aiLvR (ve + i S) Zped, (3 36)


where v,(t) E IR3 denotes the linear velocity of the object expressed in F.

Based on the open-loop error systems (3-35) and (3-36), and the subsequent

stability analysis, the angular and linear camera velocity control inputs for the

object are defined as


Je = -K vq + R Jed (3-37)

Ve = RTL, (Kc izPed) WJ S. (3-38)


In (3-37) and (3-38), K,, Kv E IR3x3 denote diagonal matrices of positive constant

control gains, the parameter estimates i\(t) E R, si(t) R3 for the unknown

constants z4(t) and si(t) are generated according to the following adaptive update

laws


Z = -7e (3-39)

si = -aiW RT L (3-40)


where F E I 3x3 denotes a positive constant diagonal adaptation gain matrix.









From (3-35) and (3-37), the rotation closed-loop error system can be deter-

mined as



1T
q0 = -q Kq, (3-41)

= -2 (qol3 + q) K"q.

Based on (3-36) and (3-38), the translation closed-loop error system is given as

ze = -Ke iZped + aiLvR& s, (3-42)

where the parameter estimation error signals \(t) E R and si(t) E R3 are defined

as

z* = = si s (3-43)

Theorem 3.2: The controller given in (3-37) and (3-38), along with the

adaptive update laws in (3-39) and (3-40) ensure global asymptotic tracking in the

sense that

(t)11| 0, |e(t)11 0, as t oo. (3-44)

Proof: Let V(t) E IR denote the following differentiable non-negative definite

function (i.e., a Lyapunov candidate):

V = q,, + (1 qo)2 +i Te + + _Tr-1S (3-45)
2 27-y 2









The time-derivative of V(t) can be determined as

S= 244q, + 2(1 qo)(- 40) + zeT + f~ + sTr-1i

= T (o13 + x) KI v (1 qo)qTK

+ eT( -Ke zid + a LvRw^ Si) + peTed + aiSw, RTLVe

= -q K (qo13 + qx + (1- qo)3) 4v

eTKe + iLveLRwX Si + a~ RT LTe

= -qTK4 eTKve + a, (e AeLvRwj si)T + aii,x RT L ATe

= -qK eTKe aS+ (wX)T RTLAT e + ai iR TL Ae

= -qK,, eTKe + ai (-wX) RTLT Ae + aisiwR TLvAAe

= -qK"q eTKve, (3-46)

where (3-34)-(3-43) were utilized. From (3-45) and (3-46), signal chasing argu-
ments described in the previous section can be used to conclude that the control
inputs and all the closed-loop signals are bounded. Barbalat's Lemma [96] can then
be used to prove the result given in (3-44).
3.5 Simulation Results

A numerical simulation was performed to illustrate the performance of the
tracking controller given in (3-19), (3-21), and the adaptive update law in (3-22).
In this simulation, the developed tracking controller aims enable the control object
to track the desired trajectory encoded by a sequence of images and rotate more
than 360 (see Fig. 3-5).








The camera is assumed to view an object with four coplanar feature points
with the following Euclidean coordinates (in [m]):

01 = 0.15 0.15 0 2 0.15 -0.15 0 (347)

03 = -0.15 0.15 0 04 = -0.15 -0.15 0

The time-varying desired image trajectory was generated by the kinematics of the
feature point plane where the desired linear and angular velocities were selected as



Vcd 0.1 sin (t) 0.1 sin(t) 0 [m/s]

cd= 0 0 1.5 [rad/s].

The initial and desired image-space coordinates were artificially generated. For
this example, consider an orthogonal coordinate frame I with the z-axis opposite
to n* (see Figure 2-1) with the x-axis and y-axis on the plane 7T. The rotation
matrices R1 between F and I, and R2 between F* and I were set as

R1 = Rx(1200)Ry (-20) R, (-80) (3-48)

R2 = Rx(1600)Ry (30) R, (30), (3-49)

where Rx,(), Ry (-) and Rz(-) E SO(3) denote rotation of angle (degrees)
along the x-axis, y-axis and z-axis, respectively. The translation vectors xf and
Xf2 between F and I (expressed in F) and between F* and I (expressed in *),
respectively, were selected as

Xfl = 0.5 0.5 4.0 (3-50)
TXf
Xf2 = 1.0 1.0 4.5 (3-51)








The initial rotation matrix R3 and translation vector Xf3 between Fd and I were
set as

R3 = RX(2400)Ry (-900) R, (-300) Xf3 = 0.5 1 5.0 (3-52)

The initial (i.e., pi(0)) and reference (i.e., p7) image-space coordinates of the four
feature points in (3-47) were computed as (in pixels)



i(0) = 907.91 716.04 1 P2(0)= 791.93 728.95 1
[ TT
P3(0)= 762.84 694.88 1 P4(0)= 871.02 683.25 1

P = 985.70 792.70 1 ] = 1043.4 881.20 1

P3 = 980.90 921.90 1 P] = 922.00 829.00 1

The initial (i.e., pdi(O)) image-space coordinates of the four feature points in
(3-47) for generating the desired trajectory were computed as (in pixels)


Pdl(0) = 824.61 853.91 1
r ^T
Pd3(0)= 766.59 790.50 1

The control gains K, in (3-19) and K,
were selected as

K, =diag{3,3,3}
y = 0.0002.


Pd2(0) = 770.36 878.49


T ]


r IT
Pd4(0) = 819.03 762.69 1

in (3-21) and adaptation gain 7 in (3-22)



K = diag{15, 15, 15}


The desired and current image-space trajectories of the feature point plane are
shown in Figure 3-5 and Figure 3-6, respectively. The feature point plane rotates
more than 360 degrees as shown in these two figures. The resulting translation









and rotation errors are plotted in Figure 3-7 and Figure 3-8, respectively. The

errors go to zero asymptotically. The desired image-space trajectory (i.e., pdi(t))

and the current image-space trajectory (i.e., pi(t)) are shown in Figure 3-9 and

Figure 3-10, respectively. The tracking error between the current and desired

image-space trajectory is shown in Figure 3-11. The Figures 3-7-3-11 show that

the current trajectory tracks the desired trajectory asymptotically. The translation

and rotation control inputs are shown in Figure 3-12 and Figure 3-13, respectively.

The parameter estimate for z* is shown in Figure 3-14.

3.6 Experiment Results

Simulations verified the performance of the tracking controller given in (3-19),

(3-21), and the adaptive update law in (3-22). Experiments were then performed

to test robustness and performance in the presence of signal noise, measurement

error, calibration error, etc. The experiments were performed in a test-bed at the

University of Florida for simulation, design and implementation of vision-based

control systems. For the test-bed, a 3D environment can be projected onto large

monitors or screens and viewed by a physical camera. Communication between the

camera and control processing computers and the environment rendering computers

allows closed-loop control of the virtual scene.

3.6.1 Experiment Configurations

A block diagram describing the experimental test-bed is provided in Figure

3-2. The test-bed is based on a virtual environment generated by a virtual reality

simulator, composed of five workstations and a database server running virtual

reality software. This allows multiple instances of the virtual environment to run at

the same time. In this way, camera views can be rigidly connected in a mosaic for

a large FOV. Alternately, multiple, independent camera views can pursue their own

tasks, such as coordinated control of multiple vehicles. The virtual reality simulator












LEFT




F-I


CENTER


i /
CA\ /MERA


CAMERA


RIGHT


/ I
/ 1
/
/


POSITION &
ORIENTATION







VIRTUAL REALITY
SIMULATOR


CONTROL
COMMANDS


VISION PROCESSING
WORKSTATION


Figure 3 2: Block diagram of the experiment.


Figure 3-3: The Sony XCD-710CR color firewire camera pointed at the virtual
environment.


.=i









in the experiment is currently capable of displaying three simultaneous displays. A

picture of the displays can be seen in Figure 3-3.

The virtual reality simulator utilizes MultiGen-Paradigm's Vega Prime, an

OpenGL-based, commercial software package for Microsoft Windows. The virtual

environment in the experiment is a recreation of the U.S. Army's urban warfare

training ground at Fort Benning, Georgia. The environment has a dense polygon

count, detailed textures, high frame rate, and the effects of soft shadows, resulting

in very realistic images. A scene from the Fort Benning environment can be seen in

Figure 3 4.

The visual sensor in the experiment is a Sony XCD-710CR color firewire

camera with a resolution of 1280 x 768 pixels and fitted with a 12.5mm lens. The

camera captures the images on the large screens as shown in Figure 3-3. The

images are processed in a vision processing workstation. An application written

in C++ acquires images from the camera and process the images to locate and

track the feature points (the initial feature points were chosen manually, then

the application will identify and track the feature points on its own). The C++

application generates the current and desired pixel coordinates, which can be used

to formulate the control command.

In addition to the image processing application, a control command generation

application programmed in Matlab also runs in this workstation. The Matlab appli-

cation communicates data with the image processing application (written in C++)

via shared memory buffers. The Matlab application reads the current and desired

pixel coordinates from the shared memory buffers, and writes linear and angular

camera velocity input into the shared memory buffers. The C++ application writes

the current and desired pixel coordinates into the shared memory, and reads cam-

era velocity input from the shared memory buffer. The linear and angular camera

velocity control input are sent from the vision processing workstation to the virtual









reality simulator via a TCP socket connection. This development makes extensive

use of Intel's Open Source Computer Vision (OpenCV) Library (see Bradski [97])

and the GNU Scientific Library (GSL) (see Galassi et al. [98]).
















Figure 3-4: Virtual reality environment exmaple: a virtual recreation of the US
Army's urban warfare training ground at Fort Benning.


Algorithms, such as the Homography decomposition, are implemented as if

the virtual environment is a true 3D scene which the physical camera is viewing.

Of course, the camera does not look at the 3D scene directly. The camera views

consist of a 3D scene that are projected onto a 2D plane, then projected onto

the image plane. That is, the projective homography needed for control exists

between the on-screen current image and the on-screen goal image, but what

are given are the camera views of the on-screen images. Thus, there exists an

additional transform action between the points on the screen and the points in the

camera image. A screen-camera calibration matrix Gcs can be used to describe this

transformation relationship.

Every point on the screen corresponds to only one point in the image. Thus,

the constant matrix Gcs is a homography and can be determined through a

calibration procedure, and effectively replaces the standard calibration of the









physical camera. In the experiment, this matrix is determined to be

0.9141 0.0039 -90.9065

Gcs = -0.0375 0.9358 -50.7003

0 0 1

In addition to Gcs, the camera calibration matrix A, corresponding to the virtual

camera within Vega Prime, is still required. This matrix can be determined from

the settings of the virtual reality program. In this experiment, A was determined to

be
1545.1 0 640

A= 0 1545.1 512

0 0 1

3.6.2 Experiment for Tracking

The desired trajectory is in a format of a prerecorded video (a sequence of

images). As the view point in the Vega Prime moves (which can be implemented

by some chosen velocity functions or manually), the images captured by the

camera changes. The pixel coordinates of the features points on the images will

be recorded as the desired trajectory. The control objective in this tracking

experiment is to send control command to the virtual reality simulator such

that the current pose of the feature points tracks the desired pose. The constant

reference image was taken as the first image in the sequence.

The control gains K, in (3-19) and K, in (3-21), and adaptation gain y in

(3-22) were selected as


K = diag{0.5, 0.5, 0.5}


K, = diag{0.1, 0.1, 1.5}


7 = 0.005.









During the experiment, the images from the camera are processed with a frame

rate of approximately 20 frames/second.

The resulting translation and rotation errors are plotted in Figure 3-15 and

Figure 3-16. The desired image-space trajectory (i.e., pdi(t)) is shown in Figure

3-17, and the current image-space trajectory (i.e., pi(t)) is shown in Figure 3-18.

The tracking error between the current and desired image-space trajectories is

shown in Figure 3-19. The translation and rotation control outputs are shown in

Figure 3-20 and Figure 3-21, respectively. The parameter estimate for z* is shown

in Figure 3-22.

In the tracking control, the steady-state tracking error is approximately

15[pixel] 10[pixel] 0.01 ]. This steady-state error is caused by the image

noise and the camera calibration error in the test-bed. To find the tracking

control error, two homographies are computed between the reference image and

the current image and desired image, respectively. Due to the image noise and

camera calibration error, the error is inserted to the two homographies. Then the

tracking error obtained from the mismatch between the two homographies will

have larger error. Also, the image noise and calibration error inserts error into the

derivative of the desired pixel coordinates, which is used as a feedforward term in

the tracking controller. Furthermore, communication between the controller and

virtual reality system occurs via a TCP socket, introducing some amount of latency

into the system. Note that this pixel error represents less than 1.5% of the image

dimensions.

In the following regulation experiment, the derivative of the desired pixel

coordinates is equal to zero, and only one homography is computed between

the current image and desired set image. The influence of the image noise and

calibration error is weakened greatly.









3.6.3 Experiment for Regulation

When the desired pose is a constant, the tracking problem becomes a regula-

tion problem. The control objective in this regulation experiment is to send control

command to the virtual reality simulator such that the current pose of the features

points is regulated to the desired set pose.

In the experiment, the control gains K, in (3-19) and K, in (3-21), and

adaptation gain 7 in (3-22) were selected as




K, = diag0.4, 0.4,0.9} K, = diag{0.5, 0.25, 0.25} = 0.005.


The resulting translation and rotation errors are plotted in Figure 3-23 and

Figure 3-24, respectively. The errors go to zero asymptotically. The current

image-space trajectory (i.e., pi(t)) is shown in Figure 3-25. The regulation error

between the current and desired set image-space pose is shown in Figure 3-26. The

translation and rotation control outputs are shown in Figure 3-27 and Figure 3-28,

respectively. The parameter estimate for z* is shown in Figure 3-29.







48













10





1000
1500
500 1000
500
Vdi [pixel] 0 0 [pixel]
Udi [pixel]


Figure 3-5: Desired image-space coordinates of the four feature points (i.e., pd(t))
in the tracking Matlab simulation shown in a 3D graph. In the figure, "0" denotes
the initial image-space positions of the 4 feature points in the desired trajectory,
and "*" denotes the corresponding final positions of the feature points.













E .. -
10
2- 4



1000
1500
500 1000
500
vi [pixel] 0 0 u. [pixel]



Figure 3-6: Current image-space coordinates of the four feature points (i.e., pd(t))
in the tracking Matlab simulation shown in a 3D graph. In the figure, "0" de-
notes the initial image-space positions of the 4 feature points, and "*" denotes the
corresponding final positions of the feature points.

















200

100

0






200


-100
0 2 4 6 8 10


Time [sec]


Figure 3-7: Translation error e(t) in the tracking Matlab simulation.









1.5

C- 1 -


0.5

2, 0


0.5

-, 0

-0.5
0 2


4 6 8
Time [sec]


Figure 3 8: Rotation quaternion error q(t) in the tracking Matlab simulation.






























Time [sec]


Figure 3-9: Pixel coordinate pd(t) of the four feature points in a sequence of de-
sired images in the tracking Matlab simulation. The upper figure is for the ud(t)
component and the bottom figure is for the vd(t) component.


Time [sec]


Figure 3-10: Pixel coordinate p(t) of the current pose of the four feature points in
the tracking Matlab simulation. The upper figure is for the u(t) component and the
bottom figure is for the v(t) component.
























Time [sec]


100

0

-100

-200
0 2 4 6 8 1
Time [sec]


Figure 3-11: Tracking error p(t)
tracking Matlab simulation. The
the bottom figure is for the v(t) -


- pd(t) (in pixels) of the four feature points in the
upper figure is for the u(t) ud(t) component and
- Vd(t) component.


5



00

0 2 4 6 8 10


Figure 3-12: Linear camera velocity input v,(t) in the tracking Matlab simulation.


0 2 4 6 8
Time [sec]








52









3





0 2 4 6 8 10

0.5


0 . . .: . . . . : . .. . .._. .
"

-0.5


0 246810

-1




-1
-C'



0 2 4 6 8 10
Time [sec]



Figure 3 13: Angular camera velocity input w,(t) in the tracking Matlab simula-

tion.

















6


P5


4
2l___ __---I------- --I- ----I ----------




















3-


2-
6 . .. . . . . . .. . . . .





3. .. . .. .














0 2 4 6 8 10
Time [sec]



Figure 3-14: Adaptive on-line estimate of zW in the tracking Matlab simulation.




























S 0


-100
0 10 20 30 40 50 60
0.05


CO 0


-0.05
0 10 20 30 40 50 60
Time [sec]



Figure 3-15: Translation error e(t) in the tracking experiment.


1.01



0.99
0 10
0.1

, 0 *V


20 30 40


0

-0.1
0 10 20 30 40 50 60
0.1
0.1 o ------ i.... ...---- -----..... ..-i-- ----



-0.1
0 10 20 30 40 50 60
Time [sec]



Figure 3 16: Rotation quaternion error q(t) in the tracking experiment.


60















1000 -

800

600

400 -

200
0 10 20 30 40 50 6

Inn...00


10 20 30
Time [sec]


40 50 60


Figure 3-17: Pixel coordinate pd(t) of the four feature points in a sequence of
desired images in the tracking experiment. The upper figure is for the Ud(t) compo-
nent and the bottom figure is for the Vd(t) component.


1000
800

4 0 0 .. . . . .. . . . .. .. . .. . . .
400 -

200
0 10 20 30 40 50 6


1200
1000

800 .

600

400
9nn00


0 10 20 30
Time [sec]


40 50 60


Figure 3-18: Pixel coordinate p(t) of the current pose of the four feature points
in the tracking experiment. The upper figure is for the u(t) component and the
bottom figure is for the v(t) component.


1000 -

G 800 -
a
>o 600 -
400 -

200 -
0

























0 10 20 30
Time [sec]


40 50 60


0 10 20 30 40 50 60
Time [sec]


Figure 3-19: Tracking error p(t) pd(t) (in pixels) of the four feature points in the
tracking experiment. The upper figure is for the u(t) ud(t) component and the
bottom figure is for the v(t) vd(t) component.


0 .1 . . . . . .. . . . . . . . . . . .

0-

-0.1
0 10 20 30 40 50 60
0.05


0
-0...... ------- '----------' --- --
-0.05
0 10 20 30 40 50 60
0.02


0


-0.02
0 10 20 30 40 50 60
Time [sec]


Figure 3 20: Linear camera velocity input vc(t) in the tracking experiment.

















0.02


0


-0.02
0 10 20 30 40 50 60
0.02


0


H 9 II


0


10 20 30 40 50 60


0.1

0

-0.1 -

0 10 20 30 40 50 60
Time [sec]


Figure 3 21: Angular camera velocity input w,(t) in the tracking experiment.


0 10 20 30
Time [sec]


40 50 60


Figure 3-22: Adaptive on-line estimate of z* in the tracking experiment.









































5 10 15
Time [sec]


Figure 3 23: Translation error e(t) in the regulation experiment.


0.5
0.5 i----- i---- i----------
0 5 10 15 20
0.5

0

-0.5
0 5 10 15 20
0.5

0

-0.5
0 5 10 15 20
0.5

0-

-0.5
0 5 10 15 20
Time [sec]


Figure 3-24: Rotation quaternion error q(t) in the regulation experiment.


































5 10
Time [sec]


Figure 3-25: Pixel coordinate p(t) (in pixels) of the current pose of the four feature
points in the regulation experiment. The upper figure is for the u(t) component
and the bottom figure is for the v(t) component.


0 5 10 15 20
Time [sec]


Figure 3-26: Regulation error p(t) p*(in pixels) of the four feature points in the
regulation experiment. The upper figure is for the u(t) u*(t) component and the
bottom figure is for the v(t) v*(t) component.


~nnn


. . .




























0o


-0.1


-0.1


5 10 15 20
Time [sec]


Figure 3-27: Linear camera velocity input v,(t) in the regulation experiment.








0.2


0

s


0.2


0


-0.2
0 5 10 15 2


0.5


0


-0.5
0 5


Time [sec]


Figure 3-28: Angular camera velocity input wc(t) in the regulation experiment.


~ch~r j


0


0





















































0 5 10 15 20
Time [sec]


Figure 3-29: Adaptive on-line estimate of z* in the regulation experiment.















CHAPTER 4
COLLABORATIVE VISUAL SERVO TRACKING CONTROL VIA A
DAISY-CHAINING APPROACH

4.1 Introduction

In this chapter, a collaborative trajectory tracking problem is considered for

a six DOF rigid-body object (e.g., an autonomous vehicle) identified by a planar

patch of feature points. Unlike typical visual servo controllers that require either

the camera or the target to remain stationary, a unique aspect of the development

in this chapter is that a moving monocular camera (e.g., a camera mounted on

an unmanned air vehicle (UAV)) is used to provide feedback to a moving control

object. The control objective is for the object to track a desired trajectory that

is encoded by a prerecorded video obtained from a fixed camera (e.g., a camera

mounted on a satellite, a camera mounted on a building).

Several challenges must be resolved to achieve this unexplored control ob-

jective. The relative velocity between the moving planar patch of feature points

and the moving camera presents a significant challenge. By using a daisy-chaining

approach (e.g., [16-19]), Euclidean homography relationships between different

camera coordinate frames and feature point patch coordinate frames are developed.

These homographies are used to relate coordinate frames attached to the moving

camera, the reference object, the control object, and the object used to record

the desired trajectory. Another challenge is that for general six DOF motion by

both the camera and the control object, the normal to the planar patch associated

with the object is unknown. By decomposing the homography relationships, the

normal to the planar patch can be obtained. Likewise, the distance between the

moving camera, the moving control object, and the reference object are unknown.









By using the depth ratios obtained from the homography decomposition, the

unknown time-varying distance is related to an unknown constant parameter. A

Lyapunov-based adaptive estimation law is designed to compensate for the un-

known constant parameter. The moving camera could be attached to a remotely

piloted vehicle with arbitrary rotations, this requires a parameterization that is

valid over a large (possibly unbounded) domain. Additionally, since this work is

motivated by problems in the aerospace community, homography-based visual servo

control techniques (e.g., [10, 20, 22]) are combined with quaternion-based control

methods (e.g., [14, 23, 24]) to facilitate large rotations. By using the quaternion

parameterization, the resulting closed-loop rotation error system can be stabilized

by a proportional rotation controller combined with a feedforward term that is a

function of the desired trajectory.

4.2 Problem Scenario

Over the past decade, a variety of visual servo controllers have been addressed

for both camera-to-hand and camera-in-hand configurations (e.g., see [1, 99-

101]). For visual servo control applications that exploit either of these camera

configurations, either the object or the camera is required to remain stationary.

In contrast to typical camera-to-hand or camera-in-hand visual servo control

configurations, a moving airborne monocular camera (e.g., a camera attached to

a remote controlled aircraft, a camera mounted on a satellite) is used by Mehta

et al. [18, 19] to provide pose measurements of a moving sensorless unmanned

ground vehicle (UGV) relative to a goal configuration. The results in [18, 19]

are restricted to three DOF, and the rotation error system is encoded by Euler

angle-axis parameterization.

Consider a stationary coordinate frame ZR that is attached to a camera and

a time-varying coordinate frame Fd that is attached to some object (e.g., an

autonomous vehicle) as depicted in Figure 4-1. The object is identified in an image











Reference
Camera
Current
Camera
,: x


R- x (Rrd, Xfrd)

(R*, ) dx



(R,x ) F*






t / Desired Patch Trajectory F





Figure 4-1: Geometric model.


by a collection of feature points that are assumed (without loss of generality) to

be coplanar and non-collinear (i.e., a planar patch of feature points). The camera

attached to IR a priori records a series of snapshots (i.e., a video) of the motion

of the object attached to Fd until it comes to rest (or the video stops recording).

A stationary coordinate frame F* is attached to a reference object identified by

another planar patch of feature points that are assumed to be visible in every frame

of the video recorded by the camera. For example, the camera attached to ZR is on-

board a "stationary" satellite that takes a series of snapshots of the relative motion

of .Fd with respect to F*. Therefore, the desired motion of .Fd can be encoded as

a series of relative translations and rotations with respect to the stationary frame

F* a priori. Spline functions or filter algorithms can be used to generate a smooth

desired feature point trajectory [10].

Consider a time-varying coordinate frame I that is attached to a camera (e.g.,

a camera attached to a remote controlled aircraft) and a time-varying coordinate









frame F that is attached to the control object as depicted in Figure 4-1. The

camera attached to I captures snapshots of the planar patches associated with F

and F*, respectively. The a priori motion of Fd represents the desired trajectory

of the coordinate system F, where F and Fd are attached to the same object

but at different points in time. The camera attached to ZR is a different camera

(with different calibration parameters) as the camera attached to I. The problem

considered in this chapter is to develop a kinematic controller for the object

attached to F so that the time-varying rotation and translation of F converges to

the desired time-varying rotation and translation of Fd, where the motion of F is

determined from the time-varying overhead camera attached to I.

4.3 Geometric Model

The relationships between the coordinate systems are as follows (see Table 4-

1): R (t), R*(t), R,(t), 7(t), Rd (t), R* E SO(3) denote the rotation from F to Z,

F* to 1, Z to ZR, F to IR, 1Fd to ZR, and F* to IR, respectively, xf (t), x (t) E fR3

denote the respective time-varying translation from F to I and from F* to I

with coordinates expressed in I, and xf, (t), x (t), xfrd (t), ,E IR3 denote the

respective constant translation from I to ZR, F to ZR, Fd to IR, and from F* to

ZR with coordinates expressed in ZR. From Figure 4-1, the translation x',(t) and

the rotation R' (t) can be expressed as


Xl', = x + RR*T(Xf x }) R' = RR*R. (4-1)



As illustrated in Figure 4-1, T, Td and T* denote the planar patches of

feature points associated with F, Fd, and F*, respectively. sli I3 Vz =

1, 2, n (n > 4) denotes the constant Euclidean coordinates of the i-th fea-

ture point in F (and also Fd), and s2i IR3 Vi = 1, 2, n denotes the constant

Euclidean coordinates of the i-th feature point in F*. From the geometry between










Motion Frames
R (t), xf (t) F to I in I
R*(t), x}*(t) F* to I in I
R,(t), xfr(t) I to IR
R' (t), x' (t) F to IR in ZR
R*r, ,: I* to ZR in ZR
Rrd(t), Xfrd(t) Fd to ZR in Z

Table 4-1: Coordinate frames relationships


the coordinate frames depicted in Figure 4-1, the following relationships can be

developed


m7i = Xf + Rs1

m7i = x + R s2i


m rdi = Xfrd + RrdSli

Tn = xf + R sli
fi Xfr + Sli


,i = x} + R*s2i. (4-4)

In (4-2)-(4-4), mfi(t), mn (t) E IR3 denote the Euclidean coordinates of the feature

points on Tr and T*, respectively, expressed in I as


T
m(t) A [ (t) 'i (t) z (t) ]


(4-5)

(4-6)


mf (t), mrndi (t) E IR3 denote the actual and desired time-varying Euclidean coordi-
nates, respectively, of the feature points on Tr expressed in IR as


S]T
m .A [ i(t) y(t)W zit)

i T
frdi(t) A X Xdi(t) Ydi (t) Zrdi(t)


(4-7)

(4-8)


(4-2)

(4-3)









and mii E IR3 denotes the constant Euclidean coordinates of the feature points on

the planar patch T* expressed in ZR as


i V Vxri Yri I
[ T


After some algebraic manipulation, the expressions in (4-2)-(4-4) can be rewritten

as

m1 = xT + R ni (4-10)

mi = Xf + Rm, mrd = Xd + Rrdm* (4-11)
*ri = Xfr + Rrfi i = Xfr + Rrf(i, (4-12)


where R, (t), R (t), Rrd(t), Rr(t) E SO (3) and x,(t), Xf (t), Xfrd(t), Xf,(t) E 3 are

new rotational and translational variables, respectively, defined as


R, = R*RT R = RR*T (4-13)

Rrd = rdR*T Rr = RR*T

and



x = x} Rn (xf R (2i sli)) (4-14)

Xf = Xf R (x} + R* (s2i si)) (4-15)

Xfrd = Xfrd Rd (Xfr + R (S2i Sli)) (4-16)

Xfr = X, Rrf = Xf, RrXf. (4-17)

Note that Rn (t), R(t) and Rd (t) in (4-13) are the rotation matrices between F

and F*, F* and F, and F* and Ed, respectively, but Xn(t), xf(t) and Xfrd(t) in

(4-14)-(4-16) are not the translation vectors between the corresponding coordinate









frames. However, this will not affect the following controller design because only

the rotation matrices will be used in the controller development.

To facilitate the development of a relationship between the actual Euclidean

translation of F to the Euclidean translation that is reconstructed from the image

information, the following projective relationships are developed:

d(t) = nTmi d*(t) = n*Tm d* = n*T i,n (4-18)

where d(t) E IR represents the distance from the origin of I to Tr along the unit

normal (expressed in Z) to T denoted as n(t) E R3, d*(t) E R represents the

distance from the origin of I to T* along the unit normal (expressed in 1) to T*

denoted as n*(t) E IR3, and d* IR represents the distance from the origin of IR

to T* along the unit normal (expressed in ZR) to T* denoted as n* E R3 where

n*(t) = R, (t)n,. In (4-18), d(t), d*(t), d* > e for some positive constant e R.

Based on (4-18), the relationships in (4-10)-(4-12) can be expressed as

mf = R, + n mT (4-19)

mi = (R+ *f* (4-20)

rdi = (Rd + frd *T) m (4-21)

Sf,n*T m),*
m = (R + X (4-22)

rmn = R, + Xn) in. (4-23)

As in Chen et al. [10], the subsequent development requires that the constant

rotation matrix R" be known. The constant rotation matrix R" can be obtained a

priori using various methods (e.g., a second camera, Euclidean measurements). The

subsequent development also assumes that the difference between the Euclidean

distances (s2i sli) is a constant Vi = 1, ...,n. While there are many practical

applications that satisfy this assumption (e.g., a simple scenario is that the objects









attached to F and F* are the same object), the assumption is generally restrictive

and is the focus of future research. As described by Hu et al. [17], this assumption

can be avoided by using the geometric reconstruction approach [102] under an

alternative assumption that the distance between two feature points is precisely

known.

4.4 Euclidean Reconstruction

The relationships given by (4-19)-(4-23) provide a means to quantify a

translation and rotation error between the different coordinate systems. Since the

pose of T, Td, and T* cannot be directly measured, a Euclidean reconstruction is

developed to obtain the pose error by comparing multiple images acquired from

the hovering monocular vision system. To facilitate the subsequent development,

the normalized Euclidean coordinates of the feature points in T and T* can be

expressed in terms of I as mi (t) R3 and m (t) E IR3, respectively, as
Aml *
n A ti mi (4-24)
Zi Zi

Similarly, the normalized Euclidean coordinates of the feature points for the

current, desired, and reference image can be expressed in terms of IZ as mi(t),

mdi (t), T~i I 3, respectively, as
'(t) r '
mn(t) m -- rndi (t) A \ n -*. (4-25)
Zi (t) ,di (t) Z1

From the expressions given in (4-20) and (4-24), the rotation and translation

between the coordinate systems F and F*, between F* and Fd, and between I and









IR can now be related in terms of the normalized Euclidean coordinates as follows:


mi = ai (R + xhn*T) mT (4-26)

m = (R, + xhn T) mi (4-27)
ti

m ndi aadi (Rrd + Xhrdn*) mn (4-28)
""id --- U0---7 d I"UITd~


mT*i = a,i (R, + xhn *T) mf, (4-29)


where ai (t), adi(t), a,i(t) R denote depth ratios defined as

*
z. z. z.
Zi Zri Z
^ Cardi ri = I
Zi Zrdi Zi

and xh (t), X,h (t), Xhrd(t), Xhr(t) R3 denote scaled translation vectors that are

defined as

Xf n(
Xh = Xnh = (4-30)
d* d
Xfrd Xfr
Xhrd d Xhr
d,* d*

Since the normalized Euclidean coordinates in (4-26)-(4-29) can not be

directly measured, the following relationships (i.e., the pin-hole camera model) are

used to determine the normalized Euclidean coordinates from pixel information


pi = Aimn p = Aimn (4-31)

Prdi = A2,di P*i = A,m* (4-32)


where A1, A2 E R3x3 are known, constant, and invertible intrinsic camera cali-

bration matrices of the current camera and the reference camera, respectively. In

(4-31) and (4-32), pi (t) and p* (t) E R3 represent the image-space coordinates of

the Euclidean feature points on 7T and T* expressed in terms of I as

[ v
Pi= Ui U 1 1P = [ 1 ,33)









respectively, where ui (t), r (t) u(t), v(t) E R. Similarly, prdi(t) and pi E IR3
represent the image-space coordinates of the Euclidean features on Td and T*
expressed in terms of ZR as

Prdi Udi Vrdi1 P 1 (4-34)

respectively, where Udi(t), vdi(t), U*i, vi IR. By using (4-26)-(4-29) and
(4-31)-(4-34), the following relationships can be developed:

pi = a (AI (R + xhn*T) A ') p*
(4-35)
G

p* = (Ai (R, + xn ) A-') pi
ai (4-36)
G,

Prdi = ardi (A2 Rrd +' .**T) A-1) p
(4-37)
Grd

pi = ai (A2 (R + Xh *T) A') p)
(4-38)
G,

where G (t), G,(t), G,d (t), G, (t) e R3x3 denote projective homographies. Sets of
linear equations can be developed from (4-35)-(4-38) to determine the projective
homographies up to a scalar multiple. Various techniques can be used (e.g.,
see Faugeras and Lustman [93] and Zhang and Hanson [94]) to decompose the
Euclidean homographies, to obtain ai (t) ardi(t), ari(t), Xh (t), Xnh (t), Xhrd(t),

Xhr(t), R (t), R,(t), Rrd(t), R,(t), n*(t), n*, n (t). Given that the constant rotation
matrix R* is assumed to be known, the expressions for Rrd(t) and R,(t) in (4

13) can be used to determine Rd(t) and R*(t). Once R*(t) is determined, the
expression for R(t) in (4-13) can be used to determine R(t). Also, once R*, R*T (t),
and R (t) have been determined, (4-1) can be used to determine R'(t). Since R,(t),









Xhr(t), ai(t), n*(t), n, n T(t), m t), and mi(t) can be determined, the following
relationship can be used to determine mn (t):


mn = (R + Xan ni mi, (4-39)
z yr n mni,

where the inverse of the ratio -- can be determined as
)(t)

= 0 0 1 Rr + raijT7 nT mi. (4-40)
Zi n Tni

4.5 Control Objective

The control objective is for a six DOF rigid-body object (e.g., an autonomous

vehicle) identified by a planar patch of feature points to track a desired trajectory

that is determined by a sequence of images taken by a fixed reference camera. This

objective is based on the assumption that the linear and angular velocities of the

camera are control inputs that can be independently controlled (i.e., unconstrained

motion) and that the reference and desired cameras are calibrated (i.e., A1 and

A2 are known). The control objective can be stated as mi (t) rnidi (t) (i.e.,

the Euclidean feature points on 7 track the corresponding feature points on wd).

Equivalently, the control objective can also be stated in terms of the rotation

and translation of the object as xf,(t) -- Xf,d(t) and R'(t) -- R,d (t). As stated

previously, R'(t) and Rd (t) can be computed by decomposing the projective

homographies in (4-35)-(4-38) and using (4-1). Once these rotation matrices

have been determined, the unit quaternion parameterization is used to describe

the rotation matrix. This parameterization facilitates the subsequent problem

formulation, control development, and stability analysis since the unit quaternion

provides a global nonsingular parameterization of the corresponding rotation

matrices. See Section 2.3 for some background about the unit quaternion.









Given the rotation matrices R' (t) and Rd (t), the corresponding unit quater-

nions q (t) and qd (t) can be calculated by using the numerically robust method

(e.g., see [14] and [15]) based on the corresponding relationships


R' = (q2

Rrd = (qO d


q/,qv) 13 + 2q~q/T + 2qoqx<

qCdqVd) 13 + 2qCdq~d + 2qOdq

(4-41)

(4-42)


where 13 is the 3 x 3 identity matrix, and the notation qx (t) denotes the following

skew-symmetric form of the vector q (t) as in (2-10).

To quantify the rotation error between the feature points on Tr and Trd, the

error between rotation matrices R' (t) and R,d (t) is defined as


R = R'TRd = (4q~ 4v) 13 + 2q~v 4- 2qoq ,


(4-43)


where the error quaternion q(t) = (qot), q (t))T is defined as

o [ qoqod + q qvd

v qodqv qoqvd + qv qvd

Since q(t) is a unit quaternion, (4-43) can be used to quantif:

objective as


(4-44)


y the rotation tracking


I,, (t) |- 0 = R(t) I3 as t oo.

The translation error, denoted by e(t) E IR3, is defined as [10,20]

e = me med,

where me (t), med(t) IR3 are defined as
S_ _T T
i Y i I rdi X rdi 1 zrdi
me = In(-) med = In( Z
Zi zi ri Zrdi Zrdi i

In (3-7), Z( and ) can be computed as below
Vi\ I ri\ M


(4-45)


(4-46)


(4-47)











Zi z zi Zi Zi 1 Zrdi
z* --a-r -a
Zri Zi Z7 Zi Zi Ci Zai Oardi
Based on (4-45) and (4-46), the subsequent control development targets the

following objectives:


|, (t)| 0 and | e(t) 0 as t oo. (4-48)


4.6 Control Development

4.6.1 Open-Loop Error System

From (4-43) and (4-44), the open-loop rotation error system can be developed

as
-~T

q013 + qX
2 [q13q (cRWcd) (449)


where Wed (t) denotes the angular velocity of 7d expressed in Fd that can be

calculated as [23]

Wdc = 2(q0dqvd qvdqd) 2qdd, (4-50)

where (qod(t), q9(t))T, (qod(t), dT(t)) are assumed to be bounded; hence, Wed(t)

is also bounded. The open-loop translation error system can be derived as (see

Appendix C)
*
zIe = -7 L' (vc + s) zfed, (4-51)
zi

where Vc(t), jc(t) E R3 denote the linear and angular velocity vectors of 7T expressed

in f', respectively, and the auxiliary measurable term L'(t) E RI3x3 is defined as

xi
10 -
z

v 0 1
Zi
0 0 1









4.6.2 Closed-Loop Error System

Based on the open-loop rotation error system in (4-49) and the subsequent

Lyapunov-based stability analysis, the angular velocity controller is designed as


J, = -Kq, + Rjcd, (4-52)


where K E R3x3 denotes a diagonal matrix of positive constant control gains.

From (4-49) and (4-52), the rotation closed-loop error system can be determined as

1
0 = Kqv (4-53)
1 1
4, = -(0 o + 3 x) Kwv = --Kwqoqv .
2 2

From (4-51), the translation control input v,(t) is designed as


Vc = -1 (Ke *fed) ) x 8s, (4-54)
ri

where K, E R3x3 denotes a diagonal matrix of positive constant control gains. In

(4-54), the parameter estimate iz (t) R for the unknown constant z*i is designed

as

*i = -Tye ,d, (4-55)

where 7 R denotes a positive constant adaptation gain. By using (4-51) and

(4-54), the translation closed-loop error system is


z e =-Ke m* (4-56)


where (t) E R denotes the following parameter estimation error:


* =* -*
ri ri ri.


(4-57)








4.6.3 Stability Analysis
Theorem 4.1: The controller given in (4-52) and (4-54), along with the
adaptive update law in (4-55) ensures asymptotic tracking in the sense that

,,, (t)| 0, 1e(t) 0- 0, as t oo. (4-58)

Proof: Let V(t) E I denote the following differentiable non-negative function
(i.e., a Lyapunov candidate):

V = 44,, + (1 Go)2 + Te + -TC. (4-59)
2 2Y

The time-derivative of V(t) can be determined as

V = -q/K7qoq (1 qo)q Kq eT Kve

+ e (-Ke e Zmed) + zie ed

= (qo13 + (1 q0)13)K K e Kve

= -qKq eTKe, (4-60)

where (4-53) and (4-55)-(4-57) were utilized. Based on (4-59) and (4-60),

e(t), 4v(t), qo(t), zi(t) e CL and e(t), qv(t) E 2. Since ,*(t) E C, it is
clear from (4-57) that *i(t) E Co. Based on the fact that e(t) E C, (4-46)
and (4-47) can be used to prove that m'(t) E Co, and then L'(t), L`1 (t) E Co.
Based on the fact that q,(t) E C and Jcd(t) is a bounded function, (4-52) can be
used to conclude that wcJ(t) Co. Since 4i(t), e(t), mn(t), L4(t), L-1 (t) E Co
and ed (t) is bounded, (4-54) can be utilized to prove that Vc(t) E LC. From the
previous results, (4-49)-(4-51) can be used to prove that e(t), q,(t) E C,. Since
e(t), qv(t) E LC n 2, and e(t), q4(t) E LC, Barbalat's Lemma [96] can be used to
conclude the result given in (4-58).








4.7 Simulation Results
A numerical simulation was performed to illustrate the performance of the
tracking controller given in (4-52), (4-54), and the adaptive update law in (4-55).
The camera calibration parameters were chosen as

1545.1 0 640
A = A2 = 0 1545.1 512

0 0 1

The origins of coordinate frames F, F* and Fd, and the four coplanar feature
points on the planes T, T* and Td are chosen such that the feature points have the
same Euclidean coordinates (in [m]) in f, F* and Fd, as

01 = 0 0.15 0 02 0.15 0.15 0]

03 = 0.15 0 0 04 = -0.15 0 0

The time-varying desired image trajectory was generated by the kinematics of the
feature point plane where the desired linear and angular velocities were selected as



Vcd = 0.1 sin (t) 0.1 in (t) 0.1 sin (t) ][m/s

cd = 0 0 0.5 [rad/s].

The moving camera attached to I is assumed to have a linear velocity vector of

0[ .lsin(t) 0 0 ][m/s]
The initial rotation matrices R (0) between F and I, R* (0) between F* and Z,
and Rd (0) between Fd and IR, and the constant rotation matrix R* between F*








and IR, were set as

R (0) = R,(1800)Ry (0) R, (400)

R* (0) = R,(180)R, (0o) R, (-20)

Rd (0) = R,(1800)Ry (00) R, (20)
R. = R,(180l)Ry (o0) R, (80o).

The initial translation vectors xf (0) between F' and I (expressed in Z), x* (0)
between F* and I (expressed in 1), and Xfrd (0) between Fd and IR (expressed in
IR), and the constant translation vector i, (0) between F* and IR (expressed in
IR), were selected as

f (0) = -0.5 0.5 4.0
r T
} (0) = 1.0 1.5 3.5
r T
Xfrd (0) = 0.5 1.5 6.0
r T
x, (0) = -1.0 1.5 4.0

The initial Euclidean relationship between the cameras, the reference object, the
control object, and the object that was used to generate the desired trajectory is
shown in Figure 4-2.
The initial image-space coordinates (i.e., pi(0)) of the four feature points
attached to the plane Tr, expressed in I, were computed as (in pixels)



i(0) = 409.62 660.75 1 P2(0)= 454.00 623.51 1

P3(0) 491.25 667.89 1 (0) 402.48 742.38 1]T








The initial reference image-space coordinates (i.e., p7 (0) and p*i) of the four
feature points attached to the plane T*, expressed in I and IR, respectively, were
computed as (in pixels)


P(0) = 1104.1

p (o) = 1143.7

Pi = 196.7

P3 = 263.8


1112.0 1

1196.8 1 ]T

1081.4 1 ]

1034.4 1


[ I]T
p (0) 66 1166.3 1134.6 1

j() = 1019.2 1151.5 1
T
P,*2 = 206.7 1024.3 1
T
P*4 [ 243.7 1148.5 1


The initial image-space coordinates (i.e., prdi(0)) of the four feature points attached
to the plane Trd, expressed in IR, were computed as (in pixels)


Prdl(0) = 755.5

Prd3(0) = 805.1

The control gains K,
(4-55) were selected as


K=


862.0 1 Prd2(0) 791.8 848.8 1

885.1 1 Prd4(0) 732.5 911.5 1

in (4-57) and K, in (4-54) and adaptation gain 7 in



diag {1, 1, 1} K = diag{5, 5, 5}


S= 20.


The desired image-space trajectory of the feature point plane Ta, taken by the
camera attached to IR, is shown in Figure 4-3. The current image-space trajectory
of the feature point plane T, taken by the camera attached to Z, is shown in Figure
4-5. The reference image-space trajectory of the reference plane T*, taken by
the camera attached to Z, is shown in Figure 4-4. The resulting translation and







79







4- Ca er .. : Reference Camera IR
S Cam ra .

3

2

N 1




-1-I- : '




-22 2

y x


Figure 4-2: This figure shows the initial positions of the cameras and the feature


by "0". The feature points on the planes T* and 7d are denoted by ".". The
origins of the coordinate frames F, '*and Fd are denoted by "*".


rotation tracking errors are plotted in Figure 4-6 and Figure 4-7, respectively. The

errors go to zero asymptotically. The translation and rotation control inputs are


shown in Figure 4-8 and Figure 4-9, respectively.































850

800

750
0 5 10 15 20
Time [sec]


Figure 4-3: Pixel coordinate prd(t) of the four feature points on the plane rd in a
sequence of desired images taken by the camera attached to IR. The upper figure is
for the Urd(t) component and the bottom figure is for the v,d(t) component.


1100


1000


900
0 5 10 15 2C

l9nn.---


1150
1100
> 1100


1050 b
0 5 10 15
Time [sec]


Figure 4-4: Pixel coordinate p* (t) of the four feature points on the plane T* in a
sequence of reference images taken by the moving camera attached to I. The upper
figure is for the u* (t) component and the bottom figure is for the v* (t) component.



















.x

S600


400
0 5 10 15 20




1400 -

S1200 -

1000
8 0 0 . . .. . .. .. . . .. ... . .
800

600
0 5 10 15 20
Time [sec]



Figure 4-5: Pixel coordinate p(t) of the four feature points on the plane Tr in a se-
quence of images taken by the moving camera attached to I. The upper figure is
for the u(t) component and the bottom figure is for the v(t) component.






0.5


0
x .


-0.5
0 5 10 15 20
0.5


0


-0.5
0 5 10 15 20
0.5


Figure 4-6: Translation error e(t).


0 5 10 15 20
Time [sec]



















1 0 .5b / .. .. .. .. .. .. .. .. .. .


0 5 10 15 20
0.5

S 0

-0.5
0 5 10 15 20
0.5

0

-0.5
0 5 10 15 20


-0.5

-1
0 5 10 15 20
Time [sec]



Figure 4-7: Rotation quaternion error q(t).


0-

_. -.--- '- ..... ... ... .. ... ...-..
-1

-2
0 5 10 15 20

2





-4
0 5 10 15 20



1
1 ,---------,---------,----




0 -2


0 5 10 15 20
Time [sec]



Figure 4-8: Linear camera velocity input v (t).







































0


-0.5
0 5 10 15 20
0.5


0


-0.5
0 5 10 15 20
2


1


0
0 5 10 15 20
Time [sec]



Figure 4-9: Angular camera velocity input wc(t).















CHAPTER 5
ADAPTIVE VISUAL SERVO TRACKING CONTROL USING A CENTRAL
CATADIOPTRIC CAMERA

5.1 Introduction

In this chapter, an adaptive homography based visual servo tracking control

scheme is presented for a camera-in-hand central catadioptric camera system. By

using the central catadioptric camera, a full panoramic FOV is obtained. The

literature review for visual servo control using central catadioptric cameras are

presented in Section 1.3.2. In this chapter, the tracking controller is developed

based on the relative relationships of a central catadioptric camera between the

current, reference, and desired camera poses. To find the relative camera pose

relationships, homographies are computed based on the projection model of the

central catadioptric camera [33-36]. As stated by Geyer and Daniilidis [36], a

unifying theory was proposed to show that all central catadioptric systems are

isomorphic to projective mappings from the sphere to a plane with a projection

center on the perpendicular to the plane. By constructing links between the

projected coordinates on the sphere, the homographies up to scalar multiples can

be obtained. Various methods can then be applied to decompose the Euclidean

homographies to find the corresponding rotation matrices, and depth ratios. The

rotation error system in this chapter is based on the quaternion formulation which

has a full-rank 4x3 interaction matrix. Lyapunov-based methods are utilized to

develop the controller and to prove asymptotic tracking.
























mirror Fm
mirror Ym


Figure 5-1: Central catadioptric projection relationship.


5.2 Geometric Model

A central catadioptric camera is composed of two elements: a camera and a

mirror which are calibrated to yield a single effective viewpoint. Geyer and Dani-

ilidis [36] developed a unifying theory that explains how all central catadioptric

systems are isomorphic to projective mappings from the sphere to a plane with a

projection center on the optical axis perpendicular to the plane. For the central

catadioptric camera depicted in Figure 5-1, the coordinate frames F, and F, are

attached to the foci of the camera and mirror, respectively. Light rays incident to

the focal point of the mirror (i.e., the origin of F,) are reflected into rays incident

with the focal point of the camera (i.e., the origin of FT).

Without loss of generality, the subsequent development is based on the

assumption that the reflection of four coplanar and non-collinear Euclidean feature

points denoted by Oi of some stationary object is represented in the camera image

plane by image space coordinates ui(t), (t) ER Vi = 1, 2, 3, 4. The plane defined

by the four feature points is denoted by 7r as depicted in Figure 5-1. The vector













image plane
Oi



-------------0
mi= (xi, yi, zi






Zo


A T
unitary sphere

Figure 52: Projection model of the central catadioptric camera.


17ni(t) R R3 in Figure 52 is defined as

SXi// i


where xi(t), I, (t), zi(t) E R denote the Euclidean coordinate of the feature points

Oi expressed in the frame F" which is affixed to the single effective viewpoint. The

projected coordinate of mi(t) can be expressed as

mi xi yi i
mTs =- (5-1)
Li Li Li Li '

where mnsi (t) E R3 denotes the Euclidean coordinates of Oi projected onto a unit

spherical surface expressed in F, and Li (t) E R is defined as


Li = ?/ + Y? + z. (5-2)









Based on the development in [36], m~si(t) can be expressed in the coordinate frame

Oc, which is attached to the reprojection center, as

[ ~T
\7/ iX I Zi -3)
m = I -- + C (53)
L i i i

where R IR is a known intrinsic parameter of the central catadioptric camera that

represents the distance between the single effective viewpoint and the reprojection

center (see Figure 5-2). The normalized coordinates of mrpi(t) in (5-3), denoted by

mpi (t) E R3, can be expressed as

Xi iI T
Li Li xi I(
z z z+ + Li zi + L (
Li Li+

By using (5-4) and the following relationship:

i + + (Z5)
1 =( + + (5-5)
Lie Lie Li

the coordinates of the feature points on the unit spherical surface rni(t) can be

determined as
T
S1 + 2 m (1 +1
2 2 pypix
m2i + m + 1
piuc piy
e+ + T-,+Y) 2) + 1
Tmsi = ++l(m +- --y (5-6)
m npi + miy + 1

-E +
mi + m2 + 1

where mpi (t) and mpiy (t) E R are the first two elements of mpi (t). The bijective

mapping in (5-6) is unique (i.e., there is no sign ambiguity) because 0 < < 1 [36],

and the geometry of the central catadioptric camera given in Figure 5-2 guarantees

that the third element of rpi(t) is positive; otherwise, the feature point Oi can not

be projected onto the image plane.











reference plane 71
Oi


Figure 5-3: Camera relationships represented in homography.


As shown in Figure 5-3, the stationary coordinate frame T* denotes a con-

stant reference camera pose that is defined by a reference image, and the coordi-

nate frame Fd represents the desired time-varying camera pose trajectory defined

by a series of images (e.g., a video). The vectors m mdi(t) E R3 in Figure 5-3 are

defined as
T T
n* z md A [ Xd(t) yd(t) di (t) (7)

where x, i z4 E R and Xdi(t), ydi(t), Zdi(t) E R denote the Euclidean coordinates

of the feature points Oi expressed in the frames F* and Fd, respectively.









The constant coordinates of mT can be projected onto a unit spherical surface

expressed in F* and 0*, respectively, as

7- 7 7 T

_- mi i l i (i
TO* Zl [x
i L i i J
Si L L* L* L( 8

and the time-varying coordinates m di(t) can be projected onto a unit spherical

surface expressed in Fd and Odc, respectively, as
T
mdi Xdi Ydi Zdi
mdsi -
Ldi Ldi Ldi Ldi
SXdi Ydi Zdi +
Ld d Ldi Ldi \

*,7-
where mn, mii, mds (t) mdp (t) R3, and L7, Ldi (t) R are defined as


L* = 2 ,2 + y 2 +L = + y2, + z,2 (5-9)

The normalized coordinates of mdsi (t) denoted as mdsi(t) E IR3 is defined as

dsi Xdi (t) ydi (t) 1 (510)
m -dsi 1 (5-10)
Zdi (t) Zdi (t)

which will be used in the controller development. The signal mTdsi(t) in (5-10) is

measurable because 1mdsi (t) can be computed from the measurable and bounded

pixel coordinates pdi(t) using similar projective relationships as in (5-6) and (5

17). The normalized coordinates of T*i, m dpi (t) denoted as mTn, mdpi (t) E R3,

respectively, are defined as

[ i xI
L z+1 + L zi J
1 21 +
Xdi Ydi
mdpi = 1
zdi + Ldi zdi + Ldia









From standard Euclidean geometry, the relationships between mi(t), mddi(t)

and mn can be determined as

mi = Xf + R~ mdi = Xfd + Rdmn, (5-12)


where xf (t) Xfd (t) E R3 denote the translation vectors expressed in F and Fd,

respectively, and R (t) Rd(t) E SO(3) denote the orientation of F* with respect

to F and Fd, respectively. As also illustrated in Figure 5-3, n* IR3 denotes the

constant unit normal to the plane T, and the constant distance from the origin of

F* to T along the unit normal n* is denoted by d* E IR is defined as

d* A n*T,. (5-13)

By using (5-13), the relationships in (5-12) can be expressed as

mi = Hfm mdi = Hdmn, (5-14)

where H(t), Hd(t) E IR3x3 are the Euclidean homographies defined as

J Xf R zT lXfd*T r 15
H = R + Xn* Hd = Rd+ n* (515)
d* d*

Based on (5-1) and (5-8), the relationship between the Euclidean coordinates in

(5-14) can be expressed in terms of the unit spherical surface coordinates as

rsi = aiHn~ i mdsi = adiHdmi, (5-16)

SL L*
where ai(t) Li (t) R and adi(t) L M are scaling terms.
Li (t) Ldi (t)
5.3 Euclidean Reconstruction

The homogenous pixel coordinates of the features points with respect to the

camera frames ', F* and Fd are denoted as pi (t), p7 and pdi (t) E I3, respectively.

They can be related to the normalized coordinates nmpi (t), mn and mdpi (t) via the









following linear relationship as

pi = Ampi p* = Am*i pdi = Amdpi, (5-17)

where A E IR3x3 contains the calibrated intrinsic parameters of the camera and the

mirror (see Barreto and Araujo [35]). Since the camera and mirror are calibrated

(i.e., A is known), mpi (t), mTp and mdpi (t) can be computed from the measurable

pixel coordinates pi (t), p7 and pdi (t) based on (5-17). The expression given in

(5-6) can then be used to compute mi (t) from mpi (t). Similarly, m i and mdsi (t)

can be computed from mn* and mdpi (t), respectively. Then based on (5-16), a set

of 12 linearly independent equations given by the 4 feature point pairs (p*, pi (t))

with 3 independent equations per feature point can be developed to determine

the homography up to a scalar multiple. Various methods can then be applied

(e.g., see Faugeras and Lustman [93] and Zhang and Hanson [94]) to decompose
the Euclidean homography to obtain ai(t), H(t), R(t), ,xf( and n*. Similarly,

adi(t), Hd(t), Rd(t), d can be obtained from (5-16) using (p7, Pdi (t)). The
d*
rotation matrices R(t), Rd(t) and the depth ratios ai (t), adi (t) will be used in the

subsequent control design.

5.4 Control Objective

The control objective is for a camera attaching to an object (i.e., camera-in-

hand configuration) which can be identified by a planar patch of feature points to

track a desired trajectory that is determined from a sequence of desired images

taken during the a priori camera motion. This objective is based on the assumption

that the linear and angular velocities of the camera are control inputs that can

be independently controlled (i.e., unconstrained motion) and that the camera is

calibrated (i.e., A is known). The control objective can be stated as mi (t)

mdi (t) (i.e., Euclidean feature points on T track the corresponding feature points
on 7Td). Equivalently, the control objective can also be stated in terms of the









rotation and translation of the object as xf(t) -- Xfd(t) and R(t) -- Rd (t). As
stated previously, R(t) and Rd (t) can be computed by decomposing the projective

homographies in (5-16). Once these rotation matrices have been determined,

the unit quaternion parameterization is used to describe the rotation matrix.

This parameterization facilitates the subsequent problem formulation, control

development, and stability analysis since the unit quaternion provides a global

nonsingular parameterization of the corresponding rotation matrices.

To quantify the error between the actual and desired camera orientations, the

mismatch between the rotation matrices R (t) and Rd (t), denoted by R(t) E R3, is
defined as

R = RRw. (5-18)

Given the rotation matrices R (t) and Rd (t) from the homography decomposition,

the corresponding unit quaternions q (t) and qd (t) can be computed by using the

numerically robust method (e.g., see [14] and [15]) as

R (q) = (q0 qq,) I3 + 2qq, 2qoqX (5-19)

Rd (qd) = (q l qdqvd) 13 + 2qdqd 2qodq,d, (5-20)

where 13 is the 3 x 3 identity matrix, and the notation q, (t) denotes the skew-

symmetric form of the vector q,(t) as in (2-10). Based on (5-18)-(5-20), the

rotation mismatch can be expressed as

R = (qo2 qq,) I3 + 2qVq 2qoq (5-21)

where the error quaternion (qo(t), T (t))T is defined as [23]


o4 = qoqod + qqvd (5-22)

qv = qOdqv qvd + q X qvd









Based on (5-18) and (5-21), the rotation tracking control objective R(t) -> Rd (t)
can be formulated as

I,(, (t)| -* 0 = R(t) I3 as t oc. (5-23)

To quantify the position mismatch between the actual and desired camera, the

translation tracking error e(t) R3 is defined as

Xi Xdi 'I Ydi i ( Zi (524)
e = me md = In (5-24)
d Zi Zdi Zi di di

where me(t), med(t) E R3 are defined as

me Tmel me2 me3 i In zi (5-25)

med Xdi Ydi md In zdi (5-26)
SZdi Zdi
mned A[medl med2 med3 ] I[ Xdi YdiZ(2

Based on (5-23) and (5-24), the subsequent control development targets the

following objectives:

I|,.(t) -0 and Ie(t)| -- 0 as t -- oc. (5-27)

The error signal qv(t) is measurable since it can be computed from the R (t)

and Rd (t) as in [24]. The first two elements of the translation error e (t) are

measurable because

xil/Li Xdi/Ldi /Li ydi/Ldi
mel medal me2 med2 -
zi/Lii zdi/L-di z iLi z/diLdi
Xe (t) Yi (t) z~ (t) Xdi(t)
where Xi i W i M can be computed from (5-6) and (5-17), and di
Li (t) Li (t) Li (t) Ldi (t)'
d, (t) d can be computed from similar relationships. The third element of
Ldi (t) Ldi (t)
the translation error is also measurable since

Zi (Ld/Ld) z /L adi /L
zdi Zdi/Ldi (L7/Li) ai Zdi/Ldi









Zi (t) Zdi(t}
where I) can be computed from (5-6) and (5-17), can be computed from
Li (t) Ldi(t)
similar relationships, and the depth ratios ai (t) adi (t) can be obtained from the

homography decompositions in (5-16).

5.5 Control Development

5.5.1 Open-Loop Error System

The open-loop rotation error system can be developed as [23]


q 2 [cq RWcd) (5-28)
2 4 0 13 + c

where q(t) = (qo(t), T(t))T, wc(t) E IR3 denotes the camera angular velocity control

input, and Wcd (t) E IR3 denotes the desired angular velocity of the camera that is

assumed to be a priori generated as a bounded and continuous function (see [10] for

a discussion regarding the development of a smooth desired trajectory from a series

of images).

Based on (5-7) and (5-10), the derivative of mdi(t) is obtained as


mdi = ( -mdsi (5-29)

where Pdi(t) E IR is defined as

V*/L*
di -- di
Zdi Zdi/Ldi

Based on (5-29) and the fact that [22]

mdi = -Vcd + fdiWcd, (5-30)

where Vcd (t) E IR3 denotes the desired linear velocity of the camera expressed in -d,

it can be obtained that

z* X d (
Vcd = cmdsWcd z -T i (5-31)
Pdi t Pdi









After differentiating both sides of (5-24) and using the equations (5-30), (5-31)

and

mi = -Vc + mi w,

the open-loop translation error system can be derived as


ze = -3iLvVc + z/ LwJc z*ndL vd7 md (5-32)
Si vd(dt Ip di)

where v,(t) E R 3 denotes the linear velocity input of the camera with respect to F*

expressed in f, the Jacobian-like matrices L,(t), Ld(t), L,, (t) E RI33 are defined

as

1 0 -mer 1 0 -medl

L = 0 1 -me2 Ld = 0 1 -med2 (5-33)

001 001

melme2 -1 Tmel e2

Lv, = 1 + m e2 mlm2 -mel l (5-34)

-me2 mel 0

and Pi(t) R is defined as
y* z*/LI
Zi zi/Li"

5.5.2 Closed-Loop Error System

Based on the open-loop rotation error system in (5-28) and the subsequent

Lyapunov-based stability analysis, the angular velocity controller is designed as


wc = -Kvq + Rcd,


(5-35)









where K E R3x3 denotes a diagonal matrix of positive constant control gains.
From (5-28) and (5-35), the rotation closed-loop error system can be determined as

1
q0 = Kqv (5-36)
2

= (q0 13 + q) Kq

Based on (5-32), the translation control input v,(t) is designed as

vc = /iL-1 Kve + z Lw diL -m (5-37)
1 V dt Pdi j )

where K, E R3x3 denotes a diagonal matrix of positive constant control gains. In
(5-37), the parameter estimate i4(t) R for the unknown constant z4 is defined as

i = y Le L c diLvd7, ds (5-38)

where 7 E denotes a positive constant adaptation gain. By using (5-32) and
(5-37), the translation closed-loop error system is

ze = -Ke + (Lwc diLd, ( 4-m dsi) (5-39)

where z (t) R denotes the following parameter estimation error:

z = -z i (5-40)

5.5.3 Stability Analysis

Theorem 5.1: The controller given in (5-35) and (5-37), along with the
adaptive update law in (5-38) ensures asymptotic tracking in the sense that


l,.(t)| 0, |e(t)|| -- 0, as t oo.


(5-41)








Proof: Let V(t) E IR denote the following differentiable non-negative function
(i.e., a Lyapunov candidate):
1
V = q, + (1 o)2 + eT + 1^2. (5-42)
2 2i

The time-derivative of V(t) can be determined as

1 = -q (o403 + q) KK,,q (1 qo)TK eTKve

+ T (Lvw 3diLvd- (d-mdsi)

if e Lwc PdiLvd, -4mdsi

= (013 + qv + (1 0)) Kq e TKe
= -qK v eTKve, (5-43)

where (5-36) and (5-38)-(5-40) were utilized. Based on (5-42) and (5-43),
e(t), qv(t), qo(t), z(t) E C and e(t), q,(t) E C2. Since z*(t) e L", it is clear
from (5-40) that { (t) E L. Based on the fact that e(t) E 4L, (5-24) and (5-25)
can be used to prove that mf(t) E C,, and L,(t), L 1 (t), L,, (t) E L,. Since

qv(t) E LC and W'cd(t) is a bounded function, (5-35) can be used to conclude that
c(t) e o. From 4(t), e(t), f(t), Lv(t), L;1 (t) L, (t) E Loo and Ld(t),
S( mTdsi) are bounded, (5-37) can be utilized to prove that vc(t) E L. From
dt pd
the previous results, (5-28)-(5-32) can be used to prove that e(t), q,(t) E Lo.
Since e(t), q4(t) E Lo n 2, and e(t), q,(t) E L,, Barbalat's Lemma [96] can be
used to conclude the result given in (5-41).















CHAPTER 6
VISUAL SERVO CONTROL IN THE PRESENCE OF CAMERA CALIBRATION
ERROR

6.1 Introduction

Hu et al. [14] introduced a new quaternion-based visual servo controller for

the rotation error system, provided the camera calibration parameters are exactly

known. Since the results by Malis and Chaumette [21] and Fang et al. [87] rely

heavily on properties of the rotation parameterization to formulate state estimates

and a measurable closed-loop error system, the research in this chapter is motivated

by the question: Can state estimates and a measurable closed-loop error system be

crafted in terms of the quaternion parameterization when the camera calibration

parameters are unknown? To answer this question, a contribution of this chapter is

the development of a quaternion-based estimate for the rotation error system that

is related to the actual rotation error, the development of a new closed-loop error

system, and a new Lyapunov-based analysis that demonstrates the stability of the

quaternion error system. One of the challenges is to develop a quaternion estimate

from an estimated rotation matrix that is not a true rotation matrix in general.

To address this challenge, the similarity relationship between the estimated and

actual rotation matrices is used (as in [21] and [87]) to construct the relationship

between the estimated and actual quaternions. A Lyapunov-based stability analysis

is provided that indicates a unique controller can be developed to achieve the

regulation result despite a sign ambiguity in the developed quaternion estimate.

Simulation results are provided in Section 6.7 that illustrate the performance of the

developed controller.









6.2 Feedback Control Measurements

The objective in this chapter is to develop a kinematic controller (i.e., the

control inputs are considered the linear and angular camera velocities) to ensure

the position/orientation of the camera coordinate frame F is regulated to the

desired position/orientation F*. The camera geometry is shown in Figure 2-2

and the corresponding Euclidean and image-space relationships are developed

in Sections 2.1 and 2.2. The only required sensor measurements for the control

development are the image coordinates of the determined feature points (i.e.,

measurement of the signals in (2-5)), where the static feature point coordinates

in the desired image are given a priori. By measuring the current image feature

points and given the desired feature points, the relationship in (2-6) can be used

to determine the normalized Euclidean coordinates of Oi provided the intrinsic

camera calibration matrix is perfectly known. Unfortunately, any uncertainty in A

will lead to a corrupted measurement of mi(t) and mQ. The computed normalized

coordinates are actually estimates, denoted by mi (t) 74 E R3, of the true values

since only a best-guess estimate of A, denoted by A E R3x3, is available in practice.

The normalized coordinate estimates can be expressed as [21]


ri = A-lpi = Ami (6-1)

A-lp = Am, (6-2)


where the calibration error matrix A E IR3x3 is defined as

All A12 A13

A = A-A = 0 A22 A23 (3)

0 0 1

where An, A12, A13, A22, A23 e R denote unknown intrinsic calibration mismatch

constants. Since mi(t) and mn can not be exactly determined, the estimates in









(6-1) and (6-2) can be substituted into (2-4) to obtain the following relationship

ai = aiHn, (6-4)

where H (t) E R3x3 denotes the estimated Euclidean homography defined as

H = AHA-1. (6-5)

Since mi(t) and im can be determined from (6-1) and (6-2), a set of twelve linear

equations can be developed from the four image point pairs, and (6-4) can be used

to solve for H (t).

As stated in [21], provided additional information is available (e.g., at least

4 vanishing points), various techniques (e.g., see Faugeras and Lustman [93] and

Zhang and Hanson [94]) can be used to decompose H(t) to obtain the estimated

rotation and translation components as

H = ARA-' + AXhn*TA-' = R + 1hj*T, (6-6)

where R (t) E IR3x3 is defined as

R = ARA-1, (6-7)

and xh (t) E R 3, n* E 1R3 denote the estimate of xh (t) and n*, respectively, defined

as

xh = iAxh n* = -A- n*,

where o I R denotes the following positive constant = A-Tn* .

For the four vanishing points (see Almansa et al. [103] for a description of how

to determine vanishing points in an image), d* = oc, so that

H = R + :-n* = R (6-8)









and therefore

H = ARA-1 = R. (6-9)

Twelve linear equations can be obtained based on (6-4) for the four vanishing

points. Assume the 3rd row 3rd column element of H(t), denoted as H33(t) e I, is

not equal to zero (w.l.o.g.). The normalized matrix H,(t) E R3x3, defined as


fH= (6-10)
H33

can be computed based on these twelve linear equations. Based on (6-9)

det () = det(A) det(R) det(A-) = 1. (6-11)

From (6-10) and (6-11),

H33 det ( ) = 1, (6-12)

and hence,
H,
H= (6-13)
det(H,)

which is equal to R(t).

6.3 Control Objective

As stated previously, the objective in this chapter is to develop a kinematic

controller to ensure the pose of the camera coordinate frame F is regulated to the

desired pose F* despite uncertainty in the intrinsic camera calibration matrix. This

objective is based on the assumption that the linear and angular velocities of the

camera are control inputs that can be independently controlled (i.e., unconstrained

motion). For example, the linear and angular camera velocities could be controlled

by the end-effector of a robotic manipulator. In addition to uncertainty in the

intrinsic camera calibration, uncertainty could also exist in the extrinsic camera

calibration (e.g., the uncertainty in the rotation and translation of the camera

with respect to the robot end-effector). The development in this chapter could









be directly modified as described in [21] and [87] to compensate for the extrinsic

calibration. Therefore, the effects of a mismatch in the extrinsic calibration are not

considered in the subsequent development for simplicity.

In the Euclidean space, the rotation control objective can be quantified as


R(t) -- 13 as t -- o. (6-14)


The subsequent development is formulated in terms of the four dimensional unit

quaternion q(t). Given the rotation matrix R (t), the corresponding unit quaternion

q (t) can be computed by using the numerically robust method presented in Section

2.3. From (2-12) and (2-15), the rotation regulation objective in (6-14) can also be

quantified as the desire to regulate q,(t) as


II q, (t) -- 0 as t -oo. (6-15)

The focus and contribution of this chapter lies in the ability to develop and

prove the stability of a quaternion-based rotation controller in the presence of

uncertainty in the camera calibration. The translation controller developed by Fang

et al. [87] is also presented and incorporated in the stability analysis to provide an

example of how the new class of quaternion-based rotation controllers can be used

in conjunction with translation controllers that are robust to camera calibration

uncertainty including (for example): the asymptotic translation controllers in [21],

and the exponential translation controllers in [87]. The translation error, denoted

by e(t) E R3, is defined as
zi
e = -ni mr, (6 16)

where i can be chosen as any number within {1, 1 4}. The translation objective

can be stated as


Ie(t) I | 0 as t oo.


(6-17)









The subsequent section will target the control development based on the objectives

in (6-15) and (6-17).

6.4 Quaternion Estimation

A method is presented in this section to develop a quaternion-based rotation

estimate that can be related to the actual rotation mismatch to facilitate the

control development.

6.4.1 Estimate Development

The unit quaternion is related to the angle-axis representation as


qo cos () q = usin( (6-18)
2/ 2/

where 0(t) and u(t) are the corresponding rotation angle and unit axis. By using

(2-12) and (2-15), the first element of the quaternion can also be expressed in

terms of the rotation matrix R(t) as

2 tr(R)+ 1
4

where qo(t) is restricted to be non-negative as

1
qo = + tr(R) (6-19)
2

without loss of generality (this restriction enables the minimum rotation to be

obtained), and tr(R) denotes the trace of R (t). Based on (6-18) and (6-19), q, (t)

can be determined as


q, = u l Cos2 ()= 1 3 tr(R), (6-20)

where the rotation axis u (t) is the unit eigenvector with respect to the eigenvalue

1 of R (t). For the quaternion vector in (6-20), the sign ambiguity can be resolved.

Specifically, (2-15) can be used to develop the following expression:


RT R = 4,,(,,,2.


(6-21)









Since the sign of qo(t) is restricted (i.e., assumed to be) positive, then a unique

solution for q (t) can be determined from (6-20) and (6-21).

Based on the similarity between R (t) and R (t) as stated in (6-7), the expres-

sions in (6-19) and (6-20) provide motivation to develop the quaternion estimate

as


qo 1 + tr(R) (6-22)
2

= u sin ( = 3 tr(R). (6-23)


In (6-22) and (6-23), R(t) is the estimated rotation matrix introduced in (6-6)

that is computed from the homography decomposition. Since R (t) is similar to

R (t) (see (6-7)), R(t) is guaranteed to have an eigenvalue of 1, where u (t) is the

unit eigenvector that can be computed from the eigenvalue of 1. Since R(t) is

not guaranteed to be a true rotation matrix (and it will not be in general), the

relationships in (2-15) and (6-21) can not be developed and used to eliminate the

sign ambiguity of the eigenvector u (t). However, the subsequent stability analysis

and simulation results indicate that the same stability result is obtained invariant

of the sign of u(t). Once the initial sign of u(t) is chosen, the same sign can be used

for subsequent computations.

6.4.2 Estimate Relationships

Based on the fact that R (t) is similar to R (t) (see (6-7)), the properties that

similar matrices have the same trace and eigenvalues can be used to relate the

quaternion estimate and the actual quaternion. Since similar matrices have the

same trace, (6-19) and (6-22) can be used to conclude that


qo = qo. (6-24)

As stated earlier, since similar matrices have the same eigenvalues, R(t) is guaran-

teed to have an eigenvalue of 1 with the associated eigenvector u (t). The following









relationships can be developed based on (6-7)

u = AR = ARA-ui. (6-25)

Premultiplying A-1 on both sides of (6-25) yields

A-lu = RA-u. (6-26)

Hence, A- 1(t) is an eigenvector with respect to the eigenvalue 1 of R (t) that can

be expressed as
A-l = u, (6-27)

where 7 IR is defined as
7= 1- (6-28)
Au

Based on (6-20), (6-23), and (6-27), the estimated quaternion vector can now be

related to the actual quaternion vector as

q, = yAq,. (6-29)

By using (2-12), (6-24), (6-28) and (6-29),

q0 + I,.2= q + 2. (6-30)
Au

Based on (6-28) and the fact that u(t) is a unit vector,

II\\u sin -
S 1 1 (6-31)


From (6-28) and (6-31),


q% + |,.12 22 Aq1.
Aq,









6.5 Control Development

6.5.1 Rotation Control

The rotation open-loop error system can be developed by taking the time

derivative of q(t) as

] I -q (6-32)
2
qv qo13 + qv

where wc (t) E R3 denotes the angular velocity of the camera with respect to F*

expressed in F. Based on the open-loop error system in (6-32) and the subsequent

stability analysis, the angular velocity controller is designed as


wc = -K & (6-33)


where K IR denotes a positive control gain. Substituting (6-33) into (6-32), the

rotation closed-loop error system can be developed as

1
qo = -KI q q, (6-34)
2
S= 2K (qol3 + qx) 4v. (6-35)


6.5.2 Translation Control

The contribution of this chapter is the rotation estimate and associated control

development. The translation controller developed in this section is provided

for completeness. As stated previously, translation controllers such as the class

developed by Malis and Chaumette [21] and Fang et al. [87] can be combined with

the developed quaternion-based rotation controller. To facilitate the subsequent

stability analysis for the six DOF problem, a translation controller proposed in [87]

is provided in this section, which is given by


vc = K6e,


(6-36)









where K IR denotes a positive control gain, and e (t) E IR3 is defined as


zi

where Ani(t) and Trncan be computed from (6-1) and (6-2), respectively, and the
zi
ratio can be computed from the decomposition of the estimated Euclidean
zi
homography in (6-4). The open-loop translation error system can be determined as

1
e = ,-v we + [mn] x c. (6-38)

After substituting (6-33) and (6-36) into (6-38), the resulting closed-loop transla-

tion error system can be determined as


e = A + [K v]x) e K [mfl]x v. (639)


6.6 Stability Analysis

As stated previously, the quaternion estimate q (t) has a sign ambiguity, but

either choice of the sign will yield the same stability result. The following analysis

is developed for the case where

qv = Aq,. (6-40)

A discussion is provided at the end of the analysis, that describes how the stability

can be proven for the case when


q = -yAqv.


Theorem 6.1: The controller given in (6-33) and (6-36) ensures asymptotic

regulation in the sense that


|q(t) | 0, ||e(t)|| 0 as t oo


(6-41)








provided K, is selected sufficiently large (see the subsequent proof), and the
following inequalities are satisfied

Amin A + AT) > Ao (6-42)

Amax{ (A+ AT) < A, (6-43)

where A0, A1 E IR are positive constants, and Amin {'} and Amax {'} denote the
minimal and maximal eigenvalues of -A + A respectively.
Proof: Let V(t) E IR denote the following differentiable non-negative function
(i.e., a Lyapunov candidate):

V= q q, + (1 qo)2 + Te. (6-44)

After cancelling common terms, V(t) can be expressed as



Vi = 2qq, 2(1 qo)qo + eT
= --Kq (qol3 + qx) Aq, ,(1 qo)q Aq,

+eT ( -KA + 7 [KwAq-] ) e -7Kwe [mf]x Aq

= -Kq ( [- (qo3 + qx) (1 qo)3] Aqv

+eCT ( -K-A + 7 [KAq] ) e- 7Ke [mn]x Aqv

= -7KqAqv K, e Ae + yeT A e x yKeT [n]< Aqv. (6-45)
zi

By using the inequality (6 42), the term yKq, Aq, satisfies

-K&q/ Aq = -7KqK (A + AT) qV
2
> Amin (A+ AT) I||q 2 > ,Ao qvl 2. (646)
I z/~' i 2 --








Since
eAe= -e (A + A e > A.0 le

the term -K, -eTAe satisfies
z.
1 1
-K, Ae < -K, Ao I12 (6-47)

Based on the property that [ 1]x = I 2 V 3 R3 (see Appendix D) and I,'.| < 1,
the term yeT ( KAq,] x e satisfies


yeT ( [KAq, ]x e = yeT ([Kq]x) e

< 7 I ..,,.| 2 < yK, ,.11 |e2 <- yK | e|2 (6-48)

From (6-31), the term -yKeT [mf x Aq, satisfies

-7KeT [ x]>< Aq, = K [m ] Aq
Aq,

< q, [mf] 21||e|| Aq
Aq2

= KI |'1 | 111q I|e|. (6-49)

By using (6-46)-(6-49), the expression in (6-45) can be upper bounded as

V < -7KAo I (, 2- (Ke + K )o 2+7K |1e 2+K |mf* |l |q, eI, (6-50)
zi

where the control gain K, is separated into two different control gains as K, =
Kj + K~2. The following inequality can be obtained after completing the squares:

K, qr 1e 1 2 1 2
K, | fll I q ll ||e|| I < irK *I || |112 + K, o 11||e||2 (6-51)
4KA,1Ao zt

From (6-51), the inequality (6-50) can be rewritten as

V < Ao i q, K, 2 A 12
K, 4A 2 Ao









Based on the definition of y(t) in (6-28), the inequalities (6-42) and (6-43), and

the assumption that rn* and zi are bounded, there exist two positive bounding

constant cl and 2 E IR satisfying the following inequalities:

zK &I |71 2 K?*^,
2 < c1 and < C2
4Ao7 Ao

the control parameter K, can be selected large enough to ensure that V (t) is

negative semi-definite as


V< K, Ao l 2- A 12. (6-52)


Based on (6-44) and (6-52), standard signal chasing arguments can be used to

conclude that the control inputs and all the closed-loop signals are bounded. The

expression in (6-52) can also be used to conclude that q,(t) and e(t) E C2. Since

qv(t), q,(t), e(t), e(t) E L and q,(t), e(t) E C2, Barbalat's Lemma [96] can be used
to prove the result given in (6-41).E

By modifying the Lyapunov function in (6-44) as


V= qq, + (1 +qo)2 + T,

the same stability analysis arguments can be used to prove Theorem 6.1 for the

case when

v = -yAq,.

Since the sign ambiguity in (6-29) does not affect the control development and

stability analysis, only the positive sign in (6-29) needs to be considered in the

future control development for convenience.

6.7 Simulation Results

Numerical simulations were performed to illustrate the performance of the

controller given in (6-33) and (6-36). The intrinsic camera calibration matrix is









given by
122.5 -3.77 100
A= 0 122.56 100

0 0 1
The best-guess estimation for A was selected as

100 -4 80
A= 0 100 110

0 0 1

The camera is assumed to view an object with four coplanar feature points with

the following Euclidean coordinates (in [m]):


01 = 0.05 0.05 0 02 0.05 -0.05 0 (6-53)

03 = -0.05 0.05 0 04 -0.05 -0.05 0

The normalized coordinates of the vanishing points were selected as


0.02 0.02 1 0.02 -0.02 1

-0.02 0.02 1] -0.02 -0.02 1

Consider an orthogonal coordinate frame I with the z-axis opposite to n* (see

Figure 2-2) with the x-axis and y-axis on the plane 7T. The rotation matrices R,

between F and I, and R2 between F* and I were set as


R, = R,(160)Ry (30) R, (-30) R2 = Rx(120)Ry (-20) R, (80),

where R,(-), Ry (.) and Rz(.) E SO(3) denote rotation of angle (degrees)

along the x-axis, y-axis and z-axis, respectively. The translation vectors xf (t) and

Xf2(t) between F and I (expressed in F-) and between F* and I (expressed in F*),








respectively, were selected as

fl = 0.5 0.5 2.5 Xf2 1.0 1.0 3.5

The initial (i.e., pi(0)) and desired (i.e., p*) image-space coordinates of the four
feature points in (6-53) were computed as (in pixels)

Pi(0)= 126.50 123.64 1 P2(0)= 124.24 127.91 1]

P(0) = 120.92 125.40 1 ] 4(0) = 123.25 121.11 1

P [ = 132.17 133.17 1 2 = 135.72 133.61 1

P3= 135.71 136.91 1 P4= 132.10 136.44 1

The initial (i.e., pvi(0)) and desired (i.e., ,i) image-space coordinates of the four
vanishing points in (6-53) were computed as (in pixels)

P (o)= 124.02 139.34 1 P2(0)= 129.02 141.61 1


r IT I ]T
Pi = 102.37 102.45 1 Pj2 = 102.53 97.55 1
[ 963T I T
Pv3 = 97.63 97.55 1 P4 = 97.47 102.45 1

The control gains K, in (6-33) and K! in (6-36) were selected as

K = 5 K = 5.

The resulting translation and rotation errors are plotted in Figure 6-1 and
Figure 6-2, respectively. The image-space pixel error (i.e., pi(t) p*) is shown in
Figure 6-4, and is also depicted in Figure 6-5 in a 3D format. The translation and
rotation control outputs are shown in Figure 6-6 and Figure 6-7, respectively. For







113



different choice of sign of the quaternion estimate, the asymptotic result can still

be achieved. In contrast to the quaternion estimate in Figure 6-2, a quaternion

estimate with different sign is shown in Figure 6-3.



0


;- -0.1 -


-0.2
0 2 4 6 8 10



S -0 .1 ......... ........ ..... .......


0 4 10
0.5

CO 0


-0.5 II
0 2 4 6 8 10
Time [sec]


Figure 6-1: Unitless translation error between ml(t) and m{.







































0 2 4 6 8
Time [sec]


Figure 6-2: Quaternion rotation error.


S0.5

0 2
0 2


4 6 8


Time [sec]


Figure 6-3: Quaternion rotation error for comparison with different sign.


0.61
0.6
















1. R


132 F


130F- .


122 -


118'
118


120 122 124 126 128
ui [pixel]


S... ..



/







/

. . . . .







130 132 134 136


Figure 6-4: Image-space error in pixies between pi(t) and p*. In the figure, "0"

denotes the initial positions of the 4 feature points in the image, and "*" denotes

the corresponding final positions of the feature points.










10 ; .

.... i I
IL



4 .. '

2,

0
140

130 13 5 140
130
120 125
120
v rnip[ I] 110 115


Figure 6-5: Image-space error in pixles between pi(t) and p7 shown in a 3D graph.

In the figure, "0" denotes the initial positions of the 4 feature points in the image,

and "*" denotes the corresponding final positions of the feature points.


p3(Q)
0


S


u [pixel]















































Time [sec]



Figure 6-6: Linear camera velocity control input.


Time [sec]



Figure 6-7: Angular camera velocity control input.


Co
2
Co




















Co
-C

S















CHAPTER 7
COMBINED ROBUST AND ADAPTIVE HOMOGRAPHY-BASED VISUAL
SERVO CONTROL VIA AN UNCALIBRATED CAMERA

7.1 Introduction

In this chapter, a new combined robust and adaptive visual servo controller

is developed to asymptotically regulate the feature points of a rigid-body object

(identified a planar patch of feature points) in an image to the desired feature

point locations while also regulating the six DOF pose of the camera (which is

affixed to the object). These dual objectives are achieved by using a homography-

based approach that exploits both image-space and reconstructed Euclidean

information in the feedback loop. In comparison to pure image-based feedback

approaches, some advantages of using a homography-based method include:

realizable Euclidean camera trajectories (see Chaumette [48] and Corke and

Hutchinson [25] for a discussion of Chaumette's Conundrum); a nonsingular image-

Jacobian; and both the camera position and orientation and the feature point

coordinates are included in the error system. Since some image-space information is

used in the feedback-loop of the developed homography-based controller, the image

features are less likely to leave the FOV in comparison with pure position-based

approaches. The developed controller is composed of the same adaptive translation

controller as in the preliminary results in Chen et al. [89] and a new robust rotation

controller. The contribution of the result is the development of the robust angular

velocity controller that accommodates for the time-varying uncertain scaling factor

by exploiting the upper triangular form of the rotation error system and the fact

that the diagonal elements of the camera calibration matrix are positive.









7.2 Camera Geometry and Assumptions

The camera geometry for this chapter is shown in Figure 2-2 and the corre-

sponding Euclidean and image-space relationships are developed in Sections 2.1 and

2.2. For convenience in the following development, the camera calibration matrix A

is rewritten as

a -a cot 0 U0 an a12 a13

A 0 o = 0 a22 a23 (71)
sin m
0 0 1 0 0 1

Based on the physical meaning of the elements of A, the diagonal calibration

elements are positive (i.e., an, a22 > 0).

The following two assumptions are made for the convenience of the controller

development. They are so reasonable such that they can be considered properties of

the considered vision system.

Assumption 1: The bounds of an and a22 are assumed to be known as


a < a < Cal < a22 -all 22

The absolute values of a12, a13, a23 are upper bounded as


|a12 < a3 <1 a1 3 la23

In (7-2) and (7-3), ( a, (an' 2 (22' C12) (013 and C23 are known positive

constants.

Assumption 2: The reference plane is within the camera's FOV and not at

infinity. That is, there exist positive constants ( and Q, such that


S< zi(t) < (7 4)









Based on (2-4)-(2-6), the homography relationship based on measurable pixel

coordinates is:

pi = aiAHA-1p. (7-5)

Since A is unknown, standard homography computation and decomposition

algorithms can't be applied to extract the rotation and translation from the

homography.

As stated in Malis and Chaumette [21], if some additional information is

known, such as four vanishing points, the rotation matrix can be obtained. For the

vanishing points (see Almansa et al. [103] for a description of how to determine

vanishing points in an image), d* = oo, so that


H = R + n*T = R. (7-6)
d*

Based on (7-6), the relationship in (7-5) can be expressed as


i =aiRp (7-7)

where R (t) E IR3x3 is defined as


R =ARA-1. (7-8)


For the four vanishing points, twelve linear equations can be obtained based on

(7-7). After normalizing R(t) by one nonzero element (e.g., R33(t) E IR which

is assumed to be the third row third column element of R(t) without loss of

generality) twelve equations can be used to solve for twelve unknowns. The twelve

unknowns are given by the eight unknown elements of the normalized R(t), denoted

by R,(t) E RI3x3 defined as

RA (7-9)
R33









and the four unknowns are given by R33(t)ai(t). From the definition of R,(t) in

(7-9), the fact that

det (R) = det(A) det(R) det(A-) = 1 (7-10)

can be used to conclude that

/33 det (Rn) = 1, (7-11)

and hence based on (7-9) and (7-11),

R= R (7-12)
det(PR)

After R (t) is obtained, the original four feature points on the reference plane can

be used to determine the depth ratio ai (t) as shown in Appendix E.

7.3 Open-Loop Error System

7.3.1 Rotation Error System

If the rotation matrix R (t) introduced in (2-4) were known, then the corre-

sponding unit quaternion q (t) A q0 (t) qT (t) can be calculated using the

numerically robust method presented presented in Section 2.3. Given R(t), the

quaternion q(t) can also be written as


qo = V1 + tr(R) (7-13)
2
1
q, = u3 tr(R), (7-14)

where u(t) E IR3 is a unit eigenvector of R(t) with respect to the eigenvalue 1. The

open-loop rotation error system for q(t) can be obtained as (see Dixon et al. [23])


S -q we, (7-15)
qv qo01 + qZ

where cj (t) E R3 defines the angular velocity of the camera expressed in F.









The quaternion q (t) given in (7-13)-(7-15) is not measurable since R(t) is

unknown. However, since R(t) can be determined as described in (7-12), the same

algorithm as shown in equations (7-13) and (7-14) can be used to determine a

corresponding measurable quaternion (qo(t), q(t))T as


qo = + tr(R) (7-16)
2
q = /3 -tr(R), (7-17)
2

where u(t) E IR3 is a unit eigenvector of R(t) with respect to the eigenvalue 1.

Based on (7 8), tr (R) = tr (ARA-1) = tr(R), where tr (.) denotes the trace

of a matrix. Since R(t) and R(t) are similar matrices, the relationship between

(qo(t), q(t))T and (qo(t), q(t)) can be determined as

qo = q0 q = Aq, -A Aq,, (7-18)

where y(t) E IR is a positive, unknown, time-varying scalar that satisfies the

following inequalities (see Appendix F)


< "(t) < 4,, (7-19)

where ( E IR are positive bounding constants. The inverse of the relationship

between qv(t) and q,(t) in (7-18) can be developed as

1 a12 a13 al12a23
vi -v2 qv3
a11 alla22 a11 l11a22
qv A-q -1 1 a2 3 (7-20)
W-- 2 3
7 7 022 a22
qv3

7.3.2 Translation Error System

The translation error, denoted by e(t) E IR3, is defined as


(7-21)


e(t) = p(tW pe,









where pe (t), p* IR3 are defined as


Pe ui [n -ln(ai) P= u7 v 0] (7 22)

where i {1, .. ,4}. The translation error e(t) is measurable since the first

two elements are image coordinates, and ai(t) is obtained from the homography

decomposition as described in Appendix A. The open-loop translation error system

can be obtained by taking the time derivative of e(t) and multiplying the resulting

expression by z4 as [89]

ze = -aiAevc + z Ae (A-1pi)x c, (7-23)


where v,(t) E IR3 defines the linear velocity of the camera expressed in f', and

Ae(t) E R3x3 is defined as

a11 12 a13 Ui
Ae = 0 a22 a23 -

0 0 1

To facilitate the control development, the translation error system can be linearly

parameterized as

e a11vc1 + a12c2 + Vc3 (a13 Ui) Y (Ui, Wc)

Cz 2 a22Vc2 +Vc3 (a23 ) + i c)

e3 c33 (Ui ''.c)
(7-24)

where (-) R1 i = 1, 2, 3, are known regressor vectors that do not depend

on the calibration parameters, and e IRm is a vector of constant unknown

parameters.









7.4 Control Development

7.4.1 Rotation Control Development and Stability Analysis

Based on the relationship in (7-18), the open-loop error system in (7-15), and

the subsequent stability analysis, the rotation controller is designed as


c = -klq = (k11 + 2) q i (7-25)

c2 = -.. 1 = (k,21 + k,22 + I) v2

Wc3 -kw34qv3 (kw31 + kw32 + k,33) 9v3


where ki E R, i, = 1,2,3and 7k E R, i,j = 1,2,3,j < i, are positive

constants. The expressed form of the controller in (7-25) is motivated by the use of

completing the squares in the subsequent stability analysis. In (7-25), the damping

control gains k,21, kw31, kJ32 are selected according to the following sufficient

conditions to facilitate the subsequent stability analysis
-2


1 1
31 2 1 ( a12-a23 + 2a13
k,31 > 4 1 k '-( + a13
-a11 a22
2
kw32 > kw2-2
-a22

where a1, an11, Ca22 a22, )a12, (a13 and (a23 are defined in (7-2) and (7-3), and

k11, kI22, kJ33 are feedback gains that can be selected to adjust the performance of

the rotation control system.

Proposition 7.1: Provided the sufficient gain conditions given in (7-26) are

satisfied, the controller in (7-25) ensures asymptotic regulation of the rotation error

in the sense that


I q, (t) | 0, as t c. o.


(7-27)










Proof: Let V1 (q, qo) IR denote the following non-negative function:


Vi q q, + (1


qo)2


(7-28)


Based on the open-loop error system in (7-15), the time-derivative of V (t) can be

determined as


i = 2q, 2(1 qo)qo = q wc = qvlcl + + qv2c2 + v3c3.


(7-29)


After substituting (7-20) for qv(t) and substituting (7-25) for ~c(t), the expression

in (7-29) can be simplified as


(k 11-, q+ k22 -I2 q2
a11 a22


+ k.. ::


, 12
a22


+ k,21 a1 2
a223
(7-30)


+ k -1223
\ a22


a13) qv1qv3 + k331all v3


1
[2 k22a23qv2qv3 + k.: ,,, J
a22

After completing the squares on each of the bracketed terms in (7-30), the expres-

sion in (7-30) can be written as


( 1 1 2 9
(k l- qa 1 + k,22-22 2 + k. :
a 011 a2112


S V1 -
all 1

ll
a11 [ v

+a11 (k31



a22 v


k a212 2
2a22


1
2


k1 2
4 ali


a12a23
a 22


a(12
Sa2


2


+ a~ k-21
a22 (

a13 ) v3 )

S)2
-23 ~13


+ a22 (k32


1 a2 2
4 a11a22





q]

!k2 a23 21
4 a22 q3


ll1
a1


(7 31)


11









Provided the sufficient gain conditions given in (7-26) are satisfied, then (7-31) can

be upper bounded as

1 1 2
2 < kwn11 + qk2222q, + k.. (732)
S a11 a22

Based on (7-19), the inequality in (7-32) can be further upper bounded as

11 1 2
V1 < k, n1 + kw22 qv2 + k.::: (7-33)
C( all a22

The Lyapunov function given in (7-28) and its time derivative in (7-33) can be

used to conclude that q,(t), qo(t) E CL and q,(t) E 2 (of course, q,(t), qo(t) E LC

by definition also). The expressions in (7-18) and (7-20), and the fact that

q,(t) E 2, can be used to conclude that q,(t) E L2. Since q,(t),qo(t) E C,

then R(t), R(t), q,(t) and qo(t) E L,. Hence, (7-25) can be used to conclude that

joc(t) E LC. Based on the rotation error system in (7-15), q,(t), qo(t) E C; hence,

q, (t), qo(t) are uniformly continuous. Barbalat's lemma [96] can now be used to
conclude that |Iq, (t)I -- 0 as t oo.

7.4.2 Translation Control Development and Stability Analysis

For completeness of the result, the same translation controller as in Chen et

al. [89] is provided. After some algebraic manipulation, the translation error system

in (7-24) can be rewritten as

z*
Zi-e = -aiv + Yi (a, Ui, (Jc, Vc2, Vc3) 01 (7 34)
a11
z*
e2 = -aic2 + Y2 (ai, u, wJc, Vc3) 02
a22

Z;3 = -aiVc3 + Y3 (ai, Ui, c) 03,


where 01 E R"n, 02 IRn2, and 03 6 Rn3 are vectors of constant unknown

parameters, and the known regressor vectors Yi (-) E RI"'l Y2 (*) R1lx2, and









Y3 (') IRlx3 satisfy the following equations (see Appendix G):

a12 (a13 u- i)
Y101 = -ai-vc2 aiVc3 + z1YY (Ui, ,' cJ)
a11 a11 a11
a23 + ( .
Y202 = -ai Vc3 +Z2i (Ui c --
a22 a22

Y303 = z (Ui,, Jc) 0


The control strategy is to design Vc3(t) to stabilize e3(t), and then design v2 (t) to

stabilize e2(t) given Vc3(t), and then design vzc(t) to stabilize el(t) given Vc3(t) and

vc2(t). Following this design strategy, the translation controller vc(t) is designed

as [89]


c3 = (kv33 + Y3 (ai,, Jc) 3) (7 35)
ai

Vc2 = -(kv22 + Y2 (ai, Ui, c, Vc3) 02)
tai

Vc1 =- (kvie + Yi (a, U, V2, Vc3) ,
tai

where the depth ratio ai(t) > 0 Vt. In (7-35), 01 (t) e RI1, 2 (t) R"2, 3 (t) I"3

denote adaptive estimates that are designed according to the following adaptive

update laws to cancel the respective terms in the subsequent stability analysis


=1 12 F2Y2Te2 3 = F33Te3, (7-36)

where F1 RIWxn1, R IRfF2X E 2,F3 E Rf3xn'3 are diagonal matrices of positive

constant adaptation gains. Based on (7-34) and (7-35), the closed-loop translation

error system is

z*
i 1 = -k~,el + Y1 (ai, Ui, .,c2, 3) i (7 37)
a11
,*
i_ 2 = -, + Y2 (ai, i, '. Vc3) 2
a22


Zie3 = -I, : + Y3 (ai, Ui, I .c) 03,









where 01 (t) E IR" 2 (t) E n2, 3 (t) E In3 denote the intrinsic calibration
parameter mismatch defined as


01 ) =01-01) 02 () =02- 02 t) 03 (t) =03 03 (t)

Proposition 7.2: The controller given in (7-35) along with the adaptive

update law in (7-36) ensures asymptotic regulation of the translation error system
in the sense that

|Ie(t) |-I 0 as t -oo.

Proof: Let V2(e, 01, 02, 03) E R denote the following non-negative function:


2 ai 2 i 2 2 2 2

After taking the time derivative of (7-38) and substituting for the closed-loop error

system developed in (7-37), the following simplified expression can be obtained:

V2 = -kie 1., k3e (7-39)

Based on (7-38) and (7-39), el(t), e2(t), e3(t) 6 o n 2, and 0i (t) ,i (t) i =

1, 2, 3 Loo. Based on the assumption that < zi(t) < Cz, the expression
in (7-35) can be used to conclude that Vc(t) L,. Based on the previous

stability analysis for the rotation controller, wJc(t) E L; hence, (7-37) can be
used to conclude that el(t), e2(t), 3(t) E L (i.e., e(t), e2(t), 3(t) are uniformly

continuous). Barbalat's lemma [96] can now be used to show that el (t), e2(t), 3 (t)
- 0 as t -- oc.
Based on Propositions 7.1 and 7.2, the main result can be stated as follows.

Theorem 7.1: The controller given in (7-25) and (7-35) along with the adap-
tive update law in (7-36) ensures asymptotic translation and rotation regulation in
the sense that


Iq,(t)II 0 and Ie(t)ll 0 as t oo,








provided the control gains satisfy the sufficient conditions given in (7-26).
Proof: See proofs in Propositions 7.1 and 7.2.
7.5 Simulation Results
Numerical simulations were performed to illustrate the performance of the
controller given in (7-25) and (7-35) and the adaptive law given in (7-36). The
intrinsic camera calibration matrix is given by


122.5
A= 0

0


-3.77 100
122.56 100
0 1


The camera is assumed to view an object with four coplanar feature points with
the following Euclidean coordinates (in [m]):


01= 0.1 0.1 0 T 2 = 0.1 -0.1 0

03 -0.1 0.1 0 04 = -0.1 -0.1 0

The normalized coordinates of the vanishing points were selected as

0.01 0.01 1 0.01 -0.01 1


(7


-0.01 0.01 1


40)


[,,, ,,,T








The initial (i.e., pi(O)) and desired (i.e., p7) image-space coordinates of the four
feature points in (7-40) were computed as (in pixels)

Pi(0)= 128.49 129.41 1 P2(0)= 128.80 119.61 1

P3() = 119.00 119.61 1 P4(O) = 118.70 129.41 1
S]T =T
P = 136.86 138.68 1 P = 138.07 132.17 1
[T T.
P3 = 130.97 131.33 1 P 129.84 137.82 1

The initial (i.e., pvi(0)) and desired (i.e., pi) image-space coordinates of the four
vanishing points in (7-40) were computed as (in pixels)

Pvo(0) = 91.84 123.68 1 Pv2(0) 91.70 121.16 1

Pv3(0) = 89.20 121.41 1 T P4(0) = 89.34 123.94 1

Pi = 101.19 101.23 1 PV 1= 101.26 98.77 1
S [ T T
Pi3 = 98.81 98.77 1 P4= 98.74 101.23 1

The actual value of the unknown parameter vectors 01, 02, and 03 are given by



0 = 0.0082 -0.0308 0.8163 3.5110 -0.1082 2.3386 0.0234

0.0234 0.0002 2.8648 0.0286 0.0002 0 -0.0241
T
0.0234 -0.0007 -2.4117 -0.0009 0 0.0910












2 = 0.0082 0.8159 3.5110 2.3375 0.0234 0.0002 0.0002 0 -0.0241
T
0.0287 -0.0009 -2.9544 0.0234 -0.0007 -2.4106




03 = 2.8648 0.0286 0.0287 -0.0009 -2.9544

In the simulation, the initial value of the estimated parameter vectors 01 (0), 02 (0),

and 03 (0) were selected as half of the actual value.

The diagonal control gain matrices K, in (7-25) and K, in (7-35) were

selected as

K, = diag {4, 2, 85} K = diag {5, 5, 5} .

The diagonal gain matrices F1, F2, and F3 in the adaptive law (7-36) were selected

as


F1 = 0.03 x diag {0.01, 1, 1, 1, 1, 1,0.01, 0.01, 0.0001, 1,

0.01, 0.0001, 0.0001, 0.01, 0.01, 0.01, 1, 0.01, 0.01, 1}

F2 = 0.03 x diag {0.01, 1, 1, 1, 0.01, 0.0001, 0.01,0.01, 1,0.0001,

0.0001,0.01, 0.01,0.01, 1}

F3 = 0.5 x diag {0.1, 0.1, 10, 0.1, 10} .


The resulting asymptotic translation and rotation errors are plotted in Figure

7-1 and Figure 7-2, respectively. The pixel coordinate of the four feature points

is shown in Figure 7-3. The image-space pixel error (i.e., pi(t) p*) is shown in

Figure 7-4. The image-space trajectory of the feature points is shown in Figure

7-5, and also in Figure 7-6 in a three-dimensional format, where the vertical axis

is time. The translation and rotation control outputs are shown in Figure 7-7 and

Figure 7-8, respectively.




























0

-5

-10
0 0.2 0.4 0.6 0.8 1


Time [sec]


Figure 7 1: Unitless translation error e(t).


2 3


0 1 2 3 4
Time [sec]



Figure 7-2: Quaternion rotation error q(t).


1.01

0 0 I
- 1

0.99
0 1
0.1

S0














IOU

140

130

120

110
0 2 4 6 8 1


150

140

130

120
A n I I I I


Time [sec]


Figure 7-3: Pixel coordinate p(t) (in pixels) of the current pose of the four fea-
ture points in the simulation. The upper figure is for the u(t) component and the
bottom figure is for the v(t) component.


-5
0





-15
0 2 4 6 8 11




-5



-10
-1 ii


Time [sec]


Figure 7-4: Regulation error p(t) p*(in pixels) of the four feature points in the
simulation. The upper figure is for the u(t) u*(t) component and the bottom
figure is for the v(t) v*(t) component.



































120 125 130
u [pixel]


135 140 145


Figure 7-5: Image-space error in pixies between p(t) and p*. In the figure, "0" de-
notes the initial positions of the 4 feature points in the image, and "*" denotes the
corresponding final positions of the feature points.


160


120


v [pixel]


100 110


u [pixel]


Figure 7-6: Image-space error in pixles between p(t) and p* shown in a 3D graph.
In the figure, "0" denotes the initial positions of the 4 feature points in the image,
and "*" denotes the corresponding final positions of the feature points.


135 F


120


115'
115


















0



-20
0 1 2 3 4 5
20





-20
0 1 2 3 4 5
20


0

2 0 .. .. .. .. . .. .
0 1 2 3 4 5
Time [sec]



Figure 7-7: Linear camera velocity control input Vc(t).


0.5


0


-0.5
0 1 2


0.2


0
^------ ---J~


-0.2
0 1 2


Time [sec]


Figure 7-8: Angular camera velocity control input wc(t).















CHAPTER 8
CONCLUSIONS

In this dissertation, visual servo control algorithms and architectures are

developed that exploit the visual feedback from a camera system to achieve a

tracking or regulation control objective for a rigid-body object (e.g., the end-

effector of a robot manipulator, a satellite, an autonomous vehicle) identified by a

patch of feature points. These algorithms and architectures can be used widely in

the navigation and control applications in robotics and autonomous systems. The

visual servo control problem in this dissertation were separated into five parts: 1)

visual servo tracking control via a quaternion formulation; 2) collaborative visual

servo tracking control using a daisy-chaining approach; 3) visual servo tracking

control using a central catadioptric camera; 4) robust visual servo control in

presence of camera calibration uncertainty; and 5) combined robust and adaptive

visual servo control via an uncalibrated camera.

An adaptive visual servo tracking control method via a quaternion formulation

is first developed that achieves asymptotic tracking of a rigid-body object to a

desired trajectory determined by a sequence of images. By developing the error

systems and controllers based on a homography decomposition, the singularity

associated with the typical image-Jacobian is eliminated. By utilizing the quater-

nion formulation, a singularity-free error system is obtained. A homography-based

rotation and translation controller is proven to yield the tracking result through

a Lyapunov-based stability analysis. Based on the result for the camera-in-hand

configuration problem, a camera-to-hand extension is given to enable a rigid-

body object to track a desired trajectory. Simulation and experiments results are

provided to show the performance of the proposed visual servo controllers.









In order to enable large area motion and weaken the FOV restriction, a

collaborative visual servo method is then developed to enable the control object to

track a desired trajectory. In contrast to typical camera-to-hand and camera-in-

hand visual servo control configurations, the proposed controller is developed using

a moving on-board camera viewing a moving object to obtain feedback signals.

A daisy-chaining method is used to develop homography relationship between

different camera coordinate frames and feature point patch coordinate frames to

facilitate collaboration between different agents (e.g., cameras, reference objects,

and control object). Lyapunov-based methods are used to prove the asymptotic

tracking result. Simulation results are provided to show the performance of

the proposed visual servo controller. To enlarge the FOV, an alternative visual

servo controller is developed that yields an asymptotic tracking result using a

central catadioptric camera. A panoramic FOV is obtained by using the central

catadioptric camera.

To study the robust visual servo control problem, a visual servo controller

is developed to regulate a camera (attached to a rigid-body object) to a desired

pose in presence of intrinsic camera calibration uncertainties. A quaternion-based

estimate for the rotation error system is developed that is related to the actual

rotation error. The similarity relationship between the estimated and actual

rotation matrices is used to construct the relationship between the estimated and

actual quaternions. A Lyapunov-based stability analysis is provided that indicates

a unique controller can be developed to achieve the regulation result despite a sign

ambiguity in the developed quaternion estimate. Simulation results are provided to

illustrate the performance of the developed controller.

A new combined robust and adaptive visual servo controller is then developed

to asymptotically regulate the feature points in an image to the desired feature

point locations while also regulating the six DOF pose of the camera. These dual









objectives are achieved by using a homography-based approach that exploits both

image-space and reconstructed Euclidean information in the feedback loop. In

comparison to pure image-based feedback approaches, some advantages of using

a homography-based method include: realizable Euclidean camera trajectories;

a nonsingular image-Jacobian; and both the camera pose and the feature point

coordinates are included in the error system. Since some image-space information is

used in the feedback-loop of the developed homography-based controller, the image

features are less likely to leave the FOV in comparison with pure position-based

approaches. The robust rotation controller that accommodates for the time-

varying uncertain scaling factor is developed by exploiting the upper triangular

form of the rotation error system and the fact that the diagonal elements of the

camera calibration matrix are positive. The adaptive translation controller that

compensates for the constant unknown parameters in the translation error system

is developed by a certainty-equivalence-based adaptive control method and a

nonlinear Lyapunov-based design approach. Simulation results are provided to

show the performance of the proposed visual servo controller.
















APPENDIX A
UNIT NORM PROPERTY FOR THE QUATERNION ERROR

Property: The quaternion error (qo(t), q (t))T defined in (3-5) has a unit

norm given that q(t) and qd(t) are two unit quaternions.

Proof: The quaternion components in (3-5) can be expanded as




qv1 qvdl

o4 = 0qoqd + qv2 qvd2 = 1vqvd + "'l. .,' + :'/..,: + 0qod (A-1)

qv3 Wvd3

and


qvdl qvi 0 -qv3 Wv2 qvdl

qv = qo qvd2 Od qv2 + qv3 0 -qv qvd2 (A-2)

qvd3 qv3 -qv2 1v 0 qvd3

qoqvdl + qv2qvd3 qv3qvd2 qvlq0d

qoqvd2 qvlqvd3 + /. :'/, qv2qod

qoqvd3 + qvlqvd2 qv2qvdl qv3qod

Based on (A-1) and (A-2),


2 T 4v

= (qvqvdl + qv2qvd2 + qv3qvd3 + qoqod)2
T
qoqvdl + i, /,'.,: i, :'/ ., qviqod qo0vdl + i, /,'.,: i, :'/,.,' qvlod

+ qoqvd2 qvlqvd3 + /, :'/. i- qv2qd qoqvd2 qvlqvd3 + /, :'/,., qv2qod

qoqvd3 + qvlqvd2 qv2qvdl qv3qod qoqvd3 + qvlqvd2 qv2qvdl qv3qod









Expanding the right side of the above equation gives

2 T
q0 + C4

= (qvlqdl + !,. '.,.,' + !. :., + qoq0d)2 + (qoqvdl + !. ., :'/.,' vlqod)2

+ (qoqvd2 qvlvd3 + :'/, .,- qv2qod) + (qoqvd3 + qvllvd2 1 qv3qOd)2
22 22 2 2 22 22 22 22 22
oqvdl + qoq2 + qvlqdl + qqvd3 + qv2lqd2 + qv2qvdl + qvlqvd3 + qv2qvd2
2 2 2 2 2 2 2 2 22
+ qv3qdl + qv2qvd3 + qv3qd2 + qv3qvd3 + qoqd + qvqd + qv2qd + q3qod

S+ q1 + 22 + q~d)(q + qvdi + qd2 + qd3).

Based on the fact that q(t) and qd(t) are two unit quaternions (i.e., their norms are

equal to 1),

2 + T4v = 1















APPENDIX B
ONE PROPERTY OF UNIT QUATERNIONS

Property: (13 + q)-lqv = qv.

Proof: The term 13 qv can be expanded as

1 0 0 0 -q3 qv2 1 v3

13 q = 0 1 0 q3 0 -qiv -qv3 1

0 0 1 -q2 qv1 0 q2 -qvi

Faking the inverse of the expanded matrix in (B 1) gives

-1
1 qv3 -qv2

(13 q,) = -q33 1 qv

v2 -qvl 1


qvl + 1
1

ql + q2 + q3 + 1 qv3 + qvlqv2
-v2 + qvq v3
W2 + qvlqv3


-qv3 + qvlqv2

q22 + 1

qv~ + 1, 2"1,


-qv2

qv (B-1)

1











qv2 + qv qv3

-qvi + qv2qv3

qW3 +1

(B-2)


Multiplying qv on both sides of (B-2) gives


(13 q,)- v


qv3 (qv2 + qvlqv3) + qv2 (-qv3 + qvlqv2) + qvi (4q1 + 1)

qi (qv3 + qvlqv2) + qv3 (-qv1 + "1. :) + v2 (q2 + 1)

qv2 (qvi + qv2qv3) + qvi (-qv2 + qvlqv3) + q3 (qO, + 1)
qv1 + q2 + q3 +1


r
















APPENDIX C
OPEN-LOOP TRANSLATION ERROR SYSTEM

The translation error was defined as


e = e m- med,


where meT(t) and med(t) are the extended normalized coordinates of mri (t) and

mrdi (t), respectively, and were defined as
rT T
me = n( md = rdin(
Zi ri di Zrdi ri

Differentiating me(t) gives
1 /
me = ,L'mi, (C-1)
zi

where
xi
10 -^
z

LV = 0 1 -
zy.
Zi
00 1

The derivative of mi (t) is given by

.!
i
mi Y R' v, + C s (C 2)
.!
z,

where Vc (t) and wjc (t) are the linear and angular velocities of 7T with respect to ZR

expressed in F. Based on (C 1) and (C-2), ri2e(t) can be further written as

1I ,
me= L',R' (vc + jsi). (C 3)
zi,






142





From (C-3), the translation error system can be obtained as



S= -L', (vc + ~w s) ed,
zi

and it can be further written as


7*
z,* = iL'R (vc + ,Si) ziLed-
zi













APPENDIX D
PROPERTY ON MATRIX NORM
Property: I[]x 2 Ill VE C f3.
Proof: The spectral norm A | 2 of a real matrix is the square root of the
largest eigenvalue of the matrix multiplied by its transpose, i.e.,

A|2 \= max- {ATA}.

For any given vector R IR3,

0 3 2
[] = 0 -0 &
-2 1 0

The Euclidean norm of [(]x is given by


I x 2 max x L x

2+32 + -
212 + 13



= /62 + + 2.


The norm of ( is given by



Hence, I[]Xl = Il .














APPENDIX E
COMPUTATION OF DEPTH RATIOS

Based on (7-5) and (7-6), the following expression can be obtained:


p ai(R 1 xhn*T) pi,


(E-1)


where R(t) is defined in (7-8), and xh(t) E IR3 and n*(t) E IR3 are defined as


xh = Ax
d*


*T = n*TA-


Using the four corresponding reference feature points, the expression in (E-1) can

be written as


pi = aiGp*=-,: p


(E-2)


where g33 (t) E IR is the (assumed w.l.o.g.) positive third row third column element

of G(t), and G,(t) e IR3x3 is defined as G(t)/g33(t). Based on (E-2), twelve linear

equations can be obtained for the four corresponding reference feature points. A set

of twelve linear equations can then be developed to solve for ai(t)g33(t) and G,(t).

To determine the scalar g33 (t), the following equation can be used:


g33Gn = R + h*T


(E-3)


provided that R (t) is obtained using the four vanishing points, where

r11 r12 r13 9n11 9nl12 n13
R = r21 r22 23 Gn = n21 gn22 9n23

r31 r32 r33 n31 gn32 1

Xh = a4h h2 .h3 t= n1 n2 3 ]











To this end, let


X1 = Xh1hl X2 = Xhl2

3X4 = 1.'1 X = I _

X7 = X8 :'.I- a = X :'-*

X 2 3 3
Hi = [2 = '*
nl nl
St f r 1

then the following relationships can be obtained:


X2 = 1-iX

X3 = [2X1


X5 = 41X4

X6 = 12X4


I *

Y *
-- :x1'*











72 1X7
llX7.


The elements of the matrix g33(t)G,(t) in (E-3) can be rewritten as


r12 + 1ixi


r22 +/I1X4 = '1::'/ I I


r32 +/ 1X7 = ::'/ :


The expressions in (E-4) can be combined as


9nl2 /_i + X)
9n11
9 (Fi + x).
gnl


'I : : I 1


r11 + xl

F13 + [2X1



r2i + 34

r23 + [2x4



r31 + X7

r33 + p12X7


'I : :'/ I I




' : : '/






'I : 'I/ l


933-


(E4)






(E-5)


(E-6)


r12 + [i1X 1


r13 + [2iX1


(E-7)










Similarly, the expressions in (E-5) can be combined as

-n22 /-
r22 1+ t1X4 = n22 (21 + 4) (E8)
.n21
gn23
r23 + /2x4 = (r2 (21 + X4).
9n21

Since rj(t), gnij(t) Vi,j E {1, 2} and r3 (t),23 (t),ngi3(t) and gn23(t) are

known, (E-7) and (E-8) provide four equations that can be solved to determine the

four unknowns (i.e., 1, /2, x (t), x4(t)). After solving for these four unknowns,

(E-4) and (E-5) can be used to solve for g33 (t), and then G (t) can be obtained.

Given G (t), (E-2) can be used to solve for ai (t).














APPENDIX F
INEQUALITY DEVELOPMENT

Property: There exist two positive constants ( and ( such that the scaling

factor 7(t) satisfies the inequality

< -(t) < C,. (F 1)

Proof: Since

lAqvll'

the square of 7(t) is given by

2 q^q ^_
T2 T (F-2)
(Aq)T Aq, qTATAq,

Based on the fact that A is of full rank, the symmetric matrix ATA is positive

definite. Hence, the Rayleigh-Ritz theorem can be used to conclude that

AXin(A A)q q, < q ATAq, < Amax(ATA)qTq,, (F 3)

where Amin(ATA) and Amax(ATA) denote the minimal and maximal eigenvalues of

ATA, respectively. From (F-2) and (F-3), it can concluded that

1 < =2 qqv < 1
max(ATA) qTATAq, Amin(ATA)


X 1 1n
Amax(ATA) 7 Amin(ATA)


















APPENDIX G
LINEAR PARAMETERIZATION OF TRANSLATION ERROR SYSTEM

The regressor vectors Y1 (') e I1x20, 2 () t I1x15, (.) 1 XI 5 and the

constant unknown parameter vectors 01 E IR20xl, 2 E IR15xl, 3 IR5~ x are given by


1 = aOiUiVc3


UiWJcl i --.


- Wc3 c2 Ui


1 [ i a12 a13
1 =, ,? I*
Small Zi a11zi 11a
a23 1 a23 1
a11 22 a11 22 022 022
2 2
a12 a12 12a23 a12a
a11 11a 22 a1la22

2 = .''. -ai Vc3 C cl


-iVc2 -CiVc3 -c3 c2 -cl cl


-U J(Jc2 U, ... 2


WJ Wc2]


1 a23
a22 a22


-Ui(c2


a12 a13a23 a13
a11 a11a22 a11a22
1 a12 a12a23 a13a22
al1 a1l22 a1la22


a1322



cl -


--' ..'2cUi 2'-.-'2c ---'2c (2cUi

[ 1 a23 a23 a2
a22z a22z* a2 2


a12 012a23 013a22


2
alla22 a11a22


a12

O11022


Y3 IJ2cUi


03 = Zi


2
lla22


a13 a12a13

a11 al a22

-2' .... i v2~Jc


- _. ^2c


3 1 1
2 2 a1
2 a22 all


a12a13a23 a13a22

i a '

-Jc3 :Ui -W c3


a12
a11a22


a12a23 a13a22

S 11022


2 T-
a23 a12a23 a12a23 ~ I :"12'"1':
a11a22 a11a a2 11022


- t' (2c --. i cl ]


012a23 a13a22

O11022


I' I















REFERENCES


[1] S. Hutchinson, G. Hager, and P. Corke, "A tutorial on visual servo control,"
IEEE Trans. Robot. Automat., vol. 12, no. 5, pp. 651-670, 1996.

[2] F. Chaumette and S. Hutchinson, "Visual servo control part I: Basic ap-
proaches," IEEE Robot. Automat. Mag., vol. 13, no. 4, pp. 82-90, 2006.

[3] "Visual servo control part II: Advanced approaches," IEEE Robot.
Automat. Mag., vol. 14, no. 1, pp. 109-118, 2006.

[4] G. Hu, N. Gans, and W. E. Dixon, Coimple.ritf and Nonlinearity in Au-
tonomous Robotics, Encyclopedia of CoIple.rxitfi and System Science.
Springer, to appear 2008, ch. Adaptive Visual Servo Control.

[5] N. J. Cowan and D. Koditschek, "Planar image-based visual serving as a
navigation problem," in Proc. IEEE Int. Conf. Robot. Automat., 2000, pp.
1720-1725.

[6] N. J. Cowan, J. D. Weingarten, and D. E. Koditschek, "Visual serving via
navigation functions," IEEE Trans. Robot. Automat., vol. 18, pp. 521-533,
2002.

[7] Y. Mezouar and F. Chaumette, "Path planning for robust image-based
control," IEEE Trans. Robot. Automat., vol. 18, no. 4, pp. 534-549, 2002.

[8] E. Rimon and D. E. Koditschek, "Exact robot navigation using artificial
potential functions," IEEE Trans. Robot. Automat., vol. 8, pp. 501-518, 1992.

[9] A. Ruf, M. Tonko, R. Horaud, and H.-H. Nagel, "Visual tracking of an end-
effector by adaptive kinematic prediction," in Proc. IEEE/RSJ Int. Conf.
Intell. Robots Syst., 1997, pp. 893-898.

[10] J. Chen, D. M. Dawson, W. E. Dixon, and A. Behal, "Adaptive homography-
based visual servo tracking for a fixed camera configuration with a camera-in-
hand extension," IEEE Trans. Contr. Syst. Technol., vol. 13, no. 5, pp. 814
825, 2005.

[11] M. Spong and M. Vidyasagar, Robot Dynamics and Control. New York:
John Wiley & Sons Inc., 1989.

[12] 0. Faugeras, Three-Dimensional Computer Vision: A Geometric Viewpoint.
Cambridge, Massachusetts: MIT Press, 1993.









[13] R. Hartley and A. Zisserman, Multiple View Geometry in Computer Vision.
New York: NY: Cambridge University Press, 2000.

[14] G. Hu, W. E. Dixon, S. Gupta, and N. Fitz-coy, "A quaternion formulation
for homography-based visual servo control," in Proc. IEEE Int. Conf. Robot.
Automat., 2006, pp. 2391-2396.

[15] M. Shuster, "A survey of attitude representations," J. Astronautical Sciences,
vol. 41, no. 4, pp. 439-518, 1993.

[16] G. Hu, S. Mehta, N. Gans, and W. E. Dixon, "Daisy chaining based visual
servo control part I: Adaptive quaternion-based tracking control," in IEEE
Multi-Conf. Syst. Control, 2007, pp. 1474-1479.

[17] G. Hu, N. Gans, S. Mehta, and W. E. Dixon, "Daisy chaining based visual
servo control part II: Extensions, applications and open problems," in IEEE
Multi-Conf. Syst. Control, 2007, pp. 729-734.

[18] S. Mehta, W. E. Dixon, D. MacArthur, and C. D. Crane, "Visual servo
control of an unmanned ground vehicle via a moving airborne monocular
camera," in Proc. American Control Conf., 2006, pp. 5276-5211.

[19] S. Mehta, G. Hu, N. Gans, and W. E. Dixon, "Adaptive vision-based
collaborative tracking control of an ugv via a moving airborne camera: A
daisy chaining approach," in Proc. IEEE Conf. Decision Control, 2006, pp.
3867-3872.

[20] E. Malis, F. Chaumette, and S. Bodet, "2 1/2 D visual servingg" IEEE
Trans. Robot. Automat., vol. 15, no. 2, pp. 238-250, 1999.

[21] E. Malis and F. Chaumette, "Theoretical improvements in the stability
analysis of a new class of model-free visual serving methods," IEEE Trans.
Robot. Automat., vol. 18, no. 2, pp. 176-186, 2002.

[22] Y. Fang, W. E. Dixon, D. M. Dawson, and P. Chawda, "Homography-based
visual serving of wheeled mobile robots," IEEE Trans. Syst., Man, C.i, I -,
Part B: C;I., i,. vol. 35, no. 5, pp. 1041-1050, 2005.

[23] W. E. Dixon, A. Behal, D. M. Dawson, and S. Nagarkatti, Nonlinear Control
of Engineering Systems: A L!lIiniio'-Based Approach. Birkhauser Boston,
2003.

[24] G. Hu, S. Gupta, N. Fitz-coy, and W. E. Dixon, "Lyapunov-based visual
servo tracking control via a quaternion formulation," in Proc. IEEE Conf.
Decision Control, 2006, pp. 3861-3866.

[25] P. Corke and S. Hutchinson, "A new partitioned approach to image-based
visual servo control," IEEE Trans. Robot. Automat., vol. 17, no. 4, pp.
507-515, 2001.









[26] G. Chesi, K. Hashimoto, D. Prattichizzo, and A. Vicino, "Keeping features in
the field of view in eye-in-hand visual serving: A switching approach," IEEE
Trans. Robot., vol. 20, no. 5, pp. 908-913, 2004.

[27] Y. Mezouar and F. Chaumette, "Optimal camera trajectory with image based
control," Int. J. Robot. Res., vol. 22, no. 10, pp. 781-804, 2003.

[28] H. Zhang and J. Ostrowski, "Visual motion planning for mobile robots,"
IEEE Trans. Robot. Automat., vol. 18, no. 2, pp. 199-208, 2002.

[29] J. Chen, D. M. Dawson, W. E. Dixon, and V. Chitrakaran, "Navigation
function based visual servo control," Automatica, vol. 43, pp. 1165-1177,
2007, to appear.

[30] S. Benhimane and E. Malis, "Vision-based control with respect to planar
and nonplanar objects using a zooming camera," in Proc. IEEE Int. Conf.
Advanced Robotics, 2003, pp. 991-996.

[31] N. Garcka-Aracil, E. Malis, R. Aracil-Santonja, and C. Perez-Vidal, "Contin-
uous visual serving despite the changes of visibility in image features," IEEE
Trans. Robot., vol. 21, no. 6, pp. 1214-1220, 2005.

[32] E. Hecht and A. Zadac, Optics, 3rd ed. Addison-Wesley, 1997.

[33] C. Geyer and K. Daniilidis, "Catadioptric projective geometry," Int. J.
Computer Vision, vol. 45, no. 3, pp. 223-243, 2001.

[34] S. Baker and S. Nayar, "A theory of single-viewpoint catadioptric image
formation," Int. J. Computer Vision, vol. 35, no. 2, pp. 175-196, 1999.

[35] J. Barreto and H. Araujo, "Geometric properties of central catadioptric line
images," in Proc. European Conf. Computer Vision, 2002, pp. 237-251.

[36] C. Geyer and K. Daniilidis, "A unifying theory for central panoramic systems
and practical implications," in Proc. European Conf. Computer Vision, 2000,
pp. 445-461.

[37] R. Y. Tsai, "A versatile camera calibration technique for high-accuracy 3D
machine vision metrology using off-the-shelf TV cameras and lenses," IEEE
J. Robot. Automat., vol. 3, no. 4, pp. 323-344, 1987.

[38] Synopsis of recent progress on camera calibration for 3D machine
vision. Cambridge, MA, USA: MIT Press, 1989.

[39] L. Robert, "Camera calibration without feature extraction," Computer Vision
and Image Understanding, vol. 63, no. 2, pp. 314-325, 1996.









[40] J. Heikkila and 0. Silven, "A four-step camera calibration procedure with
implicit image correction," in Proc. IEEE Int. Conf. Computer Vision
Pattern Recognition, 1997, pp. 1106-1112.

[41] T. A. Clarke and J. G. Fryer, "The development of camera calibration
methods and models," Photogrammetric Record, vol. 16, no. 91, pp. 51-66,
1998.

[42] P. F. Sturm and S. J. Maybank, "On plane-based camera calibration: A
general algorithm, singularities, applications," in Proc. IEEE Int. Conf.
Computer Vision Pattern Recognition, 1999, pp. 432-437.

[43] Z. Zhang, "Flexible camera calibration by viewing a plane from unknown
orientations," in Proc. IEEE Int. Conf. Computer Vision, 1999, pp. 666-673.

[44] L. E. Weiss, A. C. Sanderson, and C. P. Neuman, "Dynamic sensor-based
control of robots with visual feedback," IEEE J. Robot. and Automat., vol.
RA-3, no. 5, pp. 404-417, 1987.

[45] J. Feddema and 0. Mitchell, "Vision-guided serving with feature-based
trajectory generation," IEEE Trans. Robot. Automat., vol. 5, no. 5, pp.
691-700, 1989.

[46] K. Hashimoto, T. Kimoto, T. Ebine, and H. Kimura, "Manipulator control
with image-based visual servo," in Proc. IEEE Int. Conf. Robot. Automat.,
1991, pp. 2267-2272.

[47] B. Espiau, F. Chaumette, and P. Rives, "A new approach to visual serving
in robotics," IEEE Trans. Robot. Automat., vol. 8, no. 3, pp. 313-326, 1992.

[48] F. Chaumette, "Potential problems of stability and convergence in image-
based and position-based visual servingg" in The Confluence of Vision and
Control, ser. LNCIS Series, D. Kriegman, G. Hager, and A. Morse, Eds.
Berlin, Germany: Springer-Verlag, 1998, vol. 237, pp. 66-78.

[49] B. Espiau, "Effect of camera calibration errors on visual serving in robotics,"
in The 3rd Int. Symp. Experimental Robotics, 1993, pp. 182-192.

[50] W. J. Wilson, C. W. Hulls, and G. S. Bell, "Relative end-effector control us-
ing cartesian position based visual servingg" IEEE Trans. Robot. Automat.,
vol. 12, no. 5, pp. 684-696, 1996.

[51] P. Martinet, J. Gallice, and D. Khadraoui, "Vision based control law using
3D visual features," in Proc. World Automation Congress, vol. 3, 1996, pp.
497-502.

[52] N. Daucher, M. Dhome, J. Laprest6, and G. Rives, "Speed command of a
robotic system by monocular pose estimate," in Proc. IEEE/RSJ Int. Conf.
Intell. Robots Syst., 1997, pp. 55-62.









[53] K. Hashimoto, "A review on vision-based control of robot manipulators,"
Advanced Robotics, vol. 17, no. 10, pp. 969-991, 2003.

[54] K. Deguchi, "Optimal motion control for image-based visual serving by
decoupling translation and rotation," in Proc. IEEE/RSJ Int. Conf. Intell.
Robots Syst., 1998, pp. 705-711.

[55] E. Malis and F. Chaumette, "2 1/2 D visual serving with respect to un-
known objects through a new estimation scheme of camera displacement,"
Int. J. Computer Vision, vol. 37, no. 1, pp. 79-97, 2000.

[56] F. Chaumette and E. Malis, "2 1/2 D visual serving: a possible solution to
improve image-based and position-based visual servingss" in Proc. IEEE Int.
Conf. Robot. Automat., 2000, pp. 630-635.

[57] J. Chen, W. E. Dixon, D. M. Dawson, and M. McIntyre, "Homography-based
visual servo tracking control of a wheeled mobile robot," IEEE Trans. Robot.,
vol. 22, no. 2, pp. 406-415, 2006.

[58] N. Gans and S. Hutchinson, "Stable visual serving through hybrid switched-
system control," IEEE Trans. Robot., vol. 23, no. 3, pp. 530-540, 2007.

[59] E. Malis, "Visual serving invariant to changes in camera intrinsic parame-
ters," in Proc. IEEE Int. Conf. Computer Vision, 2001, pp. 704-709.

[60] Y. Y. Schechner and S. Nayar, "Generalized mosaicing: High dynamic range
in a wide field of view," Int. J. Computer Vision, vol. 53, no. 3, pp. 245-267,
2003.

[61] S. Hsu, H. S. Sawhney, and R. Kumar, "Automated mosaics via topology
inference," IEEE Computer Graphics and Application, vol. 22, no. 2, pp.
44 54, 2002.

[62] M. Irani, P. Anandan, J. Bergen, R. Kumar, and S. Hsu, "Efficient represen-
tations of video sequences and their application," Signal Processing: Image
communication, vol. 8, pp. 327-351, 1996.

[63] A. Smolic and T. Wiegand, "High-resolution image mosaicing," in Proc.
IEEE Int. Conf. Image Processing, 2001, pp. 872-875.

[64] R. Swaminathan and S. Nayar, "Non-metric calibration of wide-angle lenses
and polycameras," in Proc. IEEE Int. Conf. Computer Vision Pattern
Recognition, 2000, pp. 413-419.

[65] D. Burschka and G. Hager, "Vision-based control of mobile robots," in Proc.
IEEE Int. Conf. Robot. Automat., 2001, pp. 1707-1713.









[66] Y. Mezouar, H. H. Abdelkader, P. Martinet, and F. Chaumette, "Central
catadioptric visual serving from 3D straight lines," in Proc. IEEE/RSJ Int.
Conf. Intell. Robots Syst., 2004, pp. 343-349.

[67] H. Hadj-Abdelkader, Y. Mezouar, N. Andreff, and P. Martinet, "2 1/2 D
visual serving with central catadioptric cameras," in Proc. IEEE/RSJ Int.
Conf. Intell. Robots Syst., 2005, pp. 3572-3577.

[68] G. Mariottini, E. Alunno, J. Piazzi, and D. Prattichizzo, "Epipole-based
visual serving with central catadioptric camera," in Proc. IEEE Int. Conf.
Robot. Automat., 2005, pp. 3516-3521.

[69] G. Mariottini, D. Prattichizzo, and G. Oriolo, "Image-based visual serving
for nonholonomic mobile robots with central catadioptric camera," in Proc.
IEEE Int. Conf. Robot. Automat., 2006, pp. 538-544.

[70] S. Benhimane and E. Malis, "A new approach to vision-based robot control
with omni-directional cameras," in Proc. IEEE Int. Conf. Robot. Automat.,
2006, pp. 526-531.

[71] R. Tatsambon and F. Chaumette, "Visual serving from spheres using a
spherical projection model," in Proc. IEEE Int. Conf. Robot. Automat., 2007,
pp. 2080-2085.

[72] "Visual serving from spheres with paracatadioptric cameras," in Int.
Conf. Advanced Robotics, August 2007.

[73] J. P. Barreto, F. Martin, and R. Horaud, "Visual servoing/tracking using
central catadioptric images," in Proc. Int. Symp. Experimental Robotics, 2002,
pp. 863-869.

[74] K. Hosoda and M. Asada, "Versatile visual serving without knowledge of
true jacobian," in Proc. IEEE/RSJ Int. Conf. Intell. Robots Syst., 1994, pp.
186-193.

[75] M. Jagersand, O. Fuentes, and R. Nelson, "Experimental evaluation of
uncalibrated visual serving for precision manipulation," in Proc. IEEE Int.
Conf. Robot. Automat., 1997, pp. 2874-2880.

[76] M. Shahamiri and M. Jagersand, "Uncalibrated visual serving using a biased
newton method for on-line singularity detection and avoidance," in Proc.
IEEE/RSJ Int. Conf. Intell. Robots Syst., 2005, pp. 3953-3958.

[77] J. A. Piepmeier and H. Lipkin, "Uncalibrated eye-in-hand visual servingg"
Int. J. Robot. Res., vol. 22, pp. 805-819, 2003.

[78] J. A. Piepmeier, G. V. McMurray, and H. Lipkin, "Uncalibrated dynamic
visual servingg" IEEE Trans. Robot. Automat., vol. 24, no. 3, pp. 143-147,
2004.









[79] R. Kelly, "Robust asymptotically stable visual serving of planar manipula-
tor," IEEE Trans. Robot. Automat., vol. 12, no. 5, pp. 759-766, 1996.

[80] B. Bishop and M. W. Spong, "Adaptive calibration and control of 2D
monocular visual servo system," in Proc. IFAC Symp. Robot Control, 1997,
pp. 525-530.

[81] L. Hsu and P. L. S. Aquino, "Adaptive visual tracking with uncertain
manipulator dynamics and uncalibrated camera," in Proc. IEEE Conf.
Decision Control, 1999, pp. 1248-1253.

[82] C. J. Taylor and J. P. Ostrowski, "Robust vision-based pose control," in Proc.
IEEE Int. Conf. Robot. Automat., 2000, pp. 2734-2740.

[83] E. Zergeroglu, D. M. Dawson, M. de Queiroz, and A. Behal, "Vision-based
nonlinear tracking controllers in the presence of parametric uncertainty,"
IEEE/ASME Trans. Mechatr., vol. 6, no. 3, pp. 322-337, 2001.

[84] W. E. Dixon, D. M. Dawson, E. Zergeroglu, and A. Behal, "Adaptive
tracking control of a wheeled mobile robot via an uncalibrated camera
system," IEEE Trans. Syst., Man, CI1, ,, Part B: C:i11, vol. 31, no. 3,
pp. 341 352, 2001.

[85] A. Astolfi, L. Hsu, M. Netto, and R. Ortega, "Two solutions to the adaptive
visual serving problem," IEEE Trans. Robot. Automat., vol. 18, no. 3, pp.
387-392, 2002.

[86] Y. Liu, H. Wang, C. Wang, and K. Lam, "Uncalibrated visual serving of
robots using a depth-independent interaction matrix," IEEE Trans. Robot.,
vol. 22, no. 4, pp. 804-817, 2006.

[87] Y. Fang, W. E. Dixon, D. M. Dawson, and J. Chen, "An exponential class
of model-free visual serving controllers in the presence of uncertain camera
calibration," Int. J. Robot. Automat., vol. 21, 2006.

[88] G. Hu, N. Gans, and W. E. Dixon, "Quaternion-based visual servo control in
the presence of camera calibration error," in IEEE Multi-Conf. Syst. Control,
2007, pp. 1492-1497.

[89] J. Chen, A. Behal, D. M. Dawson, and W. E. Dixon, "Adaptive visual
serving in the presence of intrinsic calibration uncertainty," in Proc. IEEE
Conf. Decision Control, 2003, pp. 5396-5401.

[90] B. Boufama and R. Mohr, "Epipole and fundamental matrix estimation
using virtual parallax," in Proc. IEEE Int. Conf. Computer Vision, 1995, pp.
1030 1036.

[91] J. Shi and C. Tomasi, "Good features to track," in Proc. IEEE Int. Conf.
Computer Vision Pattern Recognition, 1994, pp. 593-600.









[92] C. Tomasi and T. Kanade, "Detection and tracking of point features,"
Carnegie Mellon University, Tech. Rep., 1991.

[93] 0. Faugeras and F. Lustman, "Motion and structure from motion in a
piecewise planar environment," Int. J. Pattern Recognition and Artificial
Intelligence, vol. 2, no. 3, pp. 485-508, 1988.

[94] Z. Zhang and A. R. Hanson, "Scaled euclidean 3D reconstruction based on
externally uncalibrated cameras," in IEEE Symp. Computer Vision, 1995, pp.
37 42.

[95] Y. Fang, A. Behal, W. E. Dixon, and D. M. Dawson, "Adaptive 2.5D visual
serving of kinematically redundant robot manipulators," in Proc. IEEE
Conf. Decision Control, 2002, pp. 2860-2865.

[96] J. J. Slotine and W. Li, Applied Nonlinear Control. Englewood Cliff, NJ:
Prentice Hall, Inc., 1991.

[97] G. Bradski, "The OpenCV library," Dr. Dobb 's Journal of Software Tools,
vol. 25, pp. 120, 122-125, 2000.

[98] M. Galassi, J. Davies, J. Theiler, B. Gough, G. Jungman, M. Booth, and
F. Rossi, GNU Scientific Library: Reference Manual, Network Theory Ltd.,
Bristol, UK, 2005.

[99] P. K. Allen, A. Timcenko, B. Yoshimi, and P. Michelman, "Automated
tracking and grasping of a moving object with a robotic hand-eye system,"
IEEE Trans. Robot. Automat., vol. 9, no. 2, pp. 152-165, 1993.

[100] G. D. Hager, W.-C. Chang, and A. S. Morse, "Robot hand-eye coordination
based on stereo vision," IEEE Contr. Syst. Mag., vol. 15, no. 1, pp. 30-39,
1995.

[101] S. Wiiesoma, D. Wolfe, and R. Richards, "Eye-to-hand coordination for
vision-guided robot control applications," Int. J. Robot. Res., vol. 12, no. 1,
pp. 65-78, 1993.

[102] N. Gans, G. Hu, and W. E. Dixon, Cionpl.ritf! and Nonlinearity in Au-
tonomous Robotics, Encyclopedia of Coirlh.ritfi and System Science.
Springer, to appear 2008, ch. Image-Based State Estimation.

[103] A. Almansa, A. Desolneux, and S. Vamech, "Vanishing point detection
without any a priori information," IEEE Trans. Pattern Anal. Machine
Intell., vol. 25, no. 4, pp. 502-507, 2003.















BIOGRAPHICAL SKETCH

Guoqiang Hu received his bachelor's degree in automation engineering from

the University of Science and Technology of China (USTC) in 2002, and he

received his Master of Philosophy in Automation and Computer-aided Engineering

from The Chinese University of Hong Kong (CUHK) in 2004.

Currently, he is pursuing his PhD degree in the nonlinear control and robotics

group at the University of Florida (UF) under the supervision of Dr. Warren

Dixon.





PAGE 1

VISUALSERVOTRACKINGCONTROLVIAALYAPUNOV-BASED APPROACH By GUOQIANGHU ADISSERTATIONPRESENTEDTOTHEGRADUATESCHOOL OFTHEUNIVERSITYOFFLORIDAINPARTIALFULFILLMENT OFTHEREQUIREMENTSFORTHEDEGREEOF DOCTOROFPHILOSOPHY UNIVERSITYOFFLORIDA 2007

PAGE 2

c 2007GuoqiangHu

PAGE 3

TomymotherHuanyinZhang,myfatherHuadeHu,andmywifeYiFengfortheir endlessloveandsupport

PAGE 4

ACKNOWLEDGMENTS Thankyoutomyadvisor,Dr.WarrenDixon,forhisguidanceandencouragementwhichwillbene tmealifetime.Asanadvisor,hekeptpolishingmy methodologyandskillsinresolvingproblemsandformulatingnewproblems.Asa mentor,hehelpedmedevelopprofessionalskillsandgavemeopportunitiestoget exposedtoprofessionalworkingenvironment. ThankstomycommitteemembersDr.ThomasBurks,Dr.CarlCraneIII,Dr. SethHutchinson,andDr.RickLind,forthetimeandhelpthattheyprovided. ThankstoDr.NickGansandSidMehtaforalltheinsightfuldiscussions. ThankstoDr.JoeKehoeforhissuggestionsinformattingmydefensepresentation anddissertation.Finally,thankstomyNCRlabfellowsfortheirfriendshipduring thepastthreeyearsofjoy. iv

PAGE 5

TABLEOFCONTENTS page ACKNOWLEDGMENTS.............................iv LISTOFTABLES........... ......................viii LISTOFFIGURES.... ............................ixABSTRACT.............. ......................xiii CHAPTER 1INTRODUCTION.. ............................1 1.1Motivation.... ............................1 1.2ProblemStatement...........................2 1.3LiteratureReview............................8 1.3.1BasicVisualServoControlApproaches............8 1.3.2VisualServoControlApproachestoEnlargetheFOV....9 1.3.3RobustandAdaptiveVisualServoControl..........11 1.4Contributions ..............................13 2BACKGROUNDANDPRELIMINARYDEVELOPMENT........16 2.1GeometricModel............................16 2.2EuclideanReconstruction.......................19 2.3UnitQuaternionRepresentationoftheRotationMatrix.......21 3LYAPUNOV-BASEDVISUALSERVOTRACKINGCONTROLVIAA QUATERNIONFORMULATION......................25 3.1Introduction ...............................25 3.2ControlObjective............................26 3.3ControlDevelopment..........................28 3.3.1Open-LoopErrorSystem....................28 3.3.2Closed-LoopErrorSystem...................30 3.3.3StabilityAnalysis........................32 3.4Camera-To-HandExtension......................33 3.4.1ModelDevelopment.......................33 3.4.2ControlFormulation.......................35 3.5SimulationResults...........................38 3.6ExperimentResults...........................41 v

PAGE 6

3.6.1ExperimentCon gurations...................41 3.6.2ExperimentforTracking....................45 3.6.3ExperimentforRegulation...................47 4COLLABORATIVEVISUALSERVOTRACKINGCONTROLVIAA DAISY-CHAININGAPPROACH......................61 4.1Introduction ................... ............61 4.2ProblemScenario............................62 4.3GeometricModel............................64 4.4EuclideanReconstruction.......................68 4.5ControlObjective............................71 4.6ControlDevelopment..........................73 4.6.1Open-LoopErrorSystem....................73 4.6.2Closed-LoopErrorSystem...................74 4.6.3StabilityAnalysis........................75 4.7SimulationResults...........................76 5ADAPTIVEVISUALSERVOTRACKINGCONTROLUSINGACENTRALCATADIOPTRICCAMERA....................84 5.1Introduction ................... ............84 5.2GeometricModel............................85 5.3EuclideanReconstruction.......................90 5.4ControlObjective............................91 5.5ControlDevelopment..........................94 5.5.1Open-LoopErrorSystem....................94 5.5.2Closed-LoopErrorSystem...................95 5.5.3StabilityAnalysis........................96 6VISUALSERVOCONTROLINTHEPRESENCEOFCAMERACALIBRATIONERROR.............................98 6.1Introduction ................... ............98 6.2FeedbackControlMeasurements....................99 6.3ControlObjective............................101 6.4QuaternionEstimation.........................103 6.4.1EstimateDevelopment.....................103 6.4.2EstimateRelationships.....................104 6.5ControlDevelopment..........................106 6.5.1RotationControl........................106 6.5.2TranslationControl.......................106 6.6StabilityAnalysis............................107 6.7SimulationResults...........................110 vi

PAGE 7

7COMBINEDROBUSTANDADAPTIVEHOMOGRAPHY-BASED VISUALSERVOCONTROLVIAANUNCALIBRATEDCAMERA..117 7.1Introduction... ............................117 7.2CameraGeometryandAssumptions.................118 7.3Open-LoopErrorSystem.. ......................120 7.3.1RotationErrorSystem.....................120 7.3.2TranslationErrorSystem....................121 7.4ControlDevelopment..........................123 7.4.1RotationControlDevelopmentandStabilityAnalysis....123 7.4.2TranslationControlDevelopmentandStabilityAnalysis...125 7.5SimulationResults...........................128 8CONCLUSIONS......... ......................135 APPENDIX AUNITNORMPROPERTYFORTHEQUATERNIONERROR.....138 BONEPROPERTYOFUNITQUATERNIONS..............140 COPEN-LOOPTRANSLATIONERRORSYSTEM............141 DPROPERTYONMATRIXNORM.....................143 ECOMPUTATIONOFDEPTHRATIOS..................144 FINEQUALITYDEVELOPMENT......................147 GLINEARPARAMETERIZATIONOFTRANSLATIONERRORSYSTEM..................... .................148 REFERENCES............. ......................149 BIOGRAPHICALSKETCH............................157 vii

PAGE 8

LISTOFTABLES Table page 41Coordinateframesrelationships.......................65 viii

PAGE 9

LISTOFFIGURES Figure page 2Coordinateframerelationshipsbetweenacameraviewingaplanarpatch atdi erentspatiotemporalinstances.Thecoordinateframes F Fand Fareattachedtothecurrent,referenceanddesiredlocations,respectively................ ......................17 2Coordinateframerelationshipsbetweenacameraviewingaplanarpatch atdi erentspatiotemporalinstances.Thecoordinateframes F and Fare attachedtothecurrentandreferencelocations,respectively........18 3Coordinateframerelationshipsbetweena xedcameraandtheplanes de nedbythecurrent,desired,andreferencefeaturepoints(i.e., and ).................... .................33 32Blockdiagramoftheexperiment.......................42 3TheSonyXCD-710CRcolor rewirecamerapointedatthevirtualenvironment............... ......................42 3Virtualrealityenvironmentexmaple:avirtualrecreationoftheUSArmys urbanwarfaretraininggroundatFortBenning...............44 35Desiredimage-spacecoordinatesofthefourfeaturepoints(i.e., ( ) )in thetrackingMatlabsimulationshownina3Dgraph.Inthe gure,O denotestheinitialimage-spacepositionsofthe4featurepointsinthe desiredtrajectory,and*denotesthecorresponding nalpositionsof thefeaturepoints.. ................. ............48 36Currentimage-spacecoordinatesofthefourfeaturepoints(i.e., ( ) )in thetrackingMatlabsimulationshownina3Dgraph.Inthe gure,O denotestheinitialimage-spacepositionsofthe4featurepoints,and* denotesthecorresponding nalpositionsofthefeaturepoints......48 3Translationerror ( ) inthetrackingMatlabsimulation..........49 3Rotationquaternionerror ( ) inthetrackingMatlabsimulation.....49 3Pixelcoordinate ( ) ofthefourfeaturepointsinasequenceofdesired imagesinthetrackingMatlabsimulation.Theupper gureisforthe ( ) componentandthebottom gureisforthe ( ) component....50 ix

PAGE 10

3Pixelcoordinate ( ) ofthecurrentposeofthefourfeaturepointsinthe trackingMatlabsimulation.Theupper gureisforthe ( ) component andthebottom gureisforthe ( ) component..............50 3Trackingerror ( ) ( ) (inpixels)ofthefourfeaturepointsinthe trackingMatlabsimulation.Theupper gureisforthe ( ) ( ) componentandthebottom gureisforthe ( ) ( ) component......51 3Linearcameravelocityinput ( ) inthetrackingMatlabsimulation...51 3Angularcameravelocityinput ( ) inthetrackingMatlabsimulation..52 314Adaptiveon-lineestimateof 1inthetrackingMatlabsimulation.....52 3Translationerror ( ) inthetrackingexperiment..............53 3Rotationquaternionerror ( ) inthetrackingexperiment.........53 3Pixelcoordinate ( ) ofthefourfeaturepointsinasequenceofdesired imagesinthetrackingexperiment.Theupper gureisforthe ( ) componentandthebottom gureisforthe ( ) component.........54 3Pixelcoordinate ( ) ofthecurrentposeofthefourfeaturepointsinthe trackingexperiment.Theupper gureisforthe ( ) componentandthe bottom gureisforthe ( ) component...................54 3Trackingerror ( ) ( ) (inpixels)ofthefourfeaturepointsinthe trackingexperiment.Theupper gureisforthe ( ) ( ) component andthebottom gureisforthe ( ) ( ) component..........55 3Linearcameravelocityinput ( ) inthetrackingexperiment.......55 3Angularcameravelocityinput ( ) inthetrackingexperiment......56 322Adaptiveon-lineestimateof 1inthetrackingexperiment.........56 3Translationerror ( ) intheregulationexperiment.............57 3Rotationquaternionerror ( ) intheregulationexperiment........57 3Pixelcoordinate ( ) (inpixels)ofthecurrentposeofthefourfeature pointsintheregulationexperiment.Theupper gureisforthe ( ) componentandthebottom gureisforthe ( ) component..........58 3Regulationerror ( ) (inpixels)ofthefourfeaturepointsintheregulationexperiment.Theupper gureisforthe ( ) ( ) component andthebottom gureisforthe ( ) ( ) component..........58 3Linearcameravelocityinput ( ) intheregulationexperiment......59 3Angularcameravelocityinput ( ) intheregulationexperiment.....59 x

PAGE 11

329Adaptiveon-lineestimateof 1intheregulationexperiment........60 41Geometricmodel. ................... ............63 4This gureshowstheinitialpositionsofthecamerasandthefeaturepoint planes.Theinitialpositionsofthecamerasattachedto I and Iare denotedbyO.Thefeaturepointsontheplanes and aredenotedby .Theoriginsofthecoordinateframes F Fand Faredenotedby ..................... ............79 4Pixelcoordinate ( ) ofthefourfeaturepointsontheplane inasequenceofdesiredimagestakenbythecameraattachedto I.Theupper gureisforthe ( ) componentandthebottom gureisforthe ( ) component......... ......................80 4Pixelcoordinate ( ) ofthefourfeaturepointsontheplane inasequenceofreferenceimagestakenbythemovingcameraattachedto I Theupper gureisforthe ( ) componentandthebottom gureisfor the ( ) component.............................80 4Pixelcoordinate ( ) ofthefourfeaturepointsontheplane inasequenceofimagestakenbythemovingcameraattachedto I .Theupper gureisforthe ( ) componentandthebottom gureisforthe ( ) component....... ............................81 4Translationerror ( ) .............................81 4Rotationquaternionerror ( ) ........................82 4Linearcameravelocityinput ( ) ......................82 49Angularcameravelocityinput ( ) .....................83 51Centralcatadioptricprojectionrelationship.................85 52Projectionmodelofthecentralcatadioptriccamera............86 53Camerarelationshipsrepresentedinhomography..............88 61Unitlesstranslationerrorbetween 1( ) and 1..............113 62Quaternionrotationerror...........................114 6Quaternionrotationerrorforcomparisonwithdi erentsign........114 6Image-spaceerrorinpixlesbetween ( ) and .Inthe gure,Odenotestheinitialpositionsofthe4featurepointsintheimage,and* denotesthecorresponding nalpositionsofthefeaturepoints.......115 xi

PAGE 12

6Image-spaceerrorinpixlesbetween ( ) and shownina3Dgraph. Inthe gure,Odenotestheinitialpositionsofthe4featurepointsin theimage,and*denotesthecorresponding nalpositionsofthefeaturepoints............. ......................115 66Linearcameravelocitycontrolinput.....................116 67Angularcameravelocitycontrolinput....................116 7Unitlesstranslationerror ( ) .........................131 7Quaternionrotationerror ( ) .........................131 7Pixelcoordinate ( ) (inpixels)ofthecurrentposeofthefourfeature pointsinthesimulation.Theupper gureisforthe ( ) componentand thebottom gureisforthe ( ) component.................132 7Regulationerror ( ) (inpixels)ofthefourfeaturepointsinthesimulation.Theupper gureisforthe ( ) ( ) componentandthebottom gureisforthe ( ) ( ) component................132 7Image-spaceerrorinpixlesbetween ( ) and .Inthe gure,Odenotestheinitialpositionsofthe4featurepointsintheimage,and* denotesthecorresponding nalpositionsofthefeaturepoints.......133 7Image-spaceerrorinpixlesbetween ( ) and shownina3Dgraph.In the gure,Odenotestheinitialpositionsofthe4featurepointsinthe image,and*denotesthecorresponding nalpositionsofthefeature points............... ......................133 7Linearcameravelocitycontrolinput ( ) ..................134 78Angularcameravelocitycontrolinput ( ) .................134 xii

PAGE 13

AbstractofDissertationPresentedtotheGraduateSchool oftheUniversityofFloridainPartialFul llmentofthe RequirementsfortheDegreeofDoctorofPhilosophy VISUALSERVOTRACKINGCONTROLVIAALYAPUNOV-BASED APPROACH By GuoqiangHu December2007 Chair:Dr.WarrenE.Dixon Major:MechanicalEngineering Recentadvancesinimageprocessing,computationaltechnologyandcontrol theoryareenablingvisualservocontroltobecomemoreprevalentinrobotics andautonomoussystemsapplications.Inthisdissertation,visualservocontrol algorithmsandarchitecturesaredevelopedthatexploitthevisualfeedbackfroma camerasystemtoachieveatrackingorregulationcontrolobjectiveforarigid-body object(e.g.,theend-e ectorofarobotmanipulator,asatellite,anautonomous vehicle)identi edbyapatchoffeaturepoints. The rsttwochapterspresenttheintroductionandbackgroundinformationfor thisdissertation.Inthethirdchapter,anewvisualservotrackingcontrolmethod forarigid-bodyobjectisdevelopedbyexploitingacombinationofhomography techniques,aquaternionparameterization,adaptivecontroltechniques,and nonlinearLyapunov-basedcontrolmethods.Thedesiredtrajectorytobetracked isrepresentedbyasequenceofimages(e. g.,avideo),whichcanbetakenonline oro inebyacamera.Thiscontrollerissingularity-freebyusingthehomography techniquesandthequaternionparameterization.Inthefourthchapter,anew collaborativevisualservocontrolmethodisdevelopedtoenablearigid-body xiii

PAGE 14

objecttotrackadesiredtrajectory.Incontrasttotypicalcamera-to-handand camera-in-handvisualservocontrolcon gurations,theproposedcontrolleris developedusingamovingon-boardcameraviewingamovingobjecttoobtain feedbacksignals.Thiscollaborativemethodweakensthe eld-of-viewrestriction andenablesthecontrolobjecttoperformlargeareamotion.Inthe fthchapter, avisualservocontrollerisdevelopedthatyieldsanasymptotictrackingresultfor thecompletelynonlinearcamera-in-han dcentralcatadioptriccamerasystem.A panoramic eld-of-viewisobtainedbyusingth ecentralcatadioptriccamera.In thesixthchapter,arobustvisualservocontrolmethodisdevelopedtoachievea regulationcontrolobjectiveinpresenceofin trinsiccameracalibrationuncertainties. Aquaternion-basedestimatefortherotationerrorsignalisdevelopedandused inthecontrollerdevelopment.Thesimilarityrelationshipbetweentheestimated andactualrotationmatricesisusedtoconstructtherelationshipbetweenthe estimatedandactualquaternions.ALyapunov-basedstabilityanalysisisprovided thatindicatesauniquecontrollercanbedevelopedtoachievetheregulationresult despiteasignambiguityinthedevelopedquaternionestimate.Intheseventh chapter,anewcombinedrobustandadap tivevisualservocontrolmethodis developedtoasymptoticallyregulatethefeaturepointsinanimagetothedesired locationswhilealsoregulatingtheposeofthecontrolobjectwithoutcalibrating thecamera.Thesedualobjectivesareachievedbyusingahomography-based approachthatexploitsbothimage-spacea ndreconstructedEuclideaninformation inthefeedbackloop.Therobustrotationcontrolleraccommodatesforthetimevaryinguncertaintiesintherotationerrorsystem,andtheadaptivetranslation controllercompensatesfortheunknowncalibrationparametersinthetranslation errorsystem.Chapter8servesastheconclusionsofthisdissertation. xiv

PAGE 15

CHAPTER1 INTRODUCTION 1.1Motivation Controlsystemsthatuseinformationacquiredfromanimagingsourceinthe feedbacklooparede nedasvisualservocontrolsystems.Visualservocontrolhas developedintoalargesubsetofroboticsliterature(see[1]forareview)because oftheenablingcapabilitiesitcanprovideforautonomy.Recentadvancesinimage processing,computationaltechnologyan dcontroltheoryareenablingvisualservo controltobecomemoreprevalentinautonomoussystemsapplications(e.g.,the autonomousgroundvehiclesgrandchallengeandurbanchallengesponsoredbythe U.S.DefenseAdvancedResearchProjectsAgency(DARPA)).Insteadofrelying solelyonaglobalpositioningsystem(GPS)orinertialmeasurementunits(IMU) fornavigationandcontrol,image-basedmethodsareapromisingapproachto provideautonomousvehicleswithpositionandorientation(i.e.,pose)information. Speci cally,ratherthanobtainaninertialmeasurementofanautonomoussystem, visionsystemscanbeusedtorecastthenavigationandcontrolprobleminterms oftheimagespace.Inadditiontoprovidingfeedbackrelatingthelocalposeof thecamerawithrespecttosometarget,animagesensorcanalsobeusedtorelate localsensorinformationtoaninertialreferenceframeforglobalcontroltasks. Visualservoingrequiresmultidisciplinaryexpertisetointegrateavisionsystem withthecontrollerfortasksincluding:selectingtheproperimaginghardware; extractingandprocessingimagesatratesamenabletoclosed-loopcontrol;image analysisandfeaturepointextraction/tracking;andrecovering/estimatingnecessary stateinformationfromanimage,etc.Whileeachoftheaforementionedtasks areactivetopicsofresearchinterestincomputervisionandimageprocessing 1

PAGE 16

2 societies,theywillnotbethefocusofthisdissertation.Thedevelopmentinthis dissertationisbasedontheassumptionthatimagescanbeacquired,analyzed,and theresultingdatacanbeprovidedtotheco ntrollerwithoutrestrictingthecontrol rates. Theuseofimage-basedfeedbackaddscomplexityandnewchallengesfor thecontrolsystemdesign.Thescopeofthisdissertationisfocusedonissues associatedwithusingreconstructedandestimatedstateinformationfroma sequenceofimagestodevelopastableclosed-looperrorsystem.Particularly,this dissertationfocusesonthefollowingproblems:1)howtodesignavisualservo trackingcontrollerthatachievesasymptotictrackingviaaquaternionformulation? 2)howtodesignacollaborativevisualservocontrolschemewhenboththecamera andthecontrolobjectaremoving?3)howtodesignavisualservocontrollerusing acentralcatadioptriccamera?4)andhowtodesignavisualservocontrollerthatis robusttocameracalibrationuncertainty? 1.2ProblemStatement Inthisdissertation,visualservocontrolalgorithmsandarchitecturesare developedthatexploitthevisualfeedbackfromacamerasystemtoachievea trackingorregulationcontrolobjectiveforasixdegreesoffreedom(DOF)rigidbodycontrolobject(e.g.,theend-e ectorofarobotmanipulator,asatellite,an autonomousvehicle)identi edbyapatchoffeaturepoints.Thetrackingcontrol objectiveisforthecontrolobjecttotrackadesiredtrajectorythatisencodedby avideoobtainedfromacameraineithert hecamera-in-handorcamera-to-hand con guration.Thisvideocanbetakenonlineoro inebyacamera.Forexample, themotionofacontrolobjectcanbeprerecordedbyacamera(forthecamerato-handcon guration)beforehandandusedasadesiredtrajectory,or,avideoof thereferenceobjectcanbeprerecordedasadesiredtrajectorywhilethecamera moves(forthecamera-in-handcon guration).Theregulationcontrolobjectiveis

PAGE 17

3 fortheobjecttogotoadesiredposethatisencodedbyaprerecordedimage.The regulationproblemcanbeconsideredasaparticularcaseofthetrackingproblem. Forexample,whenalltheimagesinthesequenceofdesiredimagesareidentical, thetrackingproblembecomesaregulationproblem.Thedissertationwilladdress thefollowingproblemsofinterest:1)visualservotrackingcontrolviaaquaternion formulation;2)collaborativevisualservotrackingcontrolusingadaisy-chaining approach;3)visualservotrackingcontrolusingacentralcatadioptriccamera;4) robustvisualservocontrolinpresenceofcameracalibrationuncertainty;and5) combinedrobustandadaptivevisualservocontrolviaanuncalibratedcamera.The controldevelopmentinthedissertationisprovenbyusingnonlinearLyapunovbasedmethodsandisdemonstratedbyMat labsimulationand/orexperimental results. 1) Visualservotrackingcontrolviaaquaternionformulation Muchofthepreviousvisualservocontrollershaveonlybeendesignedto addresstheregulationproblem.Motivatedbytheneedfornewadvancementsto meetvisualservotrackingapplications,previousresearchhasconcentratedon developingdi erenttypesofpathplanningtechniques[5].Recently,Chenetal. [10]providedanewformulationofthetrackingcontrolproblem.Ahomographybasedadaptivevisualservocontrollerisdevelopedtoenablearobotend-e ector totrackaprerecordedtime-varyingreferencetrajectorydeterminedbyasequence ofimages.TheEulerangle-axisrepresentationisusedtorepresenttherotation errorsystem.Duetothecomputationalsingularitylimitationoftheangleaxis extractionalgorithm(seeSpongandV idyasagar[11]),rotationanglesof werenotconsidered.Motivatedbythede siretoavoidtherotationsingularity completely,anerrorsystemandvisualservotrackingcontrollerisdevelopedin Chapter3basedonthequaternionformulation.Ahomographyisconstructed fromimagepairsanddecomposedviatextbookmethods(e.g.,Faugeras[12]and

PAGE 18

4 HartleyandZisserman[13])todeterminetherotationmatrix.Oncetherotation matrixhasbeendetermined,thecorrespondingunitquaternioncanbeobtainedby numericallyrobustalgorithms(seeHuetal.[14]andShuster[15]).Thenanerror systemisconstructedintermsoftheunitquaternion,whichisvoidofsingularities. Anadaptivecontrolleristhendevelopedandproventomakeacameratracka desiredtrajectorythatisdeterminedfr omasequenceofimages.Thecontroller containsanadaptivefeedforwardtermtocompensatefortheunknowndistance fromthecameratotheobservedplanarpatch.Aquaternion-basedLyapunov functionisdevelopedtofacilitatethecontroldesignandthestabilityanalysis. 2) Collaborativevisualservotrackingcontrolusingadaisy-chainingapproach. Unliketypicalvisualservocontrollersincamera-in-handandcamera-to-hand con gurations,auniqueaspectofthedevelopmentforthisproblemisthata movingcamera(e.g.,acameramountedonanunmannedairvehicle)isusedto providevisualfeedbacktoamovingautonomousvehicle.Thecontrolobjective isfortheautonomousvehicle(identi edbyaplanarpatchoffeaturepoints)to tracktheposeofadesiredvehicletrajectorythatisencodedbyaprerecorded videoobtainedfroma xedcamera(e.g.,acameramountedonasatellite,a cameramountedonabuilding).Severalchallengesmustberesolvedtoachieve thisunexploredcontrolobjective.Therelativevelocitybetweenthemovingfeature pointpatchandthemovingcamerapresentsasigni cantchallenge.Byusing adaisy-chainingapproach(e.g.,[1619] ) ,Euclideanhomographyrelationships betweendi erentcameracoordinateframesandfeaturepointpatchcoordinate framesaredeveloped.Thesehomographiesareusedtorelatecoordinateframes attachedtothemovingcamera,thereferenceobject,thecontrolobject,andthe objectusedtorecordthedesiredtrajector y.Anotherchallengeisthatforgeneral sixDOFmotionbyboththecameraandtheplanarpatch,thenormaltoplanar patchisunknown.Bydecomposingthehomographyrelationships,thenormalto

PAGE 19

5 themovingfeaturepointpatchcanbeobtained.Likewise,thedistancebetweenthe movingcamera,themovingplanarpatch,andareferencepatchareunknown.By usingthedepthratiosobtainedfromthehomographydecomposition,theunknown distanceisrelatedtoanunknownconstantparameter.ALyapunov-basedadaptive estimationlawisdesignedtocompensatefortheunknownconstantparameter. Sincethemovingcameracouldbeattachedtoaremotelypilotedvehiclewith arbitraryrotations,anotherchallengeistoeliminatepotentialsingularitiesin therotationparameterizationobtainedfromthehomographydecomposition. Toaddressthisissue,homography-basedvisualservocontroltechniques(e.g., [10,20])arecombinedwithquaternion-b asedcontrolmethods(e.g.,[14,23,24]), toeliminatesingularitiesassociatedwiththeimageJacobianandtherotation errorsystem.Byusingthequaternionparameterization,theresultingclosedlooprotationerrorsystemcanbestabilizedbyaproportionalrotationcontroller combinedwithafeedforwardtermthatisafunctionofthedesiredtrajectory. 3) Visualservotrackingcontrolusingacentralcatadioptriccamera Visualservocontrollersrequiretheimage-spacecoordinatesofsomesetof Euclideanfeaturepointsinthecontroldevelopment;hence,thefeaturepoints mustremaininthecameras eld-of-view(FOV).SincetheFOVofconventional perspectivecameras(e.g.,pinholecameras)isrestricted,keepingthefeaturepoints intheFOVisafundamentalchallengeforvisualservocontrolalgorithms.ThefundamentalnatureoftheFOVproblemhasresultedinavarietyofcontrolandpath planningmethods(e.g.,[6,7,25]).Anal ternativesolutiontotheaforementioned algorithmicapproachestoresolvetheFOVissueistouseadvancedopticssuch asomnidirectionalcameras.Catadioptriccameras(onetypeofomnidirectional camera)aredeviceswhi chusebothmirrors(re ectiveorcatadioptricelements)and lenses(refractiveordioptricelements)toformimages[32].Catadioptriccameras withasinglee ectiveviewpointareclassi edascentralcatadioptriccameras,

PAGE 20

6 whicharedesirablebecausetheyyieldpureperspectiveimages[33].InChapter5, avisualservocontrolschemeispresentedthatyieldsatrackingresultforacamerain-handcentralcatadioptriccamerasystem.Thetrackingcontrollerisdeveloped basedontherelativerelationshipsofacentralcatadioptriccamerabetweenthe current,reference,anddesiredcameraposes.To ndtherelativecameraposerelationships,homographiesarecomputedbasedontheprojectionmodelofthecentral catadioptriccamera[33].GeyerandDaniilidis[36]proposedaunifyingtheory toshowthatallcentralcatadioptricsystemsareisomorphictoprojectivemappings fromthespheretoaplanewithaprojectioncenterontheperpendicularaxistothe plane.Byconstructinglinksbetweentheprojectedcoordinatesonthesphere,the homographiesuptoscalarmultiplescanbeobtained.Variousmethodscanthen beappliedtodecomposetheEuclideanhomographiesto ndthecorresponding rotationmatrices,anddepthratios.Therotationerrorsystemisbasedonthe quaternionformulationwhichhasafull-rankinteractionmatrix.Lyapunov-based methodsareutilizedtodevelopthecontrollerandtoproveasymptotictracking. 4) Robustvisualservocontrol. Invision-basedcontrol,exactcalibrationisoftenrequiredsothattheimagespacesensormeasurementscanberelatedtotheEuclideanorjointspacefor controlimplementation.Speci cally,acameramodel(e.g.,thepinholemodel) isoftenrequiredtorelatepixelcoordinatesfromanimagetothe(normalized) Euclideancoordinates.Thecameramodelistypicallyassumedtobeexactlyknown (i.e.,theintrinsiccalibrationparametersareassumedtobeknown);however, despitetheavailabilityofseveralpopularcalibrationmethods(cf.[37]), cameracalibrationcanbetimeconsuming,requiressomelevelofexpertise, andhasinherentinaccuracies.Ifthecalibrationparametersarenotexactlyknown, performancedegradationandpotentialunpredictableresponsefromthevisual servocontrollermayoccur.Thegoalofthisresearchistodevelopavisualservo

PAGE 21

7 controllerwhichisrobusttotheintrinsiccalibrationparameters.Asintheprevious threeproblems,thequaternionparameterizationwillbeusedtorepresentthe rotationerrorsystem.Sincethequaternionerrorcannotbemeasuredprecisely duetotheuncertaincalibration,anestimatedquaternionisrequiredtodevelop thecontroller.Oneofthechallengestodevelopaquaternionestimateisthatthe estimatedrotationmatrixisnotatruerotationmatrixingeneral.Toaddressthis challenge,thesimilarityrelationshipbetweentheestimatedandactualrotation matricesisusedtoconstructtherelati onshipbetweentheestimatedandactual quaternions.ALyapunov-basedstabilityanalysisisprovidedthatindicatesa uniquecontrollercanbedevelopedtoachievetheregulationresult. 5) Combinedrobustandadaptivevisualservocontrol. Thisresearchisalsomotivatedbythedesiretocompensateforuncertaincameracalibration.Thiscontrollerhasadap tiveupdatedtermswhichcancompensate fortheunknowncalibrationparameters.Theopen-looperrorsystemiscomposed ofarotationerrorsystemandatranslationerrorsystem.Onechallengeisthatthe rotationquaternionerrorisnotmeasurable.Toaddressthisproblem,anestimated quaternionisobtainedbasedontheimage-spaceinformationandisusedtodevelop thecontroller.Thetransformationbetweentheactualandestimatedquaternions isanuppertriangularmatrixdeterminedbythecalibrationparametersandthe diagonalelementsarepositive.Thisfactisexploitedtodesignarobusthigh-gain controller.Anotherchallengeisthattheunknowncalibrationmatrixiscoupledin thetranslationerrorsystem.Toaddressthisproblem,thetranslationerrorsystem islinearlyparameterizedintermsofthecalibrationparameters.Anadaptiveupdatelawisusedtoestimatetheunknowncalibrationparameters,andatranslation controllercontainingtheadaptivecompensationtermsisusedtoasymptotically regulatethetranslationerror.

PAGE 22

8 1.3LiteratureReview 1.3.1BasicVisualServoControlApproaches Di erentvisualservocontrolmethodscanbedividedintothreemaincategoriesincluding:image-based,position-based,andapproachesthatmakeuseofa blendofimageandposition-basedapproaches.Image-basedvisualservocontrol (e.g.,[1,44])consistsofafeedbacksignalthatiscomposedofpureimage-space information(i.e.,thecontrolobjectiveisde nedintermsofanimagepixelerror).Thisapproachisconsideredtobe morerobusttocameracalibrationand robotkinematicerrorsandi smorelikelytokeeptherelevantimagefeaturesin theFOVthanposition-basedmethodsbecausethefeedbackisdirectlyobtained fromtheimagewithouttheneedtotransfertheimage-spacemeasurementto anotherspace.Adrawbackofimage-basedvisualservocontrolisthatsincethe controllerisimplementedintherobotjointspace,animage-Jacobianisrequired torelatethederivativeoftheimage-spacemeasurementstothecameraslinear andangularvelocities.However,theimage-Jacobiantypicallycontainssingularities andlocalminima(seeChaumette[48]),andthecontrollerstabilityanalysisis di culttoobtaininthepresenceofcalibrationuncertainty(seeEspiauetal.[49]). Anotherdrawbackofimage-basedmethodsarethatsincethecontrollerisbased onimage-feedback,therobotcouldbecommandedalongatrajectorythatisnot physicallypossible.ThisissueisdescribedasChaumettesconundrum.Further discussionoftheChaumettesconundrumisprovidedinChaumette[48]andCorke andHutchinson[25]. Position-basedvisualservocontrol(e.g.,[1,44,5052])usesreconstructed Euclideaninformationinthefeedbackloop .Forthisapproach,theimage-Jacobian singularityandlocalminimaproblemsareavoided,andphysicallyrealizable trajectoriesaregenerated.However,theapproachissusceptibletoinaccuracies inthetask-spacereconstructionifthetransformationiscorrupted(e.g.,uncertain

PAGE 23

9 cameracalibration).Also,sincethecontrollerdoesnotdirectlyusetheimage featuresinthefeedback,thecommandedrobottrajectorymaycausethefeature pointstoleavetheFOV.Areviewofthesetwoapproachesisprovidedin[1,2,53]. Thethirdclassofvisualservocontrollersusesomeimage-spaceinformationcombinedwithsomereconstructedinformationasameanstocombine theadvantagesofthesetwoapproache swhileavoidingtheirdisadvantages (e.g.,[10,20,24,25,54 58]).Oneparticularapproachwascoined2.5Dvisual servocontrolin[20,21,55,56]becausethis classofcontrollersexploitstwodimensionalimagefeedbackandreconstructedthree-dimensionalfeedback.Thisclassof controllersisalsocalledhomography-ba sedvisualservocontrolin[10,22,24,57] becauseoftheunderlyingrelianceoftheconstructionanddecompositionofa homography. 1.3.2VisualServoControlApproachestoEnlargetheFOV Visualservocontrollersoftenrequiretheimage-spacecoordinatesofsomeset ofEuclideanfeaturepointsinthecontroldevelopment;hence,thefeaturepoints mustremaininthecamerasFOV.SincetheFOVofconventionalperspective cameras(e.g.,pinholecameras)isrestricted,keepingthefeaturepointsintheFOV isafundamentalchallengeforvisualservocontrolalgorithms.Thefundamental natureoftheFOVproblemhasresultedinavarietyofcontrolandpathplanning methods(e.g.,[6,7,25]).CorkeandHutchinson[25]andChesietal.[26] usedpartitionedorswitchingvisualservoingmethodstokeeptheobjectin theFOV.In[6,7,27],potential elds(ornavigationfunctions)areusedto ensurethevisibilityofallfeaturesduringthecontroltask.InBenhimaneand Malis[30],thefocallengthofthecamerawasautomaticallyadjusted(i.e.,zoom control)tokeepallfeaturesintheFOVduringthecontroltaskbyusingan intrinsic-freevisualservoingapproachdevelopedbyMalis[59].InGarcka-Aracilet al.[31],acontinuouscontrollerisobtainedbyusinganewsmoothtaskfunction

PAGE 24

10 withweightedfeaturesthatallowsvisibilitychangesintheimagefeatures(i.e., somefeaturescancomeinandoutoftheFOV)duringthecontroltask.Some researchershavealsoinvestigatedmethod stoenlargetheFOV[60].In[60], imagemosaicingisusedtocapturemultipleimagesofthesceneasacameramoves andtheimagesarestitchedtogethertoobtainalargerimage.InSwaminathanand Nayar[64],multipleimagesarefusedfrommultiplecamerasmountedinorderto haveminimallyoverlappingFOV. Analternativesolutiontotheaforem entionedalgorithmicapproachesto resolvetheFOVissueistouseadvancedopticssuchasomnidirectionalcameras. Catadioptriccameras(onetypeofomnidirectionalcamera)aredeviceswhich usebothmirrors(re ectiveorcatadioptricelements)andlenses(refractiveor dioptricelements)toformimages[32].Catadioptricsystemswithasinglee ective viewpointareclassi edascentralcatadioptricsystems,whicharedesirablebecause theyyieldpureperspectiveimages[33].InBakerandNayar[34],thecomplete classofsingle-lenssingle-mirrorcatadioptricsystemsisderivedthatsatisfythe singleviewpointconstraint.Recently,catadioptricsystemshavebeeninvestigated toenlargetheFOVforvisualservocontroltasks(e.g.,[35,652]).Burschka andHager[65]addressedthevisualservoingproblemofmobilerobotsequipped withcentralcatadioptriccameras,inwhichanestimationofthefeatureheight totheplaneofmotionisrequired.Barretoetal.[73]developedamodel-based trackingapproachofarigidobjectusingacentralcatadioptriccamera.Mezouar etal.[66]controlledaroboticsystemusingtheprojectionof3Dlinesintheimage planeofacentralcatadioptricsystem. In[35,65,66],theinverseoftheimage Jacobianisrequiredinthecontrollerdevelopmentwhichmayleadtoasingularity problemforcertaincon gurations.Hadj-Abdelkaderetal.[67]presentedthe21/2 Dvisualservoingapproachusingomnidirectionalcameras,inwhichtheinverse ofanestimatedimage-Jacobian(containingpotentialsingularities)isrequiredin

PAGE 25

11 thecontroller.Mariottinietal.[68,69]developedanimage-basedvisualservoing strategyusingepipolargeometryforathreeDOFmobilerobotequippedwitha centralcatadioptriccamera.Particular ly,thesingularityproblemintheimage JacobianwasaddressedbyMariottinietal.[69]usingepipolargeometry.In BenhimaneandMalis[70],anewapproachtovisualservoregulationforomnidirectionalcameraswasdevelopedthatusesthefeedbackofahomographydirectly (withoutrequiringadecomposition)thatalsodoesnotrequireanymeasureofthe 3Dinformationontheobservedscene.However,theresultin[70]isrestrictedtobe localsincethecontrollerisdevelopedbasedonalinearizedopen-looperrorsystem attheorigin,andthetaskfunctionisonlyisomorphictoarestrictedregionwithin theomnidirectionalview.InTatsambonandChaumette[71,72],anewoptimal combinationofvisualfeaturesisproposedforvisualservoingfromspheresusing centralcatadioptricsystems. 1.3.3RobustandAdaptiveVisualServoControl Motivatedbythedesiretoincorporaterobustnesstocameracalibration, di erentcontrolapproachesthatdonotdependonexactcameracalibrationhave beenproposed(cf.[9,21,748]).E ortssuchas[74]haveinvestigatedthe developmentofmethodstoestimatetheimageandrobotmanipulatorJacobians ThesemethodsarecomposedofsomeformofrecursiveJacobianestimationlaw andacontrollaw.Speci cally,HosodaandAsada[74]developedavisualservo controllerbasedonaweightedrecursiveleast-squaresupdatelawtoestimate theimageJacobian.InJagersandetal.[75],aBroydenJacobianestimatoris appliedandanonlinearleast-squareoptimizationmethodisusedforthevisual servocontroldevelopment.ShahamiriandJagersand[76]usedanullspace-biased Newton-stepvisualservostrategywithaBroydenJacobianestimationforonline singularitydetectionandavoidanceinanuncalibratedvisualservocontrolproblem. InPiepmeierandLipkin[77]andPiepmeieretal.[78],arecursiveleast-squares

PAGE 26

12 algorithmisimplementedforJacobianestimation,andadynamicGauss-Newton methodisusedtominimizethesquarederrorintheimageplane. Robustcontrolapproachesbasedonstaticbest-guessestimationofthe calibrationmatrixhavebeendevelopedtosolvetheuncalibratedvisualservo regulationproblem(cf.[21,82,87,88]).Speci cally,underasetofassumptions ontherotationandcalibrationmatrix,akinematiccontrollerwasdevelopedby TaylorandOstrowski[82]thatutilizesaconstant,best-guessestimateofthe calibrationparameterstoachievelocalset-pointregulationforthesixDOFvisual servocontrolproblem.Homography-basedvisualservoingmethodsusingbest-guess estimationareusedbyMalisandChaumette[21]andFangetal.[87]toachieve asymptoticorexponentialregulationwithrespecttobothcameraandhand-eye calibrationerrorsforthesixDOFproblem. Thedevelopmentoftraditionaladaptivecontrolmethodstocompensatefor uncertaintyinthecameracalibrationmatrixisinhibitedbecauseofthetimevaryinguncertaintyinjectedinthetransformationfromthenormalizationofthe Euclideancoordinates.Asaresult,initialadaptivecontrolresultssuchas[795] werelimitedtoscenarioswheretheopticaxisofthecamerawasassumedtobe perpendicularwiththeplaneformedbythefeaturepoints(i.e.,thetime-varying uncertaintyisreducedtoaconstantuncertainty)orassumedanadditionalsensor (e.g.,ultrasonicsensors,laser-basedsensors,additionalcameras)couldbeusedto measurethedepthinformation. Morerecentapproachesexploitgeometricrelationshipsbetweenmultiple spatiotemporalviewsofanobjecttotransformthetime-varyinguncertaintyinto knowntime-varyingtermsmultipliedbyanunknownconstant[9,21,869].In Rufetal.[9],anon-linecalibrationalgorithmwasdevelopedforposition-based visualservoing.InLiuetal.[86],anadaptiveimage-basedvisualservocontroller wasdevelopedthatregulatedthefeaturepointsinanimagetodesiredlocations.

PAGE 27

13 Oneproblemwithmethodsbasedontheimage-Jacobianisthattheestimated image-Jacobianmaycontainsingularities.Thedevelopmentin[86]exploitsan additionalpotentialforcefunctiontodrivetheestimatedparametersawayfromthe valuesthatresultinasingularJacobianmatrix.InChenetal.[89],anadaptive homography-basedcontrollerwasproposedtoaddressproblemsofuncertainty intheintrinsiccameracalibrationpara metersandlackofdepthmeasurements. Speci cally,anadaptivecontrolstrategywasdevelopedfromaLyapunov-based approachthatexploitsthetriangularstructureofthecalibrationmatrix.Tothe bestofourknowledge,theresultin[89]wasthe rstresultthatregulatesthe robotend-e ectortoadesiredposition/orientationthroughvisualservoingby activelycompensatingforthelackofde pthmeasurementsanduncertaintyin thecameraintrinsiccalibrationmatrixwithregardtothesixDOFregulation problem.However,therelationshipbetweentheestimatedrotationaxisandthe actualrotationaxisisnotcorrectlydeveloped.Atime-varyingscalingfactorwas omittedwhichisrequiredtorelatetheestimatedrotationmatrixandtheactual rotationmatrix.Speci cally,theestimatedrotationmatrixandtheactualrotation matrixwereincorrectlyrelatedthrougheigenvectorsthatareassociatedwiththe eigenvalueof1.Anunknowntime-varyingscalarisrequiredtorelatethesevectors, andthemethodsdevelopedin[89]donotappeartobesuitabletoaccommodate forthisuncertainty. 1.4Contributions Themaincontributionofthisdissertationisthedevelopmentofvisualservo controlalgorithmsandarchitecturesthatexploitthevisualfeedbackfromacamera systemtoachieveatrackingorregulationcontrolobjectiveforarigid-bodycontrol object(e.g.,theend-e ectorofarobotmanipulator,asatellite,anautonomous vehicle)identi edbyapatchoffeaturepoints.Intheprocessofachievingthemain contribution,thefollowingcontributionsweremade:

PAGE 28

14 Anewadaptivehomography-basedvisualservocontrolmethodviaaquaternionformulationisdevelopedthatachievesasymptotictrackingcontrol.This controlschemeissingularity-freebyexploitingthehomographytechniques andaquaternionparameterization.Theadaptiveestimationtermintheproposedcontrollercompensatesfortheunknowndepthinformationdynamically whilethecontrollerachievestheasymptotictrackingresults. Anewcollaborativevisualservocontrolmethodisdevelopedtoenablea rigid-bodyobjecttotrackadesiredtrajectoryviaadaisy-chainingmulti-view geometry.Incontrasttotypicalcamera -to-handandcamera-in-handvisual servocontrolcon gurations,theproposedcontrollerisdevelopedusinga movingon-boardcameraviewingamovingobjecttoobtainfeedbacksignals. ThiscollaborativemethodweakenstheFOVrestrictionandenablesthe controlobjecttoperformlargeareamotion. Avisualservocontrollerisdevelopedthatyieldsanasymptotictracking resultforthecompletenonlinearsixDOFcamera-in-handcentralcatadioptriccamerasystem.ApanoramicFOVisobtainedbyusingthecentral catadioptriccamera. Arobustvisualservocontrollerisdevelopedtoachievearegulationcontrolobjectiveinpresenceofintrinsiccameracalibrationuncertainties.A quaternion-basedestimatefortherotationerrorsignalisdevelopedand usedinthecontrollerdevelopment.Thesimilarityrelationshipbetweenthe estimatedandactualrotationmatricesisusedtoconstructtherelationship betweentheestimatedandactualquaternions.ALyapunov-basedstability analysisisprovidedthatindicatesauniquecontrollercanbedeveloped toachievetheregulationresultdespiteasignambiguityinthedeveloped quaternionestimate.

PAGE 29

15 Anewcombinedrobustandadaptivevisualservocontrolmethodisdevelopedtoasymptoticallyregulatethefeaturepointsinanimagetothe desiredlocationswhilealsoregulatingthesixDOFposeofthecontrolobject withoutcalibratingthecamera.Thesedualobjectivesareachievedbyusing ahomography-basedapproachthatex ploitsbothimage-spaceandreconstructedEuclideaninformationinthefeedbackloop.Therobustrotation controllerthataccommodatesforthet ime-varyinguncertainscalingfactor isdevelopedbyexploitingtheuppertriangularformoftherotationerror systemandthefactthatthediagonalelementsofthecameracalibration matrixarepositive.Theadaptivetranslationcontrollerthatcompensatesfor theconstantunknownparametersinthetranslationerrorsystemisdeveloped byacertainty-equivalence-basedadaptivecontrolmethodandanonlinear Lyapunov-baseddesignapproach.

PAGE 30

CHAPTER2 BACKGROUNDANDPRELIMINARYDEVELOPMENT Thepurposeofthischapteristoprovidesomebackgroundinformation pertainingtothecamerageometricmodel,Euclideanreconstruction,andunit quaternionparameterizationapproach.Sections2.1and2.2developthenotation andframeworkforthecamerageometricmodelandEuclideanreconstructionused inChapters3,6and7.TheirextensionsarealsousedinChapters4and5.Section 2.3reviewstheunitquaternion,arotationrepresentationapproachthatisused throughoutthisdissertation. 2.1GeometricModel Imageprocessingtechniquescanoftenbeusedtoselectcoplanarandnoncollinearfeaturepointswithinanimage.However,iffourcoplanarfeaturepoints arenotavailablethenthesubsequentdevelopmentcanalsoexploittheclassic eight-pointsalgorithmwithnofouroftheeightfeaturepointsbeingcoplanar (seeHartleyandZisserman[13])orthevirtualparallaxmethod(seeBoufama andMohr[90]andMalis[55])wherethenon-coplanarpointsareprojectedonto avirtualplane.Withoutlossofgenerality,thesubsequentdevelopmentisbased ontheassumptionthatanobject(e.g.,theend-e ectorofarobotmanipulator, anaircraft,atumblingsatellite,anautonomousvehicle,etc.)hasfourcoplanar andnon-collinearfeaturepointsdenotedby =1 2 3 4 ,andthefeature pointscanbedeterminedfromafeaturepointtrackingalgorithm(e.g.,KanadeLucas-Tomasi(KLT)algorithmdiscussedbyShiandTomasi[91]andTomasi andKanade[92]).Theplanede nedbythefourfeaturepointsisdenotedby asdepictedinFigure2.Thecoordinateframe F inFigure21isa xed toacameraviewingtheobject,thestationarycoordinateframe Fdenotesa 16

PAGE 31

17 n* O id* xf, R mi m *i xfd, Rd mdi n*n* O i O id xf, R mi mi m m xfd, Rd mdi mdi Fd F F n*n* O i O id*d* xf, R mi mi m *i xfd, Rd mdi mdi n*n* O i O id xf, R mi mi m m xfd, Rd mdi mdi Fd F F Figure2:Coordinateframerelationshipsbetweenacameraviewingaplanar patchatdi erentspatiotemporalinstances.Thecoordinateframes F Fand Fareattachedtothecurrent,referenceanddesiredlocations,respectively. referencelocationforthecamera,andthecoordinateframe Fthatisattached tothedesiredlocationofthecamera.Whenthedesiredlocationofthecamera isaconstant, Fcanbechosenthesameas FasshowninFigure22.That is,thetrackingproblembecomesamoreparticularregulationproblemforthe con gurationinFigure2. Thevectors ( ) ( ) R3inFigure2arede nedas ( ) ( ) ( ) (21) ( ) ( ) ( ) where ( ) ( ) ( ) R R and ( ) ( ) ( ) R denotethe Euclideancoordinatesofthefeaturepoints expressedintheframes F Fand F,respectively.FromstandardEuclideangeometry,relationshipsbetween ( ) ,

PAGE 32

18 n* O id* xf, R mi m n*n* O i O idd xf mi mi m mi F F n* O id* xf, R mi m n*n* O i O idd xf mi mi m mi F F Figure2:Coordinateframerelationshipsbetweenacameraviewingaplanar patchatdi erentspatiotemporalinstances.Thecoordinateframes F and Fare attachedtothecurrentandreferencelocations,respectively. and ( ) canbedeterminedas = + = + (22) where ( ) ( ) (3) denotetheorientationsof Fwithrespectto F and F,respectively,and ( ) ( ) R3denotetranslationvectorsfrom F to Fand Fto Fexpressedinthecoordinatesof F and F,respectively.Asalso illustratedinFigure21, R3denotestheconstantunitnormaltotheplane andtheconstantdistancefromtheoriginof Fto alongtheunitnormalis denotedby R .ThenormalizedEuclideancoordinates,denotedby

PAGE 33

19 ( ) ( ) R3arede nedas = 1 (23) = 1 = 1 withthestandardassumptionthat ( ) ( ) where isanarbitrarily smallpositiveconstant.From(23),therelationshipsin(22)canbeexpressedas = |{z} + | {z } (24) = |{z} + | {z } where ( ) ( ) R arescalingterms,and ( ) ( ) R3 3denotethe Euclideanhomographies. 2.2EuclideanReconstruction Eachfeaturepointon hasaprojectedpixelcoordinate ( ) R3, R3and ( ) R3in F Fand Frespectively,denotedby 1 (25) 1 1 where ( ) ( ) ( ) ( ) R .Theprojectedpixelcoordinates ( ) and ( ) arerelatedtothenormalizedtask-spacecoordinates ( ) and ( ) bythefollowingglobalinvertibletransformation(i.e.,thepinholecamera

PAGE 34

20 model) = = = (26) where R3 3isaconstant,uppertriangular,andinvertibleintrinsiccamera calibrationmatrixthatisexplicitlyde nedas[13] cot 00 sin 0001 (27) In(2), 00 R denotethepixelcoordinatesoftheprincipalpoint(i.e.,the imagecenterthatisde nedastheframebu ercoordinatesoftheintersection oftheopticalaxiswiththeimageplane), R representtheproductofthe camerascalingfactorsandthefocallength,and R istheskewanglebetween thecameraaxes. Basedon(26),theEuclideanrelationshipin(2)canbeexpressedinterms oftheimagecoordinatesas = 1 | {z } = 1 | {z } (28) Byusingthefeaturepointpairs ( ( )) and ( ( )) ,theprojectivehomographyuptoascalarmultiple(i.e., and )canbedetermined(seeChenet al.[10]).Variousmethodscanthenbeapplied(e.g.,seeFaugerasandLustman[93] andZhangandHanson[94])todecomposetheEuclideanhomographiestoobtain therotationmatrices ( ) ( ) andthedepthratios ( ) ( ) .

PAGE 35

21 2.3UnitQuaternionRepresentationoftheRotationMatrix Foragivenrotationmatrix,severaldi erentrepresentations(e.g.,Eulerangleaxis,directioncosinesmatrix,Eulerangles,unitquaternion(orEulerparameters), etc.)canbeutilizedtodeveloptheerrorsystem.Inprevioushomography-based visualservocontrolliterature,theEulerangle-axisrepresentationhasbeenused todescribetherotationmatrix .Intheangle-axisparameters ( ) ( ) R representsarotationangle aboutasuitableunitvector ( ) R3.Theparameters ( ) canbeeasilycalculated(e.g.,usingthealgorithmshowninSpongand Vidyasagar[11]). Givenunitvector ( ) andangle ( ) ,therotationmatrix ( )= canbe calculatedusingtheRodriguesformula = = 3+ sin( )+( )2(1 cos( )) (29) where 3isthe 3 3 identitymatrix,andthenotation ( ) denotesthefollowing skew-symmetricformofthevector ( ) : = 0 3230 1 210 = 123 (2) Theunitquaternionisafourdimensionalvectorwhichcanbede nedas[23] 0 (2) In(2), ( ) 1( ) 2( ) 3( ) 0( ) ( ) R =1 2 3 .Theunit quaternionmustalsosatisfythefollowingnonlinearconstraint =1 (2)

PAGE 36

22 Thisparameterizationfacilitatesthesubsequentproblemformulation,control development,andstabilityanalysissincetheunitquaternionprovidesaglobally nonsingularparameterizationoftherotationmatrix. Given ( ) ,theunitquaternionvector ( ) canbeconstructedas 0( ) ( ) = cos ( ) 2 ( )sin ( ) 2 (2) Basedon(213),therotationmatrixin(29)canbeexpressedas ( )= 3+2 0 +2( )2= 2 0 3+2 +2 0 (2) Therotationmatrixin(2)istypicalinroboticsliteraturewherethemoving coordinatesystemisexpressedintermsofa xedcoordinatesystem(typicallythe coordinatesystemattachedtothebaseframe).However,thetypicalrepresentation oftherotationmatrixinaerospaceliterature(e.g.,[15])is ( )= 2 0 3+2 2 0 (2) Thedi erenceisduetothefactthattherotat ionmatrixin(2)(whichisused inthecurrentdissertation)relatesthemovingcoordinateframe F tothe xed coordinateframe Fwiththecorrespondingstatesexpressedin F .Therotation matrixin(215)canbeexpandedas ( )= 2 0+ 2 1 2 2 2 32( 1 2+ 30) 2( 1 2 30) 2 0 2 1+ 2 2 2 32( 1 3+ 20)2( 2 3 10) 2( 1 3 20) 2( 2 3+ 10) 2 0 2 1 2 2+ 2 3 (2) From(2)variousapproachescouldbeusedtodetermine 0( ) and ( ) ; however,numericalsigni canceoftheresultingcomputationscanbelostif 0( ) isclosetozero[15].Shuster[15]developedamethodtodetermine 0( ) and ( ) thatprovidesrobustnessagainstsuchcomputationalissues.Speci cally,the

PAGE 37

23 diagonaltermsof ( ) canbeobtainedfrom(2)and(2)as 11=1 2 2 2+ 2 3 (2) 22=1 2 2 1+ 2 3 (2) 33=1 2 2 1+ 2 2 (2) Byutilizing(2)and(2)-(2),the followingexpressionscanbedeveloped: 2 0= 11+ 22+ 33+1 4 (2) 2 1= 11 22 33+1 4 2 2= 22 11 33+1 4 2 3= 33 11 22+1 4 where 0( ) isrestrictedtobenon-negativewithoutlossofgenerality(thisrestrictionenablestheminimumrotationtobeobtained).Asstatedin[15],thegreatest numericalaccuracyforcomputing 0( ) and ( ) isobtainedbyusingtheelement in(220)withthelargestvalueandthencomputingtheremainingtermsrespectively.Forexample,if 2 0( ) hasthemaximumvaluein(20)thenthegreatest numericalaccuracycanbeobtainedbycomputing 0( ) and ( ) as 0= r 11+ 22+ 33+1 4 1= 23 32 4 0 2= 31 13 4 0 3= 12 21 4 0 (2)

PAGE 38

24 Likewise,if 2 1( ) hasthemaximumvaluein(2)thenthegreatestnumerical accuracycanbeobtainedbycomputing 0( ) and ( ) as 0= 23 32 4 1 1= r 11 22 33+1 4 2= 12+ 21 4 1 3= 13+ 31 4 1 (2) wherethesignof 1( ) isselectedsothat 0( ) 0 .If 2 2( ) isthemaximum,then 0= 31 13 4 2 1= 12+ 21 4 2 2= r 22 11 33+1 4 3= 23+ 32 4 2 (2) orif 2 3( ) isthemaximum,then 0= 12 21 4 3 1= 13+ 31 4 3 2= 23+ 32 4 3 3= r 33 11 22+1 4 (2) wherethesignof 2( ) or 3( ) isselectedsothat 0( ) 0 Theexpressionsin(21)-(24)indicatethatgiventherotationmatrix ( ) fromthehomographydecomposition,theunitquaternionvectorcanbedetermined thatrepresentstherotationwithoutintroducingasingularity.Theexpressionsin (21)-(24)willbeutilizedinthesubsequentcontroldevelopmentandstability analysis.

PAGE 39

CHAPTER3 LYAPUNOV-BASEDVISUALSERVOTRACKINGCONTROLVIAA QUATERNIONFORMULATION 3.1Introduction Previousvisualservocontrollerstypicallyonlyaddresstheregulationproblem. Motivatedbytheneedfornewadvancementstomeetvisualservotrackingapplications,previousresearchhasconcentratedondevelopingdi erenttypesofpath planningtechniques[5].Recently,Chenetal.[10]developedanewformulation ofthetrackingcontrolproblem.Thehomography-basedadaptivevisualservo controllerin[10]isdevelopedtoenableanactuatedobjecttotrackaprerecorded time-varyingdesiredtrajectorydeterminedbyasequenceofimages,wherethe Eulerangle-axisrepresentationisusedtorepresenttherotationerrorsystem.Due tothecomputationalsingularitylimitationoftheangleaxisextractionalgorithm (seeSpongandVidyasagar[11]),rotationanglesof werenotconsidered. ThischapterconsidersthepreviouslyunexaminedproblemofsixDOFvisual servotrackingcontrolwithanonsingular rotationparameterization.Ahomography isconstructedfromimagepairsanddecomposedviatextbookmethods(e.g., Faugeras[12]andHartleyandZisserman[1 3])toobtaintherotationmatrix.Once therotationmatrixhasbeendetermined,thecorrespondingunitquaternioncan bedeterminedfromgloballynonsingularandnumericallyrobustalgorithms(e.g., Huetal.[14]andShuster[15]).Anerrorsystemisconstructedintermsofthe unitquaternion.Anadaptivecontrolleristhendevelopedandproventoenable acamera(attachedtoarigid-bodyobject)totrackadesiredtrajectorythatis determinedfromasequenceofimages.Theseimagescanbetakenonlineoro ine byacamera.Forexample,asequenceofimagesofthereferenceobjectcanbe 25

PAGE 40

26 prerecordedasthecameramoves(acamera-in-handcon guration),andthese imagescanbeusedasadesiredtrajectoryinalaterreal-timetrackingcontrol. Thecameraisattachedtoarigid-bodyobject(e.g.,theend-e ectorofarobot manipulator,asatellite,anautonomousvehicle,etc.)thatcanbeidenti edbya planarpatchoffeaturepoints.Thecontrollercontainsanadaptivefeedforward termtocompensatefortheunknowndistancefromthecameratotheobserved features.Aquaternion-basedLyapunovfunctionisdevelopedtofacilitatethe controldesignandthestabilityanalysis. Theremainderofthischapterisorganizedasfollows.InSection3.2,the controlobjectiveisformulatedintermsofunitquaternionrepresentation.In Section3.3,thecontrollerisdeveloped,andclosed-loopstabilityanalysisisgiven basedonLyapunov-basedmethods.InSection3.4,thecontroldevelopmentis extendedtothecamera-to-handcon guration.InSections3.5and3.6,Matlab simulationsandtrackingexperimentsthatwereperformedinavirtual-reality test-bedforunmannedsystemsattheUniversityofFloridaareusedtoshowthe performanceoftheproposedvisualservotrackingcontroller. 3.2ControlObjective Thecontrolobjectiveisforacameratotrackadesiredtrajectorythatis determinedbyasequenceofimages.Thisobjectiveisbasedontheassumption thatthelinearandangularvelocitiesofthecameraarecontrolinputsthatcanbe independentlycontrolled(i.e.,unconstrainedmotion)andthatthecameraiscalibrated(i.e., isknown).Thesignalsin(25)aretheonlyrequiredmeasurements todevelopthecontroller. Oneoftheoutcomesofthehomographydecompositionistherotationmatrices ( ) and ( ) .Fromtheserotationmatrices,severaldi erentrepresentationscan beutilizedtodeveloptheerrorsystem.In previoushomography-basedvisualservo controlliterature,theEulerangle-axisrepresentationhasbeenusedtodescribethe

PAGE 41

27 rotationmatrix.Inthischapter,theunitquaternionparameterizationwillbeused todescribetherotationmatrix.Thisparameterizationfacilitatesthesubsequent problemformulation,controldevelopment,andstabilityanalysissincetheunit quaternionprovidesaglobalnonsingularparameterizationofthecorresponding rotationmatrices.Section2.3providesbackground,de nitionsanddevelopment relatedtotheunitquaternion. Giventherotationmatrices ( ) and ( ) ,thecorrespondingunitquaternions ( ) and ( ) canbecalculatedbyusingthenumericallyrobustmethod (see[14]and[15])basedonthecorrespondingrelationships ( )= 2 0 3+2 2 0 (31) ( )= 2 0 3+2 2 0 (32) where 3isthe 3 3 identitymatrix,andthenotation ( ) denotestheskewsymmetricformofthevector ( ) asin(2). Toquantifytheerrorbetweentheactualanddesiredcameraorientations,the mismatchbetweenrotationmatrices ( ) and ( ) isde nedas = (33) Basedon(31)-(3), = 2 0 3+2 2 0 (34) wheretheerrorquaternion ( 0( ) ( ))isde nedas 0= 00 + (35) = 0 0+

PAGE 42

28 Thede nitionof 0( ) and ( ) in(35)makes ( 0( ) ( ))aunitquaternionbasedonthefactthat ( ) and ( ) aretwounitquaternions(seeAppendix A). Thetranslationerror,denotedby ( ) R3,isde nedas = (36) where ( ) ( ) R3arede nedas = ln( ) = ln( ) (37) where { 1 4 } IntheEuclidean-space(seeFigure2),thetrackingobjectivecanbequantiedas ( ) 3as (38) and k ( ) k 0 as (39) Since ( ) isaunitquaternion,(34),(3)and(38)canbeusedtoquantifythe rotationtrackingobjectiveasthedesiretoregulate ( ) as k ( ) k 0 as (3) Thesubsequentsectionwilltargetthecontroldevelopmentbasedontheobjectives in(39)and(30). 3.3ControlDevelopment 3.3.1Open-LoopErrorSystem Theactualangularvelocityofthecameraexpressedin F isde nedas ( ) R3,thedesiredangularvelocityofthecameraexpressedin Fisde nedas ( ) R3,andtherelativeangularvelocityofthecamerawithrespectto F

PAGE 43

29 expressedin F isde nedas ( ) R3where = (3) Thecameraangularvelocitiescanberelatedtothetimederivativesof ( ) and ( ) as[23] 0 = 1 2 03+ (3) and 0 = 1 2 0 3+ (3) respectively. AsstatedinRemark3inChenetal.[10],asu cientlysmoothfunctioncan beusedto tthesequenceoffeaturepointstogeneratethedesiredtrajectory ( ) ;hence,itisassumedthat ( ) and ( ) areboundedfunctionsoftime. Inpractice,theaprioridevelopedsmoothfunctions ( ) ( ) ,and ( ) can beconstructedasboundedfunctionswithboundedtimederivatives.Basedon theassumptionthat ( ) isabounded rstorderdi erentiablefunctionwitha boundedderivative,thealgorithmforcomputingquaternionsin[14]canbeusedto concludethat 0 ( ) ( ) arebounded rstorderdi erentiablefunctionswith aboundedderivative;hence, 0 ( ) ( ) and 0 ( ) ( ) arebounded.In thesubsequenttrackingcontroldevelopment,thedesiredsignals ( ) and ( ) willbeusedasfeedforwardcontrolterms.Toavoidthecomputationalsingularity in ( ) ,thedesiredtrajectoryin[10]wasgeneratedbycarefullychoosingthe smoothfunctionsuchthattheworkspaceislimitedto ( ) .Unlike[10],theuse ofthequaternionalleviatestherestrictiononthedesiredtrajectory ( ) From(3),thesignal ( ) canbecalculatedas =2( 0 0 ) 2 (3)

PAGE 44

30 where 0 ( ) ( ) 0 ( ) ( ) arebounded,so ( ) isalsobounded. Basedon(34),(3),(312)and(313),theopen-looprotationerrorsystemcan bedevelopedas = 1 2 03+ (3) where ( )=( 0( ) ( )). Byusing(2),(2),(3),(3),andthefactthat[95] = + (3) where ( ) R3denotestheactuallinearvelocityofthecameraexpressedin F theopen-looptranslationerrorsystemcanbederivedas[10] = +( ) (3) where ( ) R3 3arede nedas = 00 000 0000 10 01 001 (3) Theauxiliaryterm ( ) isaninvertibleuppertriangularmatrix. 3.3.2Closed-LoopErrorSystem Basedontheopen-looprotationerrorsystemin(315)andthesubsequent Lyapunov-basedstabilityanalysis,theangularvelocitycontrollerisdesignedas = ( 3+ ) 1 + = + (3) where R3 3denotesadiagonalmatrixofpositiveconstantcontrolgains.See AppendixBfortheproofthat ( 3+ ) 1 = .Basedon(3),(3)and

PAGE 45

31 (39),therotationclosed-loope rrorsystemcanbedeterminedas 0= 1 2 (3) = 1 2 03+ Thecontributionofthischapteristhedevelopmentofthequaternion-based rotationtrackingcontroller.Severalothe rhomography-basedtranslationcontrollers couldbecombinedwiththedevelopedrotationcontroller.Forcompleteness,the followingdevelopmentillustrateshowthetranslationcontrollerandadaptive updatelawin[10]canbeusedtocompletethesixDOFtrackingresult. Basedon(317),thetranslationcontrolinput ( ) isdesignedas = 1 1 + (3) where R3 3denotesadiagonalmatrixofpositiveconstantcontrolgains.In (31),theparameterestimate ( ) R fortheunknownconstant isde nedas = (3) where R denotesapositiveconstantadapta tiongain.Thecontrollerin(3) doesnotexhibitasingularitysince ( ) isinvertibleand ( ) 0 .From(3) and(31),thetranslationclosed-looperrorsystemcanbelistedas = +( ) (3) where ( ) R denotesthefollowingpara meterestimationerror: = (3)

PAGE 46

32 3.3.3StabilityAnalysis Theorem3.1 :Thecontrollergivenin(3)and(3),alongwiththe adaptiveupdatelawin(3)ensuresglobalasymptotictrackinginthesensethat k ( ) k 0 k ( ) k 0 (3) Proof :Let ( ) R denotethefollowingdi erentiablenon-negativefunction (i.e.,aLyapunovcandidate): = +(1 0)2+ 2 + 1 2 2 (3) Thetime-derivativeof ( ) canbedeterminedas =2 +2(1 0)( 0)+ + 1 = 03+ (1 0) + ( +( ) ) = 03+ +(1 0) 3 = (3) where(320)and(322)-(324)wereutilized.Itcanbeseenfrom(37)that ( ) isnegativesemi-de nite. Basedon(3)and(3), ( ) ( ) 0( ) ( ) Land ( ) ( ) L2. Since ( ) L,itisclearfrom(3)that ( ) L.Basedonthefactthat ( ) L,(2),(2),(3)and(3)canbeusedtoprovethat ( ) L. Since ( ) L,(3)impliesthat ( ) 1 ( ) L.Basedonthefactthat ( ) 0( ) L,(3)canbeusedtoprovethat ( ) L.Since ( ) Land ( ) isaboundedfunction,(311)canbeusedtoconcludethat ( ) L. Since ( ) ( ) ( ) ( ) ( ) 1 ( ) Land ( ) isassumedtobe

PAGE 47

33 bounded,(3)and(3)canbeutilizedtoprovethat ( ) L.Fromthe previousresults,(3)-(3 17)canbeusedtoprovethat ( ) ( ) L.Since ( ) ( ) L L2,and ( ) ( ) L,BarbalatsLemma[96]canbeusedto concludetheresultgivenin(3). 3.4Camera-To-HandExtension ( xf, R )( xf* R )( xfd, Rd)d* n* sisisimdimi mi Fixed camerapdp*p ( xf, R )( xf* R )( xfd, Rd)d* n* sisisidi mi mi i i Fixed camerapdp*p F F I F d ( xf, R )( xf* R )( xfd, Rd)d* n* sisisimdimi mi mi Fixed camerapdp*p ( xf, R )( xf* R )( xfd, Rd)d* n* sisisidi mi mi i i Fixed camerapdp*p F F I F d Figure3:Coordinateframerelationshipsbetweena xedcameraandtheplanes de nedbythecurrent,desired,andreferencefeaturepoints(i.e., ,and ). 3.4.1ModelDevelopment Forthe xedcameraproblem,considerthe xedplane thatisde nedbya referenceimageoftheobject.Inaddition,considertheactualanddesiredmotion oftheplanes and (seeFigure3).Todeveloparelationshipbetweenthe planes,aninertialcoordinatesystem,denotedbyI,isde nedwheretheorigin coincideswiththecenterofa xedcamera.TheEuclideancoordinatesofthe

PAGE 48

34 featurepointson and canbeexpressedintermsofI,respectively,as ( ) ( ) ( ) ( ) (3) ( ) ( ) ( ) ( ) underthestandardassumptionthatthedistancesfromtheoriginofItothe featurepointsremainspositive(i.e., ( ) ( ) where denotesan arbitrarilysmallpositiveconstant).OrthogonalcoordinatesystemsF,F,andFareattachedtotheplanes ,and ,respectively.Torelatethecoordinate systems,let ( ) ( ) (3) denotetheorientationsofF,FandFwithrespecttoI,respectively,andlet ( ) ( )R3denotetherespective translationvectorsexpressedinI.AsalsoillustratedinFigure31, R3denotestheconstantunitnormaltotheplane expressedinI,and R3denotestheconstantcoordinatesofthe featurepointexpressedinthe correspondingcoordinateframesF,F,andF. FromthegeometrybetweenthecoordinateframesdepictedinFigure3,the followingrelationshipscanbedeveloped = + = + (3) where ( ) ( ) (3) and ( ) ( )R3denotenewrotationand translationvariables,respectively,de nedas = ( ) = ( ) = = (3) Similartothecamera-in-handcon guration,therelationshipsin(3)canbe expressedas = + = +

PAGE 49

35 Therotationmatrices ( ) ( ) andthedepthratios ( ) and ( ) can beobtainedasdescribedinSection2.2.Theconstantrotationmatrix can beobtainedapriorusingvariousmethods(e.g.,asecondcamera,Euclidean measurements)[10].Basedon(3), ( ) and ( ) canbedetermined. TheorientationsofFwithrespecttoFandFcanbeexpressedas ( ) and ( ) respectively.Toquantifytheerrorbetweentheactualanddesired planeorientations,themismatchbetweenrotationmatrices ( ) and ( ) isde nedas = = (3) Similartothedevelopmentforthecamera-in-handcon guration,thefollowing expressioncanbeobtained: = 2 0 3+2 2 0 (3) wheretheerrorquaternion ( 0( ) ( ))isde nedas 0= 00 + (3) = 0 0+ where ( 0( ) ( ))and ( 0 ( ) ( ))areunitquaternionscomputedfromthe rotationmatrices ( ) and ( ) followingthemethodgivenin[14]. 3.4.2ControlFormulation Byexpressingthetranslationvectors,angularvelocityandlinearvelocityin thebodyxedcoordinateframeF,andbyde ningrotationmatrices ( ) and ( ) ,thecontrolobjectiveinthissectioncanbeformulatedinthesame mannerasdoneforthecamera-in-handcon gurationproblem.So,therotation andtranslationerrorscanbedescribedthesameasthoseforthecamera-in-hand con gurationproblem.TheactualangularvelocityoftheobjectexpressedinFis

PAGE 50

36 de nedas ( )R3,thedesiredangularvelocityoftheobjectexpressedinFis de nedas ( )R3,andtherelativeangularvelocityoftheobjectwithrespect toFexpressedinFisde nedas ( )R3where = (3) where ( ) hasthesameformas ( ) in(3).Theopen-looprotationerror systemis = 1 2 03+ (3) andthetranslationerrorsystemis[10] = + (3) where ( )R3denotesthelinearvelocityoftheobjectexpressedinF. Basedontheopen-looperrorsystems(3)and(3),andthesubsequent stabilityanalysis,theangularandlinearcameravelocitycontrolinputsforthe objectarede nedas = + (3) =1 1 ( ) .(3) In(3)and(3), R3 3denotediagonalmatricesofpositiveconstant controlgains,theparameterestimates ( )R ( )R3fortheunknown constants ( ) and ( ) aregeneratedaccordingtothefollowingadaptiveupdate laws = (3) = (3) where R3 3denotesapositiveconstantdiagonaladaptationgainmatrix.

PAGE 51

37 From(3)and(3),therotationclosed-looperrorsystemcanbedeterminedas 0= 1 2 (3) =1 2 03+ Basedon(336)and(338),thetranslationclosed-looperrorsystemisgivenas = + (3) wheretheparameterestimationerrorsignals ( )R and ( )R3arede ned as = = .(3) Theorem3.2 :Thecontrollergivenin(3)and(3),alongwiththe adaptiveupdatelawsin(3)and(3)ensureglobalasymptotictrackinginthe sensethatk ( )k 0 k ( )k 0 (3) Proof :Let ( )R denotethefollowingdi erentiablenon-negativede nite function(i.e.,aLyapunovcandidate): = +(1 0)2+ 2 + 1 2 2 + 1 2 1 (3)

PAGE 52

38 Thetime-derivativeof ( ) canbedeterminedas =2 +2(1 0)( 0)+ + 1 + 1 = 03+ (1 0) + ( + )+ + = 03+ +(1 0) 3 + + = + + = + + = + + = (3) where(3)-(3)wereu tilized.From(3)and(3),signalchasingargumentsdescribedintheprevioussectioncanbeusedtoconcludethatthecontrol inputsandalltheclosed-loopsignalsarebounded.BarbalatsLemma[96]canthen beusedtoprovetheresultgivenin(3). 3.5SimulationResults Anumericalsimulationwasperformedtoillustratetheperformanceofthe trackingcontrollergivenin(3),(3),andtheadaptiveupdatelawin(3). Inthissimulation,thedevelopedtrackingcontrolleraimsenablethecontrolobject totrackthedesiredtrajectoryencodedbyasequenceofimagesandrotatemore than 360(seeFig.3).

PAGE 53

39 Thecameraisassumedtoviewanobjectwithfourcoplanarfeaturepoints withthefollowingEuclideancoordinates(in[m]): 1= 0 150 150 2= 0 150 150 (3) 3= 0 150 150 4= 0 150 150 Thetime-varyingdesiredimagetrajectorywasgeneratedbythekinematicsofthe featurepointplanewherethedesiredlinearandangularvelocitieswereselectedas = 0 1sin( )0 1sin( )0 [ ] = 001 5 [ ] Theinitialanddesiredimage-spacecoordinateswerearti ciallygenerated.For thisexample,consideranorthogonalcoordinateframeIwiththe -axisopposite to (seeFigure2)withthe -axisand -axisontheplane .Therotation matrices 1betweenFandI,and 2betweenFandIweresetas 1= (120) (20) (80) (3) 2= (160) (30) (30) (3) where () () and () (3) denoterotationofangle (degrees) alongthe -axis, -axisand -axis,respectively.Thetranslationvectors 1and 2betweenFandI(expressedinF)andbetweenFandI(expressedinF), respectively,wereselectedas 1= 0 50 54 0 (3) 2= 1 01 04 5 (3)

PAGE 54

40 Theinitialrotationmatrix 3andtranslationvector 3betweenFandIwere setas 3= (240) (90) (30) 3= 0 515 0 (3) Theinitial(i.e., (0) )andreference(i.e., )image-spacecoordinatesofthefour featurepointsin(3)werecomputedas(inpixels) 1(0)= 907 91716 041 2(0)= 791 93728 951 3(0)= 762 84694 881 4(0)= 871 02683 251 1= 985 70792 701 2= 1043 4881 201 3= 980 90921 901 4= 922 00829 001 Theinitial(i.e., (0) )image-spacecoordinatesofthefourfeaturepointsin (37)forgeneratingthedesiredtrajectorywerecomputedas(inpixels) 1(0)= 824 61853 911 2(0)= 770 36878 491 3(0)= 766 59790 501 4(0)= 819 03762 691 Thecontrolgains in(3)and in(3)andadaptationgain in(3) wereselectedas = diag{3 3 3}= diag{15 15 15} =0 0002 Thedesiredandcurrentimage-spacetrajectoriesofthefeaturepointplaneare showninFigure3andFigure3,respectively.Thefeaturepointplanerotates morethan 360degreesasshowninthesetwo gures.Theresultingtranslation

PAGE 55

41 androtationerrorsareplottedinFigure3andFigure3,respectively.The errorsgotozeroasymptotically.Thed esiredimage-spacetrajectory(i.e., ( )) andthecurrentimage-spacetrajectory(i.e., ( )) areshowninFigure3and Figure30,respectively.Thetrackingerrorbetweenthecurrentanddesired image-spacetrajectoryisshowninFigure 3.TheFigures3-3showthat thecurrenttrajectorytracksthedesiredtrajectoryasymptotically.Thetranslation androtationcontrolinputsareshowninFi gure3andFigure3,respectively. Theparameterestimatefor 1isshowninFigure3. 3.6ExperimentResults Simulationsveri edtheperformanceofthetrackingcontrollergivenin(319), (31),andtheadaptiveupdatelawin(322).Experimentswerethenperformed totestrobustnessandperformanceinthepresenceofsignalnoise,measurement error,calibrationerror,etc.Theexperimentswereperformedinatest-bedatthe UniversityofFloridaforsimulation,designandimplementationofvision-based controlsystems.Forthetest-bed,a3Denvironmentcanbeprojectedontolarge monitorsorscreensandviewedbyaphysicalcamera.Communicationbetweenthe cameraandcontrolprocessingcomputersandtheenvironmentrenderingcomputers allowsclosed-loopcontrolofthevirtualscene. 3.6.1ExperimentCon gurations Ablockdiagramdescribingtheexperimentaltest-bedisprovidedinFigure 32.Thetest-bedisbasedonavirtualenvironmentgeneratedbyavirtualreality simulator,composedof veworkstationsandadatabaseserverrunningvirtual realitysoftware.Thisallowsmultipleinstancesofthevirtualenvironmenttorunat thesametime.Inthisway,cameraviewscanberigidlyconnectedinamosaicfor alargeFOV.Alternately,multiple,independentcameraviewscanpursuetheirown tasks,suchascoordinatedcontrolofmultiplevehicles.Thevirtualrealitysimulator

PAGE 56

42 Figure3:Blockdiagramoftheexperiment. Figure3:TheSonyXCD-710CRcolor rewirecamerapointedatthevirtual environment.

PAGE 57

43 intheexperimentiscurrentlycapableofdisplayingthreesimultaneousdisplays.A pictureofthedisplayscanbeseeninFigure33. ThevirtualrealitysimulatorutilizesMultiGen-ParadigmsVegaPrime,an OpenGL-based,commercialsoftwarepackageforMicrosoftWindows.Thevirtual environmentintheexperimentisarecreationoftheU.S.Armysurbanwarfare traininggroundatFortBenning,Georgia.Theenvironmenthasadensepolygon count,detailedtextures,highframerate,andthee ectsofsoftshadows,resulting inveryrealisticimages.AscenefromtheFortBenningenvironmentcanbeseenin Figure3. ThevisualsensorintheexperimentisaSonyXCD-710CRcolor rewire camerawitharesolutionof 1280768 pixelsand ttedwitha12.5mmlens.The cameracapturestheimagesonthelargescreensasshowninFigure33.The imagesareprocessedinavisionprocessingworkstation.Anapplicationwritten inC++acquiresimagesfromthecameraandprocesstheimagestolocateand trackthefeaturepoints(theinitialfeaturepointswerechosenmanually,then theapplicationwillidentifyandtrackthefeaturepointsonitsown).TheC++ applicationgeneratesthecurrentanddesiredpixelcoordinates,whichcanbeused toformulatethecontrolcommand. Inadditiontotheimageprocessingapplication,acontrolcommandgeneration applicationprogrammedinMatlabalsorunsinthisworkstation.TheMatlabapplicationcommunicatesdatawiththeimageprocessingapplication(writteninC++) viasharedmemorybu ers.TheMatlabapplicationreadsthecurrentanddesired pixelcoordinatesfromthesharedmemorybu ers,andwriteslinearandangular cameravelocityinputintothesharedmemorybu ers.TheC++applicationwrites thecurrentanddesiredpixelcoordinatesintothesharedmemory,andreadscameravelocityinputfromthesharedmemorybu er.Thelinearandangularcamera velocitycontrolinputaresentfromthevisionprocessingworkstationtothevirtual

PAGE 58

44 realitysimulatorviaaTCPsocketconnection.Thisdevelopmentmakesextensive useofIntelsOpenSourceComputerVision(OpenCV)Library(seeBradski[97]) andtheGNUScienti cLibrary(GSL)(seeGalassietal.[98]). Figure3:Virtualrealityenvironmentexmaple:avirtualrecreationoftheUS ArmysurbanwarfaretraininggroundatFortBenning. Algorithms,suchastheHomographydecomposition,areimplementedasif thevirtualenvironmentisatrue3Dscenewhichthephysicalcameraisviewing. Ofcourse,thecameradoesnotlookatthe3Dscenedirectly.Thecameraviews consistofa3Dscenethatareprojectedontoa2Dplane,thenprojectedonto theimageplane.Thatis,theprojectivehomographyneededforcontrolexists betweenthe on-screen currentimageandthe on-screen goalimage,butwhat aregivenarethecameraviewsofthe on-screen images.Thus,thereexistsan additionaltransformactionbetweenthepointsonthescreenandthepointsinthe cameraimage.Ascreen-cameracalibrationmatrix canbeusedtodescribethis transformationrelationship. Everypointonthescreencorrespondstoonlyonepointintheimage.Thus, theconstantmatrix isahomographyandcanbedeterminedthrougha calibrationprocedure,ande ectivelyreplacesthestandardcalibrationofthe

PAGE 59

45 physicalcamera.Intheexperiment,thismatrixisdeterminedtobe = 0 91410 003990 90650 03750 935850 7003 001 Inadditionto ,thecameracalibrationmatrix ,correspondingtothevirtual camerawithinVegaPrime,isstillrequired.Thismatrixcanbedeterminedfrom thesettingsofthevirtualrealityprogram.Inthisexperiment, wasdeterminedto be = 1545 10640 01545 1512 001 3.6.2ExperimentforTracking Thedesiredtrajectoryisinaformatof aprerecordedvideo(asequenceof images).AstheviewpointintheVegaPrimemoves(whichcanbeimplemented bysomechosenvelocityfunctionsormanually),theimagescapturedbythe camerachanges.Thepixelcoordinatesofthefeaturespointsontheimageswill berecordedasthedesiredtrajectory.Thecontrolobjectiveinthistracking experimentistosendcontrolcommandtothevirtualrealitysimulatorsuch thatthecurrentposeofthefeaturepointstracksthedesiredpose.Theconstant referenceimagewastakenasthe rstimageinthesequence. Thecontrolgains in(3)and in(3),andadaptationgain in (32)wereselectedas = {0 1 0 1 1 5}= {0 5 0 5 0 5} =0 005

PAGE 60

46 Duringtheexperiment,theimagesfromthecameraareprocessedwithaframe rateofapproximately20frames/second. TheresultingtranslationandrotationerrorsareplottedinFigure315and Figure316.Thedesiredimage-spacetrajectory(i.e., ( )) isshowninFigure 37,andthecurrentimage-spacetrajectory(i.e., ( )) isshowninFigure318. Thetrackingerrorbetweenthecurrentanddesiredimage-spacetrajectoriesis showninFigure3.Thetranslationandrotationcontroloutputsareshownin Figure3andFigure3,respectively.Theparameterestimatefor 1isshown inFigure3. Inthetrackingcontrol,thesteady-statetrackingerrorisapproximately 15 [pixel] 10 [pixel] 0 01 .Thissteady-stateerroriscausedbytheimage noiseandthecameracalibrationerrorinthetest-bed.To ndthetracking controlerror,twohomographiesarecomputedbetweenthereferenceimageand thecurrentimageanddesiredimage,respectively.Duetotheimagenoiseand cameracalibrationerror,theerrorisinsertedtothetwohomographies.Thenthe trackingerrorobtainedfromthemismatchbetweenthetwohomographieswill havelargererror.Also,theimagenoiseandcalibrationerrorinsertserrorintothe derivativeofthedesiredpixelcoordinates,whichisusedasafeedforwardtermin thetrackingcontroller.Furthermore,communicationbetweenthecontrollerand virtualrealitysystemoccursviaaTCPsocket,introducingsomeamountoflatency intothesystem.Notethatthispixelerrorrepresentslessthan1.5%oftheimage dimensions. Inthefollowingregulationexperiment,thederivativeofthedesiredpixel coordinatesisequaltozero,andonlyonehomographyiscomputedbetween thecurrentimageanddesiredsetimage.Thein uenceoftheimagenoiseand calibrationerrorisweakenedgreatly.

PAGE 61

47 3.6.3ExperimentforRegulation Whenthedesiredposeisaconstant,thetrackingproblembecomesaregulationproblem.Thecontrolobjectiveinthisregulationexperimentistosendcontrol commandtothevirtualrealitysimulatorsuchthatthecurrentposeofthefeatures pointsisregulatedtothedesiredsetpose. Intheexperiment,thecontrolgains in(3)and in(3),and adaptationgain in(3)wereselectedas = {0 4 0 4 0 9}= {0 5 0 25 0 25} =0 005 TheresultingtranslationandrotationerrorsareplottedinFigure323and Figure34,respectively.Theerrorsgotozeroasymptotically.Thecurrent image-spacetrajectory(i.e., ( )) isshowninFigure3.Theregulationerror betweenthecurrentanddesiredsetimage-spaceposeisshowninFigure36.The translationandrotationcontroloutputsareshowninFigure3andFigure3, respectively.Theparameterestimatefor 1isshowninFigure329.

PAGE 62

48 0 500 1000 1500 0 500 1000 0 2 4 6 8 10 udi [pixel] vdi [pixel] Time [sec]Figure35:Desiredimage-spacecoordinatesofthefourfeaturepoints(i.e., ( ) ) inthetrackingMatlabsimulationshownina3Dgraph.Inthe gure,Odenotes theinitialimage-spacepositionsofthe4featurepointsinthedesiredtrajectory, and*denotesthecorresponding nalpositionsofthefeaturepoints. 0 500 1000 1500 0 500 1000 0 2 4 6 8 10 ui [pixel] vi [pixel] Time [sec]Figure3:Currentimage-spacecoordinatesofthefourfeaturepoints(i.e., ( ) ) inthetrackingMatlabsimulationshownina3Dgraph.Inthe gure,Odenotestheinitialimage-spacepositionsofthe4featurepoints,and*denotesthe corresponding nalpositionsofthefeaturepoints.

PAGE 63

49 0 2 4 6 8 10 100 0 100 200 e1 [pixel] 0 2 4 6 8 10 100 0 100 200 e2 [pixel] 0 2 4 6 8 10 0.5 0 0.5 Time [sec]e3Figure3:Translationerror ( ) inthetrackingMatlabsimulation. 0 2 4 6 8 10 0.5 1 1.5 ~ q0 0 2 4 6 8 10 0.5 0 0.5 ~ qv1 0 2 4 6 8 10 0.5 0 0.5 ~ qv2 0 2 4 6 8 10 0.5 0 0.5 Time [sec]~ qv3Figure3:Rotationquaternionerror ( ) inthetrackingMatlabsimulation.

PAGE 64

50 0 2 4 6 8 10 0 200 400 600 800 1000 1200 ud [pixel] 0 2 4 6 8 10 0 200 400 600 800 1000 Time [sec]vd [pixel]Figure3:Pixelcoordinate ( ) ofthefourfeaturepointsinasequenceofdesiredimagesinthetrackingMatlabsimulation.Theupper gureisforthe ( ) componentandthebottom gureisforthe ( ) component. 0 2 4 6 8 10 0 200 400 600 800 1000 1200 u [pixel] 0 2 4 6 8 10 0 200 400 600 800 1000 Time [sec]v [pixel]Figure3:Pixelcoordinate ( ) ofthecurrentposeofthefourfeaturepointsin thetrackingMatlabsimulation.Theupper gureisforthe ( ) componentandthe bottom gureisforthe ( ) component.

PAGE 65

51 0 2 4 6 8 10 50 0 50 100 Time [sec]u ud [pixel] 0 2 4 6 8 10 200 100 0 100 200 Time [sec]v vd [pixel]Figure3:Trackingerror ( )( ) (inpixels)ofthefourfeaturepointsinthe trackingMatlabsimulation.Theupper gureisforthe ( )( ) componentand thebottom gureisforthe ( )( ) component. 0 2 4 6 8 10 5 0 5 vc1 [m.s 1] 0 2 4 6 8 10 20 10 0 10 vc2 [m.s 1] 0 2 4 6 8 10 4 2 0 2 Time [sec]vc3 [m.s 1]Figure3:Linearcameravelocityinput ( ) inthetrackingMatlabsimulation.

PAGE 66

52 0 2 4 6 8 10 3 2 1 0 c1 [rad.s 1] 0 2 4 6 8 10 1 0.5 0 0.5 c2 [rad.s 1] 0 2 4 6 8 10 1 0 1 2 Time [sec]c3 [rad.s 1]Figure313:Angularcameravelocityinput ( ) inthetrackingMatlabsimulation. 0 2 4 6 8 10 1 2 3 4 5 6 7 8 9 Time [sec]^ z1 *[m]Figure314:Adaptiveon-lineestimateof 1inthetrackingMatlabsimulation.

PAGE 67

53 0 10 20 30 40 50 60 200 0 200 400 e1 [pixel] 0 10 20 30 40 50 60 100 0 100 e2 [pixel] 0 10 20 30 40 50 60 0.05 0 0.05 Time [sec]e3Figure3:Translationerror ( ) inthetrackingexperiment. 0 10 20 30 40 50 60 0.99 1 1.01 ~ q0 0 10 20 30 40 50 60 0.1 0 0.1 ~ qv1 0 10 20 30 40 50 60 0.1 0 0.1 ~ qv2 0 10 20 30 40 50 60 0.1 0 0.1 Time [sec]~ qv3Figure3:Rotationquaternionerror ( ) inthetrackingexperiment.

PAGE 68

54 0 10 20 30 40 50 60 200 400 600 800 1000 1200 ud [pixel] 0 10 20 30 40 50 60 200 400 600 800 1000 1200 Time [sec]vd [pixel]Figure3:Pixelcoordinate ( ) ofthefourfeaturepointsinasequenceof desiredimagesinthetrackingexperiment.Theupper gureisforthe ( ) componentandthebottom gureisforthe ( ) component. 0 10 20 30 40 50 60 200 400 600 800 1000 1200 u [pixel] 0 10 20 30 40 50 60 200 400 600 800 1000 1200 Time [sec]v [pixel]Figure3:Pixelcoordinate ( ) ofthecurrentposeofthefourfeaturepoints inthetrackingexperiment.Theupper gureisforthe ( ) componentandthe bottom gureisforthe ( ) component.

PAGE 69

55 0 10 20 30 40 50 60 200 0 200 400 Time [sec]u ud [pixel] 0 10 20 30 40 50 60 200 0 200 400 Time [sec]v vd [pixel]Figure3:Trackingerror ( )( ) (inpixels)ofthefourfeaturepointsinthe trackingexperiment.Theupper gureisforthe ( )( ) componentandthe bottom gureisforthe ( )( ) component. 0 10 20 30 40 50 60 0.1 0 0.1 vc1 [m.s 1] 0 10 20 30 40 50 60 0.05 0 0.05 vc2 [m.s 1] 0 10 20 30 40 50 60 0.02 0 0.02 Time [sec]vc3 [m.s 1]Figure3:Linearcameravelocityinput ( ) inthetrackingexperiment.

PAGE 70

56 0 10 20 30 40 50 60 0.02 0 0.02 c1 [rad.s 1] 0 10 20 30 40 50 60 0.02 0 0.02 c2 [rad.s 1] 0 10 20 30 40 50 60 0.1 0 0.1 Time [sec]c3 [rad.s 1]Figure3:Angularcameravelocityinput ( ) inthetrackingexperiment. 0 10 20 30 40 50 60 1.5 1 0.5 0 0.5 Time [sec]^ z1 *[m]Figure322:Adaptiveon-lineestimateof 1inthetrackingexperiment.

PAGE 71

57 0 5 10 15 20 200 0 200 400 e1 [pixel] 0 5 10 15 20 200 0 200 400 e2 [pixel] 0 5 10 15 20 0.1 0 0.1 Time [sec]e3Figure3:Translationerror ( ) intheregulationexperiment. 0 5 10 15 20 0.5 1 q0 0 5 10 15 20 0.5 0 0.5 qv1 0 5 10 15 20 0.5 0 0.5 qv2 0 5 10 15 20 0.5 0 0.5 Time [sec]qv3Figure3:Rotationquaternionerror ( ) intheregulationexperiment.

PAGE 72

58 0 5 10 15 20 0 200 400 600 800 1000 u [pixel] 0 5 10 15 20 0 200 400 600 800 1000 Time [sec]v [pixel]Figure3:Pixelcoordinate ( ) (inpixels)ofthecurrentposeofthefourfeature pointsintheregulationexperiment.Theupper gureisforthe ( ) component andthebottom gureisforthe ( ) component. 0 5 10 15 20 200 0 200 400 u ud [pixel] 0 5 10 15 20 200 0 200 400 Time [sec]v vd [pixel]Figure3:Regulationerror ( )(inpixels)ofthefourfeaturepointsinthe regulationexperiment.Theupper gureisforthe ( )( ) componentandthe bottom gureisforthe ( )( ) component.

PAGE 73

59 0 5 10 15 20 0.1 0 0.1 vc1 [m.s 1] 0 5 10 15 20 0.1 0 0.1 vc2 [m.s 1] 0 5 10 15 20 0.1 0 0.1 Time [sec]vc3 [m.s 1]Figure3:Linearcameravelocityinput ( ) intheregulationexperiment. 0 5 10 15 20 0.2 0 0.2 c1 [rad.s 1] 0 5 10 15 20 0.2 0 0.2 c2 [rad.s 1] 0 5 10 15 20 0.5 0 0.5 Time [sec]c3 [rad.s 1]Figure3:Angularca meravelocityinput ( ) intheregulationexperiment.

PAGE 74

60 0 5 10 15 20 1 0.8 0.6 0.4 0.2 0 0.2 Time [sec]^ z1 *[m]Figure329:Adaptiveon-lineestimateof 1intheregulationexperiment.

PAGE 75

CHAPTER4 COLLABORATIVEVISUALSERVOTRACKINGCONTROLVIAA DAISY-CHAININGAPPROACH 4.1Introduction Inthischapter,acollaborativetrajectorytrackingproblemisconsideredfor asixDOFrigid-bodyobject(e.g.,anautonomousvehicle)identi edbyaplanar patchoffeaturepoints.Unliketypicalvisualservocontrollersthatrequireeither thecameraorthetargettoremainstationary,auniqueaspectofthedevelopment inthischapteristhatamovingmonocularcamera(e.g.,acameramountedon anunmannedairvehicle(UAV))isusedtoprovidefeedbacktoamovingcontrol object.Thecontrolobjectiveisfortheobjecttotrackadesiredtrajectorythat isencodedbyaprerecordedvideoobtainedfroma xedcamera(e.g.,acamera mountedonasatellite,acameramountedonabuilding). Severalchallengesmustberesolvedtoachievethisunexploredcontrolobjective.Therelativevelocitybetweenthemovingplanarpatchoffeaturepoints andthemovingcamerapresentsasigni cantchallenge.Byusingadaisy-chaining approach(e.g.,[16] ) ,Euclideanhomographyrelationshipsbetweendi erent cameracoordinateframesandfeaturepointpatchcoordinateframesaredeveloped. Thesehomographiesareusedtorelatecoordinateframesattachedtothemoving camera,thereferenceobject,thecontrolobject,andtheobjectusedtorecord thedesiredtrajectory.AnotherchallengeisthatforgeneralsixDOFmotionby boththecameraandthecontrolobject,thenormaltotheplanarpatchassociated withtheobjectisunknown.Bydecomposingthehomographyrelationships,the normaltotheplanarpatchcanbeobtained.Likewise,thedistancebetweenthe movingcamera,themovingcontrolobject,andthereferenceobjectareunknown. 61

PAGE 76

62 Byusingthedepthratiosobtainedfro mthehomographydecomposition,the unknowntime-varyingdistanceisrelatedtoanunknownconstantparameter.A Lyapunov-basedadaptiveestimationlawisdesignedtocompensatefortheunknownconstantparameter.Themovingcameracouldbeattachedtoaremotely pilotedvehiclewitharbitraryrotations,thisrequiresaparameterizationthatis validoveralarge(possiblyunbounded)domain.Additionally,sincethisworkis motivatedbyproblemsintheaerospacecommunity,homography-basedvisualservo controltechniques(e.g.,[10,20,22])ar ecombinedwithquaternion-basedcontrol methods(e.g.,[14,23,24])tofacilitatelargerotations.Byusingthequaternion parameterization,theresultingclosed-looprotationerrorsystemcanbestabilized byaproportionalrotationcontrollercombinedwithafeedforwardtermthatisa functionofthedesiredtrajectory. 4.2ProblemScenario Overthepastdecade,avarietyofvisualservocontrollershavebeenaddressed forbothcamera-to-handandcamera-in-handcon gurations(e.g.,see[1,99 101]).Forvisualservocontrolapplicationsthatexploiteitherofthesecamera con gurations,eithertheobjectorthecameraisrequiredtoremainstationary. Incontrasttotypicalcamera-to-handorcamera-in-handvisualservocontrol con gurations,amovingairbornemonocularcamera(e.g.,acameraattachedto aremotecontrolledaircraft,acameramountedonasatellite)isusedbyMehta etal.[18,19]toprovideposemeasurementsofamovingsensorlessunmanned groundvehicle(UGV)relativetoagoalcon guration.Theresultsin[18,19] arerestrictedtothreeDOF,andtherotationerrorsystemisencodedbyEuler angle-axisparameterization. ConsiderastationarycoordinateframeIthatisattachedtoacameraand atime-varyingcoordinateframeFthatisattachedtosomeobject(e.g.,an autonomousvehicle)asdepictedinFigure41.Theobjectisidenti edinanimage

PAGE 77

63 d* dD e s i r e d P a t c h T r a j e c t o r yd*dF Fd r**s1 iC u r r en t P a t c hReference CameraC u r r e n t C a m e r aIIRFReference Objects2 is1 i(R ,x )*f*(R ,x )*f r*r(R,x )f (R x )r d f r d Figure4:Geometricmodel. byacollectionoffeaturepointsthatareassumed(withoutlossofgenerality)to becoplanarandnon-collinear(i.e.,aplanarpatchoffeaturepoints).Thecamera attachedtoIapriorirecordsaseriesofsnapshots(i.e.,avideo)ofthemotion oftheobjectattachedtoFuntilitcomestorest(orthevideostopsrecording). AstationarycoordinateframeFisattachedtoareferenceobjectidenti edby anotherplanarpatchoffeaturepointsthatareassumedtobevisibleineveryframe ofthevideorecordedbythecamera.Forexample,thecameraattachedtoIisonboardastationarysatellitethattakesaseriesofsnapshotsoftherelativemotion ofFwithrespecttoF.Therefore,thedesiredmotionofFcanbeencodedas aseriesofrelativetranslationsandrotationswithrespecttothestationaryframeFapriori.Splinefunctionsor lteralgorithmscanbeusedtogenerateasmooth desiredfeaturepointtrajectory[10]. Consideratime-varyingcoordinateframeIthatisattachedtoacamera(e.g., acameraattachedtoaremotecontrolledaircraft)andatime-varyingcoordinate

PAGE 78

64 frameFthatisattachedtothecontrolobjectasdepictedinFigure4.The cameraattachedtoIcapturessnapshotsoftheplanarpatchesassociatedwithFandF,respectively.TheapriorimotionofFrepresentsthedesiredtrajectory ofthecoordinatesystemF,whereFandFareattachedtothesameobject butatdi erentpointsintime.ThecameraattachedtoIisadi erentcamera (withdi erentcalibrationparameters)asthecameraattachedtoI.Theproblem consideredinthischapteristodevelopakinematiccontrollerfortheobject attachedtoFsothatthetime-varyingrotationandtranslationofFconvergesto thedesiredtime-varyingrotationandtranslationofF,wherethemotionofFis determinedfromthetime-varyingoverheadcameraattachedtoI. 4.3GeometricModel Therelationshipsbetweenthecoordinatesystemsareasfollows(seeTab le41): ( ) ( ) ( ) 0( ) ( ) (3) denotetherotationfromFtoI,FtoI,ItoI,FtoI,FtoI,andFtoI,respectively, ( ) ( )R3denotetherespectivetime-varyingtranslationfromFtoIandfromFtoIwithcoordinatesexpressedinI,and ( ) 0 ( ) ( ) R3denotethe respectiveconstanttranslationfromItoI,FtoI,FtoI,andfromFtoIwithcoordinatesexpressedinI.FromFigure4,thetranslation 0 ( ) and therotation 0( ) canbeexpressedas 0 = + ( ) 0= (41) AsillustratedinFigure4, and denotetheplanarpatchesof featurepointsassociatedwithF,F,andF,respectively. 1 R3 = 1 2 ( 4) denotestheconstantEuclideancoordinatesofthe -thfeaturepointinF(andalsoF),and 2 R3 =1 2 denotestheconstant Euclideancoordinatesofthe -thfeaturepointinF.Fromthegeometrybetween

PAGE 79

65 Motion Frames ( ) ( ) FtoIinI ( ) ( ) FtoIinI ( ) ( ) ItoI 0( ) 0( ) FtoIinI FtoIinI ( ) ( ) FtoIinI Table41:Coordinateframesrelationships thecoordinateframesdepictedinFigure4,thefollowingrelationshipscanbe developed = + 1 = + 1 (42) = + 2 0= 0 + 01 (43) = + 2 (44) In(42)-(4), ( ) ( )R3denotetheEuclideancoordinatesofthefeature pointson and ,respectively,expressedinIas ( ) ( ) ( ) ( ) (45) ( ) ( ) ( ) ( ) (46) 0( ) ( )R3denotetheactualanddesiredtime-varyingEuclideancoordinates,respectively,ofthefeaturepointson expressedinIas 0( ) 0( ) 0( ) 0( ) (47) ( ) ( ) ( ) ( ) (48)

PAGE 80

66 and R3denotestheconstantEuclideancoordinatesofthefeaturepointson theplanarpatch expressedinIas (49) Aftersomealgebraicmanipulation,theexpressionsin(42)-(44)canberewritten as = + (4) = + = + (4) = + 0= + (4) where ( ) ( ) ( ) ( ) (3) and ( ) ( ) ( ) ( )R3are newrotationalandtranslationalvariables,respectively,de nedas = = (4) = = and = ( ( 2 1 )) (4) = + ( 2 1 ) (4) = + ( 2 1 ) (4) = = 0 (4) Notethat ( ) ( ) and ( ) in(4)aretherotationmatricesbetweenFandF,FandF,andFandF,respectively,but ( ) ( ) and ( ) in (4)-(4)arenotthetra nslationvectorsbetweenthecorrespondingcoordinate

PAGE 81

67 frames.However,thiswillnota ectthefollowingcontrollerdesignbecauseonly therotationmatriceswillbeusedinthecontrollerdevelopment. TofacilitatethedevelopmentofarelationshipbetweentheactualEuclidean translationofFtotheEuclideantranslationtha tisreconstructedfromtheimage information,thefollowingprojectiverelationshipsaredeveloped: ( )= ( )= = (4) where ( )R representsthedistancefromtheoriginofIto alongtheunit normal(expressedinI)to denotedas ( )R3, ( )R representsthe distancefromtheoriginofIto alongtheunitnormal(expressedinI)to denotedas ( )R3,and R representsthedistancefromtheoriginofIto alongtheunitnormal(expressedinI)to denotedas R3where ( )= ( ) .In(4), ( ) ( ) forsomepositiveconstant R Basedon(418),therelationshipsin(410)-(42)canbeexpressedas = + (4) = + (4) = + (4) = + (4) 0= + (4) AsinChenetal.[10],thesubsequentdevelopmentrequiresthattheconstant rotationmatrix beknown.Theconstantrotationmatrix canbeobtaineda prioriusingvariousmethods(e.g.,asecondcamera,Euclideanmeasurements).The subsequentdevelopmentalsoassumesthatthedi erencebetweentheEuclidean distances ( 2 1 ) isaconstant =1 .Whiletherearemanypractical applicationsthatsatisfythisassumption(e.g.,asimplescenarioisthattheobjects

PAGE 82

68 attachedto and arethesameobject),theassumptionisgenerallyrestrictive andisthefocusoffutureresearch.AsdescribedbyHuetal.[17],thisassumption canbeavoidedbyusingthegeometricreconstructionapproach[102]underan alternativeassumptionthatthedistancebetweentwofeaturepointsisprecisely known. 4.4EuclideanReconstruction Therelationshipsgivenby(4)-(4)provideameanstoquantifya translationandrotationerrorbetweenthedi erentcoordinatesystems.Sincethe poseof ,and cannotbedirectlymeasured,aEuclideanreconstructionis developedtoobtaintheposeerrorbycomparingmultipleimagesacquiredfrom thehoveringmonocularvisionsystem.Tofacilitatethesubsequentdevelopment, thenormalizedEuclideancoordinatesofthefeaturepointsin and canbe expressedintermsofIas ( )R3and ( )R3,respectively,as (4) Similarly,thenormalizedEuclideancoordinatesofthefeaturepointsforthe current,desired,andreferenceimagecanbeexpressedintermsofIas 0( ) ( ) R3,respectively,as 0( ) 0( ) 0( ) ( ) ( ) ( ) (4) Fromtheexpressionsgivenin(4)and( 4),therotationandtranslation betweenthecoordinatesystemsFandF,betweenFandF,andbetweenIand

PAGE 83

69IcannowberelatedintermsofthenormalizedEuclideancoordinatesasfollows: = + (4) = 1 + (4) = + (4) = + (4) where ( ) ( ) ( )R denotedepthratiosde nedas = = = and ( ) ( ) ( ) ( )R3denotescaledtranslationvectorsthatare de nedas = = (4) = = SincethenormalizedEuclideancoo rdinatesin(4)-(4)cannotbe directlymeasured,thefollowingrelationships(i.e.,thepin-holecameramodel)are usedtodeterminethenormalizedEuclideancoordinatesfrompixelinformation = 1 = 1 (4) = 2 = 2 (4) where 12R3 3areknown,constant,andinvertibleintrinsiccameracalibrationmatricesofthecurrentcameraandt hereferencecamera,respectively.In (4)and(4), ( ) and ( )R3representtheimage-spacecoordinatesof theEuclideanfeaturepointson and expressedintermsofIas 1 1 (4)

PAGE 84

70 respectively,where ( ) ( ) ( ) ( )R .Similarly, ( ) and R3representtheimage-spacecoordinatesoftheEuclideanfeatureson and expressedintermsofIas 1 1 (4) respectively,where ( ) ( ) R .Byusing(4)-(4)and (4)-(4),thefollowingre lationshipscanbedeveloped: = 1 + 1 1 | {z } (4) = 1 1 + 1 1 | {z } (4) = 2 + 1 2 | {z } (4) = 2 + 1 1 | {z } (4) where ( ) ( ) ( ) ( )R3 3denoteprojectivehomographies.Setsof linearequationscanbedevelopedfrom(435)-(438)todeterminetheprojective homographiesuptoascalarmultiple.Varioustechniquescanbeused(e.g., seeFaugerasandLustman[93]andZhangandHanson[94])todecomposethe Euclideanhomographies,toobtain ( ) ( ) ( ) ( ) ( ) ( ) ( ) ( ) ( ) ( ) ( ) ( ) ( ) .Giventhattheconstantrotation matrix isassumedtobeknown,theexpressionsfor ( ) and ( ) in(4 13)canbeusedtodetermine ( ) and ( ) .Once ( ) isdetermined,the expressionfor ( ) in(413)canbeusedtodetermine ( ) .Also,once ( ) and ( ) havebeendetermined,(41)canbeusedtodetermine 0( ) .Since ( ) ,

PAGE 85

71 ( ) ( ) ( ) ( ) ( ) ,and ( ) canbedetermined,thefollowing relationshipcanbeusedtodetermine 0( ) : 0= 0 + (4) wheretheinverseoftheratio ( ) 0( ) canbedeterminedas 0 = 001 + (4) 4.5ControlObjective ThecontrolobjectiveisforasixDOFrigid-bodyobject(e.g.,anautonomous vehicle)identi edbyaplanarpatchoffeaturepointstotrackadesiredtrajectory thatisdeterminedbyasequenceofimagestakenbya xedreferencecamera.This objectiveisbasedontheassumptionthatthelinearandangularvelocitiesofthe cameraarecontrolinputsthatcanbeindep endentlycontrolled(i.e.,unconstrained motion)andthatthereferenceanddesiredcamerasarecalibrated(i.e., 1and 2areknown).Thecontrolobjectivecanbestatedas 0( ) ( ) (i.e., theEuclideanfeaturepointson trackthecorrespondingfeaturepointson ). Equivalently,thecontrolobjectivecanalsobestatedintermsoftherotation andtranslationoftheobjectas 0( )( ) and 0( )( ) .Asstated previously, 0( ) and ( ) canbecomputedbydecomposingtheprojective homographiesin(4)-(4)andusi ng(4).Oncetheserotationmatrices havebeendetermined,theunitquaternionparameterizationisusedtodescribe therotationmatrix.Thisparameterizationfacilitatesthesubsequentproblem formulation,controldevelopment,andstabilityanalysissincetheunitquaternion providesaglobalnonsingularparameterizationofthecorrespondingrotation matrices.SeeSection2.3forsomeback groundabouttheunitquaternion.

PAGE 86

72 Giventherotationmatrices 0( ) and ( ) ,thecorrespondingunitquaternions ( ) and ( ) canbecalculatedbyusingthenumericallyrobustmethod (e.g.,see[14]and[15])basedonthecorrespondingrelationships 0= 2 0 3+2 +2 0 (4) = 2 0 3+2 +2 0 (4) where 3isthe 33 identitymatrix,andthenotation ( ) denotesthefollowing skew-symmetricformofthevector ( ) asin(2). Toquantifytherotationerrorbetweenthefeaturepointson and ,the errorbetweenrotationmatrices 0( ) and ( ) isde nedas = 0 = 2 0 3+2 2 0 (4) wheretheerrorquaternion ( )=( 0( ) ( ))isde nedas = 0 = 00 + 0 0+ (4) Since ( ) isaunitquaternion,(443)canbeusedtoquantifytherotationtracking objectiveask ( )k 0= ( )3as (4) Thetranslationerror,denotedby ( )R3,isde nedas[10,20] = (4) where ( ) ( )R3arede nedas = 0 00 0ln( 0 ) = ln( ) (4) In(3), 0( ) ( ) and ( ) ( ) canbecomputedasbelow

PAGE 87

73 0 = 0 = 0 1 = 1 Basedon(445)and(446),thesubsequentcontroldevelopmenttargetsthe followingobjectives:k ( )k 0 andk ( )k 0 as (4) 4.6ControlDevelopment 4.6.1Open-LoopErrorSystem From(43)and(44),theopen-looprotationerrorsystemcanbedeveloped as = 1 2 03+ (4) where ( ) denotestheangularvelocityof expressedinFthatcanbe calculatedas[23] =2( 0 0 )2 (4) where 0 ( ) ( ) 0 ( ) ( ) areassumedtobebounded;hence, ( ) isalsobounded.Theopen-looptranslationerrorsystemcanbederivedas(see AppendixC) = 000 + (4) where ( ) ( )R3denotethelinearandangularvelocityvectorsof expressed inF,respectively,andtheauxiliarymeasurableterm 0( )R3 3isde nedas 0= 100 0010 0001

PAGE 88

74 4.6.2Closed-LoopErrorSystem Basedontheopen-looprotationerrorsystemin(449)andthesubsequent Lyapunov-basedstabilityanalysis,theangularvelocitycontrollerisdesignedas = + (4) where R3 3denotesadiagonalmatrixofpositiveconstantcontrolgains. From(49)and(42),therotationclosed-looperrorsystemcanbedeterminedas 0= 1 2 (4) =1 2 03+ =1 2 0 From(4),thetranslationcontrolinput ( ) isdesignedas =0 00 1 ( ) (4) where R3 3denotesadiagonalmatrixofpositiveconstantcontrolgains.In (44),theparameterestimate ( )R fortheunknownconstant isdesigned as = (4) where R denotesapositiveconstantadaptationgain.Byusing(41)and (4),thetranslationc losed-looperrorsystemis = (4) where ( )R denotesthefollowingparameterestimationerror: = (4)

PAGE 89

75 4.6.3StabilityAnalysis Theorem4.1 :Thecontrollergivenin(4)and(4),alongwiththe adaptiveupdatelawin(455)ensuresasymptotictrackinginthesensethatk ( )k 0 k ( )k 0 (4) Proof :Let ( )R denotethefollowingdi erentiablenon-negativefunction (i.e.,aLyapunovcandidate): = +(1 0)2+ 2 + 1 2 2 (4) Thetime-derivativeof ( ) canbedeterminedas = 0 (1 0) + ( )+ = ( 03+(1 0) 3) = (4) where(4)and(4)-(4)wereu tilized.Basedon(4)and(4), ( ) ( ) 0( ) ( ) Land ( ) ( ) L2.Since ( ) L,itis clearfrom(4)that ( ) L.Basedonthefactthat ( ) L,(4) and(4)canbeusedtoprovethat 0( ) L,andthen 0( ) 0 1 ( ) L. Basedonthefactthat ( ) Land ( ) isaboundedfunction,(4)canbe usedtoconcludethat ( ) L.Since ( ) ( ) 0( ) 0( ) 0 1 ( ) Land ( ) isbounded,(44)canbeutilizedtoprovethat ( ) L.Fromthe previousresults,(4)-(4 51)canbeusedtoprovethat ( ) ( ) L.Since ( ) ( ) L L2,and ( ) ( ) L,BarbalatsLemma[96]canbeusedto concludetheresultgivenin(4).

PAGE 90

76 4.7SimulationResults Anumericalsimulationwasperformedtoillustratetheperformanceofthe trackingcontrollergivenin(4),(4),andtheadaptiveupdatelawin(4). Thecameracalibrationparameterswerechosenas 1= 2= 1545 10640 01545 1512 001 TheoriginsofcoordinateframesF,FandF,andthefourcoplanarfeature pointsontheplanes and arechosensuchthatthefeaturepointshavethe sameEuclideancoordinates(in[m])inF,FandF,as 1= 00 150 2= 0 150 150 3= 0 1500 4= 0 1500 Thetime-varyingdesiredimagetrajectorywasgeneratedbythekinematicsofthe featurepointplanewherethedesiredlinearandangularvelocitieswereselectedas = 0 1sin( )0 1sin( )0 1sin( ) [ ] = 000 5 [ ] ThemovingcameraattachedtoIisassumedtohavealinearvelocityvectorof 0 1sin( )00 [ ] Theinitialrotationmatrices (0) betweenFandI, (0) betweenFandI, and (0) betweenFandI,andtheconstantrotationmatrix betweenF

PAGE 91

77 andI,weresetas (0)= (180) (0) (40) (0)= (180) (0) (20) (0)= (180) (0) (20) = (180) (0) (80) Theinitialtranslationvectors (0) betweenFandI(expressedinI), (0) betweenFandI(expressedinI),and (0) betweenFandI(expressedinI),andtheconstanttranslationvector (0) betweenFandI(expressedinI),wereselectedas (0)= 0 50 54 0 (0)= 1 01 53 5 (0)= 0 51 56 0 (0)= 1 01 54 0 TheinitialEuclideanrelationshipbetweenthecameras,thereferenceobject,the controlobject,andtheobjectthatwasusedtogeneratethedesiredtrajectoryis showninFigure42. Theinitialimage-spacecoordinates(i.e., (0) )ofthefourfeaturepoints attachedtotheplane ,expressedinI,werecomputedas(inpixels) 1(0)= 409 62660 751 2(0)= 454 00623 511 3(0)= 491 25667 891 4(0)= 402 48742 381

PAGE 92

78 Theinitialreferenceimage-spacecoordinates(i.e., (0) and )ofthefour featurepointsattachedtotheplane ,expressedinIandI,respectively,were computedas(inpixels) 1(0)= 1104 11112 01 2(0)= 1166 31134 61 3(0)= 1143 71196 81 4(0)= 1019 21151 51 1= 196 71081 41 2= 206 71024 31 3= 263 81034 41 4= 243 71148 51 Theinitialimage-spacecoordinates(i.e., (0) )ofthefourfeaturepointsattached totheplane ,expressedinI,werecomputedas(inpixels) 1(0)= 755 5862 01 2(0)= 791 8848 81 3(0)= 805 1885 11 4(0)= 732 5911 51 Thecontrolgains in(4)and in(4)andadaptationgain in (45)wereselectedas = diag{1 1 1}= diag{5 5 5} =20 Thedesiredimage-spacetrajectoryofthefeaturepointplane ,takenbythe cameraattachedtoI,isshowninFigure4.Thecurrentimage-spacetrajectory ofthefeaturepointplane ,takenbythecameraattachedtoI,isshowninFigure 45.Thereferenceimage-spacetrajectoryofthereferenceplane ,takenby thecameraattachedtoI,isshowninFigure4.Theresultingtranslationand

PAGE 93

79 2 1 0 1 2 2 1 0 1 2 2 1 0 1 2 3 4 d *z Camera I Reference Camera IR y xFigure4:This gureshowstheinitialpositionsofthecamerasandthefeature pointplanes.TheinitialpositionsofthecamerasattachedtoIandIaredenoted byO.Thefeaturepointsontheplanes and aredenotedby.The originsofthecoordinateframesF,FandFaredenotedby. rotationtrackingerrorsareplottedinFigure4andFigure4,respectively.The errorsgotozeroasymptotically.Thetran slationandrotationcontrolinputsare showninFigure4andFigure4,respectively.

PAGE 94

80 0 5 10 15 20 650 700 750 800 850 900 ud [pixel] 0 5 10 15 20 750 800 850 900 950 Time [sec]vd [pixel]Figure4:Pixelcoordinate ( ) ofthefourfeaturepointsontheplane ina sequenceofdesiredimagestakenbythecameraattachedtoI.Theupper gureis forthe ( ) componentandthebottom gureisforthe ( ) component. 0 5 10 15 20 900 1000 1100 1200 u* [pixel] 0 5 10 15 20 1050 1100 1150 1200 Time [sec]v* [pixel]Figure4:Pixelcoordinate ( ) ofthefourfeaturepointsontheplane ina sequenceofreferenceimagestakenbythemovingcameraattachedtoI.Theupper gureisforthe ( ) componentandthebottom gureisforthe ( ) component.

PAGE 95

81 0 5 10 15 20 400 600 800 1000 u [pixel] 0 5 10 15 20 600 800 1000 1200 1400 1600 Time [sec]v [pixel]Figure4:Pixelcoordinate ( ) ofthefourfeaturepointsontheplane inasequenceofimagestakenbythemovingcameraattachedtoI.Theupper gureis forthe ( ) componentandthebottom gureisforthe ( ) component. 0 5 10 15 20 0.5 0 0.5 e1 [pixel] 0 5 10 15 20 0.5 0 0.5 e2 [pixel] 0 5 10 15 20 0.5 0 0.5 Time [sec]e3Figure4:Translationerror ( ) .

PAGE 96

82 0 5 10 15 20 0 0.5 1 ~ q0 0 5 10 15 20 0.5 0 0.5 ~ qv1 0 5 10 15 20 0.5 0 0.5 ~ qv2 0 5 10 15 20 1 0.5 0 Time [sec]~ qv3Figure4:Rotationquaternionerror ( ) 0 5 10 15 20 2 1 0 1 vc1 [m.s 1] 0 5 10 15 20 4 2 0 2 vc2 [m.s 1] 0 5 10 15 20 2 1 0 1 Time [sec]vc3 [m.s 1]Figure4:Linearcameravelocityinput ( ) .

PAGE 97

83 0 5 10 15 20 0.5 0 0.5 c1 [rad.s 1] 0 5 10 15 20 0.5 0 0.5 c2 [rad.s 1] 0 5 10 15 20 0 1 2 Time [sec]c3 [rad.s 1]Figure4:Angularcameravelocityinput ( ) .

PAGE 98

CHAPTER5 ADAPTIVEVISUALSERVOTRACKINGCONTROLUSINGACENTRAL CATADIOPTRICCAMERA 5.1Introduction Inthischapter,anadaptivehomographybasedvisualservotrackingcontrol schemeispresentedforacamera-in-hand centralcatadioptriccamerasystem.By usingthecentralcatadioptriccamera,afullpanoramicFOVisobtained.The literaturereviewforvisualservocontrolusingcentralcatadioptriccamerasare presentedinSection1.3.2.Inthischapter,thetrackingcontrollerisdeveloped basedontherelativerelationshipsofacentralcatadioptriccamerabetweenthe current,reference,anddesiredcameraposes.To ndtherelativecamerapose relationships,homographiesarecomputedbasedontheprojectionmodelofthe centralcatadioptriccamera[33].A sstatedbyGeyerandDaniilidis[36],a unifyingtheorywasproposedtoshowthatallcentralcatadioptricsystemsare isomorphictoprojectivemappingsfromthespheretoaplanewithaprojection centerontheperpendiculartotheplane.Byconstructinglinksbetweenthe projectedcoordinatesonthesphere,thehomographiesuptoscalarmultiplescan beobtained.VariousmethodscanthenbeappliedtodecomposetheEuclidean homographiesto ndthecorrespondingrotationmatrices,anddepthratios.The rotationerrorsysteminthischapterisbasedonthequaternionformulationwhich hasafull-rank43interactionmatrix.Lyapunov-basedmethodsareutilizedto developthecontrollerandtoproveasymptotictracking. 84

PAGE 99

85 Y( u v )i iT V UoicF cXcZcZmimage planeYm mreference plane mirrormFX Figure5:Centralcatadioptricprojectionrelationship. 5.2GeometricModel Acentralcatadioptriccameraiscomposedoftwoelements:acameraanda mirrorwhicharecalibratedtoyieldasinglee ectiveviewpoint.GeyerandDaniilidis[36]developedaunifyingtheorythatexplainshowallcentralcatadioptric systemsareisomorphictoprojectivemappingsfromthespheretoaplanewitha projectioncenterontheopticalaxisperpendiculartotheplane.Forthecentral catadioptriccameradepictedinFigure5,thecoordinateframesFandFare attachedtothefociofthecameraandmirror,respectively.Lightraysincidentto thefocalpointofthemirror(i.e.,theoriginofF)arere ectedintoraysincident withthefocalpointofthecamera(i.e.,theoriginofF). Withoutlossofgenerality,thesubsequentdevelopmentisbasedonthe assumptionthatthere ectionoffourcoplanarandnon-collinearEuclideanfeature pointsdenotedby ofsomestationaryobjectisrepresentedinthecameraimage planebyimagespacecoordinates ( ) ( )R =1 2 3 4 .Theplanede ned bythefourfeaturepointsisdenotedby asdepictedinFigure5.Thevector

PAGE 100

86 ( u v )i iTV U image plane unitar y s p here2 mii( x y z )iiT=oiFocsimZ Y Z XFXFYFo o o Figure52:Projectionmodelofthecentralcatadioptriccamera. ( )R3inFigure52isde nedas where ( ) ( ) ( )R denotetheEuclideancoordinateofthefeaturepoints expressedintheframeFwhichisa xedtothesinglee ectiveviewpoint.The projectedcoordinateof ( ) canbeexpressedas = = (51) where ( )R3denotestheEuclideancoordinatesof projectedontoaunit sphericalsurfaceexpressedinF,and ( )R isde nedas = q 2 + 2 + 2 (52)

PAGE 101

87 Basedonthedevelopmentin[36], ( ) canbeexpressedinthecoordinateframe ,whichisattachedtothereprojectioncenter,as = + (53) where R isaknownintrinsicparameterofthecentralcatadioptriccamerathat representsthedistancebetweenthesinglee ectiveviewpointandthereprojection center(seeFigure5).Thenormalizedcoordinatesof ( ) in(5),denotedby ( )R3,canbeexpressedas = + + 1 = + + 1 (54) Byusing(54)andthefollowingrelationship: 1= 2+ 2+ 2(55) thecoordinatesofthefeaturepointsontheunitsphericalsurface ( ) canbe determinedas = + q 2 + 2 (12)+1 2 + 2 +1 + q 2 + 2 (12)+1 2 + 2 +1 + + q 2 + 2 (12)+1 2 + 2 +1 (56) where ( ) and ( )R arethe rsttwoelementsof ( ) .Thebijective mappingin(56)isunique(i.e.,thereisnosignambiguity)because 0 1 [36], andthegeometryofthecentralcatadioptriccameragiveninFigure5guarantees thatthethirdelementof ( ) ispositive;otherwise,thefeaturepoint cannot beprojectedontotheimageplane.

PAGE 102

88 mioir e f e r e n c e p l a n e F oc *o* cmsi F od cms i md s id n*H( R x )fHd( R x )fd dd*mi*mdiF* Figure53:Camerarelationshipsrepresentedinhomography. AsshowninFigure5,thestationarycoordinateframeFdenotesaconstantreferencecameraposethatisde nedbyareferenceimage,andthecoordinateframeFrepresentsthedesiredtime-varyingcameraposetrajectoryde ned byaseriesofimages(e.g.,avideo).Thevectors ( )R3inFigure5are de nedas ( ) ( ) ( ) (57) where R and ( ) ( ) ( )R denotetheEuclideancoordinates ofthefeaturepoints expressedintheframesFandF,respectively.

PAGE 103

89 Theconstantcoordinatesof canbeprojectedontoaunitsphericalsurface expressedinFandO ,respectively,as = = (58) = = + andthetime-varyingcoordinates ( ) canbeprojectedontoaunitspherical surfaceexpressedinFandO,respectively,as = = = + where ( ) ( )R3 and ( )R arede nedas = q 2 + 2 + 2 = q 2 + 2 + 2 (59) Thenormalizedcoordinatesof ( ) denotedas ( )R3isde nedas ( ) ( ) ( ) ( ) 1 (5) whichwillbeusedinthecontrollerdevelopment.Thesignal ( ) in(5)is measurablebecause ( ) canbecomputedfromthemeasurableandbounded pixelcoordinates ( ) usingsimilarprojectiverelationshipsasin(5)and(5 17).Thenormalizedcoordinatesof ( ) denotedas ( )R3, respectively,arede nedas = + + 1 (5) = + + 1

PAGE 104

90 FromstandardEuclideangeometry,therelationshipsbetween ( ) ( ) and canbedeterminedas = + = + (5) where ( ) ( )R3denotethetranslationvectorsexpressedinFandF, respectively,and ( ) ( ) (3) denotetheorientationofFwithrespect toFandF,respectively.AsalsoillustratedinFigure53, R3denotesthe constantunitnormaltotheplane ,andtheconstantdistancefromtheoriginofFto alongtheunitnormal isdenotedby R isde nedas (5) Byusing(513),therelationshipsin(52)canbeexpressedas = = (5) where ( ) ( )R3 3aretheEuclideanhomographiesde nedas = + = + (5) Basedon(51)and(5),therelationshipbetweentheEuclideancoordinatesin (54)canbeexpressedintermsoftheunitsphericalsurfacecoordinatesas = = (5) where ( ) ( )R and ( ) ( ) arescalingterms. 5.3EuclideanReconstruction Thehomogenouspixelcoordinatesofthefeaturespointswithrespecttothe cameraframesF,FandFaredenotedas ( ) and ( )R3,respectively. Theycanberelatedtothenormalizedcoordinates ( ) and ( ) viathe

PAGE 105

91 followinglinearrelationshipas = = = (5) where R3 3containsthecalibratedintrinsicparametersofthecameraandthe mirror(seeBarretoandAraujo[35]).Sin cethecameraandmirrorarecalibrated (i.e., isknown), ( ) and ( ) canbecomputedfromthemeasurable pixelcoordinates ( ) and ( ) basedon(517).Theexpressiongivenin (5)canthenbeusedtocompute ( ) from ( ) .Similarly, and ( ) canbecomputedfrom and ( ) ,respectively.Thenbasedon(516),aset of12linearlyindependentequationsgivenbythe4featurepointpairs ( ( )) with3independentequationsperfeaturepointcanbedevelopedtodetermine thehomographyuptoascalarmultiple.Variousmethodscanthenbeapplied (e.g.,seeFaugerasandLustman[93]andZhangandHanson[94])todecompose theEuclideanhomographytoobtain ( ) ( ) ( ) ( ) ,and .Similarly, ( ) ( ) ( ) ( ) canbeobtainedfrom(516)using ( ( )) .The rotationmatrices ( ) ( ) andthedepthratios ( ) ( ) willbeusedinthe subsequentcontroldesign. 5.4ControlObjective Thecontrolobjectiveisforacameraattachingtoanobject(i.e.,camera-inhandcon guration)whichcanbeidenti edbyaplanarpatchoffeaturepointsto trackadesiredtrajectorythatisdeterminedfromasequenceofdesiredimages takenduringtheaprioricameramotion.Thisobjectiveisbasedontheassumption thatthelinearandangularvelocitiesofthecameraarecontrolinputsthatcan beindependentlycontrolled(i.e.,unconstrainedmotion)andthatthecamerais calibrated(i.e., isknown).Thecontrolobjectivecanbestatedas ( ) ( ) (i.e.,Euclideanfeaturepointson trackthecorrespondingfeaturepoints on ).Equivalently,thecontrolobjectivecanalsobestatedintermsofthe

PAGE 106

92 rotationandtranslationoftheobjectas ( )( ) and ( )( ) .As statedpreviously, ( ) and ( ) canbecomputedbydecomposingtheprojective homographiesin(56).Oncetheserotationmatriceshavebeendetermined, theunitquaternionparameterizationisusedtodescribetherotationmatrix. Thisparameterizationfacilitatesthesubsequentproblemformulation,control development,andstabilityanalysissincetheunitquaternionprovidesaglobal nonsingularparameterizationofthecorrespondingrotationmatrices. Toquantifytheerrorbetweentheactualanddesiredcameraorientations,the mismatchbetweentherotationmatrices ( ) and ( ) ,denotedby ( )R3,is de nedas = (5) Giventherotationmatrices ( ) and ( ) fromthehomographydecomposition, thecorrespondingunitquaternions ( ) and ( ) canbecomputedbyusingthe numericallyrobustmethod(e.g.,see[14]and[15])as ( )= 2 0 3+2 2 0 (5) ( )= 2 0 3+2 2 0 (5) where 3isthe 33 identitymatrix,andthenotation ( ) denotestheskewsymmetricformofthevector ( ) asin(2).Basedon(5)-(5),the rotationmismatchcanbeexpressedas = 2 0 3+2 2 0 (5) wheretheerrorquaternion ( 0( ) ( ))isde nedas[23] 0= 00 + (5) = 0 0+

PAGE 107

93 Basedon(5)and(5),therotationtrackingcontrolobjective ( )( ) canbeformulatedask ( )k 0= ( )3as (5) Toquantifythepositionmismatchbetweentheactualanddesiredcamera,the translationtrackingerror ( )R3isde nedas = = ln (5) where ( ) ( )R3arede nedas 1 2 3= ln (5) 1 2 3= ln (5) Basedon(523)and(524),thesubsequentcontroldevelopmenttargetsthe followingobjectives:k ( )k 0 andk ( )k 0 as (5) Theerrorsignal ( ) ismeasurablesinceitcanbecomputedfromthe ( ) and ( ) asin[24].The rsttwoelementsofthetranslationerror ( ) are measurablebecause 1 1= 2 2= where ( ) ( ) ( ) ( ) ( ) ( ) canbecomputedfrom(5)and(5),and ( ) ( ) ( ) ( ) ( ) ( ) canbecomputedfromsimilarrelationships.Thethirdelementof thetranslationerrorisalsomeasurablesince = ( ) ( ) =

PAGE 108

94 where ( ) ( ) canbecomputedfrom(5)and(5), ( ) ( ) canbecomputedfrom similarrelationships,andthedepthratios ( ) ( ) canbeobtainedfromthe homographydecompositionsin(5). 5.5ControlDevelopment 5.5.1Open-LoopErrorSystem Theopen-looprotationerrorsystemcanbedevelopedas[23] = 1 2 03+ (5) where ( )=( 0( ) ( )), ( )R3denotesthecameraangularvelocitycontrol input,and ( )R3denotesthedesiredangularvelocityofthecamerathatis assumedtobeapriorigeneratedasaboundedandcontinuousfunction(see[10]for adiscussionregardingthedevelopmentofasmoothdesiredtrajectoryfromaseries ofimages). Basedon(5)and(5 ),thederivativeof ( ) isobtainedas = 1 (5) where ( )R isde nedas = Basedon(5)andthefactthat[22] =+ (5) where ( )R3denotesthedesiredlinearvelocityofthecameraexpressedinF, itcanbeobtainedthat = 1 (5)

PAGE 109

95 Afterdi erentiatingbothsidesof(5)andusingtheequations(5),(5) and =+ theopen-looptranslationerrorsystemcanbederivedas =+ 1 (5) where ( )R3denotesthelinearvelocityinputofthecamerawithrespecttoFexpressedinF,theJacobian-likematrices ( ) ( ) ( )R3 3arede ned as = 10 101 2001 = 10 101 2001 (5) = 1 212 1 21+ 2 2 1 2 1 2 10 (5) and ( )R isde nedas = 5.5.2Closed-LoopErrorSystem Basedontheopen-looprotationerrorsystemin(528)andthesubsequent Lyapunov-basedstabilityanalysis,theangularvelocitycontrollerisdesignedas = + (5)

PAGE 110

96 where R3 3denotesadiagonalmatrixofpositiveconstantcontrolgains. From(58)and(55),therotationclosed-looperrorsystemcanbedeterminedas 0= 1 2 (5) =1 2 03+ Basedon(5),thetranslationcontrolinput ( ) isdesignedas = 1 + 1 (5) where R3 3denotesadiagonalmatrixofpositiveconstantcontrolgains.In (57),theparameterestimate ( )R fortheunknownconstant isde nedas = 1 (5) where R denotesapositiveconstantadaptationgain.Byusing(52)and (5),thetranslationc losed-looperrorsystemis = + 1 (5) where ( )R denotesthefollowingpara meterestimationerror: = (5) 5.5.3StabilityAnalysis Theorem5.1 :Thecontrollergivenin(5)and(5),alongwiththe adaptiveupdatelawin(538)ensuresasymptotictrackinginthesensethatk ( )k 0 k ( )k 0 (5)

PAGE 111

97 Proof :Let ( )R denotethefollowingdi erentiablenon-negativefunction (i.e.,aLyapunovcandidate): = +(1 0)2+ 2 + 1 2 2 (5) Thetime-derivativeof ( ) canbedeterminedas = 03+ (1 0) + 1 1 = 03+ +(1 0) 3 = (5) where(5)and(5)-(5)wereu tilized.Basedon(5)and(5), ( ) ( ) 0( ) ( ) Land ( ) ( ) L2.Since ( ) L,itisclear from(5)that ( ) L.Basedonthefactthat ( ) L,(5)and(5) canbeusedtoprovethat ( ) L,and ( ) 1 ( ) ( ) L.Since ( ) Land ( ) isaboundedfunction,(5)canbeusedtoconcludethat ( ) L.From ( ) ( ) ( ) ( ) 1 ( ) ( ) Land ( ) 1 arebounded,(537)canbeutilizedtoprovethat ( ) L.From thepreviousresults,(5)-(5)canbeusedtoprovethat ( ) ( ) L. Since ( ) ( ) L L2,and ( ) ( ) L,BarbalatsLemma[96]canbe usedtoconcludetheresultgivenin(5).

PAGE 112

CHAPTER6 VISUALSERVOCONTROLINTHEPRESENCEOFCAMERACALIBRATION ERROR 6.1Introduction Huetal.[14]introducedanewquaternion-basedvisualservocontrollerfor therotationerrorsystem,providedthecameracalibrationparametersareexactly known.SincetheresultsbyMalisandChaumette[21]andFangetal.[87]rely heavilyonpropertiesoftherotationparam eterizationtoformulatestateestimates andameasurableclosed-looperrorsystem,theresearchinthischapterismotivated bythequestion: Canstateestimatesandameasurableclosed-looperrorsystembe craftedintermsofthequaternionparameterizationwhenthecameracalibration parametersareunknown ?Toanswerthisquestion,acontributionofthischapteris thedevelopmentofaquaternion-basedestimatefortherotationerrorsystemthat isrelatedtotheactualrotationerror,thedevelopmentofanewclosed-looperror system,andanewLyapunov-basedanalysisthatdemonstratesthestabilityofthe quaternionerrorsystem.Oneofthechallengesistodevelopaquaternionestimate fromanestimatedrotationmatrixthatisnotatruerotationmatrixingeneral. Toaddressthischallenge,thesimilarityrelationshipbetweentheestimatedand actualrotationmatricesisused(asin[21]and[87])toconstructtherelationship betweentheestimatedandactualquaternions.ALyapunov-basedstabilityanalysis isprovidedthatindicatesauniquecontrollercanbedevelopedtoachievethe regulationresultdespiteasignambiguityinthedevelopedquaternionestimate. SimulationresultsareprovidedinSection6.7thatillustratetheperformanceofthe developedcontroller. 98

PAGE 113

99 6.2FeedbackControlMeasurements Theobjectiveinthischapteristodevelopakinematiccontroller(i.e.,the controlinputsareconsideredthelinearandangularcameravelocities)toensure theposition/orientationofthecameracoordinateframeFisregulatedtothe desiredposition/orientationF.ThecamerageometryisshowninFigure2 andthecorrespondingEuclideanandimage-spacerelationshipsaredeveloped inSections2.1and2.2.Theonlyrequiredsensormeasurementsforthecontrol developmentaretheimagecoordinatesofthedeterminedfeaturepoints(i.e., measurementofthesignalsin(2)),wherethestaticfeaturepointcoordinates inthedesiredimagearegivenapriori.Bymeasuringthecurrentimagefeature pointsandgiventhedesiredfeaturepoints,therelationshipin(26)canbeused todeterminethenormalizedEuclideancoordinatesof providedtheintrinsic cameracalibrationmatrixisperfectlyknown.Unfortunately,anyuncertaintyin willleadtoacorruptedmeasurementof ( ) and .Thecomputednormalized coordinatesareactuallyestimates,denotedby ( ) R3,ofthetruevalues sinceonlyabest-guessestimateof ,denotedby R3 3,isavailableinpractice. Thenormalizedcoordinateestimatescanbeexpressedas[21] = 1= (61) = 1 = (62) wherethecalibrationerrormatrix R3 3isde nedas = 1 = 11 12 130 22 23001 (63) where 11, 12, 13, 22, 23R denoteunknownintrinsiccalibrationmismatch constants.Since ( ) and cannotbeexactlydetermined,theestimatesin

PAGE 114

100 (6)and(62)canbesubstitutedinto(24)toobtainthefollowingrelationship = (64) where ( )R3 3denotestheestimatedEuclideanhomographyde nedas = 1 (65) Since ( ) and canbedeterminedfrom(61)and(6),asetoftwelvelinear equationscanbedevelopedfromthefourimagepointpairs,and(6)canbeused tosolvefor ( ) Asstatedin[21],providedadditionalinformationisavailable(e.g.,atleast 4vanishingpoints),varioustechniques(e.g.,seeFaugerasandLustman[93]and ZhangandHanson[94])canbeusedtodecompose ( ) toobtaintheestimated rotationandtranslationcomponentsas = 1+ 1= + (66) where ( )R3 3isde nedas = 1 (67) and ( )R3, R3denotetheestimateof ( ) and ,respectively,de ned as = = 1 where R denotesthefollowingpositiveconstant = Forthefourvanishingpoints(seeAlmansaetal.[103]foradescriptionofhow todeterminevanishingpointsinanimage), =,sothat = + = (68)

PAGE 115

101 andtherefore = 1= (69) Twelvelinearequationscanbeobtainedbasedon(64)forthefourvanishing points.Assumethe3rdrow3rdcolumnelementof ( ) ,denotedas 33( )R ,is notequaltozero(w.l.o.g.).Thenormalizedmatrix ( )R3 3,de nedas = 33 (6) canbecomputedbasedonthesetwelvelinearequations.Basedon(69) det =det( )det( )det( 1)=1 (6) From(6)and(6), 3 33det =1 (6) andhence, = 3q det( ) (6) whichisequalto ( ) 6.3ControlObjective Asstatedpreviously,theobjectiveinthischapteristodevelopakinematic controllertoensuretheposeofthecameracoordinateframeFisregulatedtothe desiredposeFdespiteuncertaintyintheintrinsiccameracalibrationmatrix.This objectiveisbasedontheassumptionthatthelinearandangularvelocitiesofthe cameraarecontrolinputsthatcanbeindep endentlycontrolled(i.e.,unconstrained motion).Forexample,thelinearandangularcameravelocitiescouldbecontrolled bytheend-e ectorofaroboticmanipulator.I nadditiontouncertaintyinthe intrinsiccameracalibration,uncertaintycouldalsoexistintheextrinsiccamera calibration(e.g.,theuncertaintyintherotationandtranslationofthecamera withrespecttotherobotend-e ector).Thedevelopmentinthischaptercould

PAGE 116

102 bedirectlymodi edasdescribedin[21]and[87]tocompensatefortheextrinsic calibration.Therefore,thee ectsofamismatchintheextrinsiccalibrationarenot consideredinthesubsequentdevelopmentforsimplicity. IntheEuclideanspace,therotationcontrolobjectivecanbequanti edas ( )3as (6) Thesubsequentdevelopmentisformulatedintermsofthefourdimensionalunit quaternion ( ) .Giventherotationmatrix ( ) ,thecorrespondingunitquaternion ( ) canbecomputedbyusingthenumericallyrobustmethodpresentedinSection 2.3.From(2)and(2),therotationregulationobjectivein(6)canalsobe quanti edasthedesiretoregulate ( ) ask( )k 0 as (6) Thefocusandcontributionofthischa pterliesintheabilitytodevelopand provethestabilityofaquaternion-basedrotationcontrollerinthepresenceof uncertaintyinthecameracalibration.ThetranslationcontrollerdevelopedbyFang etal.[87]isalsopresentedandincorporatedinthestabilityanalysistoprovidean exampleofhowthenewclassofquaternion-basedrotationcontrollerscanbeused inconjunctionwithtranslationcontrollersthatarerobusttocameracalibration uncertaintyincluding(forexample):theasymptotictranslationcontrollersin[21], andtheexponentialtranslationcontrollersin[87].Thetranslationerror,denoted by ( )R3,isde nedas = (6) where canbechosenasanynumberwithin{1 4} Thetranslationobjective canbestatedask ( )k 0 as (6)

PAGE 117

103 Thesubsequentsectionwilltargetthecontroldevelopmentbasedontheobjectives in(6)and(6). 6.4QuaternionEstimation Amethodispresentedinthissectiontodevelopaquaternion-basedrotation estimatethatcanberelatedtotheactualrotationmismatchtofacilitatethe controldevelopment. 6.4.1EstimateDevelopment Theunitquaternionisrelatedtotheangle-axisrepresentationas 0=cos 2 = sin 2 (6) where ( ) and ( ) arethecorrespondingrotationangleandunitaxis.Byusing (2)and(2),the rstelementofthequaternioncanalsobeexpressedin termsoftherotationmatrix ( ) as 2 0= ( )+1 4 where 0( ) isrestrictedtobenon-negativeas 0= 1 2 p 1+ ( ) (6) withoutlossofgenerality(thisrestrictionenablestheminimumrotationtobe obtained),and ( ) denotesthetraceof ( ) .Basedon(6)and(6), ( ) canbedeterminedas = s 1cos2 2 =1 2 p 3 ( ) (6) wheretherotationaxis ( ) istheuniteigenvectorwithrespecttotheeigenvalue 1 of ( ) .Forthequaternionvectorin(60),thesignambiguitycanberesolved. Speci cally,(215)canbeusedtodevelopthefollowingexpression: =4 0 (6)

PAGE 118

104 Sincethesignof 0( ) isrestricted(i.e.,assumedtobe)positive,thenaunique solutionfor ( ) canbedeterminedfrom(6)and(6). Basedonthesimilaritybetween ( ) and ( ) asstatedin(67),theexpressionsin(6)and(6)providemotivationtodevelopthequaternionestimate as 0= 1 2 q 1+ ( ) (6) = sin 2 =1 2 q 3 ( ) (6) In(6)and(6), ( ) istheestimatedrotationmatrixintroducedin(66) thatiscomputedfromthehomographydecomposition.Since ( ) issimilarto ( ) (see(67)), ( ) isguaranteedtohaveaneigenvalueof 1 ,where ( ) isthe uniteigenvectorthatcanbecomputedfromtheeigenvalueof 1 .Since ( ) is notguaranteedtobeatruerotationmat rix(anditwillnotbeingeneral),the relationshipsin(2)and(6)cannotbedevelopedandusedtoeliminatethe signambiguityoftheeigenvector ( ) .However,thesubsequentstabilityanalysis andsimulationresultsindicatethatthesamestabilityresultisobtainedinvariant ofthesignof ( ) .Oncetheinitialsignof ( ) ischosen,thesamesigncanbeused forsubsequentcomputations. 6.4.2EstimateRelationships Basedonthefactthat ( ) issimilarto ( ) (see(6)),thepropertiesthat similarmatriceshavethesametraceandeigenvaluescanbeusedtorelatethe quaternionestimateandtheactualquaternion.Sincesimilarmatriceshavethe sametrace,(6)and(6)canbeusedtoconcludethat 0= 0 (6) Asstatedearlier,sincesimilarmatriceshavethesameeigenvalues, ( ) isguaranteedtohaveaneigenvalueof 1 withtheassociatedeigenvector ( ) .Thefollowing

PAGE 119

105 relationshipscanbedevelopedbasedon(67) = = 1 (6) Premultiplying 1onbothsidesof(6)yields 1 = 1 (6) Hence, 1 ( ) isaneigenvectorwithrespecttotheeigenvalue 1 of ( ) thatcan beexpressedas 1 = (6) where R isde nedas = 1 (6) Basedon(6),(6),and(6),the estimatedquaternionvectorcannowbe relatedtotheactualquaternionvectoras = (6) Byusing(2),(6),(6)and(6), 2 0+k k2= 2 0+ 1 2 2 (6) Basedon(6)andthefactthat ( ) isaunitvector, =kk sin 2 sin 2 =kk (6) From(6)and(6), 2 0+k k2= 2 0+kk2 2 2=1

PAGE 120

106 6.5ControlDevelopment 6.5.1RotationControl Therotationopen-looperrorsystemcanbedevelopedbytakingthetime derivativeof ( ) as 0 = 1 2 03+ (6) where ( )R3denotestheangularvelocityofthecamerawithrespecttoFexpressedinF.Basedontheopen-looperrorsystemin(6)andthesubsequent stabilityanalysis,theangularvelocitycontrollerisdesignedas = (6) where R denotesapositivecontrolgain.Substituting(633)into(62),the rotationclosed-looperrorsystemcanbedevelopedas 0= 1 2 (6) =1 2 03+ (6) 6.5.2TranslationControl Thecontributionofthischapteristherotationestimateandassociatedcontrol development.Thetranslationcontrollerdevelopedinthissectionisprovided forcompleteness.Asstatedpreviously,translationcontrollerssuchastheclass developedbyMalisandChaumette[21]andFangetal.[87]canbecombinedwith thedevelopedquaternion-basedrotationcontroller.Tofacilitatethesubsequent stabilityanalysisforthesixDOFproblem,atranslationcontrollerproposedin[87] isprovidedinthissection,whichisgivenby = (6)

PAGE 121

107 where R denotesapositivecontrolgain,and ( )R3isde nedas = (6) where ( ) and canbecomputedfrom(6)and(6),respectively,andthe ratio canbecomputedfromthedecompositionoftheestimatedEuclidean homographyin(6).Theopen-looptranslationerrorsystemcanbedeterminedas =1 +[ ].(6) Aftersubstituting(633)and(66)into(638),theresultingclosed-looptranslationerrorsystemcanbedeterminedas = 1 +[ ] [ ] (6) 6.6StabilityAnalysis Asstatedpreviously,thequaternionestimate ( ) hasasignambiguity,but eitherchoiceofthesignwillyieldthesamestabilityresult.Thefollowinganalysis isdevelopedforthecasewhere = (6) Adiscussionisprovidedattheendoftheanalysis,thatdescribeshowthestability canbeprovenforthecasewhen = Theorem6.1 :Thecontrollergivenin(6)and(6)ensuresasymptotic regulationinthesensethatk( )k 0 k ( )k 0 as (6)

PAGE 122

108 provided isselectedsu cientlylarge(seethesubsequentproof),andthe followinginequalitiesaresatis ed min 1 2 + 0(6) max 1 2 + 1 (6) where 01R arepositiveconstants,and min{}and max{}denotethe minimalandmaximaleigenvaluesof 1 2 + ,respectively. Proof :Let ( )R denotethefollowingdi erentiablenon-negativefunction (i.e.,aLyapunovcandidate): = +(10)2+ (6) Aftercancellingcommonterms, ( ) canbeexpressedas 1=2 2(10) 0+ = 03+ (10) + 1 + h i [ ] = 03+ (10) 3 + 1 + h i [ ] = 1 + h i [ ] (6) Byusingtheinequality(642),theterm satis es = 1 2 + min 1 2 + kk20kk2 (6)

PAGE 123

109 Since = 1 2 + 0kk2 theterm1 satis es1 1 0kk2 (6) Basedonthepropertythat [ ] 2=kk R3(seeAppendixD)andk k 1 theterm h i satis es h i = [ ] k k2kk2k kkk2kk2 (6) From(6),theterm[ ] satis es[ ] = kk [ ] kk [ ] 2kk = k kkkkk (6) Byusing(6)-(6),theexpress ionin(6)canbeupperboundedas 0kk21 ( 1+ 2) 0kk2+ kk2+ k kkkkk (6) wherethecontrolgain isseparatedintotwodi erentcontrolgainsas = 1+ 2.Thefollowinginequalitycanbeobtainedaftercompletingthesquares: k kkkkk 2 k k2 4 10kk2+ 1 10kk2 (6) From(61),theinequality( 6)canberewrittenas 0 1 1 k k2 4 2 0 !kk20 2 0kk2

PAGE 124

110 Basedonthede nitionof ( ) in(6),theinequalities(6)and(6),and theassumptionthat and arebounded,thereexisttwopositivebounding constant 1and 2R satisfyingthefollowinginequalities: k k2 4 2 0 1and 02thecontrolparameter canbeselectedlargeenoughtoensurethat ( ) is negativesemi-de niteas 0 1kk20 kk2 (6) Basedon(6)and(6),standards ignalchasingargumentscanbeusedto concludethatthecontrolinputsandalltheclosed-loopsignalsarebounded.The expressionin(6)canalsobeusedtoconcludethat ( ) and ( ) L2.Since ( ) ( ) ( ) ( ) Land ( ) ( ) L2,BarbalatsLemma[96]canbeused toprovetheresultgivenin(6). BymodifyingtheLyapunovfunctionin(64)as = +(1+ 0)2+ thesamestabilityanalysisargumentscanbeusedtoproveTheorem6.1forthe casewhen = Sincethesignambiguityin(6)doesnota ectthecontroldevelopmentand stabilityanalysis,onlythepositivesignin(629)needstobeconsideredinthe futurecontroldevelopmentforconvenience. 6.7SimulationResults Numericalsimulationswereperformedtoillustratetheperformanceofthe controllergivenin(633)and(66).Theintrinsiccameracalibrationmatrixis

PAGE 125

111 givenby = 122 53 77100 0122 56100 001 Thebest-guessestimationfor wasselectedas = 100480 0100110 001 Thecameraisassumedtoviewanobjectwithfourcoplanarfeaturepointswith thefollowingEuclideancoordinates(in[m]): 1= 0 050 050 2= 0 050 050 (6) 3= 0 050 050 4= 0 050 050 Thenormalizedcoordinatesofthevanishingpointswereselectedas 0 020 021 0 020 021 0 020 021 0 020 021 ConsideranorthogonalcoordinateframeIwiththe -axisoppositeto (see Figure2)withthe -axisand -axisontheplane .Therotationmatrices 1betweenFandI,and 2betweenFandIweresetas 1= (160) (30) (30) 2= (120) (20) (80) where () () and () (3) denoterotationofangle (degrees) alongthe -axis, -axisand -axis,respectively.Thetranslationvectors 1( ) and 2( ) betweenFandI(expressedinF)andbetweenFandI(expressedinF),

PAGE 126

112 respectively,wereselectedas 1= 0 50 52 5 2= 1 01 03 5 Theinitial(i.e., (0) )anddesired(i.e., )image-spacecoordinatesofthefour featurepointsin(6)werecomputedas(inpixels) 1(0)= 126 50123 641 2(0)= 124 24127 911 3(0)= 120 92125 401 4(0)= 123 25121 111 1= 132 17133 171 2= 135 72133 611 3= 135 71136 911 4= 132 10136 441 Theinitial(i.e., (0) )anddesired(i.e., )image-spacecoordinatesofthefour vanishingpointsin(6)werecomputedas(inpixels) 1(0)= 124 02139 341 2(0)= 129 02141 611 3(0)= 131 02136 541 4(0)= 126 03134 351 1= 102 37102 451 2= 102 5397 551 3= 97 6397 551 4= 97 47102 451 Thecontrolgains in(6)and in(66)wereselectedas =5 =5 TheresultingtranslationandrotationerrorsareplottedinFigure6and Figure62,respectively.Theimage-spacepixelerror(i.e., ( ) ) isshownin Figure6,andisalsodepictedinFigure6ina3Dformat.Thetranslationand rotationcontroloutputsareshowninFigure6andFigure67,respectively.For

PAGE 127

113 di erentchoiceofsignofthequaternionestimate,theasymptoticresultcanstill beachieved.IncontrasttothequaternionestimateinFigure6,aquaternion estimatewithdi erentsignisshowninFigure63. 0 2 4 6 8 10 0.2 0.1 0 e1 0 2 4 6 8 10 0.2 0.1 0 e2 0 2 4 6 8 10 0.5 0 0.5 Time [sec]e3Figure6:Unitlesstranslationerrorbetween 1( ) and 1.

PAGE 128

114 0 2 4 6 8 10 0.6 0.8 1 q0 0 2 4 6 8 10 0.2 0 0.2 qv1 0 2 4 6 8 10 0.2 0.1 0 qv2 0 2 4 6 8 10 1 0.5 0 Time [sec]qv3Figure6:Quaternionrotationerror. 0 2 4 6 8 10 0 1 2 q0 0 2 4 6 8 10 0.5 0 0.5 qv1 0 2 4 6 8 10 0.2 0 0.2 qv2 0 2 4 6 8 10 0 0.5 1 Time [sec]qv3Figure6:Quaternionrotationerrorforcomparisonwithdi erentsign.

PAGE 129

115 118 120 122 124 126 128 130 132 134 136 118 120 122 124 126 128 130 132 134 136 138 p1(0) p2(0) p3(0) p4(0) ui [pixel]vi [pixel] Figure6:Image-spaceerrorinpixlesbetween ( ) and .Inthe gure,O denotestheinitialpositionsofthe4featurepointsintheimage,and*denotes thecorresponding nalpositionsofthefeaturepoints. 115 120 125 130 135 140 110 120 130 140 0 2 4 6 8 10 ui [pixel] p4(0) p1(0) p3(0) p2(0) vi [pixel]Time [sec]Figure6:Image-spaceerrorinpixlesbetween ( ) and shownina3Dgraph. Inthe gure,Odenotestheinitialpositionsofthe4featurepointsintheimage, and*denotesthecorresponding nalpositionsofthefeaturepoints.

PAGE 130

116 0 2 4 6 8 10 1 0.5 0 vc1 [m.s 1] 0 2 4 6 8 10 1 0.5 0 vc2 [m.s 1] 0 2 4 6 8 10 2 1 0 1 Time [sec]vc3 [m.s 1]Figure6:Linearcameravelocitycontrolinput. 0 2 4 6 8 10 0.5 0 0.5 1 c1 [rad.s 1] 0 2 4 6 8 10 0 0.5 1 c2 [rad.s 1] 0 2 4 6 8 10 0 2 4 6 Time [sec]c3 [rad.s 1]Figure6:Angularcameravelocitycontrolinput.

PAGE 131

CHAPTER7 COMBINEDROBUSTANDADAPTIVEHOMOGRAPHY-BASEDVISUAL SERVOCONTROLVIAANUNCALIBRATEDCAMERA 7.1Introduction Inthischapter,anewcombinedrobustandadaptivevisualservocontroller isdevelopedtoasymptoticallyregulatethefeaturepointsofarigid-bodyobject (identi edaplanarpatchoffeaturepoints)inanimagetothedesiredfeature pointlocationswhilealsoregulatingthesixDOFposeofthecamera(whichis a xedtotheobject).Thesedualobjectivesareachievedbyusingahomographybasedapproachthatexploitsbothimage-spaceandreconstructedEuclidean informationinthefeedbackloop.Incomparisontopureimage-basedfeedback approaches,someadvantagesofusingahomography-basedmethodinclude: realizableEuclideancameratrajectories(seeChaumette[48]andCorkeand Hutchinson[25]foradiscussionofChaume ttesConundrum);anonsingularimageJacobian;andboththecamerapositionandorientationandthefeaturepoint coordinatesareincludedintheerrorsystem.Sincesomeimage-spaceinformationis usedinthefeedback-loopofthedevelopedhomography-basedcontroller,theimage featuresarelesslikelytoleavetheFOVincomparisonwithpureposition-based approaches.Thedevelopedcontrolleriscomposedofthesameadaptivetranslation controllerasinthepreliminaryresultsinChenetal.[89]andanewrobustrotation controller.Thecontributionoftheresultisthedevelopmentoftherobustangular velocitycontrollerthataccommodatesforthetime-varyinguncertainscalingfactor byexploitingtheuppertriangularformoftherotationerrorsystemandthefact thatthediagonalelementsofthecameracalibrationmatrixarepositive. 117

PAGE 132

118 7.2CameraGeometryandAssumptions ThecamerageometryforthischapterisshowninFigure22andthecorrespondingEuclideanandimage-spacerelationshipsaredevelopedinSections2.1and 2.2.Forconvenienceinthefollowingdevelopment,thecameracalibrationmatrix isrewrittenas cot 00 sin 0001 = 1112130 2223001 (71) Basedonthephysicalmeaningoftheelementsof ,thediagonalcalibration elementsarepositive(i.e., 1122 0 ). Thefollowingtwoassumptionsaremadefortheconvenienceofthecontroller development.Theyaresoreasonablesuchthattheycanbeconsideredpropertiesof theconsideredvisionsystem. Assumption1 :Theboundsof 11and 22areassumedtobeknownas 1111 11 2222 22 (72) Theabsolutevaluesof 121323areupperboundedas|12| 12|13| 13|23| 23 (73) In(7)and(7), 11 11 22 22 12 13and 23areknownpositive constants. Assumption2 :ThereferenceplaneiswithinthecamerasFOVandnotat in nity.Thatis,thereexistpositiveconstants and suchthat ( ) (74)

PAGE 133

119 Basedon(2)-(2),thehomographyrelationshipbasedonmeasurablepixel coordinatesis: = 1 (75) Since isunknown,standardhomographycomputationanddecomposition algorithmscantbeappliedtoextracttherotationandtranslationfromthe homography. AsstatedinMalisandChaumette[21],ifsomeadditionalinformationis known,suchasfourvanishingpoints,therotationmatrixcanbeobtained.Forthe vanishingpoints(seeAlmansaetal.[103]foradescriptionofhowtodetermine vanishingpointsinanimage), =,sothat = + = (76) Basedon(76),therelationshipin(75)canbeexpressedas = (77) where ( )R3 3isde nedas = 1 (78) Forthefourvanishingpoints,twelvelinearequationscanbeobtainedbasedon (7).Afternormalizing ( ) byonenonzeroelement(e.g., 33( )R which isassumedtobethethirdrowthirdcolumnelementof ( ) withoutlossof generality)twelveequationscanbeusedtosolvefortwelveunknowns.Thetwelve unknownsaregivenbytheeightunknownelementsofthenormalized ( ) ,denoted by ( )R3 3de nedas 33(79)

PAGE 134

120 andthefourunknownsaregivenby 33( ) ( ) .Fromthede nitionof ( ) in (7),thefactthat det =det( )det( )det( 1)=1 (7) canbeusedtoconcludethat 3 33det =1 (7) andhencebasedon(7)and(7), = 3p det( ) (7) After ( ) isobtained,theoriginalfourfeaturepointsonthereferenceplanecan beusedtodeterminethedepthratio ( ) asshowninAppendixE. 7.3Open-LoopErrorSystem 7.3.1RotationErrorSystem Iftherotationmatrix ( ) introducedin(24)wereknown,thenthecorrespondingunitquaternion ( ) 0( ) ( ) canbecalculatedusingthe numericallyrobustmethodpresentedpresentedinSection2.3.Given ( ) ,the quaternion ( ) canalsobewrittenas 0= 1 2 p 1+ ( ) (7) = 1 2 p 3 ( ) (7) where ( )R3isauniteigenvectorof ( ) withrespecttotheeigenvalue 1 .The open-looprotationerrorsystemfor ( ) canbeobtainedas(seeDixonetal.[23]) 0 = 1 2 03+ (7) where ( )R3de nestheangularvelocityofthecameraexpressedinF.

PAGE 135

121 Thequaternion ( ) givenin(7)-(7)isnotmeasurablesince ( ) is unknown.However,since ( ) canbedeterminedasdescribedin(712),thesame algorithmasshowninequations(7)and(7)canbeusedtodeterminea correspondingmeasurablequaternion 0( ) ( ) as 0= 1 2 q 1+ ( ) (7) = 1 2 q 3 ( ) (7) where ( )R3isauniteigenvectorof ( ) withrespecttotheeigenvalue 1 Basedon(78), = ( 1)= ( ) ,where () denotesthetrace ofamatrix.Since ( ) and ( ) aresimilarmatrices,therelationshipbetween 0( ) ( ) and 0( ) ( ) canbedeterminedas 0= 0 =kk kk, (7) where ( )R isapositive,unknown,time-varyingscalarthatsatis esthe followinginequalities(seeAppendixF) ( ) (7) where R arepositiveboundingconstants.Theinverseoftherelationship between ( ) and ( ) in(7)canbedevelopedas = 1 1 = 1 1 11 112 1122 2 13 111223 1122 31 22 223 22 3 3 (7) 7.3.2TranslationErrorSystem Thetranslationerror,denotedby ( )R3,isde nedas ( )= ( ) (7)

PAGE 136

122 where ( ) R3arede nedas = ln( ) = 0 (7) where {1 4} Thetranslationerror ( ) ismeasurablesincethe rst twoelementsareimagecoordinates,and ( ) isobtainedfromthehomography decompositionasdescribedinAppendixA.Theopen-looptranslationerrorsystem canbeobtainedbytakingthetimederivativeof ( ) andmultiplyingtheresulting expressionby as[89] =+ 1 (7) where ( )R3de nesthelinearvelocityofthecameraexpressedinF,and ( )R3 3isde nedas = 1112130 2223001 Tofacilitatethecontroldevelopment,thetranslationerrorsystemcanbelinearly parameterizedas 1 2 3 = 11 1+ 12 2+ 3( 13) 22 2+ 3( 23) 3 + 1( )_ 2( )_ 3( )_ (7) where ()R1 =1 2 3 areknownregressorvectorsthatdonotdepend onthecalibrationparameters,and Risavectorofconstantunknown parameters.

PAGE 137

123 7.4ControlDevelopment 7.4.1RotationControlDevelopmentandStabilityAnalysis Basedontherelationshipin(7),theopen-looperrorsystemin(715),and thesubsequentstabilityanalysis,therotationcontrollerisdesignedas 1= 1 1=( 11+2) 1(7) 2= 2 2=( 21+ 22+1) 2 3= 3 3=( 31+ 32+ 33) 3 where R =1 2 3 and R =1 2 3 arepositive constants.Theexpressedformofthecontrollerin(725)ismotivatedbytheuseof completingthesquaresinthesubsequentstabilityanalysis.In(725),thedamping controlgains 21 31 32areselectedaccordingtothefollowingsu cient conditionstofacilitatethesubsequentstabilityanalysis 21 1 4 2 1 2 12 11 22(7) 31 1 4 2 11 11 12 23 22+ 13!2 32 1 4 2 2 2 23 22 where 11 11 22 22 12 13and 23arede nedin(7)and(7),and 11, 22, 33arefeedbackgainsthatcanbeselectedtoadjusttheperformanceof therotationcontrolsystem. Proposition7.1 :Providedthesu cientgainconditionsgivenin(76)are satis ed,thecontrollerin(7)ensuresasymptoticregulationoftherotationerror inthesensethatk( )k 0 as (7)

PAGE 138

124 Proof :Let 1( 0)R denotethefollowingnon-negativefunction: 1, +(10)2 (7) Basedontheopen-looperrorsystemin(715),thetime-derivativeof 1( ) canbe determinedas 1=2 2(10) 0= = 1 1+ 2 2+ 3 3 (7) Aftersubstituting(7)for ( ) andsubstituting(7)for ( ) ,theexpression in(7)canbesimpli edas 1= 111 11 2 1+ 221 22 2 2+ 33 2 31 11 2 1 112 22 1 2+ 2111 22 2 2 (7)1 11 2 1+ 1 1223 2213 1 3+ 3111 2 31 22 2 2 223 2 3+ 3222 2 3 Aftercompletingthesquaresoneachofthebracketedtermsin(70),theexpressionin(7)canbewrittenas 1= 111 11 2 1+ 221 22 2 2+ 33 2 31 11" 1 112 2 22 22+ 11 22 211 4 2 12 12 1122 2 2#1 11" 1+ 1 2 1 1223 2213 32+ 11 311 4 2 11 11 1223 22132! 2 3#1 22" 21 2 223 32+ 22 321 4 2 22 23 22 2 3# (7)

PAGE 139

125 Providedthesu cientgainconditionsgivenin(7)aresatis ed,then(7)can beupperboundedas 1 111 11 2 1+ 221 22 2 2+ 33 2 3 (7) Basedon(7),theinequalityin(7)canbefurtherupperboundedas 11 111 11 2 1+ 221 22 2 2+ 33 2 3 (7) TheLyapunovfunctiongivenin(7)anditstimederivativein(7)canbe usedtoconcludethat ( ) 0( ) Land ( ) L2(ofcourse, ( ) 0( ) Lbyde nitionalso).Theexpressionsin(718)and(720),andthefactthat ( ) L2,canbeusedtoconcludethat ( ) L2 Since ( ) 0( ) L, then ( ) ( ) ( ) and 0( ) L.Hence,(75)canbeusedtoconcludethat ( ) L.Basedontherotationerrorsystemin(7), ( ) 0( ) L;hence, ( ) 0( ) areuniformlycontinuous.Barbalatslemma[96]cannowbeusedto concludethatk( )k 0 as 7.4.2TranslationControlDevelopmentandStabilityAnalysis Forcompletenessoftheresult,thesametranslationcontrollerasinChenet al.[89]isprovided.Aftersomealgebraicmanipulation,thetranslationerrorsystem in(724)canberewrittenas 11 1= 1+ 1( 2 3) 1(7) 22 2= 2+ 2( 3) 2 3= 3+ 3( ) 3 where 1R1, 2R2,and 3R3arevectorsofconstantunknown parameters,andtheknownregressorvectors 1()R1 1, 2()R1 2,and

PAGE 140

126 3()R1 3satisfythefollowingequations(seeAppendixG): 11=12 11 2( 13) 11 3+ 1( ) 1122=23 22 3+ 2( ) 2233= 3( ) Thecontrolstrategyistodesign 3( ) tostabilize 3( ) ,andthendesign 2( ) to stabilize 2( ) given 3( ) ,andthendesign 1( ) tostabilize 1( ) given 3( ) and 2( ) .Followingthisdesignstrategy,thetranslationcontroller ( ) isdesigned as[89] 3= 1 33+ 3( ) 3 (7) 2= 1 22+ 2( 3) 2 1= 1 11+ 1( 2 3) 1 wherethedepthratio ( ) 0 .In(7), 1( )R1 2( )R2 3( )R3denoteadaptiveestimatesthataredesignedaccordingtothefollowingadaptive updatelawstocanceltherespectivetermsinthesubsequentstabilityanalysis 1= 1 11 2= 2 22 3= 3 33 (7) where 1R1 1 2R2 2 3R3 3arediagonalmatricesofpositive constantadaptationgains.Basedon(7)and(7),theclosed-looptranslation errorsystemis 11 1= 11+ 1( 2 3) 1(7) 22 2= 22+ 2( 3) 2 3= 33+ 3( ) 3

PAGE 141

127 where 1( )R1 2( )R2 3( )R3denotetheintrinsiccalibration parametermismatchde nedas 1( )= 1 1( ) 2( )= 2 2( ) 3( )= 3 3( ) Proposition7.2 :Thecontrollergivenin(735)alongwiththeadaptive updatelawin(736)ensuresasymptoticregulationofthetranslationerrorsystem inthesensethatk ( )k 0 as Proof :Let 2( 1 2 3)R denotethefollowingnon-negativefunction: 2= 1 2 112 1+ 1 2 222 2+ 1 2 2 3+ 1 2 1 1 1 1+ 1 2 2 1 2 2+ 1 2 3 1 3 3 (7) Aftertakingthetimederivativeof(7)andsubstitutingfortheclosed-looperror systemdevelopedin(7),thefollowingsimpli edexpressioncanbeobtained: 2= 12 1 22 2 32 3 (7) Basedon(7)and(7), 1( ) 2( ) 3( ) L L2,and ( ) ( ) = 1 2 3 L.Basedontheassumptionthat ( ) ,theexpression in(7)canbeusedtoconcludethat ( ) L.Basedontheprevious stabilityanalysisfortherotationcontroller, ( ) L;hence,(77)canbe usedtoconcludethat 1( ) 2( ) 3( ) L(i.e., 1( ) 2( ) 3( ) areuniformly continuous).Barbalatslemma[96]cannowbeusedtoshowthat 1( ) 2( ) 3( )0 as BasedonPropositions7.1and7.2,themainresultcanbestatedasfollows. Theorem7.1 :Thecontrollergivenin(7)and(7)alongwiththeadaptiveupdatelawin(7)ensuresasymptot ictranslationandrotationregulationin thesensethatk( )k 0 andk ( )k 0 as

PAGE 142

128 providedthecontrolgainssatisfythesu cientconditionsgivenin(76). Proof :SeeproofsinPropositions7.1and7.2. 7.5SimulationResults Numericalsimulationswereperformedtoillustratetheperformanceofthe controllergivenin(7)and(7)andtheadaptivelawgivenin(7).The intrinsiccameracalibrationmatrixisgivenby = 122 53 77100 0122 56100 001 Thecameraisassumedtoviewanobjectwithfourcoplanarfeaturepointswith thefollowingEuclideancoordinates(in[m]): 1= 0 10 10 2= 0 10 10 (7) 3= 0 10 10 4= 0 10 10 Thenormalizedcoordinatesofthevanishingpointswereselectedas 0 010 011 0 010 011 0 010 011 0 010 011

PAGE 143

129 Theinitial(i.e., (0) )anddesired(i.e., )image-spacecoordinatesofthefour featurepointsin(7)werecomputedas(inpixels) 1(0)= 128 49129 411 2(0)= 128 80119 611 3(0)= 119 00119 611 4(0)= 118 70129 411 1= 136 86138 681 2= 138 07132 171 3= 130 97131 331 4= 129 84137 821 Theinitial(i.e., (0) )anddesired(i.e., )image-spacecoordinatesofthefour vanishingpointsin(7)werecomputedas(inpixels) 1(0)= 91 84123 681 2(0)= 91 70121 161 3(0)= 89 20121 411 4(0)= 89 34123 941 1= 101 19101 231 2= 101 2698 771 3= 98 8198 771 4= 98 74101 231 Theactualvalueoftheunknownparametervectors 1, 2,and 3aregivenby 1= 0 00820 03080 81633 51100 10822 33860 0234 0 02340 00022 86480 02860 000200 0241 0 02340 00072 41170 000900 0910

PAGE 144

130 2= 0 00820 81593 51102 33750 02340 00020 000200 0241 0 02870 00092 95440 02340 00072 4106 3= 2 86480 02860 02870 00092 9544 Inthesimulation,theinitialvalueoftheestimatedparametervectors 1(0) 2(0) and 3(0) wereselectedashalfoftheactualvalue. Thediagonalcontrolgainmatrices in(7)and in(7)were selectedas = {4 2 85}= {5 5 5} Thediagonalgainmatrices 1, 2,and 3intheadaptivelaw(736)wereselected as 1=0 03{0 01 1 1 1 1 1 0 01 0 01 0 0001 1 0 01 0 0001 0 0001 0 01 0 01 0 01 1 0 01 0 01 1}2=0 03{0 01 1 1 1 0 01 0 0001 0 01 0 01 1 0 0001 0 0001 0 01 0 01 0 01 1}3=0 5{0 1 0 1 10 0 1 10} TheresultingasymptotictranslationandrotationerrorsareplottedinFigure 71andFigure72,respectively.Thepixelcoordinateofthefourfeaturepoints isshowninFigure73.Theimage-spacepixelerror(i.e., ( ) ) isshownin Figure7.Theimage-spacetrajectoryofthefeaturepointsisshowninFigure 7,andalsoinFigure7inathree-dimensionalformat,wheretheverticalaxis istime.ThetranslationandrotationcontroloutputsareshowninFigure77and Figure7,respectively.

PAGE 145

131 0 0.2 0.4 0.6 0.8 1 10 5 0 5 e1 0 0.2 0.4 0.6 0.8 1 10 5 0 5 e2 0 2 4 6 8 10 0.5 0 0.5 1 Time [sec]e3Figure7:Unitlesstranslationerror ( ) 0 1 2 3 4 5 0.99 1 1.01 q0 0 1 2 3 4 5 0.1 0 0.1 qv1 0 1 2 3 4 5 0.1 0 0.1 qv2 0 1 2 3 4 5 0 0.005 0.01 Time [sec]qv3Figure7:Quaternionrotationerror ( ) .

PAGE 146

132 0 2 4 6 8 10 110 120 130 140 150 u [pixel] 0 2 4 6 8 10 110 120 130 140 150 Time [sec]v [pixel]Figure7:Pixelcoordinate ( ) (inpixels)ofthecurrentposeofthefourfeaturepointsinthesimulation.Theupper gureisforthe ( ) componentandthe bottom gureisforthe ( ) component. 0 2 4 6 8 10 15 10 5 0 5 u u* [pixel] 0 2 4 6 8 10 15 10 5 0 5 Time [sec]v v* [pixel]Figure7:Regulationerror ( )(inpixels)ofthefourfeaturepointsinthe simulation.Theupper gureisforthe ( )( ) componentandthebottom gureisforthe ( )( ) component.

PAGE 147

133 115 120 125 130 135 140 145 115 120 125 130 135 140 145 p1(0) p2(0) p3(0) p4(0) u [pixel]v [pixel] Figure7:Image-spaceerrorinpixlesbetween ( ) and .Inthe gure,Odenotestheinitialpositionsofthe4featurepointsintheimage,and*denotesthe corresponding nalpositionsofthefeaturepoints. 110 120 130 140 150 100 120 140 160 0 2 4 6 8 10 u [pixel] p2(0) p1(0) p3(0) p4(0) v [pixel]Time [sec]Figure7:Image-spaceerrorinpixlesbetween ( ) and shownina3Dgraph. Inthe gure,Odenotestheinitialpositionsofthe4featurepointsintheimage, and*denotesthecorresponding nalpositionsofthefeaturepoints.

PAGE 148

134 0 1 2 3 4 5 20 0 20 vc1 [m.s 1] 0 1 2 3 4 5 20 0 20 vc2 [m.s 1] 0 1 2 3 4 5 20 0 20 Time [sec]vc3 [m.s 1]Figure7:Linearcameravelocitycontrolinput ( ) 0 1 2 3 4 5 0.5 0 0.5 c1 [rad.s 1] 0 1 2 3 4 5 0.2 0 0.2 c2 [rad.s 1] 0 1 2 3 4 5 0.4 0.2 0 Time [sec]c3 [rad.s 1]Figure7:Angularcameravelocitycontrolinput ( ) .

PAGE 149

CHAPTER8 CONCLUSIONS Inthisdissertation,visualservocontrolalgorithmsandarchitecturesare developedthatexploitthevisualfeedbackfromacamerasystemtoachievea trackingorregulationcontrolobjectiveforarigid-bodyobject(e.g.,theende ectorofarobotmanipulator,asatellite,anautonomousvehicle)identi edbya patchoffeaturepoints.Thesealgorithmsandarchitecturescanbeusedwidelyin thenavigationandcontrolapplicationsinroboticsandautonomoussystems.The visualservocontrolprobleminthisdissertationwereseparatedinto veparts:1) visualservotrackingcontrolviaaquaternionformulation;2)collaborativevisual servotrackingcontrolusingadaisy-chainingapproach;3)visualservotracking controlusingacentralcatadioptriccamera;4)robustvisualservocontrolin presenceofcameracalibrationuncertainty;and5)combinedrobustandadaptive visualservocontrolviaanuncalibratedcamera. Anadaptivevisualservotrackingcontrolmethodviaaquaternionformulation is rstdevelopedthatachievesasymptotictrackingofarigid-bodyobjecttoa desiredtrajectorydeterminedbyasequenceofimages.Bydevelopingtheerror systemsandcontrollersbasedonahomographydecomposition,thesingularity associatedwiththetypicalimage-Jacobianiseliminated.Byutilizingthequaternionformulation,asingularity-freeerro rsystemisobtained.Ahomography-based rotationandtranslationcontrollerisproventoyieldthetrackingresultthrough aLyapunov-basedstabilityanalysis.Basedontheresultforthecamera-in-hand con gurationproblem,acamera-to-handextensionisgiventoenablearigidbodyobjecttotrackadesiredtrajectory.Simulationandexperimentsresultsare providedtoshowtheperformanceoftheproposedvisualservocontrollers. 135

PAGE 150

136 InordertoenablelargeareamotionandweakentheFOVrestriction,a collaborativevisualservomethodisthendevelopedtoenablethecontrolobjectto trackadesiredtrajectory.Incontrasttotypicalcamera-to-handandcamera-inhandvisualservocontrolcon gurations,theproposedcontrollerisdevelopedusing amovingon-boardcameraviewingamovingobjecttoobtainfeedbacksignals. Adaisy-chainingmethodisusedtodevelophomographyrelationshipbetween di erentcameracoordinateframesandfeaturepointpatchcoordinateframesto facilitatecollaborationbetweendi erentagents(e.g.,cameras,referenceobjects, andcontrolobject).Lyapunov-basedmethodsareusedtoprovetheasymptotic trackingresult.Simulationresultsareprovidedtoshowtheperformanceof theproposedvisualservocontroller.ToenlargetheFOV,analternativevisual servocontrollerisdevelopedthatyieldsanasymptotictrackingresultusinga centralcatadioptriccamera.ApanoramicFOVisobtainedbyusingthecentral catadioptriccamera. Tostudytherobustvisualservocontrolproblem,avisualservocontroller isdevelopedtoregulateacamera(attachedtoarigid-bodyobject)toadesired poseinpresenceofintrinsiccameracalibrationuncertainties.Aquaternion-based estimatefortherotationerrorsystemisdevelopedthatisrelatedtotheactual rotationerror.Thesimilarityrelationshipbetweentheestimatedandactual rotationmatricesisusedtoconstructtherelationshipbetweentheestimatedand actualquaternions.ALyapunov-basedstabilityanalysisisprovidedthatindicates auniquecontrollercanbedevelopedtoachievetheregulationresultdespiteasign ambiguityinthedevelopedquaternionestimate.Simulationresultsareprovidedto illustratetheperformanceofthedevelopedcontroller. Anewcombinedrobustandadaptivevisualservocontrolleristhendeveloped toasymptoticallyregulatethefeaturepointsinanimagetothedesiredfeature pointlocationswhilealsoregulatingthesixDOFposeofthecamera.Thesedual

PAGE 151

137 objectivesareachievedbyusingahomography-basedapproachthatexploitsboth image-spaceandreconstructedEuclideaninformationinthefeedbackloop.In comparisontopureimage-basedfeedbackapproaches,someadvantagesofusing ahomography-basedmethodinclude:realizableEuclideancameratrajectories; anonsingularimage-Jacobian;andboththecameraposeandthefeaturepoint coordinatesareincludedintheerrorsystem.Sincesomeimage-spaceinformationis usedinthefeedback-loopofthedevelopedhomography-basedcontroller,theimage featuresarelesslikelytoleavetheFOVincomparisonwithpureposition-based approaches.Therobustrotationcontrollerthataccommodatesforthetimevaryinguncertainscalingfactorisdevelopedbyexploitingtheuppertriangular formoftherotationerrorsystemandthefactthatthediagonalelementsofthe cameracalibrationmatrixarepositive.Theadaptivetranslationcontrollerthat compensatesfortheconstantunknownparametersinthetranslationerrorsystem isdevelopedbyacertainty-equivalence-basedadaptivecontrolmethodanda nonlinearLyapunov-baseddesignapproach.Simulationresultsareprovidedto showtheperformanceoftheproposedvisualservocontroller.

PAGE 152

APPENDIXA UNITNORMPROPERTYFORTHEQUATERNIONERROR Property :Thequaternionerror ( 0( ) ( ))de nedin(35)hasaunit normgiventhat ( ) and ( ) aretwounitquaternions. Proof :Thequaternioncomponentsin(3)canbeexpandedas 0= 00 + 1 2 3 1 2 3 = 1 1+ 2 2+ 3 3+ 00 (A1) and = 0 1 2 3 0 1 2 3 + 0 3 2 30 1 2 10 1 2 3 (A2) = 0 1+ 2 3 3 2 10 0 2 1 3+ 3 1 20 0 3+ 1 2 2 1 30 Basedon(A)and(A), 2 0+ =( 1 1+ 2 2+ 3 3+ 00 )2+ 0 1+ 2 3 3 2 10 0 2 1 3+ 3 1 20 0 3+ 1 2 2 1 30 0 1+ 2 3 3 2 10 0 2 1 3+ 3 1 20 0 3+ 1 2 2 1 30 138

PAGE 153

139 Expandingtherightsideoftheaboveequationgives 2 0+ =( 1 1+ 2 2+ 3 3+ 00 )2+( 0 1+ 2 3 3 2 10 )2+( 0 2 1 3+ 3 1 20 )2+( 0 3+ 1 2 2 1 30 )2= 2 02 1+ 2 02 2+ 2 12 1+ 2 02 3+ 2 12 2+ 2 22 1+ 2 12 3+ 2 22 2+ 2 32 1+ 2 22 3+ 2 32 2+ 2 32 3+ 2 02 0 + 2 12 0 + 2 22 0 + 2 32 0 =( 2 0+ 2 1+ 2 2+ 2 3)( 2 0 + 2 1+ 2 2+ 2 3) Basedonthefactthat ( ) and ( ) aretwounitquaternions(i.e.,theirnormsare equalto 1 ), 2 0+ =1

PAGE 154

APPENDIXB ONEPROPERTYOFUNITQUATERNIONS Property : ( 3+ ) 1= Proof :Theterm 3 canbeexpandedas 3 = 100 010 001 0 3 2 30 1 2 10 = 1 3 2 31 1 2 11 (B) Takingtheinverseoftheexpandedmatrixin(B)gives 3 1= 1 3 2 31 1 2 11 1= 1 2 1+ 2 2+ 2 3+1 2 1+1 3+ 1 2 2+ 1 3 3+ 1 22 2+1 1+ 2 3 2+ 1 3 1+ 2 32 3+1 (B) Multiplying onbothsidesof(B)gives 3 1= 3( 2+ 1 3)+ 2( 3+ 1 2)+ 1( 2 1+1) 1( 3+ 1 2)+ 3( 1+ 2 3)+ 2( 2 2+1) 2( 1+ 2 3)+ 1( 2+ 1 3)+ 3( 2 3+1) 2 1+ 2 2+ 2 3+1 = 140

PAGE 155

APPENDIXC OPEN-LOOPTRANSLATIONERRORSYSTEM Thetranslationerrorwasde nedas = where ( ) and ( ) aretheextendednormalizedcoordinatesof 0( ) and ( ) ,respectively,andwerede nedas = 0 00 0ln( 0 ) = ln( ) Di erentiating ( ) gives = 1 00 0 (C1) where 0= 100 0010 0001 Thederivativeof 0( ) isgivenby 0= 0 0 0 = 0 + (C2) where ( ) and ( ) arethelinearandangularvelocitiesof withrespecttoIexpressedinF.Basedon(C)and(C2), ( ) canbefurtherwrittenas = 1 000 + (C3) 141

PAGE 156

142 From(C),thetranslationerrorsystemcanbeobtainedas = 1 000 + anditcanbefurtherwrittenas = 000 +

PAGE 157

APPENDIXD PROPERTYONMATRIXNORM Property : [ ] 2=kk R3 Proof :Thespectralnormkk2ofarealmatrixisthesquarerootofthe largesteigenvalueofthematrixmultipliedbyitstranspose,i.e.,kk2= p max{} Foranygivenvector R3, [ ]= 032301210 TheEuclideannormof [ ]isgivenby [ ] 2= r maxn [ ][ ]o = v u u u u u u u t max 2 2+ 2 31213122 1+ 2 32313232 1+ 2 2 = q 2 1+ 2 2+ 2 3 Thenormof isgivenbykk= q 2 1+ 2 2+ 2 3 Hence, [ ] 2=kk. 143

PAGE 158

APPENDIXE COMPUTATIONOFDEPTHRATIOS Basedon(75)and(7),thefollowingexpressioncanbeobtained: = + (E1) where ( ) isde nedin(7),and ( )R3and ( )R3arede nedas = = 1 Usingthefourcorrespondingreferencefeaturepoints,theexpressionin(E)can bewrittenas = = 33 (E2) where 33( )R isthe(assumedw.l.o.g.)positivethirdrowthirdcolumnelement of ( ) ,and ( )R3 3isde nedas ( ) 33( ) .Basedon(E),twelvelinear equationscanbeobtainedforthefourcorrespondingreferencefeaturepoints.Aset oftwelvelinearequationscanthenbedevelopedtosolvefor ( ) 33( ) and ( ) Todeterminethescalar 33( ) ,thefollowingequationcanbeused: 33= + (E3) providedthat ( ) isobtainedusingthefourvanishingpoints,where = 11 12 13 21 22 23 31 32 33 = 11 12 13 21 22 23 31 321 = 1 2 3 = 1 2 3 144

PAGE 159

145 Tothisend,let 1= 1 12= 1 23= 1 34= 2 15= 2 26= 2 37= 3 18= 3 29= 3 31= 2 12= 3 1 thenthefollowingrelationshipscanbeobtained: 2= 115= 148= 173= 216= 249= 27 Theelementsofthematrix 33( ) ( ) in(E)canberewrittenas 11+ 1= 33 11 12+ 11= 33 12(E4) 13+ 21= 33 13 21+ 4= 33 21 22+ 14= 33 22(E5) 23+ 24= 33 23 31+ 7= 33 31 32+ 17= 33 32(E6) 33+ 27= 33 Theexpressionsin(E4)canbecombinedas 12+ 11= 12 11( 11+ 1) (E7) 13+ 21= 13 11( 11+ 1)

PAGE 160

146 Similarly,theexpressionsin(E5)canbecombinedas 22+ 14= 22 21( 21+ 4) (E8) 23+ 24= 23 21( 21+ 4) Since ( ) ( ) {1 2}and 13( ) 23( ) 13( ) and 23( ) are known,(E7)and(E)providefourequationsthatcanbesolvedtodeterminethe fourunknowns(i.e., 1, 2, 1( ) 4( ) ).Aftersolvingforthesefourunknowns, (E)and(E)canbeusedtosolvefor 33( ) ,andthen ( ) canbeobtained. Given ( ) ,(E2)canbeusedtosolvefor ( ) .

PAGE 161

APPENDIXF INEQUALITYDEVELOPMENT Property :Thereexisttwopositiveconstants and suchthatthescaling factor ( ) satis estheinequality ( ) (F1) Proof :Since =kk kk thesquareof ( ) isgivenby 2= ( )= .(F2) Basedonthefactthat isoffullrank,thesymmetricmatrix ispositive de nite.Hence,theRayleigh-Ritztheoremcanbeusedtoconcludethat min( ) max( ) (F3) where min( ) and max( ) denotetheminimalandmaximaleigenvaluesof ,respectively.From(F)and(F),itcanconcludedthat 1 max( )2= 1 min( ) s 1 max( )s 1 min( ) 147

PAGE 162

APPENDIXG LINEARPARAMETERIZATIONOFTRANSLATIONERRORSYSTEM Theregressorvectors 1()R1 202()R1 153()R1 5andthe constantunknownparametervectors 1R20 12R15 13R5 1aregivenby 1= 3 2 3 2 1 1 1 1 1 3 32 2 2 2 3 3 3 2 2 2 1= 1 11 12 11 13 11 1 12 111323 112213 112223 11221 112223 221 221 2 1112 2 112212231322 2 112212 2 112 12 2 11222 1223121322 2 112213 2 111213 2 11221213232 1322 2 11222= 3 3 1 12 12 1 3 3 32 2 2 2 2 2 2 2= 1 22 23 22 1 2 23 2 2223 2 221 2 221 1112 112212231322 11221 112212 112 2212231322 112 2223 11221223 112 22122 23132223 112 223= 2 2 2 1 1 3= 1 1112 112212231322 11221 2223 22 148

PAGE 163

REFERENCES [1]S.Hutchinson,G.Hager,andP.Corke,Atutorialonvisualservocontrol, IEEETrans.Robot.Automat. ,vol.12,no.5,pp.651,1996. [2]F.ChaumetteandS.Hutchinson,VisualservocontrolpartI:Basicapproaches, IEEERobot.Automat.Mag. ,vol.13,no.4,pp.82,2006. [3],VisualservocontrolpartII:Advancedapproaches, IEEERobot. Automat.Mag. ,vol.14,no.1,pp.109,2006. [4]G.Hu,N.Gans,andW.E.Dixon, ComplexityandNonlinearityinAutonomousRobotics,EncyclopediaofComplexityandSystemScience Springer,toappear2008,ch.AdaptiveVisualServoControl. [5]N.J.CowanandD.Koditschek,Planarimage-basedvisualservoingasa navigationproblem,in Proc.IEEEInt.Conf.Robot.Automat. ,2000,pp. 1720. [6]N.J.Cowan,J.D.Weingarten,andD.E.Koditschek,Visualservoingvia navigationfunctions, IEEETrans.Robot.Automat. ,vol.18,pp.521, 2002. [7]Y.MezouarandF.Chaumette,Pathplanningforrobustimage-based control, IEEETrans.Robot.Automat. ,vol.18,no.4,pp.534,2002. [8]E.RimonandD.E.Koditschek,Exactrobotnavigationusingarti cial potentialfunctions, IEEETrans.Robot.Automat. ,vol.8,pp.501,1992. [9]A.Ruf,M.Tonko,R.Horaud,andH.-H.Nagel,Visualtrackingofanende ectorbyadaptivekinematicprediction,in Proc.IEEE/RSJInt.Conf. Intell.RobotsSyst. ,1997,pp.893. [10]J.Chen,D.M.Dawson,W.E.Dixon,andA.Behal,Adaptivehomographybasedvisualservotrackingfora xedcameracon gurationwithacamera-inhandextension, IEEETrans.Contr.Syst.Technol. ,vol.13,no.5,pp.814 825,2005. [11]M.SpongandM.Vidyasagar, RobotDynamicsandControl .NewYork: JohnWiley&SonsInc.,1989. [12]O.Faugeras, Three-DimensionalComputerVision:AGeometricViewpoint Cambridge,Massachusetts:MITPress,1993. 149

PAGE 164

150 [13]R.HartleyandA.Zisserman, MultipleViewGeometryinComputerVision NewYork:NY:CambridgeUniversityPress,2000. [14]G.Hu,W.E.Dixon,S.Gupta,andN.Fitz-coy,Aquaternionformulation forhomography-basedvisualservocontrol,in Proc.IEEEInt.Conf.Robot. Automat. ,2006,pp.2391. [15]M.Shuster,Asurveyofattituderepresentations, J.AstronauticalSciences vol.41,no.4,pp.439,1993. [16]G.Hu,S.Mehta,N.Gans,andW.E.Dixon,Daisychainingbasedvisual servocontrolpartI:Adaptivequaternion-basedtrackingcontrol,in IEEE Multi-Conf.Syst.Control ,2007,pp.1474. [17]G.Hu,N.Gans,S.Mehta,andW.E.Dixon,Daisychainingbasedvisual servocontrolpartII:Extensions,applicationsandopenproblems,in IEEE Multi-Conf.Syst.Control ,2007,pp.729. [18]S.Mehta,W.E.Dixon,D.MacArthur,andC.D.Crane,Visualservo controlofanunmannedgroundvehicleviaamovingairbornemonocular camera,in Proc.AmericanControlConf. ,2006,pp.5276. [19]S.Mehta,G.Hu,N.Gans,andW.E.Dixon,Adaptivevision-based collaborativetrackingcontrolofanugvviaamovingairbornecamera:A daisychainingapproach,in Proc.IEEEConf.DecisionControl ,2006,pp. 3867. [20]E.Malis,F.Chaumette,andS.Bodet,21/2Dvisualservoing, IEEE Trans.Robot.Automat. ,vol.15,no.2,pp.238250,1999. [21]E.MalisandF.Chaumette,Theoreticalimprovementsinthestability analysisofanewclassofmodel-freevisualservoingmethods, IEEETrans. Robot.Automat. ,vol.18,no.2,pp.176186,2002. [22]Y.Fang,W.E.Dixon,D.M.Dawson,andP.Chawda,Homography-based visualservoingofwheeledmobilerobots, IEEETrans.Syst.,Man,Cybern.PartB:Cybern. ,vol.35,no.5,pp.1041,2005. [23]W.E.Dixon,A.Behal,D.M.Dawson,andS.Nagarkatti, NonlinearControl ofEngineeringSystems:ALyapunov-BasedApproach .BirkhuserBoston, 2003. [24]G.Hu,S.Gupta,N.Fitz-coy,andW.E.Dixon,Lyapunov-basedvisual servotrackingcontrolviaaquaternionformulation,in Proc.IEEEConf. DecisionControl ,2006,pp.3861. [25]P.CorkeandS.Hutchinson,Anewpartitionedapproachtoimage-based visualservocontrol, IEEETrans.Robot.Automat. ,vol.17,no.4,pp. 507,2001.

PAGE 165

151 [26]G.Chesi,K.Hashimoto,D.Prattichizzo,andA.Vicino,Keepingfeaturesin the eldofviewineye-in-handvisualservoing:Aswitchingapproach, IEEE Trans.Robot. ,vol.20,no.5,pp.908,2004. [27]Y.MezouarandF.Chaumette,Optimalcameratrajectorywithimagebased control, Int.J.Robot.Res. ,vol.22,no.10,pp.781,2003. [28]H.ZhangandJ.Ostrowski,Visualmotionplanningformobilerobots, IEEETrans.Robot.Automat. ,vol.18,no.2,pp.199,2002. [29]J.Chen,D.M.Dawson,W.E.Dixon,andV.Chitrakaran,Navigation functionbasedvisualservocontrol, Automatica ,vol.43,pp.1165, 2007,toappear. [30]S.BenhimaneandE.Malis,Vision-basedcontrolwithrespecttoplanar andnonplanarobjectsusingazoomingcamera,in Proc.IEEEInt.Conf. AdvancedRobotics ,2003,pp.991. [31]N.Garcka-Aracil,E.Malis,R.Aracil-Santonja,andC.Perez-Vidal,Continuousvisualservoingdespitethechangesofvisibilityinimagefeatures, IEEE Trans.Robot. ,vol.21,no.6,pp.1214,2005. [32]E.HechtandA.Zadac, Optics ,3rded.Addison-Wesley,1997. [33]C.GeyerandK.Daniilidis,Catadioptricprojectivegeometry, Int.J. ComputerVision ,vol.45,no.3,pp.223,2001. [34]S.BakerandS.Nayar,Atheoryofsingle-viewpointcatadioptricimage formation, Int.J.ComputerVision ,vol.35,no.2,pp.175,1999. [35]J.BarretoandH.Araujo,Geometricpropertiesofcentralcatadioptricline images,in Proc.EuropeanConf.ComputerVision ,2002,pp.237. [36]C.GeyerandK.Daniilidis,Aunifyingtheoryforcentralpanoramicsystems andpracticalimplications,in Proc.EuropeanConf.ComputerVision ,2000, pp.445. [37]R.Y.Tsai,Aversatilecameracalibrationtechniqueforhigh-accuracy3D machinevisionmetrologyusingo -the-shelfTVcamerasandlenses, IEEE J.Robot.Automat. ,vol.3,no.4,pp.323,1987. [38], Synopsisofrecentprogressoncameracalibrationfor3Dmachine vision .Cambridge,MA,USA:MITPress,1989. [39]L.Robert,Cameracalibrationwithoutfeatureextraction, ComputerVision andImageUnderstanding ,vol.63,no.2,pp.314,1996.

PAGE 166

152 [40]J.HeikkilaandO.Silven,Afour-stepcameracalibrationprocedurewith implicitimagecorrection,in Proc.IEEEInt.Conf.ComputerVision PatternRecognition ,1997,pp.1106. [41]T.A.ClarkeandJ.G.Fryer,Thedevelopmentofcameracalibration methodsandmodels, PhotogrammetricRecord ,vol.16,no.91,pp.5166, 1998. [42]P.F.SturmandS.J.Maybank,Onplane-basedcameracalibration:A generalalgorithm,singularities,applications,in Proc.IEEEInt.Conf. ComputerVisionPatternRecognition ,1999,pp.432. [43]Z.Zhang,Flexiblecameracalibrationbyviewingaplanefromunknown orientations,in Proc.IEEEInt.Conf.ComputerVision ,1999,pp.666. [44]L.E.Weiss,A.C.Sanderson,andC.P.Neuman,Dynamicsensor-based controlofrobotswithvisualfeedback, IEEEJ.Robot.andAutomat. ,vol. RA-3,no.5,pp.404,1987. [45]J.FeddemaandO.Mitchell,Vision-guidedservoingwithfeature-based trajectorygeneration, IEEETrans.Robot.Automat. ,vol.5,no.5,pp. 691,1989. [46]K.Hashimoto,T.Kimoto,T.Ebine,andH.Kimura,Manipulatorcontrol withimage-basedvisualservo,in Proc.IEEEInt.Conf.Robot.Automat. 1991,pp.2267. [47]B.Espiau,F.Chaumette,andP.Rives,Anewapproachtovisualservoing inrobotics, IEEETrans.Robot.Automat. ,vol.8,no.3,pp.313,1992. [48]F.Chaumette,Potentialproblemsofstabilityandconvergenceinimagebasedandposition-basedvisualservoing,in TheCon uenceofVisionand Control ,ser.LNCISSeries,D.Kriegman,G.Hager,andA.Morse,Eds. Berlin,Germany:Springer-Ve rlag,1998,vol.237,pp.66. [49]B.Espiau,E ectofcameracalibrationerrorsonvisualservoinginrobotics, in The3rdInt.Symp.ExperimentalRobotics ,1993,pp.182. [50]W.J.Wilson,C.W.Hulls,andG.S.Bell,Relativeend-e ectorcontrolusingcartesianpositionbasedvisualservoing, IEEETrans.Robot.Automat. vol.12,no.5,pp.684,1996. [51]P.Martinet,J.Gallice,andD.Khadraoui,Visionbasedcontrollawusing 3Dvisualfeatures,in Proc.WorldAutomationCongress ,vol.3,1996,pp. 497. [52]N.Daucher,M.Dhome,J.Laprest,andG.Rives,Speedcommandofa roboticsystembymonocularposeestimate,in Proc.IEEE/RSJInt.Conf. Intell.RobotsSyst. ,1997,pp.55.

PAGE 167

153 [53]K.Hashimoto,Areviewonvision-basedcontrolofrobotmanipulators, AdvancedRobotics ,vol.17,no.10,pp.969,2003. [54]K.Deguchi,Optimalmotioncontro lforimage-basedvisualservoingby decouplingtranslationandrotation,in Proc.IEEE/RSJInt.Conf.Intell. RobotsSyst. ,1998,pp.705. [55]E.MalisandF.Chaumette,1/2Dvisualservoingwithrespecttounknownobjectsthroughanewestimationschemeofcameradisplacement, Int.J.ComputerVision ,vol.37,no.1,pp.79,2000. [56]F.ChaumetteandE.Malis,1/2Dvisualservoing:apossiblesolutionto improveimage-basedandposition-basedvisualservoings,in Proc.IEEEInt. Conf.Robot.Automat. ,2000,pp.630. [57]J.Chen,W.E.Dixon,D.M.Dawson,andM.McIntyre,Homography-based visualservotrackingcontrolofawheeledmobilerobot, IEEETrans.Robot. vol.22,no.2,pp.406,2006. [58]N.GansandS.Hutchinson,Stablevisualservoingthroughhybridswitchedsystemcontrol, IEEETrans.Robot. ,vol.23,no.3,pp.530,2007. [59]E.Malis,Visualservoinginvarianttochangesincameraintrinsicparameters,in Proc.IEEEInt.Conf.ComputerVision ,2001,pp.704. [60]Y.Y.SchechnerandS.Nayar,Generalizedmosaicing:Highdynamicrange inawide eldofview, Int.J.ComputerVision ,vol.53,no.3,pp.245, 2003. [61]S.Hsu,H.S.Sawhney,andR.Kumar,Automatedmosaicsviatopology inference, IEEEComputerGraphicsandApplication ,vol.22,no.2,pp. 44,2002. [62]M.Irani,P.Anandan,J.Bergen,R.Kumar,andS.Hsu,E cientrepresentationsofvideosequencesandtheirapplication, SignalProcessing:Image communication ,vol.8,pp.327351,1996. [63]A.SmolicandT.Wiegand,High-resolutionimagemosaicing,in Proc. IEEEInt.Conf.ImageProcessing ,2001,pp.872. [64]R.SwaminathanandS.Nayar,Non-metriccalibrationofwide-anglelenses andpolycameras,in Proc.IEEEInt.Conf.ComputerVisionPattern Recognition ,2000,pp.413. [65]D.BurschkaandG.Hager,Vision-basedcontrolofmobilerobots,in Proc. IEEEInt.Conf.Robot.Automat. ,2001,pp.1707.

PAGE 168

154 [66]Y.Mezouar,H.H.Abdelkader,P.Martinet,andF.Chaumette,Central catadioptricvisualservoingfrom3Dstraightlines,in Proc.IEEE/RSJInt. Conf.Intell.RobotsSyst. ,2004,pp.343. [67]H.Hadj-Abdelkader,Y.Mezouar,N.Andre ,andP.Martinet,21/2D visualservoingwithcentralcatadioptriccameras,in Proc.IEEE/RSJInt. Conf.Intell.RobotsSyst. ,2005,pp.3572. [68]G.Mariottini,E.Alunno,J.Piazzi,andD.Prattichizzo,Epipole-based visualservoingwithcentralcatadioptriccamera,in Proc.IEEEInt.Conf. Robot.Automat. ,2005,pp.3516. [69]G.Mariottini,D.Prattichizzo,andG .Oriolo,Image-basedvisualservoing fornonholonomicmobilerobotswith centralcatadioptriccamera,in Proc. IEEEInt.Conf.Robot.Automat. ,2006,pp.538. [70]S.BenhimaneandE.Malis,Anewapproachtovision-basedrobotcontrol withomni-directionalcameras,in Proc.IEEEInt.Conf.Robot.Automat. 2006,pp.526. [71]R.TatsambonandF.Chaumette,Visualservoingfromspheresusinga sphericalprojectionmodel,in Proc.IEEEInt.Conf.Robot.Automat. ,2007, pp.2080. [72],Visualservoingfromsphereswithparacatadioptriccameras,in Int. Conf.AdvancedRobotics ,August2007. [73]J.P.Barreto,F.Martin,andR.Horaud,Visualservoing/trackingusing centralcatadioptricimages,in Proc.Int.Symp.ExperimentalRobotics ,2002, pp.863. [74]K.HosodaandM.Asada,Versatilevisualservoingwithoutknowledgeof truejacobian,in Proc.IEEE/RSJInt.Conf.Intell.RobotsSyst. ,1994,pp. 186. [75]M.Jagersand,O.Fuentes,andR.Nelson,Experimentalevaluationof uncalibratedvisualservoingforprecisionmanipulation,in Proc.IEEEInt. Conf.Robot.Automat. ,1997,pp.2874. [76]M.ShahamiriandM.Jagersand,Uncalibratedvisualservoingusingabiased newtonmethodforon-linesingularitydetectionandavoidance,in Proc. IEEE/RSJInt.Conf.Intell.RobotsSyst. ,2005,pp.3953. [77]J.A.PiepmeierandH.Lipkin,Uncalibratedeye-in-handvisualservoing, Int.J.Robot.Res. ,vol.22,pp.805,2003. [78]J.A.Piepmeier,G.V.McMurray,an dH.Lipkin,Uncalibrateddynamic visualservoing, IEEETrans.Robot.Automat. ,vol.24,no.3,pp.143, 2004.

PAGE 169

155 [79]R.Kelly,Robustasymptoticallystablevisualservoingofplanarmanipulator, IEEETrans.Robot.Automat. ,vol.12,no.5,pp.759,1996. [80]B.BishopandM.W.Spong,Adaptivecalibrationandcontrolof2D monocularvisualservosystem,in Proc.IFACSymp.RobotControl ,1997, pp.525. [81]L.HsuandP.L.S.Aquino,Adaptivevisualtrackingwithuncertain manipulatordynamicsanduncalibratedcamera,in Proc.IEEEConf. DecisionControl ,1999,pp.1248. [82]C.J.TaylorandJ.P.Ostrowski,Robustvision-basedposecontrol,in Proc. IEEEInt.Conf.Robot.Automat. ,2000,pp.2734. [83]E.Zergeroglu,D.M.Dawson,M.deQueiroz,andA.Behal,Vision-based nonlineartrackingcontrollersinthepresenceofparametricuncertainty, IEEE/ASMETrans.Mechatr. ,vol.6,no.3,pp.322,2001. [84]W.E.Dixon,D.M.Dawson,E.Zergeroglu,andA.Behal,Adaptive trackingcontrolofawheeledmobilerobotviaanuncalibratedcamera system, IEEETrans.Syst.,Man,Cybern.-PartB:Cybern. ,vol.31,no.3, pp.341,2001. [85]A.Astol ,L.Hsu,M.Netto,andR.Ortega,Twosolutionstotheadaptive visualservoingproblem, IEEETrans.Robot.Automat. ,vol.18,no.3,pp. 387,2002. [86]Y.Liu,H.Wang,C.Wang,andK.Lam,Uncalibratedvisualservoingof robotsusingadepth-independentinteractionmatrix, IEEETrans.Robot. vol.22,no.4,pp.804,2006. [87]Y.Fang,W.E.Dixon,D.M.Dawson,andJ.Chen,Anexponentialclass ofmodel-freevisualservoingcontrollersinthepresenceofuncertaincamera calibration, Int.J.Robot.Automat. ,vol.21,2006. [88]G.Hu,N.Gans,andW.E.Dixon,Quaternion-basedvisualservocontrolin thepresenceofcameracalibrationerror,in IEEEMulti-Conf.Syst.Control 2007,pp.1492. [89]J.Chen,A.Behal,D.M.Dawson,andW.E.Dixon,Adaptivevisual servoinginthepresenceofintrinsiccalibrationuncertainty,in Proc.IEEE Conf.DecisionControl ,2003,pp.5396. [90]B.BoufamaandR.Mohr,Epipoleandfundamentalmatrixestimation usingvirtualparallax,in Proc.IEEEInt.Conf.ComputerVision ,1995,pp. 1030. [91]J.ShiandC.Tomasi,Goodfeaturestotrack,in Proc.IEEEInt.Conf. ComputerVisionPatternRecognition ,1994,pp.593.

PAGE 170

156 [92]C.TomasiandT.Kanade,Detectionandtrackingofpointfeatures, CarnegieMellonUniversity,Tech.Rep.,1991. [93]O.FaugerasandF.Lustman,Motionandstructurefrommotionina piecewiseplanarenvironment, Int.J.PatternRecognitionandArti cial Intelligence ,vol.2,no.3,pp.485,1988. [94]Z.ZhangandA.R.Hanson,Scaledeuclidean3Dreconstructionbasedon externallyuncalibratedcameras,in IEEESymp.ComputerVision ,1995,pp. 37. [95]Y.Fang,A.Behal,W.E.Dixon,andD.M.Dawson,Adaptive2.5Dvisual servoingofkinematicallyredundantrobotmanipulators,in Proc.IEEE Conf.DecisionControl ,2002,pp.2860. [96]J.J.SlotineandW.Li, AppliedNonlinearControl .EnglewoodCli ,NJ: PrenticeHall,Inc.,1991. [97]G.Bradski,TheOpenCVlibrary, Dr.DobbsJournalofSoftwareTools vol.25,pp.120,122,2000. [98]M.Galassi,J.Davies,J.Theiler,B.Gough,G.Jungman,M.Booth,and F.Rossi, GNUScienti cLibrary:ReferenceManual ,NetworkTheoryLtd., Bristol,UK,2005. [99]P.K.Allen,A.Timcenko,B.Yoshimi,andP.Michelman,Automated trackingandgraspingofamovingobjectwitharobotichand-eyesystem, IEEETrans.Robot.Automat. ,vol.9,no.2,pp.152,1993. [100]G.D.Hager,W.-C.Chang,andA.S.Morse,Robothand-eyecoordination basedonstereovision, IEEEContr.Syst.Mag. ,vol.15,no.1,pp.30, 1995. [101]S.Wiiesoma,D.Wolfe,andR.Richards,Eye-to-handcoordinationfor vision-guidedrobotcontrolapplications, Int.J.Robot.Res. ,vol.12,no.1, pp.65,1993. [102]N.Gans,G.Hu,andW.E.Dixon, ComplexityandNonlinearityinAutonomousRobotics,EncyclopediaofComplexityandSystemScience Springer,toappear2008,ch.Image-BasedStateEstimation. [103]A.Almansa,A.Desolneux,andS.Vamech,Vanishingpointdetection withoutanyaprioriinformation, IEEETrans.PatternAnal.Machine Intell. ,vol.25,no.4,pp.502,2003.

PAGE 171

BIOGRAPHICALSKETCH GuoqiangHureceivedhisbachelorsdegreeinautomationengineeringfrom theUniversityofScienceandTechnologyofChina(USTC)in2002,andhe receivedhisMasterofPhilosophyinAutomationandComputer-aidedEngineering fromTheChineseUniversityofHongKong(CUHK)in2004. Currently,heispursuinghisPhDdegreeinthenonlinearcontrolandrobotics groupattheUniversityofFlorida(UF)underthesupervisionofDr.Warren Dixon. 157