<%BANNER%>

A Daisy-Chaining Approach for Vision-Based Control and Estimation

Permanent Link: http://ufdc.ufl.edu/UFE0041119/00001

Material Information

Title: A Daisy-Chaining Approach for Vision-Based Control and Estimation
Physical Description: 1 online resource (127 p.)
Language: english
Creator: Mehta, Siddhartha
Publisher: University of Florida
Place of Publication: Gainesville, Fla.
Publication Date: 2010

Subjects

Subjects / Keywords: collaborative, control, cooperative, daisy, nonlinear, pegus, ransac, robust, slam, uav, ugv, vision, visual
Mechanical and Aerospace Engineering -- Dissertations, Academic -- UF
Genre: Mechanical Engineering thesis, Ph.D.
bibliography   ( marcgt )
theses   ( marcgt )
government publication (state, provincial, terriorial, dependent)   ( marcgt )
born-digital   ( sobekcm )
Electronic Thesis or Dissertation

Notes

Abstract: A DAISY-CHAINING APPROACH FOR VISION-BASED CONTROL AND ESTIMATION The research presented in this dissertation monograph lies within the general scope of guidance, navigation, and control of autonomous systems and centers around the design and analysis of visual servo control strategies and vision-based robust position and orientation (i.e., pose) estimation. The motivation behind the presented research is to enable a vision system to provide robust navigation and control of autonomous agents operating over a large area. In order to enable vision systems to provide pose estimates over a large area, a new daisy-chaining method is developed. By developing multi-view geometry, or photogrammetry, based concepts relationships are established between the current pose of an agent and the desired agent pose, when the desired agent is out of the camera field-of-view (FOV). The daisy-chaining method is limited by the need to maintain a single reference object that is contained in both the current view and the final view of the desired pose of the vehicle. To overcome this limitation, the daisy-chaining method is extended to allow multiple reference objects to enter and leave the camera FOV, allowing theoretically infinite daisy-chaining and hence an unrestricted applicative area for an UGV. Error propagation analysis for the daisy-chaining method, which resembles a `dead-reckoning' scheme, shows the methods is susceptible to image noise and feature point outliers. To address the local pose estimation problem, a statistical method is developed, which is coined as Pose Estimation by Gridding of Unit Spheres (PEGUS), that provides robust pose estimation in presence of feature outliers and image noise. The accuracy of any vision-based control and estimation problem largely depends on accurate feature point information. Feature point errors will result in an erroneous pose estimation that could potentially affect the stability and performance of the control and estimation methods. An accurate pose estimation is a non-trivial problem, especially when real-time requirements prohibit computationally complex algorithms. Chapter 2 illustrates a novel method, PEGUS, for estimating the relative pose between two images captured by a calibrated camera. The method, based on the statistical theory, utilizes redundant feature points in the captured images to develop a robust pose estimate. Experimental results indicate markedly better performance over existing popular methods such as RANSAC and nonlinear mean shift algorithm, and the non-iterative structure of the algorithm makes it suitable in real-time applications. Control of a moving object using a stationary camera and vice versa are well attended problems in the literature of visual servo control and various solutions exist for a class of autonomous systems. However, control of a moving object using the image feedback from a moving camera has been a well-known problem due to the unknown relative velocity associated with moving camera and moving object. In Chapter 3, a collaborative visual servo controller, the daisy-chaining method, is developed with an objective to regulate a sensor-less unmanned ground vehicle (UGV) to a desired pose utilizing the feedback from a moving airborne monocular camera system. Multi-view photogrammetric methods are used to develop relationships between different camera frames and UGV coordinate systems, and Lyapunov-based methods are used to prove asymptotic regulation of an UGV. Another technical challenge when using a vision system for autonomous systems is that the given feature points can leave the camera FOV. In order to address the issue of features leaving the FOV an extension of the method developed in Chapter 3 is provided by considering multiple reseeding feature points. The presented multi-reference daisy-chaining scheme enables the UGV/camera pair to operate over an arbitrarily large area. Building on the results in Chapter 3, the complex problem of cooperative visual servo tracking control is formulated in Chapter 4 with an objective to enable an UGV to follow a desired trajectory encoded as a sequence of images utilizing the image feedback from a moving airborne monocular camera system. The association as well as the relative velocity problem is addressed by introducing a daisy-chaining structure to link a series of projective homographies and expressing them in a constant reference frame. An adaptive parameter update law is employed to actively compensate for the lack of object model and depth measurements. Based on the open-loop error system, a tracking control law is developed through the application of Extended Barbalat's lemma in the Lyapunov-based framework to yield an asymptotic stability. The tracking results are extended to include reseeding stationary feature points by formulating additional projective homography relationships to provide an unrestricted applicative area for the UGV/camera pair. Simulation results are provided demonstrating the tracking control of an UGV and visual simultaneous localization and mapping (vSLAM) results are achieved by fusing the daisy-chaining method with the geometric reconstruction scheme. Since the development provided in Chapters 3 and 4 assumes a stationary object can leave the camera FOV and a new reference object enters the FOV, it is necessary to determine the pose of the new reference object with respect to the receding object in order to provide the pose information of a moving agent such as an UGV or the camera itself. Therefore, the error in pose measurement between the stationary reference objects could propagate through the subsequent reference objects leading to large localization errors. The error propagation is analyzed in Chapter 4 by performing a numerical simulation and possible solutions are provided, along with simulation results, to mitigate the error propagation in daisy-chaining.
General Note: In the series University of Florida Digital Collections.
General Note: Includes vita.
Bibliography: Includes bibliographical references.
Source of Description: Description based on online resource; title from PDF title page.
Source of Description: This bibliographic record is available under the Creative Commons CC0 public domain dedication. The University of Florida Libraries, as creator of this bibliographic record, has waived all rights to it worldwide under copyright law, including all related and neighboring rights, to the extent allowed by law.
Statement of Responsibility: by Siddhartha Mehta.
Thesis: Thesis (Ph.D.)--University of Florida, 2010.
Local: Adviser: Dixon, Warren E.
Local: Co-adviser: Barooah, Prabir.
Electronic Access: RESTRICTED TO UF STUDENTS, STAFF, FACULTY, AND ON-CAMPUS USE UNTIL 2011-04-30

Record Information

Source Institution: UFRGP
Rights Management: Applicable rights reserved.
Classification: lcc - LD1780 2010
System ID: UFE0041119:00001

Permanent Link: http://ufdc.ufl.edu/UFE0041119/00001

Material Information

Title: A Daisy-Chaining Approach for Vision-Based Control and Estimation
Physical Description: 1 online resource (127 p.)
Language: english
Creator: Mehta, Siddhartha
Publisher: University of Florida
Place of Publication: Gainesville, Fla.
Publication Date: 2010

Subjects

Subjects / Keywords: collaborative, control, cooperative, daisy, nonlinear, pegus, ransac, robust, slam, uav, ugv, vision, visual
Mechanical and Aerospace Engineering -- Dissertations, Academic -- UF
Genre: Mechanical Engineering thesis, Ph.D.
bibliography   ( marcgt )
theses   ( marcgt )
government publication (state, provincial, terriorial, dependent)   ( marcgt )
born-digital   ( sobekcm )
Electronic Thesis or Dissertation

Notes

Abstract: A DAISY-CHAINING APPROACH FOR VISION-BASED CONTROL AND ESTIMATION The research presented in this dissertation monograph lies within the general scope of guidance, navigation, and control of autonomous systems and centers around the design and analysis of visual servo control strategies and vision-based robust position and orientation (i.e., pose) estimation. The motivation behind the presented research is to enable a vision system to provide robust navigation and control of autonomous agents operating over a large area. In order to enable vision systems to provide pose estimates over a large area, a new daisy-chaining method is developed. By developing multi-view geometry, or photogrammetry, based concepts relationships are established between the current pose of an agent and the desired agent pose, when the desired agent is out of the camera field-of-view (FOV). The daisy-chaining method is limited by the need to maintain a single reference object that is contained in both the current view and the final view of the desired pose of the vehicle. To overcome this limitation, the daisy-chaining method is extended to allow multiple reference objects to enter and leave the camera FOV, allowing theoretically infinite daisy-chaining and hence an unrestricted applicative area for an UGV. Error propagation analysis for the daisy-chaining method, which resembles a `dead-reckoning' scheme, shows the methods is susceptible to image noise and feature point outliers. To address the local pose estimation problem, a statistical method is developed, which is coined as Pose Estimation by Gridding of Unit Spheres (PEGUS), that provides robust pose estimation in presence of feature outliers and image noise. The accuracy of any vision-based control and estimation problem largely depends on accurate feature point information. Feature point errors will result in an erroneous pose estimation that could potentially affect the stability and performance of the control and estimation methods. An accurate pose estimation is a non-trivial problem, especially when real-time requirements prohibit computationally complex algorithms. Chapter 2 illustrates a novel method, PEGUS, for estimating the relative pose between two images captured by a calibrated camera. The method, based on the statistical theory, utilizes redundant feature points in the captured images to develop a robust pose estimate. Experimental results indicate markedly better performance over existing popular methods such as RANSAC and nonlinear mean shift algorithm, and the non-iterative structure of the algorithm makes it suitable in real-time applications. Control of a moving object using a stationary camera and vice versa are well attended problems in the literature of visual servo control and various solutions exist for a class of autonomous systems. However, control of a moving object using the image feedback from a moving camera has been a well-known problem due to the unknown relative velocity associated with moving camera and moving object. In Chapter 3, a collaborative visual servo controller, the daisy-chaining method, is developed with an objective to regulate a sensor-less unmanned ground vehicle (UGV) to a desired pose utilizing the feedback from a moving airborne monocular camera system. Multi-view photogrammetric methods are used to develop relationships between different camera frames and UGV coordinate systems, and Lyapunov-based methods are used to prove asymptotic regulation of an UGV. Another technical challenge when using a vision system for autonomous systems is that the given feature points can leave the camera FOV. In order to address the issue of features leaving the FOV an extension of the method developed in Chapter 3 is provided by considering multiple reseeding feature points. The presented multi-reference daisy-chaining scheme enables the UGV/camera pair to operate over an arbitrarily large area. Building on the results in Chapter 3, the complex problem of cooperative visual servo tracking control is formulated in Chapter 4 with an objective to enable an UGV to follow a desired trajectory encoded as a sequence of images utilizing the image feedback from a moving airborne monocular camera system. The association as well as the relative velocity problem is addressed by introducing a daisy-chaining structure to link a series of projective homographies and expressing them in a constant reference frame. An adaptive parameter update law is employed to actively compensate for the lack of object model and depth measurements. Based on the open-loop error system, a tracking control law is developed through the application of Extended Barbalat's lemma in the Lyapunov-based framework to yield an asymptotic stability. The tracking results are extended to include reseeding stationary feature points by formulating additional projective homography relationships to provide an unrestricted applicative area for the UGV/camera pair. Simulation results are provided demonstrating the tracking control of an UGV and visual simultaneous localization and mapping (vSLAM) results are achieved by fusing the daisy-chaining method with the geometric reconstruction scheme. Since the development provided in Chapters 3 and 4 assumes a stationary object can leave the camera FOV and a new reference object enters the FOV, it is necessary to determine the pose of the new reference object with respect to the receding object in order to provide the pose information of a moving agent such as an UGV or the camera itself. Therefore, the error in pose measurement between the stationary reference objects could propagate through the subsequent reference objects leading to large localization errors. The error propagation is analyzed in Chapter 4 by performing a numerical simulation and possible solutions are provided, along with simulation results, to mitigate the error propagation in daisy-chaining.
General Note: In the series University of Florida Digital Collections.
General Note: Includes vita.
Bibliography: Includes bibliographical references.
Source of Description: Description based on online resource; title from PDF title page.
Source of Description: This bibliographic record is available under the Creative Commons CC0 public domain dedication. The University of Florida Libraries, as creator of this bibliographic record, has waived all rights to it worldwide under copyright law, including all related and neighboring rights, to the extent allowed by law.
Statement of Responsibility: by Siddhartha Mehta.
Thesis: Thesis (Ph.D.)--University of Florida, 2010.
Local: Adviser: Dixon, Warren E.
Local: Co-adviser: Barooah, Prabir.
Electronic Access: RESTRICTED TO UF STUDENTS, STAFF, FACULTY, AND ON-CAMPUS USE UNTIL 2011-04-30

Record Information

Source Institution: UFRGP
Rights Management: Applicable rights reserved.
Classification: lcc - LD1780 2010
System ID: UFE0041119:00001


This item has the following downloads:


Full Text

PAGE 1

ADAISY-CHAININGAPPROACHFORVISION-BASEDCONTROLAND ESTIMATION By SIDDHARTHAS.MEHTA ADISSERTATIONPRESENTEDTOTHEGRADUATESCHOOL OFTHEUNIVERSITYOFFLORIDAINPARTIALFULFILLMENT OFTHEREQUIREMENTSFORTHEDEGREEOF DOCTOROFPHILOSOPHY UNIVERSITYOFFLORIDA 2010 1

PAGE 2

c r 2010SiddharthaS.Mehta 2

PAGE 3

TomyparentsSatishandSulabha,andmyfriendsandfamilyme mberswhoconstantly lledmewithmotivationandjoy. 3

PAGE 4

ACKNOWLEDGMENTS IexpressmymostsincereappreciationtoDr.WarrenE.Dixon andDr.ThomasF. Burks.Theircontributiontomycurrentandensuingcareerc annotbeoveremphasized. Ithankthemfortheeducation,advice,andintroducingmewi ththeinterestingeldof vision-basedcontrol.SpecialthanksgotoDr.PrabirBaroo ahforhistechnicalinsights, support,andactiveparticipationintheworkonrobustpose estimation.Iwouldlike tothankDr.RichardLindandDr.RyanCauseyforthecollabor ativeresearchon autonomousaerialrefuelingandDr.CarlCraneandDr.Donal dMacArthurforactive participationonthedaisy-chainingcontrolresearch. IwouldliketothankmycolleaguesfromNonlinearControlsa ndRoboticsgroupfor thetechnicaldiscussionsandfrequentnerdyconversation swhichhavebeeninsightfuland thoughtprovoking.Also,Iwouldliketothankmycolleagues fromAgriculturalRobotics andMechatronicsGroup(ARMG)fortheirsupportandencoura gement. 4

PAGE 5

TABLEOFCONTENTS page ACKNOWLEDGMENTS ................................. 4 LISTOFTABLES ..................................... 7 LISTOFFIGURES .................................... 8 ABSTRACT ........................................ 12 CHAPTER 1INTRODUCTIONANDMOTIVATION ...................... 15 1.1Vision-basedPoseEstimation ......................... 15 1.2Vision-basedControlofAutonomousSystems ................ 17 1.3DissertationOutlineandContributions .................... 20 2PEGUS:ANOVELALGORITHMFORPOSEESTIMATION ......... 23 2.1Introduction ................................... 23 2.2RelatedWork .................................. 24 2.3ProblemStatementandApproach ....................... 26 2.4ProposedAlgorithm .............................. 28 2.4.1RotationEstimation .......................... 28 2.4.2EstimatingTranslation ......................... 31 2.4.2.1CaseA:unittranslation ................... 31 2.4.2.2CaseB:translation ...................... 32 2.5PerformanceEvaluation ............................ 32 2.6Discussion .................................... 35 2.6.1SamplingoftheHypotheses ...................... 35 2.6.2RobustnesstoOutliers ......................... 37 2.7Conclusion .................................... 37 3VISUALSERVOCONTROLOFANUNMANNEDGROUNDVEHICLEVIA AMOVINGAIRBORNEMONOCULARCAMERA ............... 41 3.1Introduction ................................... 41 3.2Daisy-ChainingBasedRegulationControl .................. 42 3.2.1GeometricModel ............................ 42 3.2.2EuclideanReconstruction ........................ 47 3.2.3UGVKinematics ............................ 50 3.2.4ControlObjective ............................ 51 3.2.5ControlDevelopment .......................... 52 3.2.6StabilityAnalysis ............................ 53 3.3Multi-ReferenceVisualServoControlofanUnmannedGro undVehicle .. 55 3.3.1GeometricModel ............................ 55 5

PAGE 6

3.3.2EuclideanReconstruction ........................ 60 3.3.3UGVKinematics ............................ 64 3.3.4SimulationResults ........................... 65 3.3.5ConcludingRemarks .......................... 67 4ADAISY-CHAININGVISUALSERVOINGAPPROACHWITHAPPLICATI ONS INTRACKING,LOCALIZATION,ANDMAPPING ............... 71 4.1Introduction ................................... 71 4.2Daisy-ChainingBasedTrackingControl .................... 71 4.2.1ProblemScenario ............................ 72 4.2.2GeometricRelationships ........................ 74 4.2.3EuclideanReconstruction ........................ 77 4.2.4ControlObjective ............................ 80 4.2.5ControlDevelopment .......................... 83 4.2.5.1Open-looperrorsystem ................... 83 4.2.5.2Closed-looperrorsystem ................... 83 4.2.6StabilityAnalysis ............................ 84 4.3CooperativeTrackingControlofaNonholonomicUnmanne dGroundVehicle 85 4.3.1ProblemScenario ............................ 86 4.3.2GeometricRelationships ........................ 88 4.3.3EuclideanReconstruction ........................ 89 4.3.4ControlObjective ............................ 94 4.3.5ControlDevelopment .......................... 96 4.4SimultaneousTracking,LocalizationandMapping .............. 98 4.4.1ProblemScenario ............................ 98 4.4.2GeometricRelationships ........................ 99 4.4.3EuclideanReconstruction ........................ 99 4.4.4TrackingandMapping ......................... 104 4.4.5SimulationResults ........................... 104 4.5ErrorPropagationinDaisy-Chaining ..................... 109 4.6ConcludingRemarks .............................. 112 5CONCLUSIONS ................................... 117 5.1ResearchSummary ............................... 117 5.2RecommendationsforFutureWork ...................... 120 REFERENCES ....................................... 121 BIOGRAPHICALSKETCH ................................ 127 6

PAGE 7

LISTOFTABLES Table page 2-1Comparisonofmeanandvarianceofestimationerror:(A) PEGUS(B)RANSAC+least squares,(C)non-linearmeanshift. ......................... 36 3-1CoordinateframesrelationshipsforUGVregulationcon trol. ........... 44 3-2Coordinateframesrelationshipsformulti-referenceU GVregulationcontrol. .. 57 4-1Coordinateframesrelationshipsfor6-DOFplanarobjec ttrackingcontrol. ... 75 4-2CoordinateframerelationshipsforUGVtrackingcontro l. ............. 89 4-3Coordinateframerelationshipsformulti-referenceUG Vtrackingcontrol. .... 99 7

PAGE 8

LISTOFFIGURES Figure page 2-1Twoviewsofasceneandthematchedfeaturepointsbetwee ntheimages. ... 25 2-2HistogramoftheEulerangledataobtainedfrom5000rota tionhypothesesbetween thetwoimagesshowninFigure2-1. ........................ 26 2-3Multi-modaldistributionof5000unit-translationhyp othesesbetweenthetwo imagesshowninFigure2-1. ............................. 26 2-4Histogramoftheaverageoverlapin100hypothesesgener ated. .......... 29 2-5Numberoffeaturepointstrackedduringasequenceof970 imagestakenbya camera.Thenumberofpairsofmatchedfeaturepointsbetwee ntherstand thesecondimagepairis31andthatbetweentherstandthe97 0-thimageis 9. ........................................... 33 2-6Evaluationoftheestimationaccuracyachievedbythepr oposedPEGUSalgorithm anditscomparisonwiththatachievedbyRANSAC+LSandnon-l inearmean shiftalgorithmof[1].RANSAC+LSmeansoutlierrejectionb yRANSACfollowed byre-estimationoftheposebyfeedingalltheinlierstothe normalized8-point algorithm.Pmfoftherotationandunittranslationestimat ionerrorsarecomputed from9000samplesoftheerrorobtainedfromthe9000imagepa irs. ....... 35 2-7Comparisonofthecomputationtimerequiredbythe3algo rithms.Thepmfis estimatedfrom9000samplesofcomputationtime. ................ 35 2-8RobustnesscomparisonofthepresentedPEGUSalgorithm withRANSAC+LS andnon-linearmeanshiftalgorithmintermsoftherotation estimationaccuracy usingsyntheticdatawith(a)10%featureoutliers,(b)20%f eatureoutliers,(c) 30%featureoutliers,(d)40%featureoutliers,(e)50%feat ureoutliers,(f)60% featureoutliers,(g)70%featureoutliers,(h)80%feature outliers,and(i)90% featureoutliers.Pmfoftherotationestimationerrorisco mputedfrom100samples oftheerrorobtainedfromthe100imagepairs. .................. 38 2-9RobustnesscomparisonofthepresentedPEGUSalgorithm withRANSAC+LS andnon-linearmeanshiftalgorithmintermsofthetranslat ionestimationaccuracy usingsyntheticdatawith(a)10%featureoutliers,(b)20%f eatureoutliers,(c) 30%featureoutliers,(d)40%featureoutliers,(e)50%feat ureoutliers,(f)60% featureoutliers,(g)70%featureoutliers,(h)80%feature outliers,and(i)90% featureoutliers.Pmfofthetranslationestimationerrori scomputedfrom100 samplesoftheerrorobtainedfromthe100imagepairs. .............. 39 2-10(a)Meanrotationestimationerrorand(b)meantransla tionestimationerror forthepresentedPEGUSalgorithm,RANSAC+LS,andnon-line armeanshift algorithmusingtheposeestimationresultsforsyntheticd ataofvaryingfeature outliers(10%-90%)presentedinFigs.2-8and2-9. ................ 40 8

PAGE 9

3-1Cameracoordinateframerelationships:Amovingairbor nemonocularcamera (coordinateframe I )hoveringaboveanUGV(coordinateframe F )whileviewing axedreferenceobject(coordinateframe F )regulatesanUGVtothedesired pose(coordinateframe F d )capturedbyapriorilocatedcamera(coordinateframe I R ). .......................................... 43 3-2UrbanscenariodescribingregulationofanUGVtothedes iredposeusingan airbornecamera. ................................... 45 3-3Cameratoreferenceobjectrelationships:Amonocularc amera(coordinateframe I )viewingastationaryreferenceobject(coordinateframe F i )suchthatastationary objectcanleavethecameraFOVasthenewobjectenterstheFO V,whilethe stationaryreferencecamera(coordinateframe I R )isassumedtoviewthestationary referenceobject F 1 ................................. 56 3-4CameratoUGVrelationships:Amonocularcamera(coordi nateframe I )hovering aboveanUGV(coordinateframe F )whileviewingastationaryreferenceobject (coordinateframe F n )regulatesanUGVtothedesiredpose(coordinateframe F d )knownaprioriinthereferencecamera(coordinateframe I R ).Thestationary pose(coordinateframe F r )correspondstoasnapshotoftheUGVvisiblefrom thereferencecamera(coordinateframe I R ). .................... 57 3-5Euclideanspacetrajectoryofthemovingcamera I ,initialandnalposition ofthetime-varyingUGV F ( t ),anddesiredUGV F d F (0)denotestheinitial positionoftheUGV, I (0)denotestheinitialpositionofthemovingcamera, I ( t )denotesthetime-varyingpositionofthemovingcamera, F i where i = 1 ; 2 ;::; 7denotesthestationaryreferenceobjects,and F ( t )denotestheregulated positionoftheUGVcoincidentwiththedesiredUGV F d ............. 68 3-6Linear(i.e. e 1 ( t )and e 2 ( t ))andangular(i.e. e 3 ( t ))regulationerror. ....... 69 3-7Linear(i.e. v c ( t ))andangular(i.e. c ( t ))velocitycontrolinputs. ........ 69 3-8Linear(i.e. e 1 ( t )and e 2 ( t ))andangular(i.e. e 3 ( t ))regulationerrorinpresence ofanadditivewhiteGaussiannoise. ......................... 70 3-9Linear(i.e. v c ( t ))andangular(i.e. c ( t ))velocitycontrolinputsinpresenceof anadditivewhiteGaussiannoise. .......................... 70 4-1Geometricmodelforamovingcamera(coordinateframe I ),movingtarget(coordinate frame F )andstationaryreferencecamera(coordinateframe I R ). ........ 73 9

PAGE 10

4-2Geometricmodelforamovingcamera,movingUGV,andstat ionaryreference camera:Amovingcamera(coordinateframe I M )recordsthedesiredtrajectory ofanUGV(coordinateframe F d ( t ))withrespecttothestationaryreference object F whilestationarycoordinateframe F s representsasnapshotofanUGV alongthedesiredtrajectorytakenby I R = I M ( t ) j t = T .Amovingcamera(coordinate frame I )viewsthecurrentUGV(coordinateframe F ( t ))andthestationary referenceobject F ................................. 87 4-3GeometricmodelshowingasnapshotofanUGValongthedes iredtrajectory (coordinateframe F s )takenby I R = I M ( t ) j t = T .Acurrentcamera(coordinate frame I )viewingthetime-varyingUGV(coordinateframe F )whileobserving thesetoffeaturepointsattachedto F ...................... 89 4-4Geometricmodelshowingamovingcamera(coordinatefra me I M )recording thedesiredtrajectoryofanUGV(coordinateframe F d ( t ))withrespecttothe stationaryreferenceobject F whilestationarycoordinateframe F s represents asnapshotofanUGValongthedesiredtrajectorytakenby I R = I M ( t ) j t = T .. 90 4-5Geometricmodelforamovingcamera,movingUGV,andstat ionaryreference camera:Amovingcamera(coordinateframe I M )recordsthedesiredtrajectory ofanUGV(coordinateframe F d ( t ))withrespecttothestationaryreference object F 1 whilestationarycoordinateframe F s representsasnapshotofan UGValongthedesiredtrajectorytakenby I R = I M ( t ) j t = T .Amovingcamera (coordinateframe I )viewsthecurrentUGV(coordinateframe F ( t ))andthe stationaryreferenceobject F j ........................... 100 4-6Asimpliedequivalentmodelshowingamovingcamera(co ordinateframe I ) observingthecurrentUGV(coordinateframe F ( t ))andthestationaryreference object F j andtheposeof F j isexpressedintermsof I R ............ 103 4-7Euclideanspacetrajectoryofthefeaturepointsattach edtothecurrent(i.e. F ( t ))anddesired(i.e. F d ( t ))UGVtakenby I and I M ,respectivelyandthe time-varyingtrajectoryofthecurrentandreferencecamer a, I and I M ; respectively. F (0)denotestheinitialpositionofthecurrentUGV, F ( t )denotesthetime-varying positionofthecurrentUGV, F d (0)denotestheinitialpositionofthedesired UGV, I (0)denotestheinitialpositionofthecurrentcamera, I ( t )denotesthe time-varyingpositionofthecurrentcamera, I M (0)denotestheinitialposition ofthetime-varyingreferencecamera, I M ( t )denotesthetime-varyingposition ofthetime-varyingreferencecamera,and F 1 F 2 ,and F 3 denotetheposition ofthestationaryreferenceobjects. ......................... 106 4-8Linear(i.e., e 1 ( t )and e 2 ( t ))andangular(i.e., e 3 ( t ))trackingerror. ....... 107 4-9Linear(i.e., v c ( t ))andangular(i.e., c ( t ))velocitycontrolinputs. ........ 107 4-10Linear(i.e., e 1 ( t )and e 2 ( t ))andangular(i.e., e 3 ( t ))trackingerrorinpresence ofanadditivewhiteGaussianimagenoise. ..................... 108 10

PAGE 11

4-11Linear(i.e., v c ( t ))andangular(i.e., c ( t ))velocitycontrolinputsinpresenceof anadditivewhiteGaussianimagenoise. ...................... 108 4-12ResultsoflocalizationofthecurrentUGVattachedto F ( t )andmappingof referencetargetsattachedto F 1 F 2 ,and F 3 expressedinconstantreference frame I R .Specically,trajectory(1)showsthetime-varyingposeo fthemoving cameraattachedto I ( t ),trajectory(2)showsthetime-varyingposeofthemoving cameraattachedto I M ( t ),andtrajectory(3)showsthetime-varyingposeof thecurrentUGVattachedto F ( t )measuredinthestationaryreferencecamera frame I R F (0)denotestheinitialpositionofthecurrentUGVand F 1 F 2 and F 3 denotethepositionofthestationaryreferenceobjects. .......... 109 4-13Asimulationscenariodepictingthecirculartrajecto ryofcameraandasetof stationaryreferenceobjects F i where i =1 ;:::; 8. ................. 111 4-14Errorpropagationindaisy-chainingposeestimationm ethodinabsenceoffeature pointnoiseafter240daisy-chainsbytraversingthecircul artrajectory30times. 112 4-15Asimulationscenariodepictingtheestimatedcamerat rajectoryinpresenceof whiteGaussianimagenoiseandasetofstationaryreference objects F i where i =1 ;:::; 8. ....................................... 113 4-16Errorpropagationindaisy-chainingposeestimationm ethodinpresenceofwhite Gaussiannoiseafter240daisy-chainsbytraversingthecir culartrajectory30 times. ......................................... 114 4-17Asimulationscenariodepictingtheestimatedcamerat rajectoryinpresenceof whiteGaussianimagenoisebyupdatingthecamerapositiona ttheendofeach circulartrajectoryandasetofstationaryreferenceobjec ts F i where i =1 ;:::; 8. 115 4-18Errorpropagationindaisy-chainingposeestimationm ethodinpresenceofwhite Gaussiannoisebyupdatingthecamerapositionattheendofe achcirculartrajectory. 116 11

PAGE 12

AbstractofDissertationPresentedtotheGraduateSchool oftheUniversityofFloridainPartialFulllmentofthe RequirementsfortheDegreeofDoctorofPhilosophy ADAISY-CHAININGAPPROACHFORVISION-BASEDCONTROLAND ESTIMATION By SiddharthaS.Mehta April2010 Chair:Dr.WarrenE.DixonCo-Chair:Dr.PrabirBarooahMajor:MechanicalEngineering Theresearchpresentedinthisdissertationmonographlies withinthegeneralscope ofguidance,navigation,andcontrolofautonomoussystems andcentersaroundthe designandanalysisofvisualservocontrolstrategiesandv ision-basedrobustpositionand orientation(i.e.,pose)estimation.Themotivationbehin dthepresentedresearchisto enableavisionsystemtoproviderobustnavigationandcont rolofautonomousagents operatingoveralargearea.Inordertoenablevisionsystem stoprovideposeestimates overalargearea,anew daisy-chaining methodisdeveloped.Bydevelopingmulti-view geometry,orphotogrammetry,basedconceptsrelationship sareestablishedbetweenthe currentposeofanagentandthedesiredagentpose,whenthed esiredagentisoutof thecameraeld-of-view(FOV).Thedaisy-chainingmethodi slimitedbytheneedto maintainasinglereferenceobjectthatiscontainedinboth thecurrentviewandthenal viewofthedesiredposeofthevehicle.Toovercomethislimi tation,thedaisy-chaining methodisextendedtoallowmultiplereferenceobjectstoen terandleavethecameraFOV, allowingtheoreticallyinnitedaisy-chainingandhencea nunrestrictedapplicativearea foranUGV.Errorpropagationanalysisforthedaisy-chaini ngmethod,whichresembles a`dead-reckoning'scheme,showsthemethodsissusceptibl etoimagenoiseandfeature pointoutliers.Toaddressthelocalposeestimationproble m,astatisticalmethodis 12

PAGE 13

Theaccuracyofanyvision-basedcontrolandestimationpro blemlargelydependson accuratefeaturepointinformation.Featurepointerrorsw illresultinanerroneouspose estimationthatcouldpotentiallyaectthestabilityandp erformanceofthecontroland estimationmethods.Anaccurateposeestimationisanon-tr ivialproblem,especiallywhen real-timerequirementsprohibitcomputationallycomplex algorithms.Chapter 2 illustrates anovelmethod,PEGUS,forestimatingtherelativeposebetw eentwoimagescaptured byacalibratedcamera.Themethod,basedonthestatistical theory,utilizesredundant featurepointsinthecapturedimagestodeveloparobustpos eestimate.Experimental resultsindicatemarkedlybetterperformanceoverexistin gpopularmethodssuchas RANSACandnonlinearmeanshiftalgorithm,andthenon-iter ativestructureofthe algorithmmakesitsuitableinreal-timeapplications. Controlofamovingobjectusingastationarycameraand viceversa arewellattended problemsintheliteratureofvisualservocontrolandvario ussolutionsexistforaclassof autonomoussystems.However,controlofamovingobjectusi ngtheimagefeedbackfrom amovingcamerahasbeenawell-knownproblemduetotheunkno wnrelativevelocity associatedwithmovingcameraandmovingobject.InChapter 3 ,acollaborativevisual servocontroller,thedaisy-chainingmethod,isdeveloped withanobjectivetoregulate asensor-lessunmannedgroundvehicle(UGV)toadesiredpos eutilizingthefeedback fromamovingairbornemonocularcamerasystem.Multi-view photogrammetricmethods areusedtodeveloprelationshipsbetweendierentcameraf ramesandUGVcoordinate systems,andLyapunov-basedmethodsareusedtoproveasymp toticregulationofan UGV. Anothertechnicalchallengewhenusingavisionsystemfora utonomoussystems isthatthegivenfeaturepointscanleavethecameraFOV.Ino rdertoaddressthe issueoffeaturesleavingtheFOVanextensionofthemethodd evelopedinChapter 3 is providedbyconsideringmultiplereseedingfeaturepoints .Thepresentedmulti-reference daisy-chainingschemeenablestheUGV/camerapairtoopera teoveranarbitrarilylarge 13

PAGE 14

area.Simulationresultsareprovidedthatillustratethep erformanceofthedeveloped cooperativecontrolscheme. BuildingontheresultsinChapter 3 ,thecomplexproblemofcooperativevisual servotrackingcontrolisformulatedinChapter 4 withanobjectivetoenableanUGVto followadesiredtrajectoryencodedasasequenceofimagesu tilizingtheimagefeedback fromamovingairbornemonocularcamerasystem.Theassocia tionaswellastherelative velocityproblemisaddressedbyintroducingadaisy-chain ingstructuretolinkaseriesof projectivehomographiesandexpressingtheminaconstantr eferenceframe.Anadaptive parameterupdatelawisemployedtoactivelycompensatefor thelackofobjectmodel anddepthmeasurements.Basedontheopen-looperrorsystem ,atrackingcontrollawis developedthroughtheapplicationofExtendedBarbalat'sl emmaintheLyapunov-based frameworktoyieldanasymptoticstability.Thetrackingre sultsareextendedtoinclude reseedingstationaryfeaturepointsbyformulatingadditi onalprojectivehomography relationshipstoprovideanunrestrictedapplicativearea fortheUGV/camerapair. Simulationresultsareprovideddemonstratingthetrackin gcontrolofanUGVinpresence ofmultiplestationaryreferenceobjectsandvisualsimult aneouslocalizationandmapping (vSLAM)resultsareachievedbyfusingthedaisy-chainingm ethodwiththegeometric reconstructionscheme. SincethedevelopmentprovidedinChapters 3 and 4 assumesastationaryobject canleavethecameraFOVandanewreferenceobjectentersthe FOV,itisnecessaryto determinetheposeofthenewreferenceobjectwithrespectt otherecedingobjectinorder toprovidetheposeinformationofamovingagentsuchasanUG Vorthecameraitself. Therefore,theerrorinposemeasurementbetweenthestatio naryreferenceobjectscould propagatethroughthesubsequentreferenceobjectsleadin gtolargelocalizationerrors. TheerrorpropagationisanalyzedinChapter 4 byperforminganumericalsimulation andpossiblesolutionsareprovided,alongwithsimulation results,tomitigatetheerror propagationindaisy-chaining. 14

PAGE 15

CHAPTER1 INTRODUCTIONANDMOTIVATION Theresearchpresentedinthisdissertationmonographlies withinthegeneralscope ofguidance,navigation,andcontrolofautonomoussystems andcentersaroundthe designandanalysisofvisualservocontrolstrategiesandv ision-basedrobustpositionand orientation(i.e.,pose)estimation.TheEuclideanposeof anagentistypicallyrequired forautonomousnavigationandcontrol.Oftentheposeofana gentisdeterminedby aglobalpositioningsystem(GPS)oraninertialmeasuremen tunit(IMU).However, GPSmaynotbeavailableinmanyenvironments,andIMUscandr iftandaccumulate errorsovertimeinasimilarmannerasdeadreckoning.Given recentadvancesinimage extraction/interpretationtechnology,aninterestingap proachtoovercomethepose measurementproblemistoutilizeavisionsystem.Specica lly,ratherthanobtainan inertialmeasurementoftheagent,visionsystemscanbeuse dtorecastsomenavigation andcontrolproblemsintermsoftheimagespacewherethegoa lposeiscomparedtothe relativeposeviamultipleimages. Theaimofthischapteristoprovideareaderwithbackground intheareaof navigationandcontrolofautonomoussystemsusingvision( i.e.,camera)asasensor modality.Amotivationbehindthepresentedresearchisest ablishedbydescribingthe problemscenariosandposingtheopenproblems.Thechapter isorganizedinthree sections;Sections 1.1 and 1.2 provideintroductiontovision-basedposeestimation andcontrolofautonomoussystems,andtheoutlineofdisser tationalongwiththe contributionsofpresentedresearcharedetailedinSectio n 1.3 1.1Vision-basedPoseEstimation Motivatedbypracticalapplicationssuchasautonomousgui dance,navigation,and control,varioustechniqueshavebeendevelopedsuchasvis ualservocontrol,visual odometry,structurefrommotion,etc.Commontoalltheseme thodsistheproblem ofestimatingtherelativepose(rotationandtranslation) betweentwoimages.Fora 15

PAGE 16

monocularcamera,therotationanddirectionoftranslatio nareestimated,whereasincase ofastereocamerasystem,therotationandthetranslationv ectorareestimated. Existingmethodsforposeestimationusepointcorresponde ncebetweenthetwo views,whichisprovidedbyafeature-trackingalgorithm,s uchastheKLTalgorithm[ 2 ]. Givenaminimalsetofpointcorrespondence,therelativepo secanbeestimatedby anumberofalgorithms(theeightpointalgorithm[ 3 ],thevepointalgorithm[ 4 ], etc.).However,pointcorrespondencesastheoutputofthef eaturetrackerinvariably containgrossmismatchesorlargeerrorsinfeaturepointlo cations,whicharecommonly referredtoas outliers .Acentralissueinaccurateposeestimationisdevisingrob ust estimatorsthatcanrejectsuchoutliers.Themostpopulars olutiontothisproblemhas beenhypothesize-and-testmethods,suchasRANSAC[ 5 ]anditsvariants:MLESAC[ 6 ], PROSAC[ 7 ],GOODSAC[ 8 ],pre-emptiveRANSAC[ 9 ],etc.Inthesemethods,hypotheses aregeneratedbyrandomlychoosingaminimalsetofcorrespo ndingfeaturepointpairs thatarerequiredtogenerateahypothesis.Ahypothesisist ypicallyscoredbasedon howmanyoftheobservationsarewell-explainedbyit,andth eonewiththebestscoreis declaredasthedesiredestimate.Mostoftheextensionstot hebasicRANSACscheme focusonreducingthecomputationtime,sincegeneratingal argenumberofhypotheses (whichisrequiredtoobtainagoodestimatewithhighprobab ility)andscoringthemis computationallyexpensive. RANSACandotherhypothesize-and-testmethodschooseonly oneofthemany hypothesesthatareorcanbegenerated.Allotherhypothese sareignored,eventhose thatmaybequiteclosetothetruepose.Eachhypothesiscanb ethoughtofasanoisy \measurement"oftherelativeposethatistobeestimated.I nprinciple,oneshouldbe abletoaveragethesemeasurementsinanappropriatesenset ocomputeamoreaccurate estimatethananyoftheindividualmeasurements(i.e.,hyp otheses). InChapter 2 ,anovelrobustposeestimationalgorithmispresentedbase dontheidea above.Therearetwohurdlesthatimpedethedevelopmentoft hisidea.First,manyof 16

PAGE 17

theposehypotheseswillbecorruptedbyoutliers,whichwil lhavepooraccuracy,sothat includingthemintheaveragingprocessmayleadtolittleim provement,ifany.Thesecond dicultyisthatsinceaposeisnotanelementofavectorspac e,itisnotclearhowto averagemultiplenoisymeasurementsofapose. Toaddressthesechallenges,poseestimationproblemistre atedasestimatingthe rotationand(unit)translationseparately.Byexpressing therotationhypothesesasunit quaternionsthatlieontheunitspherein4-dimensions,and computingahistogramof thisdatabydividingthesphereintobins,thedominantclus ter,ormode,oftherotations isidentied.Asubsetoftherotationsthatarewithinasmal lgeodesicdistanceofthe modeisthenextracted.These\low-noise"rotationhypothe sesarethenaveragedusinga methoddevelopedbyMoakher[ 10 ]toproducetheestimateoftherotation.Estimating unittranslationsproceedinexactlythesameway,exceptno wthedataliesonthesurface oftheunitspherein3dimensions.Whentranslation(direct ionaswellasmagnitude)is available,sayfromastereocamera,modeestimationandave ragingissimplersincethe dataliesinavectorspace.Becauseoftheroleplayedbygrid dingoftheunitspherein3 or4dimensions,theproposedalgorithmiscalledthePoseEs timationbyGriddingofUnit Spheres(PEGUS)algorithm. 1.2Vision-basedControlofAutonomousSystems TheEuclideanpositionandorientation(i.e.,pose)ofanun mannedgroundvehicle (UGV)istypicallyrequiredforautonomousnavigationandc ontrol.Thevision-based controlschemescanbebenettedfromtherobustposeestima tionmethoddevelopedin Chapter 2 Someexamplesofimage-basedvisualservocontrolofmobile vehiclesinclude: [ 11 { 25 ].Previouspureimage-basedvisualservocontrolresultsh aveaknownproblem withpotentialsingularitiesintheimage-Jacobian,andsi ncethefeedbackisonlyin theimage-space,thesemethodsmayrequireimpossibleEucl ideanmotions.Motivated bythedesiretoeliminatetheseissues,someeortshavebee ndevelopedthatcombine 17

PAGE 18

reconstructedEuclideaninformationandimage-spaceinfo rmationinthecontroldesign. TheEuclideaninformationcanbereconstructedbydecoupli ngtheinteractionbetween translationandrotationcomponentsofahomographymatrix .Thishomography-based methodyieldsaninvertibletriangularimage-Jacobianwit hrealizableEuclideanmotion. Homography-basedvisualservocontrolresultsthathavebe endevelopedforUGVinclude: [ 26 { 31 ].In[ 29 ],avisualservocontrollerwasdevelopedtoasymptoticall yregulatethepose ofanUGVtoaconstantposedenedbyagoalimage,wheretheca merawasmounted on-boardanUGV(i.e.,thecamera-in-handproblem).Thecam eraon-boardresultin[ 29 ] wasextendedin[ 26 ]toaddressthemoregeneraltrackingproblem.In[ 27 ],astationary overheadcamera(i.e.,thecamera-to-handorxedcameraco nguration)wasusedto regulateaUGVtoadesiredpose. ThedevelopmentinChapter 3 ismotivatedbythedesiretoaddressthewell-known problemofcontrollingamovingobjectusingamovingcamera .Amovingairborne monocularcamera(e.g.,acameraattachedtoaremotecontro lledaircraft,acamera mountedonasatellite)isusedtoprovideposemeasurements ofamovingsensorlessUGV relativetoagoalconguration.Distinguishingtherelati vevelocitybetweenthemoving UGVandthemovingcamerapresentsasignicantchallenge.G eometricconstructs developedfortraditionalcamera-in-handproblemsarefus edwithxed-camerageometry todevelopasetofEuclideanhomographiessothatameasurab leerrorsystemforthe nonholonomicUGVcanbedeveloped.Theresultingopen-loop errorsystemisexpressed inaformthatisamenabletoavarietyofUGVcontrollers. Inadditiontovision-basedcontrolproblem,imagefeedbac kcanbeusedtolocalize andmaptheenvironment(i.e.,visualsimultaneouslocaliz ationandmapping-vSLAM) [ 32 { 38 ].vSLAMisusedintheapplicationswherethecameraisthema insensorusedto estimatethelocationofarobotintheworld,aswellasestim ateandmaintainestimates ofsurroundingterrainorfeatures.Oftenameasureofestim ateuncertaintyisalso 18

PAGE 19

maintained.vSLAMisabroadtopicwithvariedapproaches.S ee[ 32 ]andreferences thereinforarecentsurvey. TherearemanyoverlappingwaystocategorizevSLAMapproac hes.Someauthors (e.g.,[ 33 34 36 ])makeadistinctionbetween\localvSLAM"and\globalvSLA M".In thiscategorization,localvSLAMisconcernedwithestimat ingthecurrentstateofthe robotandworldmapthroughmatchingvisualfeaturesfromfr ametoframe,andglobal vSLAMisconcernedwithrecognizingwhenfeatureshavebeen previouslyencountered andupdatingestimatesoftherobotandmap(sometimesrefer redtoas\closingloops"). Toaddressbothissues,manyresearchersuseposeinvariant features,suchasSIFT[ 39 ], whichcanbeaccuratelymatchedfromframetoframeorfrommu ltiplecameraviewpoints. ManyvSLAMapproachesuseprobabilisticlters(e.g.,exte ndedKalmanlterorparticle lter)[ 32 33 36 { 38 ],typicallyestimatingastatevectorcomposedofthecamer a/robot position,orientation,andvelocity,andthe3Dcoordinate sofvisualfeaturesintheworld frame.Anoptiontoalteredbasedapproachistheuseofepip olargeometry[ 34 35 ].A nalpossiblecategoryaremethodsthatbuildatrue3Dmap(i .e.,amapthatiseasily interpretedbyahumanbeingsuchaswallsortopography)[ 33 34 36 { 38 ],andthose thatbuildamoreabstractmapthatisdesignedtoallowtheca mera/robottoaccurately navigateandrecognizeitslocation,butnotdesignedforhu maninterpretation. Chapter 4 utilizesanewdaisy-chainingmethoddevelopedinChapter 3 forvision-based trackingcontrolofaUGV,whilealsoprovidinglocalizatio nofthemovingcameraand movingobjectintheworldframe,andmappingthelocationof staticlandmarksinthe worldframe.Hence,thisapproachcanbeusedinvSLAMoftheU GV,withapplications towardpathplanning,realtimetrajectorygeneration,obs tacleavoidance,multi-vehicle coordinationcontrolandtaskassignment,etc.Byusingthe daisy-chainingstrategy,the coordinatesofstaticfeaturesoutoftheeld-of-view(FOV )canalsobeestimated.The estimatesofstaticfeaturescanbemaintainedasamap,orca nbeusedasmeasurements inexistingvSLAMmethods. 19

PAGE 20

1.3DissertationOutlineandContributions Chapter 2 describesanovelrobustalgorithmforestimationoftherel ativepose betweentwocalibratedimages,calledPoseEstimationbyGr iddingofUnitSpheres (PEGUS).Thefocusofthischapteristodevelopacomputatio nallydeterministicpose estimationmethodthatisrobusttofeatureoutliers.Posee stimationresultsusingPEGUS arecomparedwithpopularmethodssuchasRANSACandnonline armean-shiftalgorithm usinganindoorimagesequenceandsyntheticfeaturepointd ata.TheresultsinChapter 2 demonstrateanimprovedperformanceofPEGUSagainstRANSA C+leastsquares aswellasnon-linearmeanshiftmethod,bothintermsofthee stimationaccuracyand computationtime.Byvirtueofnon-iterativeformulationu nderlyingthedeterministic structureofPEGUS,thecomputationtimeismorepredictabl ethanthatofRANSAC andnon-linearmeanshiftalgorithm,thusmakingitamenabl etoavarietyofreal-time applications.Vision-basedcontrolofautonomoussystems typicallyrequireposeestimation betweenmultipleviewscapturedbyacamerasystem.Robustp oseestimationresults developedinChapter 2 canbeusedforsuchapplications. Controlofamovingobjectusingastationarycameraandvice versaarewellattended problemsintheliteratureofvisualservocontrolandnumer oussolutionsexistfora generalclassofautonomoussystems.However,controlofam ovingobjectusingtheimage feedbackfromamovingcamerahasbeenawell-knownproblemd uetotheunknown relativevelocityassociatedwithmovingcameraandmoving object.InChapter 3 acollaborativevisualservocontroller,whichiscoinedth edaisy-chainingmethod,is developedwithanobjectivetoregulateasensor-lessunman nedgroundvehicle(UGV)toa desiredposeutilizingthefeedbackfromamovingairbornem onocularcamerasystem. ThecontributionofresearchinChapter 3 isthedevelopmentofmulti-viewgeometry, orphotogrammetry,basedconceptstorelatethecoordinate framesattachedtomoving camera,movingUGV,anddesiredUGVposespeciedbyanaprio riimage.Geometric constructsdevelopedfortraditionalcamera-in-handprob lemarefusedwithxed-camera 20

PAGE 21

geometrytodevelopasetofEuclideanhomographies.Duetoi ntrinsicphysicalconstraints, oneoftheresultingEuclideanhomographiesisnotmeasurab lethroughasetofspatiotemporal imagesasthecorrespondingprojectivehomographycannotb edeveloped.Hence,the newgeometricformulations,termedvirtualhomography,ar econceivedtosolveforthe homographyinordertodevelopameasurableerrorsystemfor thenonholonomicUGV. AsymptoticregulationresultsareprovedusingtheLyapuno v-basedstabilityanalysis. Chapter 3 alsoillustratesaframeworktoachieveasymptoticregulat ionofanUGV basedonthescenariothatthegivenreferenceobjectscanle avecameraFOVwhile anotherreferenceobjectentersFOV.Thecontrollerisdeve loped-withtheunderlying geometricalconstructsthatdaisy-chainmultiplereferen ceobjects-suchthattheairborne cameradoesnotrequiretomaintainaviewofthestaticrefer enceobject;thereforethe airbornecamera/UGVpaircannavigateoveranarbitrarilyl argearea.Also,bytaking leverageofthegeometricreconstructionmethod,theassum ptionofequalEuclidean distanceofthefeaturesforUGVandreferenceobjectisrela xed. BuildingontheresultsinChapter 3 ,thecomplexproblemofcooperativevisualservo trackingcontrolisformulatedwithanobjectivetoenablea nUGVtofollowadesired trajectoryencodedasasequenceofimagesutilizingtheima gefeedbackfromamoving airbornemonocularcamerasystem.Thedesiredtrajectoryo fanUGVisrecordedbya movingairbornemonocularcamera I M traversinganunknowntime-varyingtrajectory. ThecontrolobjectiveistotrackanUGValongthedesiredtra jectoryusingtheimage feedbackfromamovingairbornecamera I thatmaytraversedierenttrajectorythan thatof I M .Theassociationaswellastherelativevelocityproblemis addressedby introducingadaisy-chainingstructuretolinkaseriesofp rojectivehomographiesand expressingtheminaconstantreferenceframe.Anadaptivep arameterupdatelawis employedtoactivelycompensateforthelackofobjectmodel anddepthmeasurements. Basedontheopen-looperrorsystem,atrackingcontrollawi sdevelopedthroughthe 21

PAGE 22

applicationofExtendedBarbalatslemmaintheLyapunov-ba sedframeworktoyieldan asymptoticstability. Thetrackingresultsareextendedtoincludethereseedingr eferenceobjectby formulatinganadditionalprojectivehomographyrelation shiptoprovideanunbounded applicativeareaofoperation.Thetheoreticaldevelopmen tinChapter 3 manifeststhe coalescenceofdaisy-chainingcontrollerandnewlyformed geometricreconstruction techniquetowardsapplicationinvisualsimultaneousloca lizationandmapping(vSLAM). Thechapteralsoprovidessimulationresultsdemonstratin gthepropagationofpose estimationerrorindaisy-chainingcontrolschemeanddisc ussessuitablemethodsfor limitingtheerrorpropagation. Summaryofresearchandfutureworkrecommendationsarepro videdinChapter 5 22

PAGE 23

CHAPTER2 PEGUS:ANOVELALGORITHMFORPOSEESTIMATION 2.1Introduction Thefocusofresearchinthischapteristodevelopacomputat ionallydeterministic poseestimationmethodthatisrobusttofeatureoutliers.A novelrobustalgorithmis developedforestimationoftherelativeposebetweentwoca libratedimages,whichis coinedas PoseEstimationbyGriddingofUnitSpheres (PEGUS).Thekeyideabehind themethodis,ifthereareMmatchedpairsoffeaturepointsb etweentwoviews,onecan computeamaximumof M P possibleposehypothesesbyusingaP-pointalgorithm.The developedalgorithmselectsasubsetof\low-noise"hypoth esesbyempiricallyestimating theprobabilitydensityfunctionoftherotationandtransl ationrandomvariables,and averagesthem,conformingmanifoldconstraints,tocomput eaposeestimate. Incontrasttohypothesize-and-testmethodssuchasRANSAC [ 5 ],theproposed algorithmaveragestheinformationfromanumberofhypothe sesthatarelikelytobe closetothetruepose.Asaresult,itcomesupwithamoreaccu rateestimatethanthat returnedbyRANSAC-typemethods.Theproposedalgorithmha scertainsimilarities withthenon-linearmeanshiftalgorithmproposedin[ 1 ];thesimilaritiesanddierences betweenthetwoarediscussedinSection 2.2 .AnotheradvantageofthePEGUSalgorithm isthatitdoesnotinvolveanyiterativesearch,sothatthet imerequiredforitsexecution isnotonlyquitesmallbutalsohighlypredictable.Thisasp ectofthealgorithmmakesit suitableforreal-timeapplications. Intestswithrealimagedata,theproposedalgorithmsigni cantlyoutperforms RANSACaswellasthenon-linearmeanshiftalgorithmof[ 1 ].Improvementisseennot onlyinestimationaccuracybutalsoincomputationaltimea ndpredictabilityofexecution time.Robustnessofthepresentedalgorithmiscomparedwit hRANSACandnon-linear meanshiftalgorithmbyvaryingthenumberofoutliersfrom1 0%to90%. 23

PAGE 24

Therestofthechapterisorganizedasfollows:Section 2.2 describessomeoftheprior workthatisrelevanttothepresentedapproach.Section 2.3 explainstheapproachbehind theproposedalgorithm.Theproposedalgorithmisdescribe dindetailinSection 2.4 ExperimentalresultsarepresentedinSection 2.5 andconcludingremarksarepresentedin Section 2.7 2.2RelatedWork TherearecertainsimilaritiesbetweenPEGUSandthenon-li nearmeanshift algorithmbySubbarao et.al. [ 1 ],inwhichasetofgeneratedhypothesesareusedto constructakernel-basedestimateoftheprobabilitydensi tyfunction(pdf)ofthepose hypothesisin SE (3).Anon-linearversionofthemean-shiftalgorithmisthe nusedto iterativelysearchforthemodeofthispdfstartingfromana rbitraryinitialcondition. Theidentiedmodeisdeclaredtheposeestimate.Sinceallt hehypothesesusedto constructthepdfcontributestothemode,andthemodemayno tcoincidewithanyof thehypotheses,theresultingestimatecanbethoughtofasa naverageofthehypotheses, thoughtheaveragingisofanimplicitnature.Inshort,thea pproachesintheproposed PEGUSalgorithmaswellasthatin[ 1 ]treattheposeestimationproblemasaclustering problem.Bothconstructestimatesoftheprobabilitydensi ty(ormass)functionfromaset ofgeneratedhypothesesandreturnsanaveragedhypothesis astheposeestimaterather thanasinglehypothesesfromthosegenerated. Despitethesesimilaritiesbetweenthetwoapproaches,the rearesignicantdierences betweentheproposedPEGUSalgorithmandthenon-linearmea nshiftalgorithm of[ 1 ].First,thePEGUSalgorithmisfarmorerobusttomulti-mod aldensitiesof thegeneratedhypothesesthanmeanshift.Experimentalevi dencesuggeststhatthe distributionofthesehypothesesaretypicallymulti-moda l.Asanexample,allthepossible hypothesesfrom31matchedfeaturepointsbetweentheimage pairshowninFig. 2-1 arecomputedusingthenormalized8-pointalgorithm.Fig. 2-2 showsahistogramofthe EuleranglesobtainedfromtherotationhypothesesandFig. 2-3 showstheunittranslation 24

PAGE 25

hypotheses(directionoftranslation),whicharepointson thesurfaceoftheunitsphere in3-dimensions.Multi-modalityofthedistributioniscle arfromthegures.Insucha situation,theiterativesearchinvolvedinthemeanshifta lgorithmwillconvergetoalocal maximumdependingontheinitialcondition.Incontrast,ah istogram-basedestimateof thepmf(probabilitymassfunction)ofthehypothesesmakes locatingtheglobalmode atrivialproblemevenwithmulti-modaldensities.Thepmff ortherotationhypothesis isconstructedbygriddingthesurfaceoftheunitspherein4 dimensions,onwhich unitquaternionrepresentationsoftherotationslie.Thes ameapproachworksforunit translationsaswell,bygriddingthesurfaceoftheunitsph erein3dimensions.Ifboth magnitudeanddirectionoftranslationcanbeestimated,th ehistogramisconstructedby dividingaregionof R 3 intoanumberofcells. (a) (b) Figure2-1.Twoviewsofasceneandthematchedfeaturepoint sbetweentheimages. Thesecondmajordierenceisthatthenon-linearmeanshift algorithmreturnsthe modeastheestimate,whereastheproposedmethodusesthemo deonlytoidentifyaset ofhypothesesthatarelikelytobeclosetothetruepose.The se\low-noise"hypotheses arethenexplicitlyaveragedinanappropriatemannertocon structthenalestimate. Inaddition,theproposedmethoddoesnotinvolveiterative computation,whereasthe mean-shiftalgorithmrequiresaniterativesearchforthem ode.Ontheotherhand,the nonlinearmean-shiftalgorithmisapplicabletoawidevari etyofestimationproblems 25

PAGE 26

-0.5 0 0.5 1 1.5 2 2.5 3 0 100 200 Roll angle (rad) -0.4 -0.2 0 0.2 0.4 0.6 0 200 400 Pitch angle (rad) -0.5 0 0.5 0 200 400 Yaw angle (rad) Figure2-2.HistogramoftheEulerangledataobtainedfrom5 000rotationhypotheses betweenthetwoimagesshowninFigure 2-1 -1 -0.5 0 0.5 1 -1 -0.5 0 0.5 1 -1 -0.5 0 0.5 1 x-axis y-axis z-axis Figure2-3.Multi-modaldistributionof5000unit-transla tionhypothesesbetweenthetwo imagesshowninFigure 2-1 inwhichdataliesonRiemannianmanifolds[ 40 ],whereastheproposedmethodisonly applicabletoproblemsinwhichthedataliesonsphericalsu rfacesorrealcoordinate spaces. 2.3ProblemStatementandApproach Theobjectiveistodeveloparobustposeestimationalgorit hmusingtwoimages capturedbyamonocularcamera(orfourimagesifapairofcam erasareused)and withouttheknowledgeofthescene.Let R denotethe truerotation betweentwoviews 26

PAGE 27

and t bethe truetranslation .Thetranslationcanbea unit translationifscaleinformation isnotavailable. Ifthereare M pairsoffeaturepointsbetweentwoviewscapturedbythecam era andtheminimalnumberoffeaturepointpairsneedtogenerat eahypothesisis P ,then thetotalnumberofposehypothesesthatcanbecomputedis N max := M P .First n suchhypothesesaregenerated,where n istypicallymuchsmallerthan N max .Eachpair ofgeneratedrotationandtranslationhypothesisisa\nois ymeasurement"ofthetrue rotation R andtrue(unit)translation t ,respectively.Someofthesemeasurements,i.e., hypotheses,suerfromlargeinaccuracy,asseenfromFigs. 2-2 and 2-3 .Theproposed approachistoselectasubsetof\low-noise"hypothesesfro mthesetofallpossible hypothesessothattheyareclosetothetruerotationandtra nslation,respectively.The low-noisehypothesesarethenappropriatelyaveragedtoco mputeaposeestimate. Tofacilitateextractionofthelow-noisehypotheses,each rotationhypothesisis expressedintermsofitsunit-quaternionrepresentation. Sincetheunitquaternions q and q representthesamerotation,itisensuredthattheunit-qua ternionrepresentationofa rotationhypothesishastherstcomponentpositive.Thati s,if q = q r + iq 1 + jq 2 + kq 3 then q r > 0.Aunitquaternionrepresentationofarotationmatrixcan nowbethoughtof asaunit-normvectorin R 4 whoserstcomponentispositive.Thatis,itliesonthe\top halfofthe3-sphere S 3 .The d -sphere S d isdenedas S d := f x =[ x 1 ;:::;x d +1 ] T 2 R d +1 jk x k =1 g (2{1) where kk denotestheEuclideannorm.Similarly,dene S d + = f x 2 R d +1 jk x k =1 ;x 1 0 g : (2{2) Therefore,eachrotationhypothesisisanelementof S 3+ .Similarly,eachhypothesisofunit translationisanelementof S 2 .Ifscaleinformationisavailable,translationhypothese sare elementsof R 3 insteadof S 2 27

PAGE 28

Sinceeachrotationhypothesisisanoisymeasurementofthe truerotation,the rotationhypothesescanbethoughtofasrealizationsofara ndomvariablewhose distributionisdenedoverthehalf-sphere S 3+ .Bydividingthesurfaceofthesphere S 3 andcountingthenumberofrotationhypotheses(rather,the irunit-quaternion representation),thepmfofthisrandomvariablecanbeesti mated.Themodeofthepmf isapointinthebinthathasthelargestnumberofunit-quate rnions.Asubsetofthese quaternionsthatarewithinapredeterminedgeodesicdista nceofthemodeisselected, andthenaveragedinanappropriatemannertoobtainthenal estimateoftherotation. Estimationoftranslationsproceedinasimilarmanner.The algorithmisdescribedin detailinthenextsection. 2.4ProposedAlgorithm 2.4.1RotationEstimation Step1:Hypothesesgenerationengine: Thetotalnumberofpossibleposehypotheses, N max istypicallyextremelylarge,since N max = M P ,where M isthenumberofpoint correspondenceand P istheminimalnumberneededtogenerateahypothesis.For example,evenasmallvalueof M ,e.g.,21,with P =8yields N max =203490.Processing suchalargenumberofhypothesesiscomputationallyinfeas ible.Inaddition,processing allofthemisnotnecessarysincemostofthesehypothesesar e\correlated",astheyare generatedfromoverlappingfeaturepointsets.Asamplingw ithreplacementstrategyis usedtogenerateanumberofhypothesesthathavesmallcorre lationsamongoneanother. Thenumberofsuchhypothesestobegenerated, n ,isadesignparameterthathasto bespecieda-priori.However,evenwithasmallvalueof n ( 100)themethodyields benecialresults.Thesamplingstrategyconsistsofselec tingtherstfeaturepointpair atrandomfromthe M pairs,thenselectingthesecondpairfromtheremaining M 1 pairs,andsoonuntilthe P -thpairisselected.These P pairsofpointcorrespondenceare usedtogenerateahypothesis.Thissamplingprocedureisre peated n timestogenerate n hypotheses,whicharedenotedby q i ;t i ,where q i isanunit-quaternionand t i isa 28

PAGE 29

0 1 2 3 4 5 6 7 8 0 0.5 1 1.5 2 2.5 3 3.5 4 Number of overlapping feature pointsNumber of samples Figure2-4.Histogramoftheaverageoverlapin100hypothes esgenerated. translationvector(unit-normorotherwise),for i =1 ;:::;n .Thesetofthese n rotation hypothesesisdenotedby S q ,andthesetoftranslationhypothesesisdenotedby S t Figure 2-4 providesevidenceofthesmallcorrelationamongthehypoth esesgenerated bythesamplingstrategymentionedabove.Asetofhypothese s q 1 ;:::;q j issaidtohave anoverlapof ` ifthereareexactly ` featurepointpairsthatarecommontothesetsof pointsusedtogeneratethehypotheses q 1 ;:::;q j .Themetricformeasuringcorrelations amonghypothesesisthenumberofoverlapamongthem.Thegu reshowsahistogram oftheoverlapbetween100hypothesesgeneratedinthisfash ion,computedbasedonthe averageof1000randomtrialsofthehypothesesgeneration. Step2:Estimatingthemode: Each q i isimaginedtobetherealizationofarandom variable q withanunknowndistributiondenedover S 3+ .The3-sphere S 3 isdivided intoanumberofregionsofequalarea,orbins,thataredenot edby B j j =1 ;:::;K q where K q isthe(appropriatelychosen)numberofregions.Thealgori thmdescribedin[ 41 ] isusedforthispurpose.Thepmfoftherandomvariable q overthebins B j ,whichis denotedby p ( q ) ,isanarrayof K q numbers: p ( q ) j =P( q 2 B j ),wherePdenotesprobability. Ahistogramestimate^ p ( q ) ofthepmf p ( q ) iscomputedbycountingthenumberofpoints q i insideeachbin:^ p ( q ) j = 1 n P ni =1 I B j ( q i ),where I A ( x )istheindicatorfunctionoftheset A .Thatis, I A ( x )=1if x 2 A and0otherwise.Ausefulpropertyofthehistogram-based 29

PAGE 30

estimateisthat^ p ( q ) j isanunbiasedestimateof p ( q ) j evenifthesamplesusedtoconstruct theestimatesarecorrelated.Let B j bethebininwhichthepmfattainsitsmaximum value,i.e., j =argmax j (^ p ( q ) j ).Ifthenumberofbins K q islarge,thenthecenterof B j istakenastheestimateofthemodeofthepmf p ( q ) ,whichisdenotedby q 2 S 3+ .If asmall K q ischosensothateachbinmaybelarge,thegeodesicdistance sbetweenthe centerof B j andthe q i 'slyingin B j arecomputedandthenaveragedtoidentifythe dominantclusterofpointsinsidethisbin.Ifthereissucha setofpointsthatforma dominantcluster,theircenterischosenasthemode.Thecen teriscomputedbytaking thearithmeticmeanoftheunit-quaternions(thinkingofth emas4dimensionalvectors) withintheclusterandthennormalizingthemean. Step3:Extractinglow-noisemeasurements: Oncethemodeisidentied,asubset Q q S q isselectedthatconsistsofthose^ q i 2 S thatsatises d q ( q ;q i ) <" q ; (2{3) wherethedistancefunction d q ( ; )istheRiemanniandistance.TheRiemanniandistance betweentworotations q 1 ;q 2 2 S 3+ isgivenby d ( R 1 ;R 2 )= 1 p 2 k log( R T 1 R 2 ) k F ; (2{4) where R 1 ;R 2 2 SO (3)aretherotationmatrixrepresentationof q 1 ;q 2 ,andthesubscript F referstotheFrobeniusnorm. Step4:Averaginglow-noisedata: Let N 1 bethenumberofelementsinthelow-noise dataset Q q ofrotationsobtainedasdescribedabove,andlet R i denotetherotation matrixcorrespondingto q i 2 Q q .The optimal averageoftherotationmatrices R 1 :::R N 1 intheEuclideansenseisthematrix ^ R thatsatises[ 10 ] ^ R =argmin R 2 SO (3) N 1 X i =1 jj R i R jj 2 F : (2{5) 30

PAGE 31

ItwasshownbyMoakher[ 10 ]that ^ R denedby( 2{5 )canbecomputedbytheorthogonal projectionofthearithmeticaverage R = P N 1 i =1 R i N 1 ontothespecialorthogonalgroup SO (3) by ^ R = RUdiag ( 1 p 1 ; 1 p 2 ; s p 3 ) U T ; (2{6) wheretheorthogonalmatrix U issuchthat R T R = U T DUandD = diag ( 1 ; 2 ; 3 ) ; (2{7) and s =1if det R> 0and s = 1otherwise. Thematrix ^ R computedusing( 2{6 )isthedesiredestimateofthetruerotation R 2.4.2EstimatingTranslation2.4.2.1CaseA:unittranslation Theestimationschemeforunittranslationisverysimilart othatfortherotation. Theunittranslationdata t i 2 S t i =1 ;:::;n representrealizationsoftherandomvariable t withanunknowndistributiondenedoverthe2-sphere S 2 .The2-sphere S 2 isdivided intoanumberofbinsofequalarea B j j =1 ;:::;K t ,where K t isthe(appropriately chosen)integer,byusingthemethoddescribedin[ 41 ].Ahistogramestimate^ p ( t ) of thepmf p ( t ) ,where p ( t ) j :=P( t 2 B j )isthencomputedbycountingthenumberof points t i in B j .When K t islarge,themodeoftheunittranslationdistribution,den oted by t ,istakenasthecenterofthebin B j inwhichthepmftakesitsmaximumvalue: j =argmax j ^ p ( t ) j .Whenthenumberofbins K t issmall,amethodsimilartotheoneused forrotationsisusedtoestimatethemode.Oncethemode t isidentied,thelow-noise dataset Q t isselectedbychoosingthose t i 2 S t thatsatises d t ( t ;t i ) <" t ; (2{8) where t isapre-speciedsmallpositivenumber,and d t ( t 1 ;t 2 )isthegeodesicdistance betweentheunittranslationvectors t 1 and t 2 .Let N 2 bethenumberofelementsinthe 31

PAGE 32

low-noisedataset Q t oftheunittranslationsobtainedabove.Thenormalizedari thmetic meanoftheunittranslationsintheset Q t ,whichisgivenby ^ t = P N 2 i =1 t i N 2 P N 2 i =1 t i N 2 (2{9) istakenastheestimateoftheunittranslation t 2.4.2.2CaseB:translation Whenscaleinformationisavailable,e.g.,usingastereoca merapair, t i 'sare hypothesesofthetranslationbetweenthetwoviewsandaree lementsof R 3 ,notof S 2 .Inthiscase,histogramconstruction,modeestimationand low-noisehypotheses extractioniscarriedoutin R 3 ,bydividingaparticularvolumeof R 3 into K t binsofequal volume,witheachbinbeingacubewithequalsides.Thevolum etogridischosenso thatallthehypotheseslieinit.Therestofthealgorithmst aysthesame,exceptthatthe Euclideannormofthedierenceisusedasthedistancefunct ion d t ( ; )in( 2{8 ),andthe normalizationstepin( 2{9 )isomitted. 2.5PerformanceEvaluation Totesttheperformanceoftheproposedalgorithm,970image sofastationary scenearecapturedbyamovingmonocularcamera(make:Matri xVisionGmbH,model: mvBlueFox-120aC,resolution:640 480)mountedona2-linkplanarmanipulator. The2-linkmanipulatorisequippedwithrotaryencodersone achmotorwith614,400 readings/revolution,whichprovidesthegroundtruthofro tationandtranslationbetween anytwotimeinstants.Fig. 2-1 showstheinitialandthenalimagesofthescenes capturedbythecamera. Amulti-scaleversionofKanade-Lucas-Tomasi(KLT)[ 2 ]featurepointtrackeris implementedtotrackthefeaturesinsuccessiveimagesasth ecameraundergoesrotation andtranslation.31featurepointsweredetectedintherst imageofthesequence.Due tothecameramotion,restrictedFOV,featurepointtrackin gerrors,etc.,thenumberof trackedfeaturepointsreduceswithtime(seeFig. 2-5 ). 32

PAGE 33

0 200 400 600 800 1000 5 10 15 20 25 30 35 Image Frame NumberNumber of Tracked Feature Points Figure2-5.Numberoffeaturepointstrackedduringasequen ceof970imagestakenbya camera.Thenumberofpairsofmatchedfeaturepointsbetwee ntherstand thesecondimagepairis31andthatbetweentherstandthe97 0-thimageis 9. TheperformanceofthePEGUSalgorithmisevaluatedbycompa ringtherelative poseestimatestothegroundtruth.Theperformanceofthepr oposedalgorithmisalso comparedwiththatofRANSAC+leastsquares(outlierreject ionbyRANSACfollowed byre-estimationoftheposebyfeedingalltheinlierdatain tothenormalized8-point algorithm)andthenon-linearmeanshiftalgorithm[ 1 ].Forallthethreemethods,a normalized8-pointalgorithmisusedtogeneraterelativep osehypothesis(rotationand unittranslation)betweenthetwoframesfromaminimalnumb erof8pairsofmatched featurepoints.Thesamesetofhypothesesareusedasinputs tothePEGUSalgorithm andthenon-linearmeanshift. Theparametersusedfortheproposedmethodare K q =11, q =0 : 0223, K t =7, t =0 : 017.Theparameter n isnominallysetto100,exceptwhenthenumberofmatched featurepoints M betweentwoviewsissosmallthat N max
PAGE 34

imagepairsareselectedarbitrarilyfromthe970imagescap tured.Thetruerotation andtranslationbetweentheframesinthe i -thimagepairaredenotedas R ( i )and t ( i ), respectively,whichareobtainedfromencodermeasurement s.Therotationandtranslation estimationerrorforthe i -thimagepairaredenedas e R ( i )= k I R ( i ) T ^ R ( i ) k e t ( i )= k t ( i ) ^ t ( i ) k ; (2{10) where ^ R ( i )and ^ t ( i )aretheestimatesoftherotation R ( i )andunittranslation t ( i ), kk denotesthe2-norm,and I denotesa R 3 3 identitymatrix.Foreachofthethree algorithms,the9000samplesoftheerrors( 2{10 )arecomputed,andareusedtoestimate thepmfoftherotationandtranslationerrorsthatresultsf romtheparticularalgorithm. Theerrorprobabilitiesforthethreemethods:PEGUS,RANSA C+leastsquares,and non-linearmeanshift,areshowninFig. 2-10 (a)and(b).Themeanandstandard deviationoftherotationandtranslationerrorsaretabula tedinTable 2-1 .Thegure andthetableshowsthattheestimationerrorwiththepropos edmethodissignicantly lowerthanthatwithRANSACaswellasnon-linearmeanshift. Acomputationaltimecomparisonforthethreemethodsissho wninFig. 2-7 ,which indicatesthattheproposedalgorithmisalsofasterthanbo thRANSACandmeanshift algorithms.AllcomputationsweredoneusingMATLABinades ktopLinuxmachine. Inaddition,thecomputationaltimeofPEGUSismorepredict ablecomparedtothatof RANSACandmean-shift.Thereasonisthatincontrasttoboth RANSACandmean shift,PEGUSdoesnotinvolveiterativesearch.Itshouldbe notedthatthesetrendsmay changedependingontheparametersofthealgorithms.Forex ample,executiontimeofthe meanshiftalgorithmmaybereducedbyincreasingthebandwi dth,whichmayaectits accuracy. 34

PAGE 35

0.01 0.1 0.5 1 2 0 0.1 0.2 0.3 0.4 0.5 0.6 Rotation error e RProbability PEGUS RANSAC+LS Mean-Shift (a) 0.01 0.1 0.5 1 2 0 0.05 0.1 0.15 0.2 0.25 0.3 0.35 0.4 0.45 0.5 Translation error e tProbability PEGUS RANSAC+LS Mean-Shift (b) Figure2-6.Evaluationoftheestimationaccuracyachieved bytheproposedPEGUS algorithmanditscomparisonwiththatachievedbyRANSAC+L Sand non-linearmeanshiftalgorithmof[ 1 ].RANSAC+LSmeansoutlierrejection byRANSACfollowedbyre-estimationoftheposebyfeedingal ltheinliers tothenormalized8-pointalgorithm.Pmfoftherotationand unittranslation estimationerrorsarecomputedfrom9000samplesoftheerro robtainedfrom the9000imagepairs. 0.01 0.1 0.5 1 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 Computation time (s)Probability PEGUS RANSAC+LS Mean-Shift Figure2-7.Comparisonofthecomputationtimerequiredbyt he3algorithms.Thepmfis estimatedfrom9000samplesofcomputationtime. 2.6Discussion 2.6.1SamplingoftheHypotheses TheaccuracyoftheestimateobtainedbyPEGUSalgorithmdep endsonthepose hypothesesthataregeneratedinthebeginning.Inthissect ionsomeofthepropertiesof 35

PAGE 36

Table2-1.Comparisonofmeanandvarianceofestimationerr or:(A)PEGUS(B) RANSAC+leastsquares,(C)non-linearmeanshift. Algorithm Rot.ErrorTrans.Error MeanStd.Dev.MeanStd.Dev. A0.05330.03900.10100.0438 B0.87260.93830.61770.7340C0.10160.11630.10920.0371 thehypothesesgenerationschemeusedinthealgorithmared escribed,anddiscusswhy theschemeusedleadstogoodestimates. Notethatsinceeachdistinct P pairsofpointcorrespondenceleadstoadistinct hypothesesof q and t Let h betherandomvariablerepresentingthehypothesisthatiso btainedwhen theSimpleRandomSamplingWithReplacement(SRSWR)scheme isexecuted.The possiblevaluesthat h cantakearedenotedby h i ;i =1 ;N max .Each h i isapair q i ;t i .Since thereisamappingfromeachsetof P featurepointpairstohypotheses,(e.g.,the8-point algorithm). Proposition1. TheSRSWRschemeforhypothesesgenerationensuresthateac hpossible hypothesisisobtainedwithequalprobability,i.e., P ( h = h i )= 1 N max Proof. Ahypothesis h isuniquelydenedbythe P pointcorrespondenceusedtogenerate it,whicharedenotedby f 1 ; f 2 ;:::; f P .Assumethattheallfeaturepointpairsaresorted tohaveincreasingindexfrom1through M P( h = h i )=P( f 1 = h 1i ; f 2 = h 2i ;:::; f P = h Pi ) = 8 Y k =2 P( f k = h ki j f k 1 = h k 1 i ;:::; f 1 = h 1i ) P( f 1 = h 1i ) = 1 M 7 1 M 6 ::: 1 M 36

PAGE 37

wherethesecondequalityfollowsfromthechainruleofcond itionalprobability,andthe thirdequalityfollowsfromthefactthatoncetherst k pointcorrespondencearepicked, theprobabilityofpickingaspecicpointamongtheremaini ngonesis1 = ( M k ). 2.6.2RobustnesstoOutliers Robustnessofthepresentedalgorithmtofeatureoutliersw asanalyzedusingsynthetic featurepointdata.Arandomcloudof100Euclideanpointswa sgeneratedandthese Euclideanpointswereprojectedontheimageplaneusingapi n-holecameramodel. TheEuclideanpointcloudisviewedfromtwodistinctcamera positionswiththe knownrelativerotationandtranslationservingasaground truth.Ineachtrial,white Gaussiannoiseof0 : 01standarddeviationwasaddedtotheimagedata.Inorderto verify therobustnessofthethreeposeestimationalgorithms(PEG US,RANSAC+LS,and non-linearmeanshift),thepercentageofoutlierswasvari edfrom10%to90%withan incrementof10%andtheoutlierswereaddedrandomlytothes yntheticdata.Foreach case,theexperimentwasrepeated100timesandposeestimat eswereobtainedusing PEGUS,RANSAC+LS,andnon-linearmeanshiftalgorithms.Us ing 2{10 ,therotation andtranslationestimationerrorswereobtainedasshownin Figs. 2-8 and 2-9 .Fig. 2-10 showsthemeanoftherotationandtranslationerrorsobtain edfordierentnumber offeatureoutliers.Itcanbeseenthatthemeanposeestimat ionerrorobtainedusing PEGUSisminimumamongthesealgorithms. 2.7Conclusion Inthischapteranovelrobusttwo-viewrelativeposeestima tionalgorithmis presented.Hypothesize-and-testmethodssuchasRANSACig noreallbutoneofthe goodhypotheses,whereastheproposedalgorithmidenties asetof\low-noise"pose hypothesesamongthelargenumberofpossibleonesandthena veragesthemappropriately tocomputeanestimate.Identicationofthe\low-noise"se tofhypothesesissimplied byexpressingrotationsasunit-quaternionsthatlieonthe 3-sphere S 3 andconstructinga histogrambygridding S 3 .Thesametechniqueisusedforunit-translations,exceptt hat 37

PAGE 38

0.01 0.1 0.5 1 2 0 0.2 0.4 0.6 0.8 1 Rotation error e RProbability PEGUS RANSAC+LS Mean-Shift (a) 0.01 0.1 0.5 1 2 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 Rotation error e RProbability PEGUS RANSAC+LS Mean-Shift (b) 0.01 0.1 0.5 1 2 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 Rotation error e RProbability PEGUS RANSAC+LS Mean-Shift (c) 0.01 0.1 0.5 1 2 0 0.05 0.1 0.15 0.2 0.25 0.3 0.35 0.4 0.45 Rotation error e RProbability PEGUS RANSAC+LS Mean-Shift (d) 0.01 0.1 0.5 1 2 0 0.05 0.1 0.15 0.2 0.25 0.3 0.35 0.4 0.45 Rotation error e RProbability PEGUS RANSAC+LS Mean-Shift (e) 0.01 0.1 0.5 1 2 0 0.05 0.1 0.15 0.2 0.25 0.3 0.35 0.4 0.45 Rotation error e RProbability PEGUS RANSAC+LS Mean-Shift (f) 0.01 0.1 0.5 1 2 0 0.05 0.1 0.15 0.2 0.25 0.3 0.35 0.4 Rotation error e RProbability PEGUS RANSAC+LS Mean-Shift (g) 0.01 0.1 0.5 1 2 0 0.05 0.1 0.15 0.2 0.25 0.3 0.35 Rotation error e RProbability PEGUS RANSAC+LS Mean-Shift (h) 0.01 0.1 0.5 1 2 0 0.05 0.1 0.15 0.2 0.25 0.3 0.35 Rotation error e RProbability PEGUS RANSAC+LS Mean-Shift (i) Figure2-8.RobustnesscomparisonofthepresentedPEGUSal gorithmwithRANSAC+LS andnon-linearmeanshiftalgorithmintermsoftherotation estimation accuracyusingsyntheticdatawith(a)10%featureoutliers ,(b)20%feature outliers,(c)30%featureoutliers,(d)40%featureoutlier s,(e)50%feature outliers,(f)60%featureoutliers,(g)70%featureoutlier s,(h)80%feature outliers,and(i)90%featureoutliers.Pmfoftherotatione stimationerroris computedfrom100samplesoftheerrorobtainedfromthe100i magepairs. thehypothesesarenowpointsontheunitspherein3-dimensi ons.Experimentalresults demonstrateimprovedperformanceoftheproposedmethodag ainstRANSAC+least squaresmethodaswellasnon-linearmeanshift,intermsofb othestimationaccuracy andcomputationtime.Sincetheproposedmethoddoesnotinv olveanyiterativesearch, itscomputationtimeismorepredictablethanthatofRANSAC andnon-linearmean shift.Subsequentchapterswillfocusondevelopmentandan alysisofvisualservocontrol 38

PAGE 39

0.01 0.1 0.5 1 2 0 0.2 0.4 0.6 0.8 1 Translation error e tProbability PEGUS RANSAC+LS Mean-Shift (a) 0.01 0.1 0.5 1 2 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 Translation error e tProbability PEGUS RANSAC+LS Mean-Shift (b) 0.01 0.1 0.5 1 2 0 0.05 0.1 0.15 0.2 0.25 0.3 0.35 0.4 Translation error e tProbability PEGUS RANSAC+LS Mean-Shift (c) 0.01 0.1 0.5 1 2 0 0.05 0.1 0.15 0.2 0.25 0.3 0.35 Translation error e tProbability PEGUS RANSAC+LS Mean-Shift (d) 0.01 0.1 0.5 1 2 0 0.05 0.1 0.15 0.2 0.25 Translation error e tProbability PEGUS RANSAC+LS Mean-Shift (e) 0.01 0.1 0.5 1 2 0 0.05 0.1 0.15 0.2 0.25 0.3 0.35 Translation error e tProbability PEGUS RANSAC+LS Mean-Shift (f) 0.01 0.1 0.5 1 2 0 0.05 0.1 0.15 0.2 0.25 Translation error e tProbability PEGUS RANSAC+LS Mean-Shift (g) 0.01 0.1 0.5 1 2 0 0.05 0.1 0.15 0.2 0.25 Translation error e tProbability PEGUS RANSAC+LS Mean-Shift (h) 0.01 0.1 0.5 1 2 0 0.05 0.1 0.15 0.2 0.25 Translation error e tProbability PEGUS RANSAC+LS Mean-Shift (i) Figure2-9.RobustnesscomparisonofthepresentedPEGUSal gorithmwithRANSAC+LS andnon-linearmeanshiftalgorithmintermsofthetranslat ionestimation accuracyusingsyntheticdatawith(a)10%featureoutliers ,(b)20%feature outliers,(c)30%featureoutliers,(d)40%featureoutlier s,(e)50%feature outliers,(f)60%featureoutliers,(g)70%featureoutlier s,(h)80%feature outliers,and(i)90%featureoutliers.Pmfofthetranslati onestimationerroris computedfrom100samplesoftheerrorobtainedfromthe100i magepairs. ofautonomoussystemswhereinvision-basedposeestimatio nactsasthefoundation. Therefore,thedevelopmentprovidedinthischapterwouldb ebenecialtosuchsystems requiringrobustposeestimationinadeterministicfashio n. 39

PAGE 40

1 2 3 4 5 6 7 8 9 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 Percentage outliersMean rotation error PEGUS RANSAC+LS Mean-Shift (a) 1 2 3 4 5 6 7 8 9 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 Percentage outliersMean translation error PEGUS RANSAC+LS Mean-Shift (b) Figure2-10.(a)Meanrotationestimationerrorand(b)mean translationestimationerror forthepresentedPEGUSalgorithm,RANSAC+LS,andnon-line armean shiftalgorithmusingtheposeestimationresultsforsynth eticdataofvarying featureoutliers(10%-90%)presentedinFigs. 2-8 and 2-9 40

PAGE 41

CHAPTER3 VISUALSERVOCONTROLOFANUNMANNEDGROUNDVEHICLEVIAA MOVINGAIRBORNEMONOCULARCAMERA 3.1Introduction Thedevelopmentinthischapterismotivatedbythedesireto addressthewell-known problemofcontrollingamovingtargetusingamovingcamera .Amovingairborne monocularcamera(e.g.,acameraattachedtoaremotecontro lledaircraft,acamera mountedonasatellite)isusedtoprovideposemeasurements ofamovingsensorlessUGV relativetoagoalconguration.Therelativevelocitybetw eenthemovingUGVandthe movingcamerapresentsasignicantchallenge.Thecontrib utionofthischapteristhe developmentofmulti-viewgeometryconcepts(i.e.,photog rammetry)torelatecoordinate framesattachedtothemovingcamera,movingUGV,andthedes iredUGVposespecied byanaprioriimage,thecontrolschemeiscoinedas daisy-chainingmethod Fortheresultsin[ 42 43 ],theposemeasurementsaretakenwithrespecttoa stationaryreferenceobjectandrestrictionsareimposedo ntheareaofoperationmotion fortheUGVsothatthestationaryreferenceobjectneverlea vestheeld-of-viewof anon-boardcamera.Also,themethodpresentedin[ 42 43 ]assumesthattheknown EuclideandistanceofthefeaturepointsontheUGVandthest ationaryreferenceobject areidentical,whichimposespracticallimitationsonthei mplementationofthevisualservo controller.Theresultinthischapterpaperfurtherdevelo psthedaisy-chainingmethod toachieveasymptoticregulationoftheUGVbasedontheassu mptionthatthegiven referenceobjectscanleavetheeldofviewwhileanotherre ferenceobjectenterstheeld ofview. Thischapterisorganizedintwoparts;daisy-chainingmeth odisdevelopedintherst partwithanobjectiveofregulatinganUGVtothedesiredpos itionusingimagefeedback fromamovingmonocularcamera.Thesecondpartofchapterpr esentsthedevelopment ofamulti-referencevisualservocontrolschemeinpresenc eofthereseedingstationary 41

PAGE 42

referenceobjects.Simulationresultsareprovidedtodemo nstratetheperformanceofthe multi-referencedaisy-chainingmethodforregulationcon trol. 3.2Daisy-ChainingBasedRegulationControl Inthissection,acollaborativevisualservocontrolleris developedwiththeobjective toregulateanUGVtoadesiredposeutilizingthefeedbackfr omamovingairborne monocularcamerasystem.Incontrasttotypicalcameracon gurationsusedforvisual servocontrolproblems,thecontrollerinthispaperisdeve lopedusingamovingon-board cameraviewingamovingtarget.Multi-viewphotogrammetri cmethodsareusedto developrelationshipsbetweendierencecameraframesand UGVcoordinatesystems. Geometricconstructsdevelopedfortraditionalcamera-in -handproblemsarefusedwith xed-camerageometrytodevelopasetofEuclideanhomograp hies.Oneoftheresulting Euclideanhomographiesisnotmeasurablethroughasetofsp atiotemporalimages(i.e., acorrespondingprojectivehomographycannotbedeveloped asinpreviousresults). Hence,newgeometricrelationshipsareformulatedtosolve forthehomographysothat ameasurableerrorsystemforthenonholonomicUGVcanbedev eloped.Theresulting open-looperrorsystemisexpressedinaformthatisamenabl etoavarietyofUGV controllers.Abenchmarkcontrolleroriginallyproposedi n[ 44 ]isproventoyieldthe asymptoticregulationresultthroughaLyapunov-basedsta bilityanalysis. 3.2.1GeometricModel Considerasinglecamerathatisnavigating(e.g.,byremote controlledaircraft) above 1 theplanarmotionofanUGVasdepictedinFig. 3-1 andFig. 3-2 .Themoving coordinateframe I isattachedtotheairbornecameraandthemovingcoordinate frame F isattachedtotheUGVatthecenteroftherearwheelaxis(for simplicityandwithout 1 NoassumptionsaremadewithregardtothealignmentoftheWM Rplaneofmotion andthefocalaxisofthecameraasin[ 26 ]. 42

PAGE 43

Figure3-1.Cameracoordinateframerelationships:Amovin gairbornemonocularcamera (coordinateframe I )hoveringaboveanUGV(coordinateframe F )while viewingaxedreferenceobject(coordinateframe F )regulatesanUGVto thedesiredpose(coordinateframe F d )capturedbyapriorilocatedcamera (coordinateframe I R ). lossofgenerality).TheUGVisrepresentedinthecameraima gebyfour 2 featurepoints thatarecoplanarandnotcollinear.TheEuclideandistance (i.e., s 1 i 2 R 3 8 i =1 ; 2 ; 3 ; 4) fromtheoriginof F tooneofthefeaturepointsisassumedtobeknown.Theplane denedbytheUGVmotion(i.e.,theplanedenedbythexy-axi sof F )andtheUGV featurepointsisdenotedas .ThelinearvelocityoftheUGValongthex-axisisdenoted by v c ( t ) 2 R ,andtheangularvelocity c ( t ) 2 R isaboutthez-axisof F (seeFigure 3-1 ). WhileviewingthefeaturepointsoftheUGV,thecameraisass umedtoalsoviewfour additionalcoplanarandnoncollinearfeaturepointsofast ationaryreferenceobject.The 2 Imageanalysismethodscanbeusedtodetermineplanarobjec ts(e.g.throughcolor, texturedierences).Thesetraditionalcomputervisionme thodscanbeusedtohelp determineandisolatethefourcoplanarfeaturepoints.Iff ourcoplanartargetpoints arenotavailablethenthesubsequentdevelopmentcanexplo ittheclassiceight-points algorithm[ 45 ]withnofouroftheeighttargetpointsbeingcoplanar. 43

PAGE 44

fouradditionalfeaturepointsdenetheplane inFig. 3-1 andFig. 3-2 .Thestationary coordinateframe F isattachedtotheobjectwheredistance(i.e., s 2 i 2 R 3 8 i =1 ; 2 ; 3 ; 4) fromtheoriginofthecoordinateframetooneofthefeaturep ointsisassumedtobe known.Theplane isassumedtobeparalleltotheplane .Thefeaturepointsthat dene arealsoassumedtobevisiblewhenthecameraisaprioriloca tedcoincident withthepositionandorientation(i.e.,pose)ofthestatio narycoordinateframe I r .When thecameraiscoincidentwith I r ,thedesiredposeoftheUGVisassumedtobeinthe camera'seld-of-view.WhentheUGVislocatedatthedesire dpose,thecoordinateframe F iscoincidentwiththecoordinateframe F d .Table 3-1 showstherelationshipsbetween variouscoordinateframes. Table3-1.CoordinateframesrelationshipsforUGVregulat ioncontrol. MotionFrames R ( t ), x f ( t ) F to I R ( t ), x f ( t ) F to I R r ( t ), x fr ( t ) I to I R R r x fr F to I R R rd x frd F d to I R R 0 ( t ) ;x 0fr ( t ) F to I R in I R Torelatethecoordinatesystems,let R ( t ), R ( t ), R r ( t ), R rd R r 2 SO (3)denote therotationfrom F to I F to I I to I R F d to I R ,and F to I R ,respectively, x f ( t ), x f ( t ) 2 R 3 denotetherespectivetime-varyingtranslationfrom F to I andfrom F to I withcoordinatesexpressedin I ,and x fr ( t ), x 0fr ( t ), x frd x fr 2 R 3 denotethe respectivetranslationfrom I to I R F to I R F d to I R ,andfrom F to I R expressedin thecoordinatesof I R .Fromthegeometrybetweenthecoordinateframesdepictedi nFig. 3-1 andFig. 3-2 ,thefollowingrelationshipscanbedeveloped m i = x f + Rs 1 i m rdi = x frd + R rd s 1 i (3{1) m i = x f + R s 2 i m ri = x fr + R r s 2 i (3{2) m 0i ( t )= x 0fr + R r R T Rs 1 i (3{3) 44

PAGE 45

Figure3-2.UrbanscenariodescribingregulationofanUGVt othedesiredposeusingan airbornecamera. where m i ( t ), m i ( t ) 2 R 3 denotetheEuclideancoordinatesofthefeaturepointsofth e UGVandthefeaturepointsontheplane expressedin I as m i ( t ) x i ( t ) y i ( t ) z i ( t ) T (3{4) m i ( t ) x i ( t ) y i ( t ) z i ( t ) T ; (3{5) m 0i ( t ), m rdi 2 R 3 denotetheactualtimevaryingandconstantdesiredEuclide an coordinates,respectively,ofthefeaturepointsattached totheUGVexpressedin I R as m 0i ( t ) x 0i ( t ) y 0 i ( t ) z 0 i ( t ) T (3{6) m rdi x rdi y rdi z rdi T ; (3{7) 45

PAGE 46

and m ri 2 R 3 denotestheconstantEuclideancoordinatesofthefeaturep ointsonthe plane expressedin I R as m ri x ri y ri z ri T : (3{8) Aftersomealgebraicmanipulation,theexpressionsfor m i ( t ), m rdi m ri ( t ),and m 0i ( t ) in( 3{1 )-( 3{3 )canberewrittenas m i = x f + R m i m rdi = x frd + R rd m ri (3{9) m ri = x fr + R r m i m 0i ( t )= x fr + R r m i (3{10) where R ( t ), R rd R r 2 SO (3)and x f ( t ), x frd x 0fr ( t ) 2 R 3 arenewrotationaland translationalvariables,respectively,denedas R = RR T R rd = R rd R T r R r = R r R T (3{11) x f = x f R x f + R ( s 2 i s 1 i ) (3{12) x frd = x frd R rd x fr + R r ( s 2 i s 1 i ) (3{13) x fr = x fr R r x f = x 0fr R r x f : (3{14) Byusingtheprojectiverelationships(seeFig. 3-3 andFig. 3-4 ) d ( t )= n T m i d ( t )= n T m i d r = n T m ri (3{15) therelationshipsin( 3{9 )and( 3{10 )canbeexpressedas m i = R + x f d n T m i (3{16) m rdi = R rd + x frd d r n T m ri (3{17) m ri = R r + x fr n T d m i (3{18) m 0i = R r + x fr n T d m i : (3{19) 46

PAGE 47

In( 3{15 )-( 3{19 ), d ( t ), d ( t ), d r >" forsomepositiveconstant 2 R ,and n 2 R 3 denotes theconstantunitnormaltotheplanes and Remark1. Asin[ 46 ],thesubsequentdevelopmentrequiresthattheconstantro tation matrix R r beknown.Theconstantrotationmatrix R r canbeobtainedaprioriusing variousmethods(e.g.,asecondcamera,Euclideanmeasurem ents). 3.2.2EuclideanReconstruction Therelationshipsgivenby( 3{16 )and( 3{19 )provideameanstoquantifya translationandrotationerrorbetweenthedierentcoordi natesystems.Sincetheposeof F F d ,and F cannotbedirectlymeasured,aEuclideanreconstructionis developedin thissectiontoobtainthepositionandrotationalerrorinf ormationbycomparingmultiple imagesacquiredfromthehoveringmonocularvisionsystem. Specically,comparisonsare madebetweenthecurrentUGVimageandthereferenceimagein termsof I andbetween theaprioriknownUGVimageandthereferenceimageintermso f I R .Tofacilitatethe subsequentdevelopment,thenormalizedEuclideancoordin atesofthefeaturepointsfor thecurrentUGVimageandthereferenceimagecanbeexpresse dintermsof I as m i ( t ) and m i ( t ) 2 R 3 ,respectively,asfollows: m i m i z i m i m i z i : (3{20) Similarly,thenormalizedEuclideancoordinatesofthefea turepointsforthecurrent,goal, andreferenceimagecanbeexpressedintermsof I R as m 0i ( t ), m rdi m ri 2 R 3 ,respectively, asfollows: m 0i ( t )= m 0i ( t ) z 0 i ( t ) m rdi m rdi z rdi m ri m ri z ri : (3{21) Fromtheexpressionsgivenin( 3{16 )and( 3{20 ),therotationandtranslationbetweenthe coordinatesystems F and F cannowberelatedintermsofthenormalizedEuclidean coordinatesasfollows: m i = z i z i |{z} R + x h n T | {z } m i : i H (3{22) 47

PAGE 48

Inasimilarmanner,( 3{17 )-( 3{21 )canbeusedtorelatetherotationandtranslation between m ri and m rdi as m rdi = z ri z rdi |{z} R rd + x hrd n T | {z } m ri rdi H rd (3{23) andbetween m i ( t )and m ri as m ri = z i z ri |{z} R r + x hr n T | {z } m i : ri H r (3{24) ThedevelopmentprovidedintheRemark 2 canbeusedtorelate m i ( t )to m 0i ( t )as m 0i = z i z 0 i R r + x hr i n T m i n T m i n T | {z } m i : H 0 r (3{25) In( 3{22 )-( 3{25 ), i ( t ) ; rdi ri ( t ) 2 R denotedepthratios, H ( t ), H rd H r ( t ), H 0 r ( t ) 2 R 3 3 denoteEuclideanhomographies[ 47 ],and x h ( t ), x hrd x hr ( t ) 2 R 3 denotescaled translationvectorsthataredenedasfollows x h = x f d x hrd = x frd d r x hr = x fr d (3{26) Remark2. InordertondtherelationshipbetweenthenormalizedEucl ideancoordinates m i ( t ) and m 0i ( t ) ,termedasvirtualhomography,( 3{19 )isexpressedas m 0i = R r + x fr n T d d d m i : (3{27) Bysubstituting x fr ( t ) from( 3{26 )into( 3{27 )thefollowingexpressioncanbeobtained m 0i = R r + x hr d d n T m i : (3{28) 48

PAGE 49

Utilizing( 3{15 ),( 3{20 ),and( 3{22 )andrearrangingthetermsyields m 0i = R r + x hr i n T m i n T m i n T m i : (3{29) Theaboveexpressioncanbewrittenintermsofthenormalize dEuclideancoordinatesby using( 3{20 )-( 3{21 )asfollows m 0i = z i z 0 i R r + x hr i n T m i n T m i n T m i : (3{30) EachEuclideanfeaturepointwillhaveaprojectedpixelcoo rdinateexpressedinterms of I as p i u i v i 1 T p i u i v i 1 T (3{31) where p i ( t )and p i ( t ) 2 R 3 representstheimage-spacecoordinatesofthetime-varyin g featurepointsoftheUGVandreferenceobject,respectivel y,and u i ( t ), v i ( t ) ;u i ( t ), v i ( t ) 2 R .Similarly,theprojectedpixelcoordinateoftheEuclidea nfeaturesinthe referenceimagecanbeexpressedintermsof I R as p rdi u rdi v rdi 1 T p ri u ri v ri 1 T (3{32) where p rdi and p ri 2 R 3 representstheconstantimage-spacecoordinatesofthegoa l UGVandthereferenceobject,respectively,and u rdi v rdi ;u ri v ri 2 R .Tocalculatethe Euclideanhomographiesgivenin( 3{22 )-( 3{25 )frompixelinformation,theprojectedpixel coordinatesarerelatedto m i ( t ), m i ( t ), m rdi and m ri bythepin-holecameramodelas p i = Am i p i = Am i (3{33) p rdi = Am rdi p ri = Am ri (3{34) 49

PAGE 50

where A 2 R 3 3 isaknown,constant,andinvertibleintrinsiccameracalib rationmatrix. Byusing( 3{22 )-( 3{25 ),( 3{33 ),and( 3{34 ),thefollowingrelationshipscanbedeveloped: p i = i AHA 1 | {z } p i p rdi = rdi AH rd A 1 | {z } p ri GG rd (3{35) p ri = ri AH r A 1 | {z } p i G r (3{36) where G ( t )=[ g ij ( t )], G rd =[ g rdij ], G r =[ g rij ] 8 i;j =1 ; 2 ; 3 2 R 3 3 denoteprojective homographies. Setsoflinearequationscanbedevelopedfrom( 3{35 )and( 3{36 )todeterminethe projectivehomographiesuptoascalarmultiple.Variouste chniquescanbeused(e.g.,see [ 48 49 ])todecomposetheEuclideanhomographies,toobtain i ( t ) ; rdi ri ( t ), x h ( t ), x hrd x hr ( t ), R ( t ), R rd R r ( t ).Giventhattheconstantrotationmatrix R r isassumed tobeknown,theexpressionsfor R rd and R r ( t )in( 3{11 )canbeusedtodetermine R rd and R ( t ).Once R ( t )isdetermined,theexpressionfor R ( t )in( 3{11 )canbeusedto determine R ( t ). 3.2.3UGVKinematics ThekinematicmodelfortheUGVcanbedeterminedfromFig. 3-1 as 266664 x c y c d 377775 = 266664 cos d 0 sin d 0 01 377775 264 c c 375 (3{37) where_ x c ,_ y c ,and d denotethetimederivativesof x c ( t ), y c ( t ),and d ( t ) 2 R ,respectively, where x c ( t )and y c ( t )denotetheplanarpositionof F expressedin F d ,and d ( t ) 2 R denotestheright-handedrotationangleaboutthez-axisof F thataligns F with F d ,and c ( t )and c ( t )wereintroducedinSection 3.2.1 andaredepictedinFig. 3-1 andFig. 3-2 .Basedonthedenitionsfor R ( t ), R ( t ), R rd R r ,and R r ( t )providedintheprevious 50

PAGE 51

development(seealsoTable 3-1 ),therotationfrom F to F d canbedevelopedas R T rd R r R T R = 266664 cos d sin d 0 sin d cos d 0 001 377775 : (3{38) Basedonthefactthat R ( t ), R ( t ), R r ,and R rd areknownfromthehomography decomposition,itisclearfrom( 3{38 )that d ( t )isaknownsignalthatcanbeusedin thesubsequentcontroldevelopment. Thegeometricrelationshipsbetweenthecoordinateframes canbeusedtodevelopthe followingexpression x c y c 0 T = R T rd ( x 0fr x frd ).(3{39) Afterutilizing,( 3{1 ),( 3{3 ),( 3{25 ),andtheassumption(asin[ 46 ])that s 11 =[0 ; 0 ; 0] T thefollowingexpressioncanbeobtained 3 x c z rd 1 y c z rd 1 0 T = R T rd ( r 1 rd 1 1 H 0 r m 1 m rd 1 ) : (3{40) Afterutilizing( 3{33 )and( 3{34 ),theexpressionin( 3{40 )canberewrittenasfollows x c z rd 1 y c z rd 1 0 T = R T rd A 1 ( r 1 rd 1 1 H 0 r p 1 p rd 1 ) : (3{41) Sincealltermsontheright-handsideof( 3{41 )aremeasurableorknown,then x c ( t ) z rd 1 and y c ( t ) z rd 1 canbeusedinthesubsequentcontroldevelopment. 3.2.4ControlObjective Theobjectiveconsideredinthischapteristodevelopavisu alservocontrollerthat ensuresthattheposeofaUGVisregulatedtoadesiredpose.A challengingaspectofthis problemisthattheUGVposeinformationissuppliedbyamovi ngairbornemonocular 3 Anypoint s 1 i s 2 i canbeutilizedinthesubsequentdevelopment;however,tor educe thenotationalcomplexity,wehaveelectedtoselecttheima gepoint s 11 s 21 ,andhence, thesubscript1isutilizedinlieuof i inthesubsequentdevelopment. 51

PAGE 52

camerasystem.Thatis,unliketraditionalcamera-in-hand congurationsorxedcamera congurations,theproblemconsideredinthischapterinvo lvesamovingairbornecamera observingamovinggroundvehicle.Mathematically,theobj ectivecanbeexpressedasthe desiretoregulate m i ( t )to m di (orstatedotherwisefor x 0fr ( t ) x frd and d ( t ) 0). Basedon( 3{37 )-( 3{39 )theobjectivecanbequantiedbyaregulationerror e ( t ) 2 R 3 denedbythefollowingglobaldieomorphism 266664 e 1 e 2 e 3 377775 266664 cos d sin d 0 sin d cos d 0 001 377775 266664 x c z rd 1 y c z rd 1 d 377775 : (3{42) If k e ( t ) k! 0,then( 3{39 )and( 3{42 )canbeusedtoconcludethat x 0fr ( t ) x frd and d ( t ) 0.Basedon( 3{38 )and( 3{41 ),itisclearthat e ( t )ismeasurable. 3.2.5ControlDevelopment Aftertakingthetimederivativeof( 3{42 )andusing( 3{37 ),theopen-looperror systemfor e ( t )canbedeterminedas 266664 e 1 e 2 e 3 377775 = 266664 c z rd 1 + c e 2 c e 1 c 377775 : (3{43) Avarietyofcontrollerscouldnowbeproposedtoyieldthere gulationresultbasedonthe mannerinwhichtheopen-looperrorsystemgivenby( 3{43 )hasbeendeveloped.Several controllersareprovidedin[ 50 ]includinganexplanationofhowtheUGVdynamicscould alsobeeasilyincorporatedintothecontroldesign.Thefol lowingbenchmarkcontroller proposedin[ 44 ]isanexamplethatcanbeusedtoachieveasymptoticregulat ion: c k v e 1 (3{44) c k e 3 + e 22 sin t (3{45) 52

PAGE 53

where k v k 2 R denotepositive,constantcontrolgains.Aftersubstituti ngthecontroller designedin( 3{44 )and( 3{45 )into( 3{43 ),thefollowingclosed-looperrorsystemis obtained: z rd 1 e 1 = k v e 1 + z rd 1 c e 2 (3{46) e 2 = c e 1 e 3 = k e 3 + e 22 sin t Remark3. Asstatedin[ 44 ]and[ 50 ],theclosed-loopdynamicsfor e 3 ( t ) givenin( 3{46 ) representastablelinearsystemsubjecttoanadditivedist urbancegivenbytheproduct e 22 ( t )sin( t ) .Iftheadditivedisturbanceisbounded(i.e.,if e 2 ( t ) 2L 1 ),thenitisclear that e 3 ( t ) 2L 1 .Furthermore,iftheadditivedisturbanceasymptotically vanishes(i.e.,if e 2 ( t ) 0 )thenitisclearfromstandardlinearcontrolarguments[ 51 ]that e 3 ( t ) 0 3.2.6StabilityAnalysis Theorem1. Thekinematiccontrollergivenin( 3{44 )and( 3{45 )ensureasymptoticpose regulationoftheUGVinthesensethat lim t !1 k e ( t ) k =0(3{47) Proof. Let V ( t ) 2 R denotethefollowingnon-negativefunction: V 1 2 z rd 1 ( e 21 + e 22 ).(3{48) Thefollowingsimpliedexpressioncanbeobtainedbytakin gthetimederivativeof ( 3{48 ),substitutingtheclosed-loopdynamicsfrom( 3{46 )intotheresultingexpression, andthencancellingcommonterms V = k v e 21 : (3{49) Basedon( 3{48 )and( 3{49 ),itisclearthat e 1 ( t ), e 2 ( t ) 2L 1 andthat e 1 ( t ) 2L 2 .Remark 3 canbeusedalongwiththefactthat e 2 ( t ) 2L 1 toconcludethat e 3 ( t ) 2L 1 .Based onthefactthat e 1 ( t ), e 2 ( t ), e 3 ( t ) 2L 1 ,( 3{44 ),( 3{45 )and( 3{46 )canbeusedtoprove 53

PAGE 54

that c ( t ), c ( t ),_ e 1 ( t ),_ e 2 ( t ),_ e 3 ( t ) 2L 1 .Thefactthat_ e 1 ( t ),_ e 2 ( t ),_ e 3 ( t ) 2L 1 isa sucientconditionfor e 1 ( t ), e 2 ( t ), e 3 ( t )tobeuniformlycontinuous.Aftertakingthetime derivativeof( 3{44 )and( 3{45 )andutilizingtheaforementionedfacts,wecanshowthat c ( t ),_ c ( t ) 2L 1 ,andhence, c ( t )and c ( t )areuniformlycontinuous.Basedonthefacts that e 1 ( t ),_ e 1 ( t ) 2L 1 and e 1 ( t ) 2L 2 ,Barbalat'sLemmacanbeusedtoprovethat lim t !1 e 1 ( t )=0 : (3{50) Aftertakingthetimederivativeoftheproduct e 1 ( t ) e 2 ( t )andthensubstituting( 3{46 ) intotheresultingexpressionforthetimederivativeof e 1 ( t ),thefollowingexpressionis obtained d dt ( e 1 e 2 )= e 22 c + e 1 (_ e 2 k v e 2 z rd 1 ).(3{51) Giventhefactsthatlim t !1 e 1 ( t )=0andthebracketedtermin( 3{51 )isuniformly continuous(i.e., e 2 ( t )and c ( t )areuniformlycontinuous),wecaninvokeextended Barbalat'slemma[ 50 ]toconcludethat lim t !1 d dt ( e 1 ( t ) e 2 ( t ))=0lim t !1 e 22 ( t ) c ( t )=0 : (3{52) Afterutilizing( 3{44 ),( 3{46 ),( 3{50 ),and( 3{52 ),wecanconcludethat lim t !1 c ( t )=0lim t !1 e 1 ( t )=0lim t !1 e 2 ( t )=0(3{53) Tofacilitatefurtheranalysis,wetakethetimederivative oftheproduct e 2 ( t ) c ( t )and utilize( 3{43 ),( 3{44 )and( 3{45 )toobtainthefollowingexpression d dt ( e 2 c )= e 32 cos( t ) +_ e 2 ( c +2 e 22 sin( t )) k e 2 c .(3{54) Sincethebracketedtermin( 3{54 )isuniformlycontinuous,wecaninvokeextended Barbalat'slemmatoconcludethat e 32 ( t )cos( t )=0;(3{55) 54

PAGE 55

hence, lim t !1 e 2 ( t )=0 : (3{56) Basedon( 3{56 ),itisclearfromRemark 3 that lim t !1 e 3 ( t )=0 : (3{57) Afterutilizing( 3{50 ),( 3{56 ),and( 3{57 ),theasymptoticregulationresultgivenin( 3{47 ) isobtained. 3.3Multi-ReferenceVisualServoControlofanUnmannedGro undVehicle Theresultinthissectionfurtherdevelopsthedaisychaini ngmethodintroducedin Section 3.2 toachieveasymptoticregulationoftheUGVbasedontheassu mptionthat thegivenreferenceobjectscanleavetheeldofviewwhilea notherreferenceobjectenters theeldofview.Thecontributionofthisresearchisthatsi ncethecontrollerdevelopment isbasedontheabilitytodaisychainmultiplereferenceobj ects,therestrictionsonthe applicativeareaofoperationareremoved.Thatis,sinceth econtroldevelopmentdoesnot requiretheairbornecameratomaintainaviewofastaticref erenceobject,theairborne camera/UGVpairareabletonavigateoveranarbitrarilylar gearea.Thepresentedwork alsorelaxestheassumptionofhavingidenticalEuclideand istanceofthefeaturesforthe UGVandthereferenceobjectbyleveragingonthegeometricr econstructionmethod proposedbyDupreeetal.[ 52 ]. 3.3.1GeometricModel Thegeometricmodelformulti-referenceregulationcontro lissimilartothatpresented inSection 3.2 .WhileviewingthefeaturepointsoftheUGV,thecameraisas sumed toalsoviewfouradditionalcoplanarandnon-collinearfea turepointsofastationary referenceobject,suchthatatanyinstantoftimealongthec ameramotiontrajectory atleastonesuchreferencetargetisintheeldofview.Thef ouradditionalfeature pointsdenetheplane n inFig. 3-3 andFig. 3-4 .Thestationarycoordinateframe F n ( n =1 ; 2 ;::;m )isattachedtotheobjectwheredistancefromtheoriginoft hecoordinate 55

PAGE 56

Figure3-3.Cameratoreferenceobjectrelationships:Amon ocularcamera(coordinate frame I )viewingastationaryreferenceobject(coordinateframe F i )such thatastationaryobjectcanleavethecameraFOVasthenewob jectenters theFOV,whilethestationaryreferencecamera(coordinate frame I R )is assumedtoviewthestationaryreferenceobject F 1 frametooneofthefeaturepointsisassumedtobeknown,i.e. s 2 ni 2 R 3 8 i =1 ; 2 ; 3 ; 4. Theplane n isassumedtobeparalleltotheplane .Thefeaturepointsthatdene 1 ,correspondingtoareferenceobject F 1 (i.e. F n correspondingto n =1),arealso assumedtobevisiblewhenthecameraisapriorilocatedcoin cidentwiththeposition andorientation(i.e.,pose)ofthestationarycoordinatef rame I R .Thestationarypose F r correspondstoasnapshotoftheUGV(e.g.atthestartingloc ation)visiblefrom thereferencecameracoordinatesystem I R : Whenthecameraiscoincidentwith I R ,the desiredposeoftheUGV F d isassumedtobeknown.WhentheUGVislocatedatthe desiredpose,thecoordinateframe F iscoincidentwiththecoordinateframe F d Torelatethecoordinatesystems,considerthecoordinatef ramerelationshipsasgiven inTable 3-2 56

PAGE 57

Figure3-4.CameratoUGVrelationships:Amonocularcamera (coordinateframe I ) hoveringaboveanUGV(coordinateframe F )whileviewingastationary referenceobject(coordinateframe F n )regulatesanUGVtothedesiredpose (coordinateframe F d )knownaprioriinthereferencecamera(coordinate frame I R ).Thestationarypose(coordinateframe F r )correspondstoa snapshotoftheUGVvisiblefromthereferencecamera(coord inateframe I R ). Table3-2.Coordinateframesrelationshipsformulti-refe renceUGVregulationcontrol. MotionFrames R ( t ), x f ( t ) F to I R n ( t ), x fn ( t ) F n to I R r ( t ), x fr ( t ) I to I R R rd x frd F d to I R R rn x frn F n to I R R rr x frr F r to I R R 00 ( t ), x 00f ( t ) F r to I R 0 r ( t ), x 0fr ( t ) F to I R R 0 ( t ), x 0f ( t ) F to F r R 0 rd ( t ), x 0frd ( t ) F to F d 57

PAGE 58

Fromthegeometrybetweenthecoordinateframesdepictedin Fig. 3-3 andFig. 3-4 thefollowingrelationshipscanbedeveloped m i = x f + Rs 1 i m rdi = x frd + R rd s 1 i (3{58) m ni = x fn + R n s 2 ni m rni = x frn + R rn s 2 ni (3{59) m 0i = x 00f + R 00 s 1 i m 0ri = x 0fr + R 0 r s 1 i (3{60) m ri = x frr + R rr s 1 i : (3{61) In( 3{58 )-( 3{61 ), m i ( t ), m 0i ( t ) ; m ni ( t ) 2 R 3 denotetheEuclideancoordinatesofthefeature pointsofthecurrentUGV(i.e. F ),constantreferenceUGVposition,andstationary referenceobject n ( n =1 ; 2 ;::;m ) ; respectively,expressedin I as m i ( t )= x i ( t ) y i ( t ) z i ( t ) T (3{62) m 0i ( t )= x 0i ( t ) y 0 i ( t ) z 0 i ( t ) T (3{63) m ni ( t )= x ni ( t ) y ni ( t ) z ni ( t ) T ; (3{64) m ri m 0ri ( t ), m rdi 2 R 3 denotetheEuclideancoordinatesoftheconstantreference UGV, actualtimevaryingcurrentUGV,andconstantdesiredUGV,r espectively,expressedin I R as m ri = x ri y ri z ri T (3{65) m 0ri = x 0ri ( t ) y 0 ri ( t ) z 0 ri ( t ) T (3{66) m rdi = x rdi y rdi z rdi T ; (3{67) and m rni 2 R 3 denotestheconstantEuclideancoordinatesofthefeaturep ointsonthe stationaryreferenceplane n expressedin I R as m rni = x rni y rni z rni T : (3{68) 58

PAGE 59

Forsimplicityandwithoutlossofgenerality,weconsidert woreferencetargets F n (where n =1 ; 2) : Aftersomealgebraicmanipulation,theexpressionsfor m rni m 0ri ( t ),and m 0i ( t )in ( 3{58 )-( 3{61 )canberewrittenas m r 1 i = x fr + R r m 1 i m r 2 i = x fr + R r m 2 i (3{69) m 0ri = x fr + R r m i m ri = x fr + R r m 0i (3{70) m 0i = x 0f + R 0 m i m rdi = x 0frd + R 0 rd m 0ri (3{71) where R r ( t ) ;R 0 ( t ) ;R 0 rd ( t ) 2 R 3 3 and x fr ( t ) ;x 0f ( t ) ;x 0frd ( t ) 2 R 3 denotenewrotationand translationvariablesgivenasfollows: R r = R rn R T n x fr = x frn R r x fn (3{72) R 0 = R 00 R T x 0f = x 00f R 0 x f (3{73) R 0 rd = R rd R 0 T r x 0frd = x frd R 0 rd x 0fr (3{74) Byusingtheprojectiverelationships d 1 = n T 1 m 1 i d 2 = n T 2 m 2 i (3{75) d = n T m i d 0r = n 0 T r m 0ri (3{76) therelationshipsin( 3{69 )and( 3{71 )canbeexpressedas m r 1 i = R r + x fr n T 1 d 1 m 1 i (3{77) m r 2 i = R r + x fr n T 2 d 2 m 2 i (3{78) m 0i = R 0 + x 0f n T d m i (3{79) m rdi = R 0 rd + x 0frd n 0 T r d 0r m 0ri : (3{80) 59

PAGE 60

In( 3{77 )-( 3{80 ), d 1 ( t ), d 2 ( t ) ;d ( t ), d 0r ( t ) >" forsomepositiveconstant 2 R ,and n 1 ( t ) ; n 2 ( t ) ;n ( t ), n 0r ( t ) 2 R 3 denotethetimevaryingunitnormaltotheplanes 1 2 ; and respectively.3.3.2EuclideanReconstruction Therelationshipsgivenby( 3{77 )-( 3{80 )provideameanstoquantifyatranslation androtationerrorbetweenthedierentcoordinatesystems .Comparisonsaremade betweenthecurrentUGVimageandthereferenceimageinterm sof I ,betweenthe aprioriknowndesiredUGVposeandthecurrentposeintermso f I R andbetween theimagesofthestationaryreferenceobjectintermsof I and I R .Tofacilitatethe subsequentdevelopment,thenormalizedEuclideancoordin atesofthefeaturepointsfor thecurrentUGVimage,thereferenceUGVimage,andtherefer enceobjectimagescanbe expressedintermsof I as m i ( t ), m 1 i ( t ),and m 2 i ( t ) 2 R 3 ,respectively,as m i = m i z i m 0i = m 0i z 0 i (3{81) m 1 i = m 1 i z 1 i m 2 i = m 2 i z 2 i : (3{82) Similarly,thenormalizedEuclideancoordinatesofthefea turepointsforthecurrentUGV, goalUGV,andreferenceobjectimagecanbeexpressedinterm sof I R as m 0ri ( t ), m rdi m r 1 i ,and m r 2 i 2 R 3 ,respectively,as m 0ri = m 0ri z 0 ri m rdi = m rdi z rdi (3{83) m r 1 i = m r 1 i z r 1 i m r 2 i = m r 2 i z r 2 i : (3{84) Fromtheexpressionsgivenin( 3{77 ),( 3{82 ),and( 3{84 )therotationandtranslation betweenthecoordinatesystems I and I R cannowberelatedintermsofthenormalized Euclideancoordinatesofthereferenceobject F 1 as[ 42 ] m r 1 i = z 1 i z r 1 i |{z} R r + x hr n T 1 | {z } m 1 i : ri H r (3{85) 60

PAGE 61

Atafutureinstantintime,whenthestaticreferenceobject F 2 isintheeldofview ofthecurrentcamera(i.e. I )andthedaisychainingmethodhasbeenusedtorelate cameraframes I and I R intermsofthereferenceobject F 2 ,then( 3{78 ),( 3{82 ),and ( 3{84 )canbeusedtorelatetherotationandtranslationbetween I and I R intermsofthe normalizedEuclideancoordinatesofthereferenceobject F 2 as 4 m r 2 i = z 2 i z r 2 i |{z} R r + x hr n T 2 | {z } m 2 i ri H r (3{86) where m r 2 i 2 R 3 representvirtualnormalizedEuclideancoordinatessince thestationary referenceobject F 2 isnotintheeldofviewofthestationaryreferencecamera I R : The relationshipbetween F and F r canbeexpressedas[ 42 ] m 0i = z i z 0 i |{z} R 0 + x 0h n T | {z } m i : 0 i H 0 (3{87) Similarly,using( 3{80 )and( 4{25 )therotationandtranslationbetweenthecoordinate systems F and F d cannowberelatedintermsofthenormalizedEuclideancoord inatesof theUGVexpressedin I R as m rdi = z 0 ri z rdi |{z} R 0 rd + x 0hrd n 0 T r | {z } m 0ri : rdi H rd (3{88) In( 3{85 )-( 3{88 ), 0 i ( t ), rdi ( t ), ri ( t ) 2 R denotedepthratios, H 0 ( t ), H rd ( t ), H r ( t ) 2 R 3 3 denoteEuclideanhomographies[ 53 ],and x 0h ( t ), x 0hrd ( t ), x hr ( t ) 2 R 3 denotescaled 4 Homographyrelationshipin( 3{86 )relatescameraframes I and I R utilizingthe staticreferenceobject F 2 however,givendevelopmentcanbegeneralizedforanyrefer ence object F n ( n =2 ; 3 ;::m ). 61

PAGE 62

translationvectorsthataredenedasfollows x 0h = x 0f d x 0hrd = x 0frd d 0r x hr = x fr d 1 (3{89) wherethescaledtranslation x hr ( t )isobtainedwhentherelationshipbetween I and I R is expressedintermsofthestaticreferenceobject F 1 ; x hr = x fr d 2 : (3{90) In( 3{90 ),thescaledtranslation x hr ( t )isobtainedwhenthestaticreferenceobject F 2 isintheeldofviewofcurrentcameraframe(i.e. I )anddaisychainingstrategyhas establishedaconnectionbetweencameraframes I and I R intermsofthereferenceobject F 2 .EachEuclideanfeaturepointwillhaveaprojectedpixelco ordinateexpressedinterms of I as p i = u i v i 1 T p 1 i = u 1 i v 1 i 1 T (3{91) p 2 i = u 2 i v 2 i 1 T (3{92) where p i ( t ), p 1 i ( t ),and p 2 i ( t ) 2 R 3 representstheimage-spacecoordinatesofthe time-varyingfeaturepointsoftheUGVandreferenceobject s F 1 and F 2 ,respectively, and u i ( t ), v i ( t ) ;u 1 i ( t ), v 1 i ( t ), u 2 i ( t ), v 2 i ( t ) 2 R .Similarly,theprojectedpixelcoordinate oftheEuclideanfeaturesinthereferenceimagecanbeexpre ssedintermsof I R as p r 1 i = u r 1 i v r 1 i 1 T (3{93) where p r 1 i 2 R 3 representstheconstantimage-spacecoordinatesofthesta tionary referenceobject F 1 and u r 1 i v r 1 i 2 R .TocalculatetheEuclideanhomographiesgivenin ( 3{85 )-( 3{88 )frompixelinformation,theprojectedpixelcoordinatesa rerelatedto m i ( t ), 62

PAGE 63

m 1 i ( t ), m 2 i ( t ),and m r 1 i bythepin-holecameramodelas p i = Am i p 1 i = Am 1 i (3{94) p 2 i = Am 2 i p r 1 i = Am r 1 i (3{95) Also,thepin-holecameramodelrelationshipforthenormal izedEuclideancoordinates m r 2 i ;m 0i ( t ) ;m 0ri ( t ) ; and m rdi canbeformulatedintermsofthevirtualpixelcoordinates p r 2 i ;p 0i ( t ) ;p 0ri ( t ) ; and p rdi asfollows: p r 2 i = Am r 2 i p 0i = Am 0i (3{96) p 0ri = Am 0ri p rdi = Am rdi (3{97) where A 2 R 3 3 isaknown,constant,andinvertibleintrinsiccameracalib rationmatrix. Byusing( 3{85 )-( 3{88 ),( 3{94 ),and( 3{95 ),thefollowingrelationshipscanbedeveloped: p r 1 i = ri G r p 1 i p r 2 i = ri G r p 2 i (3{98) p 0i = 0 i G 0 p i p rdi = rdi G rd p 0ri (3{99) where G r ( t )=[ g rij ( t )], G 0 ( t )=[ g 0 ij ( t )], G rd =[ g rdij ] 8 i;j =1 ; 2 ; 3 2 R 3 3 denoteprojective homographies. Setsoflinearequationscanbedevelopedfrom( 3{98 )and( 3{99 )todetermine theprojectivehomographiesuptoascalarmultiple.Variou stechniquescanbeused (e.g.,see[ 48 49 ])todecomposetheEuclideanhomographies,toobtain ri ( t ) ; 0 i ( t ), rdi ( t ), x hr ( t ), x 0h ( t ), x 0hrd ( t ), R r ( t ), R 0 ( t ), R 0 rd ( t ).Usingtheknowngeometriclength s 21 i andaunitnormal n 1 ,obtainedfromhomographydecompositionof( 3{85 ),geometric reconstructionmethodcanbeutilizedtoobtain m 1 i ( t )and d 1 ( t ).Hence,thetranslation x fr ( t )between I and I R canberecoveredfrom( 3{89 ).Also,theEuclideancoordinates m ri oftheUGVcorrespondingtothestationaryreferenceposeca nbeobtainedfrom geometricreconstruction.Thus, m 0i ( t )canbecomputedfrom( 3{70 ).Using( 3{81 ), ( 3{87 ),( 3{96 ),and( 3{99 ),theprojectivehomographycanbedenedbetween p 0i ( t )and 63

PAGE 64

p i ( t ) ; whichcanbedecomposedtoobtainaunitnormal n ( t )andhencethetime-varying Euclideancoordinates m i ( t ).TheEuclideancoordinates m 0ri ( t ),correspondingtothe currentUGVpositionasseenbyreferencecamera I R ,canbeobtainedusing( 3{70 ). Therefore,using( 3{83 )and( 3{97 )aprojectivehomographyrelationshipcanbeobtained betweenthecurrentUGV(i.e. F ( t ))andthedesiredUGV(i.e. F d )intermsofa stationaryreferencecameracoordinatesystem I R givenby( 3{99 ). Further,whenthereferenceobject F 2 appearsintheeldofviewof I ,theEuclidean position m 2 i ( t )canbeobtained.Using( 3{69 ),( 3{82 ),( 3{84 ),( 3{86 ),and( 3{98 ),a projectivehomographyrelationshipcanbedenedbetween p r 2 i and p 2 i ( t ),whichcan bedecomposedtoobtainrotationandtranslation R r ( t ), x fr ( t )between I and I R Once R r ( t )and x fr ( t )havebeendetermined,thefuturerelationshipcanbeexpre ssed withrespecttothenewreferenceobject(i.e. F 2 )andsimilarly,canbegeneralizedfor n =2 ; 3 ;::;m 3.3.3UGVKinematics ThekinematicmodelfortheUGVcanbedeterminedfromFig. 3-3 as 266664 x c y c d 377775 = 266664 cos d 0 sin d 0 01 377775 264 c c 375 (3{100) where_ x c ,_ y c ,and d denotethetimederivativesof x c ( t ), y c ( t ),and d ( t ) 2 R ,respectively, where x c ( t )and y c ( t )denotetheplanarpositionof F expressedin F d ,and d ( t ) 2 R denotestheright-handedrotationangleaboutthez-axisof F thataligns F with F d ,and c ( t )and c ( t )wereintroducedinSection 3.2.1 andaredepictedinFigs. 3-3 and 3-4 Basedonthedenitionfor R 0 rd ( t )providedinthepreviousdevelopment,therotationfrom 64

PAGE 65

F to F d canbedevelopedas R 0 rd = 266664 cos d sin d 0 sin d cos d 0 001 377775 : (3{101) Basedonthefactthat R 0 rd ( t )canbeobtainedfrom( 3{88 )and( 3{99 ),itisclearfrom ( 3{101 )that d ( t )isaknownsignalthatcanbeusedinthecontroldevelopment Thegeometricrelationshipsbetweenthecoordinateframes canbeusedtodevelopthe followingexpression x c y c 0 T = R T rd ( x 0fr x frd ).(3{102) Afterutilizing,( 3{58 ),( 3{60 ),( 3{88 ),andtheassumption(asin[ 46 ])that s 11 =[0 ; 0 ; 0] T thefollowingexpressioncanbeobtained 5 x c z rd 1 y c z rd 1 0 T = R T rd ( z 0 r 1 z rd 1 m 0r 1 m rd 1 ) : (3{103) Sincethetermsontheright-handsideof( 3{103 )areknownormeasurable(referto Section 3.3.2 ),then x c ( t ) z rd 1 and y c ( t ) z rd 1 canbeusedinthecontroldevelopment. Basedontheformof( 3{101 and 3{103 ),controldevelopmentandLyapunov-based stabilityanalysisargumentsprovidedinSection 3.2 canbeusedtoproveasymptotic regulationoftheUGV.3.3.4SimulationResults Anumericalsimulationwasperformedtoillustratetheperf ormanceofthemulti-reference regulationcontrolgiventhecontrollerin( 3{44 )and( 3{45 ).Thesimulationscenariois showninFig. 3-5 ,suchthattheposeofcurrentUGV F ( t )isestimatedwithrespect 5 Anypoint s 1 i s 2 ni canbeutilizedinthesubsequentdevelopment;however,tor educe thenotationalcomplexity,wehaveelectedtoselecttheima gepoint s 11 s 2 n 1 ,andhence, thesubscript1isutilizedinlieuof i inthesubsequentdevelopment. 65

PAGE 66

tofourstationaryreferenceobjects F 1 F 2 F 3 F 4 whileregulatingtothedesiredpose correspondingtothecoordinateframe F d .Theoriginsofthecoordinateframes F F 1 F 2 F 3 F 4 ,and F d ,andthefourcoplanarfeaturepointsontheplanes 1 2 3 4 and d arechosensuchthattheEuclideancoordinatesofthefeatur epointsin F F 1 F 2 F 3 F 4 ,and F d aregivenby s i s 1 i s 2 i s 3 i s 4 i s i (where i =1 ; 2 ; 3 ; 4),respectively. TheinitialposeofcurrentUGV F (0)= F ( t ) j t =0 ,stationaryreferenceobjects F j ;j =1 ; 2 ; 3 ; 4,andthedesiredpose F d wereconsideredas F (0)= 266666664 cos (60) sin (60)0 3 : 40 sin (60) cos (60)0 1 : 00 00100001 377777775 F 1 = 266666664 cos (40) sin (40)0 3 : 25 sin (40) cos (40)00 : 50 001 1 : 00 0001 377777775 F 2 = 266666664 cos (20) sin (20)0 1 : 50 sin (20) cos (20)0 0 : 15 001 0 : 80 0001 377777775 F 3 = 266666664 cos (00) sin (00)0 1 : 25 sin (00) cos (00)01 : 50 001 1 : 50 0001 377777775 F 4 = 266666664 cos (10) sin (10)00 : 15 sin (10) cos (10)00 : 15 001 2 : 25 0001 377777775 F d = 266666664 cos ( 30) sin ( 30)00 : 40 sin ( 30) cos ( 30)01 : 00 00100001 377777775 : Thecontrolgainsin( 3{44 )and( 3{45 )wereselectedas k =2 : 2025 k v =12 : 9275 : TheEuclideanspacetrajectoryofthemovingcamera I alongwithinitialandnal positionofthetime-varyingUGV F ( t )anddesiredUGV F d isshowninFig. 3-5 .The regulationerrorsareplottedinFig. 3-6 ,whichasymptoticallyapproachzero.Thelinear 66

PAGE 67

andangularvelocitycontrolinputsareshowninFig. 3-7 .Figs. 3-8 and 3-9 showthe regulationresultsinpresenceofanadditivewhiteGaussia nnoiseofstandarddeviation =0 : 1 pixels 3.3.5ConcludingRemarks Inthischapter,theposeofamovingsensorlessUGVisregula tedtoadesired posedenedbyagoalimageusingacollaborativevisualserv ocontrolstrategy.To achievetheresult,multipleviewsofareferenceobjectwer eusedtodevelopEuclidean homographies.BydecomposingtheEuclideanhomographiesi ntoseparatetranslationand rotationcomponentsreconstructedEuclideaninformation wasobtainedforthecontrol development.Theimpactofthisresearchisanewframeworkt orelatetheposeofa movingtargetthroughimagesacquiredbyamovingcamera.Fu rther,theresultsare extendedtoincludethescenariosuchthatthestationaryre ferenceobjectscanleave thecameraFOVandnewobjectsentertheFOV,thusincreasing theapplicativearea oftheUGV/camerapair.Thedevelopmentinnextchapterwill targetanextensionof daisy-chainingmethodtoatrackingcontrolresult. 67

PAGE 68

-4 -3 -2 -1 0 1 -1 0 1 2 -4 -3.5 -3 -2.5 -2 -1.5 -1 -0.5 0 y [m] x [m] z [m]F 5 F 7 F 4 F 3 F 6 F d F(t) I(0) I(t) F 2 F(0) F 1 Figure3-5.Euclideanspacetrajectoryofthemovingcamera I ,initialandnalposition ofthetime-varyingUGV F ( t ),anddesiredUGV F d F (0)denotestheinitial positionoftheUGV, I (0)denotestheinitialpositionofthemovingcamera, I ( t )denotesthetime-varyingpositionofthemovingcamera, F i where i =1 ; 2 ;::; 7denotesthestationaryreferenceobjects,and F ( t )denotesthe regulatedpositionoftheUGVcoincidentwiththedesiredUG V F d 68

PAGE 69

0 1 2 3 4 5 6 7 8 9 10 -1 0 1 2 time [s]e 1 (t) [m] 0 1 2 3 4 5 6 7 8 9 10 -1 -0.5 0 0.5 time [s]e 2 (t) [m] 0 1 2 3 4 5 6 7 8 9 10 -1 0 1 2 time [s]e 3 (t) [rad] Figure3-6.Linear(i.e. e 1 ( t )and e 2 ( t ))andangular(i.e. e 3 ( t ))regulationerror. 0 1 2 3 4 5 6 7 8 9 10 -20 -15 -10 -5 0 5 10 time [s]v c (t) [m/s] 0 1 2 3 4 5 6 7 8 9 10 -4 -3 -2 -1 0 1 2 time [s]w c (t) [rad/s] Figure3-7.Linear(i.e. v c ( t ))andangular(i.e. c ( t ))velocitycontrolinputs. 69

PAGE 70

0 1 2 3 4 5 6 7 8 9 10 -1 0 1 2 time [s]e 1 (t) [m] 0 1 2 3 4 5 6 7 8 9 10 -1 -0.5 0 0.5 time [s]e 2 (t) [m] 0 1 2 3 4 5 6 7 8 9 10 -1 0 1 2 time [s]e 3 (t) [rad] Figure3-8.Linear(i.e. e 1 ( t )and e 2 ( t ))andangular(i.e. e 3 ( t ))regulationerrorinpresence ofanadditivewhiteGaussiannoise. 0 1 2 3 4 5 6 7 8 9 10 -20 -15 -10 -5 0 5 10 time [s]v c (t) [m/s] 0 1 2 3 4 5 6 7 8 9 10 -4 -3 -2 -1 0 1 2 time [s]w c (t) [rad/s] Figure3-9.Linear(i.e. v c ( t ))andangular(i.e. c ( t ))velocitycontrolinputsinpresenceof anadditivewhiteGaussiannoise. 70

PAGE 71

CHAPTER4 ADAISY-CHAININGVISUALSERVOINGAPPROACHWITHAPPLICATIO NSIN TRACKING,LOCALIZATION,ANDMAPPING 4.1Introduction Anewdaisy-chainingmethoddevelopedinChapter 3 isusedforvision-based trackingcontrolofarigid-bodyobject,suchasanUGV,whil eprovidinglocalizationof themovingcameraandmovingobjectintheworldframe,andma ppingthelocationof staticlandmarksintheworldframe.Hence,thisapproachca nbeusedinvSLAMofthe UGV,withapplicationstowardpathplanning,realtimetraj ectorygeneration,obstacle avoidance,multi-vehiclecoordinationcontrolandtaskas signment,etc.Byusingthe daisy-chainingstrategy,thecoordinatesofstaticfeatur esoutoftheFOVcanalsobe estimated.Theestimatesofstaticfeaturescanbemaintain edasamap,orcanbeusedas measurementsinexistingvSLAMmethods. Forclarity,thischapterispresentedinsuccessivestages ofincreasingcomplexity. Section 4.2 introducestheimagingmodel,geometricmodelusedinthisc hapter,aswell asintroducesthedaisy-chainingmethodasappliedtotheca seofcontrollingasixDOF planarobjectthroughvisualdatafromamovingcameraandx edreferencecamera. TheseresultsareextendedtothecaseofanUGVwithnonholon omicconstraintsand amovingcameraandmovingreferencecamerainSection 4.3 .Theeortsofprevious sectionsarethenbroughttobearonatrackingandmappingap plication,wheretheUGV iscontrolledtotrackatrajectorythattakesthevehicleou tsideoftheinitialFOVofthe camera.Thedaisy-chainingapproachmustbeextendedtoall owfornewxedlandmarks toentertheFOVandrelatedtopreviouslandmarksandtheUGV 4.2Daisy-ChainingBasedTrackingControl Inthissection,avisualservotrackingcontrollerisdevel opedforamovingsix DOFobjectbasedondaisy-chainedimagefeedbackfromamovi ngcamera.Thecontrol objectiveistohavetheobjecttrackadesiredtrajectoryde terminedbyasequenceof prerecordedimagesfromastationarycamera.Toachievethi sresult,severaltechnical 71

PAGE 72

issueswereresolvedincluding:discriminatingtherelati vevelocitybetweenthemoving cameraandthemovingobject,compensatingfortheunknownt ime-varyingdistance measurementfromthecameratotheobject,relatingtheunkn ownattitudeofthecontrol objecttosomemeasurablesignals,andusingtheunitquater niontoformulatetherotation motionandrotationerrorsystem.Therelativevelocityiss ueisresolvedbyutilizing multi-viewimagegeometrytodaisy-chainhomographyrelat ionshipsbetweenthemoving cameraframeandthemovingobjectcoordinateframes.Byusi ngthedepthratios obtainedfromthehomographydecomposition,theunknownde pthinformationisrelated toanunknownconstantthatcanbecompensatedforbyaLyapun ov-basedadaptive updatelaw.Lyapunov-basedmethodsareprovidedtoproveth eadaptiveasymptotic trackingresult.4.2.1ProblemScenario Overthepastdecade,avarietyofvisualservocontrollersh avebeenaddressedfor bothcamera-to-handandcamera-in-handcongurations(e. g.,see[ 54 { 57 ]).Typical camera-to-handandcamera-in-handvisualservocontrolle rshaverequiredthateitherthe cameraorthetargetremainstationarysothatanabsoluteve locitycanbedetermined andusedinthecontroldevelopment.Fortheproblemofamovi ngcameratrackinga movingtarget(i.e.controlofrelativepose/velocity),in tegralcontrolorpredictiveKalman ltershavebeenusedtoovercometheunknowntargetvelocit y[ 58 59 ].Incontrastto thesemethods,thedevelopmentinthissectionandourprevi ouspreliminaryworkin [ 42 43 60 ]ismotivatedbytheproblemwhenthecameraandthetargetar emoving.A practicalexampleapplicationofthisscenarioisanairbor necameraattachedtoaremote controlledaircraftthatisusedtodetermineposemeasurem entsofanUGVandthenrelay theinformationtotheUGVforclosed-loopcontrol. ThescenarioexaminedinthissectionisdepictedinFig. 4-1 ,wherevarious coordinateframesaredenedasameanstodevelopthesubseq uentEuclideanreconstruction andcontrolmethods.InFig. 4-1 ,astationarycoordinateframe I R isattachedtoa 72

PAGE 73

Figure4-1.Geometricmodelforamovingcamera(coordinate frame I ),movingtarget (coordinateframe F )andstationaryreferencecamera(coordinateframe I R ). cameraandatime-varyingcoordinateframe F d isattachedtosomemobileobject(e.g.,an aircraft,agroundvehicle,amarinevessel).Theobjectisi dentiedbyanimagethrougha collectionoffeaturepointsthatareassumed(withoutloss ofgenerality 1 )tobecoplanar andnon-collinear(i.e.,aplanarpatchoffeaturepoints). Thecameraattachedto I R a priorirecordsaseriesofsnapshots(i.e.,avideo)ofthemo tionofthecoordinateframe F d until F d comestorest.Astationarycoordinateframe F isattachedtoanotherplanar patchoffeaturepointsthatareassumedtobevisibleinever yframeofthevideorecorded bythecamera.Forexample,thecameraattachedto I R ison-boarda\stationary" satellitethattakesaseriesofsnapshotsoftherelativemo tionof F d withrespectto F Therefore,thedesiredmotionof F d canbeencodedasaseriesofrelativetranslations androtationswithrespecttothestationaryframe F apriori.Splinefunctionsorlter 1 Imageprocessingtechniquescanoftenbeusedtoselectcopl anarandnon-collinear featurepointswithinanimage.However,iffourcoplanarta rgetpointsarenotavailable thenthesubsequentdevelopmentcanalsoexploitthevirtua lparallaxmethod[ 45 61 ] wherethenon-coplanarpointsareprojectedontoavirtualp lane. 73

PAGE 74

algorithmscanthenbeusedtogenerateasmoothdesiredfeat urepointtrajectoryas describedin[ 46 ]. Fig. 4-1 alsodepictsatime-varyingcoordinateframe I thatisattachedtoanother camera(e.g.,acameraattachedtoaremotecontrolledaircr aft),andatime-varying coordinateframe F thatisattachedtothecurrentposeoftheplanarpatch.Thec amera attachedto I capturessnapshotsoftheplanarpatchesassociatedwith F and F respectively.Theapriorimotionof F d representsthedesiredtrajectoryofthecoordinate system F ,where F and F d areattachedtoidenticalobjects,butatdierentpointsin time.Thecameraattachedto I R canbeadierentcamera(withdierentcalibration parameters)asthecameraattachedto I .Basedonthesecoordinateframedenitions, theproblemconsideredinthissectionistodevelopakinema ticcontrollerfortheobject attachedto F sothatthetime-varyingrotationandtranslationof F convergestothe desiredtime-varyingrotationandtranslationof F d ,wherethemotionof F isdetermined fromthetime-varyingoverheadcameraattachedto I 4.2.2GeometricRelationships Relationshipsbetweenthevariouscoordinateframesaresu mmarizedinTable 4-1 .In Table 4-1 R ( t ), R ( t ), R r ( t ), R 0 ( t ), R rd ( t ), R r 2 SO (3)denoterotationmatrices,and x fr ( t ), x 0fr ( t ), x frd ( t ), x fr 2 R 3 denotetranslationvectors.FromFig. 4-1 ,thetranslation x 0fr ( t )andtherotation R 0 ( t )canbeexpressedas x 0fr = x fr + R r R T ( x f x f ) R 0 = R r R T R: (4{1) AsillustratedinFig. 4-1 d and denotetheplanesoffeaturepointsassociated with F F d ,and F ,respectively.TheconstantEuclideancoordinatesofthe i -thfeature pointin F (andalso F d )aredenotedby s 1 i 2 R 3 8 i =1 ; 2 ; ;n ( n 4),and s 2 i 2 R 3 8 i =1 ; 2 ; ;n denotestheconstantEuclideancoordinatesofthe i -thfeaturepointin 74

PAGE 75

Table4-1.Coordinateframesrelationshipsfor6-DOFplana robjecttrackingcontrol. MotionFrames R ( t ), x f ( t ) F to I in I R ( t ), x f ( t ) F to I in I R r ( t ), x fr ( t ) I to I R R 0 ( t ), x 0fr ( t ) F to I R in I R R r x fr F to I R in I R R rd ( t ), x frd ( t ) F d to I R in I R F .Fromthegeometrybetweenthecoordinateframesdepictedi nFig. 4-1 ,thefollowing relationshipscanbedeveloped m i = x f + Rs 1 i m rdi = x frd + R rd s 1 i (4{2) m ri = x fr + R r s 2 i m 0i = x 0fr + R 0 s 1 i (4{3) m i = x f + R s 2 i : (4{4) In( 4{2 )-( 4{4 ), m i ( t ), m i ( t ) 2 R 3 denotetheEuclideancoordinatesofthefeaturepointson and ,respectively,expressedin I as m i ( t ) x i ( t ) y i ( t ) z i ( t ) T (4{5) m i ( t ) x i ( t ) y i ( t ) z i ( t ) T ; (4{6) m 0i ( t ), m rdi ( t ) 2 R 3 denotetheactualanddesiredtime-varyingEuclideancoord inates, respectively,ofthefeaturepointson expressedin I R as m 0i ( t ) x 0i ( t ) y 0 i ( t ) z 0 i ( t ) T (4{7) m rdi ( t ) x rdi ( t ) y rdi ( t ) z rdi ( t ) T ; (4{8) and m ri 2 R 3 denotestheconstantEuclideancoordinatesofthefeaturep ointsonthe plane expressedin I R as m ri x ri y ri z ri T : (4{9) 75

PAGE 76

Aftersomealgebraicmanipulation,theexpressionsin( 4{2 )-( 4{4 )canberewrittenas m i = x n + R n m i (4{10) m i = x f + R m i m rdi = x frd + R rd m ri (4{11) m ri = x fr + R r m i m 0i = x fr + R r m i ; (4{12) where R n ( t ), R ( t ), R rd ( t ), R r ( t ) 2 SO (3)and x n ( t ), x f ( t ), x frd ( t ), x fr ( t ) 2 R 3 arenew rotationandtranslationvariables,respectively,dened as 2 R n = R R T R = RR T (4{13) R rd = R rd R T r R r = R r R T x n = x f R n ( x f R ( s 2 i s 1 i )) (4{14) x f = x f R x f + R ( s 2 i s 1 i ) (4{15) x frd = x frd R rd x fr + R r ( s 2 i s 1 i ) (4{16) x fr = x fr R r x f = x 0fr R r x f : (4{17) Tofacilitatethedevelopmentofarelationshipbetweenthe actualEuclideantranslation of F totheEuclideantranslationthatisreconstructedfromthe imageinformation, projectiverelationshipsaredevelopedfromFig. 4-1 as d ( t )= n T m i d ( t )= n T m i d r = n T r m ri ; (4{18) where d ( t ) 2 R representsthedistancefromtheoriginof I to alongtheunitnormal (expressedin I )to denotedas n ( t ) 2 R 3 d ( t ) 2 R representsthedistancefromthe 2 Notethat R n ( t ), R ( t )and R rd ( t )in( 4{13 )aretherotationmatricesbetween F and F F and F ,and F and F d ,respectively,but x n ( t ), x f ( t )and x frd ( t )in( 4{14 )-( 4{16 ) arenotthetranslationvectorsbetweenthecorrespondingc oordinateframes.Onlythe rotationmatriceswillbeusedinthecontrollerdevelopmen t. 76

PAGE 77

originof I to alongtheunitnormal(expressedin I )to denotedas n ( t ) 2 R 3 and d r 2 R representsthedistancefromtheoriginof I R to alongtheunitnormal (expressedin I R )to denotedas n r 2 R 3 where n ( t )= R T r ( t ) n r .In( 4{18 ), d ( t ), d ( t ), d r >" forsomepositiveconstant 2 R .Basedon( 4{18 ),therelationshipsin ( 4{10 )-( 4{12 )canbeexpressedas m i = R n + x n d n T m i (4{19) m i = R + x f d n T m i (4{20) m rdi = R rd + x frd d r n T r m ri (4{21) m ri = R r + x fr n T d m i (4{22) m 0i = R r + x fr n T d m i : (4{23) Asin[ 46 ],thesubsequentdevelopmentrequiresthattheconstantro tationmatrix R r beknown.Theconstantrotationmatrix R r canbeobtainedaprioriusingvarious methods(e.g.,asecondcamera,additionalon-boardsensor s,o-linecalibration,Euclidean measurements).Thesubsequentdevelopmentalsoassumesth atthedierencebetween theEuclideandistances( s 2 i s 1 i )isaconstant 8 i =1 ;:::;n .Whiletherearemany practicalapplicationsthatsatisfythisassumption(e.g. ,asimplescenarioisthatthe objectsattachedto F and F aretheidenticalobjects),theassumptionisgenerally restrictiveandisthefocusoffutureresearch.Asdescribe dinourpreliminaryworkin[ 62 ], eachoftheseassumptionscanbeavoidedbyusingthegeometr icreconstructionapproach in[ 52 63 64 ]underanalternativeassumptionthattheEuclideandistan cebetweentwo featurepointsispreciselyknown.4.2.3EuclideanReconstruction Therelationshipsgivenby( 4{19 )-( 4{23 )provideameanstoquantifyatranslation androtationerrorbetweenthedierentcoordinatesystems .Sincetheposeof F F d ,and F cannotbedirectlymeasured,aEuclideanreconstructionis developedtoobtainthe 77

PAGE 78

poseerrorbycomparingmultipleimagesacquiredfromtheho veringmonocularvision system.Tofacilitatethesubsequentdevelopment,thenorm alizedEuclideancoordinates ofthefeaturepointsin and canbeexpressedintermsof I as m i ( t ), m i ( t ) 2 R 3 respectively,as m i m i z i m i m i z i : (4{24) Similarly,thenormalizedEuclideancoordinatesofthefea turepointsin d and can beexpressedintermsof I R as m 0i ( t ), m rdi ( t ), m ri 2 R 3 ,respectively,as m 0i ( t ) m 0i ( t ) z 0 i ( t ) m rdi ( t ) m rdi ( t ) z rdi ( t ) m ri m ri z ri : (4{25) Fromtheexpressionsgivenin( 4{20 )and( 4{24 ),therotationandtranslationbetween thecoordinatesystems F and F ,between F and F d ,andbetween I and I R cannowbe relatedintermsofthenormalizedEuclideancoordinatesas m i = i R + x h n T m i ; (4{26) m i = 1 i R n + x nh n T m i (4{27) m rdi = rdi R rd + x hrd n T r m ri ; (4{28) m ri = ri R r + x hr n T m i ; (4{29) where i ( t ), rdi ( t ), ri ( t ) 2 R denotedepthratiosdenedas i = z i z i rdi = z ri z rdi ri = z i z ri ; and x h ( t ), x nh ( t ), x hrd ( t ), x hr ( t ) 2 R 3 denotescaledtranslationvectorsthataredenedas x h = x f d x nh = x n d (4{30) x hrd = x frd d r x hr = x fr d : SincethenormalizedEuclideancoordinatesin( 4{26 )-( 4{29 )cannotbedirectly measured,thefollowingrelationships(i.e.,thepin-hole cameramodel)areusedto 78

PAGE 79

determinethenormalizedEuclideancoordinatesfrompixel information p i = A 1 m i p i = A 1 m i (4{31) p rdi = A 2 m rdi p ri = A 2 m ri (4{32) where A 1 ;A 2 2 R 3 3 areknown,constant,andinvertibleintrinsiccameracalib ration matricesofthecurrentcameraandthereferencecamera,res pectively.In( 4{31 )and ( 4{32 ), p i ( t ), p i ( t ) 2 R 3 representtheimage-spacecoordinatesoftheEuclideanfea ture pointson and expressedintermsof I as p i u i v i 1 T p i u i v i 1 T ; (4{33) respectively,where u i ( t ), v i ( t ) ;u i ( t ), v i ( t ) 2 R .Similarly, p rdi ( t ), p ri 2 R 3 representthe image-spacecoordinatesoftheEuclideanfeatureson d and expressedintermsof I R as p rdi u rdi v rdi 1 T p ri u ri v ri 1 T (4{34) respectively,where u rdi ( t ), v rdi ( t ) ;u ri v ri 2 R .Byusing( 4{26 )-( 4{29 )and( 4{31 )-( 4{34 ), thefollowingrelationshipscanbedeveloped: p i = i A 1 R + x h n T A 1 1 | {z } p i G (4{35) p i = 1 i A 1 R n + x nh n T A 1 1 | {z } p i G n (4{36) p rdi = rdi A 2 R rd + x hrd n T r A 1 2 | {z } p ri G rd (4{37) p ri = ri A 2 R r + x hr n T A 1 1 | {z } p i ; G r (4{38) 79

PAGE 80

where G ( t ), G n ( t ), G rd ( t ), G r ( t ) 2 R 3 3 denoteprojectivehomographies.Setsoflinear equationscanbedevelopedfrom( 4{35 )-( 4{38 )todeterminetheprojectivehomographies uptoascalarmultiple.Varioustechniquescanbeused(e.g. ,see[ 49 65 ])todecompose theEuclideanhomographies,toobtain i ( t ) ; rdi ( t ), ri ( t ), x h ( t ), x nh ( t ), x hrd ( t ), x hr ( t ), R ( t ), R n ( t ), R rd ( t ), R r ( t ), n ( t ), n r n ( t ).Giventhattheconstantrotationmatrix R r isassumedtobeknown,theexpressionsfor R rd ( t )and R r ( t )in( 4{13 )canbeusedto determine R rd ( t )and R ( t ).Once R ( t )isdetermined,theexpressionfor R ( t )in( 4{13 ) canbeusedtodetermine R ( t ).Also,once R r R T ( t ),and R ( t )havebeendetermined, ( 4{1 )canbeusedtodetermine R 0 ( t ).Since R r ( t ), x hr ( t ), i ( t ), n ( t ), n r n ( t ), m i ( t ),and m i ( t )canbedetermined,thefollowingrelationshipcanbeusedt odetermine m 0i ( t ): m 0i = z i z 0 i R r + x hr i n T m i n T m i n T m i ; (4{39) wheretheinverseoftheratio z i ( t ) z 0 i ( t ) canbedeterminedas z 0 i z i = 001 R r + x hr i n T m i n T m i n T m i : (4{40) 4.2.4ControlObjective Thecontrolobjectiveisforacontrolledobject(e.g.,anUG VoranUAV)totrack adesiredtrajectorythatisdeterminedbyasequenceofimag es.Thisobjectiveisbased ontheassumptionthatthecontrolobjectisphysicallyable tofollowthedesiredimage trajectory,thatthelinearandangularvelocitiesoftheca meraarecontrolinputsthat canbeindependentlycontrolled(i.e.,unconstrainedmoti on),andthatthereferenceand desiredcamerasarecalibrated(i.e., A 1 and A 2 areknown).Thecontrolobjectivecan bestatedasthedesirefortheEuclideanfeaturepointson totrackthecorresponding featurepointson d ,whichcanbemathematicallystatedasthedesirefor m 0i ( t ) m rdi ( t ).Equivalently,thecontrolobjectivecanalsobestatedin termsoftherotationand translationoftheobjectasthedesirefor x 0fr ( t ) x frd ( t )and R 0 ( t ) R rd ( t ). 80

PAGE 81

Asstatedpreviously, R 0 ( t )and R rd ( t )canbecomputedbydecomposingthe projectivehomographiesin( 4{35 )-( 4{38 )andusing( 4{1 ).Oncetheserotationmatrices havebeendetermined,avarietyofparameterizationscanbe usedtodescribetherotation. Theunitquaternionparameterizationisusedtodescribeth erotationinthesubsequent problemformulation,controldevelopment,andstabilitya nalysissincetheunitquaternion providesagloballynonsingularparameterizationoftheco rrespondingrotationmatrices. Theunitquaternionisafourdimensionalvector,whichcanb edenedas q q 0 q T v T q v q v 1 q v 2 q v 3 T ; (4{41) where q 0 ( t ) ;q vi ( t ) 2 R 8 i =1 ; 2 ; 3satisfythefollowingnonlinearconstraint q T q =1 : (4{42) Giventherotationmatrices R 0 ( t )and R rd ( t ),thecorrespondingunitquaternions q ( t )and q d ( t )canbecalculatedbyusingthenumericallyrobustmethodpr esentedin[ 31 ]and[ 66 ] basedonthecorrespondingrelationships R 0 = q 2 0 q T v q v I 3 +2 q v q T v +2 q 0 q v (4{43) R rd = q 2 0 d q T vd q vd I 3 +2 q vd q T vd +2 q 0 d q vd (4{44) where I 3 isthe3 3identitymatrix,andthenotation q v ( t )denotesaskew-symmetric formofthevector q v ( t )as q v = 266664 0 q v 3 q v 2 q v 3 0 q v 1 q v 2 q v 1 0 377775 ; 8 q v = 266664 q v 1 q v 2 q v 3 377775 : (4{45) Toquantifytherotationerrorbetweenthefeaturepointson and d ,themultiplicative errorbetweenrotationmatrices R 0 ( t )and R rd ( t )isdenedas ~ R = R 0 T R rd = ~ q 2 0 ~ q T v ~ q v I 3 +2~ q v ~ q T v 2~ q 0 ~ q v ; (4{46) 81

PAGE 82

wheretheerrorquaternion~ q ( t )=(~ q 0 ( t ) ; ~ q T v ( t )) T isdenedas ~ q = 264 ~ q 0 ~ q v 375 = 264 q 0 q 0 d + q T v q vd q 0 d q v q 0 q vd + q v q vd 375 : (4{47) Since~ q ( t )isaunitquaternion,( 4{46 )canbeusedtoquantifytherotationtracking objectiveas k ~ q v ( t ) k! 0= ) ~ R ( t ) I 3 as t !1 : (4{48) Thetranslationerror,denotedby e ( t ) 2 R 3 ,isdenedas e = m e m ed (4{49) where m e ( t ), m ed ( t ) 2 R 3 aredenedas 3 m e = x 01 z 0 1 y 0 1 z 0 1 ln( z 0 1 z r 1 ) T (4{50) m ed = x rd 1 z rd 1 y rd 1 z rd 1 ln( z rd 1 z r 1 ) T : In( 4{50 ), z 0 1 ( t ) z r 1 ( t ) and z rd 1 ( t ) z r 1 ( t ) canbeexpressedintermsofknownsignalsas z 0 1 z r 1 = z 0 1 z 1 z 1 z 1 z 1 z r 1 = z 0 1 z 1 1 1 r 1 z rd 1 z r 1 = 1 rd 1 : Basedon( 4{48 )and( 4{49 ),thesubsequentcontroldevelopmenttargetsthefollowin g objectives: k ~ q v ( t ) k! 0and k e ( t ) k! 0as t !1 : (4{51) 3 Anypoint O i canbeutilizedinthesubsequentdevelopment;however,tor educe thenotationalcomplexity,wehaveelectedtoselecttheima gepoint O 1 ,andhence,the subscript1isutilizedinlieuof i inthesubsequentdevelopment. 82

PAGE 83

4.2.5ControlDevelopment4.2.5.1Open-looperrorsystem Basedon( 4{46 )and( 4{47 ),theopen-looprotationerrorsystemcanbedevelopedas [ 67 ] ~ q = 1 2 264 ~ q T v ~ q 0 I 3 +~ q v 375 c ~ R! cd ; (4{52) where cd ( t )denotestheangularvelocityof d expressedin F d thatcanbecalculatedas [ 67 ] cd =2( q 0 d q vd q vd q 0 d ) 2 q vd q vd ; (4{53) where q 0 d ( t ) ;q T vd ( t ) T q 0 d ( t ) ; q T vd ( t ) T areassumedtobebounded;hence, cd ( t )isalso bounded.Theopen-looptranslationerrorsystemcanbederi vedas z r 1 e = z r 1 z 0 1 L 0v R 0 v c + c s 1 z r 1 m ed ; (4{54) where v c ( t ) ;! c ( t ) 2 R 3 denotethelinearandangularvelocityof expressedin F respectively,andtheauxiliarymeasurableterm L 0v ( t ) 2 R 3 3 isdenedas L 0v = 2666664 10 x 01 z 0 1 01 y 0 1 z 0 1 001 3777775 : 4.2.5.2Closed-looperrorsystem Basedontheopen-looprotationerrorsystemin( 4{52 )andthesubsequent Lyapunov-basedstabilityanalysis,theangularvelocityc ontrollerisdesignedas c = K ~ q v + ~ R! cd ; (4{55) 83

PAGE 84

where K 2 R 3 3 denotesadiagonalmatrixofpositiveconstantcontrolgain s.From ( 4{52 )and( 4{55 ),therotationclosed-looperrorsystemcanbedetermineda s ~ q 0 = 1 2 ~ q T v K ~ q v (4{56) ~ q v = 1 2 ~ q 0 I 3 +~ q v K ~ q v = 1 2 K ~ q 0 ~ q v : Basedon( 4{54 ),thetranslationcontrolinput v c ( t )isdesignedas v c = z 0 1 z r 1 R 0 T L 0 1 v ( K v e ^ z r 1 m ed ) c s 1 ; (4{57) where K v 2 R 3 3 denotesadiagonalmatrixofpositiveconstantcontrolgain s.In( 4{57 ), theparameterestimate^ z r 1 ( t ) 2 R fortheunknownconstant z r 1 isdesignedas ^ z r 1 = re T m ed ; (4{58) where r 2 R denotesapositiveconstantadaptationgain.Byusing( 4{54 )and( 4{57 ),the translationclosed-looperrorsystemis z r 1 e = K v e ~ z r 1 m ed ; (4{59) where~ z r 1 ( t ) 2 R denotestheparameterestimationerror ~ z r 1 = z r 1 ^ z r 1 : (4{60) 4.2.6StabilityAnalysis Theorem1. Thecontrollergivenin( 4{55 )and( 4{57 ),alongwiththeadaptiveupdate lawin( 4{58 )ensuresasymptotictrackinginthesensethat k ~ q v ( t ) k! 0 ; k e ( t ) k! 0 ;ast !1 : (4{61) 84

PAGE 85

Proof. Let V ( t ) 2 R denotethefollowingdierentiablenon-negativefunction (i.e.,a Lyapunovcandidate): V =~ q T v ~ q v +(1 ~ q 0 ) 2 + z r 1 2 e T e + 1 2 r ~ z 2 r 1 : (4{62) Thetime-derivativeof V ( t )canbedeterminedas V = ~ q T v K ~ q 0 ~ q v (1 ~ q 0 )~ q T v K ~ q v e T K v e + e T ( K v e ~ z r 1 m ed )+~ z r 1 e T m ed = ~ q T v (~ q 0 I 3 +(1 ~ q 0 ) I 3 ) K ~ q v e T K v e = ~ q T v K ~ q v e T K v e; (4{63) where( 4{56 )and( 4{58 )-( 4{60 )wereutilized.Basedon( 4{62 )and( 4{63 ), e ( t ) ; ~ q v ( t ) ; ~ q 0 ( t ) ; ~ z r 1 ( t ) 2 L 1 and e ( t ) ; ~ q v ( t ) 2L 2 .Since~ z r 1 ( t ) 2L 1 ,itisclearfrom( 4{60 )that^ z r 1 ( t ) 2L 1 Basedonthefactthat e ( t ) 2L 1 ,( 4{49 )and( 4{50 )canbeusedtoprovethat m 01 ( t ) 2L 1 ,andthen L 0v ( t ) ;L 0 1 v ( t ) 2L 1 .Basedonthefactthat~ q v ( t ) 2L 1 and cd ( t )isaboundedfunction,( 4{55 )canbeusedtoconcludethat c ( t ) 2L 1 .Since ^ z r 1 ( t ) ;e ( t ) ;m 01 ( t ) ;L 0v ( t ) ;L 0 1 v ( t ) 2L 1 and_ m ed ( t )isbounded,( 4{57 )canbeutilized toprovethat v c ( t ) 2L 1 .Fromthepreviousresults,( 4{52 )-( 4{54 )canbeusedtoprove that_ e ( t ) ; ~ q v ( t ) 2L 1 .Since e ( t ) ; ~ q v ( t ) 2L 1 \L 2 ,and_ e ( t ) ; ~ q v ( t ) 2L 1 ,wecanutilizea corollarytoBarbalat'sLemma[ 68 ]toconcludetheresultgivenin( 4{61 ). 4.3CooperativeTrackingControlofaNonholonomicUnmanne dGround Vehicle Intheprevioussection,avisualservotrackingcontroller isdevelopedforamoving sixDOFobjectbasedondaisy-chainedimagefeedbackfromam ovingcamerawherea stationaryreferencecamerawasusedtoencodeadesiredvid eo.Thedevelopmentinthis sectionandourpreliminaryworkin[ 43 ]extendstheprevioussectionbyallowingthe referencecameratoalsomove.Theexampleofareferencecam eraintheprevioussection 85

PAGE 86

wasa\stationary"satellitethatwasusedtoencodethedesi redtrajectory.Inthissection, thedesiredtrajectorycouldbeencodedbyamovingcamera(e .g.,attachedtoamoving satellite,adirigible,oranotherUAV).Inaddition,inste adofconsideringthegeneral sixDOFcontrolobject,thecontrolobjectinthissectionis anonholonomicconstrained UGV.ThecontrolobjectiveisfortheUGVtotrackadesiredtr ajectorydeterminedby asequenceofprerecordedimagesfromsomemovingoverheadc amera.Anadditional technicalissueresolvedinthissectionisthechallengeof comparingtherelativevelocity betweenamovingcameraandamovingUGVtotherelativedesir edtrajectoryrecorded byamovingcamera.4.3.1ProblemScenario Recentadvancesinimageextraction/interpretationtechn ologyandadvancesin controltheoryhavemotivatedresultssuchas[ 11 15 { 17 21 24 27 28 30 69 70 ]and others,wherecamera-basedvisionsystemsarethesolesens orusedforautonomous navigationofanUGV.See[ 28 ]foradetailedreviewoftheseandotherrelatedresults. Typicallytheseresultsarefocusedontheregulationresul t,andinalltheresultsthe targetsarestaticwithrespecttothemovingcameraortheca meraisstationaryand recordingimagesofthemovingUGV.Incontrasttothesemeth ods,thedevelopmentin thissectionandourpreviouspreliminaryworkin[ 43 ]ismotivatedbytheproblemwhena movingcameraisrecordingimagesofamovingUGVsoasecondU GVcantrackadesired imagetrajectory.Apracticalexampleapplicationofthiss cenarioisanairbornecamera attachedtoaremotecontrolledaircraftthatisusedtodete rmineadesiredvideoofan UGVmovinginaterrain,andthenanothermovingcamera(whic hdoesnothavetofollow thesametrajectoryasthepreviouscamera)isusedtorelate andcontroltheposeofa movingUGVwithrespecttotherecordedvideo. ThescenarioexaminedinthissectionisdepictedinFig. 4-2 ,wherevarious coordinateframesaredenedagainasameanstodevelopthes ubsequentEuclidean reconstructionandcontrolmethods.InFig. 4-2 ,asinglecamerathatisnavigatingabove 86

PAGE 87

Figure4-2.Geometricmodelforamovingcamera,movingUGV, andstationaryreference camera:Amovingcamera(coordinateframe I M )recordsthedesired trajectoryofanUGV(coordinateframe F d ( t ))withrespecttothestationary referenceobject F whilestationarycoordinateframe F s representsasnapshot ofanUGValongthedesiredtrajectorytakenby I R = I M ( t ) j t = T .Amoving camera(coordinateframe I )viewsthecurrentUGV(coordinateframe F ( t )) andthestationaryreferenceobject F theplanarmotionofanUGV.Themovingcoordinateframe I isattachedtoanoverhead camera,whichrecordsanimagesforreal-timetrackingcont rol.Themovingcoordinate frame I M isattachedtotheoverheadcamerathatrecordedthedesired imagesequence, andthexedcoordinateframe I R issomesinglesnapshotof I M Geometricmodelforamovingcamera(coordinateframe I ),movingUGV(coordinate frame F )andstationaryreferencecamera(coordinateframe I R ) Themovingcoordinateframe F isattachedtotheUGVatthecenteroftherear wheelaxis(forsimplicityandwithoutlossofgenerality). TheUGVisrepresentedinthe cameraimagebyfourfeaturepointsthatarecoplanarandnot collinear.TheEuclidean 87

PAGE 88

distance(i.e., s 1 i 2 R 3 8 i =1 ; 2 ; 3 ; 4)fromtheoriginof F tooneofthefeaturepointsis assumedtobeknown.Aprioriinformation(suchasaknowntar getintheinitialFOV [ 32 ])issometimesusedinvSLAMmethodstoestablishscale.The planedenedbythe UGVmotion(i.e.,theplanedenedbythexy-axisof F )andtheUGVfeaturepoints isdenotedby .ThelinearvelocityoftheUGValongthex-axisof F isdenotedby v c ( t ) 2 R ,andtheangularvelocity c ( t ) 2 R isaboutthez-axisof F WhileviewingthefeaturepointsoftheUGV,thecameraisass umedtoalsoviewfour additionalcoplanarandnoncollinearfeaturepointsofast ationaryreferenceobject.The fouradditionalfeaturepointsdenetheplane inFig. 4-2 .Thestationarycoordinate frame F isattachedtotheobjectwhereadistancefromtheoriginoft hecoordinate frametooneofthefeaturepoints(i.e., s 2 i 2 R 3 )isassumedtobeknown.Theplane isassumedtobeparalleltotheplane .Whenthecameraiscoincidentwith I R ,axed (i.e.,asinglesnapshot)referenceposeoftheUGV,denoted by F s ,isassumedtobeinthe camera'sFOV.Adesiredtrajectoryisdenedbyaprerecorde dtime-varyingtrajectoryof F d thatisassumedtobesecond-orderdierentiablewhere v cd ( t ) ;! cd ( t ) 2 R denotethe desiredlinearandangularvelocityof F d ,respectively.Thefeaturepointsthatdene arealsoassumedtobevisiblewhenthecameraisaprioriloca tedcoincidentwiththepose ofthestationarycoordinateframe I R andthetime-varyingcoordinateframe I M .Based onthesecoordinateframedenitions,theproblemconsider edinthissectionistodevelop akinematiccontrollerfortheobjectattachedto F sothatthetime-varyingrotationand translationof F convergestothedesiredtime-varyingrotationandtransla tionof F d wherethemotionof F isdeterminedfromthetime-varyingoverheadcameraattach edto I 4.3.2GeometricRelationships TherotationmatricesandtranslationvectorsinTable 4-1 (exceptthelastline) arealsovalidforthissection.Additionalrelationshipsb etweenthevariouscoordinate framesaresummarizedinTable 4-2 .InTable 4-2 R rs R md ( t ), R m ( t ), R rm ( t ), R 0 md ( t ) 2 88

PAGE 89

Table4-2.CoordinateframerelationshipsforUGVtracking control. MotionFrames R rs x frs F s to I R R md ( t ), x fmd ( t ) F d to I M R m ( t ), x fm ( t ) F to I M R rm ( t ), x frm ( t ) I M to I R R 0 md ( t ), x 0frm ( t ) F to I R in I M Figure4-3.GeometricmodelshowingasnapshotofanUGValon gthedesiredtrajectory (coordinateframe F s )takenby I R = I M ( t ) j t = T .Acurrentcamera(coordinate frame I )viewingthetime-varyingUGV(coordinateframe F )whileobserving thesetoffeaturepointsattachedto F SO (3)denoterotationmatrices,and x frs x fmd ( t ), x fm ( t ), x frm ( t ), x 0frm ( t ) 2 R 3 denote translationvectors.4.3.3EuclideanReconstruction ThecoordinateframerepresentationinFig. 4-2 canbeseparatedintoFigs. 4-3 and 4-4 torelate I to I R and I R to I M ,respectively.Thecoordinateframesineachgure 89

PAGE 90

Figure4-4.Geometricmodelshowingamovingcamera(coordi nateframe I M )recording thedesiredtrajectoryofanUGV(coordinateframe F d ( t ))withrespecttothe stationaryreferenceobject F whilestationarycoordinateframe F s represents asnapshotofanUGValongthedesiredtrajectorytakenby I R = I M ( t ) j t = T havethesamerelationshipsasinFig. 4-1 .Therefore,thesameEuclideanreconstruction processaspresentedinSection 4.2.1 4.2.3 canbeusedtwicetobuildtheEuclidean relationshipsforthisexample. ToreconstructtheEuclideanrelationshipforthegeometri cmodelasshowninFig. 4-3 ,let m rsi 2 R 3 denotetheconstantreferenceEuclideancoordinatesofthe featurepoints on s expressedin I R as m rsi x rsi y rsi z rsi T ; andlet p rsi 2 R 3 representtheconstantimage-spacecoordinatesofthefeat urepointson s takenbythecameraattachedto I M when I M iscoincidentwith I R p rsi u rsi v rsi 1 T : 90

PAGE 91

FollowingthedevelopmentinSection 4.2.2 and 4.2.3 ,relationshipscanbeobtainedto determinethehomographiesanddepthratiosas 4 p i = i A R + x h n T A 1 | {z } p i G (4{64) p rsi = rsi A R rs + x hrs n T r A 1 | {z } p ri G rs (4{65) p ri = ri A R r + x hr n T A 1 | {z } p i G r (4{66) where i = z i z i rsi = z ri z rsi ri = z i z ri R = RR T R rs = R rs R T r R r = R r R T : (4{67) Furthermore,thenormalizeEuclideancoordinates m i ( t )canberelatedto m 0i ( t )as m 0i = z i z 0 i R r + x hr i n T m i n T m i n T m i (4{68) z 0 i z i = 001 ( R r (4{69) + x hr i n T m i n T m i n T m i : ToreconstructtheEuclideanrelationshipforthegeometri cmodelasshowninFig. 4-4 ,let m mdi ( t ), m mi ( t ) 2 R 3 denotetheEuclideancoordinatesofthefeaturepointson d 4 Tosimplifythenotaions,thecamerasareassumedtohavethe samecalibrationmatrix A inthefollowingdevelopment.ThereaderscanrefertoSecti on 4.2.1 forthedeductions thatthecalibrationmatricesaredierent. 91

PAGE 92

and expressedin I M as m mdi ( t ) x mdi ( t ) y mdi ( t ) z mdi ( t ) T m mi ( t ) x mi ( t ) y mi ( t ) z mi ( t ) T ; let m 0mdi ( t ) 2 R 3 denotethedesiredEuclideancoordinatesofthefeaturepoi ntson d expressedin I R as m 0mdi ( t ) x 0mdi ( t ) y 0 mdi ( t ) z 0 mdi ( t ) T ; andlet p mdi ( t ), p mi ( t ) 2 R 3 representtheimage-spacecoordinatesofthefeaturepoint son d and capturedbythecameraattachedto I M ,respectively,as p mdi u mdi v mdi 1 T p mi u mi v mi 1 T : Thenormalizedcoordinatesof m 0mdi ( t )and m mdi ( t ),denotedas m 0mdi ( t ), m mdi ( t ) 2 R 3 respectively,aredenedas m 0mdi ( t ) m 0mdi ( t ) z 0 mdi ( t ) m mdi ( t ) m mdi ( t ) z mdi ( t ) : (4{70) FollowingthedevelopmentinSection 4.2.2 and 4.2.3 ,relationshipscanbedevelopedto computethehomographiesanddepthratiosas p mdi = mdi A R md + x hmd n T m A 1 | {z } p mi G md (4{71) p ri = rmi A R rm + x hrm n T m A 1 | {z } p mi ; G rm (4{72) where mdi = z mi z mdi rmi = z mi z ri R md = R md R T m R rm = R r R T m : (4{73) 92

PAGE 93

Theequationstorelate m mdi ( t )to m 0mdi ( t )canbedevelopedas m 0mdi = z mdi z 0 mdi ( R rm (4{74) + x hrm mdi n T m m mi n T m m mdi n T m m mdi z 0 mdi z mdi = 001 ( R rm (4{75) + x hrm mdi n T m m mi n T m m mdi n T m m mdi : In( 4{64 )-( 4{72 ), n ( t ), n m ( t ),and n r 2 R 3 denotetheconstantunitnormaltothe planes and asexpressedin I I M ,and I R respectively, x h ( t ), x hrs ( t ), x hr ( t ), x hmd ( t ), x hrm ( t ) 2 R 3 denotethecorrespondingscaledtranslationvectors,and G ( t ), G rs G r ( t ), G md ( t ), G rm ( t ) 2 R 3 3 denoteprojectivehomographies. Setsoflinearequationsin( 4{64 )-( 4{66 ),( 4{71 )and( 4{72 )canbeusedtodetermine anddecomposehomographiestoobtain i ( t ) ; rsi mdi ( t ), ri ( t ), rmi ( t ), x h ( t ), x hrs x hr ( t ), x hmd ( t ), x hrm ( t ), R ( t ), R rs R r ( t ), R md ( t ),and R rm ( t ).Giventhattherotation matrix R r ( t )isassumedtobeknown,theexpressionsfor R rs ( t )and R r ( t )in( 4{67 )can beusedtodetermine R rs ( t )and R ( t ).Once R ( t )isdetermined,theexpressionfor R ( t ) and R rm ( t )in( 4{67 )and( 4{73 )canbeusedtodetermine R ( t )and R m ( t ).Therotation R m ( t )canthenbeusedtocalculate R md ( t )fromtherelationshipfor R md in( 4{73 ). Basedonthedenitionsfor R ( t ), R ( t ), R md ( t ), R m ( t ), R r ,and R rs providedinthe previousdevelopment,therotationfrom F to F s andfrom F d to F s ,denotedby R 1 ( t ), R d 1 ( t ) 2 SO (3),respectively,aredenedas R 1 ( t )= R T rs R r R T ( t ) R ( t )= 266664 cos sin 0 sin cos 0 001 377775 (4{76) 93

PAGE 94

R d 1 ( t )= R T rs R r R T m ( t ) R md ( t ) (4{77) = 266664 cos d sin d 0 sin d cos d 0 001 377775 ; where ( t ) 2 R denotestheright-handedrotationangleaboutthez-axisth ataligns F with F s ,and d ( t ) 2 R denotestheright-handedrotationangleaboutthez-axisth at aligns F d with F s .Fromthedenitionsof ( t )and d ( t ),itisclearthat = c d = cd (4{78) where c ( t ), cd ( t ) 2 R denotethedesiredangularvelocitiesof F and F d ,respectively. Basedonthefactthat R ( t ), R ( t ), R md ( t ), R m ( t ), R r ,and R rs areknown,itisclearfrom ( 4{76 )-( 4{78 )that ( t )and d ( t )areknownsignalsthatcanbeusedinthesubsequent controldevelopment.Tofacilitatethesubsequentdevelop ment, ( t )and d ( t )areassumed tobeconnedtothefollowingregions < ( t ) 6 < d ( t ) 6 : (4{79) 4.3.4ControlObjective Theobjectiveistodevelopavisualservocontrollerthaten suresthatthecoordinate system F tracksthetime-varyingtrajectoryof F d (i.e., m i ( t )measuredin I tracks m mdi ( t )measuredin I M ).Toensurethat m i ( t )tracks m mdi ( t ) ; thecontrolobjectivecan bestatedbyusingtheEuclideanreconstructiongivenin( 4{64 )-( 4{72 )asthedesirefor m 01 ( t ) m 0md 1 ( t ).Toquantifythecontrolobjective,translationandrotat iontracking error,denotedby e ( t ) [ e 1 ( t ) ;e 2 ( t ) ;e 3 ( t )] T 2 R 3 ,aredenedas[ 67 ] e 1 1 d 1 e 2 2 d 2 e 3 d (4{80) 94

PAGE 95

where ( t )and d ( t )areintroducedin( 4{76 )and( 4{77 ),respectively,andtheauxiliary signals ( t ) [ 1 ( t ) ; 2 ( t ) ; 3 ( t )] T d ( t ) [ d 1 ( t ) ; d 2 ( t ) ; d 3 ( t )] T 2 R 3 aredenedas ( t ) 1 z r 1 R T ( t ) R ( t ) R T r m 01 ( t ) (4{81) d ( t ) 1 z r 1 R T md ( t ) R m ( t ) R T r m 0md 1 ( t ) : Also,thenormalunitvector n r isdenedas[ 43 ] n r = R r R T ( t ) R ( t ) 00 1 T = R r R T m ( t ) R md ( t ) 00 1 T : (4{82) Theexpressionsin( 4{82 )and( 4{81 )canbeusedtodeterminethat 3 = d 3 = d r z r 1 : (4{83) Theexpressionsin( 4{64 )-( 4{75 )canbeusedtorewrite ( t )and d ( t )intermsofthe measurablesignals 1 ( t ), r 1 ( t ), rm 1 ( t ), md 1 ( t ), R ( t ), R ( t ), R r R md ( t ), R m ( t ), p 1 ( t ), and p md 1 ( t )as ( t )= r 1 1 R T ( t ) R ( t ) R T r H 0 r A 1 p 1 d ( t )= rm 1 md 1 R T md ( t ) R m ( t ) R T r H 0 rm A 1 p md 1 : (4{84) Basedon( 4{80 ),( 4{84 ),andthefactthat ( t )and d ( t )aremeasurable,itisclear that e ( t )ismeasurable.Byexamining( 4{80 )-( 4{83 ),thecontrolobjectiveisachievedif k e ( t ) k! 0.Specically,if e 3 ( t ) 0 ; thenitisclearfrom( 4{80 )that R 1 ( t ) R d 1 ( t ). If e 1 ( t ) 0and e 2 ( t ) 0,thenfrom( 4{80 )and( 4{83 )itisclearthat ( t ) d ( t ). Giventhat R 1 ( t ) R d 1 ( t )andthat ( t ) d ( t ),then( 4{81 )canbeusedtoconclude that m 01 ( t ) m 0md 1 ( t ).If m 01 ( t ) m 0md 1 ( t )and R 1 ( t ) R d 1 ( t ),thentheEuclidean relationshipsinthegeometricmodelcanbeusedtoprovetha t m i ( t )measuredintermsof I! m mdi ( t )measuredintermsof I M 95

PAGE 96

4.3.5ControlDevelopment Theopen-looperrorsystemcanbeobtainedbytakingthetime derivativeof( 4{81 )as = v z r 1 + s 11 z r 1 (4{85) where v ( t ), ( t ) 2 R 3 denotetherespectivelinearandangularvelocityofanUGV expressedin F as v v c 00 T 00 c T (4{86) Withoutlossofgenerality,thelocationofthefeaturepoin t s 1 istakenastheoriginof F sothat s 11 =[0,0,0] T .Then,basedon( 4{85 )and( 4{86 ),theerrorsystemcanbefurther writtenas 1 = v c z r 1 + 2 c 2 = 1 c : (4{87) Sincethedesiredtrajectoryisassumedtobegeneratedinac cordancewithUGVmotion constraints,asimilarexpressionto( 4{87 )canbedevelopedas d 1 = v cd z r 1 + 2 d cd d 2 = d 1 cd : (4{88) where v cd ( t ) 2 R denotesthedesiredlinearvelocityof F d .From( 4{78 ),( 4{80 ),( 4{85 )and ( 4{87 ),theopen-looperrorsystemcanbeobtainedas z r 1 e 1 = v c + z r 1 ( 2 c d 1 ) e 2 = 1 c + d 1 d e 3 = c d : (4{89) Tofacilitatethesubsequentdevelopment,theauxiliaryva riable e 2 ( t ) 2 R isdenedas e 2 e 2 + d 1 e 3 .(4{90) 96

PAGE 97

Aftertakingthetimederivativeof( 4{90 )andutilizing( 4{89 ),thefollowingexpressionis obtained: : e 2 = e 1 c +_ d 1 e 3 .(4{91) Basedon( 4{90 ),itisclearthatif e 2 ( t ), e 3 ( t ) 0,then e 2 ( t ) 0.Basedonthis observationandtheopen-loopdynamicsgivenin( 4{91 ),thefollowingcontroldevelopment isbasedonthedesiretoshowthat e 1 ( t ) ; e 2 ( t ) ;e 3 ( t )areasymptoticallydriventozero. Basedontheopen-looperrorsystemsin( 4{89 )and( 4{91 ),thelinearandangular velocitycontrolinputsforanUGVaredesignedas v c k v e 1 + e 2 c ^ z r 1 ( 2 c d 1 )(4{92) c k e 3 + d d 1 e 2 (4{93) where k v k 2 R denotepositive,constantcontrolgains.In( 4{92 ),theparameterupdate law^ z r 1 ( t ) 2 R isgeneratedbythedierentialequation : ^ z r 1 = r 1 e 1 ( 2 c d 1 )(4{94) where r 1 2 R isapositive,constantadaptationgain.Aftersubstitutin gthekinematic controlsignalsdesignedin( 4{92 )and( 4{93 )into( 4{89 ),thefollowingclosed-looperror systemsareobtained: z r 1 e 1 = k v e 1 + e 2 c +~ z r 1 ( 2 c d 1 ) : e 2 = e 1 c +_ d 1 e 3 e 3 = k e 3 d 1 e 2 (4{95) where( 4{91 )wasutilized,andthedepth-relatedparameterestimation error,denotedby ~ z r 1 ( t ) 2 R ,isdenedas ~ z r 1 z r 1 ^ z r 1 .(4{96) 97

PAGE 98

Theorem2. Thecontrolinputdesignedin( 4{92 )and( 4{93 )alongwiththeadaptive updatelawdenedin( 4{94 )ensureasymptotictrackingforUGVinthesensethat k e ( t ) k! 0(4{97) providedthetimederivativeofthedesiredtrajectorysati sesthefollowingcondition d 1 9 0 : (4{98) Lyapunov-basedanalysismethodandBarbalat'slemmacanbe usedtoprovedthe theorem 2 basedonaLyapunovfunction V ( t ) 2 R denedas[ 43 ] V 1 2 z r 1 e 21 + 1 2 e 22 + 1 2 e 23 + 1 2 r 1 ~ z 2 r 1 ..(4{99) 4.4SimultaneousTracking,LocalizationandMapping Forvision-basedautonomoussystemsapplications(e.g.,t racking,localizationand mapping),thegivenreferenceobjectcanleavethecamera's FOVwhileanotherreference objectenterstheFOV.Incomparisontothesinglereference objectproblempresentedin Section 4.3 ,multiplereferenceobjectsaretakenintoconsiderationi nthissection.The daisy-chainingmethodisfurtherdevelopedtoachieveasym ptotictrackingoftheUGVby mappingeachreferenceobjecttoaglobalcoordinatesystem .Moreover,thetime-varying EuclideanpositionoftheUGVandthestationarypositionof thereferenceobjectscanbe localizedwithrespecttotheglobalcoordinatesystem.Ina dditiontoachievingthevisual servotrackingandlocalizationobjectives,thedeveloped methodgeneratesdataforthe SLAMproblem.4.4.1ProblemScenario ThegeometricmodelinthissectionisthesameasinSection 4.3 ,exceptthatmultiple referenceobjectsaretakenintoconsideration.Whileview ingthefeaturepointsofthe UGV,thecameraisassumedtoalsoviewfouradditionalcopla narandnoncollinear 98

PAGE 99

Table4-3.Coordinateframerelationshipsformulti-refer enceUGVtrackingcontrol. MotionFrames R j ( t ), x fj ( t ) F j to I in I R rj ( t ), x frj F j to I R in I R R mj ( t ), x fmj ( t ) F j to I M in I M featurepointsofastationaryreferenceobject,suchthata tanyinstantoftimealong thecameramotiontrajectoryatleastonesuchreferenceobj ectisintheFOV.Thefour additionalfeaturepointsdenetheplane j inFig. 4-5 .Thestationarycoordinate frame F j ( j =1 ; 2 ;:::;k )isattachedtotheobjectwheredistancefromtheoriginoft he coordinateframetooneofthefeaturepointsisassumedtobe known,i.e., s 2 ji 2 R 3 8 i =1 ; 2 ; 3 ; 4.Theplane j isassumedtobeparalleltotheplane .Thefeaturepoints thatdene 1 ,correspondingtoareferenceobject F 1 (i.e., F j correspondingto j =1), arealsoassumedtobevisiblewhenthecameraisaprioriloca tedcoincidentwiththepose ofthestationarycoordinateframe I R .Thexedcoordinateframe I R isasnapshotof I M atthetimeinstantthattherstreferenceobject 1 isvisibletothereferencecamera.The referenceobject 1 isvisibleto I R ,buttheotherreferenceobjects j ( j> 1)arenot. 4.4.2GeometricRelationships InadditiontothenotationsinTables 4-1 and 4-2 ,morerelationshipsbetweenthe variouscoordinateframesaresummarizedinTable 4-3 .InTable 4-3 R j ( t ), R rj ( t ), R mj ( t ) 2 SO (3)denoterotationmatricesand x fj ( t ), x frj x fmj ( t ) 2 R 3 denotetranslation vectors.4.4.3EuclideanReconstruction TheEuclideanreconstructionforthegeometricmodelinFig 4-5 canbeseparated intothreecases.Case1:asinglereferenceobject 1 iswithinthereferencecamera'sFOV andtherefore 1 isusedasthereferenceobject.Case2:tworeferenceobject s(e.g., 1 and 2 )arewithinthecamera'sFOV,andthereferenceobjectinuse isgoingtobeswitched fromonetotheother(e.g.,from 1 to 2 ).Case3: j ( j 2)isusedasthereference object. 99

PAGE 100

Figure4-5.Geometricmodelforamovingcamera,movingUGV, andstationary referencecamera:Amovingcamera(coordinateframe I M )recordsthe desiredtrajectoryofanUGV(coordinateframe F d ( t ))withrespecttothe stationaryreferenceobject F 1 whilestationarycoordinateframe F s represents asnapshotofanUGValongthedesiredtrajectorytakenby I R = I M ( t ) j t = T Amovingcamera(coordinateframe I )viewsthecurrentUGV(coordinate frame F ( t ))andthestationaryreferenceobject F j Let m mji ( t ) ; m 0 rji 2 R 3 denotetheEuclideancoordinatesofthefeaturepointson j expressedin I M and I R ,respectively,as m mji ( t ) x mji ( t ) y mji ( t ) z mji ( t ) T m 0 rji x 0 rji y 0 rji z 0 rji T Sincethefeaturepointplane 1 isvisibletothereferencecamerawhen I M iscoincident with I R m 0 r 1 i x 0 r 1 i y 0 r 1 i z 0 r 1 i T canalsobewrittenas m r 1 i x r 1 i y r 1 i z r 1 i T 100

PAGE 101

Let p mji ( t ) ;p 0 rji 2 R 3 representtheimage-spacecoordinatesofthefeaturepoint son j capturedbythereferencecameraattachedto I M and I R ,respectively,as p mji ( t ) u mji ( t ) v mji ( t )1 T p 0 rji u 0 rji v 0 rji 1 T : When j =1, p 0 r 1 i u 0 r 1 i v 0 r 1 i 1 T canbewrittenas p r 1 i u r 1 i v r 1 i 1 T whichismeasurable.When j> 1, p 0 rji cannotbemeasureddirectly.Itneedsto becomputedbasedonthecorrespondingnormalizedcoordina tesobtainedfromthe daisy-chainingmulti-viewgeometry.Thenormalizedcoord inatesof m mji ( t )and m 0 rji denotedas m mji ( t ) ;m 0 rji 2 R 3 ,respectively,aredenedas m mji ( t ) m mji ( t ) z mji ( t ) m 0 rji m 0 rji z 0 rji : Fortherstcase,theEuclideanreconstructionisexactlyt hesameasthatinSection 4.3 .Forthesecondcase,considerthefeaturepointplanes 1 and 2 asanexample. SimilartotheEuclideanreconstructiondevelopmentinSec tion 4.3.3 ,relationshipscanbe obtainedtodeterminethehomographiesanddepthratiosamo ngthecoordinateframes F 1 F 2 I R ,and I M as p m 2 i = 21 i A R 21 + x h 21 n T m 1 A 1 | {z } p m 1 i G 21 (4{100) p r 1 i = rm 1 i A R rm 1 + x hrm 1 n T m 1 A 1 | {z } p m 1 i G rm 1 (4{101) where 21 i = z m 1 i z m 2 i rm 1 i = z m 1 i z r 1 i (4{102) R 21 = R m 2 R T m 1 R rm 1 = R r 1 R T m 1 : (4{103) 101

PAGE 102

Theequationstorelate m 0 r 2 i ( t )to m r 1 i ( t )canbedevelopedas m 0 r 2 i = z r 1 i z 0 r 2 i R 21 + x h 21 rm 1 i n T m 1 m m 1 i n T r 1 m r 1 i n T r 1 m r 1 i (4{104) z 0 r 2 i z r 1 i = 001 R 21 + x h 21 rm 1 i n T m 1 m m 1 i n T r 1 m r 1 i n T r 1 m r 1 i : (4{105) In( 4{100 )-( 4{105 ), n m 1 ( t ),and n r 1 2 R 3 denotetheunitnormaltotheplanes 1 expressedin I M and I R ,respectively, x h 21 ( t ), x hrm 1 ( t ) 2 R 3 denotethecorresponding scaledtranslationvectors,and G 21 ( t ), G rm 1 ( t ) 2 R 3 3 denoteprojectivehomographies. Linearequationsin( 4{100 )and( 4{101 )canbeusedtodetermineanddecompose homographiestoobtain 21 i ( t ), rm 1 i x h 21 ( t ), x hrm 1 R 21 ( t ), R rm 1 .From m 0 r 2 i ( t )in ( 4{104 ),thevirtualpixelcoordinates p 0 r 2 i ( t )canbecomputed.Basedon( 4{104 )and ( 4{105 ),theEuclideancoordinatesofthefeaturepointson 2 canberelatedtoxed coordinateframe I R ..Followingthesameideaasusedtorelate 2 and 1 j canbe relatedto j 1 ( j =3 ;:::;k )basedonthefollowingprojectivehomographies: p mji = j ( j 1) i G j ( j 1) p m ( j 1) i (4{106) p r ( j 1) i = rm ( j 1) i G rm ( j 1) p m ( j 1) i (4{107) where G j ( j 1) and G rm ( j 1) arerespectivelydenedas G j ( j 1) = A R j ( j 1) + x hj ( j 1) n T m ( j 1) A 1 G rm ( j 1) = A R rm ( j 1) + x hrm ( j 1) n T m ( j 1) A 1 and j ( j 1) i = z m ( j 1) i z mji rm ( j 1) i = z m ( j 1) i z r ( j 1) i (4{108) R j ( j 1) = R mj R T m ( j 1) (4{109) R rm ( j 1) = R r ( j 1) R T m ( j 1) : (4{110) 102

PAGE 103

Figure4-6.Asimpliedequivalentmodelshowingamovingca mera(coordinateframe I )observingthecurrentUGV(coordinateframe F ( t ))andthestationary referenceobject F j andtheposeof F j isexpressedintermsof I R Relationshipscanalsobedevelopedintermsofthenormaliz edEuclideancoordinatesas m 0 rji = z r ( j 1) i z 0 rji R j ( j 1) + x hj ( j 1) rm ( j 1) i (4{111) n T m ( j 1) m m ( j 1) i n T r ( j 1) m r ( j 1) i n T r ( j 1) m r ( j 1) i z 0 rji z r ( j 1) i = 001 R j ( j 1) + x hj ( j 1) rm ( j 1) i n T m ( j 1) m m ( j 1) i n T r ( j 1) m r ( j 1) i n T r ( j 1) m r ( j 1) i : (4{112) Recursively,from( 4{100 )-( 4{112 ), m 0 rji ( t )canberelatedtotheknownnormalized Euclideancoordinate m r 1 i .Forthethirdcase,thegeometricmodelcanbesimpliedas depictedinFig. 4-6 103

PAGE 104

Once m 0 rji ( t )iscomputedbasedonthedeductionsincase2,thegeometric model inFig. 4-6 isequivalenttothatinFig. 4-2 .Therefore,theEuclideanreconstructionin Section 4.3 canbeusedtobuildtheEuclideanrelationshipsamongdier entcoordinate frames.4.4.4TrackingandMapping ThetrackingcontroldesignisthesameasthatinSection 4.3 ,oncetheEuclidean relationshipbetween F and F d isobtainedbasedontheEuclideanreconstructionanalysis asshowninSection 4.4.3 .Thetime-varyingEuclideanpositionoftheUGVandthe stationarypositionofthereferenceobjectscanbelocaliz edwithrespecttotheglobal coordinatesystem I R .Usingtheknowngeometriclength s 21 i andaunitnormal n r 1 (i.e., thenormalto 1 expressedin I R ),thegeometricreconstructionmethodin[ 52 63 64 ]can beutilizedtoobtain m r 1 i ( t ).Basedonthecomputed m r 1 i ( t ) ; ( 4{105 )canbeusedtond z 0 r 2 i ,andthen( 4{104 )canbeusedtond m 0 r 2 i ( t ).Recursively,basedon( 4{106 )-( 4{112 ), theEuclideancoordinatesoftheotherreferenceobjectsde notedas m 0 rji ( t )( j =3 ;:::;k ) canbecomputed.Similarly,usingtheknowngeometriclengt h s 1 i andaunitnormal n ( t ) (i.e.,thenormalto expressedin I ),thegeometricreconstructionmethodin[ 63 ]canalso beutilizedtoobtain m i ( t ).Basedonthecomputed m i ( t ),( 4{68 )and( 4{69 )canbeused tond m 0i ( t ). 4.4.5SimulationResults Anumericalsimulationwasperformedtoillustratetheloca lizationandmapping performancegiventhecontrollerin( 4{92 ),( 4{93 ),andtheadaptiveupdatelawin( 4{94 ). ThesimulationscenarioisshowninFig. 4-7 ,suchthattheposeofcurrentUGV F ( t )is estimatedwithrespecttothreestationaryreferenceobjec ts F 1 F 2 ,and F 3 whiletracking thedesiredtrajectory F d encodedasasequenceofimages.Theoriginsofthecoordinat e frames F F 1 F 2 F 3 ,and F d ,andthefourcoplanarfeaturepointsontheplanes 1 2 3 ,and d arechosensuchthattheEuclideancoordinatesofthefeatur epointsin F F 1 F 2 F 3 ,and F d aregivenby s i s 1 i s 2 i s 3 i ,and s i (where i =1 ; 2 ; 3 ; 4),respectively. 104

PAGE 105

TheinitialposeofcurrentUGV F (0)= F ( t ) j t =0 ,stationaryreferenceobjects F j ;j =1 ; 2 ; 3,andtheinitialpositionofthetime-varyingdesiredUGV F d (0)= F d j t =0 wereconsideredas F (0)= 266666664 cos (27) sin (27)0 2 : 50 sin (27) cos (27)01 : 20 00100001 377777775 F 1 = 266666664 cos (45) sin (45)0 1 : 30 sin (45) cos (45)00 : 80 001 0 : 50 0001 377777775 F 2 = 266666664 cos (35) sin (35)0 1 : 25 sin (35) cos (35)01 : 90 001 1 : 50 0001 377777775 F 3 = 266666664 cos (25) sin (25)00 : 50 sin (25) cos (25)02 : 75 001 2 : 00 0001 377777775 F d (0)= 266666664 cos (60) sin (60)0 2 : 10 sin (60) cos (60)00 : 30 00100001 377777775 : Thecontrolgainsin( 4{92 )and( 4{93 )andtheadaptationgainin( 4{94 )were selectedas k =1 : 0038 k v =3 : 88 r 1 =10 : Theperformanceofthevisualservotrackingcontrolleriss howninFig. 4-7 ,which showstheEuclideanspacetrajectoryofthefeaturepointsa ttachedtotheplanes and d ,takenby I and I M ; respectivelyandthetime-varyingtrajectoryofthecurren tand referencecamera, I and I M ; respectively.FromFig. 4-7 ,itcanbeseenthatthecurrent trajectorycorrespondingtothetime-varyingUGV F ( t )isindistinguishablefromthe desiredtrajectorycorrespondingtothetime-varyingUGV F d ( t )duetorelativelylow trackingerror(seeFig. 4-8 ).TheresultingtrackingerrorsareplottedinFig. 4-8 ,which asymptoticallyapproachzero.Thelinearandangularveloc itycontrolinputsareshown 105

PAGE 106

-3 -2.5 -2 -1.5 -1 -0.5 0 0.5 1 1.5 2 2.5 0 1 2 3 -3.5 -3 -2.5 -2 -1.5 -1 -0.5 0 y [m] x [m] z [m]I M (t) I(t) I(0) I M (0) F(0) F d (0) F 2 F 1 F 4 F 5 F 3 F(t) F d (t) Figure4-7.Euclideanspacetrajectoryofthefeaturepoint sattachedtothecurrent(i.e. F ( t ))anddesired(i.e. F d ( t ))UGVtakenby I and I M ,respectivelyand thetime-varyingtrajectoryofthecurrentandreferenceca mera, I and I M ; respectively. F (0)denotestheinitialpositionofthecurrentUGV, F ( t ) denotesthetime-varyingpositionofthecurrentUGV, F d (0)denotesthe initialpositionofthedesiredUGV, I (0)denotestheinitialpositionofthe currentcamera, I ( t )denotesthetime-varyingpositionofthecurrentcamera, I M (0)denotestheinitialpositionofthetime-varyingrefere ncecamera, I M ( t ) denotesthetime-varyingpositionofthetime-varyingrefe rencecamera,and F 1 F 2 ,and F 3 denotethepositionofthestationaryreferenceobjects. inFig. 4-9 .Figs. 4-10 and 4-11 showtheregulationresultsinpresenceofanadditive whiteGaussiannoiseofstandarddeviation =0 : 1 pixels .Fig. 4-12 showstheresults oflocalizationofthecurrentUGVattachedto F ( t )andmappingofreferencetargets attachedto F 1 F 2 ,and F 3 expressedinconstantreferenceframe I R : 106

PAGE 107

0 5 10 15 20 25 30 35 40 -0.2 -0.1 0 0.1 0.2 time [s]e 1 (t) [m] 0 5 10 15 20 25 30 35 40 -0.5 0 0.5 time [s]e 2 (t) [m] 0 5 10 15 20 25 30 35 40 -1 -0.5 0 0.5 time [s]e 3 (t) [rad] Figure4-8.Linear(i.e., e 1 ( t )and e 2 ( t ))andangular(i.e., e 3 ( t ))trackingerror. 0 5 10 15 20 25 30 35 40 -1 -0.5 0 0.5 1 time [s]v c (t) [m/s] 0 5 10 15 20 25 30 35 40 -0.4 -0.2 0 0.2 0.4 0.6 0.8 1 time [s]w c (t) [rad/s] Figure4-9.Linear(i.e., v c ( t ))andangular(i.e., c ( t ))velocitycontrolinputs. 107

PAGE 108

0 5 10 15 20 25 30 35 40 -0.2 -0.1 0 0.1 0.2 time [s]e 1 (t) [m] 0 5 10 15 20 25 30 35 40 -0.6 -0.4 -0.2 0 0.2 time [s]e 2 (t) [m] 0 5 10 15 20 25 30 35 40 -0.6 -0.4 -0.2 0 0.2 time [s]e 3 (t) [rad] Figure4-10.Linear(i.e., e 1 ( t )and e 2 ( t ))andangular(i.e., e 3 ( t ))trackingerrorin presenceofanadditivewhiteGaussianimagenoise. 0 5 10 15 20 25 30 35 40 -1 -0.5 0 0.5 1 time [s]v c (t) [m/s] 0 5 10 15 20 25 30 35 40 -0.5 0 0.5 1 time [s]w c (t) [rad/s] Figure4-11.Linear(i.e., v c ( t ))andangular(i.e., c ( t ))velocitycontrolinputsinpresence ofanadditivewhiteGaussianimagenoise. 108

PAGE 109

-4 -2 0 2 4 -2 -1.5 -1 -0.5 0 0.5 1 1.5 -0.5 0 0.5 1 1.5 2 2.5 3 3.5 x [m] y [m]z [m]F 3 F 2 F(0) F 1 F(t) I(t) I M (t) 1 2 3 Figure4-12.ResultsoflocalizationofthecurrentUGVatta chedto F ( t )andmappingof referencetargetsattachedto F 1 F 2 ,and F 3 expressedinconstantreference frame I R .Specically,trajectory(1)showsthetime-varyingposeo fthe movingcameraattachedto I ( t ),trajectory(2)showsthetime-varying poseofthemovingcameraattachedto I M ( t ),andtrajectory(3)showsthe time-varyingposeofthecurrentUGVattachedto F ( t )measuredinthe stationaryreferencecameraframe I R F (0)denotestheinitialpositionof thecurrentUGVand F 1 F 2 ,and F 3 denotethepositionofthestationary referenceobjects. 4.5ErrorPropagationinDaisy-Chaining Thedaisy-chainingbasedcontrolschemedevelopedinSecti on 4.4 isbasedon estimatingtheposeofanUGVwithrespecttostationaryrefe renceobjects F i where i =1 ;:::;n usingamovingmonocularcamera.Sincethestationaryrefer enceobject canleavethecameraFOVandanewreferenceobjectentersthe FOV,itisnecessaryto determinetheposeofthenewreferenceobjectwithrespectt otherecedingobjectinorder toprovidetheposeinformationofamovingagentsuchasanUG Vorthecameraitself. Forexample,if F j isleavingtheFOVand F j +1 isenteringtheFOV,thenahomography 109

PAGE 110

relationshipisobtainedbetween F j and F j +1 .Theerrorsinestimatingtheposeof F j +1 wouldbepropagatedtosubsequentreferenceobjects F j + n where n =2 ;:::;m .The controllawisestablishedbycomparingthecurrentposeofa nUGVwiththedesired posemeasuredinthestationaryreferenceobject.Therefor e,theposemeasurementerror propagatedthroughmultiplestationaryreferenceobjects canresultinerroneouscontrol inputandpossiblyleadtosysteminstability. Inthissection,theerrorpropagationindaisy-chainingba sedposeestimationis analysedbyperforminganumericalsimulation.Thesimulat ionscenarioconsistsof amovingairbornemonocularcamera(e.g.,acameraattached toanUAV)withthe coordinateframe I ( t )travellingatanaltitudeof100 m andcapturingtheimagesofthe stationaryreferenceobjects F i where i =1 ;:::; 8asshowninFig. 4-13 .Thecamerais assumedtotraverseacirculartrajectorywithagroundspee dof10 m=s .Attime t =0, poseofthecameraisassumedtobeknownwithrespecttothewo rldcoordinateframe andsubsequentlythecameraposeisestimatedbasedonthere ferenceobjects F i usingthe daisy-chainingmethod.Thegoalistodeterminethedeviati onoftheestimatedtrajectory fromtheactualcamerapathintermsofpositionandorientat ionestimationerrorsaftera nitenumberofloops.Thepositionandorientationerrors, e T ( t )and e R ( t ),respectively, aredenedas e T = k t ^ t k e R = k I R T ^ R k ; (4{113) where t; ^ t 2 R 3 denotestheactualandestimatedcamerapositionintheworl dframe, respectively. R; ^ R 2 R 3 3 denotestheactualandestimatedrotationofcameracoordin ate framewithrespecttotheworldframeand I 2 R 3 3 representsanidentitymatrix.Fig. 4-14 showstheposeestimationerrorafterestablishing240dais y-chainsbytraversingthe circulartrajectory30times.AsseenfromFig. 4-14 ,intheabsenceofimagenoisethe poseestimationerrorgrowslinearlyanditisaresultofnum ericalerrorsinestimatingthe rotationandtranslationduringeachdaisy-chainthatgetp ropagatedinthesubsequent 110

PAGE 111

-40 -30 -20 -10 0 10 20 30 40 0 20 40 60 80 -100 -80 -60 -40 -20 0 x [m] y [m]z [m]F 1 F 2 F 3 F 4 F 5 F 6 F 7 F 8 Figure4-13.Asimulationscenariodepictingthecirculart rajectoryofcameraandasetof stationaryreferenceobjects F i where i =1 ;:::; 8. daisy-chains.Fig. 4-16 showstheresultoferrorpropagationinpresenceofanaddit ive whiteGaussiannoisewithstandarddeviation =0 : 5 pixels ,whichdemonstrates similaritywiththe`dead-reckoning'systems.Theestimat edcameratrajectoryshownin Fig. 4-15 indicatesdeviationfromtheactualpathinpresenceofnois ewhencomparedto Fig. 4-13 .Theerrorpropagationproblemcanbeaddressedbymitigati ngtheerrorusing theknownpositionofthecamerainthesceneorusinganaddit ionalsensor.Specically, thecamerapositioncanbeupdatedwhenthecamerarevisitsa knownEuclideanpointin thespace,whichcanbedeterminedbyobservingtheknownfea turepoint`constellation'. Anotherapproachbasedonsensorfusioncouldincludeanadd itionalsensor(e.g.,aGPS) toprovideanabsolutepositionofthecamerawhenthemeasur ementisavailable.Fig. 111

PAGE 112

0 0.5 1 1.5 2 2.5 3 3.5 x 10 4 0 0.2 0.4 0.6 0.8 1 x 10 -10 time [s]Position Error (e T [m]) 0 0.5 1 1.5 2 2.5 3 3.5 x 10 4 0 0.5 1 1.5 x 10 -12 time [s]Orientation Error (e R ) Figure4-14.Errorpropagationindaisy-chainingposeesti mationmethodinabsenceof featurepointnoiseafter240daisy-chainsbytraversingth ecirculartrajectory 30times. 4-15 showstheestimatedtrajectoryofthecameraandFig. 4-18 showstheresultoferror propagationinpresenceofanadditivewhiteGaussiannoise withstandarddeviation =0 : 5 pixels afterupdatingthecamerapositionattheendofeachcircula rtrajectory. ItcanbeseenfromFig. 4-18 thattheposeestimationerrorindaisy-chainingcanbe boundedbyusinganadditionalsensororusingtheknowledge oftheschene. 4.6ConcludingRemarks Inthischapter,adaisy-chainingvision-basedcontrol,lo calizationandmapping approachhasbeenpresented.Avisualservotrackingcontro llerisrstdevelopedusing thisdaisy-chainingapproachtoenableacontrolobjecttot rackadesiredtrajectory 112

PAGE 113

-40 -30 -20 -10 0 10 20 30 40 0 20 40 60 80 -100 -80 -60 -40 -20 0 x [m] y [m]z [m]F 1 F 2 F 3 F 4 F 5 F 6 F 7 F 8 Figure4-15.Asimulationscenariodepictingtheestimated cameratrajectoryinpresence ofwhiteGaussianimagenoiseandasetofstationaryreferen ceobjects F i where i =1 ;:::; 8. representedbyasequenceofimages.Anexampleisfollowedt oshowitsapplicationin trackingcontrolofanonholonomicUGV.Byfusingthedaisychainingstrategywiththe geometricreconstructionmethod,theEuclideanpositiono ftheUGVandreferenceobjects areidentiedtoprovideSLAMoftheUGV. 113

PAGE 114

0 0.5 1 1.5 2 2.5 3 3.5 x 10 4 0 1 2 3 4 time [s]Position Error (e T [m]) 0 0.5 1 1.5 2 2.5 3 3.5 x 10 4 0 0.01 0.02 0.03 0.04 0.05 0.06 time [s]Orientation Error (e R ) Figure4-16.Errorpropagationindaisy-chainingposeesti mationmethodinpresence ofwhiteGaussiannoiseafter240daisy-chainsbytraversin gthecircular trajectory30times. 114

PAGE 115

-40 -30 -20 -10 0 10 20 30 40 0 20 40 60 80 -100 -80 -60 -40 -20 0 x [m] y [m]z [m]F 1 F 2 F 3 F 4 F 5 F 6 F 7 F 8 Figure4-17.Asimulationscenariodepictingtheestimated cameratrajectoryinpresence ofwhiteGaussianimagenoisebyupdatingthecamerapositio nattheend ofeachcirculartrajectoryandasetofstationaryreferenc eobjects F i where i =1 ;:::; 8. 115

PAGE 116

0 0.5 1 1.5 2 2.5 3 3.5 x 10 4 0 0.2 0.4 0.6 0.8 time [s]Position Error (e T [m]) 0 0.5 1 1.5 2 2.5 3 3.5 x 10 4 0 0.002 0.004 0.006 0.008 0.01 0.012 0.014 time [s]Orientation Error (e R ) Figure4-18.Errorpropagationindaisy-chainingposeesti mationmethodinpresenceof whiteGaussiannoisebyupdatingthecamerapositionatthee ndofeach circulartrajectory. 116

PAGE 117

CHAPTER5 CONCLUSIONS 5.1ResearchSummary Theresearchpresentedinthisdissertationmonographcent ersaroundthedesignand analysisofvisualservocontrolstrategiesandvision-bas edrobustposeestimationwith anobjectiveoflongrangenavigationandcontrolofautonom oussystems.Thefocusof theresearchinChapter 2 istodevelopacomputationallydeterministicposeestimat ion methodthatisrobusttofeatureoutliers.Chapter 2 presentsthedevelopmentofanovel robustalgorithmforestimationoftherelativeposebetwee ntwocalibratedimages,which iscoinedas PoseEstimationbyGriddingofUnitSpheres (PEGUS). Thekeyideabehindthemethodis,ifthereareMmatchedpairs offeaturepoints betweentwoviews,onecancomputeamaximumof M P possibleposehypothesesbyusing aP-pointalgorithm.Thedevelopedalgorithmselectsasubs etof\low-noise"hypotheses byempiricallyestimatingtheprobabilitydensityfunctio noftherotationandtranslation randomvariables,andaveragesthem,conformingmanifoldc onstraints,tocomputea poseestimate.Theselectionoflow-noisehypothesesisfac ilitatedbyaunit-quaternion representationofrotation,whichenablesclusteringofth erotationhypothesesonthe 3-sphere S 3 toidentifythedominantclusterormode.Anidenticalappro achfacilitates estimationofunittranslationthatlieonthe2-sphere S 2 TheresultsinChapter 2 demonstrateanimprovedperformanceofPEGUSagainst RANSAC+leastsquaresaswellasnon-linearmeanshiftmetho d,bothintermsofthe estimationaccuracyandcomputationtime.Byvirtueofnoniterativeformulation underlyingthedeterministicstructureofPEGUS,thecompu tationtimeismore predictablethanthatofRANSACandnon-linearmeanshiftal gorithm,thusmakingit amenabletoavarietyofreal-timeapplicationssuchastrac kingcontrolofanautonomous agent.Chapters 3 and 4 providethedevelopmentofvision-basedcontrolmethodswh ich canbebenettedfromtherobustposeestimationmethodgive ninChapter 2 117

PAGE 118

Controlofamovingobjectusingastationarycameraand viceversa arewellattended problemsintheliteratureofvisualservocontrolandvario ussolutionsexistforaclassof autonomoussystems.However,controlofamovingobjectusi ngtheimagefeedbackfrom amovingcamerahasbeenawell-knownproblemduetotheunkno wnrelativevelocity associatedwithmovingcameraandmovingobject.InChapter 3 ,acollaborativevisual servocontroller,whichiscoinedthe daisy-chainingmethod ,isdevelopedwithanobjective toregulateasensor-lessunmannedgroundvehicle(UGV)toa desiredposeutilizingthe feedbackfromamovingairbornemonocularcamerasystem. ThecontributionofresearchinChapter 3 isthedevelopmentofmulti-viewgeometry, orphotogrammetry,basedconceptstorelatethecoordinate framesattachedtomoving camera,movingUGV,anddesiredUGVposespeciedbyan apriori image.Geometric constructsdevelopedfortraditionalcamera-in-handprob lemsarefusedwithxed-camera geometrytodevelopasetofEuclideanhomographies.Duetoi ntrinsicphysicalconstraints, oneoftheresultingEuclideanhomographiesisnotmeasurab lethroughasetofspatiotemporal imagesasthecorrespondingprojectivehomographycannotb edeveloped.Hence,the newgeometricformulations,termed virtualhomography ,areconceivedtosolveforthe homographyinordertodevelopameasurableerrorsystemfor thenonholonomicUGV. AsymptoticregulationresultsareprovedusingtheLyapuno v-basedstabilityanalysis. Further,inChapter 3 ,theresultsareextendedtoincludeasymptoticregulation ofan UGVbasedonthescenariothatthegivenreferenceobjectsca nleavecameraeld-of-view (FOV)whileanotherreferenceobjectentersFOV.Thecontro llerisdeveloped-withthe underlyinggeometricalconstructsthatdaisychainmultip lereferenceobjects-suchthat theairbornecameradoesnotrequiretomaintainaviewofthe staticreferenceobject; thereforetheairbornecamera/UGVpaircannavigateoveran arbitrarilylargearea. Simulationresultsareprovidedtodemonstratetheperform anceofthedaisy-chaining basedcontrol. 118

PAGE 119

BuildingontheresultsinChapter 3 ,thecomplexproblemofcooperativevisual servotrackingcontrolisformulatedinChapter 4 withanobjectivetoenableanUGVto followadesiredtrajectoryencodedasasequenceofimagesu tilizingtheimagefeedback fromamovingairbornemonocularcamerasystem.Thedesired trajectoryofanUGVis recordedbyamovingairbornemonocularcamera I M traversinganunknowntime-varying trajectory.ThecontrolobjectiveistotrackanUGValongth edesiredtrajectoryusingthe imagefeedbackfromamovingairbornecamera I thatmaytraversedierenttrajectory thanthatof I M .Theassociationaswellastherelativevelocityproblemis addressed byintroducingadaisy-chainingstructuretolinkaserieso fprojectivehomographiesand expressingtheminaconstantreferenceframe.Anadaptivep arameterupdatelawis employedtoactivelycompensateforthelackofobjectmodel anddepthmeasurements. Basedontheopen-looperrorsystem,atrackingcontrollawi sdevelopedthroughthe applicationofExtendedBarbalat'slemmaintheLyapunov-b asedframeworktoyieldan asymptoticstability. Thetrackingresultsareextendedtoreseedthestationaryr eferenceobjectswhile formulatingtheadditionalprojectivehomographyrelatio nshiptoprovideanunrestricted applicativeareaofoperation.Thetheoreticaldevelopmen tinChapter 4 manifeststhe coalescenceofdaisy-chainingcontrollerandnewlyformed geometricreconstruction techniquetowardsapplicationinvisualsimultaneousloca lizationandmapping(vSLAM). Simulationresultsareprovideddemonstratingthetrackin gcontrolofanUGVinpresence ofmultiplestationaryreferenceobjectsandvisualsimult aneouslocalizationandmapping (vSLAM)resultsarepresented. Thedaisy-chainingbasedcontrolschemedevelopedinChapt ers 3 and 4 isbasedon estimatingtheposeofanUGVwithrespecttostationaryrefe renceobjects F i where i =1 ;:::;n usingamovingmonocularcamera.Sincethestationaryrefer enceobject canleavethecameraFOVandanewreferenceobjectentersthe FOV,itisnecessary todeterminetheposeofthenewreferenceobjectwithrespec ttotherecedingobjectin 119

PAGE 120

ordertoprovidetheposeinformationofamovingagentsucha sanUGVorthecamera itself.Therefore,theerrorinposemeasurementbetweenth estationaryreferenceobjects wouldbepropagatedthroughthesubsequentreferenceobjec ts.Theerrorpropagationis analyzedinChapter 4 byperforminganumericalsimulation,whichshowsthatthep ose estimationerrorgrowslinearly.Possiblesolutionsarepr ovided,alongwithsimulation results,tomitigatetheerrorpropagationindaisy-chaini ng. 5.2RecommendationsforFutureWork RobustposeestimationmethodpresentedinChapter 2 isbasedonestimatingthe probabilitydensityfunction(pdf)ofrotationandtransla tionrandomvariablesusingthe histogramdensityestimatorbysegmentingthe3-sphereand 2-sphere,respectively.Future workwouldfocusondevelopingamethodtodeterminethe`bes t'segmentionofsphereto minimizetheposeestimationerrorandreducethecomputati ontime.Also,anadaptive parameterlawcanbedevelopedtodeterminethedistance aroundthemodetoextract thelow-noisemeasurements. Thedaisy-chainingmethoddevelopedinChapters 3 and 4 assumesaknown geometriclengthonthemovingagentandstationaryreferen ceobjects.Inpractice, thisassumptionmightbeatoorestrictive.Therefore,thef utureworkwouldincludea nonlinearobserver-basedrangeidenticationmethodinth edaisy-chainingframework toestimatetherequiredgeometriclengthontheobject.Ino rdertoestimatesucha Euclideanparameterusingthetwo-dimensionalimageinfor mation,anadditionalsensor wouldbeincludedtoprovidethesixdegree-of-freedomcame ravelocity.Thevelocity measurementscanalsobefusedwiththeposeestimatesinaKa lman-likestructureto mitigatetheerrorpropagationindaisy-chaining. 120

PAGE 121

REFERENCES [1] R.Subbarao,Y.Genc,andP.Meer,\Nonlinearmeanshiftforr obustpose estimation,"in Proc.theEighthIEEEWorkshoponApplicationsofComputer Vision .Washington,DC,USA:IEEEComputerSociety,2007,p.6. [2] C.TomasiandT.Kanade,\Detectionandtrackingofpointfea tures,"Carnegie MellonUniversity,Tech.Rep.,1991. [3] R.I.Hartley,\Indefenseoftheeight-pointalgorithm," IEEETrans.PatternAnal. MachineIntell. ,vol.19,p.580593,June1997. [4] D.Nister,\Anecientsolutiontotheve-pointrelativep oseproblem," IEEE TransactionsonPatternAnalysisandMachineIntelligence ,vol.26,no.6,pp. 756{770,June2004. [5] M.FischlerandL.Bolles,\Randomsampleconsensus:Aparad igmformodeltting withapplicationstoimageanalysisandautomatedcartogra phy," Communicationsof theACM ,vol.24,pp.381{385,1981. [6] P.H.S.TorrandA.Zisserman,\Mlesac:anewrobustestimato rwithapplication toestimatingimagegeometry," ComputerVisionandImageUnderstanding ,vol.78, no.1,pp.138{156,2000. [7] O.ChumandJ.Matas,\MatchingwithPROSAC-progressivesam pleconsensus,"in Proc.IEEEInt.Conf.ComputerVisionPatternRecognition ,C.Schmid,S.Soatto, andC.Tomasi,Eds.,vol.1.LosAlamitos,USA:IEEEComputer Society,June2005, pp.220{226. [8] E.Michaelsen,W.vonHansen,M.Kirchhof,J.Meidow,andU.S tilla,\Estimating theessentialmatrix:GOODSACversusRANSAC,"in PhotogrammaticComputer Vision(PCV06) ,2006. [9] D.Nister,\PreemptiveRANSACforlivestructureandmotio nestimation," Journal ofMachineVisionandApplications ,vol.16,pp.321{329,December2005. [10] M.Moakher,\Meansandaveraginginthegroupofrotations," SIAMJournalon MatrixAnalysisandApplications ,vol.24,2002. [11] D.BurschkaandG.Hager,\Vision-basedcontrolofmobilero bots,"in Proc.IEEE Int.Conf.Robot.Automat. ,2001,pp.1707{1713. [12] P.CorkeandS.Hutchinson,\Anewhybridimage-basedvisual servocontrolscheme," in Proc.IEEEConf.DecisionControl ,2000,pp.2521{2527. [13] ||,\Anewpartitionedapproachtoimage-basedvisualservo control," IEEETrans. Robot.Automat. ,vol.17,no.4,pp.507{515,2001. 121

PAGE 122

[14] N.J.CowanandD.Koditschek,\Planarimage-basedvisualse rvoingasanavigation problem,"in Proc.IEEEInt.Conf.Robot.Automat. ,2000,pp.1720{1725. [15] A.Das,R.Fierro,V.Kumar,B.Southall,J.Spletzer,andC.T aylor,\Real-time vision-basedcontrolofanonholonomicmobilerobot,"in Proc.IEEEInt.Conf. Robot.Automat. ,2001,pp.1714{1719. [16] W.E.Dixon,D.M.Dawson,E.Zergeroglu,andA.Behal,\Adapt ivetrackingcontrol ofawheeledmobilerobotviaanuncalibratedcamerasystem, IEEETrans.Syst., Man,Cybern.-PartB:Cybern. ,vol.31,no.3,pp.341{352,2001. [17] G.Hagar,D.Kriegman,A.Georghiades,andO.Ben-Shahar,\T oward domain-independentnavigation:Dynamicvisionandcontro l,"in Proc.IEEEConf. DecisionControl ,1998,pp.3257{3262. [18] T.HamelandR.Mahony,\Imagebasedvisualservocontrolfor aclassofaerial roboticsystems," Automatica ,vol.43,no.11,pp.1975{1983,2007. [19] K.Hashimoto,T.Kimoto,T.Ebine,andH.Kimura,\Manipulat orcontrolwith image-basedvisualservo,"in Proc.IEEEInt.Conf.Robot.Automat. ,1991,pp. 2267{2272. [20] N.Guenard,T.Hamel,andR.Mahony,\Apracticalvisualserv ocontrolfora unmannedaerialvehicle,"in Proc.IEEEInt.Conf.Robot.Automat. ,2007,pp. 1342{1348. [21] Y.Ma,J.Kosecka,andS.Sastry,\Visionguidednavigationf ornonholonomicmobile robot," IEEETrans.Robot. ,vol.15,no.3,pp.521{536,1999. [22] G.Mariottini,D.Prattichizzo,andG.Oriolo,\Image-base dvisualservoingfor nonholonomicmobilerobotswithcentralcatadioptriccame ra,"in Proc.IEEEInt. Conf.Robot.Automat. ,2006,pp.538{544. [23] G.L.Mariottini,G.Oriolo,andD.Prattichizzo,\Image-ba sedvisualservoingfor nonholonomicmobilerobotsusingepipolargeometry," IEEETrans.Robot. ,vol.23, no.1,pp.87{100,2007. [24] K.-T.SongandJ.-H.Huang,\Fastopticalrowestimationand itsapplicationto real-timeobstacleavoidance,"in Proc.IEEEInt.Conf.Robot.Automat. ,2001,pp. 2891{2896. [25] H.Y.Wang,S.Itani,T.Fukao,andN.Adachi,\Image-basedvi sualadaptive trackingcontrolofnonholonomicmobilerobots,"in Proc.oftheIEEE/RJSInternationalConferenceonIntelligentRobotsandSystems ,2001,pp.1{6. [26] J.Chen,W.E.Dixon,D.M.Dawson,andM.McIntire,\Homograp hy-basedvisual servotrackingcontrolofawheeledmobilerobot,"in Proc.IEEE/RSJInt.Conf. Intell.RobotsSyst. ,2003,pp.1814{1819. 122

PAGE 123

[27] J.Chen,W.E.Dixon,D.M.Dawson,andV.Chitrakaran,\Visua lservotracking controlofawheeledmobilerobotwithamonocularxedcamer a,"in Proc.IEEE Conf.ControlApplications ,2004,pp.1061{1066. [28] J.Chen,W.E.Dixon,D.M.Dawson,andM.McIntyre,\Homograp hy-basedvisual servotrackingcontrolofawheeledmobilerobot," IEEETrans.Robot. ,vol.22,no.2, pp.406{415,2006. [29] Y.Fang,D.M.Dawson,W.E.Dixon,andM.S.deQueiroz,\2.5Dv isualservoingof wheeledmobilerobots,"in Proc.IEEEConf.DecisionControl ,2002,pp.2866{2871. [30] Y.Fang,W.E.Dixon,D.M.Dawson,andP.Chawda,\Homography -basedvisual servoingofwheeledmobilerobots," IEEETrans.Syst.,Man,Cybern.-PartB: Cybern. ,vol.35,no.5,pp.1041{1050,2005. [31] G.Hu,W.E.Dixon,S.Gupta,andN.Fitz-coy,\Aquaternionfo rmulationfor homography-basedvisualservocontrol,"in Proc.IEEEInt.Conf.Robot.Automat. 2006,pp.2391{2396. [32] A.J.Davison,I.D.Reid,N.D.Molton,andO.Stasse,\MonoSL AM:Real-time singlecameraSLAM," IEEETrans.PatternAnal.MachineIntell. ,vol.29,no.6,pp. 1052{1067,2007. [33] P.Jensfelt,D.Kragic,J.Folkesson,andM.Bjorkman,\Afra meworkforvision basedbearingonly3DSLAM,"in Proc.IEEEInt.Conf.Robot.Automat. ,2006,pp. 1944{1950. [34] S.Se,D.Lowe,andJ.Little,\Globallocalizationusingdis tinctivevisualfeatures,"in Proc.IEEE/RSJInt.Conf.Intell.RobotsSyst. ,2002,pp.226{231. [35] L.Goncalves,E.diBernardo,D.Benson,M.Svedman,J.Ostro wski,N.Karlsson, andP.Pirjanian,\Avisualfront-endforsimultaneousloca lizationandmapping,"in Proc.IEEEInt.Conf.Robot.Automat. ,2005,pp.44{49. [36] R.Eustice,H.Singh,J.Leonard,M.Walter,andR.Ballard,\ Visuallynavigatingthe RMStitanicwithSLAMinformationlters,"in Proc.Robotics:ScienceandSystems June2005. [37] J.-H.KimandS.Sukkarieh,\Airbornesimultaneouslocalis ationandmapbuilding," in Proc.IEEEInt.Conf.Robot.Automat. ,2003,pp.406{411. [38] I.-K.JungandS.Lacroix,\Highresolutionterrainmapping usinglowattitudeaerial stereoimagery,"in Proc.IEEEInt.Conf.ComputerVision ,2003,pp.946{951. [39] D.Lowe,\Objectrecognitionfromlocalscale-invariantfe atures,"in Proc.IEEEInt. Conf.ComputerVision ,1999,pp.1150{1157. 123

PAGE 124

[40] R.SubbaraoandP.Meer,\Nonlinearmeanshiftoverriemanni anmanifolds," Int.J. ComputerVision ,vol.84,p.120,2009. [41] P.Leopardi,\Apartitionoftheunitsphereintoregionsofe qualareaandsmall diameter," ElectronicTrans.NumericalAnalysis ,vol.25,pp.309{327,2006. [42] S.Mehta,W.E.Dixon,D.MacArthur,andC.D.Crane,\Visuals ervocontrolof anunmannedgroundvehicleviaamovingairbornemonocularc amera,"in Proc. AmericanControlConf. ,2006,pp.5276{5211. [43] S.Mehta,G.Hu,N.Gans,andW.E.Dixon,\Adaptivevision-ba sedcollaborative trackingcontrolofanugvviaamovingairbornecamera:Adai sychainingapproach," in Proc.IEEEConf.DecisionControl ,2006,pp.3867{3872. [44] C.Samson,\Controlofchainedsystemsapplicationtopathf ollowingand time-varyingpoint-stabilizationofmobilerobots," IEEETrans.Automat.Contr. vol.40,no.1,pp.64{77,1995. [45] E.MalisandF.Chaumette,\21/2Dvisualservoingwithrespe cttounknown objectsthroughanewestimationschemeofcameradisplacem ent," Int.J.Computer Vision ,vol.37,no.1,pp.79{97,2000. [46] J.Chen,D.M.Dawson,W.E.Dixon,andA.Behal,\Adaptivehom ography-based visualservotrackingforaxedcameracongurationwithac amera-in-hand extension," IEEETrans.Contr.Syst.Technol. ,vol.13,no.5,pp.814{825,2005. [47] O.Faugeras, Three-DimensionalComputerVision:AGeometricViewpoint Cambridge,Massachusetts:MITPress,1993. [48] O.FaugerasandF.Lustman,\Motionandstructurefrommotio ninapiecewise planarenvironment," InternationalJournalofPatternRecognitionandArticia l Intelligence ,vol.2,pp.485{508,1988. [49] Z.ZhangandA.R.Hanson,\Scaledeuclidean3Dreconstructi onbasedonexternally uncalibratedcameras,"in IEEESymp.ComputerVision ,1995,pp.37{42. [50] D.M.Dawson,E.Zergeroglu,A.Behal,andW.E.Dixon, NonlinearControlof WheeledMobileRobots .Secaucus,NJ,USA:Springer-VerlagNewYork,Inc.,2001. [51] C.A.DesoerandM.Vidyasagar, FeedbackSystems:Input-OutputProperties Orlando,FL,USA:AcademicPress,Inc.,1975. [52] K.Dupree,N.Gans,W.Mackunis,andW.E.Dixon,\Euclideanf eaturetrackingfor arotatingsatellite,"in Proc.AmericanControlConf. ,2007,pp.3874{3879. [53] O.Faugeras, Three-DimensionalComputerVision .TheMITPress,Cambridge Massachusetts,2001. 124

PAGE 125

[54] P.K.Allen,A.Timcenko,B.Yoshimi,andP.Michelman,\Auto matedtrackingand graspingofamovingobjectwitharobotichand-eyesystem," IEEETrans.Robot. Automat. ,vol.9,no.2,pp.152{165,1993. [55] G.D.Hager,W.-C.Chang,andA.S.Morse,\Robothand-eyecoo rdinationbasedon stereovision," IEEEContr.Syst.Mag. ,vol.15,no.1,pp.30{39,1995. [56] S.Hutchinson,G.Hager,andP.Corke,\Atutorialonvisuals ervocontrol," IEEE Trans.Robot.Automat. ,vol.12,no.5,pp.651{670,1996. [57] S.Wiiesoma,D.Wolfe,andR.Richards,\Eye-to-handcoordi nationforvision-guided robotcontrolapplications," Int.J.Robot.Res. ,vol.12,no.1,pp.65{78,1993. [58] N.Papanikolopoulos,P.Khosla,andT.Kanade,\Visualtrac kingofamovingtarget byacameramountedonarobot:acombinationofcontrolandvi sion," IEEETrans. Robot.Automat. ,vol.9,no.1,pp.14{35,1993. [59] F.BensalahandF.Chaumette,\Compensationofabruptmotio nchangesintarget trackingbyvisualservoing,"in Proc.IEEE/RSJInt.Conf.Intell.RobotsSyst. ,1995, pp.181{187. [60] G.Hu,S.Mehta,N.Gans,andW.E.Dixon,\Daisychainingbase dvisualservo controlpartI:Adaptivequaternion-basedtrackingcontro l,"in Proc.IEEEMultiConf.Syst.Control ,2007,pp.1474{1479. [61] B.BoufamaandR.Mohr,\Epipoleandfundamentalmatrixesti mationusingvirtual parallax,"in Proc.IEEEInt.Conf.ComputerVision ,1995,pp.1030{1036. [62] G.Hu,N.Gans,S.Mehta,andW.E.Dixon,\Daisychainingbase dvisualservo controlpartII:Extensions,applicationsandopenproblem s,"in Proc.IEEEMultiConf.Syst.Control ,2007,pp.729{734. [63] N.Gans,G.Hu,andW.E.Dixon, ComplexityandNonlinearityinAutonomous Robotics,EncyclopediaofComplexityandSystemScience .Springer,toappear2008, ch.Image-BasedStateEstimation. [64] W.Mackunis,N.Gans,K.Kaiser,andW.E.Dixon,\Uniedtrac kingandregulation visualservocontrolforwheeledmobilerobots,"in Proc.IEEEMulti-Conf.Syst. Control ,2007,pp.88{93. [65] O.FaugerasandF.Lustman,\Motionandstructurefrommotio ninapiecewise planarenvironment," Int.J.PatternRecognitionandArticialIntelligence ,vol.2, no.3,pp.485{508,1988. [66] M.Shuster,\Asurveyofattituderepresentations," J.AstronauticalSciences ,vol.41, no.4,pp.439{518,1993. 125

PAGE 126

[67] W.E.Dixon,A.Behal,D.M.Dawson,andS.Nagarkatti, NonlinearControlof EngineeringSystems:ALyapunov-BasedApproach .BirkhuserBoston,2003. [68] J.J.SlotineandW.Li, AppliedNonlinearControl .EnglewoodCli,NJ:Prentice Hall,Inc.,1991. [69] S.BakerandS.Nayar,\Atheoryofsingle-viewpointcatadio ptricimageformation," Int.J.ComputerVision ,vol.35,no.2,pp.175{196,1999. [70] B.Kim,D.Roh,J.Lee,M.Lee,K.Son,M.Lee,J.Choi,andS.Han ,\Localization ofamobilerobotusingimagesofamovingtarget,"in Proc.IEEEInt.Conf.Robot. Automat. ,2001,pp.253{258. 126

PAGE 127

BIOGRAPHICALSKETCH SiddharthaMehtareceivedBachelorofEngineeringdegreei nmechanicalengineering fromCOEP,UniverityofPune,India.Subsequently,forasho rtperiodofayearhe workedatThermaxLimitedbeforejoiningUniversityofFlor idain2003.Hegraduated withconcurrentmastersinmechanicalengineeringandagri cultureengineeringin2007 withthesistitled`Vision-BasedControlforAutonomousRo boticCitrusHarvesting' undertheguidanceofDr.ThomasF.Burks,whichreceivedthe 2007bestthesisaward inagricultureengineering.Later,hejoinedtheNonlinear ControlsandRobotics(NCR) groupinmechanicalengineeringtopursuePh.D.underthead visementofDr.WarrenE. Dixonwithfocusonvision-basedcontrolandestimation.He isastudentmemberofIEEE andAIAA. 127