<%BANNER%>

Experimental Demonstration of Structure Estimation of a Moving Object Using a Moving Camera Based on an Unknown Input Ob...

Permanent Link: http://ufdc.ufl.edu/UFE0044359/00001

Material Information

Title: Experimental Demonstration of Structure Estimation of a Moving Object Using a Moving Camera Based on an Unknown Input Observer
Physical Description: 1 online resource (62 p.)
Language: english
Creator: Jang, Sujin
Publisher: University of Florida
Place of Publication: Gainesville, Fla.
Publication Date: 2012

Subjects

Subjects / Keywords: nonlinear -- observer -- robotics -- sfm -- vision
Mechanical and Aerospace Engineering -- Dissertations, Academic -- UF
Genre: Mechanical Engineering thesis, M.S.
bibliography   ( marcgt )
theses   ( marcgt )
government publication (state, provincial, terriorial, dependent)   ( marcgt )
born-digital   ( sobekcm )
Electronic Thesis or Dissertation

Notes

Abstract: The problem of estimating the structure of a scene and the camera motion is referred to structure from motion (SFM). In this thesis, an application and experimental verification of a currently developed online SFM method is presented to estimate the structure of a moving object using a moving camera. Chapter 2 describes the basic kinematics of the camera motion, the geometric image formation and a point tracking algorithm. The perspective camera model is used to describe the relationship between the camera and moving object. Based on this model, a nonlinear dynamics system is developed in Chapter 3. The method of least squares is used to optimize the camera calibration matrix. A KLT (Kaneda-Lucas-Tomashi) feature point tracker is used to track a static and moving point in experiments. In Chapter 3, an unknown input observer is designed to estimate the position of a moving object relative to a moving camera. The velocity of the object is considered as an unknown input to the perspective dynamical system. The Lyapunov-based mehods are used to prove the exponential or the uniformly ultimately bounded stability result of the observer. The observer gain design problem is formulated as a linear matrix inequaility problem. The velocity kinematic analysis of a robot manipulator is introduced in Chapter 4. In the experiments, the forward kinematic analysis is used to determine the position and orientation of the end-effector of the robot manipulator. The joint velocities of the robot manipulator are related to the linear and angular velocity of the end-effector to control motion of the robot. In Chapter 5, the unknown input observer is implemented for the structure estimation of a moving object attached to a two-link robot observed by a moving camera attached to a PUMA robot. Series of experiments are performed with different camera and object motions. The method is used to estimate the structure of the static object as well as the moving object. The position estimates are compared with ground-truth data computed using forward kinematics of the PUMA and the two-link robot.
General Note: In the series University of Florida Digital Collections.
General Note: Includes vita.
Bibliography: Includes bibliographical references.
Source of Description: Description based on online resource; title from PDF title page.
Source of Description: This bibliographic record is available under the Creative Commons CC0 public domain dedication. The University of Florida Libraries, as creator of this bibliographic record, has waived all rights to it worldwide under copyright law, including all related and neighboring rights, to the extent allowed by law.
Statement of Responsibility: by Sujin Jang.
Thesis: Thesis (M.S.)--University of Florida, 2012.
Local: Adviser: Crane, Carl D.
Local: Co-adviser: Dixon, Warren E.

Record Information

Source Institution: UFRGP
Rights Management: Applicable rights reserved.
Classification: lcc - LD1780 2012
System ID: UFE0044359:00001

Permanent Link: http://ufdc.ufl.edu/UFE0044359/00001

Material Information

Title: Experimental Demonstration of Structure Estimation of a Moving Object Using a Moving Camera Based on an Unknown Input Observer
Physical Description: 1 online resource (62 p.)
Language: english
Creator: Jang, Sujin
Publisher: University of Florida
Place of Publication: Gainesville, Fla.
Publication Date: 2012

Subjects

Subjects / Keywords: nonlinear -- observer -- robotics -- sfm -- vision
Mechanical and Aerospace Engineering -- Dissertations, Academic -- UF
Genre: Mechanical Engineering thesis, M.S.
bibliography   ( marcgt )
theses   ( marcgt )
government publication (state, provincial, terriorial, dependent)   ( marcgt )
born-digital   ( sobekcm )
Electronic Thesis or Dissertation

Notes

Abstract: The problem of estimating the structure of a scene and the camera motion is referred to structure from motion (SFM). In this thesis, an application and experimental verification of a currently developed online SFM method is presented to estimate the structure of a moving object using a moving camera. Chapter 2 describes the basic kinematics of the camera motion, the geometric image formation and a point tracking algorithm. The perspective camera model is used to describe the relationship between the camera and moving object. Based on this model, a nonlinear dynamics system is developed in Chapter 3. The method of least squares is used to optimize the camera calibration matrix. A KLT (Kaneda-Lucas-Tomashi) feature point tracker is used to track a static and moving point in experiments. In Chapter 3, an unknown input observer is designed to estimate the position of a moving object relative to a moving camera. The velocity of the object is considered as an unknown input to the perspective dynamical system. The Lyapunov-based mehods are used to prove the exponential or the uniformly ultimately bounded stability result of the observer. The observer gain design problem is formulated as a linear matrix inequaility problem. The velocity kinematic analysis of a robot manipulator is introduced in Chapter 4. In the experiments, the forward kinematic analysis is used to determine the position and orientation of the end-effector of the robot manipulator. The joint velocities of the robot manipulator are related to the linear and angular velocity of the end-effector to control motion of the robot. In Chapter 5, the unknown input observer is implemented for the structure estimation of a moving object attached to a two-link robot observed by a moving camera attached to a PUMA robot. Series of experiments are performed with different camera and object motions. The method is used to estimate the structure of the static object as well as the moving object. The position estimates are compared with ground-truth data computed using forward kinematics of the PUMA and the two-link robot.
General Note: In the series University of Florida Digital Collections.
General Note: Includes vita.
Bibliography: Includes bibliographical references.
Source of Description: Description based on online resource; title from PDF title page.
Source of Description: This bibliographic record is available under the Creative Commons CC0 public domain dedication. The University of Florida Libraries, as creator of this bibliographic record, has waived all rights to it worldwide under copyright law, including all related and neighboring rights, to the extent allowed by law.
Statement of Responsibility: by Sujin Jang.
Thesis: Thesis (M.S.)--University of Florida, 2012.
Local: Adviser: Crane, Carl D.
Local: Co-adviser: Dixon, Warren E.

Record Information

Source Institution: UFRGP
Rights Management: Applicable rights reserved.
Classification: lcc - LD1780 2012
System ID: UFE0044359:00001


This item has the following downloads:


Full Text

PAGE 1

EXPERIMENTALDEMONSTRATIONOFSTRUCTUREESTIMATIONOFAMOVINGOBJECTUSINGAMOVINGCAMERABASEDONANUNKNOWNINPUTOBSERVERBySUJINJANGATHESISPRESENTEDTOTHEGRADUATESCHOOLOFTHEUNIVERSITYOFFLORIDAINPARTIALFULFILLMENTOFTHEREQUIREMENTSFORTHEDEGREEOFMASTEROFSCIENCEUNIVERSITYOFFLORIDA2012

PAGE 2

c2012SujinJang 2

PAGE 3

Tomyparents,Jin-sunJangandJong-sookKim,mybrotherSuyoungJangandmybelovedSaeromLeefortheircontinuousloveandprayer 3

PAGE 4

ACKNOWLEDGMENTSIwouldliketosincerelythankmyadvisor,Dr.CarlD.CraneIII,whoseexperienceandsupporthavebeeninstrumentalinthecompletionofmymaster'sDegree.Asanadvisor,healwayssupportedmyresearchandgavemetheinvaluableadvice.Asamentor,hehelpedmeunderstandthekinematicanalysisofrobotsandgavemetheguidancetotheapplicationofvisionsystems.Iwouldliketothankmyco-advisor,Dr.WarrenE.Dixon,forhissupportandtechnicaldiscussionstoimprovequailityofmythesis.Withouthissupportandguidance,theexperimentsinmythesiscannotbedone.IwouldliketothankmycomitteememberDr.PrabirBarooahforhisteachingintheclassroommeetingsandthetimeheprovided.IthankallofmycolleaguesandfriendsatCIMAR(CenterforIntelligentMachines&Robotics),DarsanPatel,DrewLucas,JonathonJeske,RyanChilton,RobertKid,JhonWaltz,VisheshVikas,AnubiMoses,JunsuShin,TaehoKimandYoungjinMoon.OccasionaldiscussionswithmycolleaguesatCIMARhavehelpedmetounderstandandsolvetheproblems.IespeciallythankAshwinP.Daniforhissupportandguidanceduringthelasttwosemestersofmyresearch.Finally,Iwouldliketothankmyparents,Jin-sunJangandJong-sookKim,fortheirceaselessloveandprayer,mybrotherSuyoungJangforhisencouragementandprayerandmybelovedone,SaeromLee,forherpatienceandlove. 4

PAGE 5

TABLEOFCONTENTS page ACKNOWLEDGMENTS .................................... 4 LISTOFTABLES ....................................... 7 LISTOFFIGURES ....................................... 8 ABSTRACT ........................................... 10 CHAPTER 1INTRODUCTION .................................... 12 2PERSPECTIVECAMERAMODELANDFEATURETRACKING .......... 14 2.1KinematicModeling ................................ 14 2.2CameraModelandGeometricImageFormation .................. 17 2.3OptimizationofCameraMatrix ........................... 18 2.4Apointtrackingalgorithm:KLT(Kaneda-Lucas-Thomasi)pointtracker ..... 19 3DESIGNOFANUNKNOWNINPUTOBSERVER ................... 22 3.1NonlinearDynamics ................................ 22 3.2DesignofanUnknownInputObserver ....................... 23 3.3StabilityAnalysis .................................. 24 3.4ConditiononAmatrix ............................... 28 3.5Conditiononobjecttrajectories ........................... 28 3.6LMIFormulation .................................. 29 4VELOCITYKINEMATICSFORAROBOTMANIPULATOR ............. 31 4.1ForwardKinematicAnalysis ............................ 31 4.2VelocityKinematicAnalysis ............................ 33 5EXPERIMENTSANDRESULTS ............................ 35 5.1TestbedSetup .................................... 35 5.2ExperimentI:Movingcamerawithastaticobject ................. 39 5.2.1Set1 ..................................... 39 5.2.2Set2 ..................................... 41 5.2.3Set3 ..................................... 44 5.2.4Set4 ..................................... 46 5.3ExperimentII:Movingcamerawithamovingobject ............... 49 5.3.1Set1 ..................................... 50 5.3.2Set2 ..................................... 52 5.3.3Set3 ..................................... 54 5

PAGE 6

6CONCLUSIONANDFUTUREWORK ......................... 58 REFERENCES ......................................... 60 BIOGRAPHICALSKETCH .................................. 62 6

PAGE 7

LISTOFTABLES Table page 4-1MechanismparametersforthePUMA560. ....................... 32 5-1ComparisonoftheRMSpositionestimationerrorsinset2ofExperimentI. ...... 39 5-2ComparisonoftheRMSpositionestimationerrorsinset1oftheExperimentII. .... 39 5-3RMSpositionestimationerrorsofthestaticpoint. .................... 49 5-4RMSpositionestimationerrorsofthemovingpoint. ................... 57 7

PAGE 8

LISTOFFIGURES Figure page 2-1Aperspectiveprojectionandkinematiccameramodel. ................. 14 4-1KinematicmodelofPUMA560. ............................. 33 5-1Anoverviewoftheexperimentalconguration. ..................... 36 5-2Platforms. ......................................... 36 5-3Atrackedstaticpoint(dotinsolidcircle). ........................ 37 5-4Atrackedmovingpoint(dotindashedcircle). ...................... 37 5-5Cameraangularvelocity. ................................. 40 5-6Cameralinearvelocity. .................................. 40 5-7Comparisonoftheactual(dash)andestimated(solid)positionofastaticobjectwithrespecttoamovingcamera. ................................ 41 5-8Positionestimationerrorforastaticpoint. ........................ 41 5-9Cameraangularvelocity ................................. 42 5-10Cameralinearvelocity. .................................. 43 5-11Comparisonoftheactual(dash)andestimated(solid)positionofastaticobjectwithrespecttoamovingcamera. ................................ 43 5-12Positionestimationerrorforastaticobject. ....................... 44 5-13Cameraangularvelocity ................................. 45 5-14Cameralinearvelocity. .................................. 45 5-15Comparisonoftheactual(dash)andestimated(solid)positionofastaticobjectwithrespecttoamovingcamera. ................................ 46 5-16Positionestimationerrorforastaticobject. ....................... 46 5-17Cameraangularvelocity ................................. 47 5-18Cameralinearvelocity. .................................. 48 5-19Comparisonoftheactual(dash)andestimated(solid)positionofastaticobjectwithrespecttoamovingcamera. ................................ 48 5-20Positionestimationerrorforastaticobject. ....................... 49 5-21Cameraangularvelocity ................................. 50 8

PAGE 9

5-22Cameralinearvelocity. .................................. 51 5-23Comparisonoftheactual(dash)andestimated(solid)positionofamovingobjectwithrespecttoamovingcamera. ............................. 51 5-24Positionestimationerrorforamovingobject. ...................... 52 5-25Cameraangularvelocity. ................................. 53 5-26Cameralinearvelocity. .................................. 53 5-27Comparisonoftheactual(dash)andestimated(solid)positionofamovingpointwithrespecttoamovingcamera. ................................ 54 5-28Positionestimationerrorforamovingpoint. ....................... 54 5-29Cameraangularvelocity. ................................. 55 5-30Cameralinearvelocity. .................................. 56 5-31Comparisonoftheactual(dash)andestimated(solid)positionofamovingpointwithrespecttoamovingcamera. ................................ 56 5-32Positionestimationerrorforamovingpoint. ....................... 57 9

PAGE 10

AbstractofThesisPresentedtotheGraduateSchooloftheUniversityofFloridainPartialFulllmentoftheRequirementsfortheDegreeofMasterofScienceEXPERIMENTALDEMONSTRATIONOFSTRUCTUREESTIMATIONOFAMOVINGOBJECTUSINGAMOVINGCAMERABASEDONANUNKNOWNINPUTOBSERVERBySujinJangAugust2012Chair:CarlD.CraneIIICochair:WarrenE.DixonMajor:MechanicalEngineeringTheproblemofestimatingthestructureofasceneandthecameramotionisreferredtostructurefrommotion(SFM).Inthisthesis,anapplicationandexperimentalvericationofanonlineSFMmethodispresentedtoestimatethestructureofamovingobjectusingamovingcamera.Chapter 2 describesthebasickinematicsofthecameramotion,thegeometricimageformationandapointtrackingalgorithm.Theperspectivecameramodelisusedtodescribetherelationshipbetweenthecameraandmovingobject.Basedonthismodel,anonlineardynamicssystemisdevelopedinChapter 3 .Themethodofleastsquaresisusedtooptimizethecameracalibrationmatrix.AKLT(Kaneda-Lucas-Tomashi)featurepointtrackerisusedtotrackastaticandmovingpointinexperiments.InChapter 3 ,anunknowninputobserverisdesignedtoestimatethepositionofamovingobjectrelativetoamovingcamera.Thevelocityoftheobjectisconsideredasanunknowninputtotheperspectivedynamicalsystem.TheLyapunov-basedmehodsareusedtoprovetheexponentialortheuniformlyultimatelyboundedstabilityresultoftheobserver.Theobservergaindesignproblemisformulatedasalinearmatrixinequailityproblem.ThevelocitykinematicanalysisofarobotmanipulatorisintroducedinChapter 4 .Intheexperiments,theforwardkinematicanalysisisusedtodeterminethepositionandorientationof 10

PAGE 11

theend-effectoroftherobotmanipulator.Thejointvelocitiesoftherobotmanipulatorarerelatedtothelinearandangularvelocityoftheend-effectortocontrolmotionoftherobot.InChapter 5 ,theunknowninputobserverisimplementedforthestructureestimationofamovingobjectattachedtoatwo-linkrobotobservedbyamovingcameraattachedtoaPUMArobot.Seriesofexperimentsareperformedwithdifferentcameraandobjectmotions.Themethodisusedtoestimatethestructureofthestaticobjectaswellasthemovingobject.Thepositionestimatesarecomparedwithground-truthdatacomputedusingforwardkinematicsofthePUMAandthetwo-linkrobot. 11

PAGE 12

CHAPTER1INTRODUCTIONVisionsensorsproviderichimageinformationandrarelyhavelimitsonsensingrange.Basedontheseimagedata,numerousmethodshavebeendevelopedtoestimateandreconstructthestructureofascene.Thesemethodshavebeenimplementedinvariousroboticsapplicationssuchasnavigation,guidanceandcontrolofanautonomousvehicle,autonomoussurveillancerobotsandroboticmanipulation.Oneoftheintensivelystudiedapproachestosolvetheestima-tionproblemisstructurefrommotion(SFM)whichreferstotheprocessofreconstructingboththethree-dimensionalstructureofthesceneandthecameramotion.AnumberofapproachtosolvetheSFMprobleminadynamicscenehavebeenstudiedinthepastdecade[ 1 8 ].In[ 1 ],abatchalgorithmisdevelopedforpointsmovinginstraightlinesorconictrajectoriesgivenveornineviews,respectively.In[ 2 ],abatchalgorithmispresentedforobjectmotionsrepresentedbymoregeneralcurvesapproximatedbypolynomials.In[ 3 ],assumingaweakperspectivecameramodel,afactorization-basedbatchalgorithmisdevelopedforobjectsmovingwithconstantspeedinastraightline.Analgebraicgeometryapproachispresentedin[ 4 ]toestimatethemotionofobjectsuptoascalegivenaminimumnumberofpointcorrespondences.In[ 5 ],abatchalgo-rithmisdevelopedtoestimatethestructureandmotionofobjectsmovingonagroundplaneobservedbyamovingairbornecamera.Themethodreliesonastaticsceneforestimatingtheprojectivedepth,approximatedbythedepthoffeaturepointsonastaticbackgroundassumingthatoneofthefeaturepointsofthemovingobjectliesonthestaticbackground.In[ 6 ],abatchalgorithmisdevelopedbyapproximatingthetrajectoriesofamovingobjectusingalinearcombinationofdiscretecosinetransform(DCT)basisvectors.Batchalgorithmsuseanalgebraicrelationshipbetween3Dcoordinatesofpointsinthecameracoordinateframeandcorresponding2Dprojectionsontheimageframecollectedovernimagestoestimatethestructure.Hence,batchalgorithmsarenotusefulinreal-timecontrolalgorithms.Forvisualservocontrolorvideo-basedsurveillancetasks,onlinestructureestimationalgorithmsarerequired.Recently,acausalalgorithmispresentedin[ 7 ]toestimatethestructure 12

PAGE 13

andmotionofobjectsmovingwithconstantlinearvelocitiesobservedbyamovingcamerawithknowncameramotions.Anewmethodbasedonanunknowninputobserver(UIO)isdevelopedin[ 8 ]toestimatethestructureofanobjectmovingwithtime-varyingvelocitiesusingamovingcamerawithknownvelocities.Thecontributionsofthisworkaretoexperimentallyverifytheunknowninputobserverin[ 8 ]forstructureestimationofamovingobjectandtoprovetheuniformlyultimatelybounded(UUB)resultfortheobserverwhereanadditivedisturbancetermisconsideredinthenonlinearsystem.AseriesofexperimentsareconductedonaPUMA560andatwo-linkrobot.AcameraisattachedtothePUMAandthetargetisattachedtothemovingtwo-linkrobot.Thecameraimagesareprocessedtotrackafeaturepointwhilecameravelocitiesaremeasuredusingthejointencoders.Thecameracalibrationmatrixisoptimizedusingaleast-squaresmethodtoreducetheerrorincameraparametersobtainedusingaMatlabcameracalibrationroutine.Toobtainthegroundtruthdata,thedistancebetweentheoriginofthePUMAandoriginofthetwo-linkrobotismeasuredandpositionsofthecameraandmovingobjectwithrespecttorespectiveoriginsareobtainedusingtheforwardkinematicsoftherobots.Theestimatedpositionoftheobjectiscomparedwithground-truthdata.Theexperimentsareconductedtoestimatethestructureofastaticaswellasamovingobjectkeepingthesameobserverstructure.Theexperimentsprovetheadvantageoftheobserverinthesensethata-prioriknowledgeofobjectstate(staticormoving)isnotrequired. 13

PAGE 14

CHAPTER2PERSPECTIVECAMERAMODELANDFEATURETRACKINGThischapterdescribesthebasickinematicsofthecameramotion(Section 2.1 )andthegeometricimageformation(Section 2.2 ).TheoptimizationofcameramatrixispresentedinSection 2.3 .AlsoacommonlyusedfeaturepointtrackingtechniqueisintroducedinSection 2.4 Figure2-1.Aperspectiveprojectionandkinematiccameramodel. 2.1KinematicModelingConsideringamovingcameraobservinganobject,deneaninertiallyxedreferenceframe,F:fo;Ex,Ey,Ezg,andareferenceframexedtoacamera,C:foc;ex;ey;ezgasshowninFig. 2-1 .Thepositionofapointprelativetoapointoisdenotedbyrp.Thepositionofoc(originofC)relativetopointo(originofF)isdenotedbyroc.Inthefollowingdevelopmenteveryvectorandtensorareexpressedintermsofthebasisfex;ey;ezgxedinC 1 .Theposition 1 InChapter 2 ,itisassumedthatfgeisomittedinvectorrepresentationwherefgedenotesthecolumn-vectorrepresentationsofthevectorinthebasisfex;ey;ezg(i.e.FVp=FVpe). 14

PAGE 15

ofpmeasuredrelativetothepointocisexpressedasrp=oc=rp)]TJ /F7 11.955 Tf 11.95 0 Td[(roc=X(t)Y(t)Z(t)T (2)whereX(t);Y(t)andZ(t)2R.Thelinearvelocityoftheobjectandthecameraasviewedbyanobserverintheinertialreferenceframearegivenby FVp=Fd dt(rp)=vpxvpyvpzT2VpR3; (2) FVoc=Fd dt(roc)=vcxvcyvczT2VcR3: (2) UsingEqs. 2 through 2 ,thevelocityofpasviewedbyanobserverinCisgivenby CVp=oc=Cd dt(rp)]TJ /F7 11.955 Tf 11.96 0 Td[(roc)=_X(t)_Y(t)_Z(t)T;CVp=oc=CVp)]TJ /F10 7.97 Tf 11.95 4.93 Td[(CVoc=FVp)]TJ /F8 7.97 Tf 11.96 4.93 Td[(FVoc+CwF(rp)]TJ /F7 11.955 Tf 11.96 0 Td[(roc) (2) whereCwFdenotestheangularvelocityofFasviewedbyanobserverinC.TheangularvelocityofthecamerarelativetoFisexpressedasFwC=!x!y!zT:SinceCwFandFwCarerelatedasCwF=)]TJ /F10 7.97 Tf 9.29 4.34 Td[(FwC; CwF=)]TJ /F3 11.955 Tf 9.3 0 Td[(!x)]TJ /F3 11.955 Tf 9.3 0 Td[(!y)]TJ /F3 11.955 Tf 9.3 0 Td[(!zT:(2)SubstitutingEqs. 2 through 2 intoEq. 2 yields 266664_X(t)_Y(t)_Z(t)377775=266664vpx)]TJ /F3 11.955 Tf 11.96 0 Td[(vcxvpy)]TJ /F3 11.955 Tf 11.96 0 Td[(vcyvpz)]TJ /F3 11.955 Tf 11.96 0 Td[(vcz377775+2666640!z)]TJ /F3 11.955 Tf 9.29 0 Td[(!y)]TJ /F3 11.955 Tf 9.3 0 Td[(!z0!x!y)]TJ /F3 11.955 Tf 9.3 0 Td[(!x0377775266664X(t)Y(t)Z(t)377775;=266664vpx)]TJ /F3 11.955 Tf 11.95 0 Td[(vcx+!zY(t))]TJ /F3 11.955 Tf 11.95 0 Td[(!yZ(t)vpy)]TJ /F3 11.955 Tf 11.95 0 Td[(vcy+!xZ(t))]TJ /F3 11.955 Tf 11.96 0 Td[(!zX(t)vpz)]TJ /F3 11.955 Tf 11.96 0 Td[(vcz+!yX(t))]TJ /F3 11.955 Tf 11.95 0 Td[(!xY(t)377775 (2) 15

PAGE 16

TheinhomogeneouscoordinatesofEq. 2 ,m(t)=m1(t)m2(t)1T2R3,aredenedas m(t),X(t) Z(t)Y(t) Z(t)1T: (2) Consideringsubsequentdevelopment,thestatevectorx(t)=x1(t)x2(t)x3(t)T2YR3isdenedas x(t),X ZY Z1 ZT: (2) UsingEqs. 2 and 2 ,thetimederivativeofEq. 2 canbeexpressedas _x1=_XZ)]TJ /F3 11.955 Tf 11.96 0 Td[(X_Z Z2=vpx)]TJ /F3 11.955 Tf 11.96 0 Td[(vcx+!zY)]TJ /F3 11.955 Tf 11.96 0 Td[(!yZ Z)]TJ /F11 11.955 Tf 11.95 16.86 Td[(X Zvpz)]TJ /F3 11.955 Tf 11.96 0 Td[(vcz+!yX)]TJ /F3 11.955 Tf 11.95 0 Td[(!xY Z=vpxx3)]TJ /F3 11.955 Tf 11.96 0 Td[(vcxx3+!zx2)]TJ /F3 11.955 Tf 11.95 0 Td[(!y)]TJ /F3 11.955 Tf 11.95 0 Td[(x1(vpzx3)]TJ /F3 11.955 Tf 11.95 0 Td[(vczx3+!yx1)]TJ /F3 11.955 Tf 11.95 0 Td[(!xx2); (2) _x2=_YZ)]TJ /F3 11.955 Tf 11.95 0 Td[(Y_Z Z2=vpy)]TJ /F3 11.955 Tf 11.95 0 Td[(vcy+!xZ)]TJ /F3 11.955 Tf 11.95 0 Td[(!zX Z)]TJ /F11 11.955 Tf 11.96 16.86 Td[(Y Zvpz)]TJ /F3 11.955 Tf 11.95 0 Td[(vcz+!yX)]TJ /F3 11.955 Tf 11.96 0 Td[(!xY Z=vpyx3)]TJ /F3 11.955 Tf 11.95 0 Td[(vcyx3+!x)]TJ /F3 11.955 Tf 11.96 0 Td[(!zx1)]TJ /F3 11.955 Tf 11.96 0 Td[(x2(vpz)]TJ /F3 11.955 Tf 11.95 0 Td[(vcz+!yx1)]TJ /F3 11.955 Tf 11.96 0 Td[(!xx2); (2) _x3=)]TJ /F6 11.955 Tf 16.65 11.11 Td[(_Z Z2=)]TJ /F3 11.955 Tf 10.5 8.09 Td[(vpz)]TJ /F3 11.955 Tf 11.96 0 Td[(vcz+!yX(t))]TJ /F3 11.955 Tf 11.95 0 Td[(!xY(t) Z2=)]TJ /F3 11.955 Tf 9.3 0 Td[(x3(vpzx3)]TJ /F3 11.955 Tf 11.95 0 Td[(vczx3+!yx1)]TJ /F3 11.955 Tf 11.95 0 Td[(!xx2): (2) FromEqs. 2 through 2 ,thedynamicsofthestatex(t)canbeexpressedas _x1=1+f1+vpxx3)]TJ /F3 11.955 Tf 11.96 0 Td[(x1vpzx3;_x2=2+f2+vpyx3)]TJ /F3 11.955 Tf 11.95 0 Td[(x2vpzx3;_x3=vczx23+(x2!x)]TJ /F3 11.955 Tf 11.95 0 Td[(x1!2)x3)]TJ /F3 11.955 Tf 11.96 0 Td[(vpzx23;y=x1x2T: (2) 16

PAGE 17

where1(u;y);2(u;y);f1(u;x);f2(u;x);f3(u;x)2Raredenedas1(u;y),x1x2!x)]TJ /F3 11.955 Tf 11.95 0 Td[(!y)]TJ /F3 11.955 Tf 11.95 0 Td[(x21!y+x2!z;2(u;y),!x+x22!x)]TJ /F3 11.955 Tf 11.96 0 Td[(x1x2!y)]TJ /F3 11.955 Tf 11.96 0 Td[(x1!z;f1(u;x),(x1vcz)]TJ /F3 11.955 Tf 11.96 0 Td[(vcx)x3;f2(u;x),(x2vcz)]TJ /F3 11.955 Tf 11.96 0 Td[(vcy)x3;f3(u;x),vczx23+(x2!x)]TJ /F3 11.955 Tf 11.95 0 Td[(x1!y)x3: Assumption2.1. Thevelocitiesofthecameraandobjectareassumedtobeupperandlowerboundedbyconstants. Assumption2.2. Sincethestatesx1(t)andx2(t)areequivalenttothepixelcoordinatesintheimageplane,andthesizeofimageplaneisboundedbyknownconstants,thusitcanbeassumedthatx1(t)andx2(t)arealsoboundedbyx 1x1(t) x1;x 2x2(t) x2wherex 1; x1andx 2; x2areobtainedusingthewidthandheightoftheimageplane. Assumption2.3. Thedistancebetweenthecameraandtheobjectisassumedtobeupperandlowerboundedbysomeknownpositiveconstants.Thusthestatex3(t)isboundedbyx 3x3(t)x3wherex3;x 32Rareknownconstants. 2.2CameraModelandGeometricImageFormationInordertodescribetheimageformationprocess,thegeometricperspectiveprojectioniscommonlyusedasdepictedinFig. 2-1 .Theprojectionmodelconsistsofanimageplane,acenterofprojectionoc,acenterofimageplaneoI,thedistancebetweenandoc(focallength)andthetwo-dimensionalpixelcoordinatesystem(Ix;Iy)relativetotheupperleftcorneroftheimageplane.Thepixelcoordinatesoftheprojectedpointpintheimageplaneisgivenby~m(t)=u(t)v(t)1T2IR3.Thethree-dimensionalcoordinatesm(t)inEq. 2 arerelatedtothepixelcoordinates~m(t)bythefollowingrelationship[ 9 ] ~m(t)=Kcm(t) (2) 17

PAGE 18

whereKc2R33isaninvertibleupper-triangularformofintrinsiccameramatrixgivenbyKc=266664f0cx0fcy001377775whereistheimageaspectratio,fisthefocallengthand(cx;cy)denotestheopticalcenteroIexpressedinpixelcoordinates.Tosimplifythederivationoftheperspectiveprojectionmatrix,thefollowingassumptionsareconsidered Assumption2.4. Theprojectioncenterisassumedtocoincidewiththeoriginofthecamerareferenceframe. Assumption2.5. Theopticalaxisisalignedwiththez-axisofthecoordinatesystemxedinthecamera. 2.3OptimizationofCameraMatrixSincethecoordinatesystemxedinthecamerareferenceframeisconsideredtobealignedtothebasisxedintheend-effectorofarobotmanipulator,theprojectioncenterofthecameraisassumedtobelocatedatthetoolpointoftherobotmanipulator(Assumptions2.4and2.5).However,physically,itishardtodenetheexactpositionoftheprojectioncenterrelativetothetoolpositionbecauseofuncertaintiesinmeasurementofdimensions(i.e.dimensionsofacameramount,acenterofvarifocallens).Consideringthisproblem,alinearLeast-squaresmethodisappliedtoobtainanoptimizedcameramatrix.FromEq. 2 ,alinearregressionmodelcanbeexpressedwithnsetsofmand~mas S= (2) 18

PAGE 19

whereS2R2n4isdenedasS,266666666664m1;10100m2;101............m1;n0100m2;n01377777777775;2R4isdenedas,ffcxcyT;and2R2nisdenedas,u1v1unvnT:Tondtheoptimalsolutionof,thefollowingquadraticminimizationproblemisconsidered ^=argminkS)]TJ /F3 11.955 Tf 11.96 0 Td[(k2 (2) where^denotesaleastsquaresestimatorgivenby^=SywhereSyisthegeneralizedpseudoinversedenedasSy=)]TJ /F7 11.955 Tf 5.48 -9.69 Td[(STS)]TJ /F14 7.97 Tf 6.59 0 Td[(1ST:Thesolution^totheprobleminEq. 2 givestheoptimizedcameraprojectioncenterandfocallength. 2.4Apointtrackingalgorithm:KLT(Kaneda-Lucas-Thomasi)pointtrackerInthissection,apointfeaturetrackingalgorithmisbrieydescribed.Totrackamovingobjectintheimageplane,aKLT(Kaneda-Lucas-Thomasi)pointtrackingalgorithmisused(detailedderivationanddiscussioncanbefoundin[ 10 11 ]).TheconceptoftheKLTtracking 19

PAGE 20

algorithmistotrackfeaturesbetweenthecurrentandpastframeusingasumofsquaredintensitydifferenceinaxedsizeoflocalarea.ThechangeofintensitiesinsuccessiveimagescanbeexpressedasI(u;v;t+)=I(u)]TJ /F3 11.955 Tf 11.95 0 Td[((u;v);y)]TJ /F3 11.955 Tf 11.95 0 Td[((u;v);t)whereu;vandtareassumedtobediscreteandbounded.Theamountofintensitychanges=(;)iscalledthedisplacementofthepointatm(u;v)betweentimetandt+.Theimagecoordinatesmaremeasuredrelativetothecenterofxed-sizewindows.Thedisplacementfunctionisrepresentedasanafnemotioneldinthefollowingform: =Dm+d (2) whereadeformationmatrixD2R22isgivenby D=264duuduvdvudvv375;(2)andatranslationvectord2R2isgivenby d=dddvT:(2)UsingEqs. 2 through 2 ,alocalimagemodelcanbeexpressedas I1(Am+d)=I0(m) (2) whereI0istherstimage,I1isthesecondimageandAisdenedasA=2641001375+D.Asdiscussedin[ 11 ],smallermatchingwindowsarepreferableforreliabletracking.Sincetherotationalmotionbecomesmorenegligiblewithinsmallermatchingwindows,thedeformationmatrixDcanbeassumedtobezero.Thus,apuretranslationmodelinEq. 2 isconsidered, 20

PAGE 21

andEq. 2 canberewrittenas=d:Thedisplacementparametersinthevectordarechosensuchthattheyminimizethefollowingintegralofsquareddissimilarity =w[I1(Am+d))]TJ /F3 11.955 Tf 11.96 0 Td[(I0(m)]2!(m)dm(2)wherewisthegivenlocalmatchingwindowsand!(m)isaweightingfunction.Asdescribedin[ 10 11 ],theminimizationofthedissimilarityinEq. 2 isequivalenttosolvethefollowingequationfordwhenDissettobezero:Zd=wherethevector2R2isgivenby=w[I0(m))]TJ /F3 11.955 Tf 11.95 0 Td[(I1(m)]g(m)!(m)dm;andthematrixZ2R22isgivenbyZ=wg(m)gT(m)!(m)dmwherethevectorderivedfromthetruncatedTaylorexpansionof 2 ,g(m)2R2isgivenbyg=@ @u)]TJ /F8 7.97 Tf 6.68 -4.87 Td[(I0+I1 2@ @v)]TJ /F8 7.97 Tf 6.68 -4.87 Td[(I0+I1 2T: 21

PAGE 22

CHAPTER3DESIGNOFANUNKNOWNINPUTOBSERVERInthischapter,anunknowninputobserverforaclassofnonlinearsystemisdesignedtoestimatethepositionofatrackedobjectasin[ 8 ].Theproblemofobservergaindesignisformulatedasalinearmatrixinequality(LMI)feasibilityproblem. 3.1NonlinearDynamicsBasedonEqs. 2 and 2 ,thefollowingnonlinearsystemcanbeconstructed _x=f(x;u)+g(y;u)+Ddy=Cx (3) wherex(t)2R3isastateofthesystem,y(t)2R2isanoutputofthesystem,u(t)2R6isameasurableinput,d(t)2Risanunmeasurableinput,g(y;u)=120Tisnonlineariny(t)andu(t),andf(x;u)=f1f2f3Tisnonlinearinx(t)andu(t)satisfyingtheLipschitzconditionkf(x;u))]TJ /F3 11.955 Tf 12.4 0 Td[(f(^x;u)k1kx)]TJ /F6 11.955 Tf 13.12 0 Td[(^xkwhere12R+.AfullrowrankofmatrixC2R23isselectedasC=264100010375;andD2R31isfullcolumnrank.ThesysteminEq. 3 canbewritteninthefollowingform: _x=Ax+f(x;u)+g(y;u)+Ddy=Cx (3) wheref(x;u)=f(x;u))]TJ /F3 11.955 Tf 12.75 0 Td[(AxandA2R33:Thefunctionf(x;u)satisestheLipschitzcondition[ 12 13 ] kf(x;u))]TJ /F3 11.955 Tf 11.96 0 Td[(f(^x;u))]TJ /F3 11.955 Tf 11.96 0 Td[(A(x)]TJ /F6 11.955 Tf 12.68 0 Td[(^x)k(1+2)kx)]TJ /F6 11.955 Tf 12.68 0 Td[(^xk (3) where22R+: 22

PAGE 23

3.2DesignofanUnknownInputObserverThegoalofdesignistoachieveanexponentiallystableobserver.Toquantifytheobjective,anerrorstatee(t)2R3isdenedas e(t),^x(t))]TJ /F3 11.955 Tf 11.95 0 Td[(x(t):(3)ConsideringEq. 3 andthesubsequentstabilityanalysis,anunknowninputobserverforthenonlinearsysteminEq. 3 isdesignedtoestimatethestatex(t)inthepresenceofanunknowndisturbanced(t) _z=Nz+Ly+Mf(^x;u)+Mg(y;u)^x=z)]TJ /F3 11.955 Tf 11.95 0 Td[(Ey (3) where^x(t)2R3isanestimateoftheunknownstatex(t),z(t)2R3isanauxiliarysignal,thematricesN2R33,L2R32,E2R32,M2R33aredesignedas[ 14 ]M=I3+ECN=MA)]TJ /F3 11.955 Tf 11.95 0 Td[(KCL=K(I2+CE))]TJ /F3 11.955 Tf 11.96 0 Td[(MAEE=)]TJ /F3 11.955 Tf 9.3 0 Td[(D(CD)++Y)]TJ /F3 11.955 Tf 5.48 -9.68 Td[(I2)]TJ /F6 11.955 Tf 11.95 0 Td[((CD)(CD)y (3)where(CD)ydenotesthegeneralizedpseudoinverseofthematrixCD.ThegainmatrixK2R32andmatrixY2R32areselectedsuchthat Q,NTP+PN+)]TJ /F3 11.955 Tf 5.48 -9.69 Td[(21+22PMMTP+2I3<0 (3) whereP2R33isapositivedenite,symmetricmatrix.UsingEq. 3 ,thefollowingequalityissatised NM+LC)]TJ /F3 11.955 Tf 11.95 0 Td[(MA=0; (3) MD=(I3+EC)D=0: (3) 23

PAGE 24

TakingthetimederivativeofEq. 3 yields _e=_z)]TJ /F6 11.955 Tf 11.95 0 Td[((I3+EC)_x;_e=Nz+Ly+Mf(^x;u))]TJ /F6 11.955 Tf 11.96 0 Td[((I3+EC)Ax)]TJ /F6 11.955 Tf 11.29 0 Td[((I3+EC)f(x;u))]TJ /F6 11.955 Tf 11.95 0 Td[((I3+EC)Dd: (3) UsingEqs. 3 3 and 3 ,Eq. 3 canbeexpressedas _e=Ne+(NM+LC)]TJ /F3 11.955 Tf 11.95 0 Td[(MA)x+M)]TJ /F6 11.955 Tf 8.03 -6.53 Td[(f(^x;u))]TJ /F6 11.955 Tf 14.51 3.16 Td[(f(x;u))]TJ /F3 11.955 Tf 11.95 0 Td[(MDd;_e=Ne+M)]TJ /F6 11.955 Tf 8.03 -6.53 Td[(f(^x;u))]TJ /F6 11.955 Tf 14.5 3.15 Td[(f(x;u) (3) 3.3StabilityAnalysisThestabilityoftheobserverisprovedusingtheLyapunov-basedmethod.Theexponentialstabilityoftheobserverisprovedasin[ 8 ]andtheuniformlyultimateboundnessofthestateestimateserrorisprovedwherethenonlinearsystemcontainstheadditivedisturbanceterm. Theorem3.1. ThenonlinearunknowninputobservergiveninEq. 3 isexponentiallystableinthesensethatke(t)k!0ast!1ifftheinequalityinEq. 3 issatised. Proof. ConsideraLyapunovcandidatefunctionV(t):R3!Rdenedas V(t)=eT(t)Pe(t) (3) whereP2R33isapositivedenitematrix.Since,Pispositivedenite,theLyapunovfunctionisalsopositivedenitesatisfyingfollowinginequalities min(P)kek2Vmax(P)kek2 (3) 24

PAGE 25

wheremin;max2RaretheminimumandmaximumeigenvaluesofthematrixP.BasedonEq. 3 ,thetimederivativeofEq. 3 yields _V=eT)]TJ /F3 11.955 Tf 5.48 -9.69 Td[(NTP+PNe+2eTPM)]TJ /F6 11.955 Tf 8.03 -6.53 Td[(f(^x;u))]TJ /F6 11.955 Tf 14.5 3.16 Td[(f(x;u);_V=eT)]TJ /F3 11.955 Tf 5.48 -9.68 Td[(NTP+PNe+2eTPM(f(^x;u))]TJ /F3 11.955 Tf 11.96 0 Td[(f(x;u)))]TJ /F6 11.955 Tf 9.3 0 Td[(2eTPMA(^x)]TJ /F3 11.955 Tf 11.95 0 Td[(x);_VeT)]TJ /F3 11.955 Tf 5.48 -9.69 Td[(NTP+PNe+21keTPMkkek+22keTPMkkek (3) wherethepositiveconstant1and2arerespectivelydenedasaLipschitzconstantandnormofAmatrix.FromEq. 3 ,thefollowinginequalitiescanbeobtained 21keTPMkkek21keTPMk2+kek2;22keTPMkkek22keTPMk2+kek2: (3) UsingEqs. 3 and 3 ,Eq. 3 canbeupperboundedby _VeT)]TJ /F3 11.955 Tf 5.48 -9.68 Td[(NTP+PNe+(21+22)eTPMMTPe+2eTe;_VeT)]TJ /F3 11.955 Tf 5.48 -9.69 Td[(NTP+PN+(21+22)PMMTP+2I3e;_VeTQe: (3) IftheconditioninEq. 3 issatised,_V<0.UsingEqs. 3 3 and 3 ,theupperboundsforV(t)canbeexpressedasVV(0)exp()]TJ /F3 11.955 Tf 9.3 0 Td[(t)where=)]TJ /F8 7.97 Tf 6.58 0 Td[(max(Q) min(P)2R+,andtheupperboundfortheestimationerrorisgivenby ke(t)k ke(t0)kexp()]TJ /F3 11.955 Tf 9.3 0 Td[(t)(3) 25

PAGE 26

where =max(P) min(P)2R+.UsingEq. 3 ,itcanbeshownthatke(t)k!0ast!18e(t0): Ifthenumberofunknowninputsdenotedbyndislessthanorequaltothenumberofoutputsdenotedbyny,theconditionsinSection 3.4 arenecessaryandsufcientconditionsforthestabilityofanunknowninputobserverforalineartime-invariantsystem[ 14 15 ] 1 .Howevertheobservabilityandtherankconditionsdonotnecessarilyguaranteethestabilityoftheobserverforageneralnonlinearsystemwhenndny[ 16 ].Forthestabilityofthenonlinearunknowninputobserver,thenumberofunknowninputsndshouldbelessthanthenumberofoutputsny(nd
PAGE 27

where2R+isproportionaltothenormoftheadditivedisturbanced2(t)iffthefollowinginequalityissatisedQ+I3=NTP+PN+)]TJ /F3 11.955 Tf 5.48 -9.69 Td[(21+22PMMTP+3I3<0: Proof. TheLyapunovcandidatefunctiondenedinEq. 3 isusedhere.UsingtheupperboundresultinEq. 3 andEq. 3 ,thetimederivativeofVcanbeexpressedandupperboundedas _V=eT)]TJ /F3 11.955 Tf 5.48 -9.69 Td[(NTP+PNe+2eTPM)]TJ /F6 11.955 Tf 8.03 -6.53 Td[(f(^x;u))]TJ /F6 11.955 Tf 14.5 3.16 Td[(f(x;u))]TJ /F3 11.955 Tf 9.3 0 Td[(eTP(MD2d2))]TJ /F11 11.955 Tf 11.96 9.68 Td[()]TJ /F3 11.955 Tf 5.47 -9.68 Td[(dT2DT2MTPe;_V=eT)]TJ /F3 11.955 Tf 5.48 -9.68 Td[(NTP+PNe+2eTPM(f(^x;u))]TJ /F3 11.955 Tf 11.96 0 Td[(f(x;u)))]TJ /F6 11.955 Tf 9.3 0 Td[(2eTPMA(^x)]TJ /F3 11.955 Tf 11.95 0 Td[(x))]TJ /F6 11.955 Tf 11.95 0 Td[(2eTP(MD2d2);_VeTQe+2kekTkPMD2d2k;_VeTQe+kek2+kPMD2d2k2=eT(Q+I3)e+dT2)]TJ /F3 11.955 Tf 5.48 -9.69 Td[(DT2MTPPMD2d2: (3) BasedonAssumptions2.1,2.2andtheexpressionofunknowndisturbanceinEq. 3 ,d2(t)canbeupperboundedbyaknownpositiveconstant.ItisalsogiventhatthematricesP;M;D2areknownandconstant.ThusthelasttermonrighthandsideofEq. 3 canbeupperboundedbyaknownpositiveconstantaskPMD2d2k2=dT2)]TJ /F3 11.955 Tf 5.48 -9.69 Td[(DT2MTPPMD2d21where12R+.Eq. 3 canberewrittenusing1as_VeT(Q+I3)e+1: 27

PAGE 28

IfQ+I2<0,thenusingEqs. 3 3 and 3 ,theupperboundsforV(t)canbeexpressedasVV(0)exp()]TJ /F3 11.955 Tf 9.3 0 Td[(2t)+1 12(1)]TJ /F3 11.955 Tf 11.96 0 Td[(exp()]TJ /F3 11.955 Tf 9.3 0 Td[(2t))where2=)]TJ /F8 7.97 Tf 6.58 0 Td[(max(Q+I2) min(P)2R+,andtheupperboundfortheestimationerrorisgivenby kekp 3ke(0)k2exp()]TJ /F3 11.955 Tf 9.3 0 Td[(2t)+4(1)]TJ /F3 11.955 Tf 11.96 0 Td[(exp()]TJ /F3 11.955 Tf 9.3 0 Td[(2t)):(3)where3=max(P) min(P)2R+and4=1 min(P)122R+.FromEq. 3 ,itcanbeconcludedthattheestimationerrorisuniformlyultimatelyboundedandtheultimateboundisgivenby=p 4(1)]TJ /F3 11.955 Tf 11.96 0 Td[(exp()]TJ /F3 11.955 Tf 9.3 0 Td[(2t))where2R+isproportionaltothenormofthedisturbanced2(t). 3.4ConditiononAmatrixIftheinequalityconditioninEq. 3 issatised,thepair(MA;C)isobservable[ 14 ].ThenthegainmatrixKcanbechosensuchthatN=MA)]TJ /F3 11.955 Tf 12.5 0 Td[(KCisHurwitz.Sincerank(CD)=rank(D)=1,thefollowingrankconditionisequivalenttotheobservabilityofthepair(MA;C)[ 14 ] rank264sI3)]TJ /F3 11.955 Tf 11.95 0 Td[(ADC0375=4;8s2C:(3)Thus,thematrixAshouldbechosentosatisfyEq. 3 sothatthepair(MA;C)isobservable. 3.5ConditiononobjecttrajectoriesConsideringthedynamicsinEq. 2 andthenonlinearsysteminEq. 3 ,theunknowninputd(t)canbeexpressedas d(t)=vpxx3vpyx3)]TJ /F3 11.955 Tf 9.3 0 Td[(vpzx23T: (3) Sincethenumberofunknowncomponentsofd(t)inEq. 3 islargerthanthenumberofoutputs,thedisturbanceinputcannotbedirectlyexpressedintheformofDd(t).Toresolvethisproblem,thefollowingassumptionisimposedonthemotionofthemovingobject. 28

PAGE 29

Assumption3.1.ThelinearvelocityofthemovingobjectintheZ-directionofthecameraiszero;vpz(t)=0 2 ,orthelinearvelocityofthetrackedobjecteitherintheXorY-directionandtheZ-directionofthecameraiszero;vpy(t)=vpz(t)=0orvpx(t)=vpz(t)=0 3 .SomepracticalscenariossatisfyingAssumption3.1canbeconsideredinmanyapplicationsunderconstrainedobjectmotions: 1. anobjectmovingalongastraightlineormovingalongacircleand 2. anobjectmovinginacircularmotionwithtime-varyingradiusonaplane.Considerrangedetectingapplicationsaspeopleorgroundvehiclesmoveonthegroundplanewhiletheyareobservedbyadownwardlookingcameraxedinanaerialvehicle.TheZ-axisoftheinertialreferenceframeisconsideredtobeperpendiculartothegroundplaneandtheXandY-axisareinthegroundplane.SincetheZ-directionofcameraisperpendiculartothegroundplaneandtheobjectmovesonthegroundplane,thelinearvelocityoftheobjectinZ-directionofthecameravpziszero.Iftheobjectmovesonastraightlineinthegroundplaneobservedbythecameraintranslationalmotionortheobjectmovesonacircleinthegroundplaneobservedbythecamerainacircularmotion,choiceoftheunknowndisturbancebecomesd(t)=vpxx3ord(t)=vpyx3.Iftheobjectmovesonanunknowntime-varyingradiusofcircleobservedbythecamerainacircularmotion,theunknowndisturbancetermisselectedtobeDd(t)=D1vpxx3+D2vpyx3orDd(t)=D1vpyx3+D2vpxx3. 3.6LMIFormulationTondE,KandP,Eq. 3 isreformulatedintermsofalinearmatrixinequality(LMI)as[ 8 17 ] 264X11X12XT12)]TJ /F3 11.955 Tf 9.3 0 Td[(I3375<0(3) 2 Dd(t)=D1vpxx3+D2vpyx33 d(t)=vpxx3ord(t)=vpyx3. 29

PAGE 30

whereX11=AT(I3+FC)TP+P(I3+FC)A+ATCTGTPTY+PYGCA)]TJ /F3 11.955 Tf 11.96 0 Td[(CTPTK)]TJ /F3 11.955 Tf 11.95 0 Td[(PKC+2I3X12=P+PFC+PFC+PYGCPY=PYPK=PK=q 21+22whereF=)]TJ /F3 11.955 Tf 9.3 0 Td[(D(CD)yandG=Y)]TJ /F3 11.955 Tf 5.48 -9.68 Td[(I2)]TJ /F6 11.955 Tf 11.96 0 Td[((CD)(CD)y.TosolveLMIfeasibilityprobleminEq. 3 ,theCVXtoolboxinMatlabisused[ 18 ].UsingP,PKandPYobtainedfromEq. 3 ,KandYarecomputedusingK=P)]TJ /F13 7.97 Tf 6.59 0 Td[(1PKandY=P)]TJ /F13 7.97 Tf 6.59 0 Td[(1PY.Iftheadditivedisturbancetermd2(t)isconsideredintheerrordynamics,theLMIinEq. 3 isdenedwithslightlychangedX11termas264X011X12XT12)]TJ /F3 11.955 Tf 9.3 0 Td[(I3375<0whereX011=X11+I2: 30

PAGE 31

CHAPTER4VELOCITYKINEMATICSFORAROBOTMANIPULATORInthischapter,thevelocitykinematicanalysisisdescribed.Theforwardkinematicanalysisdeterminesthepositionandorientationoftheend-effectorofarobotmanipulatorforgivenjointvariables(Section 4.1 ).Tocontrolthemotionoftherobotmanipulator,thejointvelocitiesarerelatedtothelinearandangularvelocityoftheend-effectorasdescribedinSection 4.2 4.1ForwardKinematicAnalysisGiventhejointvariables,thespecicpositionandorientationoftheend-effectorcanbedeterminedbyaforwardkinematicanalysis[ 19 ].ThehomogeneousmatrixwhichtransformsthecoordinatesofanypointinframeBtoframeAiscalledatransformmatrix,andisdenotedasABT=264ABRAPB00001375;wherethematrixABR2R33istheorientationofframeArelativetoframeBandthevectorAPB02R3representsthecoordinatesoftheoriginofframeBmeasuredinframeA.Fora6-linkmanipulator(i.e.,PUMA560usedinexperiments),thetransformmatrixbetweentheinertialreferenceframeFandthe6thjointisgivenby F6T=F1T12T23T34T45T56T, (4) wherethegeneraltransformmatrixbetweenithandjthjointisgivenby ijT=266666664cj)]TJ /F3 11.955 Tf 9.3 0 Td[(sj0aijsjcijcjcij)]TJ /F3 11.955 Tf 9.3 0 Td[(sij)]TJ /F3 11.955 Tf 9.3 0 Td[(sijSjsjsijcjsijcijcijSj0001377777775 (4) wherecj=cos(j),cij=cos(ij);sj=sin(j)andsij=sin(ij).Thetermaijisthelinklengthoflinkijandthetermsjisthejointoffsetforjointj[ 19 ].Theinertialreferenceframeisdenedashavingitsoriginattheintersectionoftherstjointaxisandthelinealonglink 31

PAGE 32

Table4-1.MechanismparametersforthePUMA560. Linklength(cm)Twistangle(deg)Jointoffset(cm)Jointangle(deg) a12=012=)]TJ /F6 11.955 Tf 9.3 0 Td[(901=variablea23=43:1823=0S2=15:052=variablea34=)]TJ /F6 11.955 Tf 9.29 0 Td[(1:9134=90S3=03=variablea45=045=)]TJ /F6 11.955 Tf 9.3 0 Td[(90S4=43:314=variablea56=056=90S5=05=variable6=variable a12.TheZ-axisoftheinertialreferenceframeisparalleltotherstjointaxisdirection.ThetransformationbetweentheinertialreferenceframeandtherstjointisgivenbyF1T=266666664cos(1))]TJ /F6 11.955 Tf 11.29 0 Td[(sin(1)00sin(1)cos(1)0000100001377777775where1istheanglebetweentheX-axisofthexedcoordinatesystemandthevectoralonglinka12.WithEq. 4 ,thepositionofthetoolpointmeasuredintheinertialreferenceframeisgivenby 264FPtool1375=F6T2646Ptool1375; (4) wherethevectorFPtool2R3denotesthepositionofthetoolpointmeasuredintheinertialreferenceframeandthevector6Ptool2R3denotesthepositionofthetoolpointmeasuredfromtheoriginofthe6thjoint 1 .InTable. 4-1 ,themechanismparametersforthePUMA560isrepresented.ThekinematicmodelofthePUMA560isillustratedinFig. 4-1 .ThevectorS6inFig. 4-1 isassumedtobealignedwiththeopticalaxis(seeAssumption4.)andthevectora67isassumedtobealignedwiththeXdirectionofthecamera. 1 InChapter 4 ,itisassumedthatallquantitiesareexpressedusingthebasisfEx,Ey,EzgxedintheinertialreferenceframeF. 32

PAGE 33

Figure4-1.KinematicmodelofPUMA560. 4.2VelocityKinematicAnalysisThekinematicvelocityanalysisisusedtogeneratedesiredtrajectoriesofthePUMArobotandtocalculatethecameravelocitiesasviewedbyanobserverintheinertialreferenceframeF.Thevelocityrelationshipsbetweenthejointvelocitiesandtheend-effectoraredened[ 20 ]as 264FV6Fw6375=J_q (4) whereFV62R3denotesthelinearvelocityof6thjointmeasuredintheinertialreferenceframe,Fw62R3denotestheangularvelocityof6thjointrelativetotheinertialreferenceframe,thethejointvelocitiesvector_q2R6isdenedas_q=F!11!22!33!44!55!6T 33

PAGE 34

wherei!j2Rdenotesthejointvelocityofthejthjointrelativetotheithjoint.ThematrixJ2R66inEq. 4 denotestheJacobian,andisdenedasJT,2666666666666664FS1)]TJ /F10 7.97 Tf 5.48 -5.35 Td[(FP6)]TJ /F8 7.97 Tf 9.3 4.34 Td[(FP1TFS1TFS2)]TJ /F8 7.97 Tf 5.48 -5.35 Td[(FP6)]TJ /F8 7.97 Tf 9.3 4.33 Td[(FP2TFS2TFS3)]TJ /F8 7.97 Tf 5.48 -5.34 Td[(FP6)]TJ /F8 7.97 Tf 9.3 4.34 Td[(FP3TFS3TFS4)]TJ /F8 7.97 Tf 5.48 -5.34 Td[(FP6)]TJ /F8 7.97 Tf 9.3 4.34 Td[(FP4TFS4TFS5)]TJ /F8 7.97 Tf 5.48 -5.35 Td[(FP6)]TJ /F8 7.97 Tf 9.3 4.34 Td[(FP5TFS5TFS6)]TJ /F8 7.97 Tf 5.48 -5.35 Td[(FP6)]TJ /F8 7.97 Tf 9.3 4.34 Td[(FP6TFS6T3777777777777775whereFSiistheithjointaxismeasuredinFandFPiistheoriginoftheithjointrelativetoF.TherstthreeelementsofthethirdcolumnofFiTisidenticaltothevectorFSiandtherstthreeelementsofthefourthcolumnofFiTisidenticaltoFPi.FromEqs. 4 4 ,andthegivenjointvelocities_q,thevelocitiesofthetoolpointcanbeobtainedas 264FVtoolFwtool375=264FV6+Fw6FPtoolFw6375: (4) Basedontheinversevelocityanalysis,thejointvelocities_qcanbecalculatedwiththedesiredtoolpointvelocitiesusingEqs. 4 and 4 _q=J)]TJ /F13 7.97 Tf 6.59 0 Td[(1264FV6Fw6375=J)]TJ /F13 7.97 Tf 6.59 0 Td[(1264FVtool)]TJ /F8 7.97 Tf 11.96 4.34 Td[(Fw6FPtoolFwtool375: (4) UnlesstheJacobianmatrixJisnon-singular,itsinversematrixJ)]TJ /F13 7.97 Tf 6.59 0 Td[(1exists[ 20 ],and_qcanbeobtainedtogeneratethedesiredvelocitiesofthetoolpointusingEq. 4 34

PAGE 35

CHAPTER5EXPERIMENTSANDRESULTSToverifythedesignedunknowninputobserverforreal-timeimplementation,twosetsofexperimentsareconductedonaPUMA560serialmanipulatorandatwo-linkplanarrobot.Therstsetisperformedfortherelativepositionestimationofastaticobjectusingamovingcamera.Thesecondsetisperformedforpositionestimationofmovingobject.AschematicoverviewofexperimentalcongurationisillustratedinFig. 5-1 5.1TestbedSetupThetestbedconsistsofvecomponents:(1)robotmanipulators,(2)camera,(3)imageprocessingworkstation(main),(4)robotcontrolworkstation(PUMAandtwo-link),and(5)serialcommunication.Figure 5-2 showstheexperimentalplatforms.Acameraisrigidlyxedtotheend-effectorofthePUMA560.ThePUMAandthetwo-linkrobotarerigidlyattachedtoaworktable.Experimentsareconductedtoestimatethepositionofthestaticaswellasthemovingobject.Aduciarymarkerisusedasanobjectinalltheexperiments.Forexperimentsinvolvingastaticobject,theobjectisxedtotheworktable.Forexperimentsinvolvingamovingobject,theobjectisxedtotheend-effectorofthetwo-linkrobotwhichfollowsadesiredtrajectory.ThePUMA560isusedtomovethecamerawhileobservingthestaticormovingobject.AmvBlueFox-120acolorUSBcameraisusedtocaptureimages.ThecameraiscalibratedusingtheMATLABcameracalibrationtoolbox[ 21 ]andisgivenbyKc=266664560:980050:000000303:911960:000000749:53852345:999060:0000000:0000001:000000377775:ACore2-Duo2.53GHzlaptop(main-workstation)operatingunderWindows7isusedtocarryouttheimageprocessingandtostoredatatransmittedfromthePUMA-workstation.TheimageprocessingalgorithmsarewritteninC/C++,anddevelopedinMicrosoftVisualStudio2008.TheOpenCVandMATRIX-VISIONAPIlibrariesareusedtocapturetheimagesandtoimplementaKLTfeaturepointtracker(Section 2.4 ).TrackedstaticandmovingpointsusingKLTtracking 35

PAGE 36

Figure5-1.Anoverviewoftheexperimentalconguration. Figure5-2.Platforms. 36

PAGE 37

(a)FRAME=45 (b)FRAME=90. (c)FRAME=135. Figure5-3.Atrackedstaticpoint(dotinsolidcircle). (a)FRAME=45 (b)FRAME=90 (c)FRAME=135 Figure5-4.Atrackedmovingpoint(dotindashedcircle). 37

PAGE 38

algorithmareillustratedinFigs. 5-3 and 5-4 .Sub-workstations(PUMAandtwo-link)arecomposedoftwoPentium2.8GHzPCsoperatingunderQNX.ThesetwocomputersareusedtohostcontrolalgorithmsforthePUMA560andthetwo-linkrobotviaQmotor3.0[ 22 ].APIDcontrollerisemployedtocontrolthesixjointsofthePUMA560.ARISE-basedcontroller[ 23 ]isappliedtocontrolthetwo-linkrobot.Controlimplementationanddataacquisitionforthetworobotsareoperatedat1.0kHzfrequencyusingtheServoToGoI/Oboard.Theforwardvelocitykinematics[ 19 20 ]areusedtoobtainthepositionandvelocityofthecameraandtrackedpoint.ThecameravelocitiescomputedonthePUMA-workstationaretransmittedtothemain-workstationviaserialcommunicationat30Hz.Thepose(positionandorientation)ofthetrackedpointandthecameraarecomputedandstoredinthesub-workstationsat1.0KHz.Thepositionofthecameraandthepointareusedtocomputethegroundtruthdistancebetweenthecameraandobjectasrobj=came=hfRgEei)]TJ /F13 7.97 Tf 6.59 0 Td[(1frobj)]TJ /F3 11.955 Tf 11.96 0 Td[(rcamgEwherefRgEe2R33istherotationmatrixofthecamerawithrespecttotheinertialreferenceframe.Theleast-squaresmethod(Section 2.3 )isimplementedtondtheoptimizedcameramatrix.Correspondingsetsof(m1i(t);m2i(t))and(ui(t);vi(t))obtainedfromastaticpointareused.TheoptimizedcameraparametersareobtainedusingdatainSet1ofExperimentI.Theresultmatrixisgivenby^Kc=266664551:97940:000000304:02820:000000737:5125331:50520:0000000:0000001:000000377775:ThepositionestimationresultsusingtheoriginalcameracalibrationmatrixandtheoptimizedcameracalibrationmatrixarecomparedinTables. 5-1 and 5-2 .Table 5-1 showsthecomparisonoftheRMS(root-meansquare)errorofthesteady-statepositionestimation,usingSet2ofExperimentI,withandwithouttheuseoftheoptimizedcameramatrix.Table 5-2 presents 38

PAGE 39

anothercomparisonoftheRMSerrorofthesteady-statepositionestimationusingset1ofexperimentII.Thematrix^Kcisusedfortheentireexperiments. Table5-1.ComparisonoftheRMSpositionestimationerrorsinset2ofExperimentI. w/ooptimizationofKCw/optimizationofKC x(m)0.01210.0119y(m)0.03490.0179z(m)0.09580.0800 Table5-2.ComparisonoftheRMSpositionestimationerrorsinset1oftheExperimentII. w/ooptimizationofKCw/optimizationofKC x(m)0.01720.0172y(m)0.02480.0170z(m)0.05480.0519 5.2ExperimentI:MovingcamerawithastaticobjectInthissection,thestructureestimationalgorithmisimplementedforastaticobjectobservedusingamovingcamera.Giventheangularandlinearvelocityofthemovingcamera,thepositionofthestaticobjectrelativetothemovingcameraisestimated.Atrackedpointonthestaticobjectisobservedbyadownward-lookingcameraasshowninFig. 5-3 .Sincevpx,vpyandvpzarezeroforastaticobject,theunmeasurabledisturbanceinputd(t)iszero. 5.2.1Set1Inthisexperimentset,theobserveristestedwithconstantcameravelocities.TheangularandlinearcameravelocityaregivenasinFigures 5-5 and 5-6 .ThematricesA;CandDaregivenbyA=2666640:00)]TJ /F6 11.955 Tf 9.3 0 Td[(0:05)]TJ /F6 11.955 Tf 9.3 0 Td[(0:300:050:00)]TJ /F6 11.955 Tf 9.3 0 Td[(1:500:000:000:00377775;C=264100010375;D=266664100377775: 39

PAGE 40

ThematrixYandgainmatrixKarecomputedusingtheCVXtoolboxinMATLAB[ 18 ]asK=2666641:31200:00000:00001:31200:05900:0000377775;Y=2666640:00000:00000:0000)]TJ /F6 11.955 Tf 9.3 0 Td[(1:00000:00001:1793377775:TheestimationresultisillustratedinFigs. 5-7 and 5-8 .Thesteady-stateRMSerrorsinthepositionestimationaregiveninTab. 5-3 Figure5-5.Cameraangularvelocity. Figure5-6.Cameralinearvelocity. 40

PAGE 41

Figure5-7.Comparisonoftheactual(dash)andestimated(solid)positionofastaticobjectwithrespecttoamovingcamera. Figure5-8.Positionestimationerrorforastaticpoint. 5.2.2Set2Again,theobserveristestedwithconstantcameravelocitiesbutwithdifferentmagnitude.ThecameravelocitiesaregivenasinFigures 5-9 and 5-10 .ThematricesA;CandDaregiven 41

PAGE 42

byA=2666640:00)]TJ /F6 11.955 Tf 9.3 0 Td[(0:071:150:070:001:300:000:000:00377775;C=264100010375;D=266664100377775:ThecomputedmatrixYandgainmatrixKaregivenasK=2666641:30820:00000:00001:30820:08960:0000377775;Y=2666640:00000:00000:0000)]TJ /F6 11.955 Tf 9.3 0 Td[(1:00000:0000)]TJ /F6 11.955 Tf 9.3 0 Td[(1:2796377775:TheestimationresultisillustratedinFigs. 5-11 and 5-12 .Thesteady-stateRMSerrorsinthepositionestimationaregiveninTab. 5-3 Figure5-9.Cameraangularvelocity 42

PAGE 43

Figure5-10.Cameralinearvelocity. Figure5-11.Comparisonoftheactual(dash)andestimated(solid)positionofastaticobjectwithrespecttoamovingcamera. 43

PAGE 44

Figure5-12.Positionestimationerrorforastaticobject. 5.2.3Set3Thisexperimentsetisdesignedtotesttheobserverwithatime-varyinglinearvelocityofthecamera.Figures 5-13 and 5-14 showthelinearandangularcameravelocities.ThematricesA;CandDareselectedtobeA=2666640:00)]TJ /F6 11.955 Tf 9.3 0 Td[(0:100:000:100:00)]TJ /F6 11.955 Tf 9.3 0 Td[(1:500:000:000:00377775;C=264100010375;D=266664100377775:ThematrixYandgainmatrixKarecomputedusingtheCVXtoolboxinMATLAB[ 18 ]asK=2666641:32920:00000:00001:32920:20160:0000377775;Y=2666640:00000:00000:0000)]TJ /F6 11.955 Tf 9.3 0 Td[(1:00000:00002:0161377775:TheestimationresultisillustratedinFigs. 5-15 and 5-16 .Thesteady-stateRMSerrorsinthepositionestimationaregiveninTab. 5-3 44

PAGE 45

Figure5-13.Cameraangularvelocity Figure5-14.Cameralinearvelocity. 45

PAGE 46

Figure5-15.Comparisonoftheactual(dash)andestimated(solid)positionofastaticobjectwithrespecttoamovingcamera. Figure5-16.Positionestimationerrorforastaticobject. 5.2.4Set4Thisexperimentsetisdesignedtotesttheobserverwithtwotime-varyinglinearvelocitiesofcamera.Figures 5-17 and 5-18 showthelinearandangularcameravelocities.Thematrices 46

PAGE 47

A;CandDareselectedtobeA=2666640:00)]TJ /F6 11.955 Tf 9.3 0 Td[(0:05)]TJ /F6 11.955 Tf 9.3 0 Td[(1:000:050:000:000:000:000:00377775;C=264100010375;D=266664010377775:ThematrixYandgainmatrixKarecomputedusingtheCVXtoolboxinMATLAB[ 18 ]asK=2666641:31490:00000:00001:31490:0590)]TJ /F6 11.955 Tf 9.3 0 Td[(0:1256377775;Y=266664)]TJ /F6 11.955 Tf 9.3 0 Td[(1:00000:00000:00000:00002:51250:0000377775:TheestimationresultisillustratedinFigs. 5-19 and 5-20 .Thesteady-stateRMSerrorsinthepositionestimationaregiveninTab. 5-3 Figure5-17.Cameraangularvelocity 47

PAGE 48

Figure5-18.Cameralinearvelocity. Figure5-19.Comparisonoftheactual(dash)andestimated(solid)positionofastaticobjectwithrespecttoamovingcamera. 48

PAGE 49

Table5-3.RMSpositionestimationerrorsofthestaticpoint. Set1Set2Set3Set4Avg. x(m)0.00160.00400.00990.00270.0046y(m)0.00470.00650.03620.01070.0145z(m)0.02140.02840.03860.03990.0221 Figure5-20.Positionestimationerrorforastaticobject. 5.3ExperimentII:MovingcamerawithamovingobjectInthissection,theobserverisusedtoestimatethepositionofamovingobjectusingamovingcamera.Giventheangularandlinearvelocityofthemovingcamera,thepositionofthemovingobjectrelativetothemovingcameraisestimated.Adownward-lookingcameraobservesamovingpointxedtothemovingtwo-linkrobotarmasillustratedinFig. 5-4 .Inthiscase,theobjectismovingintheX)]TJ /F3 11.955 Tf 10.67 0 Td[(Yplanewithunknownvelocitiesvpx(t)andvpy(t).IntheexperimentSet3,thelinearvelocityofcamerahastwotime-varyingvelocitiestotesttheobserverwithmoregeneralizedtrajectoryofthemovingcamera. 49

PAGE 50

5.3.1Set1Inthisexperimentset,theobserveristestedwithconstantcameravelocities.ThecameravelocitiesaregivenasinFigures 5-21 and 5-22 .ThematricesA;CandDaregivenbyA=2666640:00)]TJ /F6 11.955 Tf 9.3 0 Td[(0:051:280:050:00)]TJ /F6 11.955 Tf 9.3 0 Td[(0:380:000:000:00377775;C=264100010375;D=266664100377775:ThecomputedmatrixYandgainmatrixKaregivenasK=2666641:22980:00000:00001:22980:34760:0000377775;Y=2666640:00000:00000:0000)]TJ /F6 11.955 Tf 9.3 0 Td[(1:00000:00006:9530377775:TheestimationresultisillustratedinFigs. 5-23 and 5-24 .Thesteady-stateRMSerrorsinthepositionestimationaregiveninTab. 5-4 Figure5-21.Cameraangularvelocity 50

PAGE 51

Figure5-22.Cameralinearvelocity. Figure5-23.Comparisonoftheactual(dash)andestimated(solid)positionofamovingobjectwithrespecttoamovingcamera. 51

PAGE 52

Figure5-24.Positionestimationerrorforamovingobject. 5.3.2Set2Inthisexperimentset,theobserveristestedwithatime-varyinglinearvelocityofcameraalongtheXdirection.ThecameravelocitiesareshowninFigs. 5-25 and 5-26 .ThematricesA;CandDaregivenbyA=2666640:00)]TJ /F6 11.955 Tf 9.3 0 Td[(0:050:000:050:00)]TJ /F6 11.955 Tf 9.3 0 Td[(0:300:000:000:00377775;C=264100010375;D=266664100377775:ThematrixYandgainmatrixKarecomputedusingtheCVXtoolboxinMATLABandaregivenasK=2666641:24430:00000:00001:24430:27630:0000377775;Y=2666640:00000:00000:0000)]TJ /F6 11.955 Tf 9.3 0 Td[(1:00000:00005:5261377775:TheestimationresultisillustratedinFigs. 5-27 and 5-28 .Thesteady-stateRMSerrorsinthepositionestimationaregiveninTab. 5-4 52

PAGE 53

Figure5-25.Cameraangularvelocity. Figure5-26.Cameralinearvelocity. 53

PAGE 54

Figure5-27.Comparisonoftheactual(dash)andestimated(solid)positionofamovingpointwithrespecttoamovingcamera. Figure5-28.Positionestimationerrorforamovingpoint. 5.3.3Set3Inthisexperimentset,thelinearcameravelocitiesalongtheXandYdirectionaretime-varying,andthecameraangularcameravelocityisconstant.Thecameravelocitiesaredepicted 54

PAGE 55

inFigs. 5-29 and 5-30 .ThematricesA;CandDaregivenbyA=2666640:00)]TJ /F6 11.955 Tf 9.3 0 Td[(0:05)]TJ /F6 11.955 Tf 9.3 0 Td[(1:000:050:00)]TJ /F6 11.955 Tf 9.3 0 Td[(0:300:000:000:00377775;C=264100010375;D=266664100377775:ThematrixYandgainmatrixKarecomputedasK=2666641:28920:00000:00001:28920:23130:0000377775;Y=2666640:00000:00000:0000)]TJ /F6 11.955 Tf 9.3 0 Td[(1:00000:00004:6261377775:TheestimationresultisdepictedinFigs. 5-31 and 5-32 .Thesteady-stateRMSerrorsinthepositionestimationaregiveninTab. 5-4 Figure5-29.Cameraangularvelocity. 55

PAGE 56

Figure5-30.Cameralinearvelocity. Figure5-31.Comparisonoftheactual(dash)andestimated(solid)positionofamovingpointwithrespecttoamovingcamera. 56

PAGE 57

Table5-4.RMSpositionestimationerrorsofthemovingpoint. Set1Set2Set3Avg. x(m)0.01720.02910.00590.0174y(m)0.01700.00300.02080.0136z(m)0.05190.06630.06810.0621 Figure5-32.Positionestimationerrorforamovingpoint. 57

PAGE 58

CHAPTER6CONCLUSIONANDFUTUREWORKTheonlineSFMmethodusingtheunknowninputobserverin[ 8 ]isimplementedtoestimatethepositionofastaticandamovingobject.Theobserveristestedwithdifferentcameraandobjectmotions(i.e.,constantortime-varyingcameralinearvelocity)toestimatethepositionofthestaticormovingobject.TheconditionsonthemotionofmovingobjectarepresentedwithpracticalscenariosinChapter 3 .TheLyapunov-basedstabilityanalysisoftheobserverisdescribedinChapter 3 ,andisveriedinthetwosetsofexperiments.ExperimentalresultsinChapter 5 showthattheobserveryieldsexponentiallystableoruniformlyultimatelyboundedresultinthepositionestimationaccordingtotheobjectmotions.Forthestaticobjectcase(ExperimentI),thenumberofunknowninputsarelessthanthenumberofmeasuredoutputs.Thus,theobserverexponentiallyconvergestothetruestate.ResultsinExperimentIshowthattheobserveryieldstheaveragedRMSerrorswithin0.025mprecision.Forthemovingobjectcase(ExperimentII),whenthenumberofdisturbanceinputsisequaltothenumberofoutputs,theobserveryieldsanuniformlyultimatelyboundedresult(cf.,Set3ofExperimentII).Yet,theobserveryieldstheaverageofRMSerrorwithin0.065mprecisionwiththemovingobject.AsseenfromtheRMSerrorineachexperiments(Tabs. 5-3 and 5-4 ),theobserveryieldsgoodperformancefordetectingthecoordinatesforthestaticaswellasmovingobjectsinthepresenceofsensornoiseinfeaturetrackingandcameravelocities.Theoptimizedcameramatrixobtainedfromtheleastsquaresmethodisusedineachexperiment.TheimprovedpositionestimationresultswiththeoptimizedcameramatrixaregiveninTabs. 5-1 and 5-2 .Intheapplicationoftheobserverforstructureestimation,someconstraintsareimposedontheobjectandcameramotions.Futureworksshouldbefocusedoneliminatingtheconstraintsonthecameraandobjectmotions.Consideringthestructureofthenonlinearsystemequation,thematrixAisdesignedtoincludesomecomponentsofthelinearandangularvelocityofthe 58

PAGE 59

cameraasA=2666640!z)]TJ /F3 11.955 Tf 9.3 0 Td[(vpx)]TJ /F3 11.955 Tf 9.3 0 Td[(!z0)]TJ /F3 11.955 Tf 9.3 0 Td[(vpy000377775:ThischoiceofAmatrixandtheconditiononAmatrixdescribedinChapter 3 restrictthecameramotions.Designinganunknowninputobserverfortime-varyingAmatrixordeningmoregeneralchoiceofAmatrixcanliftsomeconstraintsonthecameramotion.Toeliminateconstraintsontheobjectmotion,moreinformation(i.e.,imagevelocities)fromtheimagesequenceshouldbeincorporatedinthestatespacedynamicsequationinChapter 2 59

PAGE 60

REFERENCES [1] S.AvidanandA.Shashua,Trajectorytriangulation:3Dreconstructionofmovingpointsfromamonocularimagesequence,IEEETrans.PatternAnal.Mach.Intell.,vol.22,no.4,pp.348,Apr2000. [2] J.KaminskiandM.Teicher,Ageneralframeworkfortrajectorytriangulation,J.Math.Imag.Vis.,vol.21,no.1,pp.27,2004. [3] M.HanandT.Kanade,Reconstructionofascenewithmultiplelinearlymovingobjects,Int.J.Comput.Vision,vol.59,no.3,pp.285,2004. [4] R.Vidal,Y.Ma,S.Soatto,andS.Sastry,Two-viewmultibodystructurefrommotion,Int.J.Comput.Vision,vol.68,no.1,pp.7,2006. [5] C.YuanandG.Medioni,Dreconstructionofbackgroundandobjectsmovingongroundplaneviewedfromamovingcamera,inComput.VisionPatternRecongnit.,vol.2,2006,pp.22612268. [6] H.Park,T.Shiratori,I.Matthews,andY.Sheikh,Dreconstructionofamovingpointfromaseriesof2Dprojections,inEuro.Conf.onComp.Vision,vol.6313,2010,pp.158. [7] A.Dani,Z.Kan,N.Fischer,andW.E.Dixon,Structureandmotionestimationofamovingobjectusingamovingcamera,inProc.Am.ControlConf.,Baltimore,MD,2010,pp.6962. [8] A.P.Dani,Z.Kan,N.R.Fischer,andW.E.Dixon,Structureestimationofamovingobjectusingamovingcamera:Anunknowninputobserverapproach,IEEEConferenceonDecisionandControlandEuropeanControlConference(CDC-ECC),pp.5005,2011. [9] R.Szeliski,ComputerVision:AlgorithmsandApplications.Springer,2010. [10] C.TomasiandT.Kanade,Detectionandtrackingofpointfeatures,InternationalJournalofComputerVision,Tech.Rep.,1991. [11] J.ShiandC.Tomasi,Goodfeaturetotrack,IEEEConferenceonComputerVIsionandPatternRecognition,pp.593,1994. [12] E.YazandA.Azemi,Observerdesignfordiscreteandcontinuousnonlinearstochasticsystems,Int.J.Syst.Sci.,vol.24,no.12,pp.2289,1993. [13] L.XieandP.P.Khargonekar,Lyapunov-basedadaptivestateestimationforaclassofnonlinearstochasticsystems,inProc.ofAmericanControlsConf.,Baltimore,MD,2010,pp.6071. [14] M.Darouach,M.Zasadzinski,andS.Xu,Full-orderobserversforlinearsystemswithunknowninputs,IEEETans.onAutomaticControl,vol.39,no.3,pp.606,1994. 60

PAGE 61

[15] M.Hautus,Strongdetectabilityandobservers,LinearAlgebraanditsApplication,vol.50,pp.353,1983. [16] R.Rajamani,Observersforlipschitznonlinearsystems,IEEETransactionsonAutomaticControl,vol.43,no.3,pp.397,1998. [17] W.ChenandM.Saif,Unknowninputobserverdesignforaclassofnonlinearsystems:anLMIapproach,inProc.ofAmericanControlConf.,2006. [18] M.GrantandS.Boyd,Cvx:Matlabsoftwarefordisciplinedconvexprogramming,OntheWWW,2005,uRLhttp://cvxr.com/cvx/. [19] C.D.CraneandJ.Duffy,KinematicAnalysisofRobotManipulators.Cambridge,1998. [20] M.W.SpongandM.Vidyasagar,RobotDynamicsandControl.Wiley,1989. [21] J.Bouguet,Cameracalibrationtoolboxformatlab,OntheWWW,2010,uRLhttp://www.vision.caltech.edu/bouguetj/. [22] M.Lofer,N.Costescu,andD.Dawson,Qmotor3.0andtheqmotorrobotictoolkit-anadvancedpc-basedreal-timecontrolplatform,IEEEControlSystemsMagazine,vol.22,no.3,pp.12,2002. [23] P.M.Patre,W.Mackunis,C.Makkar,andW.E.Dixon,Asymptotictrackingforsystemswithstructuredandunstructureduncertainties,IEEETrans.ControlSyst.Technol.,vol.16,pp.373,2008. 61

PAGE 62

BIOGRAPHICALSKETCHSujinJangwasborninIncheon,RepublicofKorea.HereceivedBachelorofScienceinMechanicalandAutomotiveEngineeringatKookminUniversity,RepublicofKorea.Afterhisgraduationin2010,hejoinedtheCenterforIntelligentMachinesandRoboticsintheUniversityofFlorida,undertheadvisoryofCarlD.CraneIII.HereceivedhisMasterofSciencedegreeinMechanicalEngineeringfromtheUniversityofFloridainthesummerof2012. 62