Citation
Target Tracking in Unknown Environments Using a Monocular Camera Subject to Intermittent Sensing

Material Information

Title:
Target Tracking in Unknown Environments Using a Monocular Camera Subject to Intermittent Sensing
Creator:
Bell, Zachary I
Place of Publication:
[Gainesville, Fla.]
Florida
Publisher:
University of Florida
Publication Date:
Language:
english
Physical Description:
1 online resource (135 p.)

Thesis/Dissertation Information

Degree:
Doctorate ( Ph.D.)
Degree Grantor:
University of Florida
Degree Disciplines:
Mechanical Engineering
Mechanical and Aerospace Engineering
Committee Chair:
Dixon,Warren E
Committee Co-Chair:
Barooah,Prabir
Committee Members:
Crane III,Carl D
Shea,John Mark
Graduation Date:
12/13/2019

Subjects

Subjects / Keywords:
nonlinear-estimation -- structure-and-motion -- structure-and-motion-from-motion -- visual-odometry
Mechanical and Aerospace Engineering -- Dissertations, Academic -- UF
Genre:
bibliography ( marcgt )
theses ( marcgt )
government publication (state, provincial, terriorial, dependent) ( marcgt )
born-digital ( sobekcm )
Electronic Thesis or Dissertation
Mechanical Engineering thesis, Ph.D.

Notes

Abstract:
In most applications involving autonomous agents tracking a moving target through uncertain environments, it is necessary to estimate the structure of local features in the environment (e.g., relative positions of objects in the immediate surrounding environment), the pose of an agent (i.e., position and orientation), and the pose and velocity of the target. Many of these applications require traveling over large distances implying the local environment for an agent is always changing, introducing further difficulty. Furthermore, it is often only possible to intermittently sense the target (e.g., environmental obstructions or path constraints of the agent may cause occlusions of the target). It is often assumed that it is possible to use global sensing to measure the state of an agent. However, state feedback generally requires a sensor that can relate all the states to a common coordinate system (e.g., global positioning system (GPS)). However, GPS may be unavailable (e.g., agents could operate in environments where GPS is restricted or denied). Furthermore, assuming that the entire environment is known and state information from the target is available is a restrictive assumption since targets are not likely to communicate such information and directly sensing the pose and velocity of a target is challenging and not possible in many scenarios. These challenges motivate the development of techniques that rely on local sensing but still allow agents to estimate their own state (i.e., pose) as well as the state of a target (i.e., pose and velocity). Additionally, efforts are motivated by the fact that local sensing often has intermittent availability. Cameras are a potential sensor that can provide local feedback of the environment where coordinates of the target can be related to a common reference frame; however, camera systems don't inherently measure scale, have a limited field-of-view, and are susceptible to intermittent sensing (e.g., due to occlusions). The scale of the Euclidean coordinates of features in an image, (also known as the structure of the features) is not available because images are a two dimensional projection of a Euclidean environment. Monocular camera systems can recover scale by moving the camera (e.g., structure from motion) and tracking features over large periods of time; however, as the distance to a target increases, the distance between views of the feature must also increase making an accurate scale estimate challenging. Additionally, the camera's limited field-of-view can inhibit continuous observation of a specific object. Continuous observation can also be disrupted by occlusions or trajectory constraints that may require an agent to purposefully allow the target to leave its field-of-view periodically. Furthermore, agents may need to track the target over large distances requiring the agent to continuously reconstruct new objects from the global environment when reconstructed objects in the local environment permanently leave their field-of-view. In Chapter 3, a global exponentially stable observer for feature scale is developed under a finite excitation condition through the use of integral concurrent learning. Since the observer only requires finite excitation to be globally exponentially stable, the observer is more general than previous results. The result indicates that the Euclidean distance to a set of features on a stationary object and the path the camera travels while viewing that object are estimated exponentially fast implying the structure and path are reconstructed exponentially. Furthermore, the developed estimation method does not require the features on the objects to be planar and does not require the positive depth constraint. An experimental study is presented which compares the developed Euclidean distance observer to previous observers demonstrating the effectiveness of this result. In Chapter 4, an extension to the learning approaches in Chapter 3 is developed that ( en )
General Note:
In the series University of Florida Digital Collections.
General Note:
Includes vita.
Bibliography:
Includes bibliographical references.
Source of Description:
Description based on online resource; title from PDF title page.
Source of Description:
This bibliographic record is available under the Creative Commons CC0 public domain dedication. The University of Florida Libraries, as creator of this bibliographic record, has waived all rights to it worldwide under copyright law, including all related and neighboring rights, to the extent allowed by law.
Thesis:
Thesis (Ph.D.)--University of Florida, 2019.
Local:
Adviser: Dixon,Warren E.
Local:
Co-adviser: Barooah,Prabir.
Statement of Responsibility:
by Zachary I Bell.

Record Information

Source Institution:
UFRGP
Rights Management:
Applicable rights reserved.
Classification:
LD1780 2019 ( lcc )

Downloads

This item has the following downloads:


Full Text

PAGE 1

TARGETTRACKINGINUNKNOWNENVIRONMENTSUSINGAMONOCULAR CAMERASUBJECTTOINTERMITTENTSENSING By ZACHARYIANBELL ADISSERTATIONPRESENTEDTOTHEGRADUATESCHOOL OFTHEUNIVERSITYOFFLORIDAINPARTIALFULFILLMENT OFTHEREQUIREMENTSFORTHEDEGREEOF DOCTOROFPHILOSOPHY UNIVERSITYOFFLORIDA 2019

PAGE 2

2019ZacharyIanBell 2

PAGE 3

TomyanceMichelleMariePeters,Icouldnothavemadeitthisfarwithoutyourlove andsupport 3

PAGE 4

ACKNOWLEDGMENTS IthankDr.WarrenE.Dixon,whohashelpedmetremendouslyimprovemycapacity asaresearcher.IextendmygratitudetomycommitteemembersDr.CarlCrane, Dr.PrabirBarooah,andDr.JohnSheafortheiroversightandrecommendations.I appreciateallthetimeIhavespentworkingonexcitingresearchwithallthepeoplein theNonlinearControlsandRoboticsgroupattheUniversityofFlorida. 4

PAGE 5

TABLEOFCONTENTS page ACKNOWLEDGMENTS..................................4 LISTOFTABLES......................................7 LISTOFFIGURES.....................................8 ABSTRACT.........................................10 CHAPTER 1INTRODUCTION...................................14 1.1Background...................................14 1.2OutlineoftheDissertation...........................20 2SYSTEMMODEL..................................23 2.1MotionModelUsingStationaryFeatures...................23 2.2ExtensionofMotionModelUsingStationaryFeatures...........26 2.3MotionModelofCamera............................30 2.4MotionModelofMovingFeatures.......................33 3SIMULTANEOUSESTIMATIONOFEUCLIDEANDISTANCESTOASTATIONARYOBJECT'SFEATURESANDTHEEUCLIDEANTRAJECTORYOF AMONOCULARCAMERA.............................39 3.1IntegralConcurrentLearningObserverUpdateLawsforEuclideanDistances......................................40 3.2ExtendedObserverUpdateLawforEuclideanDistancetoFeatures fromCamera..................................43 3.3StabilityAnalysis................................44 3.4ExperimentalResults..............................45 3.5Summary....................................52 4POSITIONESTIMATIONUSINGAMONOCULARCAMERASUBJECTTO INTERMITTENTFEEDBACK............................54 4.1LearningFeatureStructure..........................55 4.2FeatureObserverDesignWithoutObjectReturn..............58 4.2.1FeatureObserverDesign........................59 4.2.2ObserverDesignStabilityAnalysis..................60 4.3FeatureObserverDesignWithObjectReturn................62 4.3.1FeaturePredictorDesign........................63 4.3.2StabilityAnalysisofFeaturePredictorDesign............65 4.3.3EnsuringStabilityThroughDwell-TimeConditions..........66 4.4EstimatorDesignforPoseofCamera.....................69 5

PAGE 6

4.4.1StabilityofKeyFramePositionObserverandPredictorDesign...73 4.5Experiments...................................74 4.6Summary....................................98 5STRUCTUREANDMOTIONOFAMOVINGTARGETUSINGAMOVING MONOCULARCAMERASUBJECTTOINTERMITTENTFEEDBACK.....99 5.1LearningtheFirstFeatureStructure.....................99 5.2LearningtheStructureoftheRemainingFeatures.............103 5.3LearningtheObjectMotionModel......................107 5.4TargetEstimators................................112 5.5ObjectObserverandPredictorAnalysis...................115 5.6ObjectDwell-TimeAnalysis..........................117 5.7Summary....................................120 6CONCLUSIONS...................................121 REFERENCES.......................................125 BIOGRAPHICALSKETCH................................135 6

PAGE 7

LISTOFTABLES Table page 3-1RMSDepthErrorandPositionErrorinMetersOver15Experiments......49 3-2RMSDepthErrorandPositionErrorinMetersOver15ExperimentBefore Learning........................................51 3-3RMSDepthErrorandPositionErrorinMetersOver15ExperimentAfter Learning........................................51 7

PAGE 8

LISTOFFIGURES Figure page 2-1Examplegeometryfortrackingthepositionofasinglestationaryobject.....24 2-2Examplegeometryfortrackingthepositionofmultiplestationaryobjects....27 2-3Examplegeometryforpositionofcameraovertime................31 2-4Examplegeometryfortrackingasinglemovingobject...............34 3-1Imageofcheckerboard,KobukiTurtlebot,andaniDSuEyecamera.......46 3-2Thecameratrajectoriesforeachofthe15experiments..............48 3-3Exampleconvergenceofestimationerror......................49 3-4Exampleconvergenceofpositionerror.......................50 4-1ImageofKobukiTurtlebotandiDSuEyecamera..................75 4-2Imageofwoodenhallways..............................76 4-3Imageoflandmark...................................77 4-4Imageofhallwaycorner................................78 4-5Imageofhallwaywall.................................79 4-6Plotofthepathofthecameraduringtheexperimentandtheestimatedpath ofthecamerausingastandarddeviationof3pixelsforthehistorystackrejectionalgorithm....................................83 4-7Plotofthenormofthecamerapositionerrorduringtheexperimentusinga standarddeviationof3pixelsforacceptingdataontothehistorystack......84 4-8Plotofthedistanceestimatorconvergenceforkeyframe33thatshowsthe averagepercentageerrorofthedistancetothefeaturesrelativetothetrue distancetothefeaturesusingastandarddeviationof3pixelsforaccepting dataontothehistorystack..............................85 4-9Plotofthedistanceestimatorconvergenceforkeyframe173thatshowsthe averagepercentageerrorofthedistancetothefeaturesrelativetothetrue distancetothefeaturesusingastandarddeviationof3pixelsforaccepting dataontothehistorystack..............................86 4-10Plotofthedistanceestimatorconvergenceforkeyframe215thatshowsthe averagepercentageerrorofthedistancetothefeaturesrelativetothetrue distancetothefeaturesusingastandarddeviationof3pixelsforaccepting dataontothehistorystack..............................87 8

PAGE 9

4-11HistogramplotoftheRMSoftheaveragepercentageerrorofthedistance tothefeaturesrelativetothetruedistancetothefeaturesacrosstheentire experimentusingastandarddeviationof3pixelsforacceptingdataontothe historystack......................................88 4-12HistogramplotoftheRMSoftheaveragepercentageerrorofthedistance tothefeaturesrelativetothetruedistancetothefeaturesacrossbeforethe minimumdwell-timeconditionissatisedusingastandarddeviationof3pixelsforacceptingdataontothehistorystack.....................89 4-13HistogramplotoftheRMSoftheaveragepercentageerrorofthedistanceto thefeaturesrelativetothetruedistancetothefeaturesacrossaftertheminimumdwell-timeconditionissatisedusingastandarddeviationof3pixels foracceptingdataontothehistorystack.......................90 4-14Plotofthepathofthecameraduringtheexperimentandtheestimatedpath ofthecamerausingthepredictor-onlystrategywhennolandmarkisintheFOV.92 4-15Plotofthenormofthecamerapositionerrorduringtheexperimentusingthe predictor-onlystrategywhennolandmarkisintheFOV..............93 4-16Plotofthepathofthecameraduringtheexperimentandtheestimatedpath ofthecamerausingastandarddeviationof10pixelsforthehistorystackrejectionalgorithm....................................95 4-17Plotofthenormofthecamerapositionerrorduringtheexperimentusinga standarddeviationof10pixelsforacceptingdataontothehistorystack.....96 4-18HistogramplotoftheRMSoftheaveragepercentageerrorofthedistanceto thefeaturesrelativetothetruedistancetothefeaturesacrossaftertheminimumdwell-timeconditionissatisedusingastandarddeviationof10pixels foracceptingdataontothehistorystack.......................97 5-1Examplegeometryforasimpliedcamera.....................117 9

PAGE 10

AbstractofDissertationPresentedtotheGraduateSchool oftheUniversityofFloridainPartialFulllmentofthe RequirementsfortheDegreeofDoctorofPhilosophy TARGETTRACKINGINUNKNOWNENVIRONMENTSUSINGAMONOCULAR CAMERASUBJECTTOINTERMITTENTSENSING By ZacharyIanBell December2019 Chair:WarrenE.Dixon Major:MechanicalEngineering Inmostapplicationsinvolvingautonomousagentstrackingamovingtargetthrough uncertainenvironments,itisnecessarytoestimatethestructureoflocalfeatures intheenvironmente.g.,relativepositionsofobjectsintheimmediatesurrounding environment,theposeofanagenti.e.,positionandorientation,andtheposeand velocityofthetarget.Manyoftheseapplicationsrequiretravelingoverlargedistances implyingthelocalenvironmentforanagentisalwayschanging,introducingfurther difculty.Furthermore,itisoftenonlypossibletointermittentlysensethetargete.g., environmentalobstructionsorpathconstraintsoftheagentmaycauseocclusionsofthe target. Itisoftenassumedthatitispossibletouseglobalsensingtomeasurethestate ofanagent.However,statefeedbackgenerallyrequiresasensorthatcanrelateall thestatestoacommoncoordinatesysteme.g.,globalpositioningsystemGPS. However,GPSmaybeunavailablee.g.,agentscouldoperateinenvironmentswhere GPSisrestrictedordenied.Furthermore,assumingthattheentireenvironmentis knownandstateinformationfromthetargetisavailableisarestrictiveassumption sincetargetsarenotlikelytocommunicatesuchinformationanddirectlysensingthe poseandvelocityofatargetischallengingandnotpossibleinmanyscenarios.These challengesmotivatethedevelopmentoftechniquesthatrelyonlocalsensingbutstill allowagentstoestimatetheirownstatei.e.,poseaswellasthestateofatargeti.e., 10

PAGE 11

poseandvelocity.Additionally,effortsaremotivatedbythefactthatlocalsensingoften hasintermittentavailability. Camerasareapotentialsensorthatcanprovidelocalfeedbackoftheenvironment wherecoordinatesofthetargetcanberelatedtoacommonreferenceframe;however, camerasystemsdon'tinherentlymeasurescale,havealimitedeld-of-view,andare susceptibletointermittentsensinge.g.,duetoocclusions.ThescaleoftheEuclidean coordinatesoffeaturesinanimage,alsoknownasthestructureofthefeatures isnotavailablebecauseimagesareatwodimensionalprojectionofaEuclidean environment.Monocularcamerasystemscanrecoverscalebymovingthecamera e.g.,structurefrommotionandtrackingfeaturesoverlargeperiodsoftime;however, asthedistancetoatargetincreases,thedistancebetweenviewsofthefeaturemust alsoincreasemakinganaccuratescaleestimatechallenging.Additionally,thecamera's limitedeld-of-viewcaninhibitcontinuousobservationofaspecicobject.Continuous observationcanalsobedisruptedbyocclusionsortrajectoryconstraintsthatmay requireanagenttopurposefullyallowthetargettoleaveitseld-of-viewperiodically. Furthermore,agentsmayneedtotrackthetargetoverlargedistancesrequiringthe agenttocontinuouslyreconstructnewobjectsfromtheglobalenvironmentwhen reconstructedobjectsinthelocalenvironmentpermanentlyleavetheireld-of-view. InChapter3,aglobalexponentiallystableobserverforfeaturescaleisdeveloped underaniteexcitationconditionthroughtheuseofintegralconcurrentlearning. Sincetheobserveronlyrequiresniteexcitationtobegloballyexponentiallystable,the observerismoregeneralthanpreviousresults.TheresultindicatesthattheEuclidean distancetoasetoffeaturesonastationaryobjectandthepaththecameratravelswhile viewingthatobjectareestimatedexponentiallyfastimplyingthestructureandpath arereconstructedexponentially.Furthermore,thedevelopedestimationmethoddoes notrequirethefeaturesontheobjectstobeplanaranddoesnotrequirethepositive depthconstraint.Anexperimentalstudyispresentedwhichcomparesthedeveloped 11

PAGE 12

Euclideandistanceobservertopreviousobserversdemonstratingtheeffectivenessof thisresult. InChapter4,anextensiontothelearningapproachesinChapter3isdeveloped thatappliesanewlearningstrategythatmaintainsacontinuousestimateoftheposition ofthecameraandestimatesthestructureoffeaturesastheycomeintothecamera's eld-of-view.Furthermore,thedevelopedlearningstrategyallowssimulatedmeasurementsoffeaturesfromobjectsthatarenolongerintheeld-of-viewenablinga continuousestimateofthedistancetofeatureswithrespecttothecamera.Additionally, thisapproachshowshowtheextendedobserverremovesthepositivedepthconstraint requiredbyallpreviousstructurefrommotionapproaches.Usingthisapproach,a cameramaytraveloverlargedistanceswithoutkeepingspecicfeaturesintheeld-ofviewforalltimeandallowobjectstopermanentlyleavetheeld-of-viewifnecessary. ALyapunovbasedstabilityanalysisprovesthattheobserversforestimatingthepath ofthecameraaswellasthestructureofeachsetofobjectsaregloballyexponentially stablewhilefeaturesareinthecamera'seld-of-view.Aswitchedsystemsanalysisis usedtodevelopdwell-timeconditionstoindicatehowlongafeaturemustbetrackedto ensurethedistanceestimationerrorisbelowathreshold.Afterthedistanceestimates haveconvergedbelowthethreshold,thefeaturemaybeusedtoupdatetheposition ofthecamera.Ifafeaturedoesnotsatisfythedwell-timecondition,itisneverusedto updatethepositionoftheagent.Furthermore,theapproachdoesnotrequireanewset offeaturestobeinthecamera'sFOVwhenolderfeaturesleavethecamera'sFOV. InChapter5,theapproachinChapter4isusedtoprovideposeestimatesofthe cameraandanextensionofChapter3isdevelopedtoexponentiallyestimatethepose, velocity,andaccelerationofthemovingtarget.Specically,usingtheposeandvelocity ofthecamera,theestimationerroroftheEuclideantrajectoryofthetargetaswellas thestructureofthetarget,isgloballyexponentiallyconvergenttoanultimatebound assumingthetargetvelocityandaccelerationareboundedanddwell-timeconditions 12

PAGE 13

aresatised.Thedevelopedestimatorrelaxestherequirementtohavecontinuous observationofthetarget,toknowtheexactstructure,velocity,oraccelerationofthe target,anddoesnotrequirethepersistenceofexcitationassumptionorpositivedepth constraint. Chapter6concludesthedissertationwithadiscussionofthedevelopedestimation algorithmsandpotentialextensions. 13

PAGE 14

CHAPTER1 INTRODUCTION 1.1Background Inmanyapplications,thestatee.g.,positionandorientationofanautonomous agentanditslocalenvironmente.g.,relativepositionsofobjectsinthesurrounding environmentmustbedeterminedfromsensordata.Thisproblemiswellknownas simultaneouslocalizationandmappingSLAMcf.,[1–13].Often,aglobalpositioning systemGPSisusedtoestimatetheposition;however,inmanyenvironmentsGPSis unavailablee.g.,whenagentsoperateinGPSdeniedorcontestedenvironmentsmotivatingtheuseofonlylocalsensingdatae.g.,cameraimages,inertialmeasurement units,andwheelencoderstoestimatethepositionandmodelthesurroundingenvironment.Inapplicationsinvolvingtrackingamovingtargetthroughuncertainenvironments, itisadditionallynecessarytoestimatetheposeandvelocityofthetarget.Manyofthese applicationsrequiretravelingoverlargedistancesimplyingthelocalenvironmentfor anagentisalwayschangingintroducingfurtherdifculty.Thesechallengesmotivate thedevelopmentoftechniquesthatrelyonlocalsensingbutstillallowagentstoestimatetheirownstatei.e.,poseaswellasthestateofatargeti.e.,poseandvelocity. Additionally,effortsaremotivatedbythefactthatlocalsensingoftenhasintermittent availability. Usingcamerastoreconstructthesurroundingenvironmenti.e.,determinethe Euclideanscaleofobjectsintheenvironmentrequiresanassumptionthatobject featuresareinthecameraeld-of-viewFOVandmaybeextractedandtracked throughasequenceofimages.However,asignicantchallengearisesindetermining thescaleofobjectsinanimageusingacamerasincethelossofdepthinformation. Specically,imagesofobjectsare2Dprojectionsofthe3Denvironment.Approaches toreconstructi.e.,estimatethestructureobjectsusemultipleimagesofanobject alongwithscaleinformationcf.,[14,15]ormotioncf.,[16–32],suchas,linearand 14

PAGE 15

angularvelocitiesofthecamera.Thelatterofthesemethodsisreferredtoasstructure frommotionSfM.Generally,theEuclideanscaleofobjectsarenotknown;however, multiplecalibratedcamerasmaybeusedtorecoverthescalecf.,[14,15].However, thisapproachisinapplicableinallscenariosbecausesomeobjectsmayhavelimited ornoparallaxbetweenthecameraimages.InSfMapproaches,thepotentialforlimited parallaxstillexists;however,acameramaytraveltogenerateenoughparallax,whichis generallynotpossibleinstereovision. TheSfMproblemmaybeapproachedusingonlineiterativemethodscf.,[16–32] andofinebatchmethodscf.,[14,15,33]andthereferencescontainedwithin.These ofineapproachesperformanoptimizationoveranimagesequence,butonlyshow convergenceforlimitedcasescf.,[34,35].MostonlineSfMapproachesassume continuousmeasurementsofobjectsbythecameraoronlyupdatewhenanewimage isreceivedcf.,[16–26,28–32];however,recentresultsenableobjectstotemporarily leavethecamera'sFOVcf.,[36–38].ManyresultsapplytheextendedKalmanlter EKFtoestimatedepth,cf.,[16,18–20,27];however,theEKFgenerallydoesnot guaranteeconvergenceandmayfailinsomeapplicationswherethesystemisnot sufcientlyexcitedandortheinitialerroristoolarge[39,40].ComparedtotheEKF approach,techniquessuchas[21,23,24,26,29,31],showasymptoticconvergenceof thestructureestimationerrors.Furthermore,resultssuchas[17,22,25,28,30]show exponentialconvergenceofthescaleestimateassumingsomeformofapersistence ofexcitationPEconditionorthemorestrictExtendedOutputJacobianEOJis satised.Specically,theauthorsin[25]showexponentialconvergenceassuming thePEconditionismetandeithertheinitialestimationerrorissmallorthevelocities arelimited.Furthermore,thedevelopmentin[28]yieldsexponentialconvergence assumingtheobserversatisestheEOJcondition.In[30]anexponentiallystable observerisdevelopedthatrequiresthemotionalongatleastoneaxistobenonzero, andtheobserverremainsultimatelyboundedifthePEassumptiondoesnothold,while 15

PAGE 16

in[28]theobserverbecomessingular.Typically,SfMapproachesrequirethemotion e.g.,linearandangularvelocitiestobeknown;however,thedesignin[31],extending anapproachsimilarto[30],demonstratesapartialsolutiontothemorechallenging problemi.e.comparedtoSfMofstructureandmotionSaMwherenotonlyare thefeatureEuclideancoordinatesestimated,butalsotwoofthelinearvelocitiesand thethreeangularvelocitiesofthecamera,assumingPEandthelinearvelocityand accelerationaremeasurablealongoneaxis. In[32],exponentiallyconvergingobserversaredevelopedthatuseacamerato estimatetheEuclideandistancetofeaturesonastationaryobjectintheFOVwhilealso estimatingtheEuclideantrajectoryofthecameratrackingtheobject.Unlikeprevious methodssuchas[17,22,25,28,30]thatassumeaPEcondition,[32]onlyrequiresnite excitation.Theniteexcitationconditionresultsfromtheuseofconcurrentlearning CLcf.,[41–44].TheconceptofCListouserecordedinputandoutputdatafrom systemtrajectoriestoidentifyuncertainconstantparametersofthesysteminrealtime undertheassumptionthatthesystemissufcientlyexcitedforaniteamountoftime. ThisapproachrelaxesthePEassumptionandcanbemonitoredandveriedonline.To eliminatetheneedtomeasurethehighestorderderivativeofthestate,wespecically useintegralconcurrentlearningICLcf.,[32,45–49].ICLremovesthenecessityto estimatethehighestorderderivativeofthesystemrequiredintraditionalconcurrent learningcf.,[32,45,48,49]. AlthoughICLremovestheneedformeasuringthestatederivative,itstillrequires thestatetobemeasurable;yet,auniquechallengein[32]isthatthestatedependson theunmeasurabledistancetothetarget.Moreover,thetraditionalstateusedinresults suchas[16–26,28–31]includeaninherentsingularitywhenoneofthecoordinates becomeszeroi.e.,theso-calleddepthtothetarget.Specically,previousresults assumeapositivedepthconstraintwherethedistancefromthefocalpointofthe cameratothetargetalongtheaxisperpendiculartotheimageplaneremainspositive. 16

PAGE 17

ThepositivedepthconstraintissatisedifthefeaturesremainintheFOV;however,the constraintcanbeviolatedforsomecamerarotationsthatcausethefeaturetoleavethe FOV. Inmanyvisualservoingapproaches,thetrajectoryofthecameraisconstrained tomaintaincontinuousobservationoffeaturesonalandmarkcf.,[50–61];however, constrainingthetrajectorytokeepconstantobservationofalandmarkisnotpossible whentravelingoverlargedistances.Recentresultsdevelopapproachesforrelaxingthe continuousfeedbackrequirementcf.,[36–38,62,63],enablingtheabilitytotemporarily allowalandmarkortargettoleavetheFOVtemporarilysincedwell-timeconditionsare satisedontheamountoftimefeedbackisunavailableandavailablee.g.,featuresare inandoutoftheFOV.However,[36–38]stillrequirethepositivedepthconstraintwhile theobjectisnotintheFOV.Whilethesetypesofapproachesmaysolveissuesthat arisefromocclusionsoversmallperiodsorforreconstructingsmallerenvironments,it stillmaynotbepossibleordesirabletoreturntosomeregionsinlargeenvironments. Othersapproachtheproblemoftemporarylossoffeedbackbyassumingtherange toanobjectisavailablecf.,[64–66],whichisoftennotpossible,ordonotdetermine theposeofatargetcf.,[67].Otherapproachesusefunctionapproximationmethods tolearnamotionmodelforatargetcf.,[68–73];however,theseapproachesassume modelsareGaussianprocessesandshowsimulationandexperimentalresultsbutdo notprovideastabilityanalysis. AnapproachtoallowingfeaturestoleavetheFOVforanextendedperiodoftime, enablingtheabilitytotraveloverlargedistances,istheuseofmultiplesetsoftracked featurescf.,[1–4,6,8–13,74–80];however,mostofthetheseapproachesassume thateachnewsetoffeaturescanbeapproximatedbeforeoldersetsoffeaturesleave theFOV.Inmanyapplications,itisnotpossibletoensureanewsetoffeaturesis observedandestimatedbeforetheoldersetsleavetheFOV.Thisdrawbackmotivates thedevelopmentoftechniquesthatinitiallylearneachsetoffeaturesindependently. 17

PAGE 18

Additionally,manyapproachesuseknowninformationaboutfeatures,thedesired trajectory,orassumefeatureslieonaplanetodeveloprelationshipsbetweenposes orfeaturesusinghomographyrelationshipsbetweenplanescf.[74–80].Ingeneral, requiringhomographicrelationshipsmayintroducemoreerrorinpositionestimates orfailtoestimatepositionbecausefeaturesdonotalwayslieonaplane,especiallyin environmentscomprisedofmultipleobjects.Additionally,mostoftheseapproaches donottakeanobserverbasedapproachcf.,[1–4,6,8–13,74–80];instead,known geometryormeasurementsareusedcf.[74–80],oranoptimizationmethodisused whichmaynotconvergewithoutagoodinitialguessoronlyconvergeuptoascale factorcf.,[1–4,6,8–13]andcanbecomputationallyexpensive.Methodsthatrelyon measurementsorgeometryreducerobustnesstonoisebecausenocontroloftherate oflearningispossibleandthereisnoabilitytoremovepotentiallynoisydata;however, manyoftheoptimizationmethodshaveoutlierrejectionandconsidernoise.Stochastic homographybasedapproachesaredevelopedin[79]and[80]tohandleprocessand measurementnoise;however,theseapproacheswillstillsufferfromthesameissues thatarisefromtheplanarassumption.Foragentstotraveloverunknownenvironments, thelimitationspresentedbytheplanarassumptionandthoseintroducedbynotusing anobserverorlearningmustberemovedi.e.,itisnotpossibletoinstantlyreconstruct featuresiftheyarenonplanarandsothestructureofeachsetoffeaturesmustbe learned.Whilealloftheseapproachesenabletravelingoverlargedistances,theydo notallcombinetheposeandenvironmentestimationwithtrackingmovingobjectsi.e., thestructureandmotionfrommotionSaMfMproblem[81],whichisrequiredfortarget tracking. Theprimarydifcultythatarisesinestimatingthetrajectoryofamovingobject usingamovingcameraisthatthevelocityofthemovingobjectisnottypicallymeasurable.Someapproachesassumeaknownlengthontheobjectandestimatethepose directlycf.,[82–89].Oneoftherstresultstoestimatethepathofmovingobjects 18

PAGE 19

wasdiscussed[90]whereabatchmethodoftrajectorytriangulationwasdevelopedby restrictingtheobjectmotiontobealonglinesandconics.Theresultin[90]wasgeneralizedin[91]wheretheobjecttrajectorycouldbealonganycurve.Theworkin[92] and[93]discussEuclideanpathestimationofobjectsmovingalonglines.However, eventhoughtheseapproachesdiscussthenumberofimagesrequiredtocalculatetrajectoriesofmovingobjectsandareshowntoworkinselectenvironments,stabilityisnot guaranteed.Additionally,thesebatchmethodsarerestrictiveinthesensetheyrequire thevelocitiesofthecameraandobjectstobesmallandaredifculttoimplementonline becauserepeatedoptimizationoverimagesequencesisrequired. TherstSaMfMresultthatincludedastabilityanalysistoshowasymptoticestimationisin[81]wherearobustintegralsignoftheerrorRISEobserverisdeveloped undertheassumptionsofconstantvelocitiesformovingfeatures,knownboundsondistances,measurableopticalows,differentiablecameraaccelerations,andpersistence ofexcitationPE.Theresultin[94]asymptoticallyestimatedanobject'svelocitywhere somelengthontheobjectisassumedknownandtheobjectsinitialorientationareused todevelopaRISEestimatorforasinglestationarycamera.Resultsin[95]and[96] useapassivity-basedapproachtoshowtheestimationerrorasymptoticallyconverges undertheassumptionofknownlengthsonthemovingobject.Theassumptionthatthe movingfeatureshaveaconstantvelocityin[81]isrelaxedin[97]and[98]byrestricting themotionoftheobjecttoaplaneandrequiringPE,usinganextendedunknowninput observertoshowtheobservererrorconvergesexponentiallytozero.In[99],arecursive leastsquaresobserverisdevelopedthatassumesaknownboundonthedistanceto movingandstationaryfeatures,knowncameraangularvelocities,andPEtodevelop aRISE-basedmethodthatasymptoticallyestimatesthedistancetomovingfeatures andthelinearvelocitiesofthefeaturesandthecamera.Theresultin[99]isextended in[100]toremovethePEconditionbyassumingthevelocityofthemovingfeatures andtheangularvelocityofthecameraareknowntodevelopaRISE-basedapproach 19

PAGE 20

toasymptoticallyestimatethecameralinearvelocitiesanddistancestothefeatures. In[45],anapproachsimilarto[32]isdevelopedthatusesvelocitymeasurementsofthe cameraandtrackedstationaryfeaturestolearntheEuclideanpositionsofthestationary featuresandthetrajectoryofthecamera.Thevelocityofthecameraandtheestimates ofthepositionsofthestationaryfeaturesarethenusedtotheshowtheestimation erroroftheEuclideanpathandvelocityofamovingobject,aswellasthestructureof themovingobject,isgloballyexponentiallyconvergenttoanultimateboundassuming theobjectvelocityandaccelerationarebounded.Additionally,ifthevelocityofthe movingobjectisconstant,theultimateboundoftheestimationerrorwillbezero.The developedobserverforthemovingobjectfeaturesrelaxedtherequirementtomeasure opticalowandthePEassumption.In[101],estimatorsaredevelopedforanetwork ofstationarycamerastoexponentiallyestimatetheEuclideandistancetoamoving object'sfeaturesandtheobject'svelocity,withrespecttoeachcamera.Theobjective issimilartotheinverseofthedaisy-chainingtypeproblemcf.,[74–80].However,[45] andmostotherSaMfMapproachesassumecontinuousfeedbackofthemovingobject. Unlikethecameraposeestimationproblem,themovingobjectwilltypicallyonlyleave theFOVtemporarilyimplyingapproachesthatconsidertemporarylossoffeedback cf.,[36–38,62,63]maybedevelopedtodeterminehowlongatargetmayleavethe FOV. 1.2OutlineoftheDissertation Chapter2describesthedynamicsforamovingmonocularcameratrackingstationaryfeaturesandamovingtarget'sfeatures.Thedynamicspresentauniqueapproachto theSfMandSaMfMwhererelationshipsaredevelopedshowinghowtheEuclideandistancetostationaryfeaturesrelatestotheposeoftheagentandtheEuclideandistance tomovingfeaturesrelatestotheposeandvelocityofthemovingtarget. AglobalexponentiallystableobserverforfeaturescaleisdevelopedinChapter 3,underaniteexcitationconditionthroughtheuseofICL.Sincetheobserveronly 20

PAGE 21

requiresniteexcitationtobegloballyexponentiallystable,theobserverismore generalthanpreviousresults.TheresultindicatesthattheEuclideandistancetoaset offeaturesonastationaryobjectandthepaththecameratravelswhileviewingthat objectareestimatedexponentiallyfastimplyingthestructurei.e.,Euclideancoordinates ofthetrackedfeaturesandpatharereconstructedexponentially.Furthermore,the developedestimationmethoddoesnotrequirethefeaturesontheobjectstobeplanar anddoesnotrequirethepositivedepthconstraint.Anexperimentalstudyispresented whichcomparesthedevelopedEuclideandistanceobservertopreviousobservers demonstratingtheeffectivenessofthisresult. InChapter4,anextensiontothelearningapproachesinChapter3isdeveloped thatappliesanewlearningstrategythatmaintainsacontinuousestimateoftheposition ofthecameraandestimatesthestructureoffeaturesastheycomeintotheFOV. Furthermore,thedevelopedlearningstrategyallowssimulatedmeasurementsof featuresfromobjectsthatarenolongerintheFOVenablingacontinuousestimate ofthedistancetofeatureswithrespecttothecamera.Additionally,thisapproach showshowtheextendedobserverremovesthepositivedepthconstraintrequiredby allpreviousstructurefrommotionapproaches.Usingthisapproach,acameramay traveloverlargedistanceswithoutkeepingspecicfeaturesintheFOVforalltimeand allowobjectstopermanentlyleavetheFOVifnecessary.ALyapunovbasedstability analysisprovesthattheobserversforestimatingthepathofthecameraaswellasthe structureofeachsetofobjectsaregloballyexponentiallystablewhilefeaturesareinthe FOV.Aswitchedsystemsanalysisisusedtodevelopdwell-timeconditionstoindicate howlongafeaturemustbetrackedtoensurethedistanceestimationerrorisbelowa threshold.Afterthedistanceestimateshaveconvergedbelowthethreshold,thefeature maybeusedtoupdatethepositionofthecamera.Ifafeaturedoesnotsatisfythedwelltimecondition,itisneverusedtoupdatethepositionoftheagent.Furthermore,the approachdoesnotrequireanewsetoffeaturestobeintheFOVwhenolderfeatures 21

PAGE 22

leavetheFOV.Finally,ifarecognizedlandmarkenterstheFOV,thefeedbackisusedto compensatefordrifterror. InChapter5,theapproachinChapter4isusedtoprovideposeestimatesofthe cameraandanextensionofChapter3isdevelopedtoexponentiallyestimatethepose, velocity,andaccelerationofthemovingtarget.Specically,usingtheposeandvelocity ofthecamera,theestimationerroroftheEuclideantrajectoryofthetargetaswellas thestructureofthetarget,isgloballyexponentiallyconvergenttoanultimatebound assumingthetargetvelocityandaccelerationareboundedanddwell-timeconditions aresatised.Thedevelopedestimatorrelaxestherequirementtohavecontinuous observationofthetarget,toknowstructure,velocity,oraccelerationofthetarget,and doesnotrequirethepersistenceofexcitationassumptionorpositivedepthconstraint. Chapter6concludesthedissertationwithadiscussionofthedevelopedestimation algorithmsandpotentialextensions. 22

PAGE 23

CHAPTER2 SYSTEMMODEL Thedevelopmentinthischapterpresentsthedynamicsandassumptionsused inChapters3-5.Section2.1developsthedynamicsofasinglestationaryobject's featuresrelativetoamovingcamera,Section2.2extendsSection2.1toasequenceof objects,andSection2.4developsthedynamicsofamovingobject'sfeaturesrelativeto amovingcamera. 2.1MotionModelUsingStationaryFeatures Thissectiondescribesthedynamicrelationshipbetweenasingleobject'sfeatures andamovingcamera.Tofacilitatethesubsequentdevelopment,akeyframeisdened asthecameraframewhenfeaturesarerstextractedfromanimageofanobject. Furthermore,akeyframe,denotedby F k ,hasitsoriginattheprincipalpointofthat image,denotedby k ,andbasis n x k ;y k ;z k o .Theframewhenthecurrentimageistaken, denotedby F c ,hasitsoriginattheprincipalpointofthecurrentimage,denotedby c ,andbasis n x c ;y c ;z c o ,where z c isalignedwiththenormaltotheimageplane, x c is alignedwiththehorizontaloftheimageplanei.e.,totherightintheimage,and y c is alignedwiththeverticaloftheimageplanei.e.,downwardintheimage..Thisimplies that F k isestablishedtocoincidewith F c attime t =0 ,where t 2 R 0 representtime. Assumption2.1. Astationaryobject, s ,hasatleast p 2 Z 4 featuresthatcanbe detectedandtrackedprovidedtheyarewithintheFOVofthecamera. Remark 2.1 . FeaturesonanobjectintheFOVcanbetrackedusingdescriptorand matchingtechniquessuchas[102–108]orfeatureextractionandtrackingtechniques suchas[109–112]. Assumption2.2. Thecameraintrinsicmatrix A 2 R 3 3 isknownandinvertable[15]. Assumption2.3. Thecameralinearandangularvelocities, v c t ;! c t 2 R 3 ,are measurableandexpressedin F c andareupperboundedas k v c t k v c and k ! c t k ! c ,where v c ; ! c 2 R > 0 areknownconstants. 23

PAGE 24

Figure2-1.Examplegeometryfortrackingthepositionofthe i thfeatureof s wherethe camerastartsatthetopleftwherethekeyimageistakenandistraveling downwardfromtheupperlefttothelowerleftwhiletrackingastationary objectontheright. 24

PAGE 25

Assumption2.4. Theorigins k and c arenotcoincidentwhile t> 0 . AsshowninFigure2-1,thepositionofthe i thfeatureon s , s i 2 Z > 0 ; 8 i = f 1 ;:::;p g , canbedescribedas p s i =c t = p k=c t + R k=c t p s i =k ; where p k=c t 2 R 3 isthepositionofthe k withrespectto c expressedin F c , p s i =k t 2 R 3 isthepositionoffeature s i withrespectto k expressedin F k , R k=c t 2 R 3 3 isthe rotationmatrixdescribingtheorientationof F k withrespectto F c ,and p s i =c t 2 R 3 isthe positionoffeature s i withrespectto c expressedin F c .Rearranging2gives Y s i t 2 6 4 d s i =c t d k=c t 3 7 5 = R k=c t u s i =k d s i =k ; where Y s i t , u s i =c t )]TJ/F25 11.9552 Tf 9.299 0 Td [(u k=c t , d s i =c t 2 R > 0 and u s i =c t 2 R 3 arethedistance andunitvectoroffeature s i withrespectto c expressedin F c , d k=c t 2 R > 0 and u k=c t 2 R 3 arethedistanceandunitvectorof k withrespectto c expressedin F c ,and d s i =k 2 R > 0 and u s i =k 2 R 3 arethedistanceandunitvectoroffeature s i withrespectto k expressedin F k . Whileasetoffeatures, f s i g p i =1 ,areintheFOVand d k=c t > 0 ,therotation R k=c t andunitvector u k=c t canbedeterminedfromageneralsetofstationaryfeatures, usingexistingtechniquessuchasplanarhomographydecompositionoressential decomposition. 1 Additionally, u s i =k and u s i =c t canalwaysbedeterminedfrom u s i =k = A )]TJ/F18 5.9776 Tf 5.756 0 Td [(1 p s i =k k A )]TJ/F18 5.9776 Tf 5.756 0 Td [(1 p s i =k k and u s i =c t = A )]TJ/F18 5.9776 Tf 5.756 0 Td [(1 p s i =c t k A )]TJ/F18 5.9776 Tf 5.756 0 Td [(1 p s i =c t k where p s i =k ;p s i =c t 2 R 3 arethehomogeneouspixel coordinatesoffeature s i in F k and F c ,respectively.Whenthemotionofthecamerais notparalleltothedirectiontoafeature, 1 )-222(k u T k=c t u s i =c t k > a ,where a 2 ; 1 1 See[14],[15],and[113]forexamplesoncalculatingtherotationandnormalized translationfromplanarandnonplanarfeatures. 25

PAGE 26

isaselectedconstant,and d k=c t > 0 ,2canbewrittenas 2 6 4 d s i =c t d k=c t 3 7 5 = s i t d s i =k ; where s i t , )]TJ/F25 11.9552 Tf 5.479 -9.684 Td [(Y T s i t Y s i t )]TJ/F24 7.9701 Tf 6.587 0 Td [(1 Y T s i t R k=c t u s i =k isinvertableandmeasurablewhile 1 )-222(k u T k=c t u s i =c t k > a .Furthermore,since F k and s arestationary,thetime derivativesoftheunknowndistancesaremeasurableas d dt )]TJ/F25 11.9552 Tf 5.479 -9.684 Td [(d s i =c t = )]TJ/F25 11.9552 Tf 9.298 0 Td [(u T s i =c t v c t ; d dt )]TJ/F25 11.9552 Tf 5.479 -9.684 Td [(d k=c t = )]TJ/F25 11.9552 Tf 9.298 0 Td [(u T k=c t v c t ; and d dt d s i =k =0 : 2.2ExtensionofMotionModelUsingStationaryFeatures Consideringtheobjectiveistotraveloverlargedistances,thissectionextends Section2.1toasequenceofstationaryobjects. Let F k j p s t j =1 and f j g p s t j =1 beasequenceofkeyframesandtimesatwhichkey framesareestablished,respectively,where a j 2 R 0 denotesthetime, t 2 R 0 ,that the j thobjecthasfeedbackavailablei.e.,featuresarerstextractedfromtheobject afterenteringtheFOVestablishingthekeyframe, u j 2 R > a j denotesthetimethewhen feedbackforthe j thobjectunavailablei.e.,theobjectisnolongertrackedbecausetoo manyfeaturesontheobjectleavetheFOV, p s t 2 Z > 0 denotesthetotalnumberofkey frames,andthe j thkeyframe,denotedby F k j ,hasitsoriginattheprincipalpointofthat image,denotedby k j ,andhasthebasis n x k j ;y k j ;z k j o ,whichareselectedsuchthat F k j isestablishedtocoincidewith F c attime t = a j .Let V c R 3 representtheEuclidean spaceoftheFOVexpressedin F c . 26

PAGE 27

Figure2-2.Examplegeometryfortrackingthepositionofthe i thfeaturein S j wherethe keyimagetakenfromthetopleftandthecameratravelingdownwardfrom theupperlefttothelowerleftwhiletracking S j ontheright. 27

PAGE 28

Assumption2.5. TheEuclideanspaceoftheFOV, V c ,iscompactandthenormofeach pointin V c isboundedusingtheknownconstant d 2 R > 0 . Remark 2.2 . Assumption2.5isnecessaryforthesubsequentdevelopmentofthemaximumandminimumdwell-timeconditionsandisinherentlyrequiredwhenestimating distancestoanunknownobject'sfeaturesthatmayleavetheFOV. Assumption2.6. Thereexistsasetofstationaryobjects, fS j g p s t j =1 ,where S j R 3 representstheminimumEuclideansphereenclosingthe j thstationaryobjectexpressed in F c .Furthermore,thereexistsasetoftrackablefeaturesoneachstationaryobject, O s j p s t j =1 ,where O s j , f s j;i g p s j i =1 isthe j thstationaryobject'sfeatureset, s j;i represents the i thfeatureonthe j thstationaryobject,and p s j 2 Z 4 representsthetotalnumberof trackablefeaturesonthe j thstationaryobject. Assumption2.7. Theorigins k j and c arenotcoincidentwhile t> a j ,specically, d k j =c t >d 1 where d 1 2 R > 0 isaconstant. AsillustratedinFigure2-2,thepositionfeature s j;i 2O s j ,canbedescribedas p s j;i =c t = p k j =c t + R k j =c t p s j;i =k j ; where p s j;i =c t 2 R 3 isthepositionoffeature s j;i withrespectto c expressedin F c , p k j =c t 2 R 3 isthepositionof k j withrespectto c expressedin F c , p s j;i =k j t 2 R 3 is thepositionoffeature s j;i withrespectto k j expressedin F k j ,and R k j =c t 2 R 3 3 isthe rotationmatrixdescribingtheorientationof F k j withrespectto F c .Rearranging2 gives Y s j;i t 2 6 4 d s j;i =c t d k j =c t 3 7 5 = R k j =c t u s j;i =k j d s j;i =k j ; where Y s j;i t , u s j;i =c t )]TJ/F25 11.9552 Tf 9.299 0 Td [(u k j =c t , d s j;i =c t 2 R > 0 and u s j;i =c t 2 R 3 arethe distanceandunitvectoroffeature s j;i withrespectto c expressedin F c , d k j =c t 2 R > 0 and u k j =c t 2 R 3 arethedistanceandunitvectorof k j withrespectto c expressedin 28

PAGE 29

F c ,and d s j;i =k j 2 R > 0 and u s j;i =k j 2 R 3 arethedistanceandunitvectoroffeature s j;i with respectto k j expressedin F k j . SincefeaturesontheobjectmayleavetheFOVovertime,let P s j t O s j representtheremainingsetoffeaturesintheFOV,specically, P s j t , n s j;i 2O s j : p s j;i =c t 2S j V c o ,andlet p s j t 2 Z 0 representthenumberoffeaturesin P s j t ,and P c s j t , O s j nP s j t representthecomplimentof P s j t .Let u s j;i 2 R a j ,representthetimethefeatureindexedby s j;i leavestheFOV,specically, thetimeinstancewhen s j;i = 2P s j t ,andlet t a s j;i , u s j;i )]TJ/F25 11.9552 Tf 12.011 0 Td [( a j representthetotaltimethe featureindexedby s j;i wastrackedbythecamera.Let u j , max n u s j;i o p s j t i =1 i.e.,thetime thelastfeatureleavestheFOV. Whiletheorigins k j and c arenotcoincidenti.e., d k j =c t >d 1 where d 1 2 R > 0 isaconstant,Assumptions2.2and2.6ensuretherotation R k j =c t andunitvector u k j =c t canbedeterminedfromthesetofstationaryfeaturesin P s j t while p s j t 4 . Additionally, u s j;i =k j and u s j;i =c t canbedeterminedfrom u s j;i =k j = A )]TJ/F18 5.9776 Tf 5.756 0 Td [(1 p s j;i =k j k A )]TJ/F18 5.9776 Tf 5.756 0 Td [(1 p s j;i =k j k and u s j;i =c t = A )]TJ/F18 5.9776 Tf 5.756 0 Td [(1 p s j;i =c t k A )]TJ/F18 5.9776 Tf 5.756 0 Td [(1 p s j;i =c t k where p s j;i =k j ;p s j;i =c t 2 R 3 arethehomogeneouspixel coordinatesoffeature s j;i in F k j and F c ,respectively. Let s j;i t s j;i 2P s j t bethesetofswitchingsignalsforthefeaturesin P s j t , where s j;i t 2f u;a g indicateswhether 1 )-222(k u T k j =c t u s j;i =c t k < a or 1 )-222(k u T k j =c t u s j;i =c t k > a ,respectively.Whiletheorigins k j and c arenotcoincidenti.e., d k j =c t >d 1 andthemotionofthecameraisnotparalleltothedirection toafeaturei.e., 1 )-222(k u T k j =c t u s j;i =c t k > a , s j;i t = a i.e.,notparallelmotion, 2isinvertableinthesensethat Y s j;i t isfullcolumnrankand2maybewritten 2 6 4 d s j;i =c t d k j =c t 3 7 5 = a s j;i t d s j;i =k j ; where a s j;i t , Y T s j;i t Y s j;i t )]TJ/F24 7.9701 Tf 6.586 0 Td [(1 Y T s j;i t R k j =c t u s j;i =k j ; 29

PAGE 30

ismeasurablebasedonAssumptions2.2and2.6.When s j;i t = u i.e.,parallel motion,2isnotinvertable.However,2canalwaysbewritten d s j;i =c t = u T s j;i =c t u k j =c t d k j =c t + u T s j;i =c t R k j =c t u s j;i =k j d s j;i =k j yielding d s j;i =c t = u s j;i t 2 6 4 d k j =c t d s j;i =k j 3 7 5 ; where u s j;i t , u T s j;i =c t u k j =c t u T s j;i =c t R k j =c t u s j;i =k j : UsingAssumption2.3andwhile p s j t 4 ,thetimederivativesoftheunknown distances d s j;i =c t , d k j =c t ,and d s j;i =k j aremeasurablefor s j;i 2P s j t as d dt )]TJ/F25 11.9552 Tf 5.48 -9.684 Td [(d s j;i =c t = )]TJ/F25 11.9552 Tf 9.298 0 Td [(u T s j;i =c t v c t ; d dt )]TJ/F25 11.9552 Tf 5.479 -9.684 Td [(d k j =c t = )]TJ/F25 11.9552 Tf 9.298 0 Td [(u T k j =c t v c t ; and d dt d s j;i =k j =0 : Takingthetimederivativeof u s j;i =c t yields d dt u s j;i =c t = )]TJ/F25 11.9552 Tf 9.298 0 Td [(! c t u s j;i =c t + 1 d s j;i =c t u s j;i =c t u T s j;i =c t )]TJ/F25 11.9552 Tf 11.956 0 Td [(I 3 3 v c t ; where ! c t , 2 6 6 6 6 4 0 )]TJ/F25 11.9552 Tf 9.298 0 Td [(! z ! y ! z 0 )]TJ/F25 11.9552 Tf 9.298 0 Td [(! x )]TJ/F25 11.9552 Tf 9.299 0 Td [(! y ! x 0 3 7 7 7 7 5 ,and I 3 3 , 2 6 6 6 6 4 100 010 001 3 7 7 7 7 5 . 2.3MotionModelofCamera Aspreviouslydiscussed,itisnecessarytohaveacontinuousestimateofthe positionofthecameraovertimeregardlessofthevisibilityofobjectsi.e.,therewilltime periodswherenoobjectsremainintheFOV.AsshowninFigure2-3,thepositionofthe 30

PAGE 31

Figure2-3.Examplegeometryforposeofthecameraovertimewherethecamera startsatthetopwheretherstkeyframeislocatedandistraveling downwardtothelowerleft. 31

PAGE 32

cameramaybeexpressedthroughthesequenceofobjects.Sincethestartinglocation ofthecameramaybeunknown,thepositionofthecameraovertimecanbeexpressed relativetotherstkeyframeas p c=k 1 t = R k p s t =k 1 p c=k p s t t + p s t X j =2 R k j )]TJ/F18 5.9776 Tf 5.756 0 Td [(1 =k 1 p k j =k j )]TJ/F18 5.9776 Tf 5.756 0 Td [(1 : Thederivativeofthepositionwithrespecttotimecanbeexpressedsimilarly d dt p c=k 1 t = R c=k 1 v c t : Theorientationof R c=k 1 t isderivedusingaunitquaternion, q c=k 1 t 2 H ,which canberepresentedas q c=k 1 t 2 R 4 ,where q T c=k 1 t q c=k 1 t =1 .Thederivativewith respecttotimefor q c=k 1 t is d dt )]TJ/F25 11.9552 Tf 5.48 -9.684 Td [(q c=k 1 t = 1 2 B )]TJ/F25 11.9552 Tf 5.479 -9.684 Td [(q c=k 1 t ! c t ; where B q , 2 6 6 6 6 6 6 6 4 )]TJ/F25 11.9552 Tf 9.298 0 Td [(q 2 )]TJ/F25 11.9552 Tf 9.298 0 Td [(q 3 )]TJ/F25 11.9552 Tf 9.299 0 Td [(q 4 q 1 )]TJ/F25 11.9552 Tf 9.298 0 Td [(q 4 q 3 q 4 q 1 )]TJ/F25 11.9552 Tf 9.299 0 Td [(q 2 )]TJ/F25 11.9552 Tf 9.298 0 Td [(q 3 q 2 q 1 3 7 7 7 7 7 7 7 5 ; q 1 ;q 2 ;q 3 ;q 4 2 R arethefourelementsofaunitquaternion q t and B T q t B q t = I 3 3 . 2 Therotationmatrixrepresentationofaunitquaternion q t is R q , 2 6 6 6 6 4 1 )]TJ/F15 11.9552 Tf 11.955 0 Td [(2 q 2 3 + q 2 4 2 q 2 q 3 )]TJ/F25 11.9552 Tf 11.955 0 Td [(q 4 q 1 2 q 2 q 4 + q 3 q 1 2 q 2 q 3 + q 4 q 1 1 )]TJ/F15 11.9552 Tf 11.955 0 Td [(2 q 2 2 + q 2 4 2 q 3 q 4 )]TJ/F25 11.9552 Tf 11.956 0 Td [(q 2 q 1 2 q 1 q 4 )]TJ/F25 11.9552 Tf 11.956 0 Td [(q 3 q 1 2 q 3 q 4 + q 2 q 1 1 )]TJ/F15 11.9552 Tf 11.955 0 Td [(2 q 2 2 + q 2 3 3 7 7 7 7 5 : 2 Timedependenceissuppressedexceptwhenneededforclarityorintroducing terms. 32

PAGE 33

Remark 2.3 . Theorientation q c=k 1 t isoftenmeasurableusinglocalsensorse.g., inertialmeasurementunitsandmagnetometers.Someasurementsof q c=k 1 t maybe availableatalltimesinmanyapplications.Inthoseapplications,noestimateof q c=k 1 t wouldberequired. Since u k j =c t isonlymeasurablelocally,anestimateofsubsequently p k j =c t is developed.Takingthederivativeofthepositionofthecameraexpressedinthecamera framewithrespecttotimeyields d dt p k j =c t = )]TJ/F25 11.9552 Tf 9.299 0 Td [(v c t )]TJ/F25 11.9552 Tf 11.955 0 Td [(! c t p k j =c t : Similarly,theorientationusedforeachkeyframeis R k j =c t asshownin2.The unitquaternionformoftheorientationis q k j =c t 2 H ,whichcanberepresentedas q k j =c t 2 R 4 ,where q T k j =c t q k j =c t =1 .Thederivativewithrespecttotimefor q k j =c t is d dt )]TJ/F25 11.9552 Tf 5.48 -9.684 Td [(q k j =c t = )]TJ/F15 11.9552 Tf 10.494 8.088 Td [(1 2 B )]TJ/F25 11.9552 Tf 5.479 -9.684 Td [(q k j =c t ! c t : 2.4MotionModelofMovingFeatures SincetheobjectiveistotrackamovingobjectthatintermittentlyleavestheFOV, thissectiondevelopsthedynamicrelationshipsbetweentheamovingobjectand movingcamera.Trackingfeaturesonamovingobjectthatintermittentlyleavesthe FOVcanbeasignicantlymorechallengingproblemthantrackingstationaryfeatures. Descriptorandmatchingtechniquessuchas[102–108]mayhavebettertracking performanceformovingobjects. Assumption2.8. Amovingobjectrepresentedby M R 3 ,where M representsthe minimumEuclideansphereenclosingtheobjectexpressedin F c ,has n 2 Z 4 features thatcanbedetectedandtrackedwhile MV c i.e.,theobjectisintheFOV. While M isintheFOVofthecamera,thepositionofeachfeaturein M canbe relatedtothecamera.Let O m , f m i g n i =1 representthesetoffeaturesontheobject, p m i =c t 2 R 3 representthepositionoffeature m i withrespectto c expressedin F c , 33

PAGE 34

Figure2-4.Examplegeometryfortrackingthepositionofthe i thfeatureof m and m . 34

PAGE 35

andlet O t 2f u;a g beaswitchingsignalindicatingwhether MV c implyingwhen O t = a , p m i =c t n i =1 MV c .Furthermore,let a j 2 R 0 bethe j thtime t when O t = a occurs,andlet u j 2 R a j representthe j thtimewhen O t = u . Assumption2.9. ThefeaturesonthemovingobjectareinitiallycontainedintheFOV i.e., p m i =c a 1 n i =1 MV c andthemovingobjectandrststationaryobject, S 1 ,are detectedatthesametimei.e., a 1 = a 1 . Assumption2.10. Themovingobjectisinitiallystationary,where, v m t ;! m t 2 R 3 arethelinearandangularvelocityoftheobjectexpressedin F c .Specically, k v m t k = 0 and k ! m t k =0 foralltime t 2 [ a 1 ; s where s 2 R > 0 . AsillustratedinFigure2-4,theobjectframe,denotedby F m ,hasitsoriginat feature m 1 ,withbasis n x m ;y m ;z m o .Theinitialobjectframei.e.,theobjectattherst keyframe,denotedby F m ,hasitsoriginat m 1 andbasis n x m ;y m ;z m o ,where m represents m attime t = a 1 . Remark 2.4 . Theinitialorientationoftheobjectmaybeselectedarbitrarilysinceany coordinateframemaybeattachedtothemovingobjectbody.Toaidinthesubsequent development,theobjectinitialbasisisselectedtoalignwiththerstkeyframei.e., n x m ;y m ;z m o = n x k 1 ;y k 1 ;z k 1 o and R m =c a 1 = R m =k 1 = I 3 3 . AsalsoshowninFigure2-4,thepositionofthe i thfeatureon M maybedescribed in F c as p m i =c t = p m 1 =c t + R m=c t p m i =m 1 ; where R m=c t 2 R 3 3 istherotationmatrixdescribingtheorientationof F m with respectto F c ,and p m i =m 1 2 R 3 istheconstantpositionoffeature m i withrespectto m 1 expressedin F m .Thesameexpressionistruewhenfeaturesin M arerstextracted i.e.,when O t = a attime t = a 1 ;specically, p m i =c a 1 , p m i =c a 1 = p m i =k 1 andas describedinRemark2.4, R m =c a 1 = R m =k 1 = I 3 3 implying p m i =m 1 = p m i =k 1 )]TJ/F25 11.9552 Tf 11.955 0 Td [(p m 1 =k 1 : 35

PAGE 36

Substituting2into2,thepositionofthe i thfeaturein V c canbedescribedin F c by p m i =c t = p m=m t + R m=c t p m i =k 1 ; where p m=m t , p m 1 =c t )]TJ/F25 11.9552 Tf 11.955 0 Td [(R m=c t p m 1 =k 1 .Rearranging2gives Y m i t 2 6 4 d m i =c t d m=m t 3 7 5 = R m=c t u m i =k 1 d m i =k 1 ; where Y m i t , u m i =c t )]TJ/F25 11.9552 Tf 9.298 0 Td [(u m=m t , d m i =c t 2 R > 0 and u m i =c t 2 R 3 arethe distanceandunitvectoroffeature m i withrespectto c expressedin F c , d m=m t 2 R > 0 and u m=m t 2 R 3 arethedistanceandunitvectorof p m=m t from2,and d m i =k 1 2 R > 0 and u m i =k 1 2 R 3 arethedistanceandunitvectoroffeature m i withrespect to k 1 expressedin F k 1 . UnderAssumptions2.2and2.8,while O t = a ,therotationmatrix R m=c t and unitvector u m=m t canbedeterminedfromageneralsetoffeatures,usingexisting techniques.Additionally, u m i =k 1 and u m i =c t canbedeterminedfrom u m i =k 1 = A )]TJ/F18 5.9776 Tf 5.756 0 Td [(1 p m i =k 1 k A )]TJ/F18 5.9776 Tf 5.756 0 Td [(1 p m i =k 1 k and u m i =c t = A )]TJ/F18 5.9776 Tf 5.756 0 Td [(1 p m i =c t k A )]TJ/F18 5.9776 Tf 5.756 0 Td [(1 p m i =c t k where p m i =k 1 ;p m i =c t 2 R 3 arethehomogeneouspixel coordinatesoffeature m i and m i in F k 1 and F c ,respectively. Let f m i t g m i 2O m bethesetofswitchingsignalsforthefeaturesin O m , where m i t 2f u;a g indicateswhether 1 )-222(k u T m=m t u m i =c t k a or 1 )-222(k u T m=m t u m i =c t k > a ,respectively.Furthermore,let a l j;m i 2 a j ; u j representthe l thinstancethe i thmovingfeaturesatisestheeigenvalueconditioni.e., m i t = a duringthe j thinstancetheobjectenterstheFOVi.e., t 2 a j ; u j andlet u l j;m i 2 a j ; u j representthe l thinstanceitdoesn'tsatisfytheeigenvalueconditioni.e., m i t = u . Remark 2.5 . Thesetoffeaturesexcludingtheorigini.e., f m i t g m i 2O m nf m 1 g are m i t = u iftheorigindoesnotsatisfytheeigenvalueconditioni.e., m 1 t = u . Thisrelationshipresultsbecausethefeaturesarealldependentontheorigininthe 36

PAGE 37

subsequentdevelopment.Thesetoffeatures i:e:; f m i t g m i 2O m aresetas m i t = u iftheobjectleavestheFOVi.e., O t = u . Whiletherelativedirectionofmotionisnotparalleltothedirectionofafeaturei.e., 1 )-222(k u T m=m t u m i =c t k > a , m i t = a and2mayberearrangedas 2 6 4 d m i =c t d m=m t 3 7 5 = m i t d m i =k 1 ; where m i t , )]TJ/F25 11.9552 Tf 5.479 -9.684 Td [(Y T m i t Y m i t )]TJ/F24 7.9701 Tf 6.587 0 Td [(1 Y T m i t R m=c t u m i =k 1 and m i t ismeasurable. Basedonthedenitionsof p m i =c t and p m=m t ,theirderivativeswithrespectto timeare d dt p m i =c t = v m t )]TJ/F25 11.9552 Tf 11.955 0 Td [(v c t )]TJ/F25 11.9552 Tf 11.955 0 Td [(! c t p m i =c t + )]TJ/F25 11.9552 Tf 5.479 -9.683 Td [(! m t )]TJ/F25 11.9552 Tf 11.955 0 Td [(! c t R m=c t u m i =k 1 d m i =k 1 )]TJ/F31 11.9552 Tf 19.261 9.684 Td [()]TJ/F25 11.9552 Tf 5.479 -9.684 Td [(! m t )]TJ/F25 11.9552 Tf 11.955 0 Td [(! c t R m=c t u m 1 =k 1 d m 1 =k 1 ; and d dt p m=m t = v m t )]TJ/F25 11.9552 Tf 11.955 0 Td [(v c t )]TJ/F25 11.9552 Tf 11.955 0 Td [(! c t p m=m t )]TJ/F31 11.9552 Tf 19.261 9.683 Td [()]TJ/F25 11.9552 Tf 5.48 -9.683 Td [(! m t )]TJ/F25 11.9552 Tf 11.956 0 Td [(! c t R m=c t p m 1 =k 1 : Takingthetimederivativesoftheunknowndistancesandusing2and2 yields d dt )]TJ/F25 11.9552 Tf 5.479 -9.684 Td [(d m i =c t = u T m i =c t v m t )]TJ/F25 11.9552 Tf 11.956 0 Td [(u T m i =c t v c t + u T m i =c t )]TJ/F25 11.9552 Tf 5.479 -9.684 Td [(! m t )]TJ/F25 11.9552 Tf 11.955 0 Td [(! c t R m=c t u m i =k 1 d m i =k 1 37

PAGE 38

)]TJ/F25 11.9552 Tf 9.299 0 Td [(u T m i =c t )]TJ/F25 11.9552 Tf 5.479 -9.684 Td [(! m t )]TJ/F25 11.9552 Tf 11.955 0 Td [(! c t R m=c t u m 1 =k 1 d m 1 =k 1 ; d dt )]TJ/F25 11.9552 Tf 5.479 -9.683 Td [(d m=m t = u T m=m t v m t )]TJ/F25 11.9552 Tf 11.955 0 Td [(u T m=m t v c t )]TJ/F25 11.9552 Tf 9.299 0 Td [(u T m=m t )]TJ/F25 11.9552 Tf 5.479 -9.684 Td [(! m t )]TJ/F25 11.9552 Tf 11.955 0 Td [(! c t R m=c t u m 1 =k 1 d m 1 =k 1 ; and d dt d m i =k 1 =0 : Takingthetimederivativesofthedirectionsyields d dt )]TJ/F25 11.9552 Tf 5.48 -9.684 Td [(u m i =c t = 1 d m i =c t m i t v m t )]TJ/F25 11.9552 Tf 11.955 0 Td [(v c t )]TJ/F25 11.9552 Tf 11.955 0 Td [(! c t u m i =c t + 1 d m i =c t m i t )]TJ/F25 11.9552 Tf 5.479 -9.683 Td [(! m t )]TJ/F25 11.9552 Tf 11.955 0 Td [(! c t R m=c t u m i =k 1 d m i =k 1 )]TJ/F15 11.9552 Tf 27.797 8.087 Td [(1 d m i =c t m i t )]TJ/F25 11.9552 Tf 5.479 -9.683 Td [(! m t )]TJ/F25 11.9552 Tf 11.955 0 Td [(! c t R m=c t u m 1 =k 1 d m 1 =k 1 d dt )]TJ/F25 11.9552 Tf 5.479 -9.684 Td [(u m i =c t = 1 d m i =c t m i t v m t )]TJ/F25 11.9552 Tf 11.955 0 Td [(v c t )]TJ/F25 11.9552 Tf 11.955 0 Td [(! c t u m i =c t + 1 d m i =c t m i t )]TJ/F25 11.9552 Tf 5.479 -9.684 Td [(! m t )]TJ/F25 11.9552 Tf 11.955 0 Td [(! c t R m=c t u m i =k 1 d m i =k 1 )]TJ/F15 11.9552 Tf 27.797 8.088 Td [(1 d m i =c t m i t )]TJ/F25 11.9552 Tf 5.479 -9.684 Td [(! m t )]TJ/F25 11.9552 Tf 11.955 0 Td [(! c t R m=c t u m 1 =k 1 d m 1 =k 1 and d dt )]TJ/F25 11.9552 Tf 5.48 -9.683 Td [(u m=m t = 1 d m=m t m=m t v m t )]TJ/F25 11.9552 Tf 11.955 0 Td [(v c t )]TJ/F25 11.9552 Tf 11.955 0 Td [(! c t u m=m t )]TJ/F15 11.9552 Tf 30.286 8.088 Td [(1 d m=m t m=m t )]TJ/F25 11.9552 Tf 5.479 -9.684 Td [(! m t )]TJ/F25 11.9552 Tf 11.955 0 Td [(! c t R m=c t u m 1 =k 1 d m 1 =k 1 ; d dt )]TJ/F25 11.9552 Tf 5.479 -9.684 Td [(u m=m t = 1 d m=m t m=m t v m t )]TJ/F25 11.9552 Tf 11.955 0 Td [(v c t )]TJ/F25 11.9552 Tf 11.956 0 Td [(! c t u m=m t )]TJ/F15 11.9552 Tf 30.286 8.088 Td [(1 d m=m t m=m t )]TJ/F25 11.9552 Tf 5.479 -9.684 Td [(! m t )]TJ/F25 11.9552 Tf 11.955 0 Td [(! c t R m=c t u m 1 =k 1 d m 1 =k 1 ; where m i t = I 3 3 )]TJ/F25 11.9552 Tf 11.955 0 Td [(u m i =c t u T m i =c t and m=m t , I 3 3 )]TJ/F25 11.9552 Tf 11.955 0 Td [(u m=m t u T m=m t . 38

PAGE 39

CHAPTER3 SIMULTANEOUSESTIMATIONOFEUCLIDEANDISTANCESTOASTATIONARY OBJECT'SFEATURESANDTHEEUCLIDEANTRAJECTORYOFAMONOCULAR CAMERA Inthischapter,imagegeometryinsightsareexploitedtoexpresstheerrorsystem withamoregeneraldistancemeasurethatonlybecomeszerowhenthetargetand cameraarecoincident;thereby,avoidingthepositivedepthconstraint.Whilethisresult alsorequiresthefeaturestoremainintheFOV,eliminatingthepositivedepthconstraint eliminatesabarrierforfuturedevelopmentthatwouldallowintermittentviewingofthe features.Although,thenewimage-geometrybasederrorsystemavoidsthepotential depthsingularity,theresultingerrorsystemstillcontainstheunmeasurabledistanceto thetarget.However,thedevelopmentinSection3.1illustrateshowtheunmeasurable statecanberelatedtoanunknownconstanttoenabletheuseofICL.Regardlessofthe systemidenticationmethodused,thereisadelaybeforesufcientexcitationoccursto identifytheparameters.Therefore,thepreliminaryresultin[32]andthedevelopment inSection3.1exhibitanarbitrarilylongdelaybeforedeterminingthefeatureEuclidean coordinates.InSection3.2,wemodifythedevelopedlearningstrategytoinclude gradienttermsthatenabletransientlearninguntilsufcientdatahasbeencollectedfor theICLterms. Toillustratetheperformanceofthedevelopedobservers,multipleexperiments arepresented,includingacomparisonoftheobserversinSection3.1andSection 3.2withtheresultsin[30]andanEKF.TheseresultsindicatethattheEKFandresult in[30]haveimprovedtransientperformanceovertheresultinSection3.1,before theICL-basedestimatesconverge.TheEKFandresultin[30]havesimilartransient responseastheobserverinSection3.2beforetheICL-basedestimatesconverge.After theICL-basedestimatesconverge,theobserversinSections3.1and3.2convergeto steady-statewithimprovedperformanceovertheEKFandobserverin[30]. 39

PAGE 40

3.1IntegralConcurrentLearningObserverUpdateLawsforEuclideanDistances Motivatedbythedevelopmentsin[32,45],anICLupdatelawisimplementedto estimatetheconstantunknowndistances, d s i =k ,byintegrating2and2overa timewindow & 2 R > 0 yielding 2 6 4 d s i =c t d k=c t 3 7 5 )]TJ/F31 11.9552 Tf 11.956 27.617 Td [(2 6 4 d s i =c t )]TJ/F25 11.9552 Tf 11.955 0 Td [(& d k=c t )]TJ/F25 11.9552 Tf 11.955 0 Td [(& 3 7 5 = Z t t )]TJ/F26 7.9701 Tf 6.587 0 Td [(& s i d;t>&; where & maybeconstantinsizeorchangeovertimeand s i t , 2 6 4 )]TJ/F25 11.9552 Tf 9.298 0 Td [(u T s i =c t v c t )]TJ/F25 11.9552 Tf 9.298 0 Td [(u T k=c t v c t 3 7 5 . While R t t )]TJ/F26 7.9701 Tf 6.586 0 Td [(& s i d isaknownquantity, 2 6 4 d s i =c t d k=c t 3 7 5 and 2 6 4 d s i =c t )]TJ/F25 11.9552 Tf 11.955 0 Td [(& d k=c t )]TJ/F25 11.9552 Tf 11.955 0 Td [(& 3 7 5 areunknown; however,therelationshipin2maybeutilizedatthecurrenttime t andtheprevious time t )]TJ/F25 11.9552 Tf 11.955 0 Td [(& yielding Y s i t d s i =k = U s i t ; where Y s i t , 8 > > < > > : 0 2 1 ;t &; s i t )]TJ/F25 11.9552 Tf 11.955 0 Td [( s i t )]TJ/F25 11.9552 Tf 11.955 0 Td [(& ;t>&; and U s i t , 8 > > < > > : 0 2 1 ;t &; R t t )]TJ/F26 7.9701 Tf 6.586 0 Td [(& s i d;t>&: Thedynamicsin3demonstratethatconcurrentlearningcanestimatetheconstant distances, d s i =k ,tothefeatureson s .Specically,multiplyingbothsidesof3by Y T s i t yields Y T s i t Y s i t d s i =k = Y T s i t U s i t : Ingeneral, Y s i t willnothavefullcolumnranke.g.,whenthecameraisstationary implying Y T s i t Y s i t 0 .However,theequalityin3maybeevaluatedatinstances 40

PAGE 41

intimeandsummedtogetheri.e.,historystacksas Y s i d s i =k = U s i ; where Y s i , P N h i =1 Y T s i t h i Y s i t h i , U s i , P N h i =1 Y T s i t h i U s i t h i , t h i 2 &;t ,and N 2 Z > 1 . Assumption3.1. Motionofthecameraoccurssothereexistsniteconstants s i 2 R >& ; 2 R > 0 suchthatforalltime t s i , min Y s i > ,where min fg and max fg aretheminimumandmaximumeigenvaluesof fg . 1 Assumption3.1canbeveriedonlineandisheuristicallyeasytosatisfybecauseitonlyrequiresanitecollectionofsufcientlyexciting Y s i t and U s i t toyield min Y s i > .Thetime s i isunknown;however,itcanbedeterminedonlineby checkingtheminimumeigenvalueof Y s i .After s i , min Y s i > impliesthata constantunknowndistance, d s i =k ,canbedeterminedfrom3as d s i =k = X s i ;t s i ; where X s i , 8 > > < > > : 0 ;t< s i ; )]TJ/F24 7.9701 Tf 6.586 0 Td [(1 Y s i U s i ;t s i : When t s i ,3canbesubstitutedinto2toyield d s i =c t = s i ; 1 t ;t s i ; and d k=c t = s i ; 2 t ;t s i ; where s i ; 1 t ; s i ; 2 t aretherstandsecondelementsof s i t , s i t X s i . 1 See[114]or[115]forsomeexamplesofmethodsforselectingdatatosatisfythe assumption. 41

PAGE 42

Theestimationerrors, ~ d s i =c t ; ~ d k=c t ; ~ d s i =k t 2 R ,aredenedas ~ d s i =c t , d s i =c t )]TJ/F15 11.9552 Tf 14.021 3.155 Td [(^ d s i =c t ; ~ d k=c t , d k=c t )]TJ/F15 11.9552 Tf 14.021 3.154 Td [(^ d k=c t ; and ~ d s i =k t , d s i =k )]TJ/F15 11.9552 Tf 14.021 3.155 Td [(^ d s i =k t ; where ^ d s i =c t ; ^ d k=c t ; ^ d s i =k t 2 R aretheestimates.Motivatedbythesubsequentstabilityanalysis,theimplementableobserverupdatelawsfortheestimatesaredesigned using3-3as d dt ^ d s i =c t , 8 > > < > > : s i ; 1 t ;t< s i ; s i ; 1 t + k 1 s i ; 1 t )]TJ/F15 11.9552 Tf 14.021 3.155 Td [(^ d s i =c t ;t s i ; d dt ^ d k=c t , 8 > > < > > : s i ; 2 t ;t< s i ; s i ; 2 t + k 2 s i ; 2 t )]TJ/F15 11.9552 Tf 14.021 3.155 Td [(^ d k=c t ;t s i ; and d dt ^ d s i =k t , 8 > > < > > : 0 ;t< s i ; k 3 X s i )]TJ/F15 11.9552 Tf 14.021 3.155 Td [(^ d s i =k t ;t s i ; where k 1 ;k 2 ;k 3 2 R > 0 areconstants.Takingthetimederivativeof3-3,and substituting3-3,2-2,and3-3yields d dt ~ d s i =c t = 8 > > < > > : 0 ;t< s i ; )]TJ/F25 11.9552 Tf 9.298 0 Td [(k 1 ~ d s i =c t ;t s i ; d dt ~ d k=c t = 8 > > < > > : 0 ;t< s i ; )]TJ/F25 11.9552 Tf 9.298 0 Td [(k 2 ~ d k=c t ;t s i ; 42

PAGE 43

and d dt ~ d s i =k t = 8 > > < > > : 0 ;t< s i ; )]TJ/F25 11.9552 Tf 9.298 0 Td [(k 3 ~ d s i =k t ;t s i ; implyingforalltime t s i ,theestimationerrorderivativesarenegativedenite functionsoftheestimationerrors.Theformoftheupdatelawsin30-3areimplementableandusedinpractice,whiletheformofthetime-derivativeoftheestimation errorsin3-3areanalyticalandprovidedtofacilitatethesubsequentanalysis. 3.2ExtendedObserverUpdateLawforEuclideanDistancetoFeaturesfrom Camera Thesubsequentanalysisdemonstratesthat3and3willremainbounded while t< s i .However,aftersufcientdataisgathered,forall t s i ,3isbounded byanexponentialenvelope.Thedelayrequiredtogetsufcientexcitationmayreduce transientperformancei.e.,theerrorisnotguaranteedtoreduceuntilaftertime t s i whichisadisadvantagecomparedtopreviousapproachessuchas[30],whichimprove estimationerrorsbyestimatingopticalow.Motivatedbytheopticalowestimatorform oftheinversedepthestimatorin[30],thetimerateofchangeof u s i =c t isapproximated andusedtoprovideadditionalinformationtotheestimatorin3whichwillimprove transientperformanceuntilsufcientexcitationoccurs. Thetimederivativeof u s i =c t is d dt )]TJ/F25 11.9552 Tf 5.48 -9.683 Td [(u s i =c t = )]TJ/F25 11.9552 Tf 9.299 0 Td [(! c t u s i =c t + 1 d s i =c t )]TJ/F25 11.9552 Tf 5.48 -9.683 Td [(u s i =c t u T s i =c t )]TJ/F25 11.9552 Tf 11.955 0 Td [(I 3 3 v c t ; and T s i t s i t d s i =c t = T s i t s i t ; 43

PAGE 44

where s i t , )]TJ/F26 7.9701 Tf 8.204 -4.976 Td [(d dt )]TJ/F25 11.9552 Tf 5.48 -9.684 Td [(u s i =c t + ! c t u s i =c t , s i t , u s i =c t u T s i =c t )]TJ/F25 11.9552 Tf 11.956 0 Td [(I 3 3 v c t , ! c t , 2 6 6 6 6 4 0 )]TJ/F25 11.9552 Tf 9.298 0 Td [(! z ! y ! z 0 )]TJ/F25 11.9552 Tf 9.298 0 Td [(! x )]TJ/F25 11.9552 Tf 9.298 0 Td [(! y ! x 0 3 7 7 7 7 5 ,and I 3 3 , 2 6 6 6 6 4 100 010 001 3 7 7 7 7 5 .Toaidinthesubsequentanalysislet s i t , s i ; 1 t + k T s i t s i t )]TJ/F25 11.9552 Tf 11.956 0 Td [( T s i t s i t ^ d s i =c t ,thenanextendedversionofthe estimatorin3isdesignedas d dt ^ d s i =c t , 8 > > < > > : s i t ;t< s i ; s i t + k 1 s i ; 1 t )]TJ/F15 11.9552 Tf 14.021 3.155 Td [(^ d s i =c t ;t s i ; where k 2 R > 0 .Using3and3in3thensimplifyingyields d dt ^ d s i =c t = 8 > > < > > : s i ; 1 t + k s i t ~ d s i =c t ;t< s i ; s i ; 1 t + k 1 + k s i t ~ d s i =c t ;t s i ; where s i t , T s i t s i t .Substituting3intothetimederivativeof3yields d dt ~ d s i =c t = 8 > > < > > : )]TJ/F25 11.9552 Tf 9.299 0 Td [(k s i t ~ d s i =c t ;t< s i ; )]TJ/F15 11.9552 Tf 11.291 0 Td [( k 1 + k s i t ~ d s i =c t ;t s i : Remark 3.1 . UnderAssumption3.1, s i t 0 since k v c t k maybezeroforany periodoftime;however,forAssumption3.1tobesatised,therewillbetimeswhere s i t > 0 .Specically,therewillexistasetoftimes T s i S N h i =1 t h i )]TJ/F25 11.9552 Tf 12.356 0 Td [(&;t h i suchthat s i t > 0 ; 8 t 2T s i ,where h i ;t h i arefrom3,implyingthedesignin3may improvetransientperformanceunderAssumption3.1. 3.3StabilityAnalysis Sincetheobserverin3isanextensionof3,theresultingstabilityanalysisof3isidenticaltoTheorem3.1andisexcluded.Let t , 44

PAGE 45

~ d s i =c t ~ d k=c t ~ d s i =k t T and V t : R 3 ! R beacandidateLyapunovfunctiondenedas V t , 1 2 T t t ; whichcanbeboundedas 1 2 k t k 2 V t 1 2 k t k 2 . Theorem3.1. Theobserverupdatelawsdenedin3,3,and3ensure theestimationerrorsin t areboundedandgloballyexponentiallystableinthesense that k t kk k exp s i exp )]TJ/F25 11.9552 Tf 9.299 0 Td [(t : Proof. Takingthetimederivativeof3thensubstitutingtheerrorderivativesin 3,3,and3,resultsintheupperbound d dt V t 8 > > < > > : 0 ;t< s i ; )]TJ/F15 11.9552 Tf 9.299 0 Td [(2 V t ;t s i ; where = min f k 1 ;k 2 ;k 3 g .From3and3,[116,Theorem8.4]canbe invokedtoconcludethat k t k 2 k k 2 ; 8 t s i .From[116,Theorem4.10], k t k 2 k s i k 2 exp s i exp )]TJ/F15 11.9552 Tf 9.298 0 Td [(2 t ; 8 t s i .Evaluatingtherstboundon k t k 2 at t = s i thensubstitutingintothesecondboundon k t k 2 andtakingthe squarerootyields3. 3.4ExperimentalResults Fifteenexperimentsareprovidedtodemonstratetheperformanceofthedeveloped observers.Theperformanceofthedevelopedobserversin3-3and3– 17weretestedusingtheEigen3,OpenCV,andROSc++librariescf.,[117],[118], and[119],respectively.AKobukiTurtlebotwitha 1920 1080 monochromeiDSuEye camera,showninFigure3-1,providesimagesat30Hz.Featureswereextractedfrom imagesofacheckerboard,showninFigure3-1,with 8 6 cornerstotalfeatures 45

PAGE 46

Figure3-1.Photocourtesyofauthor.Imageshowsthecheckerboard,KobukiTurtlebot, andaniDSuEyecamerausedforexperiments. whereeachsquareis 0 : 06 meters 0 : 06 meters.Thelinearandangularvelocityof thecamerawerecalculatedusingtheTurtlebotwheelencodersandagyroscopeat 50Hz.AnOptitrackmotioncapturesystemoperatingat120Hzmeasuredthepose ofthecameraandcheckerboard,allowingforthepositionofeachfeaturerelativeto thecameratobeknownforcomparison.Imageprocessingandestimatorsexecuted simultaneouslyonacomputerwithanInteli7processorrunningat3.4GHz.The errorofthedistanceestimatorsin3and3arecomparedtotheestimator in[30]andanEKF.Sincetheestimatorin[30]andtheEKFestimatetheinversedepth i.e., 1 z s i =c t ,where z s i =c t isthedepthtofeature s i from c expressedin F c ,whilethe estimatorsin3and3estimatethedistance, z s i =c t thethirdelementof u s i =c t d s i =c t isusedtocomparethefourmethods. Foreachexperiment,theTurtlebotstartedapproximately3metersawayfrom thecheckerboard,andvarioustrajectoriesweretaken,showninFigure3-2,while maintainingthecheckerboardintheFOV.Ineachexperiment,theTurtlebotinitially 46

PAGE 47

startedatrest,andaftertraveling2.5meterstheestimatorswerestoppedtoprovide alargebaseline.AftertheTurtlebotstarteditsmotion,theTurtlebottraveledwithout stoppinguntilaftertheestimatorswerestoppedinanefforttohavetheidealconditions forestimationi.e.,continuousmotionofthefeaturesintheFOVandcontinuouslinear motionofthecameraasisrequiredfor[30]andtheEKF.Theinitialdistancesforthe estimatorsin3,3,3,theestimatorin[30]andtheEKFwereinitialized withadepthof0.5meters.Theestimatorin3wasinitializedto0.0meters.The gainsfor3-3and3wereselectedas k 1 = k 2 = k 3 =25 : 0 and k =25 : 0 k 1 ,respectively.Themaximumvaluefor & was5seconds.The48feature estimateswerecombinedusingameanateachinstancetoupdate3.Thegain forthemethodin[30]wasselectedtobe100.0.ThecovariancematricesfortheEKF weredeterminedthroughexperimentationandvaluesfoundtohavelowsteadystate errorandfastconvergencewere R = r 2 6 4 10 01 3 7 5 forthemeasurementcovariance, Q = r 2 6 6 6 6 4 10000 01000 00100000 3 7 7 7 7 5 fortheprocesscovariance,and P = r 2 6 6 6 6 4 100 010 00150000 3 7 7 7 7 5 for theinitialstatecovariance,respectively,where r =0 : 00001 . Remark 3.2 . Forageneralsystem,theoptimalmethodtoselectgooddataandremove baddatafor3e.g.,duetonoiseorparameterchangesremainsanopenproblem andisoftenlefttointuition.Intheseexperiments,theselectionofdatawasbasedoff ofknowledgeaboutapproximatenoisemagnitudesinfeaturetrackingandvelocity measurements.Specically,datawasonlyselectedif kY s i t k Y and kU s i t k U where Y ; U 2 R > 0 arevaluesselectedbasedontrialanderror.Because Y s i t isfull columnrankwhen kY s i t k Y and kU s i t k U ,thevalueof d s i =k approximated by Y s i t and U s i t canbedetermined.Since d s i =k > 0 ,andsomeknowledgeabout reasonablevaluesforthedistances,valuesof d s i =k canbedeterminedandonly Y s i t 47

PAGE 48

Figure3-2.Thecameratrajectoriesforeachofthe15experiments. and U s i t valuesthathad d s i =k estimatesfallingintheseboundsaresavedto Y s i and U s i .Thevaluesfor Y and U were Y = U =0 : 1 andtheboundsonthedistancewere selectedas0.5metersand6.0meters. Acomparisonoftheexampleperformanceovertimeoftheestimatorsisshownin Figure3-3andTables3-1-3-3,wherebeforelearningrefersto t< max f s i g andafter learningrefersto t max f s i g .Figure3-3showsacomparisonofthesumofthenorm ofeachdeptherroracrossthe48featuresi.e., P 48 s i =1 k ~ z s i =c t k onthecheckerboard fortheestimatorsin3,3,[30],andtheEKF,respectively.AsshowninFigure 3-3,theEKFestimatorstartsconvergingthefastest,butreachessteadystateslower thantheestimatorsin3,3,and[30].However,afterconverging,theEKFhas asimilarerrortotheestimatorsin3and3.Figure3-3alsoshowsthatthe estimatorin3doesnotconvergeuntilsufcientlearningoccursat t =3 : 6 seconds forexperiment11.TheextensioninSection3.2showsanadvantageofusingcurrent input-outputdataintheestimator,asshownbythemeanRMSerrorsinTable3-1and 48

PAGE 49

Figure3-3.Thesumofthenormofeachdeptherroracrossthe48featuresi.e., P 48 s i =1 k ~ z s i =c t k forExperiment11.Estimator1redrefersto3, Estimator2magentarefersto3,Estimator3greenrefersto[30], andEstimator4bluereferstotheEKF.Theblackverticallineindicatesthe timewhenenoughinformationwascollectedforlearning. Table3-1.RMSDepthErrorandPositionErrorinMetersOver15Experiments ExperimentEstimator1Estimator2Estimator3Estimator4Trajectory 170.49955.52558.32555.4350.017 266.35842.06552.82742.8380.026 380.46664.70167.61663.7060.037 476.25059.58365.59359.5650.032 579.41963.24470.79865.6740.032 682.61169.30071.99668.5580.018 765.97148.19257.28746.3920.020 873.86460.00264.29059.8580.027 977.69961.46869.22760.7080.020 1072.05355.91663.67955.7710.023 1175.47259.75759.91660.7980.016 1277.57163.62366.45163.6860.022 1383.66367.81772.99568.2740.036 1474.03058.76466.94159.2450.030 1577.74256.06764.98356.6120.029 Mean75.57859.06864.86259.1410.026 StandardDeviation5.0866.8195.5226.9380.007 49

PAGE 50

Figure3-4.Thepositionerrorofthecameraandthedistanceerrorusingtheestimator in3forExperiment11. 50

PAGE 51

Table3-2.RMSDepthErrorandPositionErrorinMetersOver15ExperimentBefore Learning ExperimentEstimator1Estimator2Estimator3Estimator4Trajectory 1133.393105.884109.492104.9840.007 2137.85587.811107.41088.2520.005 3137.133110.694114.093108.1390.005 4134.047105.321113.742104.7320.005 5138.285110.761120.448112.1890.005 6137.122115.571118.032113.9510.005 7126.25492.370103.99287.5030.004 8128.662104.859109.866103.4210.004 9136.862108.817119.151106.5690.005 10128.454100.033110.13698.8360.004 11137.677109.201107.660110.5060.006 12132.034108.494111.989108.1050.005 13139.927113.816119.680112.4770.009 14131.922104.987114.095101.1540.005 15142.222103.213116.301103.4890.006 Mean134.790105.455113.073104.2870.005 StandardDeviation4.4407.2124.8257.6450.001 Table3-3.RMSDepthErrorandPositionErrorinMetersOver15ExperimentAfter Learning ExperimentEstimator1Estimator2Estimator3Estimator4Trajectory 113.3743.97912.8388.5670.017 210.6551.86813.8948.1660.029 312.5132.65714.24610.1230.045 412.5171.18015.6567.6290.038 513.4311.90320.77717.7590.038 613.3592.41416.8897.5620.022 710.4033.50222.17710.3900.024 811.6323.80417.09911.6100.033 912.6611.68319.9599.8470.024 1011.1621.86120.0359.4570.028 1110.2392.82813.4838.1190.019 129.9961.57612.8457.8330.027 1313.0933.26119.96716.6080.044 1411.0443.09024.67721.2830.036 1513.8852.59918.5428.4940.034 Mean11.9982.54717.53910.8970.031 StandardDeviation1.2840.8313.5674.0800.008 51

PAGE 52

3-2.Specically,theestimatorin3isatadisadvantagetotheotherestimators beforesufcientexcitationhasoccurred,whiletheestimatorin3startsconverging tothetruedepthsatasimilarrateastheestimatorin[30].TheaverageRMSerrorof 3ismorethan10metersgreaterthantheotherestimatorsovertheentiretyof eachexperiment,andmorethan20metersgreaterbeforelearning.However,Table3-3 showsthattheaverageerrorof3afterlearningisonly1metergreaterthanthe EKFonaverage. TheextensioninSection3.2,specicallythedesignin3,improvestheerror convergenceof3suchthattheRMSerrorislowerthantheEKFonaverage.As showninTable3-1,theaverageerrorovertheentireexperimentruntimewas59.068 metersfor3comparedto59.141metersfortheEKF.Afterlearning,theaverage RMSerrorfortheestimatorin3wassmaller.547meterscomparedtotheEKF .897meters.However,asshowninTable3-2,theRMSerrorbeforelearningwas smallerfortheEKFcomparedto3,wheretheerrorswere104.287metersforthe EKFand105.455metersfortheestimatorin3.Additionally,Tables3-1-3-3show thatthedesignin3hasasmallerRMSerrorthanthedesignin[30]onaverage. Figure3-4andTables3-1-3-3showthepositionerrorusing3issmallwithan averageRMSerrorof0.026metersovertheentirerun;0.005metersbeforelearning and0.031afterlearning,whichisapproximately1.2%errorrelativetotrajectorylength. Theerrorincreaseafterlearningisaresultofnoise,whichasshowninFigure3-3and Table3-3causesthedeptherrortoremainsmallbutboundedatapproximately1.8% oftheinitialerror.Theseexperimentalresultsdemonstratetheabilityoftheobserverin 3and3toleveragebothimmediateinformationandlearningtobothconverge quicklywithlowRMSerrorandmaintainalowRMSerrorafterconverging. 3.5Summary Novelobserversusingasinglecameraandstructurefrommotiontheoryare developedtoestimatetheEuclideandistancetofeaturesonastationaryobjectand 52

PAGE 53

theEuclideantrajectorythecameratakeswhileobservingtheobject.Unlikeprevious resultsthatestimatetheinversedepthtofeatures,thedevelopedobserverforestimating theEuclideandistancetofeaturesdoesnotrequirethepositivedepthconstraint.A Lyapunov-basedstabilityanalysisshowstheobservererrorexponentiallyconverges wherepersistenceofexcitationisreplacedbyniteexcitationthroughtheuseofICL.An experimentalcomparisonofthedevelopedestimatortoexistingestimatorsshowsthatit achieveslowerRMSerrorwhencomparingfeaturedepthestimatesonaverage,andthe RMSerrorofthepositionalsoremainswithin1.8%. 53

PAGE 54

CHAPTER4 POSITIONESTIMATIONUSINGAMONOCULARCAMERASUBJECTTO INTERMITTENTFEEDBACK Inthischapter,anextensiontothelearningapproachesin[32],[45],andChapter3, isdevelopedthatappliesanewlearningstrategythatmaintainsacontinuousestimate ofthepositionofthecameraandestimatesthestructureoffeaturesastheycomeinto theFOV.Furthermore,thedevelopedlearningstrategyallowssimulatedmeasurements offeaturesfromobjectsthatarenolongerintheFOVenablingacontinuousestimate ofthedistancetofeatureswithrespecttothecamera.Additionally,thisapproach showshowtheextendedobserverremovesthepositivedepthconstraintrequiredby allpreviousstructurefrommotionapproaches.Usingthisapproach,acameramay traveloverlargedistanceswithoutkeepingspecicfeaturesintheFOVforalltimeand allowobjectstopermanentlyleavetheFOVifnecessary.ALyapunovbasedstability analysisprovesthattheobserversforestimatingthepathofthecameraaswellasthe structureofeachsetofobjectsaregloballyexponentiallystablewhilefeaturesareinthe FOV.Aswitchedsystemsanalysisisusedtodevelopdwell-timeconditionstoindicate howlongafeaturemustbetrackedtoensurethedistanceestimationerrorisbelowa threshold.Afterthedistanceestimateshaveconvergedbelowthethreshold,thefeature maybeusedtoupdatethepositionofthecamera.Ifafeaturedoesnotsatisfythedwelltimecondition,itisneverusedtoupdatethepositionoftheagent.Furthermore,the approachdoesnotrequireanewsetoffeaturestobeintheFOVwhenolderfeatures leavetheFOV.Finally,ifarecognizedlandmarkenterstheFOV,thefeedbackisused tocompensatefordrifterror.Theresultsinthischapterdemonstratethattheobserver andpredictorstrategyoutperformsapredictor-onlystrategycf.,[62]and[63]when feedbackisunavailable,providedstructureestimationerrorislessthanthethresholds usedtodevelopthedevelopeddwell-times. 54

PAGE 55

4.1LearningFeatureStructure Ingeneral,thereisnorelationshipbetweenanytwoobjectsthatmaybeexploited toimmediatelyestimatethestructureofthe i thfeatureonthe j thobjecti.e., s j;i when t = a j ,sincetheobjectsareunknownandtheremayonlybeoneobjectintheFOV. Whilethe i thfeatureonthe j thobjecthastheeigenvalueconditionsatisedi.e., s j;i t = a ,anapproachmotivatedbythedevelopmentin[32]and[45]isusedtolearn aconstantunknowndistance, d s j;i =k j .Specically,2and2areintegratedover atimewindow & 2 R > 0 yielding 2 6 4 d s j;i =c t d k j =c t 3 7 5 )]TJ/F31 11.9552 Tf 11.955 27.616 Td [(2 6 4 d s j;i =c t )]TJ/F25 11.9552 Tf 9.963 0 Td [(& d k j =c t )]TJ/F25 11.9552 Tf 9.963 0 Td [(& 3 7 5 = )]TJ/F26 7.9701 Tf 14.452 18.663 Td [(t Z t )]TJ/F26 7.9701 Tf 6.587 0 Td [(& 2 6 4 u T s j;i =c u T k j =c 3 7 5 v c d;t>&; where & maybeconstantinsizeorchangeovertime.Asdescribedin2and 2,while s j;i 2P s j t , )]TJ/F26 7.9701 Tf 14.453 12.022 Td [(t R t )]TJ/F26 7.9701 Tf 6.587 0 Td [(& 2 6 4 u T s j;i =c u T k j =c 3 7 5 v c d isknown,but 2 6 4 d s j;i =c t d k j =c t 3 7 5 and 2 6 4 d s j;i =c t )]TJ/F25 11.9552 Tf 11.955 0 Td [(& d k j =c t )]TJ/F25 11.9552 Tf 11.955 0 Td [(& 3 7 5 areunknownindicatingtheleftsideoftheequalityin4isunknown. However,when s j;i t = a ,therelationshipin2maybeutilizedin4yielding Y s j;i t d s j;i =k j = U s j;i t ;t> a j ; where Y s j;i t ; U s j;i t 2 R 2 aredenedas Y s j;i t , 8 > > > > > > < > > > > > > : a s j;i t )]TJ/F25 11.9552 Tf 11.955 0 Td [( a s j;i a l s j;i ;t 2 a l s j;i ; a l s j;i + & i ; a s j;i t )]TJ/F25 11.9552 Tf 11.955 0 Td [( a s j;i t )]TJ/F25 11.9552 Tf 11.955 0 Td [(& ;t 2 a l s j;i + &; u l s j;i i ; 0 2 1 ;t 2 u l s j;i ; a l +1 s j;i i ; 55

PAGE 56

U s j;i t , 8 > > > > > > > > > > > > > > < > > > > > > > > > > > > > > : )]TJ/F26 7.9701 Tf 15.85 12.022 Td [(t R a l s j;i 2 6 6 4 u T s j;i =c u T k j =c 3 7 7 5 v c d;t 2 a l s j;i ; a l s j;i + & i ; )]TJ/F26 7.9701 Tf 14.453 12.022 Td [(t R t )]TJ/F26 7.9701 Tf 6.586 0 Td [(& 2 6 6 4 u T s j;i =c u T k j =c 3 7 7 5 v c d;t 2 a l s j;i + &; u l s j;i i ; 0 2 1 ;t 2 u l s j;i ; a l +1 s j;i i ; and a l s j;i ; u l s j;i 2 h a j ; u s j;i i representtimeinstanceswhen s j;i t = a and s j;i t = u , respectively,and l 2 Z > 0 ; representstheindexcorrespondingtoeachswitchforfeature s j;i .Multiplyingbothsidesof4by Y T s j;i t yields Y T s j;i t Y s j;i t d s j;i =k j = Y T s j;i t U s j;i t : Ingeneral, Y s j;i t willisnotfullcolumnrankwhile s j;i t = a e.g.whenthecamera isstationaryimplying Y T s j;i t Y s j;i t 0 andcannotbedeterminedwhile s j;i t = u . However,theequalityin4maybeevaluatedatanyinstanceintimeandsummed togetheri.e.,historystacksyielding Y s j;i d s j;i =k j = U s j;i ; where Y s j;i , N P h =1 Y T s j;i t h Y s j;i t h , U s j;i , N P h =1 Y T s j;i t h U s j;i t h , t h 2 a j ; u s j;i i ,and N 2 Z > 1 . Assumption4.1. Thecameramotionoccurssothereexistsasetoffeatures A s j t O s j ,constant 2 R > 0 ,andasetoftimes j , s j;i s j;i 2A s j ,suchthatforalltime t> s j;i , min n Y s j;i o > ,where s j;i 2 a j ; u s j;i and min fg istheminimumeigenvalueof 56

PAGE 57

fg .Let A c s j t , O s j nA s j t .Furthermore, a s j )]TJ/F25 11.9552 Tf 5.479 -9.684 Td [( u j 4 ,where a s j t 2 Z 0 represents thenumberoffeaturesin A s j t . 1 Learningthesubsetin A s j t islessrestrictivethanassumingallofthefeatures in O s j arelearnedbecausethereisnoguaranteethatthemotionofthecamerawill besufcientbeforeeveryfeatureleavestheFOVpermanently.Cameramotionin Assumption4.1canbeveriedonlineandisheuristicallyeasytosatisfybecause itonlyrequiresanitecollectionofsufcientlyexciting Y s j;i t and U s j;i t toyield min n Y s j;i o > .Thetimesin j areunknown;however,theycanbedetermined onlinebycheckingtheminimumeigenvalueof Y s j;i foreacheachfeature. IfmotionoccursasdiscussedinAssumption4.1,theconstantunknowndistance, d s j;i =k j ,canbedeterminedforfeature s j;i 2A s j t from4yielding d s j;i =k j = X s j;i ;s j;i 2A s j t ; where X s j;i , )]TJ/F24 7.9701 Tf 6.587 0 Td [(1 Y s j;i U s j;i ;s j;i 2A s j t .Substituting4into2yields Y s j;i t 2 6 4 d s j;i =c t d k j =c t 3 7 5 = R k j =c t u s j;i =k j X s j;i ;s j;i 2A s j t : Sincetherewillalwaysbeadelaybefore X s j;i isdeterminedfor s j;i 2A s j t ,an additionalrelationshipisdevelopedinanefforttoprovidefeedbackbasedontherateof changeofthedirectiontothefeature,motivatedbythedevelopmentin[30].Specically, thetimerateofchangeof u s j;i =c t isapproximatedandusedtoprovidefeedback. Takingthetimederivativeof u s j;i =c t yields d dt u s j;i =c t = )]TJ/F25 11.9552 Tf 9.298 0 Td [(! c t u s j;i =c t 1 See[114]or[115]forsomeexamplesofmethodsforselectingdatatosatisfythe assumption. 57

PAGE 58

+ 1 d s j;i =c t u s j;i =c t u T s j;i =c t )]TJ/F25 11.9552 Tf 8.966 0 Td [(I 3 3 v c t ; implying s j;i t d s j;i =c t = s j;i t ; where s j;i t , d dt u s j;i =c t + ! c t u s j;i =c t , s j;i t , u s j;i =c t u T s j;i =c t )]TJ/F25 11.9552 Tf 11.955 0 Td [(I 3 3 v c t , ! c t , 2 6 6 6 6 4 0 )]TJ/F25 11.9552 Tf 9.299 0 Td [(! z ! y ! z 0 )]TJ/F25 11.9552 Tf 9.299 0 Td [(! x )]TJ/F25 11.9552 Tf 9.298 0 Td [(! y ! x 0 3 7 7 7 7 5 ,and I 3 3 , 2 6 6 6 6 4 100 010 001 3 7 7 7 7 5 . Substituting2into4yields s j;i t u s j;i t 2 6 4 d k j =c t d s j;i =k j 3 7 5 = s j;i t : Letacompositesignal s j;i t 2 R 3 bedenedas s j;i t , 2 6 6 6 6 4 d s j;i =c t d k j =c t d s j;i =k j 3 7 7 7 7 5 .Combining4 and4yields Y X s j;i t s j;i t = u X s j;i t ; where Y X s j;i t , 2 6 4 Y s j;i t 0 3 1 0 1 2 1 3 7 5 and u X s j;i t , X s j;i 2 6 4 R k j =c t u s j;i =k j 1 3 7 5 ,andcombining 4and4yields Y s j;i t s j;i t = u s j;i t ; where Y s j;i t , 2 6 4 s j;i t 0 3 2 0 3 1 s j;i t u s j;i t 3 7 5 and u s j;i t , 2 6 4 s j;i t s j;i t 3 7 5 . 4.2FeatureObserverDesignWithoutObjectReturn Theestimationerrorsforfeature s j;i 2O s j , ~ d s j;i =c t ; ~ d k j;i =c t ; ~ d s j;i =k j t 2 R ,are denedas 58

PAGE 59

~ d s j;i =c t , d s j;i =c t )]TJ/F15 11.9552 Tf 14.021 3.155 Td [(^ d s j;i =c t ; ~ d k j;i =c t , d k j =c t )]TJ/F15 11.9552 Tf 14.021 3.155 Td [(^ d k j;i =c t ; and ~ d s j;i =k j t , d s j;i =k j )]TJ/F15 11.9552 Tf 14.021 3.155 Td [(^ d s j;i =k j t ; where ^ d s j;i =c t ; ^ d s j;i =k j t 2 R aretheestimatesof d s j;i =c t and d s j;i =k j ,respectively,and ^ d k j;i =c t istheestimateof d k j =c t byfeature s j;i .Thecombinederrorforfeature s j;i is quantiedusing4-4as ~ s j;i t , s j;i t )]TJ/F15 11.9552 Tf 12.781 0 Td [(^ s j;i t ; where ^ s j;i t , 2 6 6 6 6 4 ^ d s j;i =c t ^ d k j;i =c t ^ d s j;i =k j t 3 7 7 7 7 5 istheestimateof s j;i t implying ~ s j;i t , 2 6 6 6 6 4 ~ d s j;i =c t ~ d k j;i =c t ~ d s j;i =k j t 3 7 7 7 7 5 . 4.2.1FeatureObserverDesign Ifthe j thobjectwillneverreturntotheFOV,noupdatescanbeguaranteedafter s j;i 2P c s j t .Inthiscase,eachfeatureisdesignedasthoughitwouldremainintheFOV andthelastknownestimateisusedafterthefeatureleavestheFOVi.e.,azero-order hold.Motivatedbythesubsequentanalysis,theestimatorupdatelawfor ^ s j;i t is denedas d dt )]TJ/F15 11.9552 Tf 6.306 -9.683 Td [(^ s j;i t , 8 > > > > > > < > > > > > > : 0 3 1 ;s j;i 2P c s j t ; proj s j;i t ;s j;i 2P s j t ; proj s j;i t + X s j;i t ;s j;i 2A s j P s j t ; where s j;i t , 2 6 6 6 6 4 )]TJ/F25 11.9552 Tf 9.298 0 Td [(u T s j;i =c t v c t )]TJ/F25 11.9552 Tf 9.298 0 Td [(u T k j =c t v c t 0 3 7 7 7 7 5 + K Y T s j;i t u s j;i t )]TJ/F25 11.9552 Tf 12.511 0 Td [(K Y T s j;i t Y s j;i t ^ s j;i t , X s j;i t , K X Y T X s j;i t u X s j;i t )]TJ/F25 11.9552 Tf 13.009 0 Td [(K X Y T X s j;i t Y X s j;i t ^ s j;i t ,and K ;K X 2 R 3 3 59

PAGE 60

arepositivedenitegainmatrices,and proj isaprojectionoperatortobound d 2 ^ d s j;i =c t d , d 2 ^ d s j;i =k j t d ,and d 1 ^ d k j;i =c t . 2 Takingthetimederivativeof4,substituting2-2,4,4, 4,and4,andsimplifyingyields d dt )]TJ/F15 11.9552 Tf 6.305 -9.684 Td [(~ s j;i t = 8 > > > > > > > > > > > > > > < > > > > > > > > > > > > > > : 2 6 6 6 6 6 6 4 )]TJ/F25 11.9552 Tf 9.298 0 Td [(u T s j;i =c t v c t )]TJ/F25 11.9552 Tf 9.298 0 Td [(u T k j =c t v c t 0 3 7 7 7 7 7 7 5 ;s j;i 2P c s j t ; )]TJ/F15 11.9552 Tf 9.299 0 Td [( s j;i t ~ s j;i t ;s j;i 2P s j t ; )]TJ/F15 11.9552 Tf 9.299 0 Td [( s j;i t ~ s j;i t ;s j;i 2A s j t P s j t ; d dt )]TJ/F15 11.9552 Tf 6.306 -9.683 Td [(~ s j;i t = 8 > > < > > : )]TJ/F15 11.9552 Tf 9.298 0 Td [( s j;i t ~ s j;i t ;s j;i 2P s j t )]TJ/F15 11.9552 Tf 9.298 0 Td [( s j;i t ~ s j;i t ;s j;i 2A s j t P s j t where s j;i t , s j;i t + X s j;i t , s j;i t , K Y T s j;i t Y s j;i t ,and X s j;i t , K X Y T X s j;i t Y X s j;i t .Whilefeature s j;i 2P s j t ,theremaybeaset oftimeswhere s j;i t canimprovetheestimateifaPEassumptionissatised. Let B s j t , s j;i 2A s j t P s j t : s j;i t = a and B c s j t , O s j nB s j t .If feature s j;i 2B s j t , s j;i t > 0 and min s j;i t > a ;however,iffeature s j;i 2A s j t P s j t B c s j t , s j;i t 0 since Y T s j;i t Y s j;i t 0 .Afterfeature s j;i 2P c s j t ,theobjectneverreturnstotheFOVandtheerrorwillgrowsincenoupdate isavailable. 4.2.2ObserverDesignStabilityAnalysis Tofacilitatethesubsequentdevelopment,let L s j;i , n l 2 Z > 0 : a l s j;i > s j;i o and a l s j;i , u l s j;i )]TJ/F25 11.9552 Tf 12.045 0 Td [( a l s j;i .Let V s j;i )]TJ/F15 11.9552 Tf 6.305 -9.684 Td [(~ s j;i t : R 3 ! R beacandidateLyapunovfunctiondened 2 See[32,AppendixE]or[33,Remark3.7]forexamplesonimplementingasmooth projectionoperator 60

PAGE 61

as V s j;i )]TJ/F15 11.9552 Tf 6.306 -9.684 Td [(~ s j;i t , 1 2 ~ T s j;i t ~ s j;i t ; whichcanbeboundedas 1 2 k ~ s j;i t k 2 V s j;i )]TJ/F15 11.9552 Tf 6.305 -9.684 Td [(~ s j;i t 1 2 k ~ s j;i t k 2 . Lemma4.1. Theobserverupdatelawdenedin4ensurestheestimationerror ~ s j;i t isboundedforthefeature s j;i 2P s j t inthesensethat k ~ s j;i t kk ~ s j;i )]TJ/F25 11.9552 Tf 5.48 -9.684 Td [( a j k ;s j;i 2P s j t : Proof. Takingthetimederivativeof4,substituting4forthecasewhenfeature s j;i 2P s j t andusing min n s j;i t o 0 for s j;i 2P s j t yields d dt )]TJ/F25 11.9552 Tf 5.479 -9.683 Td [(V s j;i )]TJ/F15 11.9552 Tf 6.306 -9.683 Td [(~ s j;i t 0 : Invoking[116,Theorem8.4]on4yields k ~ s j;i t k 2 k ~ s j;i )]TJ/F25 11.9552 Tf 5.48 -9.683 Td [( a j k 2 andtakingthe squarerootyields4. Lemma4.2. Theobserverupdatelawdenedin4ensurestheestimationerror ~ s j;i t isboundedforfeature s j;i 2A s j t P s j t B c s j t inthesensethat k ~ s j;i t kk ~ s j;i u l s j;i k ;s j;i 2A s j t P s j t B c s j t : Proof. Takingthetimederivativeof4,substituting4forthecasewhenfeature s j;i 2A s j t P s j t andusing min s j;i t 0 for s j;i 2A s j t P s j t B c s j t yields d dt )]TJ/F25 11.9552 Tf 5.479 -9.684 Td [(V s j;i )]TJ/F15 11.9552 Tf 6.306 -9.684 Td [(~ s j;i t 0 : Invoking[116,Theorem8.4]on4yields k ~ s j;i t k 2 k ~ s j;i u l s j;i k 2 andtakingthe squarerootyields4. Lemma4.3. Theobserverupdatelawdenedin4ensurestheestimationerror ~ s j;i t isexponentiallyconvergingforfeature s j;i 2B s j t inthesensethat k ~ s j;i t kk ~ s j;i a l s j;i k exp )]TJ/F25 11.9552 Tf 9.298 0 Td [( t )]TJ/F25 11.9552 Tf 11.955 0 Td [( a l s j;i ;s j;i 2B s j t : 61

PAGE 62

Proof. Takingthetimederivativeof4,substituting4forthecasewhenfeature s j;i 2A s j t P s j t ,andusing min s j;i t > a for s j;i 2B s j t yields d dt )]TJ/F25 11.9552 Tf 5.48 -9.684 Td [(V s j;i )]TJ/F15 11.9552 Tf 6.306 -9.684 Td [(~ s j;i t )]TJ/F15 11.9552 Tf 21.918 0 Td [(2 V s j;i )]TJ/F15 11.9552 Tf 6.305 -9.684 Td [(~ s j;i t ; where , a min f K X g .Invoking[116,Theorem4.10]on4yields k ~ s j;i t k 2 k ~ s j;i a l s j;i k 2 exp )]TJ/F15 11.9552 Tf 9.299 0 Td [(2 t )]TJ/F25 11.9552 Tf 11.955 0 Td [( a l s j;i andtakingthesquarerootyields4. Theorem4.1. Whenfeature s j;i 2A s j t P s j t leavestheFOV,theswitchedsystem denedby s j;i t andtheobserverupdatelawdenedin4is globallyuniformly ultimatelyboundedGUUB as k ~ s j;i u s j;i kk ~ s j;i a s j;i k exp 0 @ )]TJ/F25 11.9552 Tf 9.299 0 Td [( X l 2L s j;i a l s j;i 1 A : Proof. Usingtheboundsin4,4,and4implies k ~ s j;i u l s j;i k k ~ s j;i a l s j;i k exp )]TJ/F25 11.9552 Tf 9.298 0 Td [( a l s j;i and k ~ s j;i )]TJ/F25 11.9552 Tf 5.479 -9.683 Td [( u l +1 s j;i kk ~ s j;i u l s j;i k exp )]TJ/F28 11.9552 Tf 5.48 -9.683 Td [()]TJ/F25 11.9552 Tf 9.298 0 Td [( a l +1 s j;i .Substitutingtherstinequalityintothesecond,andusingtherelationshipforall l 2L s j;i leadsto 4. Asshownin4,thenalerrorwhenafeatureleavesisbounded;however, once s j;i 2P c s j t ,theestimationerrorsin ~ d s j;i =c t and ~ d k j;i =c t willdivergesinceno observationsaremade.Forexample,using d dt )]TJ/F25 11.9552 Tf 5.48 -9.684 Td [(V s j;i )]TJ/F15 11.9552 Tf 6.305 -9.684 Td [(~ s j;i t =~ T s j;i t d dt )]TJ/F15 11.9552 Tf 6.306 -9.684 Td [(~ s j;i t andsubstituting4and4forthecasewhenfeature s j;i 2P c s j t implies d dt )]TJ/F25 11.9552 Tf 5.479 -9.684 Td [(V s j;i )]TJ/F15 11.9552 Tf 6.306 -9.684 Td [(~ s j;i t k ~ d s j;i =c t kk v c t k + k ~ d k j;i =c t kk v c t k ,implyingtheerrorgrows. Inapplicationswhereitisnotpossibletoreturntoanobject,thisgrowthcannotbe compensatedfor,implyingtheestimatorforfeature s j;i isnotcontinuedafter s j;i 2 P c s j t . 4.3FeatureObserverDesignWithObjectReturn Asdiscussedinrecentworksuchas[62]and[63],theobjectiveofexploring unknownenvironmentswherefeedbackisunavailablerequiresanagenttoreturnto 62

PAGE 63

regionswherefeedbackisavailabletocompensateforerrorgrowth.Thisreturnwill enabletheabilitytoreducetheultimateboundoftheerrordescribedin4and compensatefortheerrorgrowththroughthedevelopmentofdwell-timeconditions. 4.3.1FeaturePredictorDesign Asshownin4,iffeedbackisunavailablei.e., s j;i 2P c s j t , u s j;i =c t is unknown.Therefore,apredictorisdesignedtoestimate u s j;i =c t as d dt ^ u s j;i =c t , u s j;i =c t ;s j;i 2P c s j t ; where u s j;i =c t , )]TJ/F25 11.9552 Tf 9.299 0 Td [(! c t ^ u s j;i =c t + 1 ^ d s j;i =c t ^ u s j;i =c t ^ u T s j;i =c t )]TJ/F25 11.9552 Tf 11.955 0 Td [(I 3 3 v c t .While s j;i 2P s j t ,aresetmapcf.,[62]and[63]isusedtoset ^ u s j;i =c t ! u s j;i =c t .Also, since u s j;i =c t isaunitvector, k ^ u s j;i =c t k =1 .Letthepredictorerrorfor u s j;i =c t be quantiedas ~ u s j;i =c t , u s j;i =c t )]TJ/F15 11.9552 Tf 12.685 0 Td [(^ u s j;i =c t ; where k ~ u s j;i =c t k 2 . Sincethetimederivativeof u k j =c t is d dt u k j =c t = )]TJ/F25 11.9552 Tf 9.298 0 Td [(! c t u k j =c t + 1 d k j =c t u k j =c t u T k j =c t )]TJ/F25 11.9552 Tf 11.955 0 Td [(I 3 3 v c t ; apredictorsimilarto4isdesignedtoestimate u k j =c t as d dt ^ u k j;i =c t , u k j;i =c t ;s j;i 2P c s j t ; where u k j;i =c t , )]TJ/F25 11.9552 Tf 9.299 0 Td [(! c t ^ u k j;i =c t + 1 ^ d k j;i =c t ^ u k j;i =c t ^ u T k j;i =c t )]TJ/F25 11.9552 Tf 11.955 0 Td [(I 3 3 v c t .Additionally,while s j;i 2P s j t ,aresetmapisusedtoset ^ u k j;i =c t ! u k j =c t .Also,since u k j =c t isaunitvector, k ^ u k j;i =c t k =1 .Letthepredictorerrorfor u k j =c t bequantied 63

PAGE 64

as ~ u k j;i =c t , u k j =c t )]TJ/F15 11.9552 Tf 12.685 0 Td [(^ u k j;i =c t where k ~ u k j;i =c t k 2 . Anestimateof R k j =c t isestablishedusingtheunitquaternionformoftheorientation,whichcanberepresentedas q k j =c t 2 R 4 ,where q T k j =c t q k j =c t =1 .The derivativewithrespecttotimefor q k j =c t is d dt )]TJ/F25 11.9552 Tf 5.48 -9.684 Td [(q k j =c t = )]TJ/F15 11.9552 Tf 10.494 8.088 Td [(1 2 B )]TJ/F25 11.9552 Tf 5.479 -9.684 Td [(q k j =c t ! c t ; where B q , 2 6 6 6 6 6 6 6 4 )]TJ/F25 11.9552 Tf 9.299 0 Td [(q 2 )]TJ/F25 11.9552 Tf 9.298 0 Td [(q 3 )]TJ/F25 11.9552 Tf 9.299 0 Td [(q 4 q 1 )]TJ/F25 11.9552 Tf 9.298 0 Td [(q 4 q 3 q 4 q 1 )]TJ/F25 11.9552 Tf 9.299 0 Td [(q 2 )]TJ/F25 11.9552 Tf 9.298 0 Td [(q 3 q 2 q 1 3 7 7 7 7 7 7 7 5 ; q 1 ;q 2 ;q 3 ;q 4 2 R arethefourelementsofaunitquaternion q t and B T q t B q t = I 3 3 . 3 Therotationmatrixrepresentationofaunitquaternion q t is R q , 2 6 6 6 6 4 1 )]TJ/F15 11.9552 Tf 11.955 0 Td [(2 q 2 3 + q 2 4 2 q 2 q 3 )]TJ/F25 11.9552 Tf 11.955 0 Td [(q 4 q 1 2 q 2 q 4 + q 3 q 1 2 q 2 q 3 + q 4 q 1 1 )]TJ/F15 11.9552 Tf 11.955 0 Td [(2 q 2 2 + q 2 4 2 q 3 q 4 )]TJ/F25 11.9552 Tf 11.955 0 Td [(q 2 q 1 2 q 1 q 4 )]TJ/F25 11.9552 Tf 11.955 0 Td [(q 3 q 1 2 q 3 q 4 + q 2 q 1 1 )]TJ/F15 11.9552 Tf 11.955 0 Td [(2 q 2 2 + q 2 3 3 7 7 7 7 5 : Similarto4,apredictorisdesignedfor q k j =c t as d dt )]TJ/F15 11.9552 Tf 6.339 -9.684 Td [(^ q k j =c t = )]TJ/F15 11.9552 Tf 10.494 8.088 Td [(1 2 B )]TJ/F15 11.9552 Tf 6.338 -9.684 Td [(^ q k j =c t ! c t ; where ^ q k j =c t 2 R 4 istheestimateof q k j =c t .Onlyonepredictor ^ q k j =c t isnecessary since d dt )]TJ/F25 11.9552 Tf 5.48 -9.684 Td [(q k j =c t isnotdependentonfeatureestimates.Additionally,theorientationis 3 Timedependenceissuppressedexceptwhenneededforclarityorintroducing terms. 64

PAGE 65

oftenestimatedthroughothermethodsandapredictormaynotbenecessary.Since d dt )]TJ/F25 11.9552 Tf 5.479 -9.684 Td [(q k j =c t isonlyafunctionof q k j =c t ,initializing ^ q k j =c )]TJ/F25 11.9552 Tf 5.48 -9.684 Td [( u j = q k j =c )]TJ/F25 11.9552 Tf 5.479 -9.684 Td [( u j implies4 and4areequivalentforall t> u j . Usingtheestimatesfromthepredictorsin4,4,and4,apredictoris designedfor s j;i t when s j;i 2P c s j t as d dt )]TJ/F15 11.9552 Tf 6.306 -9.683 Td [(^ s j;i t , proj ^ s j;i t ;s j;i 2P c s j t ; where ^ s j;i t , 2 6 6 6 6 4 )]TJ/F15 11.9552 Tf 10.028 0 Td [(^ u T s j;i =c t v c t )]TJ/F15 11.9552 Tf 10.029 0 Td [(^ u T k j;i =c t v c t 0 3 7 7 7 7 5 andtheprojectionoperatorisusedtobound d 2 ^ d s j;i =c t d , d 2 ^ d s j;i =k j t d ,and d 1 ^ d k j;i =c t . Takingthetimederivativeof4,substituting2-2,4,4,and 4,simplifyingyields d dt )]TJ/F15 11.9552 Tf 6.305 -9.684 Td [(~ s j;i t = )]TJ/F31 11.9552 Tf 11.291 38.377 Td [(2 6 6 6 6 4 ~ u T s j;i =c t v c t ~ u T k j;i =c t v c t 0 3 7 7 7 7 5 : 4.3.2StabilityAnalysisofFeaturePredictorDesign Toquantitativelydescribethestabilityoftheobserverandpredictor,let o s j;i t 2 f a;u g describewhetheranobserverisactivatedorapredictorisactivated,respectively. Specically,when o s j;i t = a , s j;i 2P s j t indicatingthefeature s j;i isintheFOV andanobserverisusedforfeature s j;i .Similarly,when o s j;i t = u , s j;i 2P c s j t indicatingthefeature s j;i leavestheFOVandapredictorisusedforfeature s j;i .Let t a n s j;i representthe n thinstanceintimefeature s j;i enterstheFOVi.e., s j;i 2P s j t a n s j;i with o s j;i t a n s j;i = a ^ s j;i t a n s j;i = a ,where t a 1 s j;i a j .Furthermore,let t u n s j;i representthe n th instanceintimefeature s j;i leavestheFOVi.e., s j;i 2P c s j t u n s j;i . 65

PAGE 66

Theorem4.2. Thepredictordesignin4forfeature s j;i 2P c s j t ensuresthe estimationerror ~ s j;i t isboundedas k ~ s j;i t kk ~ s j;i t u n s j;i k +4 v c t )]TJ/F25 11.9552 Tf 11.955 0 Td [(t u n s j;i : Proof. Takingthetimederivativeof4,substituting4,andusingthebounds k ~ u k j;i =c t k 2 and k ~ u s j;i =c t k 2 yields d dt )]TJ/F25 11.9552 Tf 5.48 -9.684 Td [(V s j;i )]TJ/F15 11.9552 Tf 6.305 -9.684 Td [(~ s j;i t 4 p 2 v c q V s j;i )]TJ/F15 11.9552 Tf 6.306 -9.684 Td [(~ s j;i t : InvokingtheComparisonLemma[116,Lemma3.4]on4yieldstheresultin 4. 4.3.3EnsuringStabilityThroughDwell-TimeConditions Tofacilitatethesubsequentdevelopment,let L n s j;i , n l 2 Z > 0 : a l s j;i 2 h t a n s j;i ;t u n s j;i i t> s j;i o ,where l n ; l n ; l n 2 Z > 0 representtherstand last l 2L n s j;i ,respectively,and l n , l n )]TJ/F25 11.9552 Tf 12.219 0 Td [(l n .Additionally,let l n 2 1 ; 2 ;:::; l n +1 representtheindexof L n s j;i ,andlet a l n ;n s j;i 2 h t a n s j;i ;t u n s j;i i t> s j;i and u l n ;n s j;i 2 h t a n s j;i ;t u n s j;i i t> s j;i representtheinstancesintimeforthe n threturnsuchthat s j;i t = a and s j;i t = u , respectively.Furthermore,let t a n s j;i , l n P l n =1 a l n ;n s j;i and t u n s j;i , t a n +1 s j;i )]TJ/F25 11.9552 Tf 12.983 0 Td [(t u n s j;i ,where a l n ;n s j;i , u l n ;n s j;i )]TJ/F25 11.9552 Tf 11.955 0 Td [( a l n ;n s j;i , u l n ;n s j;i , a l n +1 ;n s j;i )]TJ/F25 11.9552 Tf 11.955 0 Td [( u l n ;n s j;i ,and u n s j;i , t u n s j;i )]TJ/F25 11.9552 Tf 11.955 0 Td [( u l n ;n s j;i . Toensurethatthesystemdenedbytheswitchingsignals s j;i t and o s j;i t remainsbounded,minimumandmaximumdwell-timesmustbedevelopedforeach observerandpredictor,respectively.Approacheslikethosetakenin[62]and[63]will notbepossibleforswitchedsystemslikethosedenedby s j;i t and o s j;i t since ~ s j;i t a n s j;i and ~ s j;i t u n s j;i areunknownandcannotbereset.However, ~ d s j;i =c t a 1 s j;i d , ~ d k j;i =c t a 1 s j;i d ,and ~ d s j;i =k j t a 1 s j;i d implying k ~ s j;i t a 1 s j;i k p 3 d . Let ~ s j;i 2 )]TJ/F15 11.9552 Tf 5.479 -9.684 Td [(0 ; d beauser-denedthresholdsuchthat k ~ s j;i u l n ;n s j;i k ~ s j;i i.e.,theuserwillneedtheerrorbelowsomethresholdbeforeallowingtheobjectto leavetheFOV.Furthermore,let ~ s j;i 2 ~ s j;i ; d i beauser-denedthresholdsuchthat 66

PAGE 67

~ s j;i < k ~ s j;i )]TJ/F25 11.9552 Tf 5.479 -9.684 Td [( a 1 ;n s j;i k ~ s j;i forall n> 1 .Thethresholds ~ s j;i and ~ s j;i representthe acceptableamountoferrorforauser'sapplicationbeforeafeaturemayleaveandmust returntotheFOV,respectively.Whilethetrueerror ~ s j;i t isunknown,boundsonthe distancesareknownandestablishingthebounds ~ s j;i and ~ s j;i asdescribedwillensure thattheerrorsarewithinthethresholdsprovidedthesubsequentlydevelopeddwell-time conditionsaresatised.Specically,thedwell-timesareestablishedsuchthattheupper boundonthedistanceerrorsconvergeimplyingthetrueerrorsmustalsoconverge. Assumption4.2. Itispossibleforthesystemtosatisfythesubsequentlydeveloped dwell-timeconditionsforthesetoffeatures A s j t .Also,after t>t u 1 s j;i ,theonlytracked featuresfromthe j thobjectwillbethosecontainedin A s j t .Specically, P s j t A s j t fortime t>t u 1 s j;i ,implying A s j t P s j t = P s j t . UnderAssumptions2.1and2.2,therotationmatrix R k j =c t andunitvector u k j =c t canbedeterminedfromthesetofstationaryfeaturesin P s j t while p s j t 4 .This impliesthatitisnotsufcienttoonlyconsiderthedwell-timesforindividualfeatures sincetheobserversareusedontheassumption u k j =c t isavailable.Let o s j t 2 f a;u g beaswitchingsignalthatindicateswhenthereareenoughfeaturesintheFOV todetermine u k j =c t and R k j =c t ;specically,therstandsecondmodeof o s j t representswhen p s j t 4 and p s j t < 4 ,respectively.However,fortheobject tobesuccessfullyrecaptured, o s j t = a ^ s j;i 2A s j t o s j;i t = a ^ s j;i t = a implyingeachfeature s j;i 2A s j t isintheFOVandtherelativemotionissufcientfor learning.Let t a n s j , max a 1 ;n s j;i , t u n s j , min n u l n ;n s j;i o ,and t a n s j , t u n s j )]TJETq1 0 0 1 444.001 238.915 cm[]0 d 0 J 0.478 w 0 0 m 14.322 0 l SQBT/F25 11.9552 Tf 444.001 229.888 Td [(t a n s j forthe n th switchingcycleacrossallfeaturesin A s j t .Furthermore,let t a n +1 s j , max a 1 ;n +1 s j;i and t u n s j , t a n +1 s j )]TJ/F25 11.9552 Tf 11.955 0 Td [(t u n s j . Theorem4.3. Foreachfeatureinthesetoffeatures A s j t ,theerrorsoftheswitched systemdenedbytheswitchingsignals s j;i t , o s j;i t ,and o s j t ,andtheobserver updatelawin4ensuretheestimationerrorin ~ s j;i t attime t = t u 1 s j;i is GUUB as 67

PAGE 68

k ~ s j;i t u 1 s j;i k ~ s j;i providedtheswitchingsignalssatisfytheinitialminimumfeedback availabilitydwell-timecondition t a 1 s j )]TJ/F15 11.9552 Tf 23.822 8.088 Td [(1 ln ~ s j;i p 3 d ! > )]TJ/F15 11.9552 Tf 11.203 8.088 Td [(1 ln 1 p 3 : Proof. Using4fortherstinstanceimplies k ~ s j;i t u 1 s j;i k k ~ s j;i t a 1 s j;i k exp )]TJ/F25 11.9552 Tf 9.299 0 Td [( t a 1 s j;i .Itisdesiredtohave k ~ s j;i t u 1 s j;i k ~ s j;i andtheinitial errorisboundedas k ~ s j;i t a 1 s j;i k p 3 d .Substitutingtheseboundsintotherst inequalityandsolvingfor t a 1 s j;i yields t a 1 s j;i )]TJ/F24 7.9701 Tf 26.284 4.707 Td [(1 ln ~ s j;i p 3 d .Because t a 1 s j must lowerboundthedwell-timestoensureallofthefeatureobserversareimplementable, t a 1 s j )]TJ/F24 7.9701 Tf 23.577 4.707 Td [(1 ln ~ s j;i p 3 d .Since ~ s j;i < d , ~ s j;i p 3 d < 1 p 3 yieldingtheboundin4. Theorem4.4. Foreachfeatureinthesetoffeatures A s j t ,theerrorsoftheswitched systemdenedbytheswitchingsignals s j;i t , o s j;i t ,and o s j t ,andtheobserver updatelawin4ensuretheestimationerrorin ~ s j;i t isGUUBas k ~ s j;i t u n s j;i k ~ s j;i providedtheswitchingsignalssatisfytheminimumfeedbackavailabilitydwell-time condition t a n s j )]TJ/F15 11.9552 Tf 23.823 8.088 Td [(1 ln ~ s j;i ~ s j;i ! > 0 ;n> 1 Proof. TheprooffollowsTheorem4.3usingtheupperbound k ~ s j;i t a n s j;i k ~ s j;i . Theorem4.5. Foreachfeatureinthesetoffeatures A s j t P c s j t ,theerrorsof theswitchedsystemdenedbytheswitchingsignals s j;i t , o s j;i t ,and o s j t ,and thepredictorupdatelawin4ensuretheestimationerrorin ~ s j;i t isGUUBas k ~ s j;i t a n +1 s j;i k ~ s j;i providedtheswitchingsignalssatisfythemaximumlossof feedbackdwell-timecondition 0 < t u n s j ~ s j;i )]TJ/F15 11.9552 Tf 12.782 0 Td [(~ s j;i 4 v c 68

PAGE 69

Proof. Using4forthe n thinstanceimplies k ~ s j;i t a n +1 s j;i kk ~ s j;i t u n s j;i k +4 v c t u n s j;i . Itisdesiredtohave k ~ s j;i t a n +1 s j;i k ~ s j;i and k ~ s j;i t u n s j;i k ~ s j;i .Substitutingthe secondboundsintotherstandsolvingfor t u n s j;i yields t u n s j;i ~ s j;i )]TJ/F24 7.9701 Tf 7.207 0 Td [(~ s j;i 4 v c .Because t u n s j mustupperboundthedwell-timestoensureallofthefeatureobserversare implementable, t u n s j ~ s j;i )]TJ/F24 7.9701 Tf 7.208 0 Td [(~ s j;i 4 v c .Since ~ s j;i > ~ s j;i , ~ s j;i )]TJ/F15 11.9552 Tf 12.98 0 Td [(~ s j;i > 0 ,yieldingtheboundin 4. Ensuringthat4-4aresatisedguaranteestheerrorremainsboundedas ~ s j;i < k ~ s j;i t k p 3 d foralltime t
PAGE 70

Consideringthe j thobjectonlyprovidesmeasurementsof u k j =c t and q k j =c t andestimatesof d k j =c ,anobservermustbeusedtoestimate p c=k 1 t and q c=k 1 t while the j thobjectisintheFOV.Whennoobjecthasfeedbackavailable,apredictormust beusedtoestimate p c=k 1 t and q c=k 1 t .Basedontheminimumdwell-timeanalysis inTheorem4.3,let t s j;i t 2f a;u g beaswitchingsignalfor i thfeatureonthe j th objectindicatingwhenthetotaltimeconverginghasexceededtheminimumdwelltimeconditionorhasnotexceededtheminimumdwelltimecondition.Specically, t s j;i t = a implies l n t P l n =1 a l n ;n s j;i > t a n s j ! ^ t 2 h t a n s j;i ;t u n s j;i i ,where l n t 2 Z > 0 represents thecurrentindexofthe n thcycleforthe i thfeatureonthe j thobject.Similarly,let t s j t 2f a;u g beaswitchingsignalindicatingwhenalltheremainingfeatureson the j thobjectwithfeedbackavailablehaveeithersatisedornotsatisedthedwelltimecondition,specically, t s j t = a impliesallremainingfeaturesonthe j thobject s j;i 2P s j t have t s j;i t = a .Furthermore,let a n s j , min n t>t a n s j;i : t s j = a o ,represent thetimethatalloftheremainingfeaturesonthe j thobject s j;i 2P s j t havesatised theminimumdwell-timeforthe n thcycle.Whenthedwell-timeconditionhasbeen satisedforthe j thobjectduringthe n thcyclei.e., t 2 h a n s j ;t u n s j;i i ,theerrorineachof thefeatureobserversislessthanthedesiredthresholdi.e., k ~ s j;i t k ~ s j;i ,implying ~ d k j;i =c t ~ s j;i . Lettheposeerrorbequantiedas ~ p c=k 1 t 2 R 3 and ~ q c=k 1 t 2 R 4 ,where ~ p c=k 1 t , p c=k 1 t )]TJ/F15 11.9552 Tf 12.941 0 Td [(^ p c=k 1 t ; ~ q c=k 1 t , q c=k 1 t )]TJ/F15 11.9552 Tf 12.814 0 Td [(^ q c=k 1 t ; and ^ p c=k 1 t 2 R 3 and ^ q c=k 1 t 2 R 4 aretheestimatesof p c=k 1 t and q c=k 1 t ,respectively. Takingthetimederivativeof p c=k 1 t yields d dt p c=k 1 t = R )]TJ/F25 11.9552 Tf 5.479 -9.683 Td [(q c=k 1 t v c t : 70

PAGE 71

Similarly,thetimederivativeof q c=k 1 t is d dt )]TJ/F25 11.9552 Tf 5.48 -9.684 Td [(q c=k 1 t = 1 2 B )]TJ/F25 11.9552 Tf 5.479 -9.684 Td [(q c=k 1 t ! c t : Takingthetimederivativeof ~ p c=k 1 t and ~ q c=k 1 t ,andsubstituting4and4 yields d dt ~ p c=k 1 t = R )]TJ/F25 11.9552 Tf 5.48 -9.684 Td [(q c=k 1 t v c t )]TJ/F25 11.9552 Tf 15.264 8.088 Td [(d dt ^ p c=k 1 t and d dt )]TJ/F15 11.9552 Tf 6.338 -9.684 Td [(~ q c=k 1 t = 1 2 B )]TJ/F25 11.9552 Tf 5.479 -9.684 Td [(q c=k 1 t ! c t )]TJ/F25 11.9552 Tf 15.264 8.088 Td [(d dt )]TJ/F15 11.9552 Tf 6.338 -9.684 Td [(^ q c=k 1 t : WhenthelandmarkisintheFOV, L t = a andfeedbackoftheposeofthe cameraisdirectlyavailableunderAssumption4.3.While L t = a ,aresetmapisused toresetbothpositionandorientationas ^ p c=k 1 t ! p c=k 1 t and ^ q c=k 1 t ! q c=k 1 t : However,while L t = u ,anobserverorpredictorisusedtoestimatetheposeofthe cameradependingonthesetofswitchingsignals, n t s j t o p s t j =1 . Whenthefeatureestimatorsforthe j thobjecthavesatisedtheminimumdwelltimecondition, t s j t = a ,andenoughfeaturesareremainingontheobject o s j t = a , p s j t 4 ,both R k j =c t and u k j =c t aremeasurableandtheestimationerrorforeach remainingfeature k ~ d k j;i =c t k s j;i .While t s j t = a ^ o s j t = a ,anestimateofthe positionofthecamerawithrespecttothe j thkeyframeisavailableas ^ p c=k j t , u c=k j t ^ d k j =c t ; where ^ d k j =c t , 1 p s j t p s j t P i =1 ^ d k j;i =c t , u c=k j t = )]TJ/F25 11.9552 Tf 9.299 0 Td [(R c=k j t u k j =c t , R c=k j t = R T k j =c t ,and ^ p c=k j t isexpressedinthe j thkeyframebasedon4.Using4,anobserveris 71

PAGE 72

designedfortheposeofthecamerawhile L t = u ^ t s j t = a ^ o s j t = a as d dt ^ p c=k 1 t = R )]TJ/F15 11.9552 Tf 6.338 -9.684 Td [(^ q c=k 1 t v c t + K p ^ p k j =k 1 + R )]TJ/F15 11.9552 Tf 6.338 -9.684 Td [(^ q k j =k 1 ^ p c=k j t )]TJ/F15 11.9552 Tf 10.949 0 Td [(^ p c=k 1 t ; and d dt )]TJ/F15 11.9552 Tf 6.338 -9.684 Td [(^ q c=k 1 t = 1 2 B )]TJ/F15 11.9552 Tf 6.338 -9.684 Td [(^ q c=k 1 t ! c t + K q )]TJ/F25 11.9552 Tf 5.48 -9.684 Td [(Q )]TJ/F15 11.9552 Tf 6.339 -9.684 Td [(^ q k j =k 1 q c=k j t )]TJ/F15 11.9552 Tf 12.813 0 Td [(^ q c=k 1 t ; where ^ q k j =k 1 =^ q c=k 1 a 1 , ^ p k j =k 1 =^ p c=k 1 a 1 , K p 2 R 3 3 and K q 2 R 4 4 areconstant positivegainmatrices,and Q q , 2 6 6 6 6 6 6 6 4 q 1 )]TJ/F25 11.9552 Tf 9.299 0 Td [(q 2 )]TJ/F25 11.9552 Tf 9.298 0 Td [(q 3 )]TJ/F25 11.9552 Tf 9.299 0 Td [(q 4 q 2 q 1 )]TJ/F25 11.9552 Tf 9.298 0 Td [(q 4 q 3 q 3 q 4 q 1 )]TJ/F25 11.9552 Tf 9.299 0 Td [(q 2 q 4 )]TJ/F25 11.9552 Tf 9.299 0 Td [(q 3 q 2 q 1 3 7 7 7 7 7 7 7 5 : However,iftheminimumdwell-timeconditionfortheobjectisunsatisedortheobject hastoofewfeaturesintheFOV,apredictorisusedtoupdatetheposeestimate. Specically,when L t = u ^ t s j t = u _ o s j t = u ,apredictorisdesignedforthe poseas d dt ^ p c=k 1 t = R )]TJ/F15 11.9552 Tf 6.339 -9.684 Td [(^ q c=k 1 t v c t and d dt )]TJ/F15 11.9552 Tf 6.338 -9.684 Td [(^ q c=k 1 t = 1 2 B )]TJ/F15 11.9552 Tf 6.338 -9.684 Td [(^ q c=k 1 t ! c t : Onlyananalysisofthepositionestimatordesignisconsideredsincetheprimaryresult ofthisworkisestimatingthepositionofthecamera.Additionally,theorientationis 72

PAGE 73

oftenestimatedthroughothermethodsandapredictormaynotbenecessary.Since d dt )]TJ/F25 11.9552 Tf 5.479 -9.684 Td [(q c=k 1 t isonlyafunctionof q c=k 1 t ,initializing ^ q c=k 1 a 1 = 1000 T implies 4isequivalentto4and4forall t> a 1 . While L t = u and t s j t = a ^ o s j t = a forthe j thobject,substitutingthe positionobserverupdatelawin4intothederivativeofthepositionerrorin4, using ^ q c=k 1 t = q c=k 1 t ,andsimplifyingyields d dt ~ p c=k 1 t = )]TJ/F25 11.9552 Tf 9.298 0 Td [(K p ~ p c=k 1 t + K p ~ p k j =k 1 + K p R )]TJ/F25 11.9552 Tf 5.48 -9.683 Td [(q k j =k 1 u c=k j t 1 p s j t p s j t X i =1 ~ d k j;i =c t : While L t = u and t s j t = u _ o s j t = u forallobjects,substitutingthe positionpredictorin4intothederivativeofthepositionerrorin4,using ^ q c=k 1 t = q c=k 1 t ,andsimplifyingyields d dt ~ p c=k 1 t =0 3 1 : 4.4.1StabilityofKeyFramePositionObserverandPredictorDesign Let V c=k 1 ~ p c=k 1 t : R 3 ! R beacandidateLyapunovfunctiondenedas V c=k 1 ~ p k j =c t , 1 2 ~ p T c=k 1 t ~ p c=k 1 t ; whichcanbeboundedas 1 2 k ~ p c=k 1 t k 2 V c=k 1 ~ p c=k 1 t 1 2 k ~ p c=k 1 t k 2 . Theorem4.6. Theswitchingsignals L t , t s j t ,and o s j t andtheobserverupdate lawdesignedin4ensurethepositionerrorofthecamera ~ p c=k 1 t isGUUBwhile t 2 h a n s j ;t u n s j i inthesense k ~ p c=k 1 t k 2 k ~ p c=k 1 a n s j k 2 exp )]TJ/F25 11.9552 Tf 9.298 0 Td [( a p t )]TJ/F25 11.9552 Tf 11.955 0 Td [( a n s j +2 " k j a p 73

PAGE 74

where a p , min f K p g , " k j , max f K p g k ~ p k j =k 1 k +~ s j;i 2 2 min f K p g ,and max fg isthemaximum eigenvalueof fg . Proof. Takingthederivativeof4withrespecttotimeandsubstituting4yields d dt V c=k 1 ~ p c=k 1 t )]TJ/F25 11.9552 Tf 21.918 0 Td [( a p V c=k 1 ~ p c=k 1 t + " k j : InvokingtheComparisonLemma[116,Lemma3.4]on4thenupperbounding yields4. Theorem4.7. Theswitchingsignals L t , t s j t ,and o s j t andthepredictorupdate lawdesignedin4ensurethepositionerrorofthecamera ~ p c=k 1 t isboundedwhile t= 2 h a n s j ;t u n s j i inthesense k ~ p c=k 1 t kk ~ p c=k 1 t u n s j k : Proof. Takingthederivativeof4withrespecttotimeandsubstituting4yields d dt V c=k 1 ~ p c=k 1 t 0 : InvokingtheComparisonLemma[116,Lemma3.4]ontheresultthenupperbounding yields4. 4.5Experiments Anexperimentisprovidedtodemonstratetheperformanceofthedeveloped estimatorstrategyusingtheobserverandpredictordesigncomparedtoapredictor-only strategycf.,[62]and[63].Theexperimentassumednoreturntopreviousobjectsi.e., eachnewsetoffeatureswasconsideredanewobjectimplyingtheinitialminimum dwell-timeconditionin4mustalwaysbesatisedbeforeusinganewobjectinthe positionobserver.Theperformanceexaminedisthatofthedevelopedfeatureobserver in4andtheposeestimationstrategyusingtheresetmapsin4and4 whenalandmarkisintheFOVi.e., L t = a ,theobserverupdatelawsin4 and4whilenolandmarkisintheFOVandanobjectintheFOVhassatised 74

PAGE 75

Figure4-1.Photocourtesyofauthor.ImageoftheKobukiTurtlebotandiDSuEye camerausedforexperiments. 75

PAGE 76

Figure4-2.Photocourtesyofauthor.Imageofasectionofthewoodenhallwaysinthe environmentandlocationsofthemotioncapturecamerasinthatsection. Themotioncapturecameraswerelocatedthroughouttheenvironment attachedtotheupperportionofeachwall. 76

PAGE 77

Figure4-3.Photocourtesyofauthor.Imageofthelandmarkcapturedwithinakey framei.e.,thecheckerboardisleaningonthewoodenwall,wherethe whitedotsdrawnintheimagearetheextractedcornerfeatures. theminimumdwell-timeconditionandhasenoughremainingfeaturesintheFOVi.e., L t = u and t s j t = a ^ o s j t = a ,andthepredictorupdatelawsin4and4– 52whilenolandmarkisintheFOVandnoobjecthassatisedthedwell-timecondition ordoesnothaveenoughremainingfeaturesi.e., L t = u and t s j t = u _ o s j t = u . AKobukiTurtlebotwitha 1920 1080 monochromeiDSuEyecamera,showninFigure4-1,providedimagesandvelocityat30Hzasitdrovethethroughtheenvironment e.g.,Figure4-2.Theestimatorcanranreal-timei.e.,at30Hzandwasimplemented usingEigen3,OpenCV,andROSc++librariescf.,[117],[118],and[119],respectively.Thelandmarkwasacheckerboardwith 8 6 cornerswhereeachsquareis 25 : 4 millimeters 25 : 4 millimeters.AstheexampleinFigure4-2shows,theenvironment wasanenclosedseriesofwoodenhallwayswhereamotioncapturesystemprovided groundtruth.AnOptitrackmotioncapturesystemoperatedat120Hzandmeasuredthe poseofthecamera,allowingforthepositionofeachfeaturerelativetothecameratobe 77

PAGE 78

Figure4-4.Photocourtesyofauthor.Imageofthefeaturesextractedfromakeyframe imageofanonplanarobjecti.e.,twowoodenwallswitha 90 angle betweenthem,wherethewhitedotsaretheextractedcornerfeatures. 78

PAGE 79

Figure4-5.Photocourtesyofauthor.Imageofthefeaturesextractedfromakeyframe imageofaplanarobject,wherethewhitedotsaretheextractedcorner features. knownforcomparison.AcomputerwithanInteli7processorrunningat3.4GHzwas usedtosimultaneouslyperformimageprocessingandestimatorupdates. Betweenapproximately30and50cornerfeaturescf.,[111]or[118]wereextractedfromeachkeyframeimagewithaninitialspacingof100pixels. 4 Anewobject andassociatedkeyframewasaddedafterthepreviousobjectlefttheFOV.Anobject andtheassociatedkeyframewerenolongertrackedandconsideredoutoftheFOV whenlessthan20cornerfeaturesremainedi.e., p s j t 20 implied o s j t = a and p s j t < 20 implied o s j t = u . 5 Theenvironmentconsistedofbothnonplanare.g., 4 Theextractionmethodcouldonlyndbetween30and50cornerfeaturesinakey frameimagedependingonwhatwasintheimage. 5 Whileanabsoluteminimumof4featuresisrequired,4cornerfeatureswilltypically provideapoorestimateof q k j =c t and u k j =c t anditwasexperimentallydetermined 79

PAGE 80

Figure4-4andplanare.g.,Figure4-5surfacesdemonstratingtheplanarassumption wasnotalwaysvalid.IfthelandmarkwasintheFOVwhiletrackinganextractedsetof featurese.g.,Figure4-3,thegroundtruthwasprovidedtothevehicleandthereset mapsin4and4whereusedtoupdatetheposeestimate. ExtractedcornerfeaturesweretrackedwhileintheFOVbyrstpredictingthe locationofacornerfeatureinanewimageusingthecurrentestimateofthedistance tothefeatureandthelocationofthecornerfeatureintheoldimage.Forexample, considertrackingthelocationofthe i thfeatureonthe j thobjectwhileitisintheFOV i.e., s j;i 2P s j t ,integrating4fromthepreviousimageattime t p 2 R > 0 using u s j;i =c t p , ^ d s j;i =c t p , v c t p ,and ! c t p tothetimeofthenewimage t yields ^ u s j;i =c t = u s j;i =c t p )]TJ/F31 11.9552 Tf 11.955 16.273 Td [(Z t t p ! c u s j;i =c d + Z t t p 1 ^ d s j;i =c u s j;i =c u T s j;i =c v c d )]TJ/F31 11.9552 Tf 10.295 16.272 Td [(Z t t p 1 ^ d s j;i =c v c d: Theaverageoftheshiftestimatedbyallthefeaturesin P s j t wasthenusedtoestimate anafnetransformationbetweenthetwoimagesi.e.,theunitvectorswereconverted backintopixelsandtheaverageofthechangeinpixelswasusedtodeterminean afnetransformationbetweentheimages.Theapproximatedafnetransformationwas thenappliedtoa 50 pixel 50 pixelpatchofthepreviousimagearoundtheprevious pixellocationofeachfeature s j;i .Thetransformedpatchwasusedasatemplateto searchforamatchina 90 pixel 90 pixelpatcharoundthepredictedfeaturelocation inthenewimageusingnormalizedcrosscorrelationcoefcienttemplatematching cf.,[118]or[14].Thebestmatchesprovidedbythetemplatematchingwerethenused that20cornerfeatureswasthelowestnumberoffeaturesthatcouldconsistentlyprovidegoodestimates. 80

PAGE 81

todeterminethesetof u s j;i =c t .Theaverageshiftaftertemplatematchingbyallthe featuresin P s j t wasthenusedtodetermineoutliersbycalculatingthe 2 valueof afeaturesshiftcomparedtotheaverageshiftusingastandarddeviationof3pixels. Ifafeaturehada 2 valuegreaterthan6.63itwasconsideredanoutlier.Thevalue of d dt u s j;i =c t wasthenestimatedusingalteredbackwardsdifferenceon u s j;i =c t andusedtoupdate4andsubsequentlyupdate4,where K =40 I 3 3 was determinedtoconsistentlyimproveperformancewithlownoisesensitivity.Afternding thefeatures,multiplemethodsi.e.,essential,homography,andperspective-n-point decompositions,[118]wereusedtoapproximate q k j =c t and u k j =c t fromthesetof features P s j t ,andanysolutionthathadanormdifferencebetweenexpectedsolution andapproximatedsolutionlessthan 0 : 1 wasaveragedtogetherandpassedintoalow passlter.Ifnosolutionhadasmallenougherror,theexpectedsolutionwasused instead,wheretheexpected q k j =c t and u k j =c t weredeterminedusingthecurrent poseestimatei.e., ^ p c=k 1 t and ^ q c=k 1 t . Anupdatetotheestimatorin4wasthenprocessedforeachfeaturein P s j t . Usingavalueof a =0 : 3 foreachfeature,thestateof s j;i wasdeterminedandif s j;i = a , Y s j;i t and U s j;i t from4werecalculatedwherethemaximumvaluefor & was1.0second.Theposeestimateof ^ p c=k 1 t and ^ q c=k 1 t wasusedtodetermine ^ p k j =c t expressedinthecameraframeand ^ q k j =c t .Usingastandarddeviationof3 pixels,thereprojectionerrorbetweenthepixelcoordinatesofthefeaturedetermined from u s j;i =c t andthepixelcoordinatesdeterminedusingtheestimateof d s j;i =k j from Y s j;i t and U s j;i t wasusedtocalculatethe 2 valueiftheestimateof d s j;i =k j from Y s j;i t and U s j;i t fellwithintheknownbounds, d 2 d s j;i =k j d ,andthevalueof kY s j;i t k > 0 : 1 and kU s j;i t k > 0 : 15 ,wherethethresholdson kY s j;i t k and kU s j;i t k andthedistancebounds d 2 =0 : 5 metersand d =5 : 0 meterswereselectedbasedon theenvironment.Ifthe 2 valuewassmallerthan6.63,thevalueof d s j;i =k j from Y s j;i t and U s j;i t wasconsideredaninlierandthe Y s j;i t and U s j;i t pairwereaddedtothe 81

PAGE 82

historystackin4where N =50 .Using =0 : 0001 ,when min n Y s j;i o > ,the valueof X s j;i in4wasdeterminedandusedtoupdate4,where K X =30 I 3 3 wasdeterminedtoconsistentlyperformwellwithlownoisesensitivity.Additionally,after min n Y s j;i o > ,thevalueof X s j;i wasalsousedtoestimatethereprojectionerror andifthe 2 valuewastoolargewhencomparingthemeasuredpixelcoordinatesfrom u s j;i =c t tothepixelcoordinatesdeterminedfrom X s j;i ,thenew Y s j;i t and U s j;i t wereconsideredoutliers,enablinganothermethodofrejectingnoisymeasurements. Additionally,thedwell-timeforfeature s j;i startedaccumulatingtimewhiletheswitching signal s j;i = a .Oncethedwell-timeexceededtheinitialminimumdwell-timecondition in4,theswitchingsignal t s j;i t activatedi.e., t s j;i t = a ,where ~ s j;i was selectedtobe1centimeterandsince min f K X g =30 , a =0 : 3 ,and d =5 : 0 meters, t a 1 s j 0 : 75 seconds.Whenalloftheremainingfeaturesonthe j thobjectwere activated t s j t = a andwhile o s j t = a i.e.,atleast20featuresonthe j thobject remainedintheFOV,theobserverupdatelawsforthecameraposein4and 4whereusedtoestimatethepose.After o s j t = u ,thefeaturesonthe j thobject werenolongertrackedandanewsetoffeatureswereextractedestablishingthenext objectandkeyframe.Aftertheobjectwasnolongertracked,thepredictorupdatelaws in4and4wereusedtoestimatetheposeuntilthenextobjectsatised theminimumdwell-timecondition.Ifanobjectneversatisedtheminimumdwell-time condition,thatobjectwasneverusedtoupdatethepose. Theexperimentwasapproximately560secondsandconsistedofdrivingthe groundvehicleshowninFigure4-1overapathofapproximately250metersthrough theseriesofwoodenhallwaysshowninFigure4-2.Thetrueandestimatedpathover theexperimentisshowninFigure4-6wheretheestimatorstateestimateismarkedin red,green,orbluedependingifthelandmarkisintheFOV,thepredictorisactivated,or theobserverisactivated,respectively.Thenormofthepositionerrorfortheexperiment isshowninFigure4-7wheretheerrorismarkedinred,green,orbluedependingif 82

PAGE 83

Figure4-6.Plotofthepathofthecameraduringtheexperimentandtheestimatedpath ofthecamerausingastandarddeviationof3pixelsforthehistorystack rejectionalgorithm.Thetruepathismarkedinblack.Theestimatedpath showswhichestimatorisactivatedovertheexperiment.Whilethelandmark isintheFOV,theestimatedpathisshownusingtheredmarker.Similarly, theestimatedpathismarkedingreenorbluewhenthepredictororobserver areactive,respectively. 83

PAGE 84

Figure4-7.Plotofthenormofthecamerapositionerrorduringtheexperimentusinga standarddeviationof3pixelsforacceptingdataontothehistorystack.The camerapositionerrorshowswhichestimatorisactivatedoverthe experiment.WhilethelandmarkisintheFOV,theestimatedpathisshown usingtheredmarker.Similarly,theestimatedpathismarkedingreenor bluewhenthepredictororobserverareactive,respectively.Asshown,the errorresetstozeroeachtimethelandmarkenterstheFOV.Themaximum positionerrorwasapproximately1.18meterswhiletheaverageofthe maximumswas1.0metersandtheRMSofthepositionerrorwas0.58 meters. 84

PAGE 85

Figure4-8.Plotofthedistanceestimatorconvergenceforkeyframe33thatshowsthe averagepercentageerrorofthedistancetothefeaturesrelativetothetrue distancetothefeaturesusingastandarddeviationof3pixelsforaccepting dataontothehistorystack.Theplotshowswhilethesetofestimatorshave notsatisedtheeigenvaluecondition,markedinred, s j;i t enables convergence.Whenenoughdatahasbeencollectedandtheeigenvalue conditionissatised,markedbythegreenverticallineandgreenmarkers, theerrordemonstratesexponentialdecay.Aftertheinitialminimum dwell-timeconditionissatised,markedbytheverticalbluelineandblue markers,thesetoffeatureswasusedinthepositionobserver.Asshown, theerrorpercentagerelativetothedistanceisapproximately1.6%. 85

PAGE 86

Figure4-9.Plotofthedistanceestimatorconvergenceforkeyframe173thatshowsthe averagepercentageerrorofthedistancetothefeaturesrelativetothetrue distancetothefeaturesusingastandarddeviationof3pixelsforaccepting dataontothehistorystack.Theplotshowswhilethesetofestimatorshave notsatisedtheeigenvaluecondition,markedinred, s j;i t enables convergence.Whenenoughdatahasbeencollectedandtheeigenvalue conditionissatised,markedbythegreenverticallineandgreenmarkers, theerrordemonstratesexponentialdecay.Aftertheinitialminimum dwell-timeconditionissatised,markedbytheverticalbluelineandblue markers,thesetoffeatureswasusedinthepositionobserver.Asshown, theerrorpercentagerelativetothedistanceisapproximately2.3%. 86

PAGE 87

Figure4-10.Plotofthedistanceestimatorconvergenceforkeyframe215thatshows theaveragepercentageerrorofthedistancetothefeaturesrelativetothe truedistancetothefeaturesusingastandarddeviationof3pixelsfor acceptingdataontothehistorystack.Theplotshowswhilethesetof estimatorshavenotsatisedtheeigenvaluecondition,markedinred, s j;i t enablesconvergence.Whenenoughdatahasbeencollectedand theeigenvalueconditionissatised,markedbythegreenverticallineand greenmarkers,theerrordemonstratesexponentialdecay.Aftertheinitial minimumdwell-timeconditionissatised,markedbytheverticalblueline andbluemarkers,thesetoffeatureswasusedinthepositionobserver.As shown,theerrorpercentagerelativetothedistanceisapproximately1.9%. 87

PAGE 88

Figure4-11.HistogramplotoftheRMSoftheaveragepercentageerrorofthedistance tothefeaturesrelativetothetruedistancetothefeaturesacrosstheentire experimentusingastandarddeviationof3pixelsforacceptingdataonto thehistorystack.ThehistogramshowsthetheRMSerrorsovertheentire timeakeyframetracked.TheRMSerrorwasonaverageof15.6%witha standarddeviationof3.7%andamedianerrorof15.4%overtheentire timeakeyframewastracked. 88

PAGE 89

Figure4-12.HistogramplotoftheRMSoftheaveragepercentageerrorofthedistance tothefeaturesrelativetothetruedistancetothefeaturesacrossbeforethe minimumdwell-timeconditionissatisedusingastandarddeviationof3 pixelsforacceptingdataontothehistorystack.Thehistogramshowsthe RMSerrorsoverthetimefromextractingthefeaturesfromakeyframeto thetimejustbeforetheminimumdwell-timeconditionissatised.TheRMS errorwasonaverageof19.4%withastandarddeviationof4.5%anda medianerrorof19.1%beforetheminimumdwell-timeconditionwas satised. 89

PAGE 90

Figure4-13.HistogramplotoftheRMSoftheaveragepercentageerrorofthedistance tothefeaturesrelativetothetruedistancetothefeaturesacrossafterthe minimumdwell-timeconditionissatisedusingastandarddeviationof3 pixelsforacceptingdataontothehistorystack.Thehistogramshowsthe RMSerrorsoverthetimetheminimumdwell-timeconditionissatisedto thetimeakeyframewasnolongertracked.TheRMSerrorwason averageof4.2%withastandarddeviationof2.9%andamedianerrorof 3.4%aftertheminimumdwell-timeconditionwassatised. 90

PAGE 91

thelandmarkisintheFOV,thepredictorisactivated,ortheobserverisactivated, respectively.AsshowninFigure4-7,theRMSerrorovertheentireexperimentwas 0.58meters.Themaximumerrorbeforearesetwas1.18metersimplyingtheerrorover pathlengthbeforeresetwasapproximately2.8%.Theaverageofthemaximumswas 1.0metersimplyingtheaveragemaximumerroroverpathlengthwasapproximately 2.4%.Overtheexperimenttherewere246keyframes;however,only83keyframes satisedtheminimumdwell-timeconditionandwereusedinthepositionobserver.The distancetothefeatureswasapproximately1.5metersonaverageandasshowninthe exampleconvergenceinFigures4-8-4-10,andthehistogramsoftheRMSerrori.e., RMS 100 1 p P p i =1 j ~ d s j;i =c t j 1 p P p i =1 d s j;i =c t p s t j =1 inFigures4-11-4-13,theaveragefeaturedistance errorofthe83keyframeswasatitslowestwhenusedbythepositionestimatoraswas predictedbytheminimumdwell-time.Specically,thehistogramsshowedthatbefore thedwell-timeconditionissatisedthepercentageerrorwas19.4%onaverageas showninFigure4-12;however,aftertheminimumdwell-timeconditionwassatisedthe errorwas4.2percentonaveragebuthadamedianerrorof3.4%. Similartoourpreviousworkin[62]and[63],thedevelopedpositionestimatorstrategyistoensuretheerrorinthepositiondoesnotexceedadesiredthresholdthrough thedevelopmentofdwell-timeconditions.However,[62]and[63]useapredictor-only strategywhenfeedbackonthepositionisnotdirectlyavailable.Thepurposeofthe developedestimatorinthispaperistoimproveuponapredictor-onlyestimatorand guaranteetheerrorinthepositionestimategrowsataslowerratewhenavehicleis operatinginanenvironmentwithnofeedbackfromlandmarksorapositioningsystem.Thisisachievedthroughtheuseofthepositionobserverstrategywhenfeatures satisfytheminimumdwell-timecondition.AsshowninFigures4-14and4-15,usinga predictor-onlystrategy,similarto[62]and[63],resultsinlargermaximumpositionerror comparedtousingthepredictorandobserverstrategy.Specically,thepositionerror usingapredictor-onlystrategyhadanoverallRMSpositionerrorthatwas8%larger 91

PAGE 92

Figure4-14.Plotofthepathofthecameraduringtheexperimentandtheestimated pathofthecamerausingthepredictor-onlystrategywhennolandmarkisin theFOV.Thetruepathismarkedinblack.Theestimatedpathshowsifthe landmarkisintheFOVorthepredictorisactivated.Whilethelandmarkis intheFOV,theestimatedpathisshownusingtheredmarker.Similarly,the estimatedpathismarkedingreenwhenthepredictorisactive. 92

PAGE 93

Figure4-15.Plotofthenormofthecamerapositionerrorduringtheexperimentusing thepredictor-onlystrategywhennolandmarkisintheFOV.Thecamera positionerrorshowswhichestimatorisactivatedovertheexperiment. WhilethelandmarkisintheFOV,theestimatedpathisshownusingthe redmarker.Similarly,theestimatedpathismarkedingreenwhenthe predictorisactive.Asshown,theerrorresetstozeroeachtimethe landmarkenterstheFOV.Themaximumpositionerrorwasapproximately 1.18meterswhiletheaverageofthemaximumswas1.09metersandthe RMSofthepositionerrorwas0.63meters. 93

PAGE 94

i.e.,0.63metersforthepredictor-onlystrategywhilethepredictorandobserverstrategy was0.58metersandanaveragemaximumthatwas8.3%largeri.e.,1.09metersfor thepredictor-onlystrategycomparedto1.0metersusingthepredictorandobserver strategy.Theseexperimentalresultsdemonstratethattheestimatorstrategyusing theminimumdwell-timeconditionensuresthatonlyfeatureswithlowerrorareusedin thepositionobserver.Theresultofthispaperenablesthepositionestimationerrorto remainsmallercomparedtousingapredictor-onlystrategywhichenablesavehicleto operateinanenvironmentwithnofeedbackfromlandmarksorapositioningsystemfor longerperiodsoftime. Remark 4.1 . Reducingthestandarddeviationforacceptingdataontothehistorystack improvestheoverallperformanceofboththedistanceestimatorsandtheposition estimatorasdemonstratedbycomparingFigures4-6,4-7,and4-13toFigures4-164-18.Specically,themaximumpositionerrorusinga3pixelstandarddeviationfor acceptingdatawasonaverage1.0meterscomparedto1.21meterswhenusinga10 pixelstandarddeviationforacceptingdata.Thisshowsthatrelaxingthethresholdto acceptdatacausesmoreerrortobeinjectedintothesystemreducingperformance comparedtothepredictor-onlystrategywhichhadanaveragemaximumerrorof1.1 meters.Similarly,theRMSerrorofthedistancesaftersatisfyingtheminimumdwell-time was4.2%onaveragewithastandarddeviationof2.9%andamedianerrorof3.4%for the3pixelstandarddeviationforacceptingdatacomparedtousinga10pixelstandard deviationforacceptingdatawhichhada6.6%averageRMSerrorwithastandard deviationof4.3%andamedianof5.5%.However,reducingthethresholdresultsinless featuressatisfyingtheminimumdwell-timesince147keyframesoutof246satisedthe minimumdwell-timeconditionwhenusingthe10pixelstandarddeviationforaccepting datacomparedto83keyframesoutof246whenusingthe3pixelstandarddeviation foracceptingdata.Sincetheobjectiveistoprovidebetterpositionestimateswhenthe landmarkisnotinthecamera'sFOV,thistrade-offisacceptable;however,ifthegoal 94

PAGE 95

Figure4-16.Plotofthepathofthecameraduringtheexperimentandtheestimated pathofthecamerausingastandarddeviationof10pixelsforthehistory stackrejectionalgorithm.Thetruepathismarkedinblack.Theestimated pathshowswhichestimatorisactivatedovertheexperiment.Whilethe landmarkisintheFOV,theestimatedpathisshownusingtheredmarker. Similarly,theestimatedpathismarkedingreenorbluewhenthepredictor orobserverareactive,respectively. 95

PAGE 96

Figure4-17.Plotofthenormofthecamerapositionerrorduringtheexperimentusinga standarddeviationof10pixelsforacceptingdataontothehistorystack. Thecamerapositionerrorshowswhichestimatorisactivatedoverthe experiment.WhilethelandmarkisintheFOV,theestimatedpathisshown usingtheredmarker.Similarly,theestimatedpathismarkedingreenor bluewhenthepredictororobserverareactive,respectively.Asshown,the errorresetstozeroeachtimethelandmarkenterstheFOV.Themaximum positionerrorwasapproximately1.81meterswhiletheaverageofthe maximumswas1.21metersandtheRMSofthepositionerrorwas0.65 meters. 96

PAGE 97

Figure4-18.HistogramplotoftheRMSoftheaveragepercentageerrorofthedistance tothefeaturesrelativetothetruedistancetothefeaturesacrossafterthe minimumdwell-timeconditionissatisedusingastandarddeviationof10 pixelsforacceptingdataontothehistorystack.Thehistogramshowsthe histogramoftheRMSerrorsoverthetimetheminimumdwell-time conditionissatisedtothetimeakeyframewasnolongertracked.The RMSerrorwasonaverageof6.6%withastandarddeviationof4.3%anda medianerrorof5.5%aftertheminimumdwell-timeconditionwassatised. 97

PAGE 98

wastoestimatemoreoftheenvironmentthenallowingforahigherstandarddeviation foracceptingdatawouldbeacceptable.Additionally,iftheresultingstructureofall theobjectsandtheresultingpathofthecamerawerepassedintoanoptimization algorithmimplementingbundleadjustments,theresultmayenablearicherestimateof theenvironment.Anoptimizationcouldbeappliedregardless;however,havingmore featureswillresultinamoredenseestimateoftheenvironment. 4.6Summary Inthischapter,anextensiontothelearningapproachesin[32],[45],andChapter3, isdevelopedthatappliesanewlearningstrategythatmaintainsacontinuousestimate ofthepositionofthecameraandestimatesthestructureoffeaturesastheybecome visible.Thedevelopedlearningstrategyallowssimulatedmeasurementsoffeatures fromobjectsthatarenolongerintheFOVenablingacontinuousestimateofthe distancetofeatureswithrespecttothecamera.Additionally,thisapproachshowshow theextendedobserverremovesthepositivedepthconstraintrequiredbyallprevious structurefrommotionapproaches.Usingthisapproach,acameramaytraveloverlarge distanceswithoutkeepingspecicfeaturesintheFOVforalltimeandallowobjectsto permanentlyleavetheFOVifnecessary.ALyapunovbasedstabilityanalysisproves thattheobserversforestimatingthepathofthecameraaswellasthestructureof eachsetofobjectsaregloballyexponentiallystablewhilefeaturesareintheFOV.A switchedsystemsanalysisisusedtodevelopdwell-timeconditionstoindicatehowlong afeaturemustbetrackedtoensurethedistanceestimationerrorisbelowathreshold. Afterdistanceestimateshaveconvergedbelowthethreshold,thefeaturemaybeused toupdatethecameraposition.Ifafeaturedoesnotsatisfythedwell-timecondition, itisneverusedtoupdatethepositionoftheagent.Furthermore,theapproachdoes notrequireanewsetoffeaturestobeintheFOVwhenolderfeaturesleavetheFOV. Finally,ifarecognizedlandmarkenterstheFOV,thefeedbackisusedtocompensatefor drifterror. 98

PAGE 99

CHAPTER5 STRUCTUREANDMOTIONOFAMOVINGTARGETUSINGAMOVING MONOCULARCAMERASUBJECTTOINTERMITTENTFEEDBACK Inthischapter,anapproachsimilarto[45]andChapter3isdevelopedtoestimate theinitialstructureofamovingobject.UnlikeChapter3,intermittentfeedbackofthe objectisconsidered.Afterestimatingtheinitialstructureoftheobject,anobserverand predictorfortheobject'spose,velocity,andaccelerationmodeli.e.,SaMfM[81]is guaranteedtobeGUUBprovideddwell-timeconditionsontheavailabilityoffeedback aresatised.Specically,thedwell-timeconditions,developedusingaLyapunov-based stabilityanalysis,giveupperboundsonthetimetolearntheobject'sinitialstructure andupperboundsonthetimefeedbackoftheobject'sfeaturescanbeunavailable. Theapproachtolearnthemotionmodelismotivatedby[38];however,anacceleration modelislearnedinthischapterinsteadofavelocitymodelasdonein[38].Estimating anaccelerationmodelrelaxesconstraintsonthemotionandenablestheuseofmore generalsystemmodels.Furthermore,theapproachtodevelopthedwell-timesis motivatedby[63]wherethedwell-timesinthischapterarebasedonthesizeoftheFOV, ensuringestimationerrorcannotexceedupperthresholds.Thedwell-timesdeveloped inthischapterensureanobjectisrecapturedintheFOVafterleavingwhichisnot guaranteedin[38].Specically,[38]givesanobjectanumberofcyclesofleaving andreturningtotheFOVtoguaranteestability;however,inmanyapplicationsitis notpossibletoensureanobjectreturnstotheFOVifestimationerrorgrowstoolarge motivatingthedwell-timesdevelopedinthischapter.Furthermore,thisdevelopmentin thischapterrelaxesthepositivedepthconstraintrequiredin[38]. 5.1LearningtheFirstFeatureStructure UnderAssumption2.10,anapproachsimilartothestationaryfeaturescanbeused tolearntheinitialstructureoftheorigin.Specically,whiletheobjectisstationary,the 99

PAGE 100

relationshipsin2-2areextendedfortheoriginoftheobjectas Y s m 1 t 2 6 4 d m 1 =c t d k 1 =c t 3 7 5 = R k 1 =c t u m 1 =k 1 d m 1 =k 1 ;t< s ; where Y s m 1 t , u m 1 =c t )]TJ/F25 11.9552 Tf 9.299 0 Td [(u k 1 =c t .Toaidinthesubsequentdevelopment,while t s , m 1 t = a indicates 1 )-222(k u T k 1 =c t u m 1 =c t k > a and m 1 = u indicates 1 )-222(k u T k 1 =c t u m 1 =c t k a .While 1 )-222(k u T k 1 =c t u m 1 =c t k > a ,5canbe writtenas 2 6 4 d m 1 =c t d k 1 =c t 3 7 5 = s m 1 t d m 1 =k 1 ;t< s ; where s m 1 t , )]TJ/F25 11.9552 Tf 5.479 -9.684 Td [(Y sT m 1 t Y s m 1 t )]TJ/F24 7.9701 Tf 6.586 0 Td [(1 Y sT m 1 t R k 1 =c t u m 1 =k 1 isinvertableandmeasurablewhiletheobjectisintheFOV.Furthermore,thetimederivativesoftheunknown distancesaremeasurableas d dt )]TJ/F25 11.9552 Tf 5.479 -9.684 Td [(d m 1 =c t = )]TJ/F25 11.9552 Tf 9.298 0 Td [(u T m 1 =c t v c t ;t< s ; d dt )]TJ/F25 11.9552 Tf 5.479 -9.683 Td [(d k 1 =c t = )]TJ/F25 11.9552 Tf 9.298 0 Td [(u T k 1 =c t v c t ;t< s ; and d dt d m 1 =k 1 =0 ;t< s : While O t = a ^ m 1 t = a ^ t< s ,theinitialdistancetotherstfeatureis determinedbyintegrating5and5overatimewindow & 2 R > 0 yielding 2 6 4 d m 1 =c t d k 1 =c t 3 7 5 )]TJ/F31 11.9552 Tf 11.955 27.617 Td [(2 6 4 d m 1 =c t )]TJ/F25 11.9552 Tf 11.955 0 Td [(& d k 1 =c t )]TJ/F25 11.9552 Tf 11.955 0 Td [(& 3 7 5 = Z t t )]TJ/F26 7.9701 Tf 6.587 0 Td [(& 2 6 4 )]TJ/F25 11.9552 Tf 9.298 0 Td [(u T m 1 =c v c )]TJ/F25 11.9552 Tf 9.299 0 Td [(u T k 1 =c v c 3 7 5 d;t< s 100

PAGE 101

where & maybeconstantinsizeorchangeovertime,and O = a ^ m 1 = a; 8 2 [ t )]TJ/F25 11.9552 Tf 11.955 0 Td [(&; s ] .Substitutingtherelationshipin5into5for 2 6 4 d m 1 =c t d k 1 =c t 3 7 5 and 2 6 4 d m 1 =c t )]TJ/F25 11.9552 Tf 11.955 0 Td [(& d k 1 =c t )]TJ/F25 11.9552 Tf 11.955 0 Td [(& 3 7 5 yields Y m 1 t d m 1 =k 1 = U m 1 t ;t< s where Y m 1 t , 8 > > < > > : s m 1 t )]TJ/F25 11.9552 Tf 11.955 0 Td [( s m 1 )]TJ/F25 11.9552 Tf 5.479 -9.683 Td [(t )]TJ/F25 11.9552 Tf 11.955 0 Td [( a l j;m i ;t )]TJ/F25 11.9552 Tf 11.956 0 Td [( a l j;m 1 <&; s m 1 t )]TJ/F25 11.9552 Tf 11.955 0 Td [( s m 1 t )]TJ/F25 11.9552 Tf 11.955 0 Td [(& ;t )]TJ/F25 11.9552 Tf 11.956 0 Td [( a l j;m 1 &; U m 1 t , 8 > > > > > > > > > > < > > > > > > > > > > : R t t )]TJ/F26 7.9701 Tf 6.586 0 Td [( a l j;m 1 2 6 6 4 )]TJ/F25 11.9552 Tf 9.299 0 Td [(u T m 1 =c v c )]TJ/F25 11.9552 Tf 9.298 0 Td [(u T k 1 =c v c 3 7 7 5 d;t )]TJ/F25 11.9552 Tf 11.956 0 Td [( a l j;m 1 <&; R t t )]TJ/F26 7.9701 Tf 6.586 0 Td [(& 2 6 6 4 )]TJ/F25 11.9552 Tf 9.299 0 Td [(u T m 1 =c v c )]TJ/F25 11.9552 Tf 9.298 0 Td [(u T k 1 =c v c 3 7 7 5 d;t )]TJ/F25 11.9552 Tf 11.956 0 Td [( a l j;m 1 &; and a l j;m 1 isthe l thinstancetherstfeaturesatisestheeigenvalueconditionduring the j thinstancetheobjectenterstheFOVi.e., t 2 a j ; u j .Thetime a l j;m 1 mustbe consideredsincethereisnoguaranteetheobjectislearnedbeforetime u l j;m 1 ortime u j . Multiplyingbothsidesof5by Y T m 1 t yields Y T m 1 t Y m 1 t d m 1 =k 1 = Y T m 1 t U m 1 t ;t< s : Ingeneral, Y m 1 t willnothavefullcolumnrankwhile O = a ^ m 1 = a; 8 2 [ &; s ] e.g.,whenthecameraandobjectarestationaryimplying Y T m 1 t Y m 1 t 0 .However, theequalityin5maybeevaluatedatanyinstanceintimeandsummedtogether i.e.,historystacksyielding Y m 1 d m 1 =k 1 = U m 1 ; 101

PAGE 102

where Y m 1 , N P h =1 Y T m 1 t h Y m 1 t h , U m 1 , N P h =1 Y T m 1 t h U m 1 t h , t h 2 a 1 ; s ,and N 2 Z > 1 . Assumption5.1. Thereissufcientrelativemotionbetweenthecameraandtarget sothereexistsatime m 1 2 [ a 1 ; s ,suchthatforalltime t> m 1 , min Y m 1 > . Specically,theoriginsinitialstructurecanbedeterminedbeforetheobjectbegins moving. Remark 5.1 . Learningtheinitialdistancetotherstfeatureenableslearningthe remainingfeaturesonthetarget.Specictothetargettrackingobjective,learningthe initialdistancetotherstfeatureprovidessufcientinformationtodeterminetherelative positionofthetargetwithrespecttothecamera,whilethetargetisintheFOVandthe relativemotionisnotparallel.However,sinceitisoftendesirabletoobtainacontinuous estimateofthetargetsvelocity,learningthestructureofallthefeatureswillenable amorerobustestimateoftheobject'sposeandvelocity.Thetime m 1 isunknown; however,itcanbedeterminedonlinebycheckingtheminimumeigenvalueof Y m 1 . OncesufcientrelativemotionoccursasdiscussedinAssumption5.1,theconstant unknowndistance, d m 1 =k 1 ,canbedeterminedfrom5yielding d m 1 =k 1 = X m 1 ; where X m 1 , )]TJ/F24 7.9701 Tf 6.587 0 Td [(1 Y m 1 U m 1 .Since p m 1 =c t = u m 1 =c t d m 1 =c t ,while O = a ^ m 1 = a ^ t> m 1 using5in2yields p m 1 =c t = u m 1 =c t m 1 ; 1 t X m 1 ; where m 1 ; 1 t istherstelementof m 1 t . 102

PAGE 103

5.2LearningtheStructureoftheRemainingFeatures Learningtheinitialstructurefortheoriginenableslearningtheremainingfeatures structureandthemotionmodeloftheobject.Let q m=c t bethequaternionrepresentationoftheorientationof F m withrespectto F c whereitstimederivativeis d dt )]TJ/F25 11.9552 Tf 5.479 -9.684 Td [(q m=c t = 1 2 B )]TJ/F25 11.9552 Tf 5.48 -9.684 Td [(q m=c t R T m=c t ! m t )]TJ/F25 11.9552 Tf 11.955 0 Td [(! c t : IftheorientationoftheobjectismeasurablewhenintheFOV, ! m t canbeestimated byapproximating d dt )]TJ/F25 11.9552 Tf 5.48 -9.684 Td [(q m=c t while O t = a as ! m t =2 R m=c t B T )]TJ/F25 11.9552 Tf 5.48 -9.684 Td [(q m=c t d dt )]TJ/F25 11.9552 Tf 5.48 -9.684 Td [(q m=c t + ! c t ; where q m=c t canbedeterminedwhile O t = a implyingestimatesof d dt )]TJ/F25 11.9552 Tf 5.479 -9.684 Td [(q m=c t canalsobedetermined.Thetarget'slinearvelocitydoesnothaveanyrelationshipthat allowsforadirectapproach.Yetbyexaminingtherateofchangeofthedirectionforthe rstfeaturei.e.,theoriginoftheobjectandtherateofchangeoftherelativemotion direction, v m t canbewrittenasafunctionofmeasurablequantitiesandtheinitial distancetotherstfeature.Specically,using5in2fortherstfeatureand 2yields 2 6 4 m 1 t 0 3 1 0 3 1 m=m t 3 7 5 2 6 4 d m 1 =c t d m=m t 3 7 5 + 2 6 4 0 3 1 m=m 1 t 3 7 5 d m 1 =k 1 + 2 6 4 m 1 t m=m t 3 7 5 = 2 6 4 m 1 t m=m t 3 7 5 v m t ; where m i t , )]TJ/F26 7.9701 Tf 8.204 -4.977 Td [(d dt )]TJ/F25 11.9552 Tf 5.479 -9.684 Td [(u m i =c t + ! c t u m i =c t , m i t , m i t v c t , m=m t , )]TJ/F26 7.9701 Tf 8.204 -4.977 Td [(d dt )]TJ/F25 11.9552 Tf 5.479 -9.684 Td [(u m=m t + ! c t u m=m t , m=m t , m=m t v c t ,and m=m 1 t , m=m t ! m t )]TJ/F25 11.9552 Tf 11.955 0 Td [(! c t R m=c t u m 1 =k 1 areallmeasurablewhilethe targetisintheFOV.Substitutingtherelationshipin2fortherstfeatureinto5 for 2 6 4 d m 1 =c t d m=m t 3 7 5 andsimplifyingyields v m t d m 1 =k 1 +P v m t = v m t v m t ; 103

PAGE 104

where v m t , 2 6 4 m 1 t m=m t 3 7 5 , v m t , 0 B @ 2 6 4 m 1 t 0 3 1 0 3 1 m=m t 3 7 5 m 1 t + 2 6 4 0 3 1 m=m 1 t 3 7 5 1 C A , and P v m t , 2 6 4 m 1 t m=m t 3 7 5 areallmeasurablewhilethetargetisintheFOV.Let v m t 2f u;a g beanindicatorsignalindicatingif min n T v m t v m t o v m or min n T v m t v m t o > v m ,respectively.If O t = a ^ v m t = a ^ m 1 t = a ,the bodyvelocitycanbewrittenasafunctionoftheconstantinitialdistancetotheoriginof theobjecti.e.,therstfeatureas v m t = + v m t v m t d m 1 =k 1 + v c t ; where + v m t = T v m t v m t )]TJ/F24 7.9701 Tf 6.587 0 Td [(1 T v m t . Remark 5.2 . Thesetoffeatures, f m i t g m i 2O m ,aredenotedby m i t = u ifthe velocitydoesnotsatisfytheeigenvalueconditioni.e., v m t = u .Resettingthe switchingsignalisnecessarybecausethefeaturesarealldependentonthevelocityin thesubsequentdevelopment. Aftertheinitialstructurefortheoriginhasbeenlearned,thelinearvelocityofthe objectcanbedeterminedwhile O t = a ^ v m t = a ^ m 1 t = a ^ t> m 1 using 5in5yielding v m t = + v m t v m t X m 1 + v c t : Using5and5while m i t = a ^ O t = a ^ v m t = a ^ m 1 t = a ^ t> m 1 the i thfeatureontheobject, m i 2O m ,islearnedwhileitsatisestheeigenvalue conditioni.e., m i t = a byintegrating2for m i and2yielding 2 6 4 d m i =c t d m=m t 3 7 5 )]TJ/F31 11.9552 Tf 7.97 27.616 Td [(2 6 4 d m i =c t )]TJ/F25 11.9552 Tf 11.955 0 Td [(& d m=m t )]TJ/F25 11.9552 Tf 11.955 0 Td [(& 3 7 5 = d m i =k 1 Z t t )]TJ/F26 7.9701 Tf 6.587 0 Td [(& 2 6 4 u T m i =c ! m )]TJ/F25 11.9552 Tf 9.963 0 Td [(! c R m=c u m i =k 1 0 3 7 5 d 104

PAGE 105

+ X m 1 Z t t )]TJ/F26 7.9701 Tf 6.586 0 Td [(& 2 6 4 u T m i =c + v m v m u T m=m + v m v m 3 7 5 d )-167(X m 1 Z t t )]TJ/F26 7.9701 Tf 6.586 0 Td [(& 2 6 4 u T m i =c ! m )]TJ/F25 11.9552 Tf 9.963 0 Td [(! c R m=c u m 1 =k 1 u T m=m ! m )]TJ/F25 11.9552 Tf 9.962 0 Td [(! c R m=c u m 1 =k 1 3 7 5 d; where O = a ^ v m = a ^ m 1 = a ^ m i = a; 8 2 [ t )]TJ/F25 11.9552 Tf 11.955 0 Td [(&;t ] ^ t> m 1 .Substituting therelationshipin2for m i into5for 2 6 4 d m i =c t d m=m t 3 7 5 and 2 6 4 d m i =c t )]TJ/F25 11.9552 Tf 11.955 0 Td [(& d m=m t )]TJ/F25 11.9552 Tf 11.955 0 Td [(& 3 7 5 yields Y m i t d m i =k 1 = U m i t ; where Y m 1 t , 8 > > > > > > > > > > > > > > > > > > < > > > > > > > > > > > > > > > > > > : m i t )]TJ/F25 11.9552 Tf 11.955 0 Td [( m i )]TJ/F25 11.9552 Tf 5.479 -9.683 Td [(t )]TJ/F25 11.9552 Tf 11.955 0 Td [( a l j;m i )]TJ/F31 11.9552 Tf 11.291 9.631 Td [(R t t )]TJ/F26 7.9701 Tf 6.587 0 Td [( a l j;m i 2 6 6 4 u T m i =c ! m t )]TJ/F25 11.9552 Tf 11.955 0 Td [(! c t R m=c u m i =k 1 0 3 7 7 5 d;t )]TJ/F25 11.9552 Tf 11.955 0 Td [( a l j;m i <&; m i t )]TJ/F25 11.9552 Tf 11.955 0 Td [( m i t )]TJ/F25 11.9552 Tf 11.955 0 Td [(& )]TJ/F31 11.9552 Tf 11.291 9.631 Td [(R t t )]TJ/F26 7.9701 Tf 6.587 0 Td [(& 2 6 6 4 u T m i =c ! m t )]TJ/F25 11.9552 Tf 11.955 0 Td [(! c t R m=c u m i =k 1 0 3 7 7 5 d;t )]TJ/F25 11.9552 Tf 11.955 0 Td [( a l j;m i &; 105

PAGE 106

and U m i t , 8 > > > > > > > > > > > > > > > > > > > > > > > > > > < > > > > > > > > > > > > > > > > > > > > > > > > > > : X m 1 R t t )]TJ/F26 7.9701 Tf 6.586 0 Td [( a l j;m i 2 6 6 4 u T m i =c + v m v m u T m=m + v m v m 3 7 7 5 d X m 1 R t t )]TJ/F26 7.9701 Tf 6.587 0 Td [( a l j;m i 2 6 6 4 u T m i =c ! m )]TJ/F25 11.9552 Tf 11.955 0 Td [(! c R m=c u m 1 =k 1 u T m=m ! m )]TJ/F25 11.9552 Tf 11.955 0 Td [(! c R m=c u m 1 =k 1 3 7 7 5 d;t )]TJ/F25 11.9552 Tf 11.955 0 Td [( a l j;m i <&; X m 1 R t t )]TJ/F26 7.9701 Tf 6.586 0 Td [(& 2 6 6 4 u T m i =c + v m v m u T m=m + v m v m 3 7 7 5 d X m 1 R t t )]TJ/F26 7.9701 Tf 6.587 0 Td [(& 2 6 6 4 u T m i =c ! m )]TJ/F25 11.9552 Tf 11.955 0 Td [(! c R m=c u m 1 =k 1 u T m=m ! m )]TJ/F25 11.9552 Tf 11.955 0 Td [(! c R m=c u m 1 =k 1 3 7 7 5 d;t )]TJ/F25 11.9552 Tf 11.955 0 Td [( a l j;m i &; and a l j;m i isthe l thtimethe i thfeaturesatisestheeigenvalueconditionduringthe j th timetheobjectenterstheFOVandsufcientdatahasbeencollectedfortherstfeature i.e., a l j;m i 2 a j ; u j t> m 1 .Thetime a l j;m i 2 a j ; u j t> m 1 mustbeconsidered sincethereisnoguaranteetheobjectislearnedbeforetime u l j;m i or u j . Multiplyingbothsidesof5by Y T m i t yields Y T m i t Y m i t d m i =k 1 = Y T m i t U m i t : Ingeneral, Y m i t willnothavefullcolumnrankwhile O = a ^ v m = a ^ m 1 = a ^ m i = a; 8 2 [ t )]TJ/F25 11.9552 Tf 11.955 0 Td [(&;t ] ^ t> m 1 e.g.whenthecameraandobjectarestationary implying Y T m i t Y m i t 0 .However,theequalityin5maybeevaluatedatany instanceintimeandsummedtogetheri.e.,historystacksyielding Y m i d m i =k 1 = U m i ; where Y m i , N P h =1 Y T m i t h Y m i t h , U m i , N P h =1 Y T m i t h U m i t h , t h 2 m 1 ;t ] ,and N 2 Z > 1 . Assumption5.2. Thereissufcientrelativemotionbetweenthecameraandtargetso thereexistsatime m i 2 R > a 1 ,suchthatforalltime t> m i , min Y m i > . 106

PAGE 107

OncesufcientrelativemotionoccursasdiscussedinAssumption5.2,theconstant unknowndistance, d m i =k 1 ,canbedeterminedfrom5yielding d m i =k 1 = X m i ; where X m i , )]TJ/F24 7.9701 Tf 6.587 0 Td [(1 Y m i U m i . 5.3LearningtheObjectMotionModel Afterlearningthestructureoftheobject,estimatesofobjectposeandvelocityare availablewhiletheobjectisintheFOVandtheeigenvalueconditionsaresatised; however,theobjectmayperiodicallyleavetheFOVovertimeandtheeigenvalue conditionswillnotalwaysbesatised.Theworkin[38]usedmotionmodellearning todesignapredictorfortheposeofatargetwhilethetargetisoutsidetheFOVwhich isnaturallyextendedtoincludetimeperiodswheretheeigenvalueconditionsarenot satised.Inthesubsequentdevelopment,anestimatorforthemotionmodelofthe vehicleispresented;however,theprimarydifferenceinthesubsequentdesignisthe modellearnedisfortheaccelerationofthevehicleandnotthevelocity.Specically,[38] developedamethodformodelingthevelocityofthetargetbutestimatingamodelofthe accelerationenablestheuseofkineticmodelsoftargetstobeexploitedratherthanonly usingkinematicmodels.Asdescribedin[38],therearenumerousapplicationswherea target'svelocityisdirectlyafunctionofitsposeintheworldortherelativeposebetween thetargetandthecamera;however,thesekinematicmodelsorapproximationsofthe desiredtrajectoryofatargetdon'taccuratelymodelvehicletrajectoriesinnumerous applications.Inthesubsequentdevelopment,amoregeneralmodelispresentedwhich enablestheestimationofalargerclassofsystems. 107

PAGE 108

Let c t , 2 6 4 p c=k 1 t q c=k 1 t 3 7 5 and m t , 2 6 4 p m 1 =c t q m=c t 3 7 5 representtheposeofthecamera expressedin F k 1 andtheposeoftheobjectwithrespecttothecameraexpressedin F c . Additionally,let m t , 2 6 4 v m t ! m t 3 7 5 representthevelocityoftheobjectexpressedin F c . Assumption5.3. Theposeofthecamera, c t ,isknown. Assumption5.4. Theposeandvelocityofthemovingobjectandcameraarebounded. Specically, m t 2 N m , c t 2 N c ,and m t 2 N m where N m ;N c R 7 and N m R 6 areconvex,compactsetsand k m t k m . Remark 5.3 . Assumption5.4isageneralrequirementforanyestimatortoconverge i.e.,thestatetobeestimatedmustremainboundedforanestimatortoremain bounded.Thisisequivalenttotherequirementofdesiredtrajectoriesremaining boundedincontrolproblems. Assumption5.5. Theaccelerationofthemovingobjectisboundedandlimitedtothe classofsystemsthatareboundedandarelocallyLipschitzfunctionsoftheposeand velocityofthemovingobjectandposeofthecamera.Specically,thetimederivativeof thevelocityis d dt m t = f m m t ; c t ; m t where f m : R 7 R 7 R 6 ! R 6 isalocallyLipschitzandboundedfunction. Remark 5.4 . Assumption5.5ensuresthekineticmodelofthetargetcanbeapproximatedusinguniversalfunctionapproximatorstoanarbitrarylevelofaccuracyvia theStone-Weierstrasstheorem[120].Specically,aNNissubsequentlyusedtoapproximate f m m t ; c t ; m t .Thisassumptionholdsinvarioustargettracking objectivese.g.,projectileandorbitalmotionandpursuit-evasiongames. Remark 5.5 . Theaccelerationmodelisnotlimitedtotargetsthatareafunctionof m t , c t ,and m t .Specically,thetarget'smotionmodelmaybeafunctionofany combinationof m t , c t ,and m t . 108

PAGE 109

Inthisdevelopment,thetarget'saccelerationmodelisapproximatedusingaNNas f m m t ; c t ; m t = W T m m m t ; c t ; m t + " m m t ; c t ; m t ; where L 2 Z > 0 isthenumberofbasisfunctions, W m 2 R L 6 isamatrixoftheconstant unknownidealweights, m : R 7 R 7 R 6 ! R 6 isadesignedmatrixofbasis functionsthatareboundedandlocallyLipschitz,and " m : R 7 R 7 R 6 ! R 6 isthe functionapproximationresidual,whichislocallyLipschitzandcanbeboundedwith aboundthatcanbemadearbitrarilysmallbasedontheStone-Weierstrasstheorem " m , sup m 2 N m ; c 2 N c ; m 2 m ;t 2 [0 ; 1 k " m m t ; c t ; m t k .Furthermore,toaidinthe subsequentanalysis, k W m k W m 2 R > 0 , m , sup m 2 N m ; c 2 N c ; m 2 m ;t 2 [0 ; 1 k m m t ; c t ; m t k ; m; m , sup m 2 N m ; c 2 N c ; m 2 m ;t 2 [0 ; 1 k @ m @ m t m t ; c t ; m t k ; and m; m , sup m 2 N m ; c 2 N c ; m 2 m ;t 2 [0 ; 1 k @ m @ m t m t ; c t ; m t k : Remark 5.6 . Aftertheinitialstructureoftherstfeature m 1 2O m isknown,while theeigenvalueconditionissatisedandtheobjectremainsintheFOVi.e., O t = a ^ v m t = a ^ m 1 t = a ^ t> m 1 , m t canbeapproximatedusing5.Using 5and5, m t canbedetermined.Furthermore,aftertheinitialstructurefor theotherfeaturesislearned,anyorallofthefeaturescanbeusedtoestimate m t and m t ;however,inthesubsequentdevelopmentonlytheoriginoftheobjectisused i.e.,therstfeature m 1 2O m . Similartotheapproachtakenforthemovingfeatures,using5,thetime derivativeofthevelocityisintegratedoveratimewindow & while O t = a ^ v m t = 109

PAGE 110

a ^ m 1 t = a ^ t> m 1 m t )]TJ/F25 11.9552 Tf 11.955 0 Td [( m t )]TJ/F25 11.9552 Tf 11.955 0 Td [(& = W T m Z t t )]TJ/F26 7.9701 Tf 6.586 0 Td [(& m m ; c ; m d + Z t t )]TJ/F26 7.9701 Tf 6.586 0 Td [(& " m m ; c ; m d; where O = a ^ v m = a ^ m 1 = a; 8 2 [ t )]TJ/F25 11.9552 Tf 11.955 0 Td [(&;t ] ^ t> m 1 .Using5and 5, m t = 2 6 4 + v m t v m t X m 1 + v c t 2 R m=c t B T )]TJ/F25 11.9552 Tf 5.48 -9.683 Td [(q m=c t d dt )]TJ/F25 11.9552 Tf 5.479 -9.683 Td [(q m=c t + ! c t 3 7 5 ; implying5canbewrittenas Y m t W m = U m t + E m t ; where Y m t , 8 > > < > > : R t t )]TJ/F26 7.9701 Tf 6.587 0 Td [( a l j;m 1 T m m ; c ; m d;t )]TJ/F25 11.9552 Tf 11.955 0 Td [( a l j;m 1 <&; R t t )]TJ/F26 7.9701 Tf 6.587 0 Td [(& T m m ; c ; m d;t )]TJ/F25 11.9552 Tf 11.955 0 Td [( a l j;m 1 &; 110

PAGE 111

U m t , 8 > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > < > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > : 2 6 6 4 + v m t v m t X m 1 2 R m=c t B T )]TJ/F25 11.9552 Tf 5.479 -9.684 Td [(q m=c t d dt )]TJ/F25 11.9552 Tf 5.48 -9.684 Td [(q m=c t 3 7 7 5 T + 2 6 6 4 v c t ! c t 3 7 7 5 T )]TJ/F31 11.9552 Tf 11.291 31.204 Td [(2 6 6 4 + v m )]TJ/F25 11.9552 Tf 5.48 -9.684 Td [(t )]TJ/F25 11.9552 Tf 11.955 0 Td [( a l j;m 1 v m )]TJ/F25 11.9552 Tf 5.479 -9.684 Td [(t )]TJ/F25 11.9552 Tf 11.955 0 Td [( a l j;m 1 X m 1 2 R m=c )]TJ/F25 11.9552 Tf 5.48 -9.683 Td [(t )]TJ/F25 11.9552 Tf 11.956 0 Td [( a l j;m 1 B T )]TJ/F25 11.9552 Tf 5.479 -9.683 Td [(q m=c )]TJ/F25 11.9552 Tf 5.479 -9.683 Td [(t )]TJ/F25 11.9552 Tf 11.955 0 Td [( a l j;m 1 d dt )]TJ/F25 11.9552 Tf 5.48 -9.683 Td [(q m=c )]TJ/F25 11.9552 Tf 5.48 -9.683 Td [(t )]TJ/F25 11.9552 Tf 11.956 0 Td [( a l j;m 1 3 7 7 5 T )]TJ/F31 11.9552 Tf 11.291 31.203 Td [(2 6 6 4 v c )]TJ/F25 11.9552 Tf 5.479 -9.684 Td [(t )]TJ/F25 11.9552 Tf 11.955 0 Td [( a l j;m 1 ! c )]TJ/F25 11.9552 Tf 5.479 -9.684 Td [(t )]TJ/F25 11.9552 Tf 11.955 0 Td [( a l j;m 1 3 7 7 5 T ;t )]TJ/F25 11.9552 Tf 11.955 0 Td [( a l j;m 1 <&; 2 6 6 4 + v m t v m t X m 1 2 R m=c t B T )]TJ/F25 11.9552 Tf 5.479 -9.684 Td [(q m=c t d dt )]TJ/F25 11.9552 Tf 5.48 -9.684 Td [(q m=c t 3 7 7 5 T + 2 6 6 4 v c t ! c t 3 7 7 5 T )]TJ/F31 11.9552 Tf 11.291 31.203 Td [(2 6 6 4 + v m t )]TJ/F25 11.9552 Tf 11.955 0 Td [(& v m t )]TJ/F25 11.9552 Tf 11.955 0 Td [(& X m 1 2 R m=c t )]TJ/F25 11.9552 Tf 11.955 0 Td [(& B T )]TJ/F25 11.9552 Tf 5.48 -9.684 Td [(q m=c t )]TJ/F25 11.9552 Tf 11.955 0 Td [(& d dt )]TJ/F25 11.9552 Tf 5.48 -9.684 Td [(q m=c t )]TJ/F25 11.9552 Tf 11.955 0 Td [(& 3 7 7 5 T )]TJ/F31 11.9552 Tf 11.291 31.204 Td [(2 6 6 4 v c t )]TJ/F25 11.9552 Tf 11.955 0 Td [(& ! c t )]TJ/F25 11.9552 Tf 11.955 0 Td [(& 3 7 7 5 T ;t )]TJ/F25 11.9552 Tf 11.955 0 Td [( a l j;m 1 &; and E m t , 8 > > < > > : )]TJ/F31 11.9552 Tf 11.291 9.631 Td [(R t t )]TJ/F26 7.9701 Tf 6.587 0 Td [( a l j;m 1 " T m m ; c ; m d;t )]TJ/F25 11.9552 Tf 11.955 0 Td [( a l j;m 1 <&; )]TJ/F31 11.9552 Tf 11.291 9.63 Td [(R t t )]TJ/F26 7.9701 Tf 6.587 0 Td [(& " T m m ; c ; m d;t )]TJ/F25 11.9552 Tf 11.955 0 Td [( a l j;m 1 &: Multiplyingbothsidesof5by Y T m t yields Y T m t Y m t W m = Y T m t U m t + Y T m t E m t : Thematrix Y m t willneverhavefullcolumnrank;however,theequalityin5may beevaluatedatanyinstanceintimeandsummedtogetheri.e.,historystacksyielding Y m W m = U m + E m ; 111

PAGE 112

where Y m , N W P h =1 Y T m t h Y m t h , U m , N W P h =1 Y T m t h U m t h , E m , N W P h =1 Y T m t h E m t h , t h 2 m 1 ;t ] ,and N W 2 Z >L . Assumption5.6. Thereissufcientrelativemotionbetweenthecameraandtargetso thereexistsatime m 2 R > m 1 ,suchthatforalltime t> m , min f Y m g > . Remark 5.7 . Thetime m isunknown;however,itcanbedeterminedonlinebychecking theminimumeigenvalueof Y m . 5.4TargetEstimators Toquantifytheposeandvelocityestimationobjective,let ~ m t , m t )]TJ/F15 11.9552 Tf 12.781 0 Td [(^ m t ; ~ m t , m t )]TJ/F15 11.9552 Tf 13.468 3.155 Td [(^ m t ; and ~ W m t , W m )]TJ/F15 11.9552 Tf 15.367 3.022 Td [(^ W m t ; where ^ m t 2 R 7 , ^ m t 2 R 6 ,and ^ W m t 2 R L 6 aretheestimatesof m t , m t , and W m ,respectively.Takingthetimederivativeof5-5andusing2for feature m 1 ,5,5,5,andsince d dt W m =0 L 6 yields d dt ~ m t = m )]TJ/F25 11.9552 Tf 5.48 -9.684 Td [(q m=c t m t + 2 6 4 )]TJ/F25 11.9552 Tf 9.298 0 Td [(v c t )]TJ/F25 11.9552 Tf 11.956 0 Td [(! c t p m 1 =c t )]TJ/F24 7.9701 Tf 10.494 4.707 Td [(1 2 B )]TJ/F25 11.9552 Tf 5.479 -9.684 Td [(q m=c t R T m=c t ! c t 3 7 5 )]TJ/F25 11.9552 Tf 15.264 8.088 Td [(d dt ^ m t ; d dt ~ m t = W T m m m t ; c t ; m t + " m m t ; c t ; m t )]TJ/F25 11.9552 Tf 15.264 8.088 Td [(d dt ^ m t ; and d dt ~ W m t , )]TJ/F25 11.9552 Tf 12.607 8.088 Td [(d dt ^ W m t ; 112

PAGE 113

where k m )]TJ/F25 11.9552 Tf 5.48 -9.684 Td [(q m=c t k 1 , m )]TJ/F25 11.9552 Tf 5.48 -9.684 Td [(q m=c t , 2 6 4 I 3 3 0 3 3 0 4 3 1 2 B )]TJ/F25 11.9552 Tf 5.48 -9.683 Td [(q m=c t R T m=c t 3 7 5 ; and m;q m=c , sup m 2 N m ;t 2 [0 ; 1 k @ m @q m=c t )]TJ/F25 11.9552 Tf 5.479 -9.683 Td [(q m=c t k : WhilethetargetisintheFOVandtheeigenvalueconditionsaresatisedbutthe rstfeaturelearningconditionisnotsatised, O t = a ^ v m t = a ^ m 1 t = a ^ t< m 1 ,observerupdatelawsaredesignedfor d dt ^ m t , d dt ^ m t ,and d dt ^ W m t as d dt ^ m t , proj 0 B @ m )]TJ/F25 11.9552 Tf 5.479 -9.684 Td [(q m=c t ^ m t + 2 6 4 )]TJ/F25 11.9552 Tf 9.299 0 Td [(v c t )]TJ/F25 11.9552 Tf 11.955 0 Td [(! c t ^ p m 1 =c t )]TJ/F24 7.9701 Tf 10.494 4.707 Td [(1 2 B )]TJ/F25 11.9552 Tf 5.48 -9.684 Td [(q m=c t R T m=c t ! c t 3 7 5 1 C A ; d dt ^ m t , proj ^ W T m m ^ m t ; c t ; ^ m t ; and d dt ^ W m t , 0 L 6 : Afterthelearningconditionissatisedi.e., t m 1 ,andsince u m 1 =c t and q m=c t can bedeterminedwhile O t = a ^ v m t = a ^ m 1 t = a ,5and5canbeused todetermine m t and m t as m t = 2 6 4 u m 1 =c t m 1 ; 1 t X m 1 q m=c t 3 7 5 and m t = 2 6 4 + v m t v m t X m 1 + v c t 2 R m=c t B T )]TJ/F25 11.9552 Tf 5.48 -9.684 Td [(q m=c t d dt )]TJ/F25 11.9552 Tf 5.479 -9.684 Td [(q m=c t + ! c t 3 7 5 : 113

PAGE 114

After5beginssavingdata,theobserverupdatelawsuse m t , m t , U m ,and Y m as d dt ^ m t , proj 0 B @ m )]TJ/F25 11.9552 Tf 5.48 -9.684 Td [(q m=c t m t + 2 6 4 )]TJ/F25 11.9552 Tf 9.298 0 Td [(v c t )]TJ/F25 11.9552 Tf 11.956 0 Td [(! c t p m 1 =c t )]TJ/F24 7.9701 Tf 10.494 4.707 Td [(1 2 B )]TJ/F25 11.9552 Tf 5.479 -9.683 Td [(q m=c t R T m=c t ! c t 3 7 5 + K m ~ m t 1 C A ; d dt ^ m t , proj ^ W T m m m t ; c t ; m t + K m ~ m t ; d dt ^ W m t , proj )]TJ/F26 7.9701 Tf 7.314 -1.793 Td [(m m m t ; c t ; m t ~ T m t +)]TJ/F26 7.9701 Tf 26.285 -1.793 Td [(m K W m U m )]TJ/F15 11.9552 Tf 11.955 0 Td [( Y m ^ W m t ; where K m 2 R 7 7 , K m 2 R 6 6 , )]TJ/F26 7.9701 Tf 7.314 -1.794 Td [(m 2 R L L ,and K W m 2 R L L areconstant,positive denitegainmatrices. Beforethelearningconditionissatisedi.e., t< m 1 ,whentheobjectisnotin theFOVortheoriginorvelocityeigenvalueconditionsarenotsatisedi.e., O t = u _ v m t = u _ m 1 t = u ,thepredictorupdatelawsaredesignedtoupdatethe estimatesas d dt ^ m t , proj 0 B @ m )]TJ/F15 11.9552 Tf 6.338 -9.683 Td [(^ q m=c t ^ m t + 2 6 4 )]TJ/F25 11.9552 Tf 9.299 0 Td [(v c t )]TJ/F25 11.9552 Tf 11.955 0 Td [(! c t ^ p m 1 =c t )]TJ/F24 7.9701 Tf 10.494 4.707 Td [(1 2 B )]TJ/F15 11.9552 Tf 6.338 -9.684 Td [(^ q m=c t ^ R T m=c t ! c t 3 7 5 1 C A ; d dt ^ m t , proj ^ W T m t m ^ m t ; c t ; ^ m t ; and d dt ^ W m t , 0 L 6 ; whereafterthethelearningconditionissatised, t m 1 ,theaccelerationmodel weightsareupdatedusingthehistorystacksas d dt ^ W m t , )]TJ/F26 7.9701 Tf 7.314 -1.793 Td [(m K W m U m )]TJ/F15 11.9552 Tf 11.955 0 Td [( Y m ^ W m t : 114

PAGE 115

5.5ObjectObserverandPredictorAnalysis Tosimplifythesubsequentanalysis,if m 1 t = u then O t = u implying a l j;m 1 = a j and u l j;m 1 = u j .LetaLyapunovcandidatefunction, V m Z m t : R 7+6+6 L ! R , bedenedas V m Z m t , 1 2 ~ T m t ~ m t + 1 2 ~ T m t ~ m t + 1 2 tr ~ W T m t )]TJ/F29 7.9701 Tf 13.859 4.936 Td [()]TJ/F24 7.9701 Tf 6.586 0 Td [(1 m ~ W m t ; where 1 2 min 1 ; min )]TJ/F29 7.9701 Tf 7.314 4.936 Td [()]TJ/F24 7.9701 Tf 6.586 0 Td [(1 m k Z m t k 2 V m Z m t 1 2 max 1 ; max )]TJ/F29 7.9701 Tf 7.315 4.936 Td [()]TJ/F24 7.9701 Tf 6.586 0 Td [(1 m k Z m t k 2 and Z m t 2 R 7+6+6 L isastackederrorvectordenedas Z m t , ~ T m t ~ T m t vec ~ W m t T T : Lemma5.1. Theobserverdesignsin5-5ensurethestackederrorin Z m t isexponentiallyboundedwhilefeedbackfromtheobjectisavailablei.e.,objectisinthe FOV,themotioneigenvalueconditionsaresatised,andthehistorystackeigenvalue conditionsareunsatised, O t = a ^ v m t = a ^ m 1 t = a ^ t< m 1 ^ t< m ,i.e., t 2 a j ; u j t< m . Proof. Takingthetimederivativeof5,substitutingtheerrorderivativein5-5– 35,theupdatelawsin5-5,andupperboundingusingtheboundsin5 yields d dt V m Z m t c 1 V m Z m t + c 2 ; whichinvokingtheComparisonLemma[116,Lemma3.4]implies V m Z m t V m )]TJ/F25 11.9552 Tf 5.479 -9.684 Td [(Z m )]TJ/F25 11.9552 Tf 5.479 -9.684 Td [( a j + c 2 c 1 exp )]TJ/F25 11.9552 Tf 5.48 -9.684 Td [(c 1 )]TJ/F25 11.9552 Tf 5.479 -9.684 Td [(t )]TJ/F25 11.9552 Tf 11.955 0 Td [( a j )]TJ/F25 11.9552 Tf 13.151 8.088 Td [(c 2 c 1 ; where c 1 , max )]TJ/F24 7.9701 Tf 13.649 -4.977 Td [(1 2 + ! c + 1 2 W m m; m ; )]TJ/F15 11.9552 Tf 5.479 -9.684 Td [(1+ 1 2 m + 1 2 W m m; m + W m m; m ; 1 2 m 1 2 min f 1 ; min f )]TJ/F29 7.9701 Tf 7.314 3.454 Td [()]TJ/F24 7.9701 Tf 6.586 0 Td [(1 m gg 115

PAGE 116

and c 2 , 1 2 " m 2 : Lemma5.2. Theobserverdesignsin5-5ensurethestackederror Z m t exponentiallydecayswhilefeedbackfromtheobjectisavailablei.e.,objectisinthe FOV,themotioneigenvalueconditionsaresatised,andthehistorystackeigenvalue conditionsaresatised, O t = a ^ v m t = a ^ m 1 t = a ^ t m 1 ^ t m ,i.e., t 2 a j ; u j t m . Proof. Takingthetimederivativeof5,substitutingtheerrorderivativein5-5– 35,theupdatelawsin5-5,usingtheeigenvalueconditionsinAssumptions 5.1and5.6,andupperboundingusingtheboundsin5yields d dt V m Z m t )]TJ/F25 11.9552 Tf 21.918 0 Td [(c 3 V m Z m t + c 4 ; whichinvokingtheComparisonLemma[116,Lemma3.4]implies V m Z m t V m )]TJ/F25 11.9552 Tf 5.479 -9.683 Td [(Z m )]TJ/F25 11.9552 Tf 5.479 -9.683 Td [( a j )]TJ/F25 11.9552 Tf 13.151 8.087 Td [(c 4 c 3 exp )]TJ/F28 11.9552 Tf 5.479 -9.683 Td [()]TJ/F25 11.9552 Tf 9.299 0 Td [(c 3 )]TJ/F25 11.9552 Tf 5.479 -9.683 Td [(t )]TJ/F25 11.9552 Tf 11.955 0 Td [( a j + c 4 c 3 ; where c 3 , min min f K m g ; 1 2 min f K m g ; 1 2 min f K W m g 1 2 max f 1 ; max f )]TJ/F29 7.9701 Tf 7.314 3.453 Td [()]TJ/F24 7.9701 Tf 6.586 0 Td [(1 m gg and c 4 , " m 2 2 min f K m g + max f K W m g p max f Y m g N W " m & 2 2 min f K W m g : AsdescribedinRemark5.4, " m decreasesas L increasesimplying c 4 c 3 decreasesasthe numberofdifferentbasisfunctionsincreases. Lemma5.3. Thepredictordesignsin5-5ensurethestackederror Z m t exponentiallygrowswhilefeedbackfromtheobjectisunavailablei.e.,the 116

PAGE 117

Figure5-1.Examplegeometryforasimpliedcamerawithoriginat c ,angle c ,and FOV V c .ThemaximumradiusofaninscribedsphereintheFOVata distance d is R , d sin c . objectisoutsidetheFOVorthemotioneigenvalueconditionsareunsatised, O t = u _ v m t = u _ m 1 t = u ,i.e., t 2 u j ; a j +1 . Proof. Takingthetimederivativeof5andsubstitutingtheerrorderivativein5– 33-5,theupdatelawsin5-5,andupperboundingusingtheboundsin 5yields d dt V m Z m t c 5 V m Z m t + c 2 ; whichinvokingtheComparisonLemma[116,Lemma3.4]implies V m Z m t V m )]TJ/F25 11.9552 Tf 5.479 -9.684 Td [(Z m )]TJ/F25 11.9552 Tf 5.479 -9.684 Td [( u j + c 2 c 5 exp )]TJ/F25 11.9552 Tf 5.48 -9.684 Td [(c 5 )]TJ/F25 11.9552 Tf 5.479 -9.684 Td [(t )]TJ/F25 11.9552 Tf 11.955 0 Td [( u j )]TJ/F25 11.9552 Tf 13.151 8.088 Td [(c 2 c 5 ; where c 5 , max f c 5 ; m ;c 5 ; m ; 1 2 m g 1 2 min f 1 ; min f )]TJ/F30 5.9776 Tf 5.289 3.576 Td [()]TJ/F18 5.9776 Tf 5.756 0 Td [(1 m gg , c 5 ; m , m m;q m=c + 1 2 + ! c + ! c m;q m=c + 1 2 W m m; m , and c 5 ; m , )]TJ/F15 11.9552 Tf 5.48 -9.684 Td [(1+ 1 2 m + 1 2 W m m; m + W m m; m . 5.6ObjectDwell-TimeAnalysis Asshownin5and5,theobserverdesignsbeforeAssumptions5.1and 5.6andthepredictordesignsarealwaysexponentiallygrowing.Sincetheobjectiveis totrackthemovingobject,dwell-timesmustbedevelopedtoensurethe Z m t remains 117

PAGE 118

boundedduringperiodsoftimewhere Z m t grows.Theanalysisin[38]assumesthe structureoftheobjectisknownandmeasurableandshowsthatanestimatordesign i.e.,switchingbetweenanobserverandpredictorisstableprovideddwell-timescan upperboundthetotaltimespentintheunstableperiodsoveraconstantnumberof cycles;however,growthoftheestimationerrorbeyondsomethresholdisnotalways possible.Specically,asdescribedin[63],feedbackregionsareniteinsizeand constrainedbysensormodalitye.g.,thesizeofaFOVorregionswhereapositioning systemisaccurate.Furthermore,theobject'sstructureisunknownwhile t< m implyingtheassumptionsrequiredin[38]areunsatised. Forthisimage-basedtargettrackingobjective,thefeedbackregionisdenedbythe FOV, V c .AsshowninthesimpliedcameramodelinFigure5-1,thelargestinscribed spherethatcantwithintheFOVisdenedby R , d sin c ,where d isthemaximum distanceacameracanreasonablyestimateand c 2 R > 0 istheminimumangleofthe FOV V c .Specically,thesetoffeaturesontheobjectmustbecapturedwithintheFOV forfeedbacktobeavailable, O t = a .ThisrequirementiscoveredbyAssumption2.8 wherethesetoffeaturesonthetarget, O m ,areassumedtotwithintheFOV, MV c . Let D M , max p m i =m 1 2M f m i 2O m g n i =2 representthemaximumdistancebetweenthe originoftheobject, m 1 ,andanotherfeature, m i ,and D M 2 R >R M representaknown boundon D M .Toensuretheestimationerrorremainsbounded,themaximumvaluefor 5mustbeboundedas V m Z m t < 1 2 ~ m 2 ; where ~ m , R )]TJETq1 0 0 1 166.137 209.348 cm[]0 d 0 J 0.478 w 0 0 m 20.296 0 l SQBT/F25 11.9552 Tf 166.137 199.505 Td [(D M isthemaximumerrorfortheFOV, R > D M ,and k ~ m t k ~ m . Ensuringthemaximumtimewherefeedbackisunavailablei.e.,maximumdwelltimealsoguaranteesthecamerawillhavefeedbackandtheestimatorwillremain stablewhenAssumptions5.1and5.6aresatised.Thisimpliesthataninitialminimum dwell-timemustreectthelearningobjective,specically,Assumptions5.1and5.6are satisedafter t m 1 t m i.e.,thetimesthehistorystacksfortheinitialstructure 118

PAGE 119

oftheoriginandaccelerationmodelhavesufcientdata.Bydesign, m > m 1 implying theinitialminimumdwell-timemustinitiallyexceed m otherwiseitisnotpossibleto guaranteetheobjectiscapturedwithintheFOV.Specically,themaximumamountof timeitcantaketolearn, t m 2 R > 0 mustbegreaterthantheniteexcitationconditionin Assumptions5.1and5.6implying t m > m : Ifthisisnottrue,thenitisnotpossibletoensurestabilityusingtheproposedobserver andpredictordesign. Forthesubsequentdevelopment,let a j , u j )]TJ/F25 11.9552 Tf 12.017 0 Td [( a j and u j , a j +1 )]TJ/F25 11.9552 Tf 12.018 0 Td [( u j represent thetimespentwithfeedbackavailableandunavailableoverthe j thcycle. Theorem5.1. Theswitchedsystemdenedbytheswitchingsignals O t , v m t ,and m 1 t ,andtheboundsin5and5ensuretheestimationerror Z m remains boundedwhile t< m provided m < min 1 c 1 ln 1 2 ~ m 2 + c 2 c 1 V m Z m a 1 + c 2 c 1 ! ; 1 c 5 ln 1 2 ~ m 2 + c 2 c 5 V m Z m a 1 + c 2 c 5 ! : Proof. Using5,toensurestabilityofthesystem, V m Z m m < 1 2 ~ m 2 .Consideringtheboundsin5and5,theinitialminimumdwell-timemustsatisfythe minimumof V m Z m a 1 + c 2 c 1 exp )]TJ/F25 11.9552 Tf 5.479 -9.684 Td [(c 1 t m )]TJ/F25 11.9552 Tf 13.151 8.088 Td [(c 2 c 1 1 2 ~ m 2 and V m Z m a 1 + c 2 c 5 exp )]TJ/F25 11.9552 Tf 5.479 -9.684 Td [(c 5 t m )]TJ/F25 11.9552 Tf 13.15 8.088 Td [(c 2 c 5 1 2 ~ m 2 : Solving5and5for t m andsubstituting5yields5. Remark 5.8 . Thelearningconditionin5requiresreasonableinitialvaluestobe knownfortheestimates;however,thisisageneralrequirementtoguaranteethemotion modelislearnedwhenatargetintermittentlyleavestheFOV.Furthermore,reasonable initialvaluesfortheestimatesareoftenavailable. 119

PAGE 120

Theorem5.2. Theswitchedsystemdenedbytheswitchingsignals O t , v m t ,and m 1 t ,andtheboundsin5and5ensuretheestimationerror Z m remains GUUBwhile t m providedthe j thcyclealwayssatisesthelossoffeedbackdwelltimecondition u j 1 c 5 ln 0 @ 1 2 ~ m 2 + c 2 c 5 1 2 ~ m 2 )]TJ/F26 7.9701 Tf 13.151 4.813 Td [(c 4 c 3 exp )]TJ/F28 11.9552 Tf 5.48 -9.684 Td [()]TJ/F25 11.9552 Tf 9.298 0 Td [(c 3 a j + c 4 c 3 + c 2 c 5 1 A : Proof. Using5,toensurestabilityofthesystem, V m )]TJ/F25 11.9552 Tf 5.48 -9.684 Td [(Z m )]TJ/F25 11.9552 Tf 5.48 -9.684 Td [( a j < 1 2 ~ m 2 .Consideringtheobserverboundin5,theworstcaseforeachcycle j istheestimationerror growingtothemaximumduringthepreviouscycleimplyingwhen t = u j , V m )]TJ/F25 11.9552 Tf 5.48 -9.684 Td [(Z m )]TJ/F25 11.9552 Tf 5.48 -9.684 Td [( u j 1 2 ~ m 2 )]TJ/F25 11.9552 Tf 13.15 8.088 Td [(c 4 c 3 exp )]TJ/F28 11.9552 Tf 5.479 -9.684 Td [()]TJ/F25 11.9552 Tf 9.299 0 Td [(c 3 a j + c 4 c 3 : Usingtheworstcase, V m )]TJ/F25 11.9552 Tf 5.48 -9.684 Td [(Z m )]TJ/F25 11.9552 Tf 5.48 -9.684 Td [( a j +1 < 1 2 ~ m 2 forthepredictorboundin5and solvingforthe u j yields u j 1 c 5 ln 1 2 ~ m 2 + c 2 c 5 V m )]TJ/F25 11.9552 Tf 5.48 -9.684 Td [(Z m )]TJ/F25 11.9552 Tf 5.48 -9.684 Td [( u j + c 2 c 5 ! : Substituting5into5yieldsthelossoffeedbackdwell-timeconditionin 5. Remark 5.9 . Aminimumfeedbackdwell-timeconditionisnotdevelopedheresincethe objectiveistoalwaystrackthetarget. 5.7Summary Inthischapter,anovelapproachtoestimatingthepose,velocity,andacceleration ofatargetisdevelopedwhileconsideringintermittentfeedback.Thisapproachutilizes anewapproachtoimagegeometrythatrelaxestherequirementtohavecontinuous observationofthetarget,toknowstructure,velocity,oraccelerationofthetarget,and doesnotrequirethepersistenceofexcitationassumptionorpositivedepthconstraint. 120

PAGE 121

CHAPTER6 CONCLUSIONS Inapplicationswhereagentsarerequiredtotrackamovingtargetthroughuncertain environments,itisnecessarytoestimatethestructureoflocalfeaturesintheenvironmente.g.,relativepositionsofobjectsintheimmediatesurroundingenvironment, theposeofanagenti.e.,positionandorientation,andtheposeandvelocityofthe target.Manyoftheseapplicationsrequiretravelingoverlargedistancesimplyingthe localenvironmentforanagentisalwayschangingintroducingfurtherdifculty.Itisoften onlypossibletointermittentlysensethetargete.g.,environmentalobstructionsorpath constraintsoftheagentmaycauseocclusionsofthetarget.Atypicalassumptionis thatglobalsensingisavailabletomeasurethestateofanagent.However,statefeedbackgenerallyrequiresasensorthatcanrelateallthestatestoacommoncoordinate systeme.g.,globalpositioningsystemGPS.However,GPSmaybeunavailablee.g., agentscouldoperateinenvironmentswhereGPSisrestrictedordenied.Assuming thattheentireenvironmentisknownandstateinformationfromthetargetisavailable isarestrictiveassumptionsincetargetsarenotlikelytocommunicatesuchinformation anddirectlysensingtheposeandvelocityofatargetischallengingandnotpossiblein manyscenarios.Thesechallengesmotivatethedevelopmentoftechniquesthatrelyon localsensingbutstillallowagentstoestimatetheirownstatei.e.,poseaswellasthe stateofatargeti.e.,poseandvelocity.Additionally,effortsaremotivatedbythefact thatlocalsensingoftenhasintermittentavailability. Inthisdissertation,camerasareproventobeasensorthatcanprovidelocal feedbackoftheenvironmentwherecoordinatesofthetargetcanberelatedtoa commonreferenceframe.Numerousestimatorsaredevelopedthatenableamonocular camerasystemtoestimatethestateofanagentandtargetdespitenothavingtheability toinherentlymeasurescale,havealimitedFOV,andbeingsusceptibletointermittent sensinge.g.,duetoocclusions.Specically,novelestimatorsusingasinglecamera 121

PAGE 122

andSfMtheoryaredevelopedtoestimatetheEuclideandistancetofeaturesona stationaryobjectsandtheEuclideantrajectorythecameratakeswhiletrackingthe target.Theseestimatorsareextendedtodevelopanovelestimatorthatusedasingle cameraandSaMfMtheorytoestimatetheposeofthetargetrelativetotheagent andthevelocityofthetarget.Unlikepreviousresultsthatestimatetheinversedepth tofeatures,thedevelopedobserversdonotrequirethepositivedepthconstraint, allowingformoregeneraltrajectoriestobetakenbyanagent.InChapter1,thetarget trackingproblemisdiscussedandasurveyofpreviousworkonusingcamerasystems waspresented.InChapter2,thedynamicsforamovingmonocularcameratracking stationaryfeaturesandamovingtarget'sfeaturesisdeveloped.Thedynamicspresent auniqueapproachtotheSfMandSaMfMwhererelationshipsaredevelopedshowing howtheEuclideandistancetostationaryfeaturesrelatestotheposeoftheagentand theEuclideandistancetomovingfeaturesrelatestotheposeandvelocityofthemoving target. InChapter3,aglobalexponentiallystableobserverforfeaturescaleisdeveloped underaniteexcitationconditionthroughtheuseofICL.Sincetheobserveronly requiresniteexcitationtobegloballyexponentiallystable,theobserverismore generalthanpreviousresults.TheresultindicatesthattheEuclideandistancetoaset offeaturesonastationaryobjectandthepaththecameratravelswhileviewingthat objectareestimatedexponentiallyfastimplyingthestructurei.e.,Euclideancoordinates ofthetrackedfeaturesandpatharereconstructedexponentially.Furthermore,the developedestimationmethoddoesnotrequirethefeaturesontheobjectstobeplanar anddoesnotrequirethepositivedepthconstraint.Anexperimentalstudyispresented whichcomparesthedevelopedEuclideandistanceobservertopreviousobservers demonstratingtheeffectivenessofthisresult. InChapter4,anextensiontothelearningapproachesinChapter3isdeveloped thatappliesanewlearningstrategythatmaintainsacontinuousestimateoftheposition 122

PAGE 123

ofthecameraandestimatesthestructureoffeaturesastheycomeintotheFOV. Furthermore,thedevelopedlearningstrategyallowssimulatedmeasurementsof featuresfromobjectsthatarenolongerintheFOVenablingacontinuousestimate ofthedistancetofeatureswithrespecttothecamera.Additionally,thisapproach showshowtheextendedobserverremovesthepositivedepthconstraintrequiredby allpreviousSfMapproaches.Usingthisapproach,acameramaytraveloverlarge distanceswithoutkeepingspecicfeaturesintheFOVforalltimeandallowobjectsto permanentlyleavetheFOVifnecessary.ALyapunovbasedstabilityanalysisproves thattheobserversforestimatingthepathofthecameraaswellasthestructureofeach setofobjectsaregloballyexponentiallystablewhilefeaturesareintheFOV.Aswitched systemsanalysisisusedtodevelopdwell-timeconditionstoindicatehowlongafeature mustbetrackedtoensurethedistanceestimationerrorisbelowathreshold.Afterthe distanceestimateshaveconvergedbelowthethreshold,thefeaturemaybeusedto updatethepositionofthecamera.Ifafeaturedoesnotsatisfythedwell-timecondition, itisneverusedtoupdatethepositionoftheagent.Furthermore,theapproachdoesnot requireanewsetoffeaturestobeinthecamera'sFOVwhenolderfeaturesleavethe camera'sFOV.Finally,ifarecognizedlandmarkentersthecamera'sFOV,thefeedback isusedtocompensatefordrifterror. InChapter5,theapproachinChapter4isusedtoprovideposeestimatesof thecameraandanextensionofChapter3isdevelopedtoexponentiallyestimatethe poseandvelocityofthemovingtarget.Specically,usingtheposeandvelocityof thecamera,theestimationerroroftheEuclideantrajectoryofthetargetaswellas thestructureofthetarget,isgloballyexponentiallyconvergenttoanultimatebound assumingthetargetvelocityandaccelerationareboundedanddwell-timeconditions aresatised.Thedevelopedestimatorrelaxestherequirementtohavecontinuous observationofthetarget,toknowstructureorvelocityofthetarget,anddoesnotrequire thepersistenceofexcitationassumptionorpositivedepthconstraint. 123

PAGE 124

Themonocularcameraestimatorsdevelopedinthisdissertationonlyconsidera singleagenttrackingasingletargetandassumethatacontrollerexiststosatisfythe trackingobjective.Futureworkmayincludeincorporatingtheestimatorsinthisdissertationintoaguidanceandcontrolframeworkthatinformstheagentaboutpotential trajectoryissuese.g.,abuildingmayblockthepathoroccludethetargetandestimate theoptimaltrajectorytotrackthatenableslearningthetargetstructure.Additionally,this workcouldbethefoundationforacooperativenetworktargettrackingsystemwherea multipleagentsaretrackingmultipletargets.Futureworkcanalsofocusonextending thisresulttoconsiderdisturbancesinthedynamicsanddevelopingabundleadjustment strategythatisproventobestableusingaLyapunov-basedanalysisenablingimproved estimatesofthepaththroughthefeedback-deniedregionwithoutsacricingstability guarantees.Theextendedresultwouldbethefoundationofanovelsimultaneouslocalizationandmappingalgorithmthatimprovestheobserverandpredictorstrategywhile operatinginfeedback-deniedenvironmentsandensuresstability. 124

PAGE 125

REFERENCES [1]R.Sim,P.Elinas,andJ.J.Little,“Astudyoftherao-blackwellisedparticlelterfor efcientandaccuratevision-basedslam,” Int.J.Comput.Vision ,vol.74,no.3,pp. 303,2007. [2]T.Lemaire,C.Berger,I.-K.Jung,andS.Lacroix,“Vision-basedslam:Stereoand monocularapproaches,” Int.J.Comput.Vision ,vol.74,no.3,pp.343,Sep. 2007. [3]A.J.Davison,I.D.Reid,N.D.Molton,andO.Stasse,“Monoslam:Real-time singlecameraslam,” IEEETrans.PatternAnal.Mach.Intell. ,vol.29,no.6,pp. 1052,Jun.2007. [4]F.Bonin-Font,A.Ortiz,andG.Oliver,“Visualnavigationformobilerobots:A survey,” J.Intell.Rob.Syst. ,vol.53,no.3,pp.263,Nov.2008. [5]J.Sola,A.Monin,M.Devy,andT.Vidal-Calleja,“Fusingmonocularinformationin multicameraSLAM,” IEEETrans.Robot. ,vol.24,no.5,pp.958,Oct.2008. [6]S.Y.Chen,“Kalmanlterforrobotvision:Asurvey,” IEEETrans.Ind.Electron. , vol.59,no.11,pp.4409,Aug.2011. [7]G.P.Huang,A.I.Mourikis,andS.I.Roumeliotis,“Aquadratic-complexity observability-constrainedunscentedKalmanlterforSLAM,” IEEETrans.Robot. , vol.29,no.5,pp.1226,Oct.2013. [8]J.Engel,T.Schps,andD.Cremers,“Lsd-slam:Large-scaledirectmonocular slam,”in ComputerVision–ECCV2014 ,2014,pp.834. [9]G.DubbelmanandB.Browning,“Cop-slam:Closed-formonlinepose-chain optimizationforvisualslam,” IEEETransactionsonRobotics ,vol.31,no.5,pp. 1194,Oct2015. [10]R.Mur-Artal,J.M.M.Montiel,andJ.D.Tards,“Orb-slam:Aversatileand accuratemonocularslamsystem,” IEEETransactionsonRobotics ,vol.31,no.5, pp.1147,Oct2015. [11]R.Mur-ArtalandJ.D.Tards,“Orb-slam2:Anopen-sourceslamsystemfor monocular,stereo,andrgb-dcameras,” IEEETransactionsonRobotics ,vol.33, no.5,pp.1255,Oct2017. [12]T.Taketomi,H.Uchiyama,andS.Ikeda,“Visualslamalgorithms:Asurveyfrom 2010to2016,” IPSJTransactionsonComputerVisionandApplications ,vol.9, no.1,p.16,2017. [13]M.Karrer,P.Schmuck,andM.Chli,“Cvi-slamcollaborativevisual-inertialslam,” IEEERoboticsandAutomationLetters ,vol.3,no.4,pp.2762,Oct2018. 125

PAGE 126

[14]R.HartleyandA.Zisserman, MultipleViewGeometryinComputerVision . CambridgeUniversityPress,2003. [15]Y.Ma,S.Soatto,J.Kosecka,andS.Sastry, AnInvitationto3-DVision .Springer, 2004. [16]L.Matthies,T.Kanade,andR.Szeliski,“Kalmanlter-basedalgorithmforestimatingdepthfromimagesequences,” Int.J.Comput.Vision ,vol.3,pp.209, 1989. [17]M.JankovicandB.Ghosh,“Visuallyguidedrangingfromobservationspoints, linesandcurvesviaanidentierbasednonlinearobserver,” Syst.ControlLett. , vol.25,no.1,pp.63,1995. [18]S.Soatto,R.Frezza,andP.Perona,“Motionestimationviadynamicvision,” IEEE Trans.Autom.Control ,vol.41,no.3,pp.393,1996. [19]H.Kano,B.K.Ghosh,andH.Kanai,“Singlecamerabasedmotionandshape estimationusingextendedKalmanltering,” Math.Comput.Modell. ,vol.34,pp. 511,2001. [20]A.Chiuso,P.Favaro,H.Jin,andS.Soatto,“Structurefrommotioncausally integratedovertime,” IEEETrans.PatternAnal.Mach.Intell. ,vol.24,no.4,pp. 523,Apr.2002. [21]W.E.Dixon,Y.Fang,D.M.Dawson,andT.J.Flynn,“Rangeidenticationfor perspectivevisionsystems,” IEEETrans.Autom.Control ,vol.48,pp.2232, 2003. [22]X.ChenandH.Kano,“Stateobserverforaclassofnonlinearsystemsandits applicationtomachinevision,” IEEETrans.Autom.Control ,vol.49,no.11,pp. 2085,2004. [23]D.KaragiannisandA.Astol,“Anewsolutiontotheproblemofrangeidenticationinperspectivevisionsystems,” IEEETrans.Autom.Control ,vol.50,no.12, pp.2074,2005. [24]D.Braganza,D.M.Dawson,andT.Hughes,“Euclideanpositionestimationof staticfeaturesusingamovingcamerawithknownvelocities,”in Proc.IEEEConf. Decis.Control ,NewOrleans,LA,USA,Dec.2007,pp.2695. [25]A.DeLuca,G.Oriolo,andP.RobuffoGiordano,“Featuredepthobservation forimage-basedvisualservoing:Theoryandexperiments,” Int.J.Robot.Res. , vol.27,no.10,pp.1093,2008. [26]G.Hu,D.Aiken,S.Gupta,andW.Dixon,“Lyapunov-basedrangeidentication foraparacatadioptricsystem,” IEEETrans.Autom.Control ,vol.53,no.7,pp. 1775,2008. 126

PAGE 127

[27]A.Mourikis,N.Trawny,S.Roumeliotis,A.Johnson,A.Ansar,andL.Matthies, “Vision-aidedinertialnavigationforspacecraftentry,descent,andlanding,” IEEE Trans.Robot. ,vol.25,no.2,pp.264,Apr.2009. [28]F.MorbidiandD.Prattichizzo,“Rangeestimationfromamovingcamera:an immersionandinvarianceapproach,”in Proc.IEEEInt.Conf.Robot.Autom. , Kobe,Japan,May2009,pp.2810. [29]N.Zarrouati,E.Aldea,andP.Rouchon,“So-invariantasymptoticobserversfor densedeptheldestimationbasedonvisualdataandknowncameramotion,” in Proc.Am.ControlConf. ,FairmontQueenElizabeth,Montreal,Canada,Jun. 2012,pp.4116. [30]A.Dani,N.Fischer,Z.Kan,andW.E.Dixon,“Globallyexponentiallystable observerforvision-basedrangeestimation,” Mechatron. ,vol.22,no.4,pp. 381,SpecialIssueonVisualServoing2012. [31]A.Dani,N.Fischer,andW.E.Dixon,“Singlecamerastructureandmotion,” IEEE Trans.Autom.Control ,vol.57,no.1,pp.241,Jan.2012. [32]Z.I.Bell,H.-Y.Chen,A.Parikh,andW.E.Dixon,“Singlesceneandpathreconstructionwithamonocularcamerausingintegralconcurrentlearning,”in Proc. IEEEConf.Decis.Control ,2017,pp.3670. [33]J.Oliensis,“Acritiqueofstructure-from-motionalgorithms,” Comput.Vis.Image. Understand. ,vol.80,pp.172,2000. [34]J.OliensisandR.Hartley,“Iterativeextensionsofthestrum/triggsalgorithm: convergenceandnonconvergence,” IEEETrans.PatternAnal.Mach.Intell. , vol.29,no.12,pp.2217,2007. [35]F.KahlandR.Hartley,“Multiple-viewgeometryunderthe L 1 -norm,” IEEETrans. PatternAnal.Mach.Intell. ,vol.30,no.9,pp.1603,Sep.2008. [36]A.Parikh,T.-H.Cheng,H.-Y.Chen,andW.E.Dixon,“Aswitchedsystemsframeworkforguaranteedconvergenceofimage-basedobserverswithintermittent measurements,” IEEETrans.Robot. ,vol.33,no.2,pp.266,April2017. [37]A.Parikh,T.-H.Cheng,R.Licitra,andW.E.Dixon,“Aswitchedsystemsapproach toimage-basedlocalizationoftargetsthattemporarilyleavethecameraeldof view,” IEEETrans.ControlSyst.Technol. ,vol.26,no.6,pp.2149,2018. [38]A.Parikh,R.Kamalapurkar,andW.E.Dixon,“Targettrackinginthepresence ofintermittentmeasurementsviamotionmodellearning,” IEEETrans.Robot. , vol.34,no.3,pp.805,2018. [39]M.Boutayeb,H.Rafaralahy,andM.Darouach,“Convergenceanalysisofthe extendedKalmanlterusedasanobserverfornonlineardeterministicdiscrete -timesystems,” IEEETrans.onAutom.Control ,vol.42,no.4,pp.581,1997. 127

PAGE 128

[40]K.ReifandR.Unbehauen,“Theextendedkalmanlterasanexponentialobserverfornonlinearsystems,” IEEETrans.SignalProcess. ,vol.47,no.8,pp. 2324,1999. [41]G.V.ChowdharyandE.N.Johnson,“Theoryandight-testvalidationofa concurrent-learningadaptivecontroller,” J.Guid.ControlDynam. ,vol.34,no.2, pp.592,Mar.2011. [42]G.Chowdhary,M.Mhlegg,J.How,andF.Holzapfel,“Concurrentlearningadaptivemodelpredictivecontrol,”in AdvancesinAerospaceGuidance,Navigation andControl ,Q.Chu,B.Mulder,D.Choukroun,E.-J.vanKampen,C.deVisser, andG.Looye,Eds.SpringerBerlinHeidelberg,2013,pp.29. [43]G.Chowdhary,T.Yucelen,M.Mhlegg,andE.N.Johnson,“Concurrentlearning adaptivecontroloflinearsystemswithexponentiallyconvergentbounds,” Int.J. Adapt.ControlSignalProcess. ,vol.27,no.4,pp.280,2013. [44]R.Kamalapurkar,P.Walters,andW.E.Dixon,“Model-basedreinforcement learningforapproximateoptimalregulation,” Automatica ,vol.64,pp.94, 2016. [45]Z.Bell,P.Deptula,H.-Y.Chen,E.Doucette,andW.E.Dixon,“Velocityandpath reconstructionofamovingobjectusingamovingcamera,”in Proc.Am.Control Conf. ,2018,pp.5256. [46]R.Licitra,Z.I.Bell,E.Doucette,andW.E.Dixon,“Singleagentindirectherdingof multipletargets:Aswitchedadaptivecontrolapproach,” IEEEControlSyst.Lett. , vol.2,no.1,pp.127,January2018. [47]R.Licitra,Z.Bell,andW.Dixon,“Singleagentindirectherdingofmultipletargets withunknowndynamics,” IEEETrans.Robotics ,vol.35,no.4,pp.847,2019. [48]A.Parikh,R.Kamalapurkar,andW.E.Dixon,“Integralconcurrentlearning: Adaptivecontrolwithparameterconvergenceusingniteexcitation,” IntJAdapt ControlSignalProcess ,toappear. [49]Z.Bell,J.Nezvadovitz,A.Parikh,E.Schwartz,andW.Dixon,“Globalexponential trackingcontrolforanautonomoussurfacevessel:Anintegralconcurrentlearning approach,” IEEEJ.OceanEng. ,toappear. [50]S.Hutchinson,G.Hager,andP.Corke,“Atutorialonvisualservocontrol,” IEEE Trans.Robot.Autom. ,vol.12,no.5,pp.651,Oct.1996. [51]N.R.Gans,P.I.Corke,andS.A.Hutchinson,“Performancetestspartitioned approachestovisualservocontrol,”in Proc.IEEEInt.Conf.Robot.Autom. ,2002. [52]J.Chen,D.M.Dawson,W.E.Dixon,andA.Behal,“Adaptivehomography-based visualservotrackingforxedandcamera-in-handcongurations,” IEEETrans. ControlSyst.Technol. ,vol.13,pp.814,2005. 128

PAGE 129

[53]J.Chen,D.M.Dawson,W.E.Dixon,andV.Chitrakaran,“Navigationfunction basedvisualservocontrol,” Automatica ,vol.43,pp.1165,2007. [54]N.R.GansandS.A.Hutchinson,“Astablevision-basedcontrolschemefor nonholonomicvehiclestokeepalandmarkintheeldofview,”in Proc.IEEEInt. Conf.Robot.Autom. ,Roma,Italy,Apr.2007,pp.2196. [55]N.R.Gans,G.Hu,andW.E.Dixon,“Keepingmultipleobjectsintheeldofview ofasingleptzcamera,”in Proc.Am.ControlConf. ,St.Louis,Missouri,Jun.2009, pp.5259. [56]G.Hu,W.Mackunis,N.Gans,W.E.Dixon,J.Chen,A.Behal,andD.Dawson, “Homography-basedvisualservocontrolwithimperfectcameracalibration,” IEEE Trans.Autom.Control ,vol.54,no.6,pp.1318,2009. [57]G.Hu,N.Gans,N.Fitz-Coy,andW.E.Dixon,“Adaptivehomography-based visualservotrackingcontrolviaaquaternionformulation,” IEEETrans.Control Syst.Technol. ,vol.18,no.1,pp.128,2010. [58]G.Hu,N.Gans,andW.E.Dixon,“Quaternion-basedvisualservocontrolinthe presenceofcameracalibrationerror,” Int.J.RobustNonlinearControl ,vol.20, no.5,pp.489,2010. [59]G.Lopez-Nicolas,N.R.Gans,S.Bhattacharya,C.Sagues,J.J.Guerrero, andS.Hutchinson,“Homography-basedcontrolschemeformobilerobotswith nonholonomicandeld-of-viewconstraints,” IEEETrans.Syst.ManCybern. , vol.40,no.4,pp.1115,Aug.2010. [60]N.Gans,G.Hu,J.Shen,Y.Zhang,andW.E.Dixon,“Adaptivevisualservocontrol tosimultaneouslystabilizeimageandposeerror,” Mechatron. ,vol.22,no.4,pp. 410,2012. [61]G.Palmieri,M.Palpacelli,M.Battistelli,andM.Callegari,“Acomparisonbetween position-basedandimage-baseddynamicvisualservoingsinthecontrolofa translatingparallelmanipulator,” J.Robot. ,vol.2012,2012. [62]H.-Y.Chen,Z.I.Bell,P.Deptula,andW.E.Dixon,“Aswitchedsystemsframework forpathfollowingwithintermittentstatefeedback,” IEEEControlSyst.Lett. ,vol.2, no.4,pp.749,Oct.2018. [63]H.-Y.Chen,Z.Bell,P.Deptula,andW.E.Dixon,“Aswitchedsystemsapproachto pathfollowingwithintermittentstatefeedback,” IEEETrans.Robot. ,vol.35,no.3, pp.725,2019. [64]K.GranstromandU.Orguner,“APHDlterfortrackingmultipleextended targetsusingrandommatrices,” IEEETrans.SignalProcess. ,vol.60,no.11,pp. 5657,Nov.2012. 129

PAGE 130

[65]K.WyffelsandM.Campbell,“Negativeobservationsformultiplehypothesis trackingofdynamicextendedobjects,”in Proc.Am.ControlConf. ,2014. [66]——,“Negativeinformationforocclusionreasoningindynamicextended multiobjecttracking,” IEEETrans.Robot. ,vol.31,no.2,pp.425,Apr.2015. [67]A.Andriyenko,S.Roth,andK.Schindler,“Ananalyticalformulationofglobal occlusionreasoningformulti-targettracking,”in Proc.IEEEInt.Conf.Comput.Vis. Workshop ,2011,pp.1839. [68]M.K.C.TayandC.Laugier, FieldandServiceRobotics ,ser.SpringerTractsin AdvancedRobotics.SpringerBerlinHeidelberg,2008,vol.42,ch.Modelling SmoothPathsUsingGaussianProcesses,pp.381. [69]J.KoandD.Fox,“Gp-bayeslters:Bayesianlteringusinggaussianprocess predictionandobservationmodels,” Auton.Robot. ,vol.27,no.1,pp.75,2009. [70]D.Ellis,E.Sommerlade,andI.Reid,“Modellingpedestriantrajectorypatternswith gaussianprocesses,”in Proc.IEEEInt.Conf.Comput.Vis.Workshops ,2009,pp. 1229. [71]W.Lu,G.Zhang,S.Ferrari,andI.Palunko,“Aninformationpotentialapproach fortrackingandsurveillingmultiplemovingtargetsusingmobilesensoragents,”in Proc.SPIE8045UnmannedSyst.Technol.XIII ,2011. [72]J.Joseph,F.Doshi-Velez,A.S.Huang,andN.Roy,“Abayesiannonparametric approachtomodelingmotionpatterns,” Auton.Robot. ,vol.31,no.4,pp.383, 2011. [73]H.Wei,W.Lu,P.Zhu,S.Ferrari,R.H.Klein,S.Omidshaei,andJ.P.How, “Cameracontrolforlearningnonlineartargetdynamicsviabayesiannonparametric dirichlet-processgaussian-processdp-gpmodels,”in Proc.IEEE/RSJInt.Conf. Intell.Robot.Syst. ,2014,pp.95. [74]G.Hu,S.Mehta,N.Gans,andW.E.Dixon,“Daisychainingbasedvisualservo controlpartI:Adaptivequaternion-basedtrackingcontrol,”in IEEEMulti-Conf. Syst.andContr. ,SuntecCity,Singapore,Oct.2007,pp.1474. [75]G.Hu,N.Gans,S.Mehta,andW.E.Dixon,“Daisychainingbasedvisualservo controlpartII:Extensions,applicationsandopenproblems,”in Proc.IEEEConf. ControlAppl. ,SuntecCity,Singapore,Oct.2007,pp.729. [76]S.S.Mehta,T.Burks,andW.E.Dixon,“Atheoreticalmodelforvision-based localizationofawheeledmobilerobotingreenhouseapplications:Adaisychaining approach,” ComputersElectron.Agric. ,vol.63,pp.28,2008. [77]K.Dupree,N.R.Gans,W.MacKunis,,andW.E.Dixon,“Euclideancalculation offeaturepointsofarotatingsatellite:Adaisychainingapproach,” AIAAJ.Guid. ControlDyn. ,vol.31,pp.954,2008. 130

PAGE 131

[78]M.Kaiser,N.Gans,andW.Dixon,“Vision-basedestimationforguidance, navigation,andcontrolofanaerialvehicle,” IEEETransactionsonaerospaceand electronicsystems ,vol.46,no.3,2010. [79]T.Wang,C.Wang,J.Liang,Y.Chen,andY.Zhang,“Vision-aidedinertial navigationforsmallunmannedaerialvehiclesingps-deniedenvironments,” Int.J. Adv.Robot.Syst. ,vol.10,no.6,2013. [80]D.Lee,Y.Kim,andH.Bang,“Vision-basedterrainreferencednavigationfor unmannedaerialvehiclesusinghomographyrelationship,” J.Intell.Robot.Syst. , vol.69,no.1,pp.489,Jan2013. [81]A.Dani,Z.Kan,N.Fischer,andW.E.Dixon,“Structureandmotionestimationofa movingobjectusingamovingcamera,”in Proc.Am.ControlConf. ,Baltimore,MD, 2010,pp.6962. [82]R.Horaud,B.Conio,O.Leboulleux,andB.Lacolle,“Ananalyticsolutionforthe perspective4-pointproblem,”in Proc.IEEEConf.Comput.Vis.PatternRecog. , 1989,pp.500. [83]D.DeMenthonandL.S.Davis,“Exactandapproximatesolutionsofthe perspective-three-pointproblem,” IEEETrans.PatternAnal.Mach.Intell. ,vol.14, no.11,pp.1100,1992. [84]B.M.Haralick,C.-N.Lee,K.Ottenberg,andM.Nlle,“Reviewandanalysisof solutionsofthethreepointperspectiveposeestimationproblem,” Int.J.Comput. Vision ,vol.13,no.3,pp.331,Dec.1994. [85]D.DeMenthonandL.Davis,“Model-basedobjectposein25linesofcode,” Int.J. Comput.Vision ,vol.15,pp.123,1995. [86]T.Q.Phong,R.Horaud,A.Yassine,andP.D.Tao,“Objectposefrom2-dto 3-dpointandlinecorrespondences,” Int.J.Comput.Vision ,vol.15,no.3,pp. 225,1995. [87]L.QuanandZ.Lan,“LinearN-pointcameraposedetermination,” IEEETrans. PatternAnal.Mach.Intell. ,vol.21,no.8,pp.774,1999. [88]X.-S.Gao,X.-R.Hou,J.Tang,andH.-F.Cheng,“Completesolutionclassication fortheperspective-three-pointproblem,” IEEETrans.PatternAnal.Mach.Intell. , vol.25,no.8,pp.930,2003. [89]N.R.Gans,A.Dani,andW.E.Dixon,“Visualservoingtoanarbitraryposewith respecttoanobjectgivenasingleknownlength,”in Proc.Am.ControlConf. , Seattle,WA,USA,Jun.2008,pp.1261. [90]S.AvidanandA.Shashua,“Trajectorytriangulation:3Dreconstructionofmoving pointsfromamonocularimagesequence,” IEEETrans.PatternAnal.Mach.Intell. , vol.22,no.4,pp.348,Apr.2000. 131

PAGE 132

[91]J.KaminskiandM.Teicher,“Ageneralframeworkfortrajectorytriangulation,” J. Math.Imag.Vis. ,vol.21,no.1,pp.27,2004. [92]A.BartoliandP.Sturm,“Structure-from-motionusinglines:Representation, triangulation,andbundleadjustment,” Comput.Vis.Image.Understand. ,vol.100, no.3,pp.416,2005. [93]A.Bartoli,“Thegeometryofdynamicscenes–oncoplanarandconvergentlinear motionsembeddedin3Dstaticscenes,” Comput.Vis.Image.Understand. ,vol.98, no.2,pp.223,2005. [94]V.Chitrakaran,D.M.Dawson,W.E.Dixon,andJ.Chen,“Identicationofamoving object'svelocitywithaxedcamera,” Automatica ,vol.41,pp.553,2005. [95]M.Fujita,H.Kawai,andM.W.Spong,“Passivity-baseddynamicvisualfeedback controlforthree-dimensionaltargettracking:Stabilityand l _{ 2 }-gainperformance analysis,” IEEETransactionsonControlSystemsTechnology ,vol.15,no.1,pp. 40,2007. [96]T.Ibuki,T.Hatanaka,andM.Fujita,“Passivity-basedvisualposeregulationfor amovingtargetobjectinthreedimensions:Structuredesignandconvergence analysis,”in Proc.IEEEConf.Decis.Control .IEEE,2012,pp.5655. [97]A.Dani,Z.Kan,N.Fischer,andW.E.Dixon,“Structureestimationofamoving objectusingamovingcamera:Anunknowninputobserverapproach,”in Proc. IEEEConf.Decis.Control ,Orlando,FL,2011,pp.5005. [98]S.Jang,A.Dani,C.Crane,andW.E.Dixon,“Experimentalresultsformoving objectstructureestimationusinganunknowninputobserverapproach,”in Proc. ASMEDyn.Syst.ControlConf. ,FortLauderdale,Florida,Oct.2012. [99]D.Chwa,A.Dani,H.Kim,andW.E.Dixon,“Cameramotionestimationfor3-d structurereconstructionofmovingobjects,”in Proc.oftheIEEEInt.Conf.on Systems,Man,andCybernetics ,2012,pp.1788. [100]D.Chwa,A.Dani,andW.E.Dixon,“Rangeandmotionestimationofamonocular camerausingstaticandmovingobjects,” IEEETrans.ControlSyst.Tech. ,vol.24, no.4,pp.1174,July2016. [101]Z.I.Bell,C.Harris,R.Sun,andW.E.Dixon,“Structureandvelocityestimationof amovingobjectviasyntheticpersistencebyanetworkofstationarycameras,”in Proc.IEEEConf.Decis.Control ,Nice,Fr,Dec.2019. [102]J.Lewis,“Fasttemplatematching,” Vis.Interface ,pp.120,1995. [103]D.Lowe,“Distinctiveimagefeaturesfromscale-invariantkeypoints,”in IJCV ,2004. [104]H.Bay,A.Ess,T.Tuytelaars,andL.VanGool,“Speeded-uprobustfeaturessurf,” Computervisionandimageunderstanding ,vol.110,no.3,pp.346,2008. 132

PAGE 133

[105]T.TuytelaarsandK.Mikolajczyk, LocalInvarientFeatureDetectors:ASurvey ,ser. FoundationsandTrendsinComputerGraphicsandVision.NowPublishersInc, 2008,vol.10. [106]E.Rublee,V.Rabaud,K.Konolige,andG.R.Bradski,“Orb:Anefcientalternative tosiftorsurf.”in Proc.IEEEInt.Conf.Comput.Vision ,2011,pp.2564. [107]Z.Kalal,K.Mikolajczyk,andJ.Matas,“Tracking-learning-detection,” IEEETrans. PatternAnal.Mach.Intell. ,vol.34,no.7,pp.1409,Jul.2012. [108]O.Russakovsky,J.Deng,H.Su,J.Krause,S.Satheesh,S.Ma,Z.Huang, A.Karpathy,A.Khosla,M.Bernstein,A.Berg,andL.Fei-Fei,“Imagenetlarge scalevisualrecognitionchallenge,” Int.J.Comput.Vision ,pp.1,2015. [109]B.LucasandT.Kanade,“Aniterativeimageregistrationtechniquewithan applicationtostereovision,”in Proc.Int.JointConf.Artif.Intell. ,1981,pp. 674. [110]C.HarrisandM.Stephens,“Acombinedcornerandedgedetector,”in AlveyVis. Conf. ,vol.15,1988,pp.147. [111]J.ShiandC.Tomasi,“Goodfeaturestotrack,”in Proc.IEEEConf.Comput.Vis. PatternRecognit. ,1994,pp.593. [112]J.-Y.Bouguet,“Pyramidalimplementationoftheafnelucaskanadefeaturetracker descriptionofthealgorithm,” IntelCorporation ,vol.5,no.1-10,p.4,2001. [113]K.Fathian,J.P.Ramirez-Paredes,E.A.Doucette,J.W.Curtis,andN.R.Gans, “Quest:Aquaternion-basedapproachforcameramotionestimationfromminimal featurepoints,” IEEERoboticsandAutomationLetters ,vol.3,no.2,pp.857, April2018. [114]G.ChowdharyandE.Johnson,“Asingularvaluemaximizingdatarecording algorithmforconcurrentlearning,”in Proc.Am.ControlConf. ,2011,pp.3547– 3552. [115]R.Kamalapurkar,B.Reish,G.Chowdhary,andW.E.Dixon,“Concurrentlearning forparameterestimationusingdynamicstate-derivativeestimators,” IEEETrans. Autom.Control ,vol.62,no.7,pp.3594,July2017. [116]H.K.Khalil, NonlinearSystems ,3rded.UpperSaddleRiver,NJ:PrenticeHall, 2002. [117]G.Guennebaud,B.Jacob etal. ,“Eigenv3,”http://eigen.tuxfamily.org,2010. [118]G.Bradski,“TheOpenCVLibrary,” Dr.Dobb'sJournalofSoftwareTools ,2000. 133

PAGE 134

[119]M.Quigley,K.Conley,B.Gerkey,J.Faust,T.Foote,J.Leibs,R.Wheeler,andA.Y. Ng,“Ros:anopen-sourcerobotoperatingsystem,”in ICRAworkshoponopen sourcesoftware ,vol.3,no.3.2.Kobe,Japan,2009,p.5. [120]M.H.Stone,“Thegeneralizedweierstrassapproximationtheorem,” Math.Mag. , vol.21,no.4,pp.167,1948. 134

PAGE 135

BIOGRAPHICALSKETCH ZacharyIanBellreceivedhisbachelor'sdegreeinmechanicalengineeringin 2015fromtheUniversityofFlorida.Thatsameyear,ZacharyjoinedtheNonlinear ControlsandRoboticsgroupundertheadvisementofDr.WarrenE.Dixontopursue hisdoctoralstudies.Hereceivedhismaster'sdegreeinmechanicalengineeringfrom theUniversityofFloridain2017.In2018,hereceivedtheScience,Mathematics,and ResearchforTransformationSMARTScholarship.HereceivedhisPh.D.inmechanical engineeringfromtheUniversityofFloridain2019.Hisresearchinterestsincludeimagebasedsensing,switchedsystems,networkcontrol,andLyapunov-basednonlinearand adaptivecontrol. 135