<%BANNER%>

Vision-Based Estimation, Localization, and Control of an Unmanned Aerial Vehicle

Permanent Link: http://ufdc.ufl.edu/UFE0022131/00001

Material Information

Title: Vision-Based Estimation, Localization, and Control of an Unmanned Aerial Vehicle
Physical Description: 1 online resource (103 p.)
Language: english
Publisher: University of Florida
Place of Publication: Gainesville, Fla.
Publication Date: 2008

Subjects

Subjects / Keywords: control, estimation, flight, homography, lyapunov, nonlinear, robust, vision, visual
Mechanical and Aerospace Engineering -- Dissertations, Academic -- UF
Genre: Aerospace Engineering thesis, Ph.D.
bibliography   ( marcgt )
theses   ( marcgt )
government publication (state, provincial, terriorial, dependent)   ( marcgt )
born-digital   ( sobekcm )
Electronic Thesis or Dissertation

Notes

Abstract: Given the advancements in computer vision and estimation and control theory, monocular camera systems have received growing interest as a local alternative/collaborative sensor to GPS systems. One issue that has inhibited the use of a vision system as a navigational aid is the difficulty in reconstructing inertial measurements from the projected image. Current approaches to estimating the aircraft state through a camera system utilize the motion of feature points in an image. One geometric approach that is in this dissertation uses a series of homography relationships to estimate position and orientation with respect to an inertial pose. This approach creates a series of ?daisy-chained? pose estimates in which the current feature points can be related to previously viewed feature points to determine the current coordinates between each successive image. Because this technique relies on the accuracy of a depth estimation, a Lyapunov-based range identification method is developed that is intended to enhance and compliment the homography based method. The nature of the noise associated with using a camera as a position and orientation sensor is distinctly different from that of legacy type sensors used for air x vehicles such as accelerometers, rate gyros, attitude resolvers, etc. In order to fly an aircraft in a closed-loop sense, using a camera as a primary sensor, the controller will need to be robust to not only parametric uncertainties, but to system noise that is of the kind uniquely characteristic of camera systems. A novel nonlinear controller, capable of achieving asymptotic stability while rejecting a broad class of uncertainties, is developed as a plausible answer to such anticipated issues. A commercially available vehicle platform is selected to act as a testbed for evaluating a host of image-based methodologies as well as evaluating advanced control concepts. To enhance the vision-based analysis as well as control system design analysis, a simulation of this particular aircraft is also constructed. The simulation is intended to be used as a tool to provide insight into algorithm feasibility as well as to support algorithm development, prior to physical integration and flight testing. The dissertation focuses on three problems of interest: 1) vehicle state estimation and control using a homography-based daisy-chaining approach; 2) Lyapunov-based nonlinear state estimation and range identification using a pinhole camera; 3) robust aerial vehicle control in the presence of structured and unstructured uncertainties.
General Note: In the series University of Florida Digital Collections.
General Note: Includes vita.
Bibliography: Includes bibliographical references.
Source of Description: Description based on online resource; title from PDF title page.
Source of Description: This bibliographic record is available under the Creative Commons CC0 public domain dedication. The University of Florida Libraries, as creator of this bibliographic record, has waived all rights to it worldwide under copyright law, including all related and neighboring rights, to the extent allowed by law.
Thesis: Thesis (Ph.D.)--University of Florida, 2008.
Local: Adviser: Dixon, Warren E.
Electronic Access: RESTRICTED TO UF STUDENTS, STAFF, FACULTY, AND ON-CAMPUS USE UNTIL 2010-05-31

Record Information

Source Institution: UFRGP
Rights Management: Applicable rights reserved.
Classification: lcc - LD1780 2008
System ID: UFE0022131:00001

Permanent Link: http://ufdc.ufl.edu/UFE0022131/00001

Material Information

Title: Vision-Based Estimation, Localization, and Control of an Unmanned Aerial Vehicle
Physical Description: 1 online resource (103 p.)
Language: english
Publisher: University of Florida
Place of Publication: Gainesville, Fla.
Publication Date: 2008

Subjects

Subjects / Keywords: control, estimation, flight, homography, lyapunov, nonlinear, robust, vision, visual
Mechanical and Aerospace Engineering -- Dissertations, Academic -- UF
Genre: Aerospace Engineering thesis, Ph.D.
bibliography   ( marcgt )
theses   ( marcgt )
government publication (state, provincial, terriorial, dependent)   ( marcgt )
born-digital   ( sobekcm )
Electronic Thesis or Dissertation

Notes

Abstract: Given the advancements in computer vision and estimation and control theory, monocular camera systems have received growing interest as a local alternative/collaborative sensor to GPS systems. One issue that has inhibited the use of a vision system as a navigational aid is the difficulty in reconstructing inertial measurements from the projected image. Current approaches to estimating the aircraft state through a camera system utilize the motion of feature points in an image. One geometric approach that is in this dissertation uses a series of homography relationships to estimate position and orientation with respect to an inertial pose. This approach creates a series of ?daisy-chained? pose estimates in which the current feature points can be related to previously viewed feature points to determine the current coordinates between each successive image. Because this technique relies on the accuracy of a depth estimation, a Lyapunov-based range identification method is developed that is intended to enhance and compliment the homography based method. The nature of the noise associated with using a camera as a position and orientation sensor is distinctly different from that of legacy type sensors used for air x vehicles such as accelerometers, rate gyros, attitude resolvers, etc. In order to fly an aircraft in a closed-loop sense, using a camera as a primary sensor, the controller will need to be robust to not only parametric uncertainties, but to system noise that is of the kind uniquely characteristic of camera systems. A novel nonlinear controller, capable of achieving asymptotic stability while rejecting a broad class of uncertainties, is developed as a plausible answer to such anticipated issues. A commercially available vehicle platform is selected to act as a testbed for evaluating a host of image-based methodologies as well as evaluating advanced control concepts. To enhance the vision-based analysis as well as control system design analysis, a simulation of this particular aircraft is also constructed. The simulation is intended to be used as a tool to provide insight into algorithm feasibility as well as to support algorithm development, prior to physical integration and flight testing. The dissertation focuses on three problems of interest: 1) vehicle state estimation and control using a homography-based daisy-chaining approach; 2) Lyapunov-based nonlinear state estimation and range identification using a pinhole camera; 3) robust aerial vehicle control in the presence of structured and unstructured uncertainties.
General Note: In the series University of Florida Digital Collections.
General Note: Includes vita.
Bibliography: Includes bibliographical references.
Source of Description: Description based on online resource; title from PDF title page.
Source of Description: This bibliographic record is available under the Creative Commons CC0 public domain dedication. The University of Florida Libraries, as creator of this bibliographic record, has waived all rights to it worldwide under copyright law, including all related and neighboring rights, to the extent allowed by law.
Thesis: Thesis (Ph.D.)--University of Florida, 2008.
Local: Adviser: Dixon, Warren E.
Electronic Access: RESTRICTED TO UF STUDENTS, STAFF, FACULTY, AND ON-CAMPUS USE UNTIL 2010-05-31

Record Information

Source Institution: UFRGP
Rights Management: Applicable rights reserved.
Classification: lcc - LD1780 2008
System ID: UFE0022131:00001


This item has the following downloads:


Full Text

PAGE 1

VISION-BASEDESTIMATION,LOCALIZATION,ANDCONTROLOFAN UNMANNEDAERIALVEHICLE By MICHAELKENTKAISER ADISSERTATIONPRESENTEDTOTHEGRADUATESCHOOL OFTHEUNIVERSITYOFFLORIDAINPARTIALFULFILLMENT OFTHEREQUIREMENTSFORTHEDEGREEOF DOCTOROFPHILOSOPHY UNIVERSITYOFFLORIDA 2008

PAGE 2

Copyright2008 by MichaelKentKaiser

PAGE 3

Tomywife,Brigita,ourdaughters,MaryEllenAnnandElikaGaynor,andmy mother,CaroleCampbellKaiser.

PAGE 4

ACKNOWLEDGMENTS Firstandforemost,IwouldliketoexpressmygratitudetoDr.WarrenDixon fornotonlyprovidingmewithexceptionalguidance,encouragement,knowledge, andfriendship,butalsoservingasanexampleofaqualitypersonofhighcharacter -worthyofemulating! Iwouldalsoliketothankmycommitteemembers:Dr.PeterIfju,Dr.Thomas Burks,Dr.NormanFitz-Coy,andDr.NairaHovakimyan.Itwasagreathonorto havethemonmycommittee,eachonecarryingaspeci cmeaningforme. TheNCRlabteamiscomposedofahigherconcentrationofpureandlean talentthananysinglegroupofpeopleIhaveeverworkedwith,whichiswhyitwas suchagreatprivilegetobeamember,albeitalittleintimidating!Ishouldpoint outthatthoseindividualsintheNCRgroupwhomademydegreepossibleare: WillMacKunis,Dr.NickGans,Dr.GuoqiangHu,ParagPatre,andSidMehta. AndIwouldberemisstonotacknowledgethestabilizing"thirdleg"in myschoolingendeavors;namely,theAirForceResearchLaboratoryMunitions Directorate(AFRL/RW)thatprovidedthepushandcontinuedsupportformy education. Alsoofnote,thefollowingindividual swhomIhavehadtheopportunityto workwithasanengineerandwhowereeithermentorsorengineerstoaspireto, orbothare:HenryYarborough,WalterJ."Jake"Klinar,Dr.KarolosGrigoriadis, BobLoschke,Dr.PaulBevilaqua,Dr.LelandNicolai,JohnK."Jack"Buckner,Dr. JamesCloutier,andDr.AnthonyCalise. Finally,IcannotforgettomentionMs.JudiShiversattheREEFandallof thefavorsIoweherthatIcannotpossiblyrepay. iv

PAGE 5

TABLEOFCONTENTS page ACKNOWLEDGMENTS.............................iv LISTOFFIGURES.... ............................vii ABSTRACT.............. ......................x CHAPTER 1I NTRODUCTION.. ............................1 1.1Motivation.... ............................1 1.2DissertationOverview... ......................6 1.3ResearchPlan. ................. ............6 2AIRCRAFTMODELINGANDSIMULATION..............8 2.1Introduction ...............................8 2.2AerodynamicCharacterization.....................9 2.3MassPropertyEstimation.......................9 2.4SimulinkE ort.............................13 2.5Conclusions... ............................17 3AUTONOMOUSCONTROLDESIGN...................18 3.1Introduction ...............................18 3.2BaselineController...........................19 3.3RobustControlDevelopment.....................21 3.4ControlDevelopment..........................24 3.4.1ErrorSystem..........................25 3.4.2StabilityAnalysis........................28 3.5SimulationResults...........................32 3.6Conclusion.... ............................37 4DAISY-CHAININGFORSTATEESTIMATION.............39 4.1Introduction ...............................39 4.2PoseReconstructionFromTwoViews................40 4.2.1EuclideanRelationships.....................40 4.2.2ProjectiveRelationships....................41 4.2.3ChainedPoseReconstructionforAerialVehicles.......42 4.2.4SimulationResults.......................47 v

PAGE 6

4.2.5ExperimentalResults......................49 4.3Conclusions... ............................61 5LYAPUNOV-BASEDSTATEESTIMATION...............63 5.1Introduction ...............................63 5.2A neEuclideanMotion........................63 5.3ObjectProjection............................66 5.4RangeIdenti cationForA neSystems................67 5.4.1Objective. ............................67 5.4.2EstimatorDesignandErrorSystem..............68 5.5Analysis..... ............................69 5.6Conclusion.... ............................72 6CONTRIBUTIONSANDFUTUREWORK................73 APPENDIX A(CHAPTER3)INTEGRATIONOFTHEAUXILIARYFUNCTION, ( ) ......................................76 B(CHAPTER4)VIDEOEQUIPMENTUSEDONBOARDTHEAIRCRAFT.............. ......................77 C(CHAPTER4)GROUNDSUPPORTEQUIPMENT...........81 REFERENCES....... ............................84 BIOGRAPHICALSKETCH............................92 vi

PAGE 7

LISTOFFIGURES Figure page 2LinAirwireframerepresentationoftheOspreyairframe .........9 22Simulinkmodelingofaerodynamiccoe cients................10 23MeasurableinertiavaluesoftheOspreyairframe..............12 24Simulinkrepresentationoftheaircraftequationsofmotion.........15 25Simulinkaircraftequationsofmotionsub-block:RotationalEOM.....15 26Simulinkaircraftequationsofmotionsub-block:BodyRatestoEuler AngleRates............ ......................16 2Simulinkaircraftequationsofmotionsub-block:TranslationalEOM...16 31PhotographoftheOspreyaircrafttestbed..................18 32Rudimentarycontrolsystemusedforproofofconceptforvision-based extimationalgorithms.............................20 33Plotofthediscretevertical(upward)windgustusedinthecontroller simulation....... ............................34 34Illustrationofuncoupledvelocityandpitchrateresponseduringclosedlooplongitudinalcontrolleroperation....................36 35Illustrationofuncoupledrollrateandyawrateresponseduringclosedlooplateralcontrolleroperation........................37 41Euclideanrelationshipsbetweentwocameraposes. ............41 42Illustrationofposeestimationchaining....................44 43Depthestimationfromaltitude........................45 44Actualtranslationversusestimatedtranslationwheretheestimationis notpartoftheclosed-loopcontrol......................48 45Actualattitudeversusestimatedattitudewheretheestimationisnot partoftheclosed-loopcontrol........................48 46Autononmouscontrolarchitecture......................50 vii

PAGE 8

4Actualtranslationversusestimatedtranslationwheretheestimatedvalue isusedwithintheclosed-loopcontrol.....................50 48Actualattitudeversusestimatedattitudewheretheestimatedvalueis usedwithintheclosed-loopcontrol......................51 4Overviewofthe ighttestsystemandcomponentinterconnections....52 410SinglevideoframewithGPSoverlayillustratinglandmarksplacedalong insideedgeoftherunway...........................52 4Experimental ighttestresults.Estimatedpositioncomparedtotwo GPSsignals....... ............................53 42Techniqueforachievingaccuratelandmarkrelativeplacementdistances.54 413Singlevideoframefromsecond ighttestexperimentillustratingtheeffectofthemoreforwardlooking,wider eldofviewcamera........55 414Singlevideoframefromsecond ighttestexperimentwiththelensdistortionremoved. ...............................56 4Exampleofa3Dcolorcontourplotgeneratedfromthematrixdesigned asanonlinearcombinationoftheredandgreencolorspacematrices...57 4Imageplanetrajectoriesmadebythelandmarksfrompatch1enterring andexitingthe eldofview..........................58 417Basicconceptofgeometricpositionreconstructionfromknownlandmark locations........ ............................60 4Illustrationofhowfourtetrahedronsareusedtoestimatethelengthof eachedge, ,& threetimes......................61 4Secondexperimental ighttestresults.Estimatedpositioncomparedto moreaccuratepositionfromgeometricreconstructiontechnique......62 51Movingcamerastationaryobjectscenario. .................65 52Euclideanpointprojectedontoimageplaneofapinholecamera.....65 B1SonyColorExviewSuperHAD(480Lineso fResolution).........77 B 2PanasonicAG-DV1DigitalVideoRecorder.................78 B3GarminGPS35OEMGPSReceiver....................78 B4IntuitiveCircuits,LLC-OSD-GPSOverlayBoard............79 B512V,2.4Ghz,100mW,VideoTransmitterandAntennae.........79 viii

PAGE 9

B6EagleTree,Seagull,WirelessDashboardFlightSystem-ProVersion: (1)WirelessTelemetryTransmitter,(2)EagleTreeSystemsG-ForceExpander,and(3)EagleTreeSystemsGPSExpander ............80 C1RX-24102.4GHzWireless4-channelAudio/VideoSelectableReceiver.81 C2SonyGV-D1000PortableMiniDVVideoWalkman............82 C3EagleTree,Seagull,Real-timeDataDashboard,WirelessTelemetryData, ReceiverModelSTR-01...........................82 C4LeicaDISTOTMA5(MeasuringRangeupto650ft,Accuracy+/-0.06 inches)............... ......................83 ix

PAGE 10

AbstractofDissertationPresentedtotheGraduateSchool oftheUniversityofFloridainPartialFul llmentofthe RequirementsfortheDegreeofDoctorofPhilosophy VISION-BASEDESTIMATION,LOCALIZATION,ANDCONTROLOFAN UNMANNEDAERIALVEHICLE By MichaelKentKaiser May2008 Chair:Dr.WarrenE.Dixon Major : Aerospace Engineering Giventheadvancementsincomputervisionandestimationandcontrol theory,monocularcamerasystemshavereceivedgrowinginterestasalocal alternative/collaborativesensortoGPSsystems.Oneissuethathasinhibited theuseofavisionsystemasanavigationalaidisthedi cultyinreconstructing inertialmeasurementsfromtheprojectedimage.Currentapproachestoestimating theaircraftstatethroughacamerasystemutilizethemotionoffeaturepoints inanimage.Onegeometricapproachthatisinthisdissertationusesaseriesof homographyrelationshipstoestimatepositionandorientationwithrespecttoan inertialpose.Thisapproachcreatesaseriesofdaisy-chainedposeestimatesin whichthecurrentfeaturepointscanberelatedtopreviouslyviewedfeaturepoints todeterminethecurrentcoordinatesbetweeneachsuccessiveimage.Becausethis techniquereliesontheaccuracyofadepthestimation,aLyapunov-basedrange identi cationmethodisdevelopedthatisintendedtoenhanceandcomplimentthe homographybasedmethod. Thenatureofthenoiseassociatedwithusingacameraasapositionand orientationsensorisdistinctlydi erentfromthatoflegacytypesensorsusedforair x

PAGE 11

vehiclessuchasaccelerometers,rategyr os,attituderesolvers,etc.Inorderto yan aircraftinaclosed-loopsense,usingacameraasaprimarysensor,thecontroller willneedtoberobusttonotonlyparametricuncertainties,buttosystemnoise thatisofthekinduniquelycharacteristicofcamerasystems.Anovelnonlinear controller,capableofachievingasymptoticstabilitywhilerejectingabroadclassof uncertainties,isdevelopedasaplausibleanswertosuchanticipatedissues. Acommerciallyavailablevehicleplatformisselectedtoactasatestbedfor evaluatingahostofimage-basedmethodologiesaswellasevaluatingadvanced controlconcepts.Toenhancethevision-basedanalysisaswellascontrolsystem designanalysis,asimulationofthisparticularaircraftisalsoconstructed.The simulationisintendedtobeusedasatooltoprovideinsightintoalgorithm feasibilityaswellastosupportalgorithmdevelopment,priortophysicalintegration and ighttesting. Thedissertationwillfocusonthreeproblemsofinterest:1)vehiclestate estimationandcontrolusingahomography-baseddaisy-chainingapproach; 2)Lyapunov-basednonlinearstateestimationandrangeidenti cationusinga pinholecamera;3)robustaerialvehiclecontrolinthepresenceofstructuredand unstructureduncertainties. xi

PAGE 12

CHAPTER1 INTRODUCTION 1.1Motivation Feedbacklinearizationisageneralcontrolmethodwherethenonlinear dynamicsofasystemarecanceledbystatefeedbackyieldingaresiduallinear system.Dynamicinversionisasimilarconceptasfeedbacklinearizationthatis commonlyusedwithintheaerospacecommunitytoreplacelinearaircraftdynamics withareferencemodel[11].Forexample,ageneraldynamicinversionapproach ispresentedin[4]forareferencetrackingproblemforaminimum-phaseand left-invertiblelinearsystem.Adynamicinversioncontrollerisdesignedfora nonminimum-phasehypersonicaircraftsystemin[2],whichutilizesanadditional controllertostabilizethezerodynamics.A nite-timestabilizationdesignis proposedin[3],whichutilizesdynamicinversiongivenafullrankinputmatrix. Typically,dynamicinversionmethods(e.g.,[1,2])assumethecorrespondingplant modelsareexactlyknown.However,parametricuncertainty,additivedisturbances, andunmodeledplantdynamicsarealwayspresentinpracticalsystems. Motivatedbythedesiretoimprovetherobustnesstouncertaintyovertraditionalmethods,adaptivedynamicinversion(ADI)wasdevelopedasamethodto compensateforparametricuncertainty(cf.[4,6,7,10]).Typically,ADImethods exploitmodelreferenceadaptivecontrol(MRAC)techniqueswherethedesired input-outputbehavioroftheclosed-loopsystemisgivenviathecorresponding dynamicsofareferencemodel[5,7,12].Therefore,thebasictaskistodesigna controllerwhichwillensuretheminimalerrorbetweenthereferencemodelandthe plantoutputsdespiteuncertaintiesintheplantparametersandworkingconditions. 1

PAGE 13

2 Severale orts(e.g.,[8,13])havebeendevelopedforthemoregeneralproblemwheretheuncertainparametersortheinversionmismatchtermsdonotsatisfy thelinear-in-the-parametersassumption(i.e.,non-LP).Onemethodtocompensatefornon-LPuncertaintyistoexploitaneuralnetworkasanon-linefunction approximationmethodasin[135];however,alloftheseresultsyielduniformly ultimatelyboundedstabilityduetotheinherentfunctionreconstructionerror. Incontrasttoneuralnetwork-basedme thodstocompensateforthenon-LP uncertainty,arobustcontrolapproachwasrecentlydevelopedin[17](coinedRISE controlin[18])toyieldanasymptoticstabilityresult.TheRISE-basedcontrol structurehasbeenusedforavarietyoffullyactuatedsystemsin[175].The contributioninthisresultistheuseoftheRISEcontrolstructuretoachieve asymptoticoutputtrackingcontrolofamodelreferencesystem,wheretheplant dynamicscontainaboundedadditivedisturbance(e.g.,potentialdisturbances include:gravity,inertialcoupling,nonlineargustmodeling,etc.).Thisresult representsthe rsteverapplicationoftheRISEmethodwherethecontroller ismultipliedbyanon-squarematrixcontainingparametricuncertainty.To achievetheresult,thetypicalRISEcontrolstructureandclosed-looperrorsystem developmentismodi edbyaddingarobustcontrolterm,whichisdesignedto compensatefortheuncertaintyintheinputmatrix.Theresultisprovenvia Lyapunov-basedstabilityanalysisanddemonstratedthroughnumericalsimulation. GPS(GlobalPositioningSystem)istheprimarynavigationalsensormodality usedforvehicleguidance,navigation,andcontrol.However,acomprehensive studyreferredtoastheVolpeReport[26]indicatesseveralvulnerabilitiesofGPS associatedwithsignaldisruptions.TheVolpeReportdelineatesthesourcesof interferencewiththeGPSsignalintotwocategories,unintentionalanddeliberate disruptions.Someoftheunintentionaldisruptionsincludeionosphereinterference(alsoknownasionosphericscintillation)andradiofrequencyinterference

PAGE 14

3 (broadcasttelevision,VHF,cellphones,two-waypagers);whereas,someofthe intentionaldisruptionsinvolvejamming,spoo ng,andmeaconing.Someofthe ultimaterecommendationsofthisreportwereto,createawarenessamongmembersofthedomesticandglobaltransportationcommunityoftheneedforGPS backupsystems...andtoconductacomprehensiveanalysisofGPSbackup navigation...whichincludedILS(InstrumentLandingSystems),LORAN(LOng RAngeNavigation),andINS(InertialNavigationSystems)[26]. TheVolpereportactedasanimpetustoin vestigatemitigationstrategiesfor thevulnerabilitiesassociatedwiththecurrentGPSnavigationprotocol,nearly allfollowingthesuggestedGPSbackupmethodsthatreverttoarchaic/legacy methods.Unfortunately,thesenavigationalmodalitiesarelimitedbytherange oftheirland-basedtransmitters,whichareexpensiveandmaynotbefeasiblefor remoteorhazardousenvironments.Basedontheserestrictions,researchershave investigatedlocalmethodsofestimatingpositionwhenGPSisdenied. Giventheadvancementsincomputervisionandestimationandcontrol theory,monocularcamerasystemshavereceivedgrowinginterestasalocal alternative/collaborativesensortoGPSsystems.Oneissuethathasinhibited theuseofavisionsystemasanavigationalaidisthedi cultyinreconstructing inertialmeasurementsfromtheprojectedimage.Currentapproachestoestimating theaircraftstatethroughacamerasystemutilizethemotionoffeaturepoints inanimage.Ageometricapproachisproposedinthisdissertationthatuses aseriesofhomographyrelationshipstoestimatepositionandorientationwith respecttoaninertialpose.Thisapproachcreatesaseriesofdaisy-chainedpose estimates[27,28]inwhichthecurrentfeaturepointscanberelatedtopreviously viewedfeaturepointstodeterminethecurrentcoordinatesbetweeneachsuccessive image.Throughtheserelationships,previouslyrecordedGPSdatacanbelinked withtheimagedatatoprovidemeasurementsofpositionandattitude(i.e.pose)

PAGE 15

4 innavigationalregionswhereGPSisdenied.Themethodalsodeliversanaccurate estimationofvehicleattitude,whichisanopenprobleminaerialvehiclecontrol. Theestimationmethodcanbeexecutedin realtime,makingitamenableforusein closedloopguidancecontrolofanaircraft. Theconceptofvision-basedcontrolfora ightvehiclehasbeenanactive areaofresearchoverthelastdecade.Recentresearchliteratureonthesubjectof vision-basedstateestimationforuseincontrolofa ightvehiclecanbecategorized byseveraldistinctions.Onedistinctionisthatsomemethodsrelyheavilyon simultaneoussensorfusion[29]whileothermethodsrelysolelyoncamerafeedback [30].Researchcanfurtherbecategorizedintomethodsthatrequireapriori knowledgeoflandmarks(suchaspatternorshape[31],lightintensityvariations [37],runwayedgesorlights[3840])versustechniquesthatdonotrequireany priorknowledgeoflandmarks[41].A nothercategoryofresearchincludes methodsthatrequiretheimagefeaturestoremaininthe eld-of-view[41]versus methodsthatarecapableofacquiringnewfeatures[42].Finally,methodscanbe categorizedaccordingtothevision-basedte chniqueforinformationextractionsuch as:OpticFlow[48],SimultaneousLocalizationAndMapping(SLAM)[43],Stereo Vision[49],EpipolarGeometry[34,41,45,46].Thislastcategorymightalsobe delineatedbetweenmethodsthataremoreco mputationallyintensiveandtherefore indicativeofthelevelofreal-timeon-boardcomputationalfeasibility. Methodsusinghomographyrelationshipsbetweenimagestoestimatethe poseofanaircraftarepresentedbyCaballeroetal.[46]andShakerniaetal.[41] (whereitisreferredtoastheplanaressentialmatrix).Themethodpresented byCaballeroetal.islimitedto yingaboveaplanarenvironmentandcreatesan imagemosaic,whichcanbecostlyintermsofmemory.Shakerniasapproach,does notaccountforfeaturepointsenteringandexitingthecamera eldofview.The methodintroducedinthisdissertationproposesasolutionwhichallowspointsto

PAGE 16

5 continuouslymoveintoandoutofthecamera eldofview.Therequirementof yingoveraconstantplanarsurfaceisalsorelaxedtoallow ightoverpiecewise planarpatches,morecharacteri sticofrealworldscenarios. ReconstructingtheEuclideancoordinatesofobservedfeaturepointsisa challengingproblemofsigni cantinterest,becauserangeinformation(i.e., thedistancefromtheimagingsystemtothefeaturepoint)islostintheimage projection.Di erenttools(e.g.,extendedKalman lter,nonlinearobservers)have beenusedtoaddressthestructureand/ormotionrecoveryproblemfromdi erent pointsofview.Someresearchers(e.g.,see[5053])haveappliedtheextended Kalman lter(EKF)toaddressthestructure /motionrecoveryproblem.Inorder tousetheEKFmethod,aprioriknowledgeofthenoisedistributionisrequired, andthemotionrecoveryalgorithmisdevelopedbasedonthelinearizationofthe nonlinearvision-basedmotionestimationproblem. Duetorestrictionswithlinearmethods,researchershavedevelopedvarious nonlinearobservers(e.g.,see[54] ).Forexample,severalresearchershave investigatedtherangeidenti cationproblemforconventionalimagingsystems whenthemotionparametersareknown.In[57],JankovicandGhoshdeveloped adiscontinuousobserver,knownastheIdenti erBasedObserver(IBO),to exponentiallyidentifyrangeinformationoffeaturesfromsuccessiveimagesofa camerawheretheobjectmodelisbasedonknownskew-symmetrica nemotion parameters.In[55],ChenandKanogeneralizedtheobjectmotionbeyondthe skew-symmetricformof[57]anddevelopedanewdiscontinuousobserverthat exponentiallyforcedthestateobservationerrortobeuniformlyultimatelybounded (UUB).IncomparisontotheUUBresultof[55],acontinuousobserverwas constructedin[56]toasymptoticallyidentifytherangeinformationforageneral a nesystemwithknownmotionparameters.Thatis,theresultin[56]eliminated theskew-symmetricassumptionandyielde danasymptoticresultwithacontinuous

PAGE 17

6 observer.Morerecently,astateestimationstrategywasdevelopedin[59,60]for a nesystemswithknownmotionparameterswhereonlyasinglehomogeneous observationpointisprovided(i.e.,asingleimagecoordinate).In[58],areduced orderobserverwasdevelopedtoyieldasemi-globalasymptoticstabilityresultfora xedcameraviewingamovingobjectwithknownmotionparametersforapinhole camera. 1.2DissertationOverview Inthisdissertation,vision-basedestimation,localization,andcontrolmethodologiesareproposedforanautonomousairvehicle yingoverwhatarenominally consideredasplanarpatchesoffeaturepoints.Thedissertationwillfocusonfour problemsofinterest:1)developarobustcontrolsystemresultinginasemi-global asymptoticstablesystemforanairvehiclewithstructuredandunstructured uncertainties;2)provideameansofstateestimationwherefeaturepointscan continuouslyenterandexitthe eld-of-view,aswouldnominallybethecasefora xed-wingvehicle,viaanoveldaisy-chainingapproach;.3)introduceavision-based altimeterwhichseekstoresolvethedepthambiguity,whichisacurrentissuewith thehomographybaseddaisy-chainingmethodthatusesanaltimetertoprovidea depthmeasurement. 1.3ResearchPlan Thischapterservesasanintroduction.Themotivation,problemstatement andtheproposedresearchplanofthedissertationisprovidedinthischapter. Chapter2describesafrom-the-ground-upsimulationdevelopmentofa researchairvehiclespeci callyselectedforitsperformancecapabilitiesfor ight testingofvision-based,estimation,localization,andcontrolmethodologies.An outcomefromthischapterisafullynonlinearsimulationofacommercially availablemini-aircraftthatcanbeusedforawiderangeofanalysisanddesign purposes.

PAGE 18

7 Chapter3presentsaninner-looprobustcontrolmethod,providingmathematicaltheoryandsimulationresults.Thecontributionofthedevelopmentin thischapterisacontrollerthatisasymptoticallystableforabroadclassofmodel uncertaintiesaswellasboundedadditivedisturbances. Chapter4involvesthedevelopmentofthedaisy-chainingmethodasaviable GPSbackuptechnologyaswellasastateestimationprocedure.Resultsare demonstratedviasimulationaswellas ighttesting.Thecontributionfromthis chapterisindevelopingameansforanaircrafttoperformpositionandorientation estimationfromplanarfeaturepointpatchesthatenterandleavethe eldofview, inde nitely. Chapter5investigatesanonlinearestimatorthatcanprovideanalternate meanstoaltitudeestimationaswellastoprovidealternatestateestimation.The resultsofthischapteristhatitdevelopsaLyapunov-basednonlinearstateestimatorusingapinholecamerathatcouldworkinsymbiosiswiththehomographybaseddaisychainingtechniqueanditalsosuggestshow,atleastnotionally,the cameracanthereforebeusedasasolesensoronboardanaircraft.

PAGE 19

CHAPTER2 AIRCRAFTMODELINGANDSIMULATION 2.1Introduction Avehiclesimulationhasbeendevelopedtoinvestigatethefeasibilityofthe proposedvision-basedstateestimationandguidancemethod.TheOsprey xed wingaerialvehicle,byAirandSeaComposites,Inc.(seeFigure43)wasselected forevaluatingahostofimage-basedmethodologiesaswellasforpotentially evaluatingadvancedcontrolconcepts. Thisparticularaircraftwaschosenfor severalreasons;chie ybeing:lowcost,pusherpropbeingamenabletoforward lookingcameraplacement,andpayloadcapability.Afullynonlinearmodelofthe equationsofmotionandaerodynamicsoftheOspreyareconstructedwithinthe Simulinkframework.Anonlinearmodel,asopposedtolinearmodel,ispreferredin thisanalysisasitbetterrepresentsthecoupleddynamicsandcamerakinematics, whichcouldpotentiallystresstheperformanceandhence,feasibilityofthepose estimationalgorithm. The rstundertakingofthedissertationistodevelopafullynonlinear,six degrees-of-freedommodeloftheOspreyaircraft.Thesimulationwillprovidea meanstotestproof-of-conceptmethodologiespriortotestingontheactualOsprey testbed.Forexample,aspeci cmaneuvercanbecreatedwithinthesimulation environmenttoperformasimultaneousrolling,pitching,andyawingmotionof theaircraftcombinedwitha xedmountedcameratotesttherobustnessofthe vision-basedalgorithm. Acommerciallyavailablevehicleplatformisselectedtoactasatestbedfor evaluatingahostofimage-basedmethodologiesaswellasevaluatingadvanced controlconcepts. 8

PAGE 20

9 Figure2:LinAirwireframerepresentationoftheOspreyairframe 2.2AerodynamicCharacterization Thedevelopmentofthevehiclesimulationentailedtwoprimarytasks,estimatingtheaerodynamiccharacteristicsandevaluatingthemassproperties.The aerodynamiccharacterizationoftheOspreyaircraftwascomputedusingLinAir,by DesktopAeronautics,Inc.,whichemploysthediscretevortexWeissengermethod (i.e.extendedliftinglinetheory),tosolvethesubsonic,inviscid,irrotational Prandtl-Glauertequation[61].Liftingsurfacesaremodeledbydiscretehorseshoe vorticeswhereeachmakesuponepanel,panelsmakeupanelement,andelements aregroupedtomakeuptheaircraftgeometryasshowninFigure21.Theresultingnondimensional,aerodynamiccoe cientsareimplementedviaSimulinks multi-dimensionallookuptablesasillustratedinFigure22. 2.3MassPropertyEstimation Theinertiaandmasspropertiesoftheai rcraftweremeasuredusingaprecision mass,centerofgravity,andmomentofinertia(MOI)instrument.Theinstrumentiscomprisedofatablelevitatedonagasbearingpivotandatorsionalrod connectedtothecenterofthetable,resultinginatorsionpendulumforMOI measurement.Becausethevehiclecouldnotbemountedonitsnoseortail,an

PAGE 21

10 Figure2:Simulinkmodelingofaerodynamiccoe cients. alternatemethodwasdevisedtoestimatethevehiclesrollinertia, .Giventhat theangularmomentuminoneframe, isrelatedtoangularmomentuminasecondframe, viaasimplecoordinatetransformation, (readastransformation fromframe toframe ),thefollowingseriesofrelationshipscanbewritten = = = where = = where = = Theinertiarelationshipbetweentwoframesisgivenby, = ,whichfor thisparticularcaseofestimatingtherollinertiaisexpressedas 0 0 0 0 = 0 010 0 0 0 0 0 0 0 0 0 0 0 010 0 (21)

PAGE 22

11 Expandingthisequation 0 0 0 0 = 0 cos2( ) 0 sin(2 ) + 0 sin2( ) 0 0 sin(2 ) 2 + 0 cos(2 ) 0 sin(2 ) 2 0 0 0 0 sin(2 ) 2 + 0 cos(2 ) 0 sin(2 ) 2 0 0 sin2( )+ 0 sin(2 ) + 0 cos2( ) (22) Fromthis,itisnotedthat = 0 ,asexpected.Furthermore,becausethe airframeismostlysymmetricalaboutits plane,seeFigure2,itisreasonable toassumethat 0 .Noticealsothatsincethematrixin(2)issymmetric,it isonlynecessarytolookattheupperorlowertriangularelements.Separatingthe knowntermsontheright-hand-sideandunknowntermsontheleft-hand-sidegives thefollowingequalities 0 cos2( )+ 0 sin(2 )= 0 sin2( ) 0 sin2( ) 0 sin(2 )= + 0 cos2( ) 0 sin(2 ) 2 + 0 cos(2 )= 0 sin(2 ) 2 (23) Rewritinginmatrixform,theunknowntermsaresolvedforaccordingly 0 0 = 1 cos2( )sin(2 ) 0 sin2( ) sin(2 ) 0 sin(2 ) 2 cos(2 ) 1 0 sin2( ) + 0 cos2( ) 0 sin(2 ) 2 (24) Theknowntermsin(24)are 0 and ,where isasdepictedinFigure 2.Therefore,thecompleteinertiapropertiescanbecalculatedfromtheonly3 measurementspossible,asillustratedinFigure23.

PAGE 23

12 Figure2:MeasurableinertiavaluesoftheOspreyairframe.

PAGE 24

13 2.4SimulinkE ort Thecoreofthesimulationusestheaforementionedaerodynamicsandmass propertiesalongwiththefollowingnonlineartranslationalandrotationalrigid bodyequationsofmotionderivedinthevehiclebody-axissystem = 1 ( ) + sin( ) (25) = 1 ( ) + + cos( )sin( ) (26) = 1 ( ) + + cos( )cos( ) (27) and = 2 ( )+ 2 { ( )+ } + 2 ( )+ 2 { ( ) } (28) = 1 ( ) ( ) 2 2 (29) = 2 ( )+ 2 { ( )+ } + 2 ( )+ 2 { ( ) } (2) where and representtheaerodynamicandpropulsivemomentsandforcesin bodyaxis,givenin & components; and representthevehiclesinertia tensorvaluesandmass; and arerelativevelocityandairdensity; and are angleofattackandsideslipangle; and areangularbodyrates; and aretranslationalvelocitiesinthebodyframe;and representstheindividual

PAGE 25

14 controlde ections.Thecorrespondingkinematicrelationshipsaregivenby = + + (2) and = 1sin tan cos tan 0cos sin 0s in sec cos sec (2) whe r e and denote cos( ) and sin( ) ,respectively(similarlyfor and ) Theequationsofmotionandallsimulationsubsystemsareconstructedusing standardSimulinklibraryblocks,wherenoscript lesareincorporated,asshown inFigure2. Thesub-blocksofinterestdepictedin theequationsofmotionmodelgiven inFigure2are:RotationalEOM,BodyRatestoEulerAngleRates,and TranslationalEOM,andareillustrated inFigures2,26,and2,respectively. BesidesusingthemodelgiveninFigure2forcaseswhereafullynonlinear simulationisrequired,itcanalsobeusedtogeneratelinearizedrepresentationsof theOspreybyutilizingMatlabslinearizingcapability.Forexample,thetrimmed vehicleata60meteraltitudeat25meters/sec.wouldhavethecorrespondingstate spacerepresentation = 0 15 11 080 08 9 81 0 03 7 17 0 830 0 37 35 9 960 001 000 + 3 30 06 1 51 4 0 980 00 (2)

PAGE 26

15 1 (12) rates states (12) pqr Phi_Theta_Psi uvw xyzsplit up states pqr Forces uvw mass uvw dotTranslational EOM Total_Torques pqr Inertia Matrix pqr dotRotational EOM Transform_Matrix xyz XYZPosition M u x pqr Phi_Theta_Psi Phi_Theta_Psi dotBody Rates to Euler Angle Rates 6 states (12) 5 Transform_Matrix 4 Inertia Matrix 3 mass 2 Total Forces (@cg) 1 Total Torques (@cg) Figure2:Simulinkrepresentationoftheaircraftequationsofmotion. H cross Omega inv(I)*H_dot = omega_dot 1 pqr dot Transform_Matrix xyz XYZMatrix Multiplication2 Transform_Matrix xyz XYZMatrix Multiplication MatrixInv_MatrixInvert 3x3 H_dot A B A_cross_BCross Product 3 Inertia Matrix 2 pqr 1 Total_Torques Figure2:Simulinkaircraftequationsofmotionsub-block:RotationalEOM.

PAGE 27

16 1 Phi_Theta_Psi dot u[2]*u[7] u[3]*u[6] Theta_Dot sin(u[1]) Sin_Theta sin(u[1]) Sin_Phi (u[2]*u[6] + u[3]*u[7])/u[5] Psi_Dot u[1] + u[2]*u[6]*u[4]/u[5] + u[3]*u[7]*u[4]/u[5] Phi_Dot M u x M u x Euler_Rates e m cos(u[1]) Cos_Theta cos(u[1]) Cos_Phi 2 Phi_Theta_Psi 1 pqr Figure2:Simulinkaircraftequationsofmotionsub-block:BodyRatestoEuler AngleRates. 1 uvw dot M u x uvw_Dot M u x u[3]/u[10] u[8]*u[4] + u[7]*u[5] Accel_minus_Cross_z (w_dot) u[2]/u[10] u[7]*u[6] + u[9]*u[4] Accel_minus_Cross_y (v_dot) u[1]/u[10] u[9]*u[5] + u[8]*u[6] Accel_minus_Cross_x (u_dot) 4 mass 3 uvw 2 Forces 1 pqr Figure2:Simulinkaircraftequationsofmotionsub-block:TranslationalEOM.

PAGE 28

17 Andthecorrespondinglateralstate-spacerepresentationiscomputedtobe = 0 69 0 03 0 990 39 3 13 12 921 100 17 03 0 10 0 970 01 00 0 030 + 00 1 50 0 02 0 09 0 17 00 (2) where = m/sec = rad = rad/sec = rad = rad = rad/sec = rad/sec = rad and = deg = N = deg = deg Thismodelwouldprovideusefulinformationinregardstointermediate feasibilityofthevision-basedmethod,suchasprovidinginsightintomotionand frequencyissueswhichcanseverelya ecttheperformanceofthevision-based methods.Itcanalsoserveasabasisforrudimentarycontroldesign,inthecase wherethe ightregimeisbenignandhence,doesnotrequireemphasizethee ect ofvision-basedstateestimationinregardstovehiclecontrol. 2.5Conclusions Thee ortsinthischapterillustratedthegroundupdevelopmentofafully nonlinearsimulationrepresentinganaircrafttestbedtohost ighttestingof image-basedestimationandcontrolalgorithms.Amethodwasdevisedtoestimate thevehiclesrollinertiapropertiesbaseduponthefactthatthevehicle,unlike mostairframes,isrelativelysymmetricalaboutits plane.Finally,whilethe aerodynamicparameterestimatesarederivedfromcommerciallyavailablevortex latticesoftware,thistypeofcodeisverybene cialintimeandcostsavings,butit comesatthecostofmodeluncertainty;whichyetagaincallsforcontrolalgorithms thatareinherentlyrobust.

PAGE 29

CHAPTER3 AUTONOMOUSCONTROLDESIGN 3.1Introduction Arobustcontrolapproachwasrecentlydevelopedin[17]thatexploitsa uniquepropertyoftheintegralofthesignoftheerror(coinedRISEcontrolin[18]) toyieldanasymptoticstabilityresult.TheRISEbasedcontrolstructurehasbeen usedforavarietyoffullyactuatedsystemsin[17],[18],[62].Thecontributionof thisresultistheabilitytoachieveasymptotictrackingcontrolofamodelreference systemfornotonlyabroadclassofmodeluncertainties,butalsoforwherethe plantdynamicscontainaboundedadditivedisturbance(e.g.,potentialdisturbances include:dynamicinversionmismatch,windgusts,nonlineardynamics,etc.).In addition,thisresultrepresentsthe rsteverapplicationoftheRISEmethod wherethecontrollerismultipliedbyanon-squarematrixcontainingparametric uncertaintyandnonlinear,non-LPdisturbances.Thefeasibilityofthistechnique isproventhroughaLyapunov-basedstabilityanalysisandthroughnumerical simulationresults. Figure3:PhotographoftheOspreyaircrafttestbed. 18

PAGE 30

19 3.2BaselineController Asmentioned,thevision-basedestimationmethodthewillbediscussed furtherinchapter4wasbeexperimentallydemonstratedby ighttestingwithan OspreyAircraft.Priortoperformingtheexperiment,theaircraftwasmodelled andtheestimationmethodwastestedinsimulation.ASimulinkmodelinge ort hasbeenundertakentodevelopafullynon linear,sixdegrees-of-freedommodel ofanOspreyaircraft.Asimpli edautopilotdesignisconstructed,withinputs complimentarywiththeoutputsfromtheestimationmethod,andaspeci c maneuveriscreatedtoperformasimultaneousrolling,pitching,andyawingmotion oftheaircraftcombinedwitha xedmountedcamera.Theaircraft/autopilot modelinge ortandmaneuverisintendedtotesttherobustnessofthevision-based algorithmaswellastoprovideproof-of-conceptinusingthecameraastheprimary sensorforachievingclosed-loopautonomous ight. Withthevehiclemodelasdescribed,abaselineautopilotisincorporatedto allowforthevehicletoperformsimplecommandedmaneuversthatanautonomous aircraftwouldtypicallybeexpectedto receivefromanon-boardguidancesystem.Theautopilotarchitecture,giveninFigure32,isspeci callydesignedto acceptinputscompatiblewiththestateestimatescomingfromthevision-based algorithms.PreliminarymodalanalysisoftheOspreyvehicle yingata60meter altitudeat25meters/secindicatedashort-periodfrequency, =10 1 rad/secand damping, =0 85 ;aphugoidmodefrequency, =0 34 rad/secanddamping, =0 24 ;adutch-rollfrequency, =4 20 rad/secanddamping, =0 19 ;a rollsubsidencetimeconstantof =0 08 sec.;andaspiralmodetime-to-double, =44 01 sec.Thesevalues,whichcorrespondto(213)and(214),arecrucial fortheauto-pilotdesignaswellasindeterminingwhat,ifany,ofthestateestimationvaluescomingfromthecameraandproposedtechniquearefavorableto

PAGE 31

20 Y cmdY PIFilter WashoutYKYK K ail+ + + + + -. r Filter WashoutYK rud H cmdH PIFilter WashoutHKHK K elev+ + + + V cmdV PIVK + throt K. Figure3:Rudimentarycontrolsystemusedforproofofconceptforvision-based extimationalgorithms. beusedinaclosed-loopsense,asvideoframerateandquantizationnoisebecome integraltothecontrollerdesignfromafrequencystandpoint. Astheaircraft,withanintegratedvision-basedsystem,isrequiredto yin lesserbenignregimes,suchasmaneuveringinandaroundstructures,itbecomes evidentthatsimplisticclassicalcontrolmethodswillbelimitedinperformance capabilities.Theaircraftsystemunderconsiderationcanbemodeledviathe followingstatespacerepresentation[2,6,11,63,64]: = + + ( ) (31) = (32)

PAGE 32

21 where R denotesthestatematrix, R for representsthe inputmatrix, R istheknownoutputmatrix, Risavectorofcontrol inputs,and ( ) Rrepresentsanunknown,nonlineardisturbance. Assumption1: The and matricesgivenin(3)containparametric uncertainty. Assumption2: Thenonlineardisturbance ( ) andits rsttwotime derivativesareassumedtoexistandbeboundedbyaknownconstant. 3.3RobustControlDevelopment Inthissection,itisdescribedhowaspeci caircraftcanberelatedto(3). Basedonthestandardassumptionthatthelongitudinalandlateralmodesofthe aircraftaredecoupled,thestatespacemodelfortheOspreyaircrafttestbedcan berepresentedusing(3)and(32),wherethestatematrix R8 8andinput matrix R8 4giveninchapter2areexpressedas = 04 404 4 = 04 204 2 (33) andtheoutputmatrix R4 8isdesignedas = 02 402 4 (34) where R4 4, R4 2,and R2 4denotethestate matrices,inputmatrices,andoutputmatrices,respectively,forthelongitudinal andlateralsubsystems,andthenotation 0 denotesan matrixofzeros.The statevector ( ) R8isgivenas = (35)

PAGE 33

22 where ( ) ( ) R4denotethelongitudinalandlateralstatevectorsde ned as (36) (37) wherethestatevariablesarede nedas = = = = = = = = andthecontrolinputvectorisde nedas (38) = In(3), ( ) R denotestheelevatorde ectionangle, ( ) R isthe controlthrust, ( ) R istheaileronde ectionangle,and ( ) R isthe rudderde ectionangle. Thedisturbance ( ) introducedin(3)canrepresentseveralbounded nonlinearities.Themorepromisingexampleofdisturbancesthatcanberepresentedby ( ) isthenonlinearformofaselectivelyextractedportionofthe statespacematrix R4 4thatwouldnormallybelinearized.Thisnonlinearity wouldthenbeaddedtothenewstatespaceplantbysuperposition,resultinginthe followingquasi-linearplantmodel: = 0+ + ( ) (39)

PAGE 34

23 where 0 R4 4isthestatespacematrix withthelinearizedportion removed,and ( ) R4denotesthenonlineardisturbancespresentinthe longitudinaldynamics.Somephysicalexamplesof ( ) wouldbetheselective nonlinearitiesthatcannotbeignored,suchaswhendealingwithsupermaneuvering vehicles,wherepost-stallanglesofattackandinertiacoupling,forexample, areencountered.GiventhattheOspreyisaverybenignmaneuveringvehicle, ( ) inthischapterwillrepresentlessrigorousnonlinearitiesforillustrative purposes.Asimilartechniquecanbefollowedwiththelateraldirectionstatespace representation,wherethenonlinearcomponentof isextracted,andanew quasi-linearmodelforthelateraldynamicsisdevelopedas = 0+ + ( ) (3) where 0 R4 4isthenewlateralstatematrixwiththelinearizedcomponents removed,and ( ) R4denotesthenonlineardisturbancespresentinthe lateraldynamics.Anotherexampleofboundednonlineardisturbances,whichcan berepresentedby ( ) in(3),isadiscreteverticalgust.Theformulagiven in[65],forexample,de nessuchaboundednonlinearity inthelongitudinalaxisas ( )= 11 1 7 2 37 4 0 1 0 2 h 1 cos i (3) where denotesthedistance(between10.67and106.68meters)alongthe airplanes ightpathforthegusttoreachitspeakvelocity, 0istheforward velocityoftheaircraftwhenitentersthegust, [0 2 ] representsthedistance penetratedintothegust(e.g., = R21 ( ) ),and isthedesigngustvelocity asspeci edin[65].Thisregulationisintendedtobeusedtoevaluatebothvertical andlateralgustloads,soasimilarrepresentationcanbedevelopedforthelateral

PAGE 35

24 dynamics.Anothersourceofboundednonlineardisturbancesthatcouldbe representedby ( ) isnetworkdelayfromcommunicationwithagroundstation. 3.4ControlDevelopment Tofacilitatethesubsequentcontroldesign,areferencemodelcanbedeveloped as: = + (3) = (3) with R and R designedas = 04 404 4 = 04 204 2 (3) where isHurwitz, ( ) Risthereferenceinput, Rrepresentsthereferencestates, Rarethereferenceoutputs,and wasde nedin(32).Thelateralandlongitudinalreferencemodelswere chosenwiththespeci cpurposeofdecouplingthelongitudinalmodevelocity andpitchrateaswellasdecouplingthelateralmoderollrateandyawrate.In additiontothiscriterion,thedesignisintendedtoexhibitfavorabletransient responsecharacteristicsandtoachievezerosteady-stateerror.Simultaneous anduncorrelatedcommandsareinputintoeachofthelongitudinalandlateral modelsimulationstoillustratethateachmodelindeedbehavesastwocompletely decoupledsecondordersystems. Thecontributioninthiscontroldesignisarobusttechniquetoyieldasymptotictrackingforanaircraftinthepresenceofparametricuncertaintyina non-squareinputauthoritymatrixandanunknownnonlineardisturbance.Tothis end,thecontrollawisdevelopedbasedontheoutputdynamics,whichenables ustotransformtheuncertaininputmatrixintoasquarematrix.Byutilizinga

PAGE 36

25 feedforward(bestguess)estimateoftheinputuncertaintyinthecontrollawin conjunctionwitharobustcontrolterm,oneisabletocompensatefortheinput uncertainty.Speci cally,basedontheassumptionthatanestimateoftheuncertain inputmatrixcanbeselectedsuchthatadiagonaldominancepropertyissatis ed intheclosed-looperrorsystem,asymptotictrackingisproven.13.4.1ErrorSystem Thecontrolobjectiveistoensurethatthesystemoutputstrackdesiredtimevaryingreferenceoutputsdespiteunknown,nonlinear,non-LPdisturbancesinthe dynamicmodel.Toquantifythisobjective,atrackingerror,denotedby ( ) R, isde nedas = = ( ) (3) Tofacilitatethesubsequentanalysis,a lteredtrackingerror[66],denotedby ( ) R isde nedas: + (3) where R denotesamatrixofpositive,constantcontrolgains. Remark3.1: Itcanbeshownthatthesystemin(3)and(3)isbounded inputboundedoutput(BIBO)stableinthesensethattheunmeasurablestates ( ) R andthecorrespondingtime derivativesareboundedas k k 1k k + (3) k k 2k k + (3) where ( ) R2 isde nedas (3) 1Preliminarysimulationresultsshowthatthisassumptionismildinthesense thatawiderangeofestimatessatisfythisrequirement.

PAGE 37

26 and 12 R areknownpositiveboundingconstants,providedthecontrol input ( ) remainsboundedduringclose-loopoperation. Theopen-looptrackingerrordynamicscanbedevelopedbytakingthetime derivativeof(36)andutilizingtheexpressionsin(31),(3),(312),and(33) toobtainthefollowingexpression: = + + ( + ) (3) wheretheauxiliaryfunction ( ) Risde nedas ( )+ ( )+ ( + )+ (3) theauxiliaryfunction isde nedas = ( + )+ ( )+ ( ) ( + ) (3) + + ( + ) andtheconstant,unknownmatrix R isde nedas (3) In(3)and(3), ( ) ( ) Rcontaintheportionsof ( ) and ( ) respectively,thatcanbeupperboundedbyfunctionsofthestates, ( ) ( ) Rcontaintheportionsof ( ) and ( ) thatcanbeupperboundedbyknown constants(i.e.,see(3)and(3)), ( ) Rcontainsthemeasurablestates (i.e., ( )= ( )+ ( )+ ( ) ),and ( ) Rcontainsthereference statescorrespondingtothemeasurablestates ( ) .Thequantities ( ) and andthederivative canbeupperbounded

PAGE 38

27 asfollows: ( k k ) k k (3) k k (3) where R areknownpositiveboundingconstants,andthefunction ( k k ) isapositive,globallyinvertible,nondecreasingfunction.Basedontheexpressionin (30)andthesubsequentstabilityanalysis,thecontrolinputisdesignedas = Z 0 ( ) ( +1) 1 ( )+( +1) 1 (0) Z 0 1 ( ( )) (3) 1Z 0[( +1) ( )+ ( ( ))] where R arediagonalmatricesofpositive,constantcontrolgains, wasde nedin(3),andtheconstantfeedforwardestimate R isde ned as (3) Tosimplifythenotationinthesubsequentstabilityanalysis,theconstantauxiliary matrix R isde nedas 1 (3) where canbeseparatedintodiagonalando -diagonalcomponentsas = + (3) where R containsonlythediagonalelementsof ,and R contains theo -diagonalelements.

PAGE 39

28 Aftersubstitutingthetimederivativeof(326)into(30),thefollowing closed-looperrorsystemisobtained: = + ( +1) ( ) ( ( )) (3) Assumption3: Theconstantestimate givenin(327)isselectedsuchthat thefollowingconditionissatis ed: min( ) k k (3) where R isaknownpositiveconstant,and min( ) denotestheminimum eignenvalueoftheargument.Preliminarytestingresultsshowthisassumptionis mildinthesensethat(3)issatis edforawiderangeof 6= Remark3.2: Apossiblede citofthiscontroldesignisthattheaccelerationdependentterm ( ) appearsinthecontrolinputgivenin(3).Thisisundesirablefromacontrolsstandpoint;howeve r,manyaircraftcontrollersaredesigned basedontheassumptionthataccelerati onmeasurementsareavailable[67]. Further,from(326),thesignoftheaccelerationisallthatisrequiredformeasurementinthiscontroldesign. 3.4.2StabilityAnalysis Theorem3.1: Thecontrollergivenin(3)ensuresthattheoutputtracking errorisregulatedinthesensethat k ( ) k 0 as (3) providedthecontrolgain introducedin(36)isselectedsu cientlylarge (seethesubsequentstabilityproof),and and areselectedaccordingtothe

PAGE 40

29 followingsu cientconditions: +1 min( ) (3) k k (3) where and wereintroducedin(35), wasde nedin(3),and and wereintroducedin(3). ThefollowinglemmaisutilizedintheproofofTheorem3.1. Lemma3.1: Let D R2 +1beadomaincontaining ( )=0 where ( ) R2 +1isde nedas ( ) p ( ) (3) andtheauxiliaryfunction ( ) R isde nedas ( ) k (0) kk k (0)(0) (3) + Z 0 k kk ( ) k Z 0 ( ) Theauxiliaryfunction ( ) R in(3)isde nedas ( ) ( ) ( ) (3) Providedthesu cientconditionsin(3)issatis ed,thefollowinginequalitycan beobtained: Z 0 ( ) k (0) kk k (0)(0) (3) + Z 0 k kk ( ) k Hence,(38)canbeusedtoconcludethat ( ) 0 .

PAGE 41

30 Proof:(SeeTheorem1) Let ( ): D [0 ) R beacontinuously di erentiable,positivede nitefunctionde nedas 1 2 + 1 2 + (3) where ( ) and ( ) arede nedin(5)and(3),respectively,andthepositive de nitefunction ( ) isde nedin(3).Thepositivede nitefunction ( ) satis estheinequality 1( ) ( ) 2( ) (3) providedthesu cientconditionintroducedin(33)issatis ed.In(3),the continuous,positivede nitefunctions 1( ) 2( ) R arede nedas 1, 1 2 k k22, k k2 (3) Aftertakingthederivativeof(339)andutilizing(316),(329),(330),(336), and(3), ( ) canbeexpressedas ( )= + ( +1) (3) ( +1) + k kk k ( ) ( ) Byutilizing(3), ( ) canbeupperboundedas ( ) k k2 k k2(3) + ( k k ) k kk k + + k k k k Clearly,if(3 )issatis ed,thebracketedtermin(3)isnegative,and ( ) canbeupperboundedusingthesquaresofthecomponentsof ( ) asfollows: ( ) k k2 k k2+ ( k k ) k kk k k k2 (3)

PAGE 42

31 Completingthesquaresforthebracketedtermsin(3)yields ( ) 3k k2+ 2( k k ) k k2 4 (3) where 3, min { } ,and ( k k ) isintroducedin(34).Thefollowingexpressioncanbeobtainedfrom(3): ( ) ( ) (3) where ( )= k k2,forsomepositiveconstant R ,isacontinuous,positive semi-de nitefunctionthatisde nedonthefollowingdomain: D n R2 +1|k k 1 2 p 3 o (3) Theinequalitiesin(3)and(3)canbeusedtoshowthat ( ) Lin D ;hence ( ) ( ) Lin D .Giventhat ( ) ( ) Lin D ,standard linearanalysismethodscanbeusedtoprovethat ( ) Lin D from(3). Since ( ) ( ) Lin D ,theassumptionthat Lin D canbeused alongwith(5)toprovethat Lin D .Giventhat ( ) Lin D ,the assumptionthat 1 Lin D canbeusedalongwiththetimederivativeof (3)toshowthat ( ) Lin D .Further,Equation2.78of[72]canbeused toshowthat ( ) canbeupperboundedas ( ) ( )+ 0 ,where R+isaboundingconstant.Theorem1.1of[73]canthenbeutilizedtoshow that ( ) Lin D .Hence,(3)canbeusedtoshowthat ( ) Lin D Since ( ) ( ) Lin D ,thede nitionsfor ( ) and ( ) canbeusedtoprove that ( ) isuniformlycontinuousin D .Let D denoteasetde nedasfollows: ( ) D| 2( ( )) 1 2 1 2 p 32 (3) Theorem8.4of[74]cannowbeinvokedtostatethat k k2 0 (0) (3)

PAGE 43

32 Basedonthede nitionof ,(3)canbeusedtoshowthat k ( ) k 0 (0) (3) 3.5SimulationResults Anumericalsimulationwascreatedtotestthee cacyoftheproposed controller.Thesimulationisbasedontheaircraftstatespacesystemgivenin(3) and(32),wherethestatematrix ,inputauthoritymatrix ,andnonlinear disturbancefunction ( ) aregivenbythestatespacemodelfortheOsprey aircraftgivenin(3)-(3).Thereferencemodelforthesimulationisrepresented bythestatespacesystemgivenin(312)-(314),withstatematrices and ,inputmatrices and ,andoutputmatrices and selected as = 0 6 1 100 2 0 2 200 00 4 0 600 0 000 1 10 (3) = 4 0 600 000 0 1 10 0000 000 6 1 1 002 0 2 2 (3) = 00 5 00 100 00 = 100 00 00 5 00 (3)

PAGE 44

33 and = 0010 1000 = 0100 0010 (3) ThelongitudinalandlateraldynamicmodelsfortheOspreyaircraft yingat 25m/satanaltitudeof60metersarerepresentedusing(3)and(3),where 0, 0, ,and aregivenas 0= 0 1511 080 080 0 03 7 170 830 0 37 35 9 960 001 000 (3) 0= 0 69 0 03 0 990 3 13 12 921 100 17 03 0 10 0 970 01 00 0 030 (3) = 3 30 06 1 51 40 980 00 = 00 1 50 0 02 0 090 17 00 (3) respectively.Thenonlineardisturbanceterms ( ) and ( ) introducedin (3)and(310),respectively,arede nedas ( )= 9 81sin 000 + ( ) (3) ( )= 0 39sin 000 (3) where ( ) representsadisturbanceduetoadiscreteverticalwindgust asde nedin(3),where =10 12 =15 24 ,and 0=25

PAGE 45

34 0 1 2 3 4 5 6 7 8 0 1 2 3 4 5 6 7 8 9 10 11 Time [s]Wind Gust Speed [m/s] Figure3:Plotofthediscretevertical(upward)windgustusedinthecontroller simulation. (cruisevelocity).Figure3showsaplotofthewindgustusedinthe simulation.Theremainderoftheadditivedisturbancesin(38)and(39) representnonlinearitiesnotcapturedinthelinearizedstatespacemodel(e.g.,due tosmallangleassumptions).Allstatesandcontrolinputswereinitializedtozero forthesimulation. Thefeedforwardestimates and wereselectedas = 0 010 1 00 1 40 00 = 00 1 7 0 05 0 10 25 00 (3) Remark3.3: Forthechoicesfor and givenin(3),theinequalityin (3)issatis ed.Speci cally,thechoicefor yieldsthefollowing: min( )=0 6450 0 0046= k k (3)

PAGE 46

35 Table31:Parametersusedinthecontrollersimulation. SamplingTime 0 01sec PitchRateSensorNoise 1 7 sec VelocitySensorNoise 0 4 sec RollRateSensorNoise 1 7 sec YawRateSensorNoise 1 7 sec ControlThrustSaturationLimit 200 ControlThrustRateLimit 200 sec ElevatorSaturationLimit 30 ElevatorRateLimit 300 sec AileronSaturationLimit 30 AileronRateLimit 300 sec RudderSaturationLimit 30 RudderRateLimit 300 sec andthechoicefor yields min( )=0 6828 0 0842= k k (3) Inordertodeveloparealisticsteppingstonetoanactualexperimental demonstrationoftheproposedaircraftcontroller,thesimulationparameterswere selectedbasedondetaileddataanalysesandspeci cations.Thesensornoisevalues arebaseduponCloudCapTechnologysPiccoloAutopilotandanalysisofdata loggedduringstraightandlevel ight.Thesevaluesarealsocorroboratedwiththe speci cationsgivenforCloudCapTechnologysCristaInertialMeasurementUnit (IMU).Thethrustlimitandestimatedratelimitwasmeasuredviaastatictest usinga shscale.Thecontrolsurfacerateandpositionlimitsweredetermined viathegeometryofthecontrolsurfacelinkagesinconjunctionwiththedetailed speci cationssheetgivenwiththeFutabaS3010standardballbearingservo.The simulationparametersaresummarizedinTable1. Theobjectivesforthelongitudinalcontrollersimulationaretotrackpitch rateandforwardvelocitycommands.Figure34showsthesimulationresultsof

PAGE 47

36 0 2 4 6 8 10 12 14 16 18 20 0 5 10 Velocity (m/s) 0 2 4 6 8 10 12 14 16 18 20 0 10 20 Angle of Attack (deg) 0 2 4 6 8 10 12 14 16 18 20 0 5 10 Pitch Rate (deg/sec) 0 2 4 6 8 10 12 14 16 18 20 0 50 Time [sec]Pitch (deg) Model Reference Actual Response Tracking a zero pitch rate command through a large gust results in a large residual angle of attack. Figure3:Illustrationofuncoupledvelocityandpitchrateresponseduring closed-looplongitudinalcontrolleroperation. theclosed-looplongitudinalsystemwithcontrolgainsselectedasfollows(e.g.,see (3)and(3))2: = 0 1130 = 0 2160 = 0 70 1 =0 1 2 2 wherethenotation denotesthe identitymatrix.Figure3showsthe actualresponsesversusthereferencecommandsforvelocityandpitchrate.Note thattheuncontrolledstatesremainbounded.Forthelateralcontrollersimulation, theobjectivesaretotrackrollrateandyawratecommands.Figure3showsthe simulationresultsoftheclosed-looplateralsystemwithcontrolgainsselectedas 2The usedinthelongitudinalcontrollersimulationdoesnotsatisfythesu cientconditiongivenin(3);however,thisconditionisnotnecessaryforstability,itissu cientfortheLyapunovstabilityproof.

PAGE 48

37 0 2 4 6 8 10 12 14 16 18 20 -10 0 10 Sideslip Angle (deg) 0 2 4 6 8 10 12 14 16 18 20 -20 0 20 Roll Rate (deg/sec) 0 2 4 6 8 10 12 14 16 18 20 -10 0 10 Yaw Rate (deg/sec) 0 2 4 6 8 10 12 14 16 18 20 0 50 100 Time [sec]Roll Angle (deg) Model Reference Actual Response Figure35:Illustrationofuncoupledrollrateandyawrateresponseduringclosedlooplateralcontrolleroperation. follows: = 0 20 6 = 0 23 = 1 00 2 = 2 2 Figure35showstheactualresponsesversusthereferencecommandsforrollrate andyawrate.Notethattheuncontrolledstatesremainbounded. 3.6Conclusion Anaircraftcontrollerispresented,whichachievesasymptotictrackingcontrol ofamodelreferencesystemwheretheplantdynamicscontaininputuncertainty andaboundednon-LPdisturbance.Thedevelopedcontrollerexhibitsthedesirable characteristicoftrackingthespeci eddecoupledreferencemodel.Anexampleof suchadecouplingisdemonstratedbyexaminingtheaircraftresponsetotracking arollratecommandwhilesimultaneouslytrackingacompletelyunrelatedyaw ratecommand.Thisresultrepresentsthe rsteverapplicationofacontinuous

PAGE 49

38 controlstrategyinaDIandMRACframeworktoanonlinearsystemwithadditive, non-LPdisturbances,wherethecontrolinputismultipliedbyanon-squarematrix containingparametricuncertainty.Toachievetheresult,anovelrobustcontrol techniqueiscombinedwithaRISEcontrolstructure.ALyapunov-basedstability analysisisprovidedtoverifythetheoreticalresult,andsimulationresultsdemonstratetherobustnessofthecontrollertosensornoise,exogenousperturbations, parametricuncertainty,andplantnonlinearities,whilesimultaneouslyexhibiting thecapabilitytoemulateareferencemodeldesignedo ine.Futuree ortswill focusoneliminatingtheacceleration-dependenttermfromthecontrolinputand designingadaptivefeedforwardestimatesoftheuncertainties.

PAGE 50

CHAPTER4 DAISY-CHAININGFORSTATEESTIMATION 4.1Introduction WhileaGlobalPositioningSystem(GPS)isthemostwidelyusedsensor modalityforaircraftnavigation,researchershavebeenmotivatedtoinvestigate othernavigationalsensormodalitiesbecauseofthedesiretooperateinGPSdenied environments.Duetoadvancesincomputervisionandcontroltheory,monocular camerasystemshavereceivedgrowinginterestasanalternative/collaborative sensortoGPSsystems.Camerascanactasnavigationalsensorsbydetectingand trackingfeaturepointsinanimage.Currentmethodshavealimitedabilityto relatefeaturepointsastheyenterandleavethecamera eldofview. Thischapterdetailsavision-basedpositionandorientationestimationmethod foraircraftnavigationandcontrol.Thisestimationmethodaccountsforalimited camera eldofviewbyreleasingtrackedfeaturesthatareabouttoleavethe eld ofviewandtrackingnewfeatures.Ateachtimeinstantthatnewfeaturesare selectedfortracking,thepreviousposeestimateisupdated.Thevision-based estimationschemecanprovideinputdirectlytothevehicleguidancesystemand autopilot.Simulationsareperformedwhereinthevision-basedposeestimation isintegratedwithanew,nonlinear ightmodelofanaircraft.Experimental veri cationoftheposeestimationisperformedusingthemodelledaircraft. Thee ortsinthischapter(andourpreliminaryresults[76,77])exploretheuse ofasinglecameraasasolesensortoestimatethepositionandorientationofan aircraftthroughuseoftheEuclideanHomography.Themethodisdesignedforuse witha xedwingaircraft,thusthemethodexplicitlyacquiresnewfeaturepoints whenthecurrentfeaturesriskleavingtheimage,andnotargetmodelisneeded, 39

PAGE 51

40 ascomparedtoothermethods[31]-[40].Thecontributionofthischapteristhe useofhomographicrelationshipsthatarelinkedinauniquewaythroughanovel daisy-chainingmethod. 4.2PoseReconstructionFromTwoViews 4.2.1EuclideanRelationships Considerabodyxedcoordinateframe Fthatde nesthepositionand attitudeofacamerawithrespecttoaconstantworldframe F.Theworldframe couldrepresentadeparturepoint,destination,orsomeotherpointofinterest. Therotationandtranslationof Fwithrespectto Fisde nedas ( ) R3 3and ( ) R3,respectively.Thecamerarotationandtranslationfrom F( 0) to F( 1) betweentwosequentialtimeinstances, 0and 1,isdenotedby 01( 1) and 01( 1) .Duringthecameramotion,acollectionof (where 4 )coplanar andnon-colinearstaticfeaturepointsareassumedtobevisibleinaplane .The assumptionoffourcoplanarandnon-colinearfeaturepointsisonlyrequiredto simplifythesubsequentanalysisandismadewithoutlossofgenerality.Image processingtechniquescanbeusedtoselectcoplanarandnon-colinearfeaturepoints withinanimage.However,iffourcoplanartargetpointsarenotavailablethen thesubsequentdevelopmentcanalsoexploitavarietyoflinearsolutionsforeight ormorenon-coplanarpoints(e.g.,theclassiceightpointsalgorithm[78,79]),or nonlinearsolutionsfor veormorepoints[80]. Afeaturepoint ( ) hascoordinates ( )=[ ( ) ( ) ( )] R3 { 1 } in F.Standardgeometricrelationshipscanbeappliedtothecoordinate systemsdepictedinFigure41tode velopthefollowingrelationships: ( 1)= 01( 1) ( 0)+ 01( 1) ( 1)= 01( 1)+ 01( 1) ( 0) ( 0) | {z } ( 1) ( 0) (41)

PAGE 52

41 Figure41:Euclideanrelationshipsbetweentwocameraposes. where ( ) istheEuclideanHomographymatrix,and ( 0) istheconstantunit vectornormaltotheplane from F( 0) ,and ( 0) istheconstantdistance betweentheplane and F( 0) along ( 0) .AfternormalizingtheEuclidean coordinatesas ( )= ( ) ( ) (42) therelationshipin(4)canberewrittenas ( 1)= ( 0) ( 1) | {z }( 0) (43) where R { 1 } isascalingfactor. 4.2.2ProjectiveRelationships Usingstandardprojectivegeometry,theEuclideancoordinate ( ) can beexpressedinimage-spacepixelcoordinatesas ( )=[ ( ) ( ) 1].The projectedpixelcoordinatesarerelatedtothenormalizedEuclideancoordinates, ( ) bythepin-holecameramodelas[81] = (44)

PAGE 53

42 where isaninvertible,uppertriangularcameracalibrationmatrixde nedas cos 00 0001 (45) In(4), 0and 0 R denotethepixelcoordinatesoftheprincipalpoint(the imagecenterasde nedbytheintersectionoftheopticalaxiswiththeimage plane), R representscalingfactorsofthepixeldimensions,and R isthe skewanglebetweencameraaxes. Byusing(44),theEuclideanrelationshipin(43)canbeexpressedas ( 1)= 1( 0)= ( 0) (46) Setsoflinearequationscanbedevelopedfrom(46)todeterminetheprojective andEuclideanHomographymatrices ( ) and ( ) uptoascalarmultiple. Givenimagesoffourormorefeaturepointstakenat F( 0) and F( 1) ,various techniques[82,83]canbeusedtodecomposetheEuclideanHomographytoobtain ( 1) ( 0) ,01( 1) ( 0)and 01( 1) .Thedistance ( 0) mustbeseparatelymeasured (e.g.,throughanaltimeterorradarrange nder)orestimatedusingapriori knowledgeoftherelativefeaturepointlocations,stereoscopiccameras,orasan estimatorsignalinafeedbackcontrol. 4.2.3ChainedPoseReconstructionforAerialVehicles ConsideranaerialvehicleequippedwithaGPSandacameracapableof viewingalandscape.Atechniqueisdevelopedinthissectiontoestimatethe positionandattitudeusingcameradatawhentheGPSsignalisdenied.Acamera hasalimited eldofview,andmotionofavehi clecancauseobservedfeature pointstoleavetheimage.Themethodpresentedherechainstogetherpose estimationsfromsequentialsetsoftrackedofpoints.Thisapproachallowsthe systemtohalttrackingasetofimagefeaturesifitislikelytoleavetheimageand

PAGE 54

43 begintrackinganewsetoffeatureswhilemaintainingtheposeestimate.Thus,the estimationcancontinueinde nitelyandisnotlimitedbythecameras eldofview. Thesubsequentdevelopmentassumesthattheaerialvehiclebeginsoperating inaGPSdeniedenvironmentattime 0,wherethetranslationandrotation(i.e., ( 0) and 0( 0) inFigure4)between F( 0) and F( 0) isknown.Therotation between F( 0) and F( 0) canbedeterminedthroughthebearinginformationof theGPSalongwithothersensorssuchasagyroscopeand/orcompass.Without lossofgenerality,theGPSunitisassumedtobe xedtotheoriginoftheaerial vehiclescoordinateframe,andtheconstantpositionandattitudeofthecamera frameisknownwithrespecttothepositionandattitudeoftheaerialvehicle coordinateframe.ThesubsequentdevelopmentfurtherassumesthattheGPSis capableofdeliveringaltitude,perhapsinconjunctionwithanaltimeter,sothatthe altitude ( 0) isknown. AsillustratedinFigure4,theinitialsetoftrackedcoplanarandnoncolinearfeaturepointsarecontainedintheplane .Thesefeaturepointshave Euclideancoordinates ( 0) R3 { 1 } in F.Theplane isperpendiculartotheunitvector ( 0) inthecameraframe andliesatadistance ( 0) fromthecameraframeorigin.Attime 1,thevehiclehassomerotation 01( 1) and translation 01( 1) thatcanbedeterminedfromtheimagesbydecomposingthe relationshipsgivenin(4).Fornotationalsimplicity,thesubscript isomittedin subsequentdevelopment. Asdescribedearlier, 01( 1) and01( 1) ( 0)canbesolvedfromtwocorresponding imagesofthefeaturepoints( 0) and ( 1) .Ameasurementorestimatefor ( 0) isrequiredtorecover 01( 1) .Thisestimationispossiblewithdistance sensorsorwithaprioriknowledgeofthegeometricdistancesbetweenthepoints in .However,withanadditionalassumption,itispossibletoestimate ( 0) geometricallyusingaltitudeinformationfromthelastGPSreadingand/oran

PAGE 55

44 Figure42:Illustrationofposeestimationchaining. altimeter.FromtheillustrationinFigure43,if ( 0) istheheightabove (e.g., theslopeofthegroundisconstantbetweenthefeaturepointsandprojectionofthe planeslocationtotheground),thenthedistance ( 0) canbedeterminedas ( 0)= ( 0) ( 0) (47) where ( 0) isknownfromthehomographydecomposition. Once 01( 1) ( 0) and 01( 1) havebeendetermined,therotation 1( 1) andtranslation 1( 1) canbedeterminedwithrespectto Fas 1= 0011= 0101+ 0 AsillustratedinFigure42,anewcollectionoffeaturepoints ( ) canbeobtained thatcorrespondtoacollectionofpointsonaplanarpatchdenotedby .Attime 2,thesetsofpoints ( 1) and ( 2) canbeusedtodetermine 12( 2) and12( 2) ( 1), whichprovidestherotationandscaledtranslationof Fwithrespectto F.If

PAGE 56

45 Figure43:Depthestimationfromaltitude. and arethesameplane,then ( 1) canbedeterminedas ( 1)= ( 1)= ( 0)+ 01( 1) ( 0) (48) When and arethesameplane 12( 2) canbecorrectlyscaled,and 2( 2) and 2( 2) canbecomputedinasimilarmannerasdescribedfor 1( 1) and 1( 1) Estimationscanbepropagatedbychainingthemtogetherateachtimeinstance withoutfurtheruseofGPS. Inthegeneralcase, and arenotcoplanarand(4)cannotbeusedto determine ( 1) .If and arebothvisiblefortwoormoreframes,itisstill possibletocalculate ( ) throughgeometricmeans.Let 1denoteassometime beforethedaisychainoperationisperformed,whenboth and arevisiblein theimage.Attime 1 anadditionalsetofhomographyequationsforthepoints and attimes 1and 1canbesolvedfor

PAGE 57

46 ( 1)= + ( 1) ( 1) ( 1) (49) ( 1)= + ( 1) ( 1) ( 1) (4) where =( 1 ) ( )and =( 1 ) ( ) Notethat ( 1) and ( 1) havethesamevaluesinequations(49)and(4 10),butthedistanceandnormaltotheplanearedi erentforthetwosetsof points.Thedistance ( 1) isknownfromusing(48).De ne ( 1)= ( 1) ( 1 )and ( 1)= ( 1) ( 1 ) Thetranslation ( 1) issolvedas = ( 1) andthendetermining ( 1) ( 1)= k k ( 1) canthenbefoundbyusing(4)with ( 1) inplaceof ( 0) .Additional sensors,suchasanaltimeter,canprovideanadditionalestimateinthechangein altitude.Theseestimatescanbeusedinconjunctionwith(48)toupdatedepth estimates. Image-BasedRateGyro Additionalusesarefoundfromthehomographydecompositionthatcanbe usedforfeedbackcontrol.Asanexample,Poissonskinematicdi erentialequation forthedirectioncosinematrix, states = 0 0 0 whichcanalsobeexpressedas

PAGE 58

47 0 0 0 = Hence,aircraftbodyrates, ( ) ( ) and ( ) canbeestimatedvia ( ) fromthe homographydecompositionalongwiththetimederivativeof ( ) 4.2.4SimulationResults Inthesimulations, vepatchesof4featurepointsaremanuallyplaced alonga500metergroundtrack,whichthevehicle iesover.Forsimplicity,all planarpatcheslieinthesameplane.Thetaskistoperformthestateestimation duringamaneuver.Thecommandedmaneuveristosimultaneouslyperforma10 meterlateralshifttotherightanda10meterlongitudinalincreaseinaltitude. Thisparticularmaneuverresultsinthevehiclesimultaneouslypitching,rolling, andyawing,whiletranslating.Forsimulationpurposes,thecameraismounted underneaththefuselagelookingdownwards.Thecameramodelforthisexercise isintendedtoberepresentativeofatypical640 480linesofresolutionCCD cameraequippedwitha10mmlens.Tomoreaccuratelycapturetruesystem performance,pixelcoordinateswereroundedtothenearestintegertomodelerrors duetocamerapixilatione ects(i.e.quantizationnoise),furthermorea5%error wasaddedtotheestimatedvehiclealtitudetotestrobustness. The rstsimulationwasdesignedtotesttheaccuracyofthevision-basedestimation.Visionwasnotusedinthefeedbackinthismaneuver,andtheestimated poseiscomparedtothetruepose.Theresultsofthispreliminaryanalysisare giveninFigures4and4.Thee ectsofnoisearevisiblebuttheestimatedpose isaccurate. Thesecondsimulationwasintendedtoexaminethee ectsofusingthe visionbasedestimateasasensorinclosed-loopcontrol.Thissimulationinvolved

PAGE 59

48 0 2 4 6 8 10 12 14 16 18 20 0 200 400 Downrange (meters) Actual Homography 0 2 4 6 8 10 12 14 16 18 20 0 5 10 Crossrange (meters) Commanded Actual Homography 0 2 4 6 8 10 12 14 16 18 20 60 65 70 Time (sec)Altitude (meters) Commanded Actual Homography Figure44:Actualtranslationversusestimatedtranslationwheretheestimationis notpartoftheclosed-loopcontrol. 0 2 4 6 8 10 12 14 16 18 20 0 10 Roll (deg) Actual Homography 0 2 4 6 8 10 12 14 16 18 20 0 2 4 6 Pitch (deg) Actual Homography 0 2 4 6 8 10 12 14 16 18 20 0 2 4 6 8 Time (sec)Yaw (deg) Actual Homography Figure4:Actualattitudeversusestimatedattitudewheretheestimationisnot partoftheclosed-loopcontrol.

PAGE 60

49 replacingtheperfectpositionandatti tudemeasurements,usedintheguidance systemandautopilot,withpositionandattitudeestimationsdeterminedfromthe vision-basedmethod.Theresultingcontrolarchitectureandsensorsuiteforthis autonomousairvehicleisgiveninFigure46.Thenoisecontentoftheestimated positionandattituderequired lteringpriortobeingusedbytheautopilot,to preventthehighfrequencynoisefrombe ingpassedtotheaircraftactuators.As expected,thenoiseoccursat30Hzandcorrespondstotheframerateofthe camera.First-order,lowpasslters(cuto frequencyaslowas4rad/sec)were usedto lterthenoise.Thenoisealsopreventede ectivedi erentiationofthe positionandattitudeandnecessitatedtheuseofrategyrosforyawandroll damping,asdepictedinFigure46.Theairdatasystemisalsoincluded,asshown inFigure46,fortheinitialaltitudemeasurement,sinceitismoreaccuratefor altitudethancurrentGPSsolutions.Theresultsofthecamera-in-the-loopsystem performingthesameguidancecommandedautonomousmaneuveraregivenin Figures4and4. Thesimulationresultsindicatethatacamerasupplementedwithminimal sensorssuchasrategyrosandbarometricaltitudecanbeusedforcompletely autonomous ightofa xedwingvehicle;however,someresidualoscillatione ects duetonoiseispresentinthevehicleattituderesponse.Amajorityofthenoise sourcecandirectlybeattributedtocamerapixilatione ectsandthecorresponding phaselagintroducedbythe rstorder ltering. 4.2.5ExperimentalResults Basedontheresultsofthesimulation,a ighttestexperimentwasconducted toestablishthefeasibilityoftheproposedvision-basedstateestimationmethod. Arti cialfeatureswereplacedalongastretchoftherunway.Aradiocontrolled aircraftwithanonboardcamerawas ownovertherunway.Thevideowasoverlaid withGPSdatafromaGarminGPS35receiver.Anexampleofasingleframeof

PAGE 61

50 Figure4:Autononmouscontrolarchitecture. 0 2 4 6 8 10 12 14 16 18 20 0 200 400 Downrange (meters) Actual Homography 0 2 4 6 8 10 12 14 16 18 20 0 5 10 Crossrange (meters) Commanded Actual Homography 0 2 4 6 8 10 12 14 16 18 20 60 65 70 Time (sec)Altitude (meters) Commanded Actual Homography Figure47:Actualtranslationversusestimatedtranslationwheretheestimated valueisusedwithintheclosed-loopcontrol.

PAGE 62

51 0 2 4 6 8 10 12 14 16 18 20 0 10 Roll (deg) Actual Homography 0 2 4 6 8 10 12 14 16 18 20 0 2 4 6 8 Pitch (deg) Actual Homography 0 2 4 6 8 10 12 14 16 18 20 0 5 10 Time (sec)Yaw (deg) Actual Homography Figure4:Actualattitudeversusestimatedattitudewheretheestimatedvalueis usedwithintheclosed-loopcontrol. thisvideoisgiveninFigure40.AsecondGPSunit(manufacturedbyEagle TreeSystems,LLC)wasalsoonboardtotestinter-GPSaccuracy.Theuseoftwo GPSunitsprovidescomparisonforthevision-basedmethod,whichisintendedto computeGPS-likeinformation.VideodatawascapturedusingaDVtaperecorder andanalyzedo ine.Abasicdescriptivepictorialofwhatequipmentwasused (minustheeagletreesystem)andthecorrespondingsignal owisgiveninFigure 4.SeetheAppendixforamoredetaileddescriptionofthegroundandairborne equipment.Duetopoorimagequality,includingfocus,motionblurandinterlacing oftheDVvideo,itbecamenecessarytoextractfeaturesbyhandfromindividual frames.Featureswereextractedeverysixthframe,resultingina5Hzinputsignal. ResultsoftheexperimentaregiveninFigure4.InthelegendforFigure 41,GPS2representstheoverlaidGPSdata,andGPS1representstheonboard dataloggerGPSvalues.A*intheplotindicatesatimewhendaisy-chainingwas performedandposereconstructionisperformedusinganewsetoffeaturepoints.

PAGE 63

52 Figure4:Overviewofthe ighttestsystemandcomponentinterconnections. Figure410:SinglevideoframewithGPSoverlayillustratinglandmarksplaced alonginsideedgeoftherunway.

PAGE 64

53 0 2 4 6 8 10 12 14 16 18 20 22 0 200 400 600 Downrange (m) Homography GPS1 GPS2 0 2 4 6 8 10 12 14 16 18 20 22 0 50 100 Crossrange (m) Homography GPS1 GPS2 0 2 4 6 8 10 12 14 16 18 20 22 0 50 100 Time (sec)Altitude (m) represents where daisychain occurs Homography GPS1 GPS2 *Figure4:Experimental ighttestresults.Estimatedpositioncomparedtotwo GPSsignals. Theresultsfromthistestappeartobeverypromising.Signi cantmismatchexists betweenthetwoGPSmeasurements,andthevision-basedestimationremains proportionatetothetwoGPSmeasuremen ts.Furthermore,theestimatesagree closelywithGPS2fordownrangeandcrossrangetranslation,andwithGPS1for altitudetranslation.Thereisnodiscerniblediscontinuityorincreasederroratthe daisy-chainhando times.Notethattheresolutionofthevision-basedestimation (5Hz)isalsohigherthanthatofbothGPSunits(1Hz).Theposeestimationcode canbeexecutedinrealtime( 30Hz)onatypicallaptop. TheaccuracyoftheGPSdata,asrecorded,isarguablydubious,therefore analternate ighttestwasdesignedthatwasintendedtoenablethevehicle positiontobedeterminedwithgreateraccuracythanwhatcouldbeachievedwith theinexpensiveGPSunitsusedinthe rstroundoftesting.Inadditiontothe issueswiththeGPSoutputbeingusedas"truthdata",itwasalsoofinterest toinvestigatewhetherusingawiderangle eldofviewlensonamoreforward

PAGE 65

54 Figure412:Techniqueforachievingaccuratelandmarkrelativeplacementdistances. pointingcameracouldimprovetheresults.Thefundamentalassumptionofthis additionaltestingwasthatifthelocationsofthegroundtargetswereknown precisely,thenthepositionofthevehiclecouldbeascertainedmoreaccurately throughgeometricreconstructionthanwhatcouldbeachievedwithlowcostGPS units. Similartobefore,plates/landmarkswereplacedalong274.32meters(900feet) oftherunway.Unlikebefore,theywereplacedat15.24meter(50foot)intervals wheretheaccuracyofeachplatelocationwasknowntowithinafewinchesalong theentiretyofthe274.32meters(900feet)span.Thetworowsofplates,leftand rightoftherunwaycenterline,were18.59meters(61feet)apart.Theplateswere paintedredsothatasimplisticfeaturetrackingalgorithmcouldbeusedtolocate andtracktheirlocationintheimageplane.Theequipmentthatwasusedfor surveyingthelandmarksisshowninFigure412. Byusingawider eldofviewlens,theresultingvideoimageexhibitedsevere lensdistortione ectsthatrequiredcorrectinginorderforthehomographyto

PAGE 66

55 Figure413:Singlevideoframefromsecond ighttestexperimentillustratingthe e ectofthemoreforwardlooking,wider eldofviewcamera. operatecorrectly.ThedistortioncanbeseeninFigure43,wherethefourred landmarksinthelowerportionoftheframeappeartobelocatedonaspherical surfaceasopposedtoaplane;again,afundamentalrequirementinthedevelopmentofthehomographymethod.Thissameimagewiththedistortionremoved isgiveninFigure44.Theinterlacinge ect,prevalentinFigure4hasalso beenremoved.Asmentioned,thefeaturesweretrackedandtheirtrajectoriesinthe imageplanerecordedusingasimplisticad-hocschemeinMatlab.Thepseudocode fortrackingthefourfeaturesofasinglepatchisgivenas: Manuallydeterminethelocationtoinitializea40x40pixelwindowovereachofthe fourlandmarksinthe rstimage FORallimagescontainingthesamepatchoffourlandmarks ReadthenextimageinthesequenceandassignittoanRGBcolorspacearray

PAGE 67

56 Figure414:Singlevideoframefromsecond ighttestexperimentwiththelens distortionremoved. Createamatrixequaltoanelement-by-elementmultiplicationoftheredspace matrixwithitselfminusonetenththeelement-by-elementmultiplicationof thegreenspacematrixwithitself FOReachofthefourwindows Withinthecurrentwindow,thresholdallvaluesoftheabovede ned matrixthatareabove96%ofthemaximumvaluewithinthecurrentwindow Thecenterofmassofthisthresholdvalueischosentobethepixel locationrepresentingthecenterofthelandmarkasitprojectsontotheimage plane Updatethe40x40pixelwindowtobecenteredoverthispixellocation ENDFOR Recordthefourpixellocations ENDFOR A3-Dimensionalplotofthematrixde nedintheabovepseudocodeisgivenin Figure415.Notethatthelargespikescorrespondtothelocationoftheredplates.

PAGE 68

57 Figure4:Exampleofa3Dcolorcontourplotgeneratedfromthematrixdesignedasanonlinearcombinationoftheredandgreencolorspacematrices. Valuesthatareabove96%ofthemaximumvalueofeachspikearethresholdedand thecenters-of-massofthesethresholdedvaluesareusedtodeterminethepredicted centersofthelandmarksintheimageframe.Byfollowingthisprocedure,itwas predictedthatsub-pixelaccuracycouldbeachievedoversimplyselectingthepixel locationofthemaximumvalueortipofthespike. Anexampleoftheoutputoftheabovetrackingalgorithmisalsogivenin Figure416.Thefourtrajectoriesinthisparticularcaserepresentthetraceofthe landmarksfromthe rstpatchasitentersandexitsthe eldofview(fromtopto bottom).Asapointofinterest,notethatthecenteroftheimageplanedoesnot correspondtolocationoftheopticalaxis.Thisapparentoddityisinkeepingwith thecameracalibrationresultsandismostlikelycongruouswithalowendimager andlens. Withthetrajectoriesofthefourlandmarksofeachpatchrecorded,vehiclelocalizationcanthenbeperformedo ineforanalysispurposes.Itshouldbepointed

PAGE 69

58 0 100 200 300 0 100 200 300 400 500 Vertical Pixel LocationHorizontal Pixel Location x yFigure4:Imageplanetrajectoriesmadebythelandmarksfrompatch1enterringandexitingthe eldofview. outthatthe40 40 pixelwindowisdesignedsuchthatitislargeenoughtocontainthelandmarkinthecurrentframeandthesubsequentframe.Smallerwindows tendedtonotencapsulateboththecurrentandsubsequentframelandmarksdueto vehiclemotionsbetweenframes. Foragivensetofthreefeaturepoints,andknowledgeofwherethoselandmarksareintheearthframe,geometricreconstructioncanbeusedtobackoutthe locationoftheaircraft.ThisisillustratedinFigure417,whereinthisparticular illustration,atetrahedronisusedtodeterminethe 0, 0,and 0locationofthe cameraintheearthframe. 1, 2,and 3areknownapriorirelativedistances betweenthe3landmarks.Vertexangles 1, 2,and 3oftheapexofthetetrahedronaredeterminedbythefocallengthofthecameralensinpixelcoordinatesin conjunctionwiththepixellocationsofwherethelandmarkscorrespondingtoline segments 1, 2,and 12occurintheimageplane.Thevaluefortheskew, isused toaccountforthefactthatthefourland marksmostlikelyformaparallelogram,

PAGE 70

59 sinceitwassomewhatdi culttocreateaperfectrectangleinthe eld.Infact, 12dependsupontheskew, andiscalculatedaccordingto 12= q 2 1sec2( )+ 2 2+2 12sec( )sin( ) Presuming ,and canbecomputed,then 0, 0,and 0ofthecameracan bedeterminedviathefollowingseriesofcalculations: 1=cos 1 2 12+ 2 2 2 12 2=cos 1 2 12+ 2 2 2 1 2 122 and 0,showninFigure417,iscalculatedfromthefollowing 0= q 2 2sec2( 2)+ 2 2 2cos( 1) | sec( 2) | If isconsideredasavector,the and directioncosinesaregivenby,respectively cos 1= 2 2+ 2 2 2 2 cos 2= 2 2tan2( 2)+ 2 0 2 2 2tan( 2) Finally,thecomponentsofthepositionv ectorfromtheoriginoftheearthframeto theoriginofthecameraframe,inearthframecoordinates,issimply 0= cos( 1) 0= cos( 2) 0= q sin2( 1) cos2( 2) Thedi cultyofthismethodforposeestimationisindetermining ,and morespeci cally,theedgelengthsofthetetrahedroninFigure4.Thereason isthatforgivenvaluesof 1, 2,and 3and 1, 2,and 12,therecanbeaslittle aszerophysicallypossiblesolutionsandasmanyasthreedistinctlydi erent

PAGE 71

60 Figure47:Basicconceptofgeometricpositionreconstructionfromknownlandmarklocations. physicallypossibletetrahedrons.Thepossiblevaluesof ,and aredetermined bysimultaneouslysolvingthethreelaw-of-cosinerulesforthethreetriangular facetsmakinguptheuprightsidesofthetetrahedron.Aforthlandmarkisneeded toresolvethecorrect ,and ,viaconstructingmultipletetrahedrons,clocking aroundthefourlandmarksasshowninFigure418,andselectingthe ,and thatarecommontoallfourtetrahedrons.Usingthistechnique,eachedgelength iscalculatedthreetimes.Ameritofhowreliableoraccurateourpositionestimate is,andhence,howwelltheproperpixelsrepresentingthelandmarkcenterswere selected,isinhowcloseeachofthethreeedgelengthcalculations ,and aretoeachother.Aquickconvincingargumentwouldbeintheeventthat ,and aretheexactsameinall3calculations,onecouldbeverycertainthat theywouldconsistentlyprojectandreprojectbackandforthbetweenthecamera andasurfacecontaining4pointswiththesameedgelengths 1, 2, 3,and 4.

PAGE 72

61 Figure418:Illustrationofhowfourtetrahedronsareusedtoestimatethelength ofeachedge, ,& threetimes. Ifthissurfacecanbeshowntobeaplane,thenintuitively,thisplaneandfour pointswould,withhighcon dence,betheearthplaneandthepointswouldbethe centersofthelandmarks.Resultsofthis ightexperimentaregiveninFigure4. Becausetimewasnotrecordedforthisexperiment,thegroundtracktrajectoryis givenasopposedtodownrangeandcrossrangetimeresponsesasgiveninFigure 41.Thecolorvariationinthegeometricreconstructionrepresenttheindividual patchesusedalongthelengthoftherunway.Theoriginoftheearthframeis placedatthe rstplateonthelowerrightthatappearswhenthevisionestimation begins.Alsonoteworthy,becausethisexperimentusedasomewhatforwardlooking camera,thecamerapositionstartsinthenegativex-directionbecausethe rst plateisaheadofthecamera;conversely,thecameraisbehindthe rstplate. 4.3Conclusions Thee ortsinthischapterintegratednewvision-basedposeestimation methodswiththe ightcontrolsofanaerialvehicleinaguidancetask.This methodisbasedonEpipolargeometry,wi thanoveldaisy-chainingapproach

PAGE 73

62 0 50 100 150 200 250 0 5 10 15 20 X Distance Down Runway (m)Y Distance Across Runway (m) Geometric Reconstruction Homography Figure49:Secondexperimental ighttestresults.Estimatedpositioncompared tomoreaccuratepositionfromgeometricreconstructiontechnique. allowingimagefeaturestoenterandleavethe eldofviewwhilemaintainingpose estimation.Furthermore,noknowntargetisrequired. Thevision-basedposeestimationmethodwasveri edexperimentallywith aradiocontrolledOspreyaircraft.BecausetheaccuracyoftheonboardGPS measurementswasconsideredquestionable,analternatee ortinvolvinggeometric reconstructionforpositiondeterminationwasundertakentobetterrepresenttruth dataforvalidatingthedaisy-chainingposeestimationtechnique.Tofacilitatethe experiments,anonlinearaircraft ightmodelfortheOspreywasdevelopedto allowextensivesimulationtesting.Simulationsincludetestingtheposeestimation methodinclosed-loopcontroloftheaircraftthroughanintentionallyrigorous maneuvertoevaluatetherobustnessofthetechnique. Theultimategoalofthisresearchisclosed-loopcontrolusingcameradata inplaceofGPS.Tothisend,futureresearchwilltargetonlinefeatureextraction, tracking,andposeestimation.Furthermore,afuturetaskwouldbetoinvestigate theuseofnonlinearcontroltechniques,suchasdiscussedinthenextchapter,to eliminateorreducetheamountofrequiredaprioriknowninformationrequired intheestimationstrategy.AdditionalsensordatasuchasIMUsandintermittentGPScouldalsobefusedwiththevision-basedposeestimationtoenhance performance.

PAGE 74

CHAPTER5 LYAPUNOV-BASEDSTATEESTIMATION 5.1Introduction ManyapplicationsrequiretheinterpretationoftheEuclideancoordinatesof featuresofa3-dimensional(3D)objectthrough2Dimages.Inthischapter,the relativerangeandtheEuclideancoordinatesofacameraundergoinggenerala ne motionaredeterminedforpinholecamerasystemsviaanonlinearobserver.The nonlinearobserverasymptoticallydeterminestherangeinformationprovidedthat themotionparametersareknown.TheobserverisdevelopedthroughaLyapunovbaseddesignandstabilityanalysis,andsimulationresultsareprovidedthat illustratetheperformanceofthestateestimator.Thecontributionsofthischapter arethatthedevelopedobserver:canbeappliedtobotha neandnona ne systems;iscontinuous;andyieldsanasymptoticresult. 5.2A neEuclideanMotion Forthedevelopmentinthisdissertation,thescenarioofamovingcamera viewingastaticscenetorecoverthestructureofthesceneand/orthemotionof thecamera(cf.,[53,84,85],andtherein)isconsidered.Thea nemotionofthe cameradynamicscanbeexpressedas 1 2 3 = 111213212223313233 123 + 123 (51) where ( )=[ 1( ) 2( ) 3( )] R3denotestheunmeasurableEuclidean coordinatesofthemovingcameraalongthe ,and axesofacamera xed referenceframe,respectively,wherethe axisiscolinearwiththeopticalaxis 63

PAGE 75

64 ofthecamera.In(51),theparameters ( ) R =1 2 3 ofthematrix ( ) R3 3and ( )=[ 1, 2, 3] R3denotethemotionparameters.Thea ne motiondynamicsintroducedin(51)areexpressedinageneralformthatdescribes anobjectmotionconsistingofarotation,translation,andlineardeformation[86]. Assumption5.1: Themotionparametersin ( ) and ( ) introducedin (5)areassumedtobeknown,boundedfunctionsoftimethataresecondorder di erentiable.(cf.[51,55,87]). Toillustratehowthea nedynamicsin(51)representamovingcamera viewingastationaryobject(seeFigure51),considerafeaturepointattached toastationaryobjectasinFigure51.InFigure51, Fdenotesabodyxed coordinateframeattachedtothecamera, I denotesaninertialcoordinateframe and ( ) (expressedin F)denotesthecoordinatesofthetargetfeaturepoint Thelinearandangularvelocitiesofthetarget(i.e., ( ) and ( ) )withrespectto thecamera(expressedin F)canbewrittenas = = (52) where ( ) R3 3denotesthecorrespondingrotationbetween Fand I ,and ( ) and ( ) denotethelinearandangularvelocityofthecamera,respectively.Based on(5),Thetimederivativeof ( ) canbeexpressedas =[ ] + = + (53) PotentialapplicationsforthisscenarioundertherestrictionofAssumption1 includeexampleswherethecameraismovingwithaknown/measurablelinear andangularvelocitywherethegoalistoestimatetheEuclideanpositionofthe movingcameraintime,suchas:inertialnavigationinGPSdeniedenvironments andsimultaneouslocalizationandmapping(SLAM).

PAGE 76

65 oi moving cameracFX YcZcfixed target1( x x x )23=xT Ic Figure5:Movingcamerastationaryobjectscenario. Figure5:Euclideanpointprojectedontoimageplaneofapinholecamera

PAGE 77

66 5.3ObjectProjection Theprojectionofthecoordinates ( ) ontoanimageplanewithitsfocusat theorigin(seeFigure5)canbeexpressedas ( ) = (54) where R denotestheconstantknowndistancebetweenthefocalpointandthe imageplane. Derivingtheexpressionforthetimerateofchangeofpixellocationbytaking thetimederivativeof(5)andutilizing(51)as ( )= 1+ (55) where 1( ) R2denotesavectorofmeasurableandknownsignalsde nedby 1( ) 111213212223 1 313233 (56) andtheunmeasurablesignal ( ) [ 1( ) 2( )] R2isde nedas ( ) 12 3 1 (57) Fromtheabovede nitionfor ( ) ,thefollowingexpressioncanbewritten: 2 1+ 2 2= ( 1 3) 1 2+ ( 2 3) 1 2(58) = ( 1 3)2+( 2 3)2 1 2 Basedupon(58)thetimevaryingdepthestimationcanbeexpressedas,

PAGE 78

67 = s ( 1 3)2+( 2 3)2 2 1+ 2 2(59) Assumption5.2: Theimage-spacefeaturecoordinates ( ) ( ) arebounded functionsoftime. Assumption5.3: If ( ) canbeidenti ed,then ( ) canbedetermined from(57),provided 1236=0 simultaneously.Thisobservabilityassumption physicallymeansthattheobjectmusttranslateinatleastonedirection. Remark5.1 :BasedonAssumptions5.1-5.3,theexpressionsgivenin(55)(5)canbeusedtodeterminethat ( ) 1( ) ,and ( ) L.Giventhatthese signalsarebounded,Assumptions5.1-5.3canbeusedtoprovethat k ( ) k 1k ( ) k 2k ( ) k 3(5) where 1, 2and 3 R denoteknownpositiveconstants. 5.4RangeIdenti cationForA neSystems 5.4.1Objective TheobjectiveofthissectionistoextracttheEuclideancoordinateinformation oftheobjectfeaturefromitsimage-basedprojection.From(54)andthefact that ( ) and ( ) aremeasurable,if ( ) couldbeidenti ed,thenthecomplete Euclideancoordinatesofthefeaturecanbedetermined.Toachievethisobjective, anestimatorisconstructedbasedontheunmeasurableimage-spacedynamics for ( ) .Toquantifytheobjective,ameasurableestimationerror,denotedby ( ) [ 1( ) 2( )] R2,isde nedasfollows: = (5)

PAGE 79

68 where ( ) [ 1( ) 2( )] R2denotesasubsequentlydesignedestimate.An unmeasurable1 lteredestimationerror,denotedby ( ) [ 1( ) 2( )] R2 is alsode nedas = + (5) where R2 2denotesadiagonalmatrixofpositiveconstantgains 1, 2 R Motivationforthedevelopmentofthe lteredestimationerrorin(5),isthat thesubsequentobserverisbasedontheequation(5).If ( ) in(5)canbe identi ed,thefactthatthefeaturepointcoordinates ( ) =1 2 aremeasurable canbeusedalongwith(5)tocompute ( ) providedtheobservabilitycondition inAssumption5.3issatis ed. 5.4.2EstimatorDesignandErrorSystem Basedon(55)andthesubsequentanalysis,thefollowingestimationsignals arede ned: ( )= 1+ (5) where ( ) [ 1( ) 2( )] R2denotesasubsequentlydesignedestimatefor ( ) Thefollowingerrordynamicsareobtainedaftertakingthetimederivativeof ( ) andutilizing(5)and(5): = (5) Basedonthestructureof(5)and(514), ( ) isdesignedasfollows[56]: = ( + ) + ( )+ (5) where R3 3denotediagonalmatricesofpositiveconstantestimationgains, andthenotation ( ) isusedtoindicateavectorwiththestandardsignum 1The lteredestimationsignalisunmeasurableduetoadependenceontheunmeasurableterms 1( ) 2( ) .

PAGE 80

69 functionappliedtoeachelementoftheargument.Thestructureoftheestimator in(515)containsdiscontinuousterms;however,asdiscussedin[56],theoverall structureoftheestimatoriscontinuous(i.e., ( ) iscontinuous).Afterusing (52),(514),and(515),thefollowingexpressioncanbeobtained: = ( ) (5) where ( ) 12 R2isde nedas = +( + ) (5) Basedon(510)and(517),thefollowinginequalitiescanbedeveloped: | ( ) | 4| ( ) | 5. (5) where 4and 5 R denoteknownpositiveconstants. Remark5.1 :Considering(5),theunmeasurablesignal ( ) canbeidentiedif ( ) approaches ( ) as (i.e., ( ) and ( ) approach ( ) and ( ) as )sincetheparameters ( ) =1 2 areassumedtobeknown,and ( ) and ( ) aremeasurable.After ( ) isidenti ed,(54)canbeusedtoextractthe3D Euclideancoordinatesoftheobjectfeature(i.e.determinetherangeinformation). Toprovethat ( ) approaches ( ) as ,thesubsequentdevelopmentwill focusonprovingthat k ( ) k 0 and k ( ) k 0 as basedon(5)and (5). 5.5Analysis Thefollowingtheoremandassociatedproofcanbeusedtoconcludethatthe observerdesignof(5)and(5)canbeusedtoidentifytheunmeasurable signal ( ) Theorem5.1 :Forthesystemin(55)-(5),theunmeasurablesignal ( ) (andhence,theEuclideancoordinatesoftheobjectfeature)canbeasymptotically

PAGE 81

70 determinedfromtheestimatorin(5)and(5)providedtheelementsofthe constantdiagonalmatrix introducedin(515)areselectedaccordingtothe su cientcondition 4+ 1 5(5) =1 2 ,where 4, 5arede nedin(5). Proof: Consideranon-negativefunction ( ) R asfollows(i.e.,aLyapunov functioncandidate): = 1 2 (5) Aftertakingthetimederivativeof(5)andsubstitutingfortheerrorsystem dynamicsgivenin(56),thefollowingexpressioncanbeobtained: = +( + )( ( )) (5) Afterintegrating(5)andexploitingthefactthat ( )= | | R thefollowinginequalitycanbeobtained: ( ) ( 0) Z 0 ( ) ( ) +2X =1Z 0| ( ) | ( | ( ) | ) + (5) wheretheauxiliaryterms ( ) R arede nedas = Z 0 ( ) ( ) Z 0 ( ) ( ( )) (5) =1 2 .Theintegralexpressionin(53)canbeevaluatedas = ( ) ( ) | 0 Z 0( ) ( ) | ( ) || 0(5) = ( ) ( ) Z 0( ) ( ) | ( ) | ( 0) ( 0)+ | ( 0) |

PAGE 82

71 =1 2 .Substituting(5)into(5)andperformingsomealgebraicmanipulationyields ( ) ( 0) Z 0 ( ) ( ) + 3+ 0wheretheauxiliaryterms 3( ) 0 R arede nedas 3=2X =1Z 0| ( ) | | ( ) | + 1 | ( ) | +2X =1| ( ) | ( | ( ) | ) 0=2X =1 ( 0) ( 0)+ | ( 0) | Provided =1 2 areselectedaccordingtotheinequalityintroducedin(59), 4( ) willalwaysbenegativeorzero;hence, ( ) canbeupperboundedas ( ) ( 0) Z 0 ( ) ( ) + 0 (5) From(50)and(525),thefollowinginequalitiescanbedetermined: ( 0)+ 0 ( ) 0; hence, ( ) L.Theexpressionin(525)canbeusedtodeterminethat Z 0 ( ) ( ) ( 0)+ 0 (5) Byde nition,(5)cannowbeusedtoprovethat ( ) L2.Fromthefact that ( ) L,(5)and(5)canbeusedtoprovethat ( ) ( ) ( ) ,and ( ) L.Theexpressionsin(53)and(515)canbeusedtodeterminethat ( ) and ( ) L.Basedon(510),theexpressionsin(516)and(517)canbe usedtoprovethat ( ) ( ) ( ) L.Basedonthefactthat ( ) ( ) Landthat ( ) L2,BarbalatsLemma[90]canbeusedtoprovethat k ( ) k 0 as ;hence,standardlinearanalysiscanbeusedtoprovethat k ( ) k 0 and k ( ) k 0 as .Basedonthefactthat k ( ) k 0 and k ( ) k 0 as ,theexpressiongivenin(511)canbeusedtodeterminethat 1( ) and

PAGE 83

72 3( ) approach 1( ) and 3( ) as ,respectively.Therefore,theexpressionin (5)canbeusedtodeterminethat ( ) approaches ( ) as .Theresult that ( ) approaches ( ) as ,thefactthattheparameters ( ) =1 2 3 areassumedtobeknown,andthefactthattheimage-spacesignals 1( ) and 3( ) aremeasurablecanbeusedtoidentifytheunknownsignal ( ) from(5).Once ( ) isidenti ed,thecompleteEuclideancoordinatesoftheobjectfeaturecanbe determinedusing(54). WiththeEuclideancoordinatesoftheobjectknown,asimplisticmethod fordeterminingaircraftaltitudeabovethegroundcouldemploythefollowing relationship: = sin cos + { cos cos cos +sin sin } + { cos cos sin sin cos } where isthedownwardlookangleofthecamerawithrespecttotheaircraftand and representtheaircraftrollandpitchanglerespectively. 5.6Conclusion TheresultsinthischapterfocusontheuseofanonlinearestimatortodeterminetherangeandtheEuclideancoordinatesofanobjectfeaturewithrespectthe cameracoordinatesystemundergoinggenerala nemotion.Thenonlinearestimatorisproven,viaaLyapunov-basedanalysis,toasymptoticallydeterminethe rangeinformationforacamerasystemwithknownmotionparameters.Ifspeci c observabilityconditionsaresatis ed,theidenti edrangecanbeusedtoreconstruct theEuclideancoordinatesofthemovingaircraftwithrespecttoa xedobjecton theground.

PAGE 84

CHAPTER6 CONTRIBUTIONSANDFUTUREWORK Anovelvision-basedestimation,localization,andcontrolmethodologyis presentedforenablingautonomous ightofa xed-wingairvehicle;providing itwiththecapabilityof yinginde nitelyoveraregioncomposedofplanar patchesoffeaturepoints.Theproposeddaisy-chainingapproachisdistinctfrom themajorityofthecurrentvision-basedmethodsofstateestimationinthat currentmethodsusuallyrequiresomedeg reeofaprioriknowledgeoflandmarks orarespeci callydesignedforhoveringvehiclesandthereforearenotdevisedto handlefeaturepointsenteringandleavingthe eld-of-viewofthecamera.One contributionofthisdissertationisthatitisthe rstoccasionthatahomographybasedstateestimationmethodispresentedtohandlefeaturepointsenteringand leavingthecamera eld-of-view.Asaresult,thisisalsothe rsttimeinwhich acamerahasbeendemonstratedtoactasthesolesensor,withtheexceptionof analtimeterforapproximatingheightaboveagivenplanarpatch,forestimating aircraftpositionduring ightintoaGPSdeniedenvironment.Asacomplimentto theseresultsandtoaddresstherequirementforanestimationoftheheightabove agivenplanarpatch,aLyapunov-basedstateestimator,usedincombinationwith thedaisy-chainingmethod,isthe rstinstanceofillustrat ingtheplausibilityof autonomousairvehicle ightoveraninde nitedistancewiththecameratruly actingasthesolevehiclesensor. Usingavideocameraastheprimarysensorinaircraftcontrolrequiresspecial considerationfromacontrolsstandpoint.Inadditiontosuchpresentedconcerns, astheairvehicleisrequiredto yinlesserbenignregimes,suchaswithagile maneuvering,itbecomesevidentthatsimplisticclassicalcontrolmethodswillbe 73

PAGE 85

74 limitedincapabilityandperformance.Therefore,inorderto yanaircraftina closed-loopsenseusingacameraasaprimarysensor,thecontrollerneedstobe robusttonotonlyparametricuncertainties,buttosystemnoisethatisofthe kinduniquelycharacteristicofcamerasystems.Anovelnonlinearcontrolleris presentedasacredibleanswertosuchanticipatedissues,resultinginthe rstcase ofdevelopinganaircraftcontrollerforanuncertainsystemthatprovidesasemiglobalasymptoticallystableresult,wherethereexistsinputuncertaintyaswellas generalizedadditivenonlineardisturbancesthatarestateandtimevarying.Future workonthiscontrolapproachshouldattempttoeliminatetherequirementthat the lteredtrackingerrorneedstobemeasured,asopposedtothemoredesirable situationofsimplymeasu ringthetrackingerror. Otherpossiblefutureworkthatbuildsuponwhatispresentedinthisdissertationwouldbetoinvestigatemethodsthatwouldallowtherelaxationofthe coplanarrequirement,i.e.navigationoveranonatearth,tofusethehomographybaseddaisy-chainestimateswithadditionalsensorsforimprovedaccuracy,and toperformacomparisonanalysisbyapplyingestimationmethods(i.e.Kalman ltering)tobothfeaturepointtrackingintheimageplaneaswellasvehiclestate estimation.Potentialfutureworkmightalsobetoinvestigatemultiplevehicles foruseinthehomography-baseddaisy-chainingmethodasananaloguetowhatis currentlybeingresearchedwithinthe eldofcooperative-SLAM. Becauseerrorfromthehomography-baseddaisy-chainestimateaccumulates witheachhando ,itwouldalsobeofinteresttoperform ightexperimentsthat combinetheLyapunov-basedstateestimatorwiththedaisy-chainingmethod. Indoingso,onecouldinvestigateusinganupdatedheightaboveplanarpatch measurementbetweeneachhand-o toreducethee ectofaccumulatingposition error.Thisimprovementwouldbeexpectedsincepositionerrorscaleswiththe errorintheestimatedmagnitudeofthevectorthatisde nedbybeingbothnormal

PAGE 86

75 totheplanarpatchandextendingfromtheplanarpatchtothevehicle.The currentmethodusesheightabovegroundderivedfromanaltimetertoapproximate thismagnitude,whichultimatelyresultsinanerrorinestimatedposition.Finally, theultimategoalofsuchablendingofmethodologieswouldbeindemonstrating autonomous ightviathecameraasthesolesensor. Futuree ortsshouldalsoattempttomitigatetheestimationerrorsthat candirectlybeattributedtothe delityoftheequipmentused.Toaddressthis issueandtosupportfuturework,recentpurchasesofthefollowinghigherquality equipmenthavebeenmade: SolidLogicC156Mini-ITXSystem,EPIAMII12000GMainboard,fordirectto-hard-diskvideorecordingandpossiblereal-timeonboardprocessing. PixelinkPL-B958F2.0megapixel(1600x1200)colorCCDcamerabased upontheSonyICX274progressivescansensorwitha1 2opticalformat, globalshutter,variableshutterspeed andframerate,andstandardFireWire (1394A). HauppageWinTV-PVR-USB2,direct-to-hard-diskvideorecorderforrecordingtheair-vehicleto-groundtelemeteredvideosignal.

PAGE 87

APPENDIXA (CHAPTER3)INTEGRATIONOFTHEAUXILIARYFUNCTION, ( ) (SeeLemma1) Integratingbothsidesof(337)yields Z 0 ( ) = Z 0 ( ) ( ) ( ( )) (A1) Substituting(316)into(A1),utilizing(3),andrearrangingyields Z 0 ( ) = Z 0 ( ) ( ) Z 0 ( ) ( ( )) + Z 0 ( )( ( ) ( ( ))) Z 0 ( ) ( ( )) (A2) Integratingthe rstintegralin(A2)usingintegrationbyparts, Z 0 ( ) = ( )( ) 0 Z 0 ( )( ) Z 0 ( ) ( ( )) + Z 0 ( )( ( ) ( ( ))) Z 0 ( ) ( ( )) (A3) From(A3),thefollowingboundcanbeobtained: Z 0 ( ) Z 0 k ( ) k k ( ) k + 1 ( ) min( ) + k ( ) k ( k ( ) k min( ))+ k kk (0) k (0)(0) + Z 0 k kk ( ) k (A4) where wasde nedin(31).Thus,itisclearfrom(A4)thatif satis es(3), then(38)holds. 76

PAGE 88

APPENDIXB (CHAPTER4)VIDEOEQUIPMENTUSEDONBOARDTHEAIRCRAFT Theitemslistedinthisappendixrepresenttheequipmentthatwas ownon theaircraftaspartofthe ightexperiment: FigureB1isofthe640x480pixelCCDbulletcamerathatwasusedforthe videocollection.TheoutputsignalfromthiscamerawassplitwithaY-cableand senttotwodi erentdevices. FigureBshowstheminiDVrecorderthatwasused.Itwasdetermined throughseveralunsuccessfulattamptstogathervideodatathatitwasnecessaryto havetheabilitytorecordonboardinordercapturevideodatathatwasnoisefree. ThevideofromthecamerainFigureBwassplitwithaY-cable,aspreviously stated,withoneofthevideosignalsgoingdirectlyintothisrecorder. FigureB3isoftheselfcontainedGPSantennae/receiver.Thiswasusedto determinethelocationoftheaircraft,whichinturnwasusedinitiallyastruth data.Thedaisy-chainresultswerecomparedwiththeGPSdataforvalidation FigureB:SonyColorExviewSuperHAD(480Lines o f Resolution ) 77

PAGE 89

78 FigureB:PanasonicAG-DV1DigitalVideoRecorder FigureB3:GarminGPS35OEMGPSReceiver purposes.TheoutputfromthisdevicewenttotheGPSoverlayboardshownin FigureB4. FigureB4isoftheGPSoverlayboard.Theothervideosignalfromthe Y-cablecomingfromthecamerainFigureB1wentintothisboardthatoverlaid thevideowiththeGPSdatafromthereceivershowninFigureB.Fromhere, theoverlaidvideowassenttothetransmittershowninFigureB5. FigureBisofthetransmitterthatwasusedtosendtheGPSoverlaidvideo signaltothegroundreceiver.

PAGE 90

79 FigureB:IntuitiveCircuits,LLC-OSD-GPSOverlayBoard FigureB5:12V,2.4Ghz,100mW,VideoTransmitterandAntennae

PAGE 91

80 FigureB6:EagleTree,Seagull,Wirele ssDashboardFlightSystem-ProVersion: (1)WirelessTelemetryTransmitter,(2)EagleTreeSystemsG-ForceExpander,and (3)EagleTreeSystemsGPSExpander FigureBisofthesecondaryGPSreceiverthatwasalsousedforcomparison purposes.Thisdevicetelemeteredthedatadownonaseparatefrequency,whereit wasrecordedonagroundstation.

PAGE 92

APPENDIXC (CHAPTER4)GROUNDSUPPORTEQUIPMENT Theitemslistedinthisappendixcorrespondtothegroundequipmentusedin the ightexperiment: FigureC1isofthevideoreceiverthatreceivedtheGPSoverlaidvideo. FigureC2isofthesecondvideorecorderthatwasusedinthetesting.In thiscase,itwasusedtorecordthetransmittedGPSoverlaidvideo.Thereason forrequiringasecondvideowasthatitwasnecessarytohaveacleanvideo forperformingtheanalysiswith(seeFigureB2),yetamethodwasneededto correlateeachframeofthecleanvideowithaGPSlocation.Havingtworecorders operatingatoncewasthesimplestsolution.Thevideothatwascapturedonthis devicecouldnotbeusedforanalysisduetotheRFnoiseandtheoverlaytext thattookupmuchoftheimage.Bydoingaframebyframecomparisonwiththe cleanvideorecordedbytherecorderinFig ureB,itwaspossibletoascertainthe positionofthevehicletowithin1second(GPSfrequency)ofeveryframeofthe cleanvideo. FigureC3isthegroundstationinterfaceforthesecondaryGPSreceiverthat wasused,showninFigureB6. FigureC:RX-24102.4GHzWireless4-channelAudio/VideoSelectableReceiver 81

PAGE 93

82 FigureC:SonyGV-D1000PortableMiniDVVideoWalkman FigureC3:EagleTree,Seagull,Real-timeDataDashboard,WirelessTelemetry Data,ReceiverModelSTR-01

PAGE 94

83 FigureC4:LeicaDISTOTMA5(MeasuringRangeupto650ft,Accuracy+/0.06inches) FigureCisthelaserrange nder,showninFigure4,thatwasusedinthe accurateplacementofthelandmarks.

PAGE 95

REFERENCES [1]A.MoutinhoandJ.R.Azinheira,StabilityandRobustnessAnalysisofthe AURORAAirshipControlSystemUsingDynamicInversion, Proceedingsof theInternationalConferenceonRoboticsandAutomation ,Barcelona,Spain, April2005,pp.2265. [2]M.W.OppenheimerandD.B.Doman,ControlofanUnstable,NonminimumPhaseHypersonicVehicleModel, ProceedingsoftheIEEEAerospace Conference ,BigSky,MT,March2006,pp.1-7. [3]S.Onori,P.Dorato,S.Galeani,andC.Abdallah,FiniteTimeStabilityDesignViaFeedbackLinearization, ProceedingsoftheConferenceonDecision andControl,andtheEuropeanControlConference ,Seville,Spain,December 2005,pp.4915-4920. [4]Z.Szabo,P.Gaspar,andJ.Bokor,TrackingDesignforWienerSystems BasedonDynamicInversion, ProceedingsoftheInternationalConferenceon ControlApplications ,Munich,Germany,October2006,pp.1386. [5]J.Chen,D.Li,X.Jiang,andX.Sun,AdaptiveFeedbackLinearization ControlofaFlexibleSpacecraft, ProceedingsoftheConferenceonIntelligent SystemsDesignandApplications,Jinan,China,October2006,pp.225. [6]A.D.NgoandD.B.Doman,DynamicInversion-BasedAdaptive/Recon gurableControloftheX-33onAscent, Proceedingsofthe IEEEAerospaceConference ,BigSky,MT,March2006,pp.2683. [7]M.D.TandaleandJ.Valasek,AdaptiveDynamicInversionControlofa LinearScalarPlantwithConstrainedControlInputs, Proceedingsofthe AmericanControlConference ,Portland,OR,June2005,pp.2064. [8]N.Hovakimyan,E.Lavretsky,andA.Sasane,DynamicInversionfor Nona ne-in-ControlSystemsViaTime-ScaleSeparation:PartI, ProceedingsoftheAmericanControlConference ,Portland,OR,June2005,pp. 3542. [9]N.Hovakimyan,E.Lavretsky,andC.Cao,DynamicInversionofMultiinput Nona neSystemsViaTime-ScaleSeparation, ProceedingsoftheAmerican ControlConference ,Minneapolis,MN,June2006,pp.3594. 84

PAGE 96

85 [10]E.LavretskyandN.Hovakimyan,AdaptiveDynamicInversionforNona nein-ControlSystemsViaTime-ScaleSeparation:PartII, Proceedingsofthe AmericanControlConference ,Portland,OR,June2005,pp.3548. [11]J.Bu ngtonandA.Sparks,ComparisonofDynamicInversionandLPV TaillessFlightControlLawDesigns, ProceedingsoftheAmericanControl Conference ,vol.2,Philadelphia,PA ,June1998,pp.1145. [12]X.-.J.Liu,F.Lara-Rosano,andC.W.Chan,Model-ReferenceAdaptive ControlBasedonNeurofuzzyNetworks, IEEETrans.Syst.,Man,Cybern.C, vol.34,no.3,pp.302,August2004. [13]A.CaliseandR.Rysdyk,NonlinearAdaptiveFlightControlUsingNeural Networks, IEEEControlSystemMagazine ,vol.18,no.6,pp.14,December1998. [14]J.Leitner,A.Calise,andJ.V.R.Prasad,AnalysisofAdaptiveNeural NetworksforHelicopterFlightControls, AIAAJournalofGuidance,Control, andDynamics ,vol.20,no.5,pp.972,September1997. [15]Y.Shin,NeuralNetworkBasedAdaptiveControlforNonlinearDynamic Regimes,Ph.D.dissertation,GeorgiaTechnicalInstitute,November2005. [16]E.LavretskyandN.Hovakimyan,AdaptiveCompensationofControlDependentModelingUncertaintiesUsingTime-ScaleSeparation, Proceedingsofthe ConferenceonDecisionandControl,andtheEuropeanControlConference Seville,Spain,December2005,pp.2230235. [17]B.Xian,D.M.Dawson,M.S.deQueiroz,andJ.Chen,AContinuous AsymptoticTrackingControlStrategyforUncertainNonlinearSystems, IEEETransactionsonAutomaticControl ,vol.49,no.7,pp.1206,July 2004. [18]P.M.Patre,W.MacKunis,C.Makkar,andW.E.Dixon,Asymptotic TrackingforSystemswithStructuredandUnstructuredUncertainties, ProceedingsoftheConferenceonDecisionandControl ,SanDiego,CA, December2006,pp.441. [19]Z.Cai,M.S.deQueiroz,andD.M.Dawson,RobustAdaptiveAsymptotic TrackingofNonlinearSystemswithAdditiveDisturbance, IEEETransactionsonAutomaticControl ,vol.51,no.3,pp.524,March2006. [20]B.Xian,M.S.deQueiroz,andD.M.Dawson, AContinuousControlMechanismforUncertainNonlinearSystemsinOptimalControl,Stabilization,and NonsmoothAnalysis ,Heidelberg,Germany:Springer-Verlag,2004. [21]X.Zhang,A.Behal,D.M.Dawson,a ndB.Xian,OutputFeedbackControl foraClassofUncertainMIMONonlinearSystemswithNonsymmetricInput

PAGE 97

86 GainMatrix, ProceedingsoftheConferenceonDecisionandControl,andthe EuropeanControlConference ,Seville,Spain,December2005,pp.7762. [22]M.L.McIntyre,W.E.Dixon,D.M.Dawson,andI.D.Walker,Fault Identi cationforRobotManipulators,IEEE TransactionsonRoboticsand Automation ,vol.21,no.5,pp.1028,October2005. [23]S.Gupta,D.Aiken,G.Hu,andW.E.Dixon,Lyapunov-BasedRange andMotionIdenti cationforaNona nePerspectiveDynamicSystem, ProceedingsoftheAmericanControlConference ,Minneapolis,MN,June2006, pp.4471. [24]W.E.Dixon,Y.Fang,D.M.Dawson,andT.J.Flynn,RangeIdenti cation forPerspectiveVisionSystems, IEEETransactionsonAutomaticControl vol.48,no.12,pp.2232 ,December2003. [25]A.Behal,D.M.Dawson,W.E.Dixon,andY.Fang,TrackingandRegulationControlofanUnderactuatedSurfaceVesselwithNonintegrable Dynamics, ProceedingsoftheConferenceonDecisionandControl ,vol.3, Sydney,Australia,Decemb er2000,pp.2150. [26]Center,J.A.V.N.T.S.,VulnerabilityAssessmentoftheTransportInfrastructureRelyingontheGlobalPositioningSystem,Report,O ceofthe AssistantSecretaryforTransportationPolicy,U.S.DepartmentofTransportation,August2001. [27]G.Hu,S.Mehta,N.R.Gans,W.E.Dixon,"DaisyChainingBasedVisual ServoControlPartI:AdaptiveQuaternion-BasedTrackingControl," ProceedingsoftheIEEEMulti-conferenceonSystemsandControl ,Singapore,October 2007,pp.1474. [28]G.Hu,N.R.Gans,S.Mehta,W.E.Dixon,"DaisyChainingBasedVisual ServoControlPartII:Extensions,ApplicationsandOpenProblems," ProceedingsoftheIEEEMulti-conferenceonSystemsandControl, Singapore, October2007,pp.729. [29]Roberts,J.M.,Corke,P.I.,andBuskey,G.,Low-CostFlightControl SystemforaSmallAutonomousHelicopter, ProceedingsoftheAustralasian ConferenceonRoboticsandAutomation, Auckland,NewZealand,November, 2002. [30]Zhang,H.andOstrowski,J.,VisualServoingwithDynamics:Controlof anUnmannedBlimp, ProceedingsoftheIEEEInternationalConferenceon RoboticsandAutomation ,vol.1,May1999,pp.618. [31]Sharp,C.S.,Shakernia,O.,andSastry,S.S.,AVisionSystemforLandinganUnmannedAerialVehicle, ProceedingsoftheIEEEInternational

PAGE 98

87 ConferenceonRoboticsandAutomation(ICRA) ,Seoul,Korea,2001,pp. 1720. [32]Bernatz,A.,andThielecke,F.,NavigationofaLowFlyingVTOLAircraft WiththeHelpofaDownwardsPointingCamera, ProceedingsoftheAIAA Guidance,Navigation,andControlConference ,Providence,RI,August2004. [33]Yuan,Z.,Gong,Z.,Wu,J.,Chen,J.,andRao,J.,AReal-TimeVision-based GuidedMethodforAutonomousLandingofaRotor-craftUnmannedAerial Vehicle, ProceedingsoftheIEEEInternationalConferenceonMechatronics& Automation, NiagaraFalls,Canada, July2005,pp2212. [34]Wu,A.D.,Johnson,E.N.,andProctor,A.A.,Vision-AidedInertial NavigationforFlightControl, JournalofAerospaceComputing,Information, andCommunication ,vol.2,September2005,pp.348. [35]Schulz,H.-W.,Buschmann,M.,Krger,L.,Winkler,S.,andVrsmann,P., Vision-BasedAutonomousLandingforSmallUAVsFirstExperimental Results, Infotech@Aerospace ,Arlington,Virginia,September,2005. [36]Jones,C.G.,Heyder-Bruckner,J.F.,Richardson,T.S.,andJones,D.C., Vision-basedControlforUnmannedRotorcraft, ProceedingsoftheAIAA Guidance,Navigation,andControlConference ,AIAA-2006-6684,Keystone, CO,August2006. [37]Saripalli,S.,Montogomery,J.F.,andSukhatme,G.S.,VisuallyGuided LandingofanUnmannedAerialVehicle,IEEE TransactionsonRoboticsand Automation ,vol.19,No.3,June2003,pp.371. [38]Chatterji,G.B.,Menon,P.K.,andSridhar,B.,Vision-BasedPositionand AttitudeDeterminationforAircraftNightLanding, JournalofGuidance, Control,andDynamics ,1998vol.21,No.1,pp.84. [39]Proctor,A.A.,andJohnson,E.N.,Vision-onlyApproachandLanding, ProceedingsoftheAIAAGuidance,Navigation,andControlConference AIAA-2005-5871,SanFrancisco,CA,August2005. [40]Liu,T.,andFleming,G.,Videogramm etricDeterminationofAircraftPositionandAttitudeforVision-BasedAutonomousLanding, AIAAAerospace SciencesMeetingandExhibit ,AIAA-2006-1437,Reno,NV,January2006. [41]Shakernia,O.,Y.Ma,Koo,T.J.,andSastry,S.,LandinganUnmanned AirVehicle:VisionBasedMotionEstimationandNonlinearControl, Asian JournalofControl ,vol1,pp.128. [42]Jianchao,Y.,ANewSchemeofVisionBasedNavigationforFlyingVehicles -ConceptStudyandExperimentEvaluation, ProceedingsoftheIEEE

PAGE 99

88 InternationalConferenceonControl,Automation,RoboticsAndVision (ICARV) ,Singapore,December2002,pp.643. [43]Kim,J.,AutonomousNavigationforAirborneApplications,Ph.D.Dissertation,DepartmentofAerospace,MechanicalandMechatronicEngineering,The UniversityofSydney,Sydney,Australia,May2004. [44]Webb,T.P.,Prazenica,R.J.,Kurdila,A.J.,andLind,R.,Vision-Based StateEstimationforUninhabitedAerialVehicles, ProceedingsoftheAIAA Guidance,Navigation,andControlConference ,AIAA-2005-5869,SanFrancisco,CA,August2005. [45]Prazenica,R.J.,Watkins,A.,Kurdila,A.J.,Ke,Q.F.,andKanade,T., Vision-BasedKalmanFilteringforAircraftStateEstimationandStructure fromMotion, ProceedingsoftheAIAAGuidance,Navigation,andControl Conference ,AIAA-2005-6003,SanFra ncisco,CA,August2005. [46]Caballero,F.,Merino,L.,Ferruz,J.,andOllero,A.,ImprovingVision-Based PlanarMotionEstimationforUnmannedAerialVehiclesThroughOnline Mosaicing, ProceedingsoftheIEEEInternationalConferenceonRoboticsand Automation(ICRA),May2006,pp.2860. [47]Koch,A.,Wittich,H.,andThielecke,F.,AVision-BasedNavigationAlgorithmforaVTOL-UAV, ProceedingsoftheAIAAGuidance,Navigation,and ControlConference ,AIAA-2006-6546,Keystone,CO,August2006. [48]Barber,D.B.,Gri ths,S.R.,McLain,T.W.,andBeard,R.W.,AutonomousLandingofMiniatureAerialVehicles, JournalofAerospace Computing,Information,andCommunication ,vol.4,May2007,p.770. [49]Trisiripisal,P.,Parks,M.R.,Abbot,A.L.,Liu,T.,andFleming,G.A., StereoAnalysisforVision-basedGuidanceandControlofAircraftLanding, ProceedingsoftheAIAAAerospaceSciencesMeetingandExhibit ,AIAA-20061438,Reno,Nevada,January2006. [50]A.Chiuso,P.Favaro,H.Jin,andS.S oatto,StructurefromMotionCausally IntegratedOverTime, IEEETransactionsonPatternAnalysisandMachine Intelligence ,vol.24,pp.523,2002. [51]H.Kano,B.K.Ghosh,andH.Kanai,SingleCameraBasedMotionand ShapeEstimationUsingExtendedKalmanFiltering, Mathematicaland ComputerModelling ,vol.34,pp.511,2001. [52]L.Matthies,T.Kanade,andR.Szeliski,KalmanFilter-BasedAlgorithmfor EstimatingDepthfromImageSequence, InternationalJournalofComputer Vision ,vol.3,pp.209,1989.

PAGE 100

89 [53]S.Soatto,R.Frezza,andP.Perona,MotionEstimationViaDynamic Vision, IEEETransactionsonAutomaticControl ,vol.41,pp.393,1996. [54]X.ChenandH.Kano,StateObserverforaClassofNonlinearSystemsand itsApplicationtoMachineVision, IEEETransactionsonAutomaticControl vol.49,no.11,pp. 2085,2004. [55]X.ChenandH.Kano,ANewStateObserverforPerspectiveSystems, IEEE TransactionsonAutomaticControl ,vol.47,no.4,pp.658,2002. [56]W.E.Dixon,Y.Fang,D.M.Dawson,andT.J.Flynn,RangeIdenti cation forPerspectiveVisionSystems, IEEETransactionsonAutomaticControl vol.48,no.12,pp. 2232,2003. [57]M.JankovicandB.K.Ghosh,VisuallyGuidedRangingfromObservationsof Points,LinesandCurvesViaanIdenti erBasedNonlinearObserver, Systems andControlLetters ,vol.25,pp.63,1995. [58]D.KaragiannisandA.Astol ,ANewSolutiontotheProblemofRange Identi cationinPerspectiveVisionSystems, IEEETransactionsonAutomaticControl ,vol.50,pp.2074,2005. [59]L.Ma,Y.Chen,andK.L.Moore,RangeIdenti cationforPerspective DynamicSystemwithSingleHomogeneousObservation, Proceedingsofthe IEEE.InternationalConferenceonRoboticsAutomation ,2004,pp.5207. [60],RangeIdenti cationforPerspectiveDynamicSystemUsingLinear Approximation, ProceedingsoftheIEEEInternationalConferenceon RoboticsAutomation ,2004,pp.1658. [61]Ashley,H.,andLandahl,M., AerodynamicsofWingsandBodies ,Dover Publications,Inc.,1985. [62]Z.Cai,M.S.deQueiroz,D.M.Daws on,andB.Xian,AdaptiveAsymptotic TrackingofParametricStrict-FeedbackSystemsinthePresenceofAdditive Disturbances, 43rdIEEEConferenceonDecisionandControl ,December 14-17,2004. [63]L.Duan,W.Lu,F.Mora-Camino,andT.Miquel,Flight-PathTracking ControlofaTransportationAircraft:ComparisonofTwoNonlinearDesignApproaches, ProceedingsoftheDigitalAvionicsSystemsConference Portland,OR,October2006,pp.1. [64]I.Szaszi,B.Kulcsar,G.J.Balas,andJ.Bokor,DesignofFDIFilterforan AircraftControlSystem, ProceedingsoftheAmericanControlConference Anchorage,AK,May20 02,pp.4232. [65]DepartmentofTransportation,Airw orthinessStandards:TransportCategory Airplanes, FederalAviationRegulations-Part25 ,Washington,DC,1996.

PAGE 101

90 [66]F.L.Lewis,C.T.Abdallah,andD.M.Dawson, ControlofRobotManipulators ,NewYork,NY:MacMillan,1993. [67]B.S.Davis,T.Denison,andJ.Kaung,AMonolithicHigh-gSOI-MEMS AccelerometerforMeasuringProjectileLaunchandFlightAccelerations, ProceedingsoftheConferenceonSensors,Vienna,Austria,October2004,pp. 296. [68]V.Janardhan,D.Schmitz,andS.N.Balakrishnan,Developmentand ImplementationofNewNonlinearControlConceptsforaUA, Proceedingsof theDigitalAvionicsSystemsConference,SaltLakeCity,UT,October2004, pp.12.E.5. [69]T.WagnerandJ.Valasek,DigitalAutolandControlLawsUsingQuantitative FeedbackTheoryandDirectDigitalDesign, AIAAJournalofGuidance, Control,andDynamics ,vol.30,no.5,pp.1399 ,September2007. [70]M.Bodson,MultivariableAdaptiveAlgorithmsforRecon gurableFlight Control, ProceedingsoftheConferenceonDecisionandControl ,LakeBuena Vista,FL,December1994,pp.12.E.5. [71]B.J.Bacon,A.J.Ostro ,andS.M.Joshi,Recon gurableNDIController UsingInertialSensorFailureDetection&Isolation, IEEETransactionson AerospaceElectronicSystems ,vol.37,no.4,pp.1373,Oct.2001. [72]G.Tao, AdaptiveControlDesignandAnalysis ,S.Haykin,Ed.,NewYork: Wiley-Interscience,2003. [73]D.Dawson,M.Bridges,andZ.Qu, NonlinearControlofRoboticSystemsfor EnvironmentalWasteandRestoration ,EnglewoodCli s,NJ:PrenticeHall PTR,1995. [74]H.K.Khalil, NonlinearSystems,ThirdEdition ,UpperSaddleRiver,NJ: PrenticeHall,2002. [75]P.M.Patre,W.MacKunis,C.Makkar,andW.E.Dixon,Asymptotic TrackingforSystemswithStructuredandUnstructuredUncertainties, IEEETransactionsonControlSystemsTechnology ,vol.16,March2008,pp. 373. [76]Kaiser,M.K.,Gans,N.R.,andDixon,W.E.,PositionandOrientation ofanAerialVehiclethroughChained,Vision-BasedPoseReconstruction, ProceedingsoftheAIAAGuidance,Navigation,andControlConference Keystone,CO,2006. [77]Kaiser,M.K.,Gans,N.R.,andDixon,W.E.,LocalizationandControl ofanAerialVehiclethroughChained,Vision-BasedPoseReconstruction, ProceedingsoftheAmericanControlConference ,NewYork,NY,2007.

PAGE 102

91 [78]Longuet-Higgins,H.,AComputerA lgorithmforReconstructingaScenefrom TwoProjections, Nature ,Sept.1981,pp.133. [79]Hartley,R.,ComputerVision Proc.of1992EuropeanConferenceonComputer Vision LectureNotesinComputerSciences ,NewYork:Springer-Verlag,1992. [80]Nister,D.,AnE cientSolutiontotheFive-PointRelativePoseProblem, IEEETransactionsonPatternAnalysisandMachineIntelligence ,vol.26, June2004,pp.756. [81]Ma,Y.,Soatto,S.Koseck,J.,andSastry,S.,AnInvitationto3-DVision NewYork:Springer,2004. [82]Faugeras,O.,andLustman,F.,M otionandStructurefromMotionina PiecewisePlanarEnvironment, InternationalJournalofPatternRecognition andArti cialIntelligence ,vol.2,No.3,1988,pp.485. [83]Zhang,Z.,andHanson,A.,DReconstructionBasedonHomography Mapping, ProceedingsoftheARPAImageUnderstandingWorkshop ,Palm Springs,CA,1996. [84]S.SoattoandP.Perona,Reducin g"StructurefromMotion":AGeneral FrameworkforDynamicVisionPart1:Modeling, IEEETransactionson PatternAnalysisandMachineIntelligence ,vol.20,no.9,pp.933,1998. [85],Reducing"StructurefromMotion":AGeneralFrameworkforDynamicVisionPart2:ImplementationandExperimentalAssessment, IEEE TransactionsonPatternAnalysisandMachineIntelligence ,vol.20,no.9,pp. 943,1998. [86]R.Y.TsaiandT.S.Huang,EstimatingThree-DimensionalMotionParametersofaRigidPlanarPatch, IEEETransactionsonAcoustics,Speech,& SignalProcessing ,vol.29,no.6,pp. 1147,1981. [87]D.Aiken,S.Gupta,G.Hu,andW.E.Dixon,Lyapunov-BasedRange Identi cationforaParacatadioptricSystem, ProceedingsoftheIEEE ConferenceonDecisionandControl ,2006,pp.3879. [88]L.Ma,Y.Chen,andK.L.Moore,RangeIdenti cationforPerspective DynamicSystemswith3DImagingSurfaces, ProceedingsoftheAmerican ControlConference ,2005,pp.3671. [89]S.Gupta,D.Aiken,G.Hu,andW.E.Dixon,Lyapunov-BasedRange andMotionIdenti cationforaNona nePerspectiveDynamicSystem, ProceedingsoftheAmericanControlConference ,2006,pp.4471. [90]J.J.SlotineandW.Li, AppliedNonlinearControl .EnglewoodCli s,NJ: PrenticeHall,1991.

PAGE 103

BIOGRAPHICALSKETCH KentKaiserreceivedaB.S.degreeinaerospaceengineeringfromAuburn Universityin1990,anM.S.degreeinmechanicalengineeringfromtheUniversity ofHoustonin1995,andcompletedcourseworktowardsanadditionalM.S.degree inaerodynamicsandpropulsionengineeringfromCal Poly Pomona He has nearly2 0yearsworkexperiencewithintheaerospaceindustry,whichincludeshis currentemploymentwithAirForceResearchLabs(AFRL),aswellasprevious employmentwithLockheedMartinCorporationinHouston,TX,Palmdale,CA, andMarietta,GA.Someofhisresearchinterestsincludesystemidenti cation, aerodynamicperformancebaseddesign, ightdynamicsandcontrol,neural net-basedadaptivecontrol,properorthogonaldecomposition(POD)for uidic modelingand owcontrol,robustcontrol,micro-UAVs,linearmatrixinequality (LMI)control,nonlineardynamicinversionmethodsforaircraftcontrol,visionbasedestimation,guidance,andcontrol,andcooperativecontrol. 92