<%BANNER%>

A B-Spline Model for Camera Calibration

Permanent Link: http://ufdc.ufl.edu/UFE0044320/00001

Material Information

Title: A B-Spline Model for Camera Calibration
Physical Description: 1 online resource (36 p.)
Language: english
Creator: Xu, Yiming
Publisher: University of Florida
Place of Publication: Gainesville, Fla.
Publication Date: 2012

Subjects

Subjects / Keywords: approximation -- b-spline -- camera-calibration -- image-processing
Mechanical and Aerospace Engineering -- Dissertations, Academic -- UF
Genre: Mechanical Engineering thesis, M.S.
bibliography   ( marcgt )
theses   ( marcgt )
government publication (state, provincial, terriorial, dependent)   ( marcgt )
born-digital   ( sobekcm )
Electronic Thesis or Dissertation

Notes

Abstract: This thesis presents a general B-spline surface model for camera calibration. The model implicitly characterizes lens distortion by directly relating positions on the image plane to lines of sight in the world reference frame. Resolution for the model can be adjusted by the numbers of control vertices. Orders for the B-spline surfaces can also be chosen for improvement of accuracy without increasing degrees of freedom in the model. A calibration grid with known position and orientation is used for data acquisition, and calibration results are compared with actual measurements. An auxiliary method is provided to recover the relationship between the camera model and the world reference frame after the camera is moved. Both numerical and practical tests are conducted for demonstration purposes.
General Note: In the series University of Florida Digital Collections.
General Note: Includes vita.
Bibliography: Includes bibliographical references.
Source of Description: Description based on online resource; title from PDF title page.
Source of Description: This bibliographic record is available under the Creative Commons CC0 public domain dedication. The University of Florida Libraries, as creator of this bibliographic record, has waived all rights to it worldwide under copyright law, including all related and neighboring rights, to the extent allowed by law.
Statement of Responsibility: by Yiming Xu.
Thesis: Thesis (M.S.)--University of Florida, 2012.
Local: Adviser: Crane, Carl D.

Record Information

Source Institution: UFRGP
Rights Management: Applicable rights reserved.
Classification: lcc - LD1780 2012
System ID: UFE0044320:00001

Permanent Link: http://ufdc.ufl.edu/UFE0044320/00001

Material Information

Title: A B-Spline Model for Camera Calibration
Physical Description: 1 online resource (36 p.)
Language: english
Creator: Xu, Yiming
Publisher: University of Florida
Place of Publication: Gainesville, Fla.
Publication Date: 2012

Subjects

Subjects / Keywords: approximation -- b-spline -- camera-calibration -- image-processing
Mechanical and Aerospace Engineering -- Dissertations, Academic -- UF
Genre: Mechanical Engineering thesis, M.S.
bibliography   ( marcgt )
theses   ( marcgt )
government publication (state, provincial, terriorial, dependent)   ( marcgt )
born-digital   ( sobekcm )
Electronic Thesis or Dissertation

Notes

Abstract: This thesis presents a general B-spline surface model for camera calibration. The model implicitly characterizes lens distortion by directly relating positions on the image plane to lines of sight in the world reference frame. Resolution for the model can be adjusted by the numbers of control vertices. Orders for the B-spline surfaces can also be chosen for improvement of accuracy without increasing degrees of freedom in the model. A calibration grid with known position and orientation is used for data acquisition, and calibration results are compared with actual measurements. An auxiliary method is provided to recover the relationship between the camera model and the world reference frame after the camera is moved. Both numerical and practical tests are conducted for demonstration purposes.
General Note: In the series University of Florida Digital Collections.
General Note: Includes vita.
Bibliography: Includes bibliographical references.
Source of Description: Description based on online resource; title from PDF title page.
Source of Description: This bibliographic record is available under the Creative Commons CC0 public domain dedication. The University of Florida Libraries, as creator of this bibliographic record, has waived all rights to it worldwide under copyright law, including all related and neighboring rights, to the extent allowed by law.
Statement of Responsibility: by Yiming Xu.
Thesis: Thesis (M.S.)--University of Florida, 2012.
Local: Adviser: Crane, Carl D.

Record Information

Source Institution: UFRGP
Rights Management: Applicable rights reserved.
Classification: lcc - LD1780 2012
System ID: UFE0044320:00001


This item has the following downloads:


Full Text

PAGE 1

AB-SPLINEMODELFORCAMERACALIBRATIONByYIMINGXUATHESISPRESENTEDTOTHEGRADUATESCHOOLOFTHEUNIVERSITYOFFLORIDAINPARTIALFULFILLMENTOFTHEREQUIREMENTSFORTHEDEGREEOFMASTEROFSCIENCEUNIVERSITYOFFLORIDA2012

PAGE 2

c2012YimingXu 2

PAGE 3

Idedicatethistomyparents. 3

PAGE 4

ACKNOWLEDGMENTS Iwouldliketothankallmyfriends,family,andcolleaguesfortheirsupport,whichmadethisworkpossible.IowemygreatestgratitudetomyadvisorCarlCraneforhisguidanceandencouragement.IamthankfulfortheconstructivefeedbackfromWarrenDixon,JohnSchueller,PrabirBarooah,andMrinalKumarthathelpedmakethisastrongerthesis.Manythankstomyfriendsandcolleaguesforthenumeroushelpfuldiscussionsandsuggestions:AlanHamlet,DarsanPatel,JosephOsentowski,KarlBrandt,KeHuo,OlugbengaMosesAnubi,SujinJang,TaehoKim,YilunLiu,YoungjinMoon,YungshengChang,andothers.Thisisdedicatedtomyparentsfortheirinvaluableloveandsupport. 4

PAGE 5

TABLEOFCONTENTS page ACKNOWLEDGMENTS .................................. 4 LISTOFTABLES ...................................... 6 LISTOFFIGURES ..................................... 7 ABSTRACT ......................................... 8 CHAPTER 1INTRODUCTION ................................... 9 1.1ExistingCalibrationMethods ......................... 9 1.2DevelopmentofB-splineModel ........................ 11 2CAMERAMODEL .................................. 13 2.1B-SplineSurfaces ............................... 13 2.2SurfaceFitting ................................. 14 2.3FormulationofB-splineCameraModel .................... 14 3CALIBRATIONTEST ................................ 16 3.1DataAcquisition ................................ 16 3.2ModelFormulation ............................... 17 3.3ErrorAnalysis .................................. 20 3.4ImageRectication ............................... 23 4CAMERAPOSEESTIMATION ........................... 24 4.1LeastSquaresPoseEstimation ........................ 24 4.2NumericalTest ................................. 25 5TARGETEXTRACTION ............................... 29 6CONCLUSION .................................... 32 REFERENCES ....................................... 33 BIOGRAPHICALSKETCH ................................ 36 5

PAGE 6

LISTOFTABLES Table page 3-1Summaryofdatapointsinthecapturedimages .................. 16 3-2Surfacettingerrorofimage2versussurfaceorder ............... 18 3-3Surfacettingerrorofimage5versussurfaceorder ............... 18 3-4Surfacettingerrorsummary ............................ 20 3-5Calibrationerrorsummary .............................. 21 4-1Comparisonbetweenthetrueandfalseestimationresults ............ 26 6

PAGE 7

LISTOFFIGURES Figure page 2-1Formulationofatwo-surfacemodelbyttinglinestovertices. .......... 15 3-1Schematicdiagramofthecalibrationgridsetup. ................. 17 3-2Processofcornerdetection(markedaswhitecrosses)withimage5.(PhotocourtesyofYimingXu.) ............................... 17 3-3B-splinesurfacettedtodatapointsinimage5.CirclesrepresentcontrolverticesoftheB-splinesurface. ............................... 19 3-4Surfacettingerrorinimage5,with20timesoforiginalmagnitudes. ...... 19 3-5B-splinesurfacesttedforimage1,2,4,5,7,and8,fromrighttoleft. ..... 20 3-6Twosurfacecameramodelresultedfromttinglinestocontrolvertices. .... 21 3-7Vectorsofcalibrationerrorforimage3,with20timesoforiginalmagnitudes. 22 3-8Vectorsofcalibrationerrorforimage6,with20timesoforiginalmagnitudes. 22 3-9Recticationofimage3withB-splinecameramodel.(PhotocourtesyofYimingXu.) .......................................... 23 4-1Targetpointssampledfromimage3and6. .................... 27 4-2Atrueestimationoftargetpoints. .......................... 27 4-3Afalseestimationoftargetpoints. ......................... 28 5-1Characteristicobjectforfeatureextraction.(PhotocourtesyofYimingXu.) ... 29 5-2ColorrangeoftargetonH-Vplane. ......................... 30 5-3ColorrangeoftargetonS-Vplane. ......................... 30 5-4Featureextractionresultforthetargetvehicle. ................... 31 7

PAGE 8

AbstractofThesisPresentedtotheGraduateSchooloftheUniversityofFloridainPartialFulllmentoftheRequirementsfortheDegreeofMasterofScienceAB-SPLINEMODELFORCAMERACALIBRATIONByYimingXuMay2012Chair:CarlCraneMajor:MechanicalEngineeringThisthesispresentsageneralB-splinesurfacemodelforcameracalibration.Themodelimplicitlycharacterizeslensdistortionbydirectlyrelatingpositionsontheimageplanetolinesofsightintheworldreferenceframe.Resolutionforthemodelcanbeadjustedbythenumbersofcontrolvertices.OrdersfortheB-splinesurfacescanalsobechosenforimprovementofaccuracywithoutincreasingdegreesoffreedominthemodel.Acalibrationgridwithknownpositionandorientationisusedfordataacquisition,andcalibrationresultsarecomparedwithactualmeasurements.Anauxiliarymethodisprovidedtorecovertherelationshipbetweenthecameramodelandtheworldreferenceframeafterthecameraismoved.Bothnumericalandpracticaltestsareconductedfordemonstrationpurposes. 8

PAGE 9

CHAPTER1INTRODUCTION 1.1ExistingCalibrationMethodsCameracalibrationhasalwaysbeenanimportantprocedureforextractionofaccuratemetricinformationfromimages,obtainingopticalcharacteristicsoflensalongwiththerelativepositionandorientationofcameraswithrespecttosomeglobalreferenceframe.Overtheyears,considerableingenuityhasbeendevotedtothissubject,andanextensivenumberofcalibrationmethodsandcameramodelsweredevelopedtoachievesatisfactoryaccuracywithmanageablecomputation.Adetailedreviewcanbeseenfrom[ 21 ][ 23 ][ 27 ][ 7 ].Calibrationmethodscangenerallybecharacterizedbythecameramodelstheyadopt.Someofthemodels,usuallyearlierones,ignorelensdistortionsothatasimplesolutionforcameraparameterscanberapidlygenerated.Thedirectlineartransformation(DLT),developedbyAbdel-AzizandKarara[ 1 ],usesa3-by-3transformationmatrixtodescribetheperspectiveprojectionfromtheworldcoordinateframetotheimageplane.AsimilarcalibrationmodelispresentedbyHalletal.[ 14 ],thatusesa3-by-4matrixtotransformthehomogeneouscoordinatesbetweentheframes.Furtherdevelopmentandapplicationofthemethodcanbeseenfrom[ 10 ].Anotherexamplefornon-distortioncameramodelsisthetwoplanemethod[ 18 ],wheretheraysoflightcorrespondingtopositionsontheimageplaneareuniquelydeterminedbytheirintersectionpointswiththetwoimaginaryplanesthatareplacedwithinviewofthecamera.Inthesecalibrationmethods,itisassumedthattheperspectiveprojectionrelationshipislinear,thereforeonlysimple,linearequationsneedtobesolvedforthecameramodel.However,forapplicationswheregreaterprecisionisrequired,inclusionoflensdistortionisnecessaryandthusnonlinearprojectionmodelsneedtobeformulated. 9

PAGE 10

Modelsthatincludelensdistortionmayinturnfallintotwocategories,namely,explicitandimplicitones.Calibrationmethodswithexplicitmodelsutilizeexplicitphysicallyinterpretableparametersfordescriptionofthecamera'sopticalcharacteristics.Incomputervision,cameramodelsforpopularcalibrationmethodsareusuallyderivedfromthepinholemodelandcompensatelensdistortionswithadditionaltermsinthemodel.ForexampleinTsai'swork[ 27 ],asecond-orderradialdistortionismodeledbasedondisparitiesofthelinearprojectionwiththerealone.ThemethoddevelopedbyHeikkilaandSilven[ 15 ]rstobtainsinitialestimateswiththeDLTcalculation,andthenobtainsthecoefcientsforradialanddecenteringdistortionusinganonlinearleastsquarestechnique.Zhang'smethod[ 30 ]considersthirdandfthorderradialdistortion,whereasWengetal.[ 28 ]considerradialdistortion,decenteringdistortion,andthinprismdistortioninhismodel.Thesecalibrationmethods,byusingahandfulofparametersinthecameramodels,areproventobeefcient(orevenautomatic)waystoachieveadequateprecisionincalibratingoff-the-shelfcameras.Someotherexamplescanbefoundin[ 2 ][ 26 ].Anothergroupofexplicitcameramodelsappearsincalibrationofcatadioptricimagingsystems,whereimagesareformedthroughacombinationofrefractionandreectionvialenses(dioptrics)andcurvedmirrors(catoptrics).Dependingondesignintentionandassemblyaccuracy,theraysoflightenteringsuchsystemsmaynotnecessarilyintersectatasinglepoint,butratherpassthroughalocusinthreedimensionalspace,referredasacaustic[ 26 ].Studyforsuchimagingsystemscanbefoundin[ 25 ][ 24 ][ 29 ],andwillnotbediscussedextensivelyhere.Generally,calibrationwithexplicitmodelsrequiressomeknowledgeofthegeometryandopticalcharacteristicsofthecamerasinordertoobtainanaccurateoutcome.Althoughhighprecisionandsomeautomationmayhavebeenachievedwiththeseapproaches,theyarerestrictedforcertainimagingsystemsandareinherentlylimitedintheirabilitiestocorrectirregularorlocaldistortions.Onthecontrary,implicitcalibration 10

PAGE 11

methodsconsidertheimagingsystemsasa`blackbox'andformulatethecameramodelswithoutexplicitlycomputingtheirphysicalandopticalparameters[ 23 ].Thereforemodelsofthistypeareuselessforextractionofcameraparameters.However,exclusionofexplicitphysicalparametersalsorelaxestherequirementforpriorknowledgeofthecameras,andallowsformoreexibilityandgeneralityofthecalibrationmethods.Implicitmodelsseektodirectlyrelateinputandoutputoftheimagingsystems,whichareessentiallytheraysoflightinthreedimensionalspaceandthetwodimensionalpositionsontheimageplane.AgenericimagingmodeldevelopedbyGrossbergandNayar[ 13 ]allocatesopticalpropertiestoeverypixelontheimageplane,sothatlightrayscanbedeterminedfrompositionsanddirectionsofacausticsurface.Duetothefactthatexplicitmodelsbeing`blackboxes'relaxestheconstraintsincamera'sopticalproperties,theextraexibilitycouldmakeitmoredifculttoachieveautomationinthecalibrationprocess.Despitethis,Ramalingametal.[ 20 ]presentedanautonomouscalibrationmethodforcentralorslightlynon-centralcamerasthatisabletoassociateprojectionraysdirectlywithimagepixelindices,andtoestimatepositionsandorientationsofoverlappingcalibrationgridswithbundleadjustment.Thesetwoexamplesforimplicitcameracalibrationmethodsbothrelateraysoflightwithdiscretepixelsontheimageplane,whichmaybehelpfulforrapidcomputationandforcameraswithdiscontinuousproperties(forexampleacompoundcamera).However,forcontinuousimagesystems,sincethetargetpointsfromimageprocessingmighthavesub-pixellevelpositionsontheimageplane,itmightbefavorabletoadoptacontinuousmodeltoexploitbetteraccuracyfrominterpolation. 1.2DevelopmentofB-splineModelTheoriginaltwoplanemethodcanberegardedasanimplicitmodelthatdoesnotrelyonknowledgeofopticallenscharacteristics,butitislimitedinaccuracyduetolinearmapping.ThemethodisextendedbyGrembanetal.[ 12 ]usingquadratictransformationortriangularpatches,inordertocarrydistortioninformation.In[ 5 ]and[ 6 ] 11

PAGE 12

Champlebouxetal.proposeamathematicalmodelcalledn-planesB-spline(NPBS)usingbi-cubicB-splinesurfacefunctions,whichrelatetwodimensionalcoordinatesontheimageplanewiththreedimensionalcoordinatesintheworldframe.TheNPBSmethodsolvesforcoefcientsforthecubicsplinecomponentsofthesurfacesbyminimizingalinearizedcostfunctionthathasonetermtointerpolatethedataandtheothertosmooththesurface.Thankstoexibilityinbi-cubicB-splinesurfaces,themodelcantheoreticallycompensateanycontinuouslensdistortionuptothenumberofsubdivisionschoseninitssurfacefunctions.Sinceitonlyaccountsforcorrespondencebetweenimagepositionsandraysoflight,theimagingsystemisnotrestrictedtohaveasingleviewpoint.ThisthesispresentsamodelusinggeneralB-splinesurfaces,whichhavetunableparameterssuchasthenumberofverticesandtheorderofthesurfaces,inorderthatthemodelcanbemoreexiblewithoutthenecessitytoincreasethedegreesoffreedom.Thecontentsofthisthesisarepresentedasfollows.Chapter 1 servesasanintroduction.Chapter 2 describesthegeneralB-splinecameramodel.ExperimentaltestforcameracalibrationisconductedinChapter 3 .AnauxiliarymethodforobtainingthepositionandorientationofthecameraispresentedinChapter 4 ,togetherwithanumericaltestinChapter 4 andapracticaltestinChapter 5 toverifythepracticalityofthemethod.Chapter 6 presentsaconclusionandfurtherdiscussion. 12

PAGE 13

CHAPTER2CAMERAMODEL 2.1B-SplineSurfacesSimilartotheNPBSmethod,multiplesurfacesareintroducedthatintersectwithraysoflightintothecamera,andthepointsofintersectionareassociatedwithpositionsontheimageplanewithbi-parametersurfacefunctions.Thepositionsandshapesofthesurfacescanbechosenarbitrarly,aslongastheycovertheeldofviewofthecamera.Foranypositionontheimageplane,apointcanbelocatedoneachofthesurfaces,sothatthepointsfrommultiplesurfacestogethercandeterminetheraysoflightcoorespondingtotheimagepositions.Inthisapproach,thefunctionsaregeneralB-splinesurfacesdenedas[ 22 ][ 19 ] Q(u,w)=n+1Xi=1m+1Xj=1Bi,jNi,k(u)Mj,l(w),(2)thatmapsparameterpairsuandwtocoordinatesofpointsonthesurfaces.ThebasisfunctionsNi,k(u)andMj,l(w)aredenedasNi,1(u)=8><>:1ifxiu<>:1ifyjw
PAGE 14

tochoosedifferentordersforbasisfunctions,sothatbetteraccuracycanbeobtained,withoutincreasingthedegreeoffreedomofthemodel. 2.2SurfaceFittingAsmentionedearlier,thesurfacesaretorepresentthemappingfromposition(u,w)ontheimageplanetopoint(x,y,z)intheworldframe.Foranygivenposition(u,w),atleasttwoB-splinesurfacesarenecessarytolocatethecorrespondinglineofsightintheworldreferenceframe,byttingastraightlinetothepointsofintersectiononthesurfaces.TheremainingtaskbeforethesurfacesareavailablerequiresndingthecoordinatesforeachBi,j,vertexofthepolygonalcontrolnetofaB-splinesurface.Sincethebasisfunctionsarescalars,thethreecomponentsinthecoordinatescanbetreatedseparately.Forexample,thex-coordinateofvertexxi,jcanbefoundusingalinearleastsquareapproximation[ 16 ][ 4 ]byminimizing xi,j=argminXp ^xp)]TJ /F7 7.97 Tf 12.79 14.94 Td[(n+1Xi=1m+1Xj=1xi,jNi,k(up)Mi,l(wp)!2,(2)where^xpisx-coordinateforthep-thdatapointfrommeasurement,andupandwparecorrespondingimagepositions.Itshouldbenotedthatthereisnosmoothingconstraintsinthecostfunction,sothatthecalculationislinearandthatthesurfacecanbettedascloselytothedatapointsaspossible.Consequently,sufcientdatapointsarerequiredfortheapproximationtobesolvable,thatis,atleastonedatapointshouldberegisteredwithinthecontrolregionofeveryvertex. 2.3FormulationofB-splineCameraModelWiththehelpofacalibrationgrid,muchefciencycanbeachievedbysimultaneouslyregisteringanarrayofpoints.Foreachpositionandorientationofthecalibrationgrid,aB-splinesurfacecanbettedtothedata.Afteratleasttwosurfacesaredened,thelinesofsightforanyposition(u,w)ontheimagecanbelocatedfromtheintersectionpoints(x,y,z)fromeachofthesurfaces.IftherearemorethantwoB-splinesurfacesinthemodel,integratingthemintotwothatuniquelydeterminesthelinescanreducethe 14

PAGE 15

workloadforonlinecomputation.Sincecoordinatesofpointsonthesurfacesarelinearfunctionsintermsofcoordinatesofthecontrolvertices,foranidealcameramodelwithnoerror,vertexpointsatthesamecontrolknotindifferentsurfacesshouldbecollinear.Therefore,onecantstraightlinestothecorrespondingvertices,intersectthemwithtwoB-splinesurfaces,andformulatethetwosurfacemodelsthatuniquelydenesthelineofsightforanyimagepositions(Figure 2-1 ).Hereaparametricexpressionforastraightlineisadoptedtoavoidscalingerrorsinbetweencoordinatecomponents. x=at+b,y=ct+d,z=et+f.(2)Eachlinecorrespondstothetraceofacontrolvertex.CoordinatesofverticesforthetworesultingB-splinesurfacescanbefoundbyspecifyingtwodistinctparameterst1andt2.Again,linearleastsquareapproximationisusefulforttingthestraightlines. Figure2-1. Formulationofatwo-surfacemodelbyttinglinestovertices. 15

PAGE 16

CHAPTER3CALIBRATIONTESTAnexperimenttestiscarriedoutusingacalibrationgridandaslideway.B-splinesurfacesofdifferentordersarettedtothedatapoints,andtheorderofsurfaceswiththesmallesterrorischosentoformulatethecameramodel.Thecalibrationmodeliscomparedwithdatapointmeasurementsthatarenotusedinthemodelingprocess.Attheendofthischapter,thecalibrationmodelisthenusedtorectifyoneofthecapturedimages. 3.1DataAcquisitionInthiscalibrationtest,aCMOSdigitalvideocamerawasusedthattransmitsVGA(640480)graphics.A5050mmgridwasprintedonanA0paper(1189841mm)whichwaspastedontoawoodendrawingboard.Theboardwastheninstalledontoacalibratedslidewayforperpendicularmovements(rangingfrom800to1500mminfrontofthecamera),asillustratedinFigure 3-1 .Theworldcoordinateframewasalignedtothedeviceforconvenience,andeightimageswerecapturedinaccordance,asthecalibrationboardtraveledtoeightdifferentpositions.Intersectionsofthegridweredetectedfromtheimageasdatapoints[ 9 ](Figure 3-2 ).ObtaineddatapointsaresummarizedinTable 3-1 Table3-1. Summaryofdatapointsinthecapturedimages ImageindexZdistance(mm)Numberofdatapoints 1579330265932537393024819257589922469791887105916581139130 16

PAGE 17

Figure3-1. Schematicdiagramofthecalibrationgridsetup. AOriginalimage BCornermetric CDetectedcornerpointsFigure3-2. Processofcornerdetection(markedaswhitecrosses)withimage5.(PhotocourtesyofYimingXu.) 3.2ModelFormulationB-splinesurfacesweresubsequentlyttedtotheacquireddatapoints.Inordertodecidethebestorderforthesurfaces,attingtestwascarriedoutwithallthe8imagesusingdifferentorders.Thenumberofverticeswasdenedtobe (n+1)(m+1)=76.(3) 17

PAGE 18

Table3-2. Surfacettingerrorofimage2versussurfaceorder OrderMaximumerror(mm)Averageerror(mm)Errorvariance(mm2) 31.9880.4810.09941.8300.4640.09251.8010.4630.09161.7790.4610.090 Table3-3. Surfacettingerrorofimage5versussurfaceorder OrderMaximumerror(mm)Averageerror(mm)Errorvariance(mm2) 31.4270.3120.04641.3180.2970.04151.2530.2980.04061.2130.2980.038 Thettingerrorofimage2andimage5withdifferentordersaredisplayedinTable 3-2 andTable 3-3 respectively.Inthisparticularcase,theordersof4and5werefoundtohaveleastttingerrors.Bycomparisonbetweenthetwo,amajorityof5thordersurfaceshadlessaveragettingerror(forimage2,3,4,7,and8),andlessmaximumerror(forimage2,5,6,7,and8).However,4thordersurfaceshadthemostsignicanterrordecreasefromitsneighboringlowerorder,whichisgreaterthanthatof5thordersurfaces.Sincetheerrordifferencesbetweenorder4and5arecomparativelysmall,anorderof4wouldbeusedforconstructingtheB-splinecameramodelforlesscomputation.Nevertheless,itshouldbenotedthatalthoughthismodelhadthesameorderastheNPBSmethod,itisalwaysbenecialtohavemorechoicesforthemodel.Infact,basedondifferentrequirementandpreference,onemaymodelthiscamerausingadifferentorder.Thus,exceptforimage3and6thatwerereservedforerroranalysis,4thorderB-splinesurfaceswerettedtothedatafromeachoftheremainingpictures:k=l=4.Theknotvectorsarexi:[0,0,0,0,160,320,480,640,640,640,640],yj:[0,0,0,0,160,320,480,480,480,480]. (3) 18

PAGE 19

Thettingresultofimage5anditserrorvectorsareshowninFigure 3-3 and 3-4 .ResultsforthesiximagesareinshowninFigure 3-5 ,andthettingerrorissummarizedinTable 3-4 Figure3-3. B-splinesurfacettedtodatapointsinimage5.CirclesrepresentcontrolverticesoftheB-splinesurface. Figure3-4. Surfacettingerrorinimage5,with20timesoforiginalmagnitudes. FromtheresultingsixB-splinesurfaces,astraightlinewasttedtoverticesateachcorrespondingcontrolknot.Interceptingthelinesattheverytwoplanesthatcontaindatapointsfromthereservedimages(image3and6),suchthattheresulted 19

PAGE 20

Figure3-5. B-splinesurfacesttedforimage1,2,4,5,7,and8,fromrighttoleft. Table3-4. Surfacettingerrorsummary ImageindexMaximumerror(mm)Averageerror(mm)Errorvariance(mm2) 11.7420.4750.08321.8300.4640.09342.0860.3470.07651.3180.2970.04170.8820.2260.02080.4130.1690.008 twosurfacecameramodel(Figure 3-6 )couldbecompareddirectlytothemeasureddatainimage3and6. 3.3ErrorAnalysisSincethetwosurfacesaresoplacedthattheycoincidewiththeplanesofdatapointsinimage3and6,thecalibrationerrorcouldbeevaluatedbycomparingthecoordinatesofthemeasureddatapointsonthegridwiththecoordinatesforthedetectedcornerpointsintheimages,whichcanbeobtainedbysubstitutingthecorrespondingimagepositionsintotheB-splinesurfacefunctions.Calibrationerrorisdenedbyvectorsfromthemeasuredpositionsofthecornerpointstothecalculatedonesusingthecameramodel.HereinFigure 3-7 and 3-8 ,these 20

PAGE 21

Figure3-6. Twosurfacecameramodelresultedfromttinglinestocontrolvertices. Table3-5. Calibrationerrorsummary PictureindexMaximumerror(mm)Averageerror(mm)Errorvariance(mm2) 33.160.730.1162.210.900.19 vectorsaredisplayedwith20timestheiroriginalmagnitudes.Thelengthsoftheerrorvectors,orscalarcalibrationerror,arepresentedinTable 3-5 .Therearenumeroussourcesoferrorthatmaycontributetothedisparitiesbetweenoutcomesfromcalibrationandmeasurement.First,thenumberofcontrolverticesandordersofthesurfacesinthecameramodelmaybechosendifferentlytoyieldbetteraccuracy,whichisunlikelybecausethettingerror(Table 3-4 )doesnotappeartobecomparativelysignicant.Second,resolutionofthecameraaswellasprecisionofthecornerdetectiontechniquesmayaffectaccuracy.AsonecanseeinFigure 3-2 ,linesofthecalibrationgridatcornersoftheimageappearfuzzyandtheextractedcornermetricatthecorrespondingplacesisvague.Theseallbringdifcultiesandofcourseinaccuracyforimageprocessing.Lastbutnotleast,thequalityofthecameramodelheavilydependsontheaccuracyoftheacquireddata.Flatnessofthecalibrationboard, 21

PAGE 22

Figure3-7. Vectorsofcalibrationerrorforimage3,with20timesoforiginalmagnitudes. Figure3-8. Vectorsofcalibrationerrorforimage6,with20timesoforiginalmagnitudes. printingprecisionofthegrid,andaccuracyofcoordinatemeasurements,tonameafew,arekeyfactorsfortheoverallperformanceofthecalibrationmethod.Unfortunately,thedeviceavailablefortheexperimenttesthasonlylimitedaccuracy.Considerableerrorwasintroducedduetothefactthatthegridboardcouldnotberigidlypositionedontheslideway,thereforetheplanesofdatapointswerenotaspreciselyparalleltoeachotherastheywereregisteredtobe. 22

PAGE 23

Aroughestimationoftheoverallerrorduetoimprecisemeasurementswouldbearound0.5mm.Incomparison,thecalibrationerrorisadequatelysmallforobjectswithinthecalibratedrange.Predictably,abettercameramodelrequiresabetterslidewayandcalibrationboard. 3.4ImageRecticationWiththiscameramodel,theoriginalimagecanbeprojectedalongthelinesofsightontoanysurface,andformanewimage.Ifthesurfaceisdenedtobeaatplane,theresultingimagewillbedevoidofthedistortionscompensatedbythecameramodel.Todemonstratethis,image3wasthusprojectedontothecorrespondingplaneofthecalibrationgridwhereitisoriginatedfrom,andarectiedimagewasgeneratedasshowninFigure 3-9 AOriginalimage BRectiedimageFigure3-9. Recticationofimage3withB-splinecameramodel.(PhotocourtesyofYimingXu.) 23

PAGE 24

CHAPTER4CAMERAPOSEESTIMATIONUptothispoint,aB-splinecameramodelhasbeenformulatedwithrespecttosomexedreferenceframe.Thereferenceframeofthecameramodel(denotedasFrameC)coincidestothexedworldframe(denotedasFrameW),ifthecameraremainsxedafteritscalibration.However,asthecameramoves,FrameCwouldbemovedawayfromFrameW.Apossiblewaytorecoverthecalibrationresultistondthetransformationmatrix[ 8 ]fromFrameCtothecurrentFrameW,whichcanbeaccomplishedbyusingthecameramodelandacharacteristicobjectwithknowngeometryandknownpositionandorientationinFrameW. 4.1LeastSquaresPoseEstimationSupposetheobjecthasn(n3)non-collineartargetpointsthathaveknownpositionsinFrameW,andareeasilyrecognizablefromtheimage.Basedonimageposition(uk,wk)ofthek-thtargetpoint,thecorrespondinglineofsightwithrespecttoFrameCcanbelocatedusingthecameramodel.Sincethetargetpointsarelocatedonthecoorespondinglinesofsight,theircoordinatesinFrameCcanbewritteninparametricexpressionsofthelinesdenedin( 2 ).Thepositionvectorforthek-thtargetpointinFrameC,CPk,canbewritteninanexpressionwithknownparametersakthroughfkandanunknownvariabletk.ThepositionvectorforthepointwithrespecttoFrameW,WPk,isknownasassumed. CPk=266664xkykzk377775=266664aktk+bkcktk+dkektk+fk377775,WPk=266664pkqkrk377775.(4)Sincedistancesbetweenanycombinationofthetargetpointsshouldbeidenticalinbothcoordinateframes,parametertkcanbedeterminedfromminimizingthefollowingcost 24

PAGE 25

functionwithnonlinearleastsquareapproximation[ 17 ]. tk=argmin1 2Xi6=j)]TJ /F2 11.955 Tf 5.48 -9.69 Td[(kCVijk2)-222(kWVijk2,(4)wherevectorsCVijandWVijarevectorsbetweenPointiandPointj: CVij=CPi)]TJ /F7 7.97 Tf 11.95 4.93 Td[(CPj,WVij=WPi)]TJ /F7 7.97 Tf 11.96 4.93 Td[(WPj.(4)Substitutingtkinto( 4 )givescoordinatesforthek-thtargetpointinFrameC.Sincecoordinatesofthetargetpointsinbothframesareobtained,thetransformationmatrixbetweenFrameWandFrameCcanbeobtainedbylinearleastsquaretechniques.TherotationmatrixCWRisestimatedusingvectorsbetweenthetargetpoints. CWR=argmin1 2Xi6=jkCVij)]TJ /F7 7.97 Tf 11.96 4.94 Td[(CWRWVijk2.(4)VectorfromtheoriginofFrameWtotheoriginofFrameCcanbecalculatedas CWV=1 nXkCPk)]TJ /F7 7.97 Tf 11.96 4.93 Td[(CWR1 nXkWPk.(4)Thusthetransformationmatrixcanbewrittenas CWT=264CWRCWV0>1375.(4)Oneshouldnotethatthisalgorithmofndingthetransformationmatrixdoesnotinitselfguaranteeorthogonalpropertiesoftherotationmatrix,butfortheoverallresulttobeaccurate,inputdatashouldatleastbeaccurate,whichmeantimeensuresvalidityoftheoutcomeproperties. 4.2NumericalTestToevaluatethealgorithmofndingtransformationmatrices,anumericaltestwasconductedusingthedatapointsfromthereservedimage3and6.Threepointswererandomlyselectedastargetpointsfromeachimage,asshowninFigure 4-1 ,andtheir 25

PAGE 26

coordinates(inFrameC)weretransformedtoanewreferenceframe(denotedasFrameW)withatransformationmatrixdenedas CWT0=2666666640.70360.7036)]TJ /F5 11.955 Tf 9.3 0 Td[(0.099820.0000)]TJ /F5 11.955 Tf 9.3 0 Td[(0.65470.69640.294018.00000.2764)]TJ /F5 11.955 Tf 9.3 0 Td[(0.14150.9506)]TJ /F5 11.955 Tf 9.3 0 Td[(5.00000.00000.00000.00001.0000377777775;(4)whichrepresentsanXYZrotationfromFrameWtoFrameCbyanglesof17.2,5.73,and45.0,followedbyatranslationof[20,18,)]TJ /F5 11.955 Tf 9.3 0 Td[(5]>.CorrespondingpositionsontheimagesweresubstitutedintotheB-splinecameramodeltolocatelinesofsightwithrespecttoFrameC.Thus,usingtheaforementionedmethod,coordinatesofthetargetpointswithrespecttoFrameCwereestimatedwithnonlinearleastsquareapproximation.AhybridalgorithmofLevenberg-Marquardt(L-M)andQuasi-Newton(Q-N)wasused,forfastconvergenceandgloballyrobustiterations[ 17 ].Duetothefactthattheraysoflightarelikelytobeinsimilardirections(orroughlyparallel),anambiguousresultfromtheestimationispossible.Dependingontheinitialguessesfortk,onemayobtainatrueestimate,orafalseestimatethat`mirrors'thetrueonealongtheraysoflight,asshowninFigure 4-2 and 4-3 respectively.Itistrueinthiscase,thatagoodinitialguessforparametertkisimportant.However,afalseresultcanbeeasilyidentiedanddisambiguationispossible.Essentially,thefalseresultrepresentsalocalminimumofthecostfunction,whosevalueissignicantlygreaterthanthatforthetrueestimate(Table 4-1 ).Therefore,evenifagoodinitialguessisnotavailable,onecanalwaystryanother`mirror'guessandcomparetheresults. Table4-1. Comparisonbetweenthetrueandfalseestimationresults Maximumerror(mm)Averageerror(mm)Errorvariance(mm2)Costfunction 13.5802e+22.3593e+21.4775e+462.906921.08990.66430.05939.2942e-4 26

PAGE 27

Figure4-1. Targetpointssampledfromimage3and6. Figure4-2. Atrueestimationoftargetpoints. 27

PAGE 28

Figure4-3. Afalseestimationoftargetpoints. Thetransformationmatrixwassubsequentlycomputedbasedontheglobalresult. CWT=2666666640.70150.7058)]TJ /F5 11.955 Tf 9.3 0 Td[(0.099119.9941)]TJ /F5 11.955 Tf 9.3 0 Td[(0.65630.69590.291818.19140.2743)]TJ /F5 11.955 Tf 9.3 0 Td[(0.14000.9513)]TJ /F5 11.955 Tf 9.3 0 Td[(5.05340.00000.00000.00001.0000377777775,(4)whichrepresentsanXYZrotationfromFrameWtoFrameCbyanglesof17.1,5.69,and45.2,followedbyatranslationof[19.9,18.2,)]TJ /F5 11.955 Tf 9.3 0 Td[(5.1]>.Thetransformationmatrixwasagoodestimateoftheonein( 4 ),consideringthatmodelingerrorwasalsointroducedinthecalculations.Ifanidealcameramodel(withnocalibrationerror),themethodcouldyieldanestimationresultwithanerroronthelevelofoatingpointrelativeaccuracy. 28

PAGE 29

CHAPTER5TARGETEXTRACTIONInthischapter,imageprocessingtechniquesaredemonstratedforextractionoftargetpointsfromacharacteristicobject,forcompletenessofthecalibrationmethod.Aremotecontrolvehiclewasusedasthecharacteristicobject,asshowninFigure 5-1 .Colorsegmentationwasperformedintherstplacetoobtainthetargetfromtheinputimage.TheimagewastransformedintoHSVspace,inwhichcolorregionforthetargetismoreregularthanthatinRGBspace.ThecolorregionisshowninFigure 5-2 andFigure 5-3 ,andcanbedescribedbythefollowingfunctions.jh)]TJ /F5 11.955 Tf 11.95 0 Td[(0.04j0.09,s0.03,v0.1467s2)]TJ /F5 11.955 Tf 11.95 0 Td[(0.873s+0.9413. (5)Thelargestconnectedcomponentintheimagethatsatisesthedescriptionwasconsideredtobethetargetvehicle. Figure5-1. Characteristicobjectforfeatureextraction.(PhotocourtesyofYimingXu.) Withbinarymorphologicaloperations[ 3 ][ 11 ],partsofthevehiclesuchasthewindscreen,enginecover,andsidewindows,canbeextractedfromthemainbodyand 29

PAGE 30

Figure5-2. ColorrangeoftargetonH-Vplane. Figure5-3. ColorrangeoftargetonS-Vplane. separatedbytheirmorphologicalfeatures.Imagepositionsforthesetargetcomponentswererepresentedwithapolarcoordinatesystemthatoriginatesfromthecentroidofmainbody.Thus,relativepositionandorientationofthesecomponentsinthepolarreferenceframeareusefultofurtherdeterminethedirectionofthetargetvehicle.Theneachindividualcomponentcanberecognizedbasedonitsrelativelocation.Forthistest,sixfeaturepointswerechosenastargetpoints,namely,fourcornerpointsonthe 30

PAGE 31

windscreenandtwocornerpointsontheenginecover.Imageprocessingresultfortheoriginalimage(Figure 5-1 )isshowninFigure 5-4 Figure5-4. Featureextractionresultforthetargetvehicle. Thisfeatureextractionprocessrequiresthatthebackgroundhasadifferentcolorthanthetargetvehicleandthatthecameramaintainsadownward-lookingperspective.Surelytheimageprocessingstrategyshouldaccommodatetheapplicationrequirements,andtheoneusedherehasprovedtohaveanadequaterobustness.However,aprecisemeasurementofthefeaturepointsonthevehicleisnotavailableatthemoment,thereforeitisnotpossibletointegratethiswiththeposeestimationalgorithmandformanerroranalysisfortheentirecalibrationprocess. 31

PAGE 32

CHAPTER6CONCLUSIONInthisthesis,ageneralB-splinecameramodelisdened,whichimplicitlyembodieslensdistortionsanddirectlyrelatespointsontheimageplanetolinesofsightintheworldreferenceframe,withoutpriorknowledgeofopticalcharacteristicsofthecamera.ItextendsthecameramodelinNPBSmethodtoonewithfreedomtochooseordersfortheB-splinesurfaces,whichintroducemoreexibilitywithoutnecessarilyincreasingdegreesoffreedominthemodel.AccuracyofthecalibrationdependsonmeasurementandagoodchoiceforthevertexnumberandtheordersofB-splinesurfaces.Theexperimentaltestshaveshownadequateaccuracy,giventhattheprecisionofinputdataislimitedbythedevicethatwasused.Forrelocatedcameras,anauxiliarymethodisalsopresented,torecoverthetransformationrelationshipsofthecameraframeandtheworldframe,withthehelpofacharacteristicobjectwithknowngeometryandposeinthescene.AlthoughhighaccuracycanbeachievedwiththeB-splinecameramodel,calculationswithB-splinesurfacesarecertainlynotadvantageousoverlinearcameramodels.Ifcomputingspeedisrenderedofoverwhelmingimportance,itisnecessarytopre-calculatethelinesofsightforanadequateamountofsamplepositions,thussavingtheneedforonlinecalculationofB-splinesurfaces.Furthermore,generalityandexibilityofthecameramodelallowsforlessconstraintontheimagingsystems,whichalsobringdifcultiesfordevelopinganautonomouscalibrationtechnique.Eventhoughtheauxiliarymethodisprovidedtorecoverthecameramodel,sothatcalibrationonlyneedtobeperformonce.Thecalibrationprocessisstillcumbersomeandrequireshighaccuracymeasurements.Therefore,theB-splinecameramodelneedsfurtherdevelopmentforaneasier,moreautomaticcalibrationmethod. 32

PAGE 33

REFERENCES [1] Abdel-Aziz,Y.I.andKarara,H.M.Directlineartransformationfromcomparatorcoordinatesintoobjectspacecoordinates.ProceedingsofAmericanSocietyforPhotogrammetryandRemoteSensing1(1971). [2] Basu,A.andRavi,K.Activecameracalibrationusingpan,tilt,androll.Pro-ceedingsofIEEEInternationalConferenceonRoboticsandAutomation3(1995):2961. [3] Born,M.andWolf,E.Principlesofoptics.PermagonPress,1965. [4] Buchanan,J.andTurner,P.Numericalmethodsandanalysis.McGraw-Hill,1992. [5] Champleboux,G.,Lavallee,S.,Sautot,P.,andCinquin,P.Accuratecalibrationofcamerasandrangeimagingsensor:TheNPBSmethod.ProceedingsofIEEEInternationalConferenceonRoboticsAutomation(1992):1552. [6] Champleboux,G.,Lavallee,S.,Szeliski,R.,andBrunie,L.Fromaccuraterangeimagingsensorcalibrationtoaccuratemodel-based3-Dobjectlocalization.ProceedingsofIEEEComputerSocietyConferenceonComputerVisionandPatternRecognition(1992):83. [7] Clarke,T.A.andFryer,J.G.Thedevelopmentofcameracalibrationmethodsandmodels.ThePhotogrammetricRecord16(1998).91:51. [8] Crane,C.andDuffy,J.KinematicAnalysisofRobotManipulators.CambridgeUniversityPress,2008. [9] Davies,E.R.Machinevision:theory,algorithms,practicalities.MorganKaufmann,2005,3rded. [10] Faugeras,O.D.andToscani,G.Thecalibrationproblemforstereo.ProceedingsofIEEEComputerVisionandPatternRecognition(1986):15. [11] Gonzalez,R.andWoods,R.Digitalimageprocessing.PrenticeHall,2007,3rded. [12] Gremban,K.D.,Thorpe,C.E.,andKanade,T.Geometriccameracalibrationusingsystemoflinearequation.ProceedingsofIEEEofInternationalConferenceonRoboticsandAutomation1(1988):947951. [13] Grossberg,M.D.andNayar,S.K.Ageneralimagingmodelandamethodforndingitsparameters.Proceedingsofthe8thInternationalConferenceonComputerVision2(2001):108. [14] Hall,E.L.,Tio,J.B.K.,McPherson,C.A.,andSadjadi,F.A.Measuringcurvedsurfacesforrobotvision.Computer15(1982).12:42. 33

PAGE 34

[15] Heikkila,J.andSilven,O.Afour-stepcameracalibrationprocedurewithimplicitimagecorrection.ProceedingsofIEEEComputerSocietyConferenceonCom-puterVisionandPatternRecognition(1997):1106C1112. [16] Hoffman,J.Numericalmethodsforengineersandscientists.MarcelDekker,2001,2nded. [17] Madsen,K.,Nielsen,H.B.,andTingleff,O.Methodsfornonlinearleastsquaresproblems.InformaticsandMathematicalModeling,TechnicalUniversityofDenmark,2004. [18] Martins,H.,Birk,J.,andKelley,R.Cameramodelsbasedondatafromtwocalibrationplanes.ComputerGraphicsandImageProcessing17(1981):173. [19] Piegl,L.A.andTiller,W.TheNURBSbook.Springer,1997. [20] Ramalingam,S.,Sturm,P.,andLodha,S.K.Towardscompletegenericcameracalibration.ProceedingsofIEEEComputerSocietyConferenceonComputerVisionandPatternRecognition1(2005):1093. [21] Remondino,F.andFraser,C.Digitalcameracalibrationmethods:considerationsandcomparisons.InternationalArchivesofPhotogrammetry,RemoteSensingandSpatialInformationSciences36(2006).5:266. [22] Rogers,D.AnintroductiontoNURBS:withhistoricalperspective.MorganKaufmann,2001. [23] Salvi,J.,Armangue,X.,andBatlle,J.Acomparativereviewofcameracalibratingmethodswithaccuracyevaluation.PatternRecognition35(2002).7:1617. [24] Sturm,P.Multiviewgeometryforgeneralcameramodels.ProceedingsofIEEEComputerSocietyConferenceonComputerVisionandPatternRecognition(2005):206. [25] Swaminathan,R.,Grossberg,M.D.,andNayar,S.K.Non-singleviewpointcatadioptriccameras:geometryandanalysis.InternationalJournalofComputerVision66(2006).3:211. [26] Triggs,B.Autocalibrationfromplanarscenes.Proceedingsofthe5thEuropeanConferenceonComputerVision(1998):89. [27] Tsai,R.Y.Aversatilecameracalibrationtechniqueforhigh-accuracy3Dmachinevisionmetrologyusingoff-the-shelfTVcamerasandlenses.IEEEJournalofRoboticsandAutomation3(1987).4:323. [28] Weng,J.,Cohen,P.,andHerniou,M.Cameracalibrationwithdistortionmodelsandaccuracyevaluation.IEEETransactionsonPatternAnalysisandMachineIntelligence(1992):965. 34

PAGE 35

[29] Ying,X.andHu,Z.Catadioptriccameracalibrationusinggeometricinvariants.IEEETransactionsonPatternAnalysisandMachineIntelligence26(2004).10:1260. [30] Zhang,Z.Aexiblenewtechniqueforcameracalibration.IEEETransactionsonPatternAnalysisandMachineIntelligence22(2000).11:1330. 35

PAGE 36

BIOGRAPHICALSKETCH YimingXureceivedhisB.S.degreein2010fromtheDepartmentofControlScienceandEngineeringatZhejiangUniversity.HeiscurrentlystudyingasagraduatestudentinDepartmentofMechanicalandAerospaceEngineeringatUniversityofFlorida.Hisresearchinterestincludescameracalibration,imageprocessing,anddynamicsystemcontrol. 36