Citation
Utilizing Glint Phenomenology to Perform Classification of Civilian Vehicle Using Synthetic Aperture Radar

Material Information

Title:
Utilizing Glint Phenomenology to Perform Classification of Civilian Vehicle Using Synthetic Aperture Radar
Creator:
Paulson, Christopher R
Publisher:
University of Florida
Publication Date:
Language:
English

Thesis/Dissertation Information

Degree:
Doctorate ( Ph.D.)
Degree Grantor:
University of Florida
Degree Disciplines:
Electrical and Computer Engineering
Committee Chair:
Wu, Dapeng
Committee Members:
Harris, John Gregory
Ifju, Peter G
Banks, Scott Arthur
Graduation Date:
5/4/2013

Subjects

Subjects / Keywords:
Bandwidth ( jstor )
Civilian personnel ( jstor )
Geometric angles ( jstor )
Geometrical optics ( jstor )
Image classification ( jstor )
Pixels ( jstor )
Radar ( jstor )
Sensors ( jstor )
Signals ( jstor )
Term weighting ( jstor )
classification
cvdomes
database
glrt
gotcha
optics
pose
sar

Notes

General Note:
This dissertation proposes an innovative automatic target recognition (ATR) technique to generate synthetic aperture radar (SAR) templates, perform pose estimation on civilian vehicles using wide angle SAR, and match civilian vehicles.  These methods are proposed to improve upon current techniques that are computationally or economically expensive, or not directly applicable to this problem.  The proposed approaches introduce geometric optics to generate the SAR templates, exploit glint phenomenology to extract angle features to estimate the pose of the vehicle, and match test images to templates using a novel radar motivated similarity measure. SAR ATR requires accurate pose estimation because the signature can change significantly with aspect angle.  In template matching approaches, the pose estimation will limit the number of templates that are used, thereby increasing the accuracy of the recognition algorithms.  In this dissertation, a method is proposed in which the change in the signature with respect to the aspect angle is used to improve the pose estimation.  Specifically, the pose is estimated by utilizing the glint phenomenology that is prevalent when imaging civilian vehicles with SAR.  The box-like shape of the vehicle provides large glints at the sides, back, and front of the vehicle providing improved recognition. Tests of the pose estimation algorithm were performed with both measured and simulated data as a function of a number of radar parameters.  Aspect angle estimates were within 5 degrees, modulo 90 degrees, which significantly reduced the number of templates needed in later ATR stages. Training data is needed to calibrate the recognition algorithms in the SAR ATR.  Other traditional approaches use measured or physical optics generated data, but these approaches are computationally and economically expensive.  However, this dissertation proposed to develop the training data using geometrical optics.  This reduces cost and time because the necessary data can be generated on a standard computer using Matlab or C, and the mathematical complexities of physical optics are avoided.  To verify that this approach can be used to classify targets, a radar motivated similarity measure was used to confirm that the test and template images can be matched properly using the geometric optic templates.

Record Information

Source Institution:
UFRGP
Rights Management:
All applicable rights reserved by the source institution and holding location.
Embargo Date:
5/31/2015

Downloads

This item has the following downloads:


Full Text

PAGE 2

2

PAGE 3

3

PAGE 4

FirstofallIwouldliketosaythankyoutomyadvisor,Prof.DapengWu,forallhissupport,guidance,patience,knowledge,andassistancethroughouttheprocessofmyresearch.IamgratefulthathesawsomethingspecialinmeanddecidedtogivemeanopportunitywhenIcameuptotalktohimaboutacourseproject.Hisfundingofferwastotallyunexpected.IwouldnotbeheretodayandwouldnothavereceivedtheSMARTfellowshiporaPhDfromtheUniversityofFloridaifitwerenotforhim.ForthatreasonIwouldliketoexpressmydeepestgratitudetoProf.DapengWu.InadditiontoDr.Wu,Iwouldliketothankmycommitteemembers,Dr.PeterIfju,Dr.JohnHarris,andDr.ScottBanksforservingonmycommittee,givingmedirectiontobesuccessfulasaPhDcandidate,andgivinguptimeintheirbusyschedulestoassistmeonmyjourney.TheSMARTscholarshipprovidedthenecessaryfundingneededtonishmyPhD.ThegrantunderwhichIwasfundedconcludedandIneededfundingtocontinuemystudiestowardsmydegree.Thankyouforselectingmein2010tobearecipientofthiswonderfulscholarshipandwithoutwhichIwouldnotbewhereIamtoday.AlsoIwouldliketosaythankyoutoEdmundZelnioforhiscontinuousguidanceanddirectionasIwaspursuingmydissertation.HehelpedmeinmorewaysthanoneandwithouthimitwouldhavebeenextremelydifculttonishmyPhD.Thankyouforallyourtime,effort,lengthyphoneconversations,andguidance.IamlookingforwardtoworkingandcontinuelearningfromyouasIprogressasaresearcherunderyou.NextIwouldliketothankOlgaMendoza-SchrockandLeRoyGorhamforbeingmymentors.ThankyouOlgaforbeingtherefromthebeginningofmycareerasaPhDstudent,yourguidanceasIpursedmyPhD,alwaysmakingsurethatIwastakencareof,andthatIwasheadingintherightdirection.Also,thankyouRoyforbeingwillingtostepinwhenthefocusshiftedtoSARduringmyPhDcareerandhelpingmetobetterunderstandSARandimageformationtechniques. 4

PAGE 5

5

PAGE 6

page ACKNOWLEDGMENTS .................................. 4 LISTOFTABLES ...................................... 8 LISTOFFIGURES ..................................... 9 ABSTRACT ......................................... 12 CHAPTER 1INTRODUCTION ................................... 14 1.1OverviewofAutomaticTargetRecognition .................. 15 1.2Physics ..................................... 17 1.2.1PhysicalOpticsApproximations .................... 17 1.2.2Geometric ................................ 18 1.2.3DiffusevsSpecular ........................... 19 1.3Targets ..................................... 23 1.4Sensor ...................................... 27 1.5FeatureExtractor ................................ 30 1.6Database .................................... 31 1.7Classier .................................... 31 1.8ImportanceofDatabaseGeneration ..................... 32 1.9ImportanceofPoseEstimation ........................ 32 1.10Contributions .................................. 33 2BACKGROUND ................................... 34 2.1PreviousWorkonSARATRforCivilianVehicles .............. 34 2.2PreviousWorkonWideAngleImageFormation ............... 36 2.3PreviousWorkonPoseEstimation ...................... 37 2.3.1ApproachesRequiringTrainingData ................. 38 2.3.2ApproachesAssumingaCrispObjectBoundary .......... 39 2.4OverviewofSARDatabaseCollectionModes ................ 40 2.4.1MeasuredDataCollectionModes ................... 40 2.4.1.1AirbornePlatformDataCollections ............. 40 2.4.1.2Ground-BasedPlatformDataCollections ......... 41 2.4.1.3ScaleModelDataCollections ................ 42 2.4.2SyntheticDataGeneration ....................... 42 2.5PreviousWorkonObjectClassication .................... 43 2.6PreviousWorkonSARTargetClassiers .................. 46 6

PAGE 7

48 3.1TechnicalApproachforPoseEstimation ................... 48 3.1.1ProofofConcept ............................ 48 3.1.2OverviewofAlgorithm ......................... 51 3.1.3ImageFormation ............................ 53 3.1.3.1Windows ........................... 59 3.1.4GLRT .................................. 61 3.1.5SpokesFilter .............................. 63 3.2Results ..................................... 68 3.3Summary .................................... 77 4DATABASEGENERATIONOFCIVILIANVEHICLESUSINGGEOMETRICOPTICS ........................................ 80 4.1TechnicalApproach .............................. 80 4.1.1SingleBounce ............................. 80 4.1.2DoubleBounce ............................. 88 4.1.3BounceFilter .............................. 89 4.2Results ..................................... 94 4.3Summary .................................... 99 5UTILIZINGGEOMETRICOPTICSANDPOSEESTIMATIONTOCLASSIFYCIVILIANVEHICLESARIMAGES ......................... 102 5.1TechnicalApproach .............................. 102 5.1.1DatabaseGeneration .......................... 103 5.1.2PoseEstimation ............................. 103 5.1.3TemplateSelection ........................... 103 5.1.4SimilarityMeasure ........................... 104 5.2Results ..................................... 112 5.3Summary .................................... 120 6CONCLUSION .................................... 121 6.1SummaryofDissertation ........................... 121 6.2NovelConcepts ................................. 123 6.3FutureWork ................................... 123 APPENDIX ADEFINITIONOFMATHEMATICALSYMBOLS .................. 125 BDEFINITIONOFACRONYMS ........................... 129 REFERENCES ....................................... 131 BIOGRAPHICALSKETCH ................................ 145 7

PAGE 8

Table page 1-1EOVisibleSpectrumversusSAR .......................... 16 1-2ResultsofStructureScatteringResponse ..................... 26 1-3ResultsofStructureScatteringResponse ..................... 26 1-4MilitaryVehiclesversusCivilianVehiclesUsingDifferentAngleApertures ... 27 2-1SummaryofATRforSARDataCollectionofCivilianVehicles .......... 35 2-2SummaryofResearchinPoseEstimation ..................... 37 2-3FunctionalityoftheDifferentTypesofDataCollectionMethods ......... 40 8

PAGE 9

Figure page 1-1DetailedOverviewofATR .............................. 15 1-2OverviewofATR ................................... 16 1-3GeometryofFlatPlate ................................ 19 1-4RadarCrossSectionofFlatPlate ......................... 19 1-5ExampleofDiffuseScattering ............................ 20 1-6TypesofScatteringthatExistinEOandSARSensors .............. 20 1-7ExampleofSpecularReection ........................... 21 1-8ExampleofSpecularScatteringfrom88to92forVehicle ........... 22 1-9ExampleofSpecularScatteringfrom178to182forVehicle .......... 23 1-10ExampleofSignalReectivityoffBasicStructures ................ 24 1-11ExampleofSignalReectivityoffFlatPlateStructures .............. 25 1-12ExampleofwhentheSignalwillbeReectedBacktotheRadarforaFlatPlate 25 1-13ExamplesofDifferentAngleApertureforDifferentVehicles ........... 27 3-1AngleandAmplitudeValueofCivilianVehiclesforDifferentDatasets ...... 49 3-2ProofofConceptusingCVDomesatCardinalHeadings ............. 49 3-3ProofofConceptusingCVDomesatNoncardinalHeadings ........... 50 3-4ProofofConceptusingGotchaattheCardinalHeadings ............ 51 3-5ProofofConceptusingGotchaattheNoncardinalHeadings .......... 52 3-6FlowchartofPoseEstimation ............................ 53 3-7CreatingRangeProle ................................ 54 3-8CollectionGeometry ................................. 55 3-9AnExampleofaSubaperture ............................ 56 3-10AnExampleofaSubapertureat=5,=90,=30,fc=9.6GHz .... 57 3-11ProcessofCreatingaSubaperture ......................... 59 3-12GLRTImagesUsingDifferentWeighting ...................... 60 9

PAGE 10

................ 62 3-14GLRTImages ..................................... 63 3-15FlowchartofGLRT .................................. 63 3-16GLRTIndicatorMatrixforJeep93withp=5% 64 3-17Effectofp%on1G,,(x,y) 65 3-18VisualizationofEqn. 3 .............................. 66 3-19SpokesFilterPlotofJeep93 ............................ 66 3-20MAFofSpokesFilterPlotofFig. 3-19 ....................... 67 3-21EffectofmonSpokeFilterforIntegrationAngle5 69 3-22EffectofmonSpokeFilterforIntegrationAngle10 70 3-23TheEffectofBandwidthandDepressionAngleDuringImageFormation .... 72 3-24PoseEstimationResultsforeachWeightingTechniques ............. 74 3-25PoseEstimationAverageResultsofUniformweightingatIntegrationAngle2 75 3-26PoseEstimationAverageResultsofUniformweightingatIntegrationAngle5 76 3-27PoseEstimationAverageResultsofUniformweightingatIntegrationAngle10 77 3-28MADofPoseEstimationforGotchaData ..................... 78 4-1SingleBounceRadarSignal ............................ 81 4-2DoubleBounceReturn ................................ 81 4-3FlowchartofDatabaseGeneration ......................... 83 4-4SurfaceNormalofaFacet .............................. 83 4-5FacetModel ...................................... 84 4-6RadarVectorv 84 4-7EffectsoftheValue ................................. 86 4-8GeometryoftheLayover .............................. 87 4-9FilterstoCreateMask ................................ 90 4-10MasktoFindOutsideEdges ............................ 91 4-11SingleandDoubleBounceCombinedTogether .................. 91 10

PAGE 11

................................... 92 4-13ExampleofaTemplateAngleMatrix(1~,,(x,y)) ................. 93 4-14GeometricOpticGeneratedImagesofaCamryatDifferentElevationAngleswithNoFiltering ................................... 95 4-15PhysicalOpticsImagesofCVDomesat=60 97 4-16NoFilterAppliedtoGeometricOpticImageofCVDomesat=60 98 4-17OuterEdgeFilterAppliedtoGeometricOpticImageat=60 100 4-18MagniedPhysicalOpticImageofToyotaTacoma ................ 101 5-1GeneralFlowchartofClassicationAlgorithm ................... 103 5-2Eqn. 5 inImageForm ............................... 105 5-3AccountingforModuloPropertyofAnglesinD,,(x,y) 106 5-4WinsorizingD,,(x,y) 107 5-5Topp%ofIG,,(x,y,0;c) 109 5-6QuantizedImageIQ,,(x,y) 110 5-7Eqn. 5 inImageForm ............................... 111 5-8ComparingS,,(x,y) 111 5-90S,,(x,y) 112 5-10BestWeightingtoUseforClassication ...................... 113 5-11RenderedImagesofCivilianVehicles ....................... 115 5-12HammingWeightingUsingOuterEdgeFilterScore ................ 117 5-13TemplateImagesofCivilianVehicles ........................ 119 11

PAGE 12

Thisdissertationproposesaninnovativeautomatictargetrecognition(ATR)techniquetogeneratesyntheticapertureradar(SAR)templates,performposeestimationoncivilianvehiclesusingwideangleSAR,andmatchcivilianvehicles.Thesemethodsareproposedtoimproveuponcurrenttechniquesthatarecomputationallyoreconomicallyexpensive,ornotdirectlyapplicabletothisproblem.TheproposedapproachesintroducegeometricopticstogeneratetheSARtemplates,exploitglintphenomenologytoextractanglefeaturestoestimatetheposeofthevehicle,andmatchtestimagestotemplatesusinganovelradarmotivatedsimilaritymeasure. SARATRrequiresaccurateposeestimationbecausethesignaturecanchangesignicantlywithaspectangle.Intemplatematchingapproaches,theposeestimationwilllimitthenumberoftemplatesthatareused,therebyincreasingtheaccuracyoftherecognitionalgorithms.Inthisdissertation,amethodisproposedinwhichthechangeinthesignaturewithrespecttotheaspectangleisusedtoimprovetheposeestimation.Specically,theposeisestimatedbyutilizingtheglintphenomenologythatisprevalentwhenimagingcivilianvehicleswithSAR.Thebox-likeshapeofthevehicleprovideslargeglintsatthesides,back,andfrontofthevehicleprovidingimprovedrecognition.Testsoftheposeestimationalgorithmwereperformedwithbothmeasuredandsimulateddataasafunctionofanumberofradarparameters.Aspectangle 12

PAGE 13

TrainingdataisneededtocalibratetherecognitionalgorithmsintheSARATR.Othertraditionalapproachesusemeasuredorphysicalopticsgenerateddata,buttheseapproachesarecomputationallyandeconomicallyexpensive.However,thisdissertationproposedtodevelopthetrainingdatausinggeometricaloptics.ThisreducescostandtimebecausethenecessarydatacanbegeneratedonastandardcomputerusingMatlaborC,andthemathematicalcomplexitiesofphysicalopticsareavoided.Toverifythatthisapproachcanbeusedtoclassifytargets,aradarmotivatedsimilaritymeasurewasusedtoconrmthatthetestandtemplateimagescanbematchedproperlyusingthegeometricoptictemplates. 13

PAGE 14

ThisdissertationfocusesonthedesignandevaluationofAutomaticTargetRecognition(ATR)techniquesforSyntheticApertureRadar(SAR)signalobservations.AutomaticTargetRecognition(ATR)istheprocessbywhichcomputer-basedalgorithmsoperateonsensorobservationstomakedecisionsregardingtheidentity,type,orclassofobjectsthatmaybepresentinthesensedregion. Becauseofthetypicallyhighdatarateofobservedsignalsandthetypicallyhighcomputationalburdenofrecognitiontechniques,manyATRsystemsemployaprescreener.Theprescreenerisusedtoidentifypotentialregionsofinterest(ROI)forfurtherprocessingandtodisregardlargeregionscontainingonlyprocessingartifactssuchasnoiseandclutter.TheprescreenerisoftenfollowedbyadiscriminatorthatexaminestheROIstoseparatethesignalsproducedbynaturalobjects(trees,rocks,etc.)fromthoseproducedbyman-madeobjects,whichareofultimateinterest.Finally,themorecomputationally-intensiveclassicationprocessisappliedonlytotheROIsthatpotentiallycontainman-madeobjects.Theclassicationprocessisdesignedtoextractidentiersbasedonscattering,suchastheshape,orientation,class,andidentityoftheobject. Thisresearchfocusesprimarilyontheclassicationprocess.Inparticular,anovel,computationallyefcienttechniqueisdevelopedandevaluatedforclassifyingvehiclescommonlyfoundincivilianparkinglots.Vehiclesareconsideredaparticularlydifcultfamilyofobjectstodifferentiateduetotheirhighlevelsofsimilarity.TheparticularsensorofinterestforthisresearchisaradarsystemoperatinginSyntheticApertureRadar(SAR)mode.Whenoperatinginthismode,radarsystemsproducea2-Dimageofthebackscattteredenergy. AnoutlineofthisdissertationisgraphicallypresentedinFig. 1-1 .TherstchapterservesasanintroductiontotheproblemofATRusingSAR.InChapter 2 ,previouswork 14

PAGE 15

3 introduceanddetailthecapabilitiesoftheproposedtechniquesforestimatingorientationofacivilianvehicle.TheeffectivenessofaproposedtechniqueisevaluatedinChapter 4 forgeneratingtrainingimagesfortraining.Finally,inChapter 5 aprocessisproposedforcombiningtheseproposedtechniquestoachieveanovelclassicationmethodology.Chapter 6 summarizestheoverallsignicanceofthisworkandprovidesrationaleforfuturework. Figure1-1. DetailedOverviewofATR 1-2 withanemphasisontheclassicationstage.TheclassicationstageofanATRidentiesvarioustargetsinanimagebyexploitingdifferencesinscatteringphenomenology.Civilianvehicles 15

PAGE 16

OverviewofATR consistofatsurfaceregionsandsingularlycurvedsurfaceregions.Thesmooth,aerodynamicdesignofmostpassengervehiclesimpliesthatthey,unlikemilitaryvehicles,lackdihedralsandtrihedrals.Itispreciselythisrelativelackofdihedralandtrihedralreectorsthatmakesmeaningfulimagegeneration,andsubsequentclassication,verydifcultforanarrowilluminationangle(NIA),whichistheangleusedtoproduceaSARimage.TheElectro-Optic(EO)andSARsensormodesarethetwosensormodesthataretypicallyusedtocollecttheimagerydata. AbriefoverviewofthedifferencesbetweenthetwosensorsisgiveninTable 1-1 .ThesedifferenceswillbediscussedinfurtherdetailinChapter 1.4 Table1-1. EOVisibleSpectrumversusSAR CategorySAREOVisibleSpectrum IlluminationActivePassiveResolutionIndependentofRangeDegradesasFunctionofRangeProjectionPlaneRange-CrossrangeAngle-AngleNoResolutionCrosstheLineofSightAlongtheLineofSightApertureAdaptiveFixedProjectionArtifactsLayoverParallax Sinceallclassicationtechniquesoperateonrepresentationsofmeasureddata,knownasfeatures,itfollowsthatthecharacteristicsoftheresultingfeatureslargelydeterminethefunctionandperformanceoftheclassier.Forexample,inEOimagery 16

PAGE 17

1 ]andspeduprobustfeatures(SURF)[ 2 ]arethecurrenttechnologiesusedtoperformfeatureextractionbyidentifyingstrongpointfeatures.However,SIFTandSURFarenotsuitableapproachesforthewideilluminationangle(WIA)SARofcivilianvehiclesbecausetheimageisnotdenselylledasillustratedinFig. 1-13C .Sincetheimageisnotdenselylled,theabilitytoextractstrongpointfeaturesfromtheimageisextremelydifcultduetothelackofdistinctcornersoredgesintheimage.Toimproveonthisinabilitytoextractsalientfeatures,thisresearchproposesseveralinnovativetechniquesforextractingfeaturesfromWIASARofcivilianvehicles. MostusefulclassicationapproachesforSARexplicitlyaccountforthevariabilityoftheSARsignatureasafunctionofaspectangleduetospecularscattering.Theseincludebothmodel-basedandtemplate-basedapproaches,eachofwhichrequireadatabasetoaccountforthevariabilityoftheSARsignature.Ontheotherhand,thetargetsignaturesensedbyEOsensordoesnotchangedramaticallywithrespecttotheaspectangleasthescatteringisdiffuse.Therefore,alargervarietyofapproacheshavebeensuccessfulsuchasstatisticalclassication,neuralnetworks,syntacticorstructuralmatching,amongothers. 3 ]. 17

PAGE 18

Inthisdissertation,adifferentapproachisdevelopedwhichofferscomputationalefciencyandacceptableclassicationperformance.Inparticular,aGeometric-OpticsapproximationisusedthatexploitstheglintphenomenologyprevalentinSAR.Glintsare,ingeneral,onlyvisibleoveranarrowrangeofangles.Sincethevehiclesconsistofatplates,thisglintknowledgeisexploitedtocreateaninnovativeposeestimator. Figure 1-4 showstheradarcrosssectionofaatplateasafunctionofincidenceangle.Theradarcrosssectionisproportionaltothesignalstrength.ThisplotwasgeneratedusingthecodeandequationsprovidedbyMahafza[ 4 ].Forthisplot,thetwosidesoftheatplatewere5and8metersandthefrequencywas9.6GHz.Accordingtotheglintphenomenologyandtheorientationoftheatplate(showninFig. 1-3 ),thestrongestscatteringshouldoccuratazimuthangle()of0becausethiswillbethelocationinwhichthedotproductbetweenthetransmittedandreectedsignalis1.ThiscanbeobservedinFig. 1-4 wherethestrongestscatterersarelocatedwithintherstfewdegreesof=0. Thetargetsstudiedinthisresearcharecivilianvehicleswhichconsistofatplatesandcylindricalstructures,thereforethegeometricopticsapproachonlyconsideredatplate,dihedral,andcylindricalstructures.Inaddition,thegeometricopticspredictionapproachonlyconsiderssingleanddoublebouncephenomenology,butthisissuitable 18

PAGE 19

GeometryofFlatPlate Figure1-4. RadarCrossSectionofFlatPlate forthetargetsstudiedinthisresearch.Thisprocesscanbeextendedifnecessarytoincludecornerreectors,whichusethetriplebouncephenomenology. 5 ].DiffusescatteringisillustratedinFig. 1-6A ,wherethesunistheradiationsource,andtheobjectreectstheincidentlightequallyinalldirections. 19

PAGE 20

ExampleofDiffuseScattering B TypesofScatteringthatExistinEOandSARSensors:A)EOScattering,B)SARScattering 20

PAGE 21

1-7 )andcausesthesurfaceoftheobjecttoactlikeamirror.Thelocationofthesensordetermineswhetherornotthesensorreceivesthesignal.Forexample,inFig. 1-6B ,thesourceandsensor1donotreceivethereectedsignalbecausethespecularspikeofthereectedsignalisnotinviewofsensor1.However,inthecaseofsensor2,thespecularspikeofthereectedsignalisinviewofsensor2.Therefore,sensor2willrecorddata.TheguresfromFig. 1-6 aresimilartothegurein[ 3 ]. Figure1-7. ExampleofSpecularReection Theratioofthestandarddeviationofheightmodel,h,tothewavelength,,asshowninEqn. 1 determineswhetherthesurfacescatteringisdiffuseorspecular.Notethatthesurfaceroughness,g,isalsoafunctionoftheincidenceandreectedangleoftheradiation.Ifg1,g1,andg1,thesurfaceiseithersmooth,moderatelyrough,orextremelyrough,respectively.Wheng1thesurfaceactsasamirrorandcontributestothespecularspikecomponentofthescattering(Fig. 1-6 ). Whengincreases,thescatteringbecomeslessspecularandmorediffuseasillustratedinFig. 1-6 .Thespecularspikespreadsoutintoaspecularlobeandeventuallybecomestotallydiffusewherethescatteringisomnidirectional. ForSARX-bandradars(usedforthisresearch),thefrequencyrangesfrom8-12GHz[ 6 ];therefore,rangesfrom2.5cmto3.7cm.ForvisiblebandEO,is0.43mto0.79m[ 7 ].ForEOsystems,g>>1duetothewavelengthbeingsmallinEqn. 1 21

PAGE 22

SpecularscatteringpresentsuniquechallengesforclassicationofSARimages,partlybecausetheangleinwhichthesignalisreectedisthesameastheincidenceangle(Fig. 1-7 ).Inthiscase,thelocationofthesensordetermineswhetheraparticularsurfaceisobserved[ 8 ],[ 3 ].InorderforSARtoreceivethereectedsignal,itmustbenormaltotheincidentsurface.Anotherchallengingaspectofspecularscatteringisthatonlyafewsurfacesonthetargetgivebrightreturns.WhereasforEOimagery,diffusescatteringcausesallthetargetsurfaceswithinlineofsighttobevisible. InFig. 1-8A ,theaperture,,rangesfrom88oto92o.InFig. 1-9A ,theaperturerangesfrom178oto182o.Theredpixelsarethegeometricpredictionofthescatteringwhenthefull360degreesareusedtodeveloptheimage.Ascanbeenseenfromtheseimages,thescatteringissignicantlydifferentfromangletoangle,duetothefactthattheradaronlyreceivesthesignalthatisnormaltothesurface.Also,notethatthereareonlyafewdominantscatteringcentersthatappearintheimage.Theseimagesillustratethechallengesthatarisewhenclassifyingobjectsusingspecularscattering(SARimages). B ExampleofSpecularScatteringfrom88to92forVehicle:A)SARImagefrom88-92,B)TopDownViewofSignalReectivityat90o

PAGE 23

B ExampleofSpecularScatteringfrom178to182forVehicleA)SARImagefrom178-182,B)TopDownViewofSignalReectivityat180o Scatteringphenomenologyisbasedonvehiclestructures.Itisimportanttounderstandthebasicscatteringmodelsofatplates(Fig. 1-10A ),horizontalcylinders(Fig. 1-10B ),spheres(Fig. 1-10C ),andotherstructuressuchasdihedral(Fig. 1-11A )andtrihedral(Fig. 1-11B ).Tables 1-2 and 1-3 provideabriefoverviewofscatteringphenomenologyusingtheworkofAkyildizandMoses[ 9 ]: fc)ej4f c(xcos+ysin)sinc(2f cLsin( TheparametersA,(jf fc),andej4f c(xcos+ysin)arethescatteringcenteramplitude,frequency,andpositionresponsecomponents,respectively.Thefourthtermsinc(2f cLsin(

PAGE 24

B C ExampleofSignalReectivityoffBasicStructures:A)FlatPlate,B)Cylinder,C)Sphere Thestrengthofthereceivedsignalcriticallydependson.Forstructuressuchastheatplate,dihedral,andtrihedral,=1;forsingularlycurvedsurfacessuchasthecylinder,=1=2;andfordoublycurvedsurfaceslikethesphere,=0.Eqn. 1 showsthatasdecreases,theamplitudeofE(f,)alsodecreases.Therefore,atplatestructureshavealargeramplitudethancurvedsurfaces,whereasdoublycurved 24

PAGE 25

B ExampleofSignalReectivityoffFlatPlateStructures:A)Dihedral,B)Trihedral surfaceshavethesmallestamplitudevalues.ThisphenomenologywillbeexploitedinChapter 4.1.3 Flatplatestructuresaretheleaststablesincescatteringpersistsonlyoveranarrowazimuthandelevationangle.Reectionsoccurwhentheanglebetweenthesurfacenormaloftheatplateandtheradariszero,asdepictedinFig. 1-12 .OtherwisethesignalisnotdirectedbacktotheradarasshowninFig. 1-10A Figure1-12. ExampleofwhentheSignalwillbeReectedBacktotheRadarforaFlatPlate Dihedralandcylindricalstructureshavemorestabilitythantheatplatestructures,butarenotasstableastrihedralandsphericalstructures.Horizontalcylindricalstructuresexhibitpersistenceintheverticaldirectionbutnotthehorizontaldirection.Trihedralandsphericalstructuresarethemoststablebecausethesignalpersistsoverwideazimuthandelevationangles.Hence,militaryvehiclesareabletouseNIAsince 25

PAGE 26

Table1-2. ResultsofStructureScatteringResponse StructureAzimuthAngleElevationAngleStableStrength FlatPlateNarrowNarrowLeastHighCylinderNarrowWideMoreMediumSphereWideWideMostLow Table1-3. ResultsofStructureScatteringResponse StructureAzimuthAngleElevationAngleStableStrength FlatPlateNarrowNarrowLeastHighDihedralNarrowWideMoreHighTrihedralWideWideMostHigh TherehasbeenasubstantialamountofworkusingSARtoperformprocessingonmilitaryvehicleslargelyduetotheMSTARdatabase[ 10 ].ThisworkincludesClassication[ 11 17 ],Detection[ 18 21 ],FeatureExtraction[ 12 15 22 ]andPoseestimation,whichisdiscussedinChapter 2.3 Unlikemilitaryvehicles,therehasbeenlessanalysisperformedoncivilianvehicles.Someexamplesofworkoncivilianvehiclesinclude:Imageformation[ 23 27 ],3-DImaging[ 28 31 ],FeatureExtraction[ 31 37 ],andClassication[ 32 36 38 ]. Theworkcitedformilitaryvehiclesuseda3NIA(Fig. 1-13A ),butthedifcultyofextractingusefulinformationfromNIAforcivilianvehiclesisillustratedinFig. 1-13B .NIAcanbeusedeffectivelyformilitaryvehiclesbecauseofthedihedralandtrihedralresponsesproduceareasonableimage(RefertoTable 1-4 forbriefoverview).However,forcivilianvehicles,thisisnotthecasebecausetheyconsistofmostlyatplatesandafewvirtualdihedralresponses.Toovercomethisdifculty,aversionofWIASARcalledCircularSAR[ 39 ]isemployed(Fig. 1-13C ). UsingWIAformilitaryvehiclesproducesanexcellentimageandforcivilianvehiclesusingaWIAof360degreesproducesagoodimage.OnechallengewithWIASAR 26

PAGE 27

B C ExamplesofDifferentAngleApertureforDifferentVehicles:A)TankNarrowAngleAperture,B)3DegreesAperture,C)360degreesAperture Table1-4. MilitaryVehiclesversusCivilianVehiclesUsingDifferentAngleApertures VehicleNIAWIA MilitaryReasonableExcellentCivilianPoorGood isthattheabilitytohaveasufcientlywideviewinganglemaynotbefeasibleinallsituations.Butfornowitisassumedthatthisdataisavailable. 1-1 .Inthissection,eachcategorywillbediscussedbrieytoclearlyshowthedifferencebetweenthetwosensors. 27

PAGE 28

TheresolutionofSARisnotprimarilyafunctionofthedistancetothetarget.Thisresolutionindependencetodistanceisduetotwofactors.Therstfactorisduetotherangeresolutionbeingafunctionofbandwidth;hence,theresolutionisnotafunctionofdistancetothetarget.Thesecondfactorisduetotheazimuthresolutionbeingafunctionofthesizeofthesyntheticaperture.Hence,ifthedistancetothetargetfromtheSARislarge,thenalargersyntheticapertureisallowable.Thustokeeptheresolutionindependentofdistance,thelengthofthesyntheticapertureiscreatedinproportiontothedistancetothetarget. 28

PAGE 29

4 .AnimportantadvantageofSAR,duetoitslongerwavelength,isthatitsenergypenetratescloudsunliketheshorterwavelengthEOwhichcannotpenetrateclouds.AlsoSARprovidesitsownenergyasitisanactivesensor.ThecombinedeffectoftheactivesensorandthelongerwavelengthresultsinSARbeingabletoimageindayornightandinmanytypesofweather. SARandEOareboth2Dsensors,buttheirdifferentprojectionshavedifferentdimensionsofambiguity.EOsensorshavenoresolutioninrangeoralongthelineofsight,andSARhasnoresolutioninelevationororthogonaltothelineofsight.SARisabletoresolvehowfaranobjectisfromthesensor,butisnotabletoputthetopofthetargetintheproperlocationbecauseitdoesnothaveresolutioninelevation.Whereas,theEOvisiblespectrumlacksresolutionalongthelineofsight;therefore,itisnotabletodeterminethedistanceofanobjectfromthesensor.However,EOvisiblespectrumisabletoproperlyplacethetopofthetargetinthecorrectlocationsinceithasresolutioninelevation. InSAR,theapertureisadaptivebecausetheaperturesizecanbecontrolledevenafterthephasehistorydatahasbeencollectedbyvaryingtheintegrationangle.Thisallowstheimagequalitytobeachievedwhentheimageisbeingformed.However,forEO,theapertureisxedafterthedatahasbeencollected.Thedisadvantagewiththe 29

PAGE 30

Eachsensortypeproducesadistinctiveartifact.SARproduceslayoverandEOproducesparallax.Sincelayoverisdependentontheelevationangle,thisalsocontributestovariabilityofthesignature.Parallax,ontheotherhand,iscausedbytheapparentdisplacementofanobjectatdifferentlinesofsight.ThiscausesdifcultiesforusingEOimageryinimageregistration,classication,changedetection,fusion,amongothers Also,featureextractionwasusedforclassicationbyhavingangleandamplitudeassociatedwitheachpixelofthetemplate.Theangleofthepixelwasthedirectionofgreatestscatteringor,inthiscase,themaximumamplitude.Theamplitudeofthepixelswerealsousedasfeatures,butwerenotweightedasheavilyastheangleduetothelargevariabilityofamplitudesinSARasaresultofspecularscatteringandscintillation. 30

PAGE 31

10 ],Gotcha[ 40 ]),electromagnetic(EM)simulated(Backhoe[ 41 ],andCVDomes[ 42 ])sources. Insimplisticterms,measureddataisobtainedbysendingsuccessiveEMpulsesandmeasuringthetimethatittakesforthesignaltoreturntotheradar.SARutilizestheprojectionslicetheoremtocreatethedata.Thismeasurementissavedinthephasehistorydata(PHD)whichisstoredaspulses(slowtime)andfrequencysamples(fasttime).toconvertthePHDintoanimage,animageformationtechniquesuchasthePolarFormattingAlgorithm[ 43 ],rangemigrationalgorithm[ 44 45 ],orbackprojection[ 26 46 ],amongothersareused. Measureddataisconsideredmorerealisticasitmostcloselyresemblesthedatathatwillbeusedintheapplicationsetting.However,theeldcollectionofrealisticmeasurementdatacanbetimeconsumingandeconomicallyexpensive.Therefore,itisunlikelythatthemeasurementdatawillcoverasufcientnumberofconditionstoeffectivelytrainaclassicationalgorithm. 42 ];therefore,thedatabasehas3,600templates.Poseestimationisusedtonarrowthenumberoftemplatesneededtodoclassicationfrom3,600templatesto40templates.Boththeamplitudeandanglefeaturesarerequiredtodifferentiatethevehiclesfromoneanother. 31

PAGE 32

19 47 51 ].ThesealgorithmsuseCADmodelsofthetargetsandthetheoreticalprinciplesofphysicalopticstorecreatethedata.ThishasbeenarelativelynewadvancesincetheSARcommunityhasnotpreviouslybeenabletoutilizetheseEMsimulatorsduetolackofsufcientcomputingpower[ 52 ]. EMsimulatorstakeasignicantamountoftimetogeneratethedata,butthisapproachisstillfasterthanobtainingmeasureddata.Forexample,togeneratetheCVDomedataset,ittookprocessors,4GB/Node,4TBTotal,InnibandInterconnect,97TerabyteWorkspace,AMDOpteron(2.8GHz)whichequatesto72,000CPUhours[ 42 ].Also,mostofthecommercialEMsimulatorsarerestrictedandalargeportionoftheresearchcommunitydonothaveaccesstothesetools[ 52 ]. Inthisresearch,anewapproachtogeneratingthetrainingdataisbeingproposedbyusinggeometricoptics.ThiswillreducetheamountoftimerequiredtogeneratethedatabasebecausetheEMpropertiesdonotneedtobecalculated.Also,thisapproachiswidelyavailableforotherstousebecausethealgorithmiswritteninMatlab,butcouldbeeasilytranslatedtoothercomputerlanguagesasrequired. 11 53 56 ].SARtemplatematchingapproachesusealargedictionaryoftemplates.EachtemplateisbuiltfromimagesoveranarrowrangeofaspectanglesduetotheaspectanglesensitivityoftheSARsignatures.Withoutareliableposeestimationstepinthetemplatematchingalgorithm,eachtemplatewouldhavetobematchedtothesensedSARimage.Thisresultsinaverylargecomputationalburden.AnaccurateposeestimatorsignicantlyreducesthiscomputationtimeasonlyahandfuloftemplatesneedtobematchedtotheSARimagesratherthantheentiredictionary. 32

PAGE 33

Infact,thenoveltyoftheposeestimationapproachinthisdissertationhasthreeessentialelements.ThisworkusescircularSARtoestimatetheposeofcivilianvehiclesusingglintphenomenology.Thesethreeelementsaresynergisticallyusedtoturnthenormallydifcultradarimageryconditions,i.e.lowreturnsofcivilianvehiclesduetosmoothsurfacesandspecularreturnsduetoglints,fromadisadvantagetoanadvantagebyusingsignalprocessingstrategiesthattakeadvantageoftheradargeometryandscatteringphysics. 1. AnovelandveryaccurateposeestimationtechniqueforcivilianvehiclesusingcircularSARandglintphenomenology(Chapter 3 ) 2. AnewmethodforgeneratingtemplatesforSARclassicationthatgeneratesthepositionofanglefeatureswithoutrequiringcomputationallyintensiveraytracingorsurfaceintegration(Chapter 4 ) 3. AnewmethodformatchingthetemplateswithspeciallyprocessedSARimagerythatweightsandcombinestheanglefeaturesandtheamplitudeintheSARimagerytoestablishthefeasibilityofcivilianvehicleclassicationusingSAR(Chapter 5 ) 33

PAGE 34

ThetaskofdesigningtechniquescapableofperformingATRhasattractedattentionforEOaswellasSARsensorsystems.TheprimaryfocusofthisdissertationistheformulationandevaluationofATRprocessesforSAR.Inthischapter,arepresentativesummaryofpreviousworkinSARATRispresented,followedbyadiscussionofthenoveltyandimpactofcontributionsinthisdissertation.Finally,theprimarycontributionsofthisresearchareoutlinedwithresepcttoposeestimation,databasegeneration,andclassicationalgorithmdesign,andtheadvantagesoftheseapproacharedescribed. 10 ]since1996. Recentevents,alongwiththereleaseofdatasets,havebothmotivatedandfacilitatedtheconsiderationofSARATRforcivilianvehicles.The2006releaseoftheGotchadataset[ 40 ],consistingofmeasuredphase-historydataofcivilianvehicles,spawnedanumberofinvestigationsincluding[ 23 35 37 38 ].ThemorerecentlyreleasedCVDomedataset[ 42 ]alsoresultedinanumberofinvestigations[ 26 35 38 ].Table 2-1 summarizesthedetailsofthemorepertinentinvestigations. Anumberofpreviousinvestigationshavefocusedonfeatureextraction,whichisgenerallyapplicabletootherSARprocessingsystemssuchas3-Dimaging,poseestimation,aswellasclassication.Apowerfulfeatureextractiontechniqueistheidenticationofattributedscatteringcenters[ 31 35 ].Analternatetechniquebasesfeatureextractiononlocalfeatures[ 36 ].Lastly,Pena,etal.[ 37 ]considerstheextractionofvehicleedgesusingYanik's,etal.[ 57 ]edgedetectionlter. 34

PAGE 35

32 35 ]werethersttoapproachthisproblemforcivilianvehicles.In[ 32 34 ],apointpatternmatchingwasemployedusingtheDirectedPartialHausdorffDistancealongwithaparticleswarmoptimization.Analternativeapproach,consideredin[ 33 ],usesextractedattributedscatteringcentersandmeasuresdissimilaritiesbetweentestandtrainingdatathroughthepartialHausdorffdistance.ThenalstepinvolvesmappingthescatteringcenterstotheEuclideanspaceandperformingclassicationusingsupportvectormachinesandlineardiscriminantanalysis.Inanotherwork,[ 35 ]considersathree-stepapproachconsistingofmappingattributedscatteringcenterstorotationalandtransnationalinvariantsets,applyingpyramidmatchhashing,andperformingfusionontheresultsforthescatters.Jackson,etal.[ 38 ]alsoapproachedthisproblembyusingmorphologicaloperatorstoenhancetheimagequality,followedbytemplatematchingbymeansofnormalizedcorrelationcoefcienttoclassifythevehicles.Lastly,Paulson,etal.[ 36 ]useslocalfeatureextractionalgorithmsandnearestneighbortechniquestoclassifythevehicles. Table2-1. SummaryofATRforSARDataCollectionofCivilianVehicles Dungan[ 33 ]XXDungan[ 34 ]XXDungan[ 35 ]XXJackson[ 38 ]XPaulson[ 36 ]XXDemanet[ 58 ]XFerrea[ 25 ]XGorham[ 26 ]XMoses[ 23 ]XMoses[ 24 ]XVu[ 27 ]XErtin[ 28 ]XAustin[ 29 ]XAustin[ 30 ]XDungan[ 31 ]XXPena[ 37 ]XPaulsonXXXX

PAGE 36

23 24 ]performsimageformationusingrecursiveprocessingtechniquestodecreasecomputationtimeascomparedtootheralgorithmsthataretypicallyusedsuchasconvolutionbackprojectionorpolarformatalgorithm.Alternatively,Ferrara,etal.[ 25 ]andVu,etal.[ 27 ]approachtheimageformationproblembyenhancingtheimagethroughcreatingsparseimages.Ferrara,etal.[ 25 ]formsimagesusingaNon-UniformFastFourierTransform,andevaluatestwoapproachestoenhancethequalityoftheimage:the`1-regularizedleastsquares,andtheCompressiveSampling-basedMatchingPursuit.Alternatively,Vu,etal.[ 27 ]usesanIterativeAdaptiveApproachtoformtheimageandthenusesthemaximumaposteri(MAP)approachtopromotesparsity.Lastly,Gorham,etal.[ 26 ]usesbackprojectiontoformtheimage.Sincebackprojectionisafundamentaltechniqueformanyimageformationalgorithms,adetailedexplanationofthetechniqueisgiveninSection 3.1.3 Anotherbodyofworkfocuseson3-Dimaging,sincetheadditionalinformationavailableintheheightoftheobjectcanbevaluableinSARATR.Inthisarea,Ertin,etal.[ 28 ]usestwodifferentalgorithmstoobtain3-Dimages:a3-Dlteredbackprojectionreconstructionandmodel-baseddeconvolution.Austin,etal.[ 29 30 ]consideredthe`p-normleastsquaresprocessingaswellasnonuniformmultipassinterferometricSAR.Lastly,Dungan,etal.work[ 31 ]extractstheattributedscatteringcentersfromtheimageandutilizesthephenomenologyoftheoddandevenbouncestocreate3-Dimages. 36

PAGE 37

2-2 ,wherethecolumnlabelabbreviationscorrespondto:Appr=Approaches,TS=TargetSegmentation,FE=FeatureExtraction,HT=HoughTransform,AED=AngularEnergyDensity,CWT=ContinuousWavelettransform,NN=NeuralNetworks,SVR=SupportVectorRegression,AER=AngleEntropyofRadonTransform,AS=AxisofSymmetry,LF=LinearFit,BB=BoundingBox,S=Statistics,T=Training,SB=Shape-based,NAA=NarrowIlluminationAngle,WIA=WideIlluminationAngle,MV=MilitaryVehicles,andCV=CivilianVehicles. Table2-2. SummaryofResearchinPoseEstimation Zhao[ 59 ]XXXXXXPrin[ 60 ]XXXXXXMeth[ 61 ]XXXXXXVoic[ 62 ]XXXXXXXKapl[ 63 ]XXXXXXXXin[ 64 ]XXXXXXXXSun[ 65 ]XXXXXXXMcFa[ 66 ]XXXXXXXHan[ 67 ]XXXXXXXXXXDuan[ 68 ]XXXXXXXSaidi[ 69 ]XXXXXXSi[ 70 ]XXXXXPaulsonXXX 1.3 ),and2)assumeeithertheavailabilityofextensivetrainingdata,oradiscernibleobjectperimeterwithauniqueouteredge.Inwhatfollows,thislatterassumptioniscalledthecrisp-boundaryassumption.Inthisdissertation,techniquesapplicabletowide-angleaperturesthatdorelyontheseassumptionsaredevelopedandanalyzed. 37

PAGE 38

InZhao,etal.[ 59 ]andPrincipe,etal.[ 60 ]multilayerperceptrons(MLP)areusedforfeatureextractionfollowedbymaximizationofmutualinformationtoestimatetheposeangle.Here,Havrda-Chavart'sQuadraticentropywasusedtocalculatethemutualinformationasafunctionofposeangle.ThetrainingdataisusedistotraintheMLPtondaprojectionthatbestpreservetheposeinformation[ 71 ]. Anotherapproach,Kaplan,etal.[ 63 ]calculatestheposeestimatebydeterminingtheorientationthatmaximizestheangularenergydensity.Thealgorithmusesacontinuouswavelettransformtoreducecomputationalcomplexity.Trainingdataisneededfortheslantplaneadjustment. InXin,etal.[ 64 ]targetsegmentationandaHoughtransformisusedtoperformtheinitialposeestimation.However,thistechniqueisknowntobeunstable.Inordertoselecttheproperpose,a2-DCWTisused.Lastly,slantplaneadjustmentsmustbemadetocompensatefortheestimationerrorscausedintheprojectionofthe3-Dobjectontoa2-Dplane. In[ 66 ],targetsegmentationisusedtoextractseveralfeaturessuchasthell,aspect,theinversell,andfaderatiowhicharewithinaboundingboxareaofthetargetcontour.OtherfeaturesthatareusedintheiralgorithmincludePCAfeatures,robustregressionfeatures,andtheleananglefeatures.Thesefeaturesarethenusedasinputstotwoneuralnetworksystemswhichrequiresignicanttraining. Finally,inHan,etal.[ 67 ]thetargetissegmentedandfeaturesareextractedusinglongedgettingandRidgeletTransform.Theresultingfeaturesareusedinasupportvectorregressionwhichrequirestraininginordertoconverge.Thenalestimatedposeisobtainedbytakingtheaverageoftheregressionoutputs. 38

PAGE 39

62 ],targetsegmentationallowstheformationofaboundingboxaroundtheedgesofthetarget.Thenthealgorithmrotatestheboxbyangleovertheinterval2[0o,90o).Severalcriterionareusedtoestimatetheposeangle.Theseincludethetargettobackgroundratio,perimeter,edgepixel-count,andHoughtransform. Meth,etal.[ 61 ]alsousestargetsegmentationfollowedbyasearchforstraightlinesandalineart(LF)ofregioncorrespondingtoorthogonalsegments.Poseisestimatedbycomputingthelengthofbothsegments.Thesmallerofthetwosegmentsisassociatedwiththefrontorthebackofthevehicle. Sun,etal.[ 65 ]approachesposeestimationbyusingtargetsegmentationfollowedbyedgedetectionusingaSobellter.Inanexhaustivesearchoverposeangles,thealgorithmdrawsaboundingboxinthatorientationsothatthesidesoftherectanglearetangenttothetargetcontour.Next,therectangleisdilatedtoapresetpixelsvalueandthenumberofpixelsalongthelongersidesthatbelongtoboththetargetcontourandrectangleisdetermined.Themaximumoftheseisusedastheedgeweightatangle.Thenextstepistorotatetherectanglebyandrepeattheprocessofndingtheedgeweightfrom[0,180].Finallytheestimatedtargetposecorrespondstothemaximumofalltheedgeweights. InDuan,etal.[ 68 ]targetsegmentationisfollowedbytheuseofAngleEntropyofRadonTransform(AER)minimizationtoestimatetheposeangle.Sincetheirdatasetconsistofshipsthathavelongandnarrowaspects,theyassumetheRadonTransformintegralwillgeneratethesharpestdistributionalongtheship'saxis,therefore,causingthelowestentropytooccurattheship'smajoraxis. InSi's,etal.work[ 70 ]performcoarseextractionbyusingstatisticstodeterminethelocationofthestrongscattersandthepixelsfurtherthanapresetdistancefromthecentroidofthetargetareeliminated.Thenthealgorithmlooksforthefrontedge 39

PAGE 40

FunctionalityoftheDifferentTypesofDataCollectionMethods TypeRealismControlAdaptability Synthetic411ScaleModel322GroundPlatform233AirbornePlatform144 responsebycalculatingthenormaldistributionofthestrongscattersandsmoothing,scalingthenormaldistribution.Todeterminetheposeofthetarget,thealgorithmlooksforthesharprisingedgeinthesmoothedandscaledversionofthenormaldistributionwhichcoincideswiththeposeofthetarget. Finally,Saidietal.[ 69 ]performstargetsegmentationontheSARimageandextractsfeaturesbyusingFourierDescriptorsandMomentsInvariants.Inordertondtheposeofthetarget,thealgorithmlooksfortheaxisofsymmetry.Theposeestimateistheanglethatmaximizesthemutualinformationbetweentheoriginalimageandthesymmetricimage. 2-3 showsthedifferentmodesofdatasetsthatexistintheSARcommunity.Thevalue1through4refertothebestandworstqualities,respectively,forthatparticularcategory.TheRealismcategoryreferstotherepresentationofeldmeasurementsandtheControlcategoryreferencestotheabilitytoobtainandcontrolthegroundtruthinformationofthedatacollection.Lastly,theAdaptabilitycategoryreferstotheabilitytochangetheoperatingconditions. 40

PAGE 41

ThegoldstandardofairbornedatacollectionsformilitarytargetsistheMSTARdatacollection.Thisdatasetiscomprisedofseveralmilitaryvehicleswithdifferentcongurationsatdepressionanglesrangingupto40degrees,squintangle35degreesoffbroadside,andatdifferenttargetaspects[ 10 ].Thisdatasethasbeenusedforthedevelopmentofclassiers[ 11 17 ],detectors[ 19 21 ],featureextractors[ 15 22 ]andposeestimators[ 65 71 ]. ThepremierdatasetforcivilianvehiclesistheGotchadatasetwhichiscomposedofeightcompletecircularpassesofanareacontainingdifferentcivilianvehicleswithelevationanglesrangingfrom43.7to45degreesatbandwidthof640MHz[ 28 ].Imageformation[ 25 27 ],3-Dimaging[ 30 31 ],featureextraction[ 31 34 ],classication[ 35 38 ],andposeestimation[ 72 ]investigationshaveusedtheGotchadataset. 41

PAGE 42

TheGeorgiaTechResearchInstitute(GTRI)ground-basedcollectionsystemusesa640MHzbandwidthradar,asdiscussedin[ 73 ].Examplesofinvestigationsthatemploydatafromthissysteminclude[ 74 75 ]forCircularSARImagingtechniquesand[ 76 ]forautomatictargetrecognition.Otherground-basedrotatingsystemsusedforimagingtechniquesaredescribedin[ 43 77 ]. TheUniversityofMassachusettsLowelloperatesascale-modelsystemoperatingat1.56THz,abandwidthof8GHz,atscale1=16th.Thiscorrespondstoafullscaledatacollectionat96.6GHzandabandwidthof500MHz[ 78 ].Asecondscalemodelsimulationoperatedat1=35thscale,at350GHzandabandwidthof35GHz;correspondingtoafullscalefrequencyof10GHzandabandwidthof1GHz[ 79 ]. 42

PAGE 43

42 ].Investigationsconcerningimageformation[ 26 ],featureextraction[ 35 38 ],classication[ 35 36 38 ],andposeestimation[ 80 ],havebeenbasedonthisdataset. TheBackhoedatadomeisanotherexampleofsyntheticdata.Thisgenerationcorrespondstoabandwidthrangingfrom7to13GHz[ 41 ].Investigationsusingthisdataset[ 30 81 82 ]emphasizesimageformationtoimproveSARfocusing. Physicalopticsandgeometricalopticsarethetwoprimaryapproximationsusedingeneratingsyntheticdata.Physicalopticsuseselectromagnetictheorytodeterminethestrengthofthereectedsignalandwhereitislocatedintheimageplane[ 19 47 51 ].Geometricalopticspredictsscatteringintheimageplaneusingonlygeometry.ThisdissertationrepresentsrstknownefforttoexploreSARimageryformationfromgeometricaloptics. 83 ],theapproachesforpatternrecognitionincludestatisticalclassication,syntacticorstructuralmatching,neuralnetworks,andtemplatematching. Inanypatternrecognitionapplication,thereexistnuisanceparameterssuchaspose,translation,intensity,occlusion,andvariationoftargetsthataffecttherecognitionperformanceandoperation.Thereareseveralapproachescommonlyusedtoaddressnuisanceparameters.Thesecanbecategorizedassampling,invariant,robust,estimate,andintegrate.Bryant[ 84 ]givesabriefoverviewofeachmethodwhichwillbebrieydiscussedinthisdissertation. 43

PAGE 44

Intheintroductionof[ 85 ],theauthorsstatethatstatisticalclassicationusestrainingdatatosearchforthebestdecisionboundaryinthefeaturespacethatwillbestdifferentiatethetargetsfromoneanother.Onewayofcharacterizingstatisticalpatternrecognitionalgorithmistodividethemintotwotypes:discriminativeandgenerative.DiscriminativeclassiersusetrainingdatatomodeltheconditionalprobabilityP(Cijx)whichistheprobabilityofclassigiventhedata.Examplesofdiscriminativeclassiersincludelineardiscriminativeanalysis[ 86 88 ],logisticregression[ 89 91 ],k-nearestneighbor[ 92 94 ],andsupportvectormachines[ 95 97 ].GenerativeclassiersusetrainingdatatomodelthelikelihoodP(xjCi)andthenusesBayesruletocalculatetheposteriorconditionalprobabilityP(Cijx)=P(xjCi)P(Ci) 98 100 ],Gaussianmixturemodels[ 101 103 ],andhiddenMarkovmodels[ 104 106 ]. Syntacticorstructuralmatchingconsistsofagroupofprimitivesthatarenodesinagraph.Thealgorithmnotonlymatchestheprimitives,butalsomatchesthestructureofthegraphtoclassifythetargets[ 83 107 ].Syntacticorstructuralmatchingisusedin[ 108 110 ]. 44

PAGE 45

83 ].Examplesofresearchusingneuralnetworksincludemulti-layerperceptrons[ 111 113 ],RadialBasisFunctions[ 114 116 ],Gaussianmixturemodels[ 102 117 118 ],andKohonennetworks[ 119 121 ]. Theprocessoftemplatematchingemploysasearchforthebestmatchbetweenthetrainingsamplesandtestsamplesusingamatchmetricsuchastheminimummeansquareerror.Whenperformingtemplatematchingfortherecognitionproblem,multipletemplatesareneededforeachtargetduetothenuisanceparameterssuchaspose,intensity,andtranslation.Forexample,Wright,etal.[ 122 ]usedthesamplingmethodtomanagetheintensityvariationcausedbyillumination.Also,Wagner,etal.[ 123 ]sampledvariouslightingconditionstoaddresstheilluminationproblemandestimatedtheposetohandlethemisalignmentandocclusionproblem.Anotherexampleofaddressingnuisanceparameters[ 124 ]whichusestheintegratemethodtodealwiththedifferentcongurationsofthetargetduetoscatteringphenomenology. Allclassicationstechniquesrequiresomeformoffeatureextraction.Tuytelaars,etal.[ 125 ]providesabriefoverviewoftechniquesusedtoperformfeatureextractionofEOimages.Typically,thefeaturesthatarebeingextractedareedges,corners,blobs,orregions.Toextractedgesfromanimage,Canny[ 126 ],Sobel[ 127 ],Prewitt[ 128 ],Roberts[ 129 ],orMarr-Hildreth[ 130 ]edgedetectorsarecommonlyused.CommonwaystoextractcornersfromEOimagesincludetheHarriscornerdetector[ 131 ],SUSAN[ 132 ],Harris-Laplace[ 133 ],Harris-Afne[ 133 ],andFeaturesforAcceleratedSegmentTest(FAST)[ 134 ].Techniquesthatextractimageblobsincludethesalientregiondetector[ 135 ],Difference-of-Gaussians(DoG)[ 136 ],andSpeedUpRobustFeatures(SURF)[ 2 ].Lastly,intensity-based[ 137 ],maximallystableextremal[ 138 ],andsegmentation-based[ 139 ]aresometechniquesusedtoextractregionsfromthedata.However,thedisadvantagewiththesetechniquesisthatSARimagesdonotcontainthelledincontourthatEOimagescontain. 45

PAGE 46

140 ],2DPCA[ 119 ],Matrix-basedcomplexPCA[ 141 ],bidirectionalPCA[ 142 ],andICA[ 143 ].Thedisadvantagewiththeseapproachesisthatabundanttrainingdataisneeded. 11 144 145 ].Toaccomplishthistask,aConstantFalseAlarmRate(CFAR)detector[ 146 ]iscommonlyused.Sometimesanadditionaldiscriminationbetweentargetandnaturalclutterisemployedbyusingaone-classquadraticdiscriminatorforthediscriminationstage[ 147 ].Interestingly,mostSARATRalgorithmsdonotusestatisticalclassicationforrecognitionbutratherresorttotemplatematching.Templatematchingoftenoffersbetterperformancebutonlyiftemplatesexistforeachposeandelevationangle,soabruteforcemethodcanbeusedtondtheexacttemplatematch.However,thereareafewalgorithmssuchas[ 65 148 149 ]thathaveusedstatisticalclassication.Thesealgorithmsmayimprovecomputationtimeandalleviatethelarge-memoryrequirementneededtosaveallthetemplates. RarelyissyntacticorstructuralmatchingusedforSARATRbecausethestructureoftheobjectcanchangesignicantlydependingontheoperatingconditions;therefore,asmallnumberofprimitiveswillnotsufcientlydescribeaSARimage.SyntacticorstructuralmatchingcanbedoneforEOimagerysincethefeaturespersistwithrespecttoaspectangle,whichisnotthecaseforSAR.InEO,aslongasthefeatureisalongthethelineofsightfromthesensor,thefeatureswillpersist.However,thatisnotthecaseforSARbecausenotonlydothefeatureshavetobealongthelineofsight,butthesignalreectedbacktothesensormustbeorthogonal(DiscussedindetailinChapter 4 ). 46

PAGE 47

12 150 151 ].Thisapproachusesaphysicalopticssyntheticgenerateddataasapredictionofthetest(measured)image,extractsfeaturesfromboththetestimageandthepredictedimage,andusestheextractedfeaturestoclassifythetargets. NeuralnetworksclassiershavebeenusedinSARfortargetdiscrimination[ 152 ],poseestimation[ 59 60 ],andclassication[ 153 154 ].Principe,etal.[ 152 ]usedneuralnetworksfordiscrimination,whilemostworkatthetimecenteredaroundtheprescreenerthatusedCFARtolteroutregionsofclutter.Otherworkhasfocusedontheprescreenerstage[ 54 ],butpostulatedthatthediscriminatorstagecouldbeimprovedbyusingneuralnetworks.Neuralnetworkswereappliedtoposeestimationtoaddresstheissueofcomputationtimeneededfortemplatematchingbylimitingthenumberoftemplatesneededtomatchthetesttargettothetrainingtargets[ 59 60 ].However,neuralnetworksrequiretrainingdatainordertoestimatethepose. Notethatsomeformoffeatureextractionmaybeusedintheprescreener,discrimination,andclassicationstagesofSARATR[ 22 ].Intheprescreenerstage,usefulfeaturesincludeedges[ 155 ],shadows[ 156 ],brightreturns[ 54 ],etc.Forthediscriminator,usefulfeaturesincludetextural,size,contrast,polarimetric,etc.[ 54 ].Theseareexploredindetailin[ 147 157 ].Lastlyfortheclassicationstage,usefulfeaturesincludeangle[ 72 80 ],scatteringcenters[ 34 35 ],andPCA-based[ 158 159 ]. 47

PAGE 48

3-1 whichshowthatthecolortendstobeconsistentacrossthecardinalheadingsofthevehicle.Atanglesbetweenthecardinalheading(front,back,andside),thecolorchangesgraduallyindicatingthatthesurfacesarecurved.Theatplatesonthesidesandfrontofthevehicleprovidesharpglints,whereastheotherangleshaveagradualchangeincolorastheanglechanges.ThiscanbeveriedinboththesyntheticdatafromtheCVDome(Fig. 3-1A )andmeasureddatafromtheGotchadataset(Fig. 3-1B )atabandwidthof640MHzandelevationangleof45degrees.Thereisconsistencybetweenthesetwoimages(Fig. 3-1A andFig. 3-1B )inthatbothimagesaregreen,blue,purple,andorangealongtheouteredgeofthevehicleatthecardinalheadings.Therefore,thisresultprovidessomereassurancethattheproposedposeestimationapproachwillworkforbothmeasuredandsyntheticdata. ThisscatteringbehaviorwasveriedinChapter 1.2.1 Tofurthervalidatethisapproach,comparethecardinalheadingsat0(Fig. 3-2A ),90(Fig. 3-2B ),180(Fig. 3-2C ),and270(Fig. 3-2D )totheoff-cardinalheadingimagesshowninFig. 3-3A 3-3B 3-3C 3-3D .Itcanbeclearlyseenthatthecardinalheadingimagesproducesignicantlymorereturnsthanthenon-cardinalheadingimages. 48

PAGE 49

B AngleandAmplitudeValueofCivilianVehiclesforDifferentDatasetsatBandwidth640MHz,depressionangle45:A)CVDomes,B)Gotcha B C D ProofofConceptusingCVDomesatCardinalHeadingsusingBandwith5.35GHz,depressionangle30,andp=10%:A)359-1,B)89-91,C)179-181,D)269-271

PAGE 50

B C D ProofofConceptusingCVDomesatNoncardinalHeadingsusingBandwith5.35GHz,depressionangle30,andp=10%:A)44-46,B)134-136,C)224-226,D)314-316 3-4A ,Fig. 3-4B ,Fig. 3-4C ,andFig. 3-4D )alsowerethemaincontributorstotheGLRTimage.Thenon-cardinalheadings(Fig. 3-5A ,Fig. 3-5B ,Fig. 3-5C ,andFig. 3-5D )producesignicantlylessscatteringthanthecardinalheadings. Thesegures,showingtheanglesensitivityforbothmeasuredandsyntheticdata,motivatetheposeestimationapproachdevelopedinthisdissertation.Aswillbe 50

PAGE 51

B C D ProofofConceptusingGotchaattheCardinalHeadingsusingBandwidthof640MHz,depressionangle45,andp=20%:A)20-30,B)110-120,C)200-210,D)290-300 3.2 ,anapproachthatsumsthecardinalheadingstoestimateposeanglecapitalizesontheglintphenomenologytoprovideveryaccurateestimatesofthevehicle'sorientation. 3 ).Next,thesubaperturesarecombinedbyndingthemaximumamplitudeateachpixellocationandstoringboththeamplitudeandcorrespondingaspectangleofthemaximumsubapertureatthatpixellocation.SinceaSARimagecanbeviewedasamatched 51

PAGE 52

B C D ProofofConceptusingGotchaattheNoncardinalHeadingsusingbandwidthof640MHz,depressionangle45,andp=20%:A)60-70,B)150-160,C)240-250,D)320-330 3-6 tovisualizetheowofthealgorithmandthespecicstepsofthealgorithmaredelineatedinAlgorithm1.Tounderstandeachstep,thefollowingparagraphsdescribethealgorithmindetail. 52

PAGE 53

FlowchartofPoseEstimation 3 3 3 tocalculateGLRTimage 3 3 3 ,^ismodulo90 3.1.3ImageFormation 3-8A showsanexampleoftheSARcollectiongeometryfortwodifferentpulsesorsignalswhichcorrespondstotheblackdotsinthegure.Therearemanypulsestransmittedtogeneratethefullrevolution 53

PAGE 54

Oncethesignalhasbeentransmittedandreectedbacktotheradar,theequationforthereceivedsignalalongthelineofsightisasfollows: whereS(x,y,z)isthegroundreectivityfunction.Thismeasurestheroundtriptraveltime(t)ofthesignal,hencethefactorof2intheaboveequation.Tovisualizethisprocessfromatopdownviewofthetarget,refertoFig. 3-7 .Eqn. 3 createsarangeprolewhichcanbeviewedinFig. 3-10 .Onethingtonoteisthattheelevationanddepressionanglesarecongruentsincetheradarisparalleltothegroundandtheradarsignalformsatransversallinebetweenthetargetandradar;therefore,theelevationanddepressionanglecanbeusedinterchangeably. Figure3-7. CreatingRangeProle Mostradarscollectthedatainthefrequencydomain,R,(f),ratherthanthetimedomain,r,(t).TheseFouriertransformpairsareshowninEqn. 3 54

PAGE 55

3-8B whereeachradiallineinthegurerepresentsaparticularR,(f)andeachdotrepresentsasamplefrequency.TheothervariablesinFig. 3-8B areandfcwhichwillbedescribedlaterinthissectionaretheanglelengthofthesubapertureknownastheintegrationangleandthecenterfrequencyrespectively. B CollectionGeometry:A)CircularSARCollectionGeometry,B)DataCollectionforSARinFrequencyDomain 55

PAGE 56

3 toformthesubaperture(Ic,,(x,y,0)). Anexampleofasubapertureobtainedat=5,c=90,=30,andfc=9.6GHz,canbeviewedinFig. 3-9 Figure3-9. AnExampleofaSubaperture Eqn. 3 canbebestexplainedintwoparts,where: Eqn. 3 istherangeprole(Fig. 3-10 ),wherexcoscos+ysincos+zsinistinEqn. 3 .Alsotheterm,V(fmin,fmax)istheweightingfunctiontominimizethesidelobes(discussedinChapter 3.1.3.1 ).Next,fmaxandfminarethemaximumandminimumsamplingfrequenciesofthesignalrespectivelywhichcanbeviewedinFig. 3-8B .These 56

PAGE 57

wherethebandwidthofthesignalisfmaxfmin. Figure3-10. AnExampleofaSubapertureat=5,=90,=30,fc=9.6GHz ThenthesecondpartofEqn. 3 takeseachrangeproleandeffectivelysmearsitacrosstheimageplanewhichisrepresentedby: Eqn. 3 willcoherentlysumeachrangeprolethatiswithinthesubaperturerangeofc 3-11 .Thegureontheleft,inFig. 3-11 ,istheindividualrangeprolesmearedacrosstheimageplaneforaparticularpulse.Thenthegureontheright,inFig. 3-11 ,areallthesmearedrangeprolestartingfrompulse1tothecurrentpulsebeingcoherentlysummedtogether.Thesnapshotofthepulsesusedin 57

PAGE 58

3-8B ,whereonesubapertureconsistsofseveralinverseFouriertransformofR,whichisdependentonthesizeof. B C D E F

PAGE 59

H I J ProcessofCreatingaSubaperture:A)Pulse1SmearedintoImagePlane,B)CoherentSumofPulse1toPulse1,C)Pulse20SmearedintoImagePlane,D)CoherentSumofPulse1toPulse20,E)Pulse40SmearedintoImagePlane,F)CoherentSumofPulse1toPulse40,G)Pulse60SmearedintoImagePlane,H)CoherentSumofPulse1toPulse60,I)Pulse80SmearedintoImagePlane,J)CoherentSumofPulse1toPulse80 FromTable1inHarris',etal.work[ 160 ],itwasdeterminedthattheRectangle(Uniform),Hamming,Chebsyshev,andBlackmanHarrisshouldbeusedforthis 59

PAGE 60

B C D GLRTImagesUsingDifferentWeighting:A)BlackmanHarrisWeighting,B)ChebyshevWeighting,C)HammingWeighting,D)UniformWeighting research.Uniformweighting(UW)isusedbecauseithasthebestresolutionduetothenarrowestmainlobeofallthelters.ThiscanbeveriedfromFig. 3-12 ,wheretheUW(Fig. 3-12D )hasthesharperimagecomparedtotheothers.However,thesidelobesarenotreducedasmuchcomparedtotheotherweightingfunctionswhichwillimpacttheclassicationstage.Thedegrationoftheperformanceofclassicationalgorithmisduetothesidelobespotentiallycausingconfusionforthetemplatematcher.Ontheotherhand,thesidelobescouldbebenecialfortheposeestimationbecausemoreinformationisprovidedfortheestimatortodeterminetheorientationofthevehicle.ThishypothesiswillbetestedandveriedinChapter 3.2 60

PAGE 61

3-12C ).ThisresultformsahypothesisinthattheHWcouldpotentiallyprovideanacceptablecompromisebetweenthewidthofthemainlobeandareductionofthesidelobeswhenitcomestoclassicationofthevehicles.ThishypothesisistestedandveriedinChapter 5.2 Inordertoverifyifitismoreimportantforthesidelobestobereducedcomparedtohavinganarrowermainlobe,Chebshevweighting(CW)andBlackmanHarrisweighting(BHW)areusedbecausethesetwoweightingfunctionssignicantlyreducethesidelobesatthecostofhavingawidermainlobe.IfCWorBHWoutperformUWandHW,thenitcanbeconcludedthatlteringthesidelobesisamoreimportantparameterfortheposeestimationthenthewidthofthemainlobe.However,ifCWandBHWperformworsethanUWandHW,thenitcanbeconcludedthatsidelobesuppressionisnotasimportantforposeestimation.Tovisualizewhatwasstatedpreviously,refertothenormalizedfrequencyplotofFig. 3-13 whichshowsthewidthofthemainlobeandtheheightofthesidelobesgoesfromlargesttosmallestforUW,HW,BHW,CW. 161 163 ]wasusedtoextracttheanglefeaturesneededforthisalgorithm.Toobtaintheanglefeatures,thesubaperture(Ic,,(x,y,0)whichisEqn. 3 )needstobetheinputintotheGLRTprocesswithonesubaperturebeingcomparabletooneimageinEOterms.TheGLRTprocesscalculatesthemaximumamplitudevalueamongalltheIc,,ateachpixel(x,y)whichcreatesann2matrixwhereonematrixcorrespondstothemaximumamplitudevalue,denotedasIG,,(x,y)withanexamplebeingFig. 3-14B .Theothermatrixistheanglevalue(G,,(x,y;c))atwhichthemaximumamplitudevalueoccurred.Fig. 3-14A combinesboththeamplitudeandangleGLRTimagetogethertoillustratehowthebrightreturnsalongthecardinalheadingsareconsistent.Thisresultisutilizedforthepose 61

PAGE 62

WeightinginboththeTimeandFrequencyDomain estimationscheme.TheequationfortheGLRTimageis: wherec=0,1,...,359withtheindexcrepresentingtheangleatwhichthecenterapertureislocated.Therefore,c=0,1,...,359.ForthisresearchthesizeofIG,,(x,y)andG,,(x,y;c)are501501. ForthisresearchGLRTwasusedbyndingthemaximumamplitudevalueof360Ic,,withonedegreeoverlapbetweeneachIc,,(Fig. 3-15 )whichgivesonecompleterevolutionaroundthetarget(Fig. 3-14B ).DifferentsizesoftheIc,,weretested(=2,5,10)anddifferentelevationangles(=30,40,45,50,60). 62

PAGE 63

B GLRTImages:A)AmplitudeWeightedGLRTAngleImage,B)IG,,(x,y)GLRTImage TheresultsoftheseexperimentscanbefoundinChapter 3.2 .TobetterunderstandtheIc,,anexampleoftherangeoftheIc,,isgivenfor=5atc=0,1,2,3,...,359is357.5to2.5,358.5to3.5,359.5to4.5,0.5to5.5,...,356.5to1.5respectively. Figure3-15. FlowchartofGLRT 63

PAGE 64

3-16 andismathematicallyrepresentedby: Figure3-16. GLRTIndicatorMatrixforJeep93withp=5% 3-17 Then,ahistogramofG,,(x,y;c)isformedwhichcorrespondstothetopp%ofthedataat90intervalstocreatethespokeslter(Fig. 3-21A andFig. 3-22A )denotedass^.Forexample,everypixelinG,,(x,y;c)thatcontained45o,135o,225o,315owouldbecombinedtogethertocreateaspokeat^=45.Theequationforthespokeslteris: 64

PAGE 65

B C Effectofp%on1G,,(x,y):A)IG,,(x,y),B)p=5%,C)p=35% Eqn. 3 canbebestvisualizedbylookingatFig. 3-18 .TheHadamardproduct(denotedas)of1G,,(x,y;p)(Fig. 3-18A )andG,,(x,y)(Fig. 3-18B )istakentoextracttheanglesthatcorrespondtothetopp%amplitudevalues(Fig. 3-18C ).TheanglesinFig. 3-18C aretheanglesusedtocalculates^ 3-19 65

PAGE 66

B C VisualizationofEqn. 3 :A)1G,,(x,y;p),B)G,,(x,y),C)ExtractedAngleValuesthatcorrespondtotopp%ofthedata Figure3-19. SpokesFilterPlotofJeep93 66

PAGE 67

3-19 .Thesepeaksaresymmetricaroundthetrueangle.Thesepeaksareduetotheglintfromthetargetappearinginmultiplesubapertures. Tocompensateforthepeakphenomenology,amovingaveragelter(MAF)isimplementedtosmooththehistogram,givingitonepeakasdepictedinFig. 3-20 .TheequationfortheMAFprocessofthespokeslteris: withmbeingthewindowparameteroftheMAFwith^beingmodulo90.Ifmisodd,thenrounddownthevalue.Forexample,ifm=5,thentheresultwillbe2.5;therefore,roundingdownthisvalueto2therangewillgofrom-2to2. Figure3-20. MAFofSpokesFilterPlotofFig. 3-19 Thepurposeofmistoincludetheangleinwhichtheglintwillappeartosmooththehistogram.Asmincreases,itwilleventuallycausethehistogramtohaveonepeak.Forexample,ifthesizeofm=3for=10,thentheglintwouldnotbeincludedforthefullspectrumoftheMAF,causingthehistogramtohavemultiplepeaksasshowninFig. 3-22B .Butifm=13,thentheglintisconsideredthroughouttheentiredurationoftheMAFallowingthehistogramtohaveonepeak(Fig. 3-22C ).However,iftheMAFis 67

PAGE 68

3-21C ). Itwasconcludedthatmtypicallyneedstobeslightlylargerthantheintegrationangletoincludetheglintofthevehiclethroughouttheentire.Forexample,ifthe=2andoneofthecardinalheadingofthevehicleisat0,thentheglintwillbelocatedatc=359,0,1.Thereasonwhytheglintislocatedattheseregionsisbecausec=359includeangles358,359,0,c=0includeangles359,0,1,andc=1includeangles0,1,2.Ascanbeenseenfromthisexample,theglintoccursatthecardinalheadingsofthevehicleand0islocatedateachoftheseindices.Sothesesubapertureshavetheglintcontributingtotheimage.Therefore,theidealsizeofmfor=2is3,=5is5,and=10is13whichwasdeterminedexperimentally. Thelaststepoftheposeestimationistheindex~s^atwhichthepeakvalueofMAFoccursbecomestheestimatedposedenotedas^. 40 ].Thisallowsforacomparisontobemadebetweensyntheticandmeasureddata.TheotherbandwidthswereusedtoevaluatetheperformanceoftheposeestimationalgorithmasthebandwidthincreasedtofullbandwidthoftheCVDomeswhichis5.35GHz[ 42 ]. 68

PAGE 69

B C EffectofmonSpokeFilterforIntegrationAngle5ofMazdaatp=20%andBandwidth=640MHz:A)SpokeFilter,B)m=5,C)m=7

PAGE 70

B C EffectofmonSpokeFilterforIntegrationAngle5ofMazdaatp=20%andBandwidth=640MHz:A)SpokeFilter,B)m=5,C)m=13

PAGE 71

3-23 .Notethattheresolutionisafunctionofbandwidth()dueto: withrandcbeingrangeresolutionandspeedoflightrespectively.Therelationshipbetweenbandwidthandrangeresolutionisinverselyproportional.Therefore,asincreases,thesizeoftherangebinsdecreases,whichmakesiteasierfortheimageformationalgorithmtoresolvethelocationofthescatterersintherangedimension.Fig. 3-23 illustratestheeffectofdecreasingthesizeoftherangebinfromthetopoftheguretothebottombylookingatthemainscatteringofthevehicle.Thewidthofthemainscatteringofthevehicledecreasesasthebandwidthincreasesisduetotherangeresolutionbeingsmaller.Therefore,asthebandwidthincreases,theaccuracyoftheposeestimatorincreases,sincethescatterersareabletoberesolved. Also,layoverbecomesmoreprominentasthedepressionangleincreasesduetogeometryofthereectionofthesignal(ExplainedinChapter 4 ).Asthedepressionangleincreases,theseparabilitybetweenthetopandthesideofthevehicleincreaseswhichcausesanannulusshapetoappear.Theseparabilityphenomenologyduetotheincreaseddepressionangledecreasestheperformanceoftheposeestimationalgorithmbecausethebox-likeshapeislesspronouncedcausingmoreanglestoappeartoconfusetheposeestimator. Totesttheposeestimationalgorithm,thetargetswereimagedevery5degreesbetween0and180degrees.Thereasonthattheywerenotrotated360degreeswereduetothefactthatthevehiclesweremostlysymmetrical.TheCVDomestestingdatafortheposeestimationalgorithmsimulatedthevehiclesonatgroundandatthecenterofthescene.FormeasuredSARdata,theseassumptionsdonotholdforallimagingconditions.TheGotchadata,whichwasmeasuredSARdata,wasalsousedtotestthealgorithmswheretheseassumptionsdidnotstrictlyhold. 71

PAGE 72

TheEffectofBandwidthandDepressionAngleDuringImageFormation TheCVDomesdatasetassumesthatthevehiclesareatscenecenterandonaatlevelsurface.However,theGotchadatasetisnotatscenecenterwhichwilltestthesensitivityofthealgorithmwhentargetsarenotatscenecenter. AnotheraspectexaminedforCVDomeswastheaffectofweightingontheposeestimationalgorithm.WeightingwasdescribedinChapter 3.1.3.1 .ThedifferentweightingthatwereusedwereBHW,CW,HW,andUW.FromFig. 3-24 ,theUWperformsbetterthantheotherweightingfunctions.Thisimprovedperformanceisduetothesidelobesoftheuniformweightingnotbeingsuppressedasmuchasthemoreheavilyweightedweightingfunctions.Also,UWhasasharpermainlobewhichproducesahigherresolutionimage(Fig 3-12 ).AstobeexpectedBHWandCWhavesimilarperformancetooneanotherduetothesimilaritieswhichcanbeseeninFig. 3-13 .Inessencethisexperimentconrmsthatthesidelobesarebenecialtoposeestimationratherthanahindrance.SinceUWproducethebestresults,thesearetheonlyresults 72

PAGE 73

3.1.3.1 Fig. 3-25 -Fig. 3-27 showstheaverageresultsobtainedfortheposeestimationalgorithm.ForFig. 3-25 -Fig. 3-27 ,thexaxiscorrespondstothedegreeerror,withthenumber6referringto6errorormore.Also,0indicatesthatthatalgorithmestimatedthecorrectpose;1,2,3,4,and5indicatesthattheestimatedposedisoffby1,2,3,4,and5respectively.Theyaxiscorrespondstothepercentageofdataforthatparticulardegreeerror.Theguresstartwiththelowestbandwidthintheupperright-handcornerandincreaseinbandwidthasthesubplotsgoesfromthetoptothebottom.Alsothelowestdepressionanglestartsontheleftandincreasesasthesubplotsmovetotheright. AscanbeseenfromFig. 3-25 -Fig. 3-27 ,theaccuracyoftheposeestimatoriswithin2degrees99%ofthetime.Theworseresultsisatbandwidth640MHzandelevationangle60degrees,buteventhenthealgorithmisaccuratewiththemajorityofthepredictionbeingwithin2.Therefore,thisalgorithmishighlyaccuratedespitetheelevationangleorbandwidthofthesignal.Theintegrationanglesof2,5and10degreesgavesimilarresults;therefore,itisrecommendedthat2degreesbeusedasithastheleastprocessingrequiredtoestimatetheposeangle.Thisimprovedprocessingtimeisduetothefasterimageformationasaresultofthesmallersubaperturesandalsoduetothesmallerslidingwindowlterwhichsmoothedthespokeslter.Itwasdiscoveredexperimentallythatthebestmvaluetouseforintegrationangle2was3,5was5,and10was13. FromDungan's,etal.work[ 164 ],thisresearchusesfcarA1(Car1),fcarB1(Car2),fsuv1(Car3),mcar5(Car4),msuv1(Car5),andvan1(Car6)becausethesevehiclesaretheclosesttothescenecenter.Alsothisresearchusedorbits214through232asthetestorbitsforeachvehiclebecausetheotherorbitsdidnotproduceaccurateimagesforeachvehicle. 73

PAGE 74

B C PoseEstimationResultsforeachWeightingTechniques:A)IntegrationAngle2,B)IntegrationAngle5,C)IntegrationAngle10 74

PAGE 75

PoseEstimationAverageResultsofUniformweightingatIntegrationAngle2 Tomeasurethestatisticaldispersionbetweenthedifferentestimatedposes,themedianabsolutedeviation(MAD)wasused.ThereasonforusingMADratherthanstandarddeviationisbecauseMADismorerobustthanstandarddeviationinthepresenceofoutliers[ 165 ].Tounderstandthiseffect,itisbenecialtolookattheequations: where 75

PAGE 76

PoseEstimationAverageResultsofUniformweightingatIntegrationAngle5 withX=[x1,x2,...,xN],i=1,2,...,N,andbeingthestandarddeviation.SinceMADisbasedoffthemedian,itfocusesonthedatathatareconcentratedratherthantheoutliers.Thestandarddeviationusesthemeanofthedatawhichcanbeaffectedbyoutliers.Forthisexperiment,theMADintegrationangle2(Fig. 3-28A )isbetween2and3.Forintegrationangle5(Fig. 3-28B )isbetween1and2whichindicatesthatthestatisticalvariabilitybetweentheestimatedposeforeachorbitissmall.Outofthetwointegrationangles,theintegrationangle5hasthebetterperformanceduetotheconsistencyoftheestimator. 76

PAGE 77

PoseEstimationAverageResultsofUniformweightingatIntegrationAngle10 40 ])andsynthetic(CVDomes[ 42 ])data.FortheCVDomes,severaldifferentdepressionanglesandbandwidthweretestedtogaugetheperformanceofthealgorithm.ThisalgorithmisofinteresttotheATRcommunitybecauseitcansignicantlyreducethecomputationtimerequiredtoclassifythevehicles.Toreducethecomputationtime,thenumberoftemplatesneededfromthedictionarytoperformclassicationisreduced.Forexample,afterusingtheposeestimationalgorithmontheCVDomesdataset,thenumberoftemplatesneededfromthedictionaryforclassicationofthevehicleswasreducedfrom 77

PAGE 78

B MADofPoseEstimationforGotchaData:A)IntegrationAngle2,B)IntegrationAngle5 3600to40.Thatisa98.89%decreaseinthenumberoftemplatesneededtoclassifythevehicles. Accurateposeestimationresultsweredemonstratedforbothsyntheticandmeasureddata.Forsyntheticdata,whichsimulatedtargetsinthescenecenterandonaatsurface,theperformanceofthealgorithmwas2error99%ofthetimeforbandwidthof640MHz,1.3375GHz,2.675GHz,and5.35GHzanddepressionanglesof30,40,45,50,and60.Formeasureddata,arobustdispersionmeasure,MAD,wasusedtodeterminethevariabilityoftheposeestimatewhenestimatedovermultiplepassessincethetrueposewasunknown.TheMADstatisticforanintegrationangleof 78

PAGE 79

79

PAGE 80

Togeneratetemplatesforcivilianvehicles,boththeprojectiongeometryandthescatteringphysicsmustbeconsidered.Sincecivilianvehiclesarecomprisedofsmoothsurfaces,thetypeofscatteringthatcontributestothetemplatereturnscanberestrictedtosinglebounceandthedoublebouncereturn.Thedoublebouncereturnconsistsofthevirtualdihedralofthetargetsideandtheground.Usingthesetwofundamentalscatteringtypes,aneffectivetemplategenerationcapabilityisdeveloped. 4-1 )anddoublebounce(Fig. 4-2 )areconsidered.Chapter 4.1 explainshowtemplatesarecreatedutilizingboththesingle(Chapter 4.1.1 )anddoublebounce(Chapter 4.1.2 )phenomenology.Fig. 4-3 illustratesthealgorithmandAlgorithm2delineatesthespecicsteps. 28 31 ].Thisphenomenologyobeysthelawofreectionwheretheangleinwhichtheradarsignalstrikesthesurfacewillbethesameastheanglethattheradarsignalisreectedoffthesurface.Inthecaseofthesinglebounceradarsignal,theanglebetweenthetransmittedsignalandthesurfacenormaloftheedgeofthevehicleis0o;therefore,thereectedsignaltravelsthesamepathofthetransmittedsignaltraveled(Fig. 4-1 ).Tocreatethetemplateofthevehicles,theknowledgeofthesurfacenormal,dotproduct,andgeometryareutilizedtopredictthelocationofthesinglebounceradarsignal. Therststepincreatingthesinglebouncetemplateistondthesurfacenormalofeachfacet(Fig. 4-4 ),denotedasn,intheCADmodel(Fig. 4-5 ).Thisisdonebyusing 80

PAGE 81

B SingleBounceRadarSignal:A)TransmitSignal,B)ReceiveSignal Figure4-2. DoubleBounceReturn 81

PAGE 82

4 throughEqn. 4 4 -Eqn. 4 4 andbilinearinterpolation 4 4 4 andEqn. 4

PAGE 83

FlowchartofDatabaseGeneration thefollowingequations: wherea,b,c2R3aretheverticesofafacetinthemodel.ThefacetmodelisdenotedasM(x,y,z). Figure4-4. SurfaceNormalofaFacet Theunitvectoristhencalculatedby: 83

PAGE 84

Figure4-5. FacetModel Theradarvector,^vrad,,(Fig. 4-6 )isthencalculatedfrom: atangleandrad.Inthisresearchisxedandradgoesfrom0oto359owithastepsizeof1o. Thedotproductisthencalculatedtodeterminewhichtransmittedsignalwillresultinadirectreturnbacktotheradar.Thisisdoneby: where^nisaunitvectorofthesurfacenormal,^vrad,istheunitvectorofthelocationoftheradar,andistheanglebetweenthetwounitvectors.Since^nand^vrad,areunit 84

PAGE 85

RadarVectorv 4 becomes: Onetradeoffparameterforthetemplategenerationalgorithmistheangletolerancebetweentheradarlineofsightandthesurfacenormal.Thistoleranceisincorporatedintothealgorithmbytheparameterepsilon,asshowninEqn. 4 ,toimprovetheresultsofthegeometricprediction.Therefore,thethresholdvalueisincorporatedintoEqn. 4 : Highervaluesofleadtoamorecongestedtemplate,whichmaycausefalseclassication(asshowninFig. 4-7A ).Butifistoolow,thenthetemplatewillbesparseandmaymisssomevaluableinformationthatdifferentiatesthevehiclesfromoneanother(Fig. 4-7B ).Todeterminetheappropriatevalueof,aqualitativecomparisonwasmadetodeterminethepropertoleranceofthatwouldaccuratelyrepresentthevehicle.Itwasexperimentallydeterminedthattheacceptabletoleranceofrangesfrom0.005to0.01.WhenEqn. 4 issatised,thenthealgorithmispredictingthatthefacetisseen 85

PAGE 86

B EffectsoftheValue:A)HighValue,B)LowValue vehicle(xTFig. 4-8A )reachestheradarbeforethesignalfromthesideofthevehiclereturns.WhenprojectingxTontotheimageplane,itwillbeprojectedorthogonallytothelineofsightofthetransmittedsignalsinceitistherstsignaltoarrivetotheradar.Forexample,pointxTwillbeprojectedintotheimageplane.ThispointisknownasthelayoverpointxL(Fig. 4-8A ).ThisprojectionresultsfromtheSARhavingresolutionalongthelineofsight,butlackingresolutioninelevation. Toaccomplishthegeometricpredictionofthesinglebounceray,thealgorithmnds 4-8B ).TheselinesegmentswilldeterminethelocationofxLintheimageplane.Theequationstond wherexisthedistancefromthex-axistopointxB(knownasthebasepoint)andyisthedistancefromthey-axistopointxB.Theinformationthatisknownisx,y,and

PAGE 87

4-8B ). B GeometryoftheLayover:A)3-DViewofGeometryoftheLayover,B)TopViewofGeometryoftheLayover Todetermineifthepointneedstobeprojectedintotheimageplane,thealgorithmveriesthatEqn. 4 issatised.Ifitissatised,thenthealgorithmprojectsthevertexofthefacet,whichisrepresentedbyxTinFig. 4-8A ,intotheimageplane.Thefollowing 87

PAGE 88

wherex,y,zarethecoordinatesofxTobtainedfromthefacetle.Oncethefacetisprojectedintotheimageplane,bilinearinterpolationisusedtoconnectthevertextogetherintheimageplane. Whenthefacetsareprojectedintotheimageplane,thealgorithmcheckstomakesurethatthepixelisnotalreadyoccupiedbyanotherfacet.Ifthepixellocationalreadycontainsinformation,thenthealgorithmcheckstoseeifthepreviousdotproductthatpredictedareturnatthatlocationislowerthanthecurrentdotproductatthesamepixellocation.Ifthisisthecase,thenthealgorithmsavestheradofthesensorintothesinglebounceangletemplateat.Thisisdenotedas_,(x,y)where_signiesthesinglebounceoperator.Alsoa1isstoredin1_,,(x,y).Thisisanindicatortemplateof_,,wherea1signiesthatthealgorithmpredictsthatasinglebounceoccursatthatlocation.Thealgorithmthencontinuesontotheotherfacetsuntilallfacetshavebeenexamined. whichbecomes 88

PAGE 89

Inthisresearch=0.15. IfthecriteriainEqn. 4 issatised,thenthealgorithmusestheunitradarvectorinthetwodimensionalxandyeld(denotedas^v2,rad,)andthexandycomponentsin^n(denotedas^n2withthesubscript2signifying2dimensionalvector)todetermineifthetwovectorsresultinthesignalbeingreectedbacktotheradar.Themathematicalexpressionis: IfthecriteriainEqn. 4 issatised,thenthealgorithmremovesthezcomponentfromthevertexofthefacettoprojectthefacetintotheimageplane. Aswiththesinglebounce,thealgorithmmustverifythatthepixeldoesnotalreadycontaininformation.Toaccomplishthis,thealgorithmcomparesthepreviousdotproductatthesamepixellocationtothecurrentdotproduct.Ifthecurrentdotproductisgreaterthan,orequalto,thepreviousdotproduct,thealgorithmsavestheradatwhichthesensorislocatedintothedoublebounceangletemplate,denotedas,(x,y).Alsoa1isplacedin1,,(x,y),whichisanindicatortemplateof,(x,y).Thissigniesthatthealgorithmpredictsthatadoublebounceoccursatthatlocation.Otherwisethealgorithmcontinuesuntilallfacetsinthefacetlehavebeenexamined. 89

PAGE 90

Therststepofthebouncelterprocessistocreatethemask,denotedas1M,,(x,y),bylocatingtheminimumandmaximumindicesofeverycolumn(Fig. 4-9A )androw(Fig. 4-9B )of1,,(x,y)inwhicha1islocated.Oncealltheminimumandmaximumindiceshavebeenlocated,avalueof0isgiventoeverypixelinbetweenthesevalues.Everywhereelsea1issavedtocreate1M,,(x,y).Fig. 4-10 illustratesanexampleofamask. B FilterstoCreateMask:A)EveryColumn,B)EveryRow 90

PAGE 91

MasktoFindOutsideEdges Nextthealgorithmuses1M,,(x,y),_,(x,y),,(x,y),1_,,(x,y),and1,,(x,y)tocreatethelteredangletemplate(Fig. 4-12 ),whichisdenotedas~,(x,y).Therststepincreating~,(x,y)istocombineboth_,(x,y)and,(x,y)together(Fig. 4-11 ).Thisisdonebyusing1_,,(x,y)and1,,(x,y),whereif1,,(x,y)=1,thentheanglevalueatthesamepixellocationof,(x,y)isstoredin~,(x,y)atthecorrespondingpixellocation.If1_,,(x,y)=1and1,,(x,y)=0,thentheanglevalueatthesamepixellocationof_,(x,y)isstoredin~,(x,y)atthecorrespondingpixellocation.Ifneitherconditionismet,thenazeroisstoredatthatpixellocation.Themathematicalexpressionforthisprocessis: Ifbothindicatortemplatespredictascatteringatthesamelocation,thenthescatteringtypemustbeidentied.RecallfromChapter 1.3 thatadihedralstructurewillproducealargeramplitudevaluecomparedtothatofacylindricalstructure.Thedoublebouncescatteringistheresultofthesignalhittingthesideofthecivilianvehicleandthegroundformingavirtualdihedralstructure.Thesinglebouncescatteringresultsfromthetopofthecivilianvehicle,whichismorecylindricalinstructure.Due 91

PAGE 92

SingleandDoubleBounceCombinedTogether tothisphenomenology,thedoublebouncecasewillalwaysbepredictedforthepixellocationwherebothcasesarepredicted.Thisisbecausethedoublebouncewillalwayshavestrongeramplitudevaluethanthatofthesinglebounce.Therefore,whenever1,,(x,y)=1,thealgorithmstorestheanglevalueof,(x,y)in~,(x,y). Afterthesingleanddoublebouncetemplateshavebeencombined,thenextstepistolterthescatteringthatoccurswithinthebox-likestructureofthecombinedimage.ThisisillustratedinFig. 4-11 .TheHadamardproductof~,(x,y)and1M,,(x,y)isusedtolterthecombinedtemplates: Anexampleof~,(x,y)isdisplayedinFig. 4-12 Thealgorithmthencreatesthelteredindicatortemplate,1~,,(x,y),byrstplacinga1everywherea1islocatedinboth1_,,(x,y)and1,,(x,y).Otherwise,thealgorithmplacesa0in1~,,(x,y).Themathematicalequationis: 92

PAGE 93

FilteredTemplate Once1~,,(x,y)hasbeenformed,thealgorithmltersthescatteringwithinthebox-likescatteringbyusing: Fig. 4-13 showsanexamplesofthisprocess. Figure4-13. ExampleofaTemplateAngleMatrix(1~,,(x,y)) Lastly,~,(x,y)and1~,,(x,y)isstoredinthetemplatedictionaryD.Dconsistsofthetemplatesofthevehiclerotatedby.Forexample,inthisresearch,theCVDomesdatabaseconsistsoftenvehiclesandthereisatemplateforeachdegree.Therefore,thesizeofthedictionaryis5015017200.Thisisduetotheresolutionoftheimagebeing501501andthereare360templatesforeachvehicletocovereachdegree 93

PAGE 94

42 ]whichconsistsoftendifferentfacetmodelsofthevehicles.ThesevehiclesaretheCamry,HondaCivic,Jeep1993,Jeep1999,Maxima,MaxdaMPV,Mitsubishi,Sentra,ToyotaAvalon,andToyotaTacoma.TogeneratethefullypolarimetricPHDofthevehicles,thebandwidthofthedatacollectionwas5.35GHzwithacenterfrequencyof9.6GHz.Alsodifferentelevationanglesareavailableinthisdatasetrangingfrom30to60at0.5increments.Lastly,eachvehiclehasafull360viewwiththepulsesobtainedfrom0to359.9375atincrementsof0.0625.TogeneratetheCVDomedataset,ittook72,000CPUhoursusingProcessors,4GB/Node,4TBTotal,InnibandInterconnect,97TerabyteWorkspace,AMDOpteron(2.8GHz)[ 42 ]. SincetherearemanydifferentoperatingparametersavailableintheCVDomesdataset,therearemanydifferentwaystogeneratethephysicalopticalimage.Forthisresearch,theimageswereformedatof30,40,50,and60degrees,=2,polarizationVV,andbandwidthof5.35GHz(refertoSec. 3.1 foradetailedexplanation). Forthegeometricopticsimagery,theparametersthatareusedare30,40,50,and60degreeswith(1)=0.99.AdetaileddescriptionofhowtheimagewasformedisgiveninSec. 4.1 .Whencreatingthegeometricopticimagery,ittakesapproximatelyanhourtogeneratethedataforalltenvehiclesatoneonastandarddesktopcomputer.Thereforethegeometricopticsimageryissignicantlyfasterthanthephysicalopticimagery,whichwascreatedonahighlycongurablecomputingsystem[ 42 ].However,thereisstillroomforasignicantdecreaseincomputationtimeforgeometricopticsbyimplementingthealgorithmonahighlycongurablecomputingsystemwithparallelcomputingtechniques. 94

PAGE 95

4-14 qualitatively,thegeometricopticsapproachisabletopredictwherethemajorscatteringoccurswithhighaccuracyregardlessoftheelevationangle.ThegeometricpredictionisshowninblackinFig. 4-14 .Astheelevationangleincreases,theoverlaybecomesmoreprominentduetothegeometryofthescattering.ThisoverlayphenomenologyisdescribedindetailinSec. 4.1 .Thereasonwhythegeometricopticspredictiondoesnotpredictallofthebrightreturnsofthephysicalopticsapproximationisbecausetheentirefacetmodelwasnotused.Onlythemainpartsofthevehiclelocatedontheoutsidewereused.Fromthequalitativeassessment,itgivesreassurancethatthegeometricopticbasedtemplategenerationapproachwillbeeffectivefordeterminingthesimilaritybetweenthetemplateandaSARimagesincethetemplateispredictingwherethehighenergyintheSARimageislocated. Fig. 4-17 showsthelteredgeometricopticspredictiononallthevehiclesintheCVDomedatasetatanelevationangle60degrees.Thesearedisplayedasblackpixels.Again,aqualitativeassessmentindicatesthatthebrightreturnscausedbythemainstructureintheSARimages(Fig. 4-15 )arepredictedbythegeometricopticsalgorithm.Alsothealgorithmisabletopredicttheminorstructuresaswell.Forexample,thegroovesofthebedoftheToyotaTacomatruckwereseeninthegeometricpredictionasagroupofhorizonallinesinFig. 4-16J andFig. 4-17J .ThegroovescanalsobeenseeninthemagniedversionofthephysicalopticimageoftheToyotaTacoma(Fig. 4-18 )andarelocatedinthesamelocationasthegeometricprediction.TheimportanceoftheltercanbevalidatedbycomparingFig. 4-16 toFig. 4-17 .Themajorityofthescatteringthathappensinsidethebox-likestructureforthephysicalopticimagery(Fig. 4-15 )isduetoobjectsinsidethevehicleandwillmostlikelynotbedisplayedinthemeasureddata.ThiscanbeveriedbyexaminingFig. 3-1 .Therefore,theltereliminatesthescatteringinsidethebox-likestructuretoreducefalseclassicationwhengeometricopticsimagesareusedastrainingdatafortheclassicationstage. 95

PAGE 96

B C D GeometricOpticGeneratedImagesofaCamryatDifferentElevationAngleswithNoFiltering:A)30degrees,B)40degrees,C)50degrees,D)60degrees 96

PAGE 97

B C D E F G H I J PhysicalOpticsImagesofCVDomesat=60:A)Camry,B)Civic,C)Jeep93,D)Jeep99,E)Maxima,F)Mazda,G)Mitsubishi,H)Sentra,I)ToyotaAvalon,J)ToyotaTacoma 97

PAGE 98

B C D E F G H I J NoFilterAppliedtoGeometricOpticImageofCVDomesat=60:A)Camry,B)Civic,C)Jeep93,D)Jeep99,E)Maxima,F)Mazda,G)Mitsubishi,H)Sentra,I)ToyotaAvalon,J)ToyotaTacoma 98

PAGE 99

4-15D ),haveasignicantamountofpredictedreturnsduetothefacetleincludingmoredetailsinthefacetmodel.Someofthefacetmodelshavemanydetailspertainingtothecarsuchaslogo,lugnuts,andsidemolding,butforthegeometricopticimagerytofunctionoptimallyitrequiresonlythebareminimumofpartsthatmakeupthevehicle.Thisisbecausethealgorithmisnotsensitivetowhetherthesurfaceishiddenoroutofsightoftheradar.Ifthesurfacenormalisinthedirectlineofsightoftheradar,itwillbeprojectedintotheimageplane.Eventhoughthefacetleswerereducedforthisresearch,someofthelescouldbereducedevenmoretofutherimprovetheresults.TheseincludethefacetlesfortheJeep93(Fig. 4-17C ),Jeep99(Fig. 4-17D ),andMazda(Fig. 4-17F ). Tofurthervalidatethatthegeometricopticpredictionisanacceptableapproach,aquantitativeassessmentmustbeperformed.ThisassessmentisdoneinChapter 5 byusingasimilaritymeasuretoexaminethesimilaritybetweenthetest(physicaloptic)imageandthetemplate(geometricoptic). 5 quantitativelyshowsthatpotentialforusingthesetemplatesinanautomaticclassierofcivilianvehiclesusingSAR. 99

PAGE 100

B C D E F G H I J OuterEdgeFilterAppliedtoGeometricOpticImageat=60:A)Camry,B)Civic,C)Jeep93,D)Jeep99,E)Maxima,F)Mazda,G)Mitsubishi,H)Sentra,I)ToyotaAvalon,J)ToyotaTacoma 100

PAGE 101

MagniedPhysicalOpticImageofToyotaTacoma 101

PAGE 102

Thischapterquantitativelyevaluatesthepotentialofusingposeestimationandthegeometricoptics-basedtemplategenerationtoperformvehicleclassicationusingtheGLRTSARimagery.ThequantitativeevaluationisperformedbydevelopingaclassicationalgorithmthatusesthetemplatestomatchtheGLRTimagerybyweightingboththeangleandamplitudeinformationintheSARimagebasedonthetemplatepredictions.ThealgorithmisdescribedinChapter 5.1 4 andstorestheminthedictionary(D). AftertheDhasbeenformed,thentheon-lineportionofthealgorithmisreadytobeimplemented.ThePHDistheinputintotheposeestimationalgorithm.Thealgorithmisresponsibleforformingtheimageandestimatingtheorientationofthevehicle.Theestimatedpose(^)isusedtoreducethenumberoftemplatesneededtomatchagainstthetestimage. Finally,thealgorithmusesaradarmotivatedsimilaritymeasuretomeasurethedifferencebetweenthetestandtemplateimagestodetermineifthealgorithmwillbeabletoproperlymatchthecorrecttemplatetothetestimage.Fig. 5-1 showsthetopviewowofthealgorithmandthespecicstepsofthealgorithmaredelineatedinAlgorithms3,4,5,and6. 4

PAGE 103

GeneralFlowchartofClassicationAlgorithm 3 5.1.1DatabaseGeneration 4-1 )anddoublebounce(Fig. 4-2 )mustbeconsidered.Dconsistsof5015017200pixelswhichcontainboththelteredtemplateimage(~,(x,y))andtheindicatormatrixofthelteredtemplateimage(1~,,(x,y)).Chapter 4 providesadetailedexplanationthedatabaseformation. 3 ).TheimageisthensenttotheGLRTprocesstosaveboththeamplitudeandanglevalueatwhichthemaximumscatteringoccursbetweenalltheimages(Eqn. 3 ).Finallythealgorithmcontinuesontothespokeslterprocesswhereittakestheamplitudeandanglematricescomputedfromthepreviousstepstoestimatetheorientationofthevehicle.RefertoChapter 3 foradetailedexplanation. 103

PAGE 104

Algorithm6DAPart3 5 5 5 5 5 5 5 4.2 qualitativelyassessedthepotentialofusingthetemplategenerationprocessinaclassicationalgorithm.Inthissection,aquantitativesimilaritymeasureisusedtomatchthetemplatewiththeGLRTimage. Aftertemplateselection,theanglefeaturesareextractedfromthetestimage(G,,(x,y;c))(Chapter 3.1.3 and 3.1.4 )tobecomparedtotheanglefeaturesinthetemplate(Chapter 4.1 ).ThesetestimageanglefeaturesareextractedbytakingtheHadamardproductofthelteredindicatortemplate,1~,,(x,y)withtheimageanglefeatures,G,,(x,y;c).Tocomparetheextractedanglefeatureswiththepredicted 104

PAGE 105

5 5-2 B C Eqn. 5 inImageForm:A)(1~,,(x,y)G,,(x,y)),B)~,(x,y),C)D,,(x,y) 5 andisillustratedinFig. 5-3 Notethehighererrorwhenthetwodifferentvehiclesarecomparedtoeachother(Fig. 5-3B )ascomparedtowhenthesamevehiclesarecompared(Fig. 5-3A ). 105

PAGE 106

B AccountingforCylicalPropertyofAnglesinD,,(x,y):A)CivicTemplatewithCivicTestImage,B)MazdaTemplatewithCivicTestImage Toaccountforoutlierdifferencessothatonelargelocalmismatchisnotweightedtooseverely,theangledifference,D,,(x,y),iswinsorizedaccordingtoEqn. 5 .TheparameterinEqn. 5 wasexperimentallydeterminedtobe10fortheCVDomesdataset. Toillustratetheeffectoftheoutliers,considerasinglepixeldifferenceof170degreescomparedtoadifferenceof17pixelswithanerrorof10degrees.Althoughthesetwodifferenceswouldgivethesamescore,the17pixeldifferenceismostlikelythemuchpoorermatch.Bywinsorizingthesetwoscores,thesinglepixelerrorbecomes10versus170andthe17pixeldifferenceremains170.Thus,thewinsorizedscoreprotectsagainstoutliersandbetterreectsthedifferencesinshape.ThewinsorizedscoreisillustratedinFigure 5-4 .NotethatagreateramountofredisapparentinFig. 5-4B thaninFig. 5-4A sincetheincorrecttemplateisbeingcomparedtothetestimageinFig. 5-4B andthecorrecttemplateisbeingcomparedinFig. 5-4A Inadditiontotheanglefeatures,theamplitudevaluesofthetestimagearealsousedasfeaturestomatchthetemplatestothetestimages.BasedonthevariabilityoftheSARamplitude,theamplitudefeaturesarenotasstableastheanglefeatures,and 106

PAGE 107

B WinsorizingD,,(x,y):A)CivicTemplatewithCivicTestImage,B)MazdaTemplatewithCivicTestImage therefore,theamplitudefeaturesarenotweightedasheavily.Toweighttheamplitude,thedynamicrangeisquantizedinto4levelsandthelevelsareweightedaccordingtoEqn. 5 Thedifferentthresholdvaluesusedtoquantizetheamplitudearedeterminedbythresholdingtheamplitudehistogramatp1=5%,p2=15%,andp3=30%.ThesepercentageswerechosenempiricallybyexaminingtheresultsofdifferentthresholdvaluesasdepictedinFig. 5-5 TheresultofapplyingthisquantizationtooneofthetestimagesisillustratedinFig. 5-6 .Notethatthelowestvaluesinthequantizedtestimagearemostlikelycausedbythetarget,andthehighestvaluesaremostlikelyinthebackgroundwiththeintermediatevaluespotentiallyeithertargetorbackground.TheamplitudescoreiscomputedbytheHadamardproductofthelteredindicatortemplate,1~,,(x,y),and 107

PAGE 108

B C D E F

PAGE 109

H I J K L Topp%ofIG,,(x,y,0;c):A)IG,,(x,y,0;c),B)p=5%,C)p=6%,D)p=7%,E)p=8%,F)p=9%,G)p=10%,H)p=15%,I)p=20%,J)p=25%,K)p=30%,L)p=35%, 109

PAGE 110

QuantizedImageIQ,,(x,y) 5 Tocomputethejointscore,therststepistoweighteachscore(anglescoreandamplitudescore)andaddthemasshowninEqn. 5 .Forthepurposeoftheexperimentsperformedinthisdissertation,theweightingfactorwaschosenas=.5.RefertoFig. 5-7 tovisualizeEqn. 5 andtovisualizeitsdiscriminationpotentialrefertoFig. 5-8 whereaCivictemplateisscoredagainstaCivictestimageandaMazdatemplateisscoredagainstaCivictestimage. Thenextstepistocreateanormalizedscoreimagebyusingmin-maxnormalizationasshowninEqn. 5 wheresworseandsbestarethemaximumandminimumscorevaluesinthescoreimage,respectivelyands0min=0ands0max=1arethenewminimumandmaximumscorevaluesaftermin-maxnormalization.Forthisscoringstrategy,theminimumscoreiszerowhich 110

PAGE 111

B C Eqn. 5 inImageForm:A)D,,(x,y),B)IQH,,(x,y),C)S,,(x,y) B ComparingS,,(x,y):A)CivicTemplatewithCivicTestImage,B)MazdaTemplatewithCivicTestImage 111

PAGE 112

Forthenalstep,thescorec0iscomputedbyEqn. 5 whichndsthemeanvalueofthescoreimagewhereNxyequalsthenumberofpixelsinthescoreimage.Thenthescorec0isusedtodeterminethesimilaritybetweenthetemplateandtestimage.Fig. 5-9 showstwoscoreimagesalongwiththecomputedscoresforacorrectandincorrectmatch. B Aninitialtestwasdesignedtodeterminewhichweightingfunction-UW,BHW,CW,orHW-wouldperformbestfordiscriminatingbetweenthevehicles.Toperformthis 112

PAGE 113

5-10 foreachweightingfunctionandforboththecoherentintegrationanglesof2degreesand5degrees.AscanbeobservedfromFig. 5-10 thatHWgivesthebestresultsforbothintegrationangles.Hence,HWgivesthebesttradeoffbetweenmainlobewidthandsidelobesuppressionforthisdiscriminationtask.RecallthatthisresultisdifferentfromtheposeestimationresultwheretheUWwasthemosteffectiveweighting.Itisinterestingthatdifferentimageformationstrategiesappeartobemoreappropriatefordifferentstagesintheclassicationalgorithm. B BestWeightingtoUseforClassication:A)IntegrationAngle2,B)IntegrationAngle5 113

PAGE 114

B C D E F 5-11 .Toinvestigatetheabilityfortheclassicationapproachdescribedabovetoperformvehiclediscrimination,adiscriminationmatrixwascalculated.Thismatrixisdifferentfroma 114

PAGE 115

H I J RenderedImagesofCivilianVehicles:A)CamryVehicle1,B)CivicVehicle2,C)Jeep93Vehicle3,D)Jeep99Vehicle4,E)MaximaVehicle5,F)MazdaVehicle6,G)MitsubishiVehicle7,H)SentraVehicle8,I)AvalonVehicle9,J)TacomaVehicle10 confusionmatrixwhereeachtestimageisscored,classiedseparately,andthentheclassicationresultsareenteredintotheconfusionmatrix.Inthediscriminationmatrix,thetestscoresareaveragedtogetherforeachcellinthematrixasshowninEqn. 5 andthenthescoresarenormalizedaccordingtoEqn. 5 .Thisdiscriminationmatrixthenpermitsanalysistodeterminewhetherthevehiclescanbeclassiedandtodeterminehowsimilarthevehiclesaretoeachotherusingthefeaturesandthescoringapproach.Thediscriminationmatrixresultforboth2degreesand5degreesintegration 115

PAGE 116

5-12 wheretherowsandcolumnscorrespondtothetestimageandtemplaterespectively. 3737Xi=1c0ijk.(5) LetcijkbethevaluecalculatedinEqn. 5 whereiistheimageindex,jisthetargettypeoftheimages,andkisthetargettypeofthetemplate.Forexample,letj=1,k=1,forthiscase,AS11istheaveragescoreofthetestingimagesoftarget1withtemplate1.ThenEqn. 5 isusedtonormalizetheresultsforeachtestimage. Therefore,eachpixelinFig. 5-12A andFig. 5-12B isAS0jk.Forexample,letj=1,forthiscase,AS01istheaverageoverallthetemplatesfrom1to10.Eachcell,AS1k,isdividedbyAS01togetAS01k. Eachrowofthediscriminationmatrixhasthelargestvaluealongthediagonalwhichshowsthatthealgorithmiscapableofclassifyingallthetargetscorrectly.However,itcanbeseenthatsometargetsareonlyseparablebyasmallamount.Forexample,testimage1hasahighsimilarityscorewithtemplate1(Fig. 5-13A )andtemplate5(Fig. 5-13E ).AscanbeseenfromFig. 5-13A andFig. 5-13E ,thesetwotemplateimagesareverysimilartooneanotherduetothesimilarityofthevehiclestructurewhichcanbeviewedinFig. 5-11A andFig. 5-11E .Itisevendifcultforahumantodifferentiatebetweentheactualimages,untilacloserexaminationascertainstheminordifferencesbetweenthevehicles.Therefore,itisunderstandablethatthealgorithmwouldhaveasimilarsimilaritymeasureforthesetwotemplatesfortestimage1.However,thereisenoughofadifferencebetweenthetemplatesthatthealgorithmisabletomatchthecorrecttemplatetothethetestimage. Tounderstandwhythesimilarityscoresaresimilar,considerboththeactualvehicleimageinwhichthetestimagewasgeneratedfrom(Fig. 5-11 )andthetemplate 116

PAGE 117

B HammingWeightingUsingOuterEdgeFilterScore:A)IntegrationAngle2,B)IntegrationAngle5 (Fig. 5-13 )aswasdoneinthepreviousexample.Theothersimilarityscoresthathaveahighscorearetestimage2withtemplates2(Fig. 5-13B )and8(Fig. 5-13H ),testimage3withtemplate3(Fig. 5-13C )and4(Fig. 5-13D ),testimage4andtemplates3(Fig. 5-13C )and4(Fig. 5-13D ),testimage5withtemplate1(Fig. 5-13A )and5(Fig. 5-13E ),testimage6withtemplates3(Fig. 5-13C )and6(Fig. 5-13F ),testimage7withtemplates7(Fig. 5-13G )and8(Fig. 5-13H ),testimage8withtemplates1(Fig. 5-13A )and8(Fig. 5-13H ),andtestimage9withtemplates1(Fig. 5-13A )and9(Fig. 5-13I ).Eventhoughthesesimilaritymeasuresareclosetooneanother,each 117

PAGE 118

B C D E F 5-11J )issignicantlydifferentfromtheotherothervehiclesusedintheCVDomedatasetandisalsoapparentinthegeometricoptic 118

PAGE 119

H I J TemplateImagesofCivilianVehicles:A)CamryVehicle1,B)CivicVehicle2,C)Jeep93Vehicle3,D)Jeep99Vehicle4,E)MaximaVehicle5,F)MazdaVehicle6,G)MitsubishiVehicle7,H)SentraVehicle8,I)AvalonVehicle9,J)TacomaVehicle10 119

PAGE 120

5-13J ).Thisresultisalsoveriedinthediscriminationmatrixaswell(Fig. 5-12 )wherethehighestpixelislocatedinthecorrectplaceandalltheotherpixelsaresignicantlylower.Othertestimageshadafewdarkerpixelscatteredthroughtheothertemplates,soithadsomesimilaritiestotheothervehicles.However,fortheToyotaTacomaitisapparentthattherearenosimilaritiestotheothervehiclestothedataset. 120

PAGE 121

1 ,anoverviewoftheATRprocesswasdescribedandthescopeofthedissertationwasdelineated.ThedissertationfocusedontheclassicationstageoftheATRprocess.ThescopeincludedtheclassicationofcivilianvehiclesusingwideangleSAR.ThephysicsandgeometryoftheSARimagingprocesswascontrastedwithelectro-opticalimagingtomotivatetheapproachtakeninthedissertation.Inaddition,giventhatmostresearchinSARclassicationfocusesonmilitaryvehicles,thedifferenceinscatteringbetweenmilitaryandcivilianvehicleswasdescribed.Also,asmostoftheSARresearchwasfocusedonnarrowangleofilluminationversuswideangleillumination,thedifferencesinscatteringbehaviorbetweenthosetworadartypeswerealsocontrastedandcompared.Giventhisscatteringbehavior,andparticularlyglintbehavior,thethreenovelcontributionsofthisdissertationweremotivated:glint-basedposeestimation,geometricoptics-basedtemplategeneration,andradarphenomenologybasedangleandamplitudefeaturematching. InChapter 2 ,therelevantbackgroundresearchwaspresented.PreviousworkinSARclassicationoncivilianvehiclesandpreviousworkinwideangleSARimageformationwasreviewed.Inaddition,previousworkonposeestimationusingSARimagerywasreviewed.Themultiplewaysofgeneratingadatabaseoftrainingdatawasreviewed,andthesewerecomparedtoeachothertomotivatethetemplategenerationapproachtakeninthisdissertation.Previousworkinobjectclassicationusingmultipleapproacheswerecategorizedandreviewed.Inparticular,previousworkonSARtargetclassicationofmilitaryvehicleswasdescribed. InChapter 3 ,therstofthemajorcontributions-glintbasedposeestimation-wasdescribedindetail.Theapproachwasmotivatedandthespecicstepsoftheimageformationprocessweredescribedtosupporttheposeestimationalgorithm. 121

PAGE 122

Chapter 4 developedthesecondmajorcontributionofthisresearch-geometricopticsbasedtemplategeneration.Thespecicalgorithmforgeneratingthetemplateswasdescribedindetail,andthegeneratedtemplateswerecomparedwithsyntheticapertureradarimagesgeneratedbyacomputationallyintensivephysicaloptics,theraytracingscatteringmodel.BoththepositionandtheanglefeaturesgeneratedbythetemplategenerationalgorithmcomparedfavorablywiththosefeaturesextractedfromtheSARimagesbasedonqualitativevisualcomparisons.Thetemplategenerationprocesscombinedbothsingleanddoublebouncephenomenologyintoacompositetemplatethatcapturedthemajorscatteringofthetarget. Chapter 5 developedthethirdmajorcontributionoftheresearch-aclassicationapproachthatincorporatesboththeangleestimationandtemplategenerationapproachesdevelopedinChapters 3 and 4 .ThisclassicationapproachmatchedtheangleandpositionfeaturesofthetemplateswithspeciallyprocessedSARimagerythatincludedboththeamplitudeandangleinformationateachpixel.ItwasdemonstratedthataphenomenologybasedweightingschemebetweenangleandamplitudeinformationsupportstheclassicationofcivilianvehiclesusingwideangleSAR.Inaddition,asensitivitystudyshowedthatHammingweightingprovidedthebestclassicationperformanceasitprovidedthebesttradeoffbetweenmainlobewidthandsidelobesuppression. 122

PAGE 123

1. Glint-basedposeestimation.Thisposeestimationtechniquecapitalizesonthefactthatvehicleshavefoursideswhichreectenergyinfourprominentdirections,orcardinalheadings.TheSARdataisprocessedinanovelmannerthatstoresbothangleandamplitudewhichis,inturn,processedtoprovideveryaccurateposeestimation.Thismethodwasvalidatedbyusingbothsyntheticandmeasureddatatotesttheposeestimator. 2. Geometricopticsbasedtemplategeneration.Thistemplategenerationtechniquedoesnotrequireraytracingorsurfaceintegrationasdoesprevioustemplategenerationstrategies,butonlyrequiresgeometricoperationsonfacets.ThisgeometricalinformationissufcienttopredictbothangleandpositionofscatterersforuseasfeaturesinSARclassication.ThismethodwasvalidatedbyaqualitativevisualcomparisonbetweenSARimageryandthepredictedtemplate. 3. Angleandamplitudebasedtemplatematcher.Thisclassicationapproachprovidesamatchbetweenthefeaturesthatcanbepredicted-angleandpositionofscattererswiththefeaturesthatcanbeextractedfromwideangleSARimagery-angleandamplitude.Theanglefeaturesarematcheddirectly,andtheamplitudeinformationiscombinedbyquantizingtheamplitudeandusingthetemplatepositioninformationasaltertosumthequantizedamplituderesponseoftheSARtestimage.Inaddition,theangleandamplitudeiscombinedinaphenomenologymotivatedmannertosupporttargetclassication.Thisapproachwasvalidatedbyusingsyntheticdatatotesttheclassicationalgorithm. 123

PAGE 124

Extendingthealgorithmtoworkundermoreconditions.Thereareseveralconstraintsthatcouldberelaxedinfuturework.Relaxingtheseconstraintswerenotconsideredinthiseffortbecausethefocusofthisworkwastodevelopeauniqueapproachand,hence,focusedonthefeasibilityoftheapproach.Oneconstraintthatcouldbeaddressedisthelevelbackgroundandscenecenterconstraint.Thesignaturedoesalterwithchangesinthegroundplaneandwherethetargetislocatedinthescene.Thealgorithmshouldbeadaptedtohandletheseconstraints.Also,additionalsensorparametersshouldbeconsidered.Theseincludedifferentbandwidthsanddifferentilluminationgeometries.Thetemplategenerationandclassicationapproachesfocusedonthesteepestdepressionangles,thewidestbandwidth,and360degreeilluminationofthetarget.Allthesesensorparametersshouldberelaxedandvariedtodeterminetheireffectonsystemperformance.Thisworkalsofocusedonmonostaticsystemsanditsapplicabilitytobistaticsystemsshouldalsobeconsidered. 2. Improvingthealgorithms.ThetemplategenerationapproachwilllikelyhavetobeextendedastheSARresolutionisdegradedorasdifferentgeometriesareconsidered.Theseparameterchangeswouldincreasethelikelihoodofmultiplereturnslandinginasingleresolutioncellorpixel,andhence,wouldlikelyrequireamoresophisticatedapproachforcombiningthesereturnsratherthanthecurrentrulebasedapproach.Inaddition,theinclusionofthreebouncephenomenologymaybenecessaryifmorecomplicatedvehiclesaretobeconsidered.Fortheclassicationalgorithm,algorithmextensionsmaybeneededasthetemplategenerationapproachisextended.Inaddition,thereisampleroomforimprovementinhowthefeaturesareused.Theweightingofthefeaturescouldbelearnedusingamachinelearningapproach,andthetemplateweightingcouldpenalizethematchforhavingenergyinareasnotpredictedbythetemplate.Currently,onlytheenergythatisinthelocationspredictedbythetemplateisutilized. 3. Systemconsiderations.Thiseffortanalyzedkeyelementsofaclassier,butitwouldbeusefultocombinethesealgorithmsalongwithadetectoranddiscriminationstagetotestanend-to-endATRsystem.Inaddition,computationalconcernsshouldbeaddressed,andcomputerarchitecturestailoredforthistypeofprocessingshouldbeinvestigated.Therobustnessofthisapproachtoadditionaltargettypes,cluttertypes,andsensordegradationsshouldalsobetestedandevaluated. 124

PAGE 125

125

PAGE 126

126

PAGE 127

127

PAGE 128

128

PAGE 129

AED=AngularEnergyDensity AER=AngleEntropyofRadonTransform ATR=AutomaticTargetRecognition AS=AxisofSymmetry BHW=Blackmann-HarrisWeighting BB=BoundingBox CW=ChebyshevWeighting CV=CivilianVehicles CFAR=ConstantFalseAlarmRate CWT=ContinuousWavelettransform D=Dictionary DoG=Difference-of-Gaussians EPC=EdgePixelCount EO=ElectroOptics EM=Electromagnetic FAST=FeaturesforAcceleratedSegmentTest FS=FilledinShape GLRT=GeneralizedLikelihoodRatioTest GTRI=GeorgiaTechResearchInstitute HW=HammingWeighting HT=HoughTransform ICA=IndependentComponentAnalysis LA=LeanAngle LF=LinearFit MAD=MedianAbsoluteDeviation 129

PAGE 130

MSTAR=MovingandStationaryTargetAcquisitionandRecognition MV=MilitaryVehicles MLP=MultilayerPerceptron NIA=NarrowIlluminationAngle NN=NeuralNetwork P=Perimeter PHD=PhaseHistoryData PCA=PrincipalComponentAnalysis RCS=RadarCrossSection ROI=RegionofInterest RR=RobustRegression SIFT=ScaleInvariantFeatureTransform SURF=SpeededUpRobustFeatures S=Statistics SVR=SupportVectorRegression SAR=SyntheticApertureRadar TBR=TargettoBackgroundRatio TS=TargetSegmentation T=Training UW=UniformWeighting WIA=WideIlluminationAngle 130

PAGE 131

[1] D.Lowe,Objectrecognitionfromlocalscale-invariantfeatures,inComputerVision,1999.TheProceedingsoftheSeventhIEEEInternationalConferenceon,vol.2.IEEE,1999,pp.1150. [2] H.Bay,A.Ess,T.Tuytelaars,andL.VanGool,Speeded-uprobustfeatures(surf),Computervisionandimageunderstanding,vol.110,no.3,pp.346,2008. [3] S.Nayar,K.Ikeuchi,andT.Kanade,Surfacereection:physicalandgeometricalperspectives,IEEETransactionsonPatternAnalysisandMachineIntelligence,pp.611,July1991. [4] B.Mahafza,RadarsystemsanalysisanddesignusingMATLAB.CRCPress,2000. [5] D.ForsythandJ.Ponce,ComputerVision:AModernApproach.PrenticeHall,2003. [6] R.Sullivan,Microwaveradar:Imagingandadvancedconcepts.Norwood,MA:ArtechHouse,Inc.,2000. [7] R.GonzalezandR.Woods,Digitalimageprocessing,3rded.PearsonEducation,Inc.,2008. [8] I.Ihrke,K.Kutulakos,H.Lensch,M.Magnor,andW.Heidrich,Stateoftheartintransparentandspecularobjectreconstruction,inEUROGRAPHICS2008,2008. [9] Y.AkyildizandR.Moses,Ascatteringcentermodelforsarimagery,inSPIE,vol.3869,1999,pp.76. [10] E.Keydel,S.Lee,andJ.Moore,Mstarextendedoperatingconditions:atutorial,inProceedingsofSPIE,vol.2757,1996,p.228. [11] L.Novak,G.Owirka,andA.Weaver,Automatictargetrecognitionusingenhancedresolutionsardata,IEEETransactionsonAerospaceandElectronicSystems,vol.35,no.1,pp.157,1999. [12] H.Chiang,R.Moses,andL.Potter,Model-basedclassicationofradarimages,IEEETransactionsonInformationTheory,vol.46,no.5,pp.1842,2000. [13] Y.Yang,Y.Qiu,andC.Lu,Automatictargetclassication-experimentsonthemstarsarimages,SoftwareEngineering,ArticialIntelligence,NetworkingandParallel/DistributedComputing,InternationalConferenceon&Self-AssemblingWirelessNetworks,InternationalWorkshopon,pp.2,2005. 131

PAGE 132

R.PatnaikandD.Casasent,Sarclassicationandconfuserandclutterrejectiontestsonmstarten-classdatausingminacelters,inProceedingsofSPIE,vol.6574,2007,p.657402. [15] Y.Chen,E.Blasch,H.Chen,T.Qian,andG.Chen,Experimentalfeature-basedsaratrperformanceevaluationunderdifferentoperationalconditions,inProceed-ingsofSPIE,vol.6968,2008,p.69680F. [16] J.Thiagarajan,K.Ramamurthy,P.Knee,A.Spanias,andV.Berisha,Sparserepresentationsforautomatictargetclassicationinsarimages,inCommu-nications,ControlandSignalProcessing(ISCCSP),20104thInternationalSymposiumon.IEEE,2010,pp.1. [17] Z.Jianxiong,S.Zhiguang,C.Xiao,andF.Qiang,Automatictargetrecognitionofsarimagesbasedonglobalscatteringcentermodel,IEEETransactionsonGeoscienceandRemoteSensing,vol.49,no.10,pp.3713,2011. [18] L.Kaplan,Improvedsartargetdetectionviaextendedfractalfeatures,IEEETransactionsonAerospaceandElectronicSystems,vol.37,no.2,pp.436,2001. [19] B.BhanuandY.Lin,Stochasticmodelsforrecognitionofoccludedtargets,Patternrecognition,vol.36,no.12,pp.2855,2003. [20] C.Zhang,T.Zou,andZ.Wang,Atargetdiscriminationalgorithmforhighresolutionsarimagery,inRobotics,IntelligentSystemsandSignalProcess-ing,2003.Proceedings.2003IEEEInternationalConferenceon,vol.2.IEEE,2003,pp.863. [21] K.DumanandA.Cetin,Targetdetectioninsarimagesusingcodifferenceanddirectionallters,inProceedingsofSPIE,vol.7699,2010,p.76990S. [22] L.Yi-Bo,Z.Chang,andW.Ning,Asurveyonfeatureextractionofsarimages,inComputerApplicationandSystemModeling(ICCASM),2010InternationalConferenceon,vol.1.IEEE,2010,pp.V1. [23] R.MosesandJ.Ash,Recursivesarimaging,Proc.SPIEAlgorithmsSyntheticApertureRadarImagery,vol.6970,pp.69700P,2008. [24] ,Anautoregressiveformulationforsarbackprojectionimaging,AerospaceandElectronicSystems,IEEETransactionson,vol.47,no.4,pp.2860,2011. [25] M.Ferrara,J.Jackson,andC.Austin,Enhancementofmulti-pass3dcircularsarimagesusingsparsereconstructiontechniques,AlgorithmsforSyntheticApertureRadarImageryXVI,ProceedingsofSPIE,vol.7337,no.01,2009. 132

PAGE 133

L.A.GorhamandL.J.Moore,Sarimageformationtoolboxformatlab,inAlgorithmsforSyntheticApertureRadarImageryXVII,vol.7699.SPIE,2010,p.769906. [27] D.Vu,K.Zhao,W.Rowe,andJ.Li,Sparseandaccuratehighresolutionsarimaging,inSocietyofPhoto-OpticalInstrumentationEngineers(SPIE)Confer-enceSeries,vol.8394,2012,p.4. [28] E.Ertin,C.Austin,S.Sharma,R.Moses,andL.Potter,Gotchaexperiencereport:three-dimensionalsarimagingwithcompletecircularapertures,inProceedingsofSPIE,vol.6568,2007,p.656802. [29] C.Austin,E.Ertin,andR.Moses,Sparsemultipass3dsarimaging:Applicationstothegotchadataset,AlgorithmsforSyntheticApertureRadarImageryXVI,2009. [30] ,Sparsesignalmethodsfor3-dradarimaging,SelectedTopicsinSignalProcessing,IEEEJournalof,vol.5,no.3,pp.408,2011. [31] K.DunganandL.Potter,-dimagingofvehiclesusingwideapertureradar,IEEETransactionsonAerospaceandElectronicSystems,vol.47,no.1,pp.187,2011. [32] K.Dungan,L.Potter,J.Blackaby,andJ.Nehrbass,Discriminationofcivilianvehiclesusingwide-anglesar,AlgorithmsforSyntheticApertureRadarImageryXV,vol.6970,p.69700Z,2008. [33] K.DunganandL.Potter,Classifyingcivilianvehiclesusingawide-eldcircularsar,inProc.SPIE,vol.7337,2009,p.73370R. [34] ,Classifyingtransformation-variantattributedpointpatterns,PatternRecog-nition,vol.43,pp.3805,Nov2010. [35] ,Classifyingsetsofattributedscatteringcentersusingahashcodeddatabase,inAlgorithmsforSyntheticApertureRadarImageryXVII,vol.7699.SPIE,2010,p.76990Q. [36] C.PaulsonandD.Wu,Featurephenomenologyandfeatureextractionofcivilianvehiclesfromsarimages,inProceedingsofSPIE,vol.8051,2011,p.80510X. [37] N.Pena,G.Garza,andZ.Qiao,Filteredbackprojectiontypedirectedgedetectionofrealsyntheticapertureradarimages,inSocietyofPhoto-OpticalInstrumentationEngineers(SPIE)ConferenceSeries,vol.8394,2012,p.19. [38] J.JacksonandP.Brady,Radartargetclassicationusingmorphologicalimageprocessing,inProceedingsofSPIE,vol.8051,2011,p.805114. [39] M.Soumekh,Reconnaissancewithslantplanecircularsarimaging,ImageProcessing,IEEETransactionson,vol.5,no.8,pp.1252,1996. 133

PAGE 134

C.CasteelJr,L.Gorham,M.Minardi,S.Scarborough,K.Naidu,andU.Majumder,Achallengeproblemfor2d/3dimagingoftargetsfromavolumetricdatasetinanurbanenvironment,inProceedingsofSPIE,vol.6568,2007,p.65680D. [41] K.Naidu,L.Lin,andV.Team,Datadome:Fullk-spacesamplingdataforhighfrequencyradarresearch,inProc.ofSPIEVol,vol.5427,2004,p.201. [42] K.Dungan,C.Austin,J.Nehrbass,andL.Potter,Civilianvehicleradardatadomes,inAlgorithmsforSyntheticApertureRadarImageryXVII,vol.7699.SPIE,2010,p.76990P. [43] J.Walker,Range-dopplerimagingofrotatingobjects,AerospaceandElectronicSystems,IEEETransactionson,no.1,pp.23,1980. [44] C.Cafforio,C.Prati,andF.Rocca,Sardatafocusingusingseismicmigrationtechniques,AerospaceandElectronicSystems,IEEETransactionson,vol.27,no.2,pp.194,1991. [45] C.Prati,A.Guarnieri,andF.Rocca,Spotmodesarfocusingwiththewktechnique,inProceedingsofIEEEInternationalGeoscienceandRemoteSensingSymposium,1991,pp.631. [46] M.DesaiandW.Jenkins,Convolutionbackprojectionimagereconstructionforspotlightmodesyntheticapertureradar,IEEETransactionsonImageProcessing,vol.1,no.4,pp.505,1992. [47] S.Simpson,P.Galloway,andC.Bell,Complextargetsignaturegeneration,Proc.ICSP,Beijing,1996. [48] R.Bhalla,H.Ling,J.Moore,D.Andersh,S.Lee,andJ.Hughes,dscatteringcenterrepresentationofcomplextargetsusingtheshootingandbouncingraytechnique:areview,AntennasandPropagationMagazine,IEEE,vol.40,no.5,pp.30,1998. [49] S.Simpson,P.Galloway,andM.Harman,Applicationsofepsilontmaradarsignaturepredictionandanalysistool,inInternationalRadarSymposiumIRS,vol.98,1998. [50] D.Andersh,J.Moore,S.Kosanovich,D.Kapp,R.Bhalla,R.Kipp,T.Courtney,A.Nolan,F.German,J.Cook,andJ.Hughes,Xpatch4:thenextgenerationinhighfrequencyelectromagneticmodelingandsimulationsoftware,inRadarCon-ference,2000.TheRecordoftheIEEE2000International,2000,pp.844. [51] M.Simcoe,E.Chen,R.Adve,F.Qureshi,J.Schneible,andM.Wicks,Fopenscenegenerationusingnumericalelectromagneticanalysis,inRadarConference,2003.Proceedingsofthe2003IEEE.IEEE,2003,pp.362. 134

PAGE 135

A.MishraandB.Mulgrew,Generationofsarimageforreal-lifeobjectsusinggeneralpurposeemsimulators,arXivpreprintarXiv:1101.0317,2011. [53] S.Verbout,W.Irving,andA.Hanes,Improvingatemplate-basedclassierinasarautomatictargetrecognitionsystembyusing3-dtargetinformation,TheLincolnLaboratoryJournal,vol.6,no.1,pp.53,1993. [54] L.Novak,S.Halversen,G.Owirka,andM.Hiett,Effectsofpolarizationandresolutionontheperformanceofasarautomatictargetrecognitionsystem,TheLincolnLaboratoryJournal,vol.8,no.1,pp.49,1995. [55] R.PatnaikandD.Casasent,Minacelterclassicationalgorithmsforatrusingmstardata,inProc.SPIE,vol.5807,2005,pp.100. [56] X.Yu,Y.Li,andL.Jiao,Sarautomatictargetrecognitionbasedonclassiersfusion,inMulti-Platform/Multi-SensorRemoteSensingandMapping(M2RSM),2011InternationalWorkshopon.IEEE,2011,pp.1. [57] H.Yanik,Z.Li,andB.Yazici,Computationallyefcientfbp-typedirectsegmentationofsyntheticapertureradarimages,inProceedingsofSPIE,vol.8051,2011,p.80510C. [58] L.Demanet,M.Ferrara,N.Maxwell,J.Poulson,andL.Ying,Abutteryalgorithmforsyntheticapertureradarimaging,SIAMJournalonImagingSciences,vol.5,no.1,pp.203,2012. [59] Q.Zhao,D.Xu,andJ.Principe,Poseestimationforsarautomatictargetrecognition,inInProceedingsofImageUnderstandingWorkshop,1998,pp.827. [60] J.Principe,D.Xu,andJ.FisherIII,Poseestimationinsarusinganinformation-theoreticcriterion,inProceedingsofSPIE,vol.3370.Citeseer,1998,pp.218. [61] R.Meth,Target/shadowsegmentationandaspectestimationinsyntheticapertureradarimagery,inProceedingsofSPIE,vol.3370,1998,pp.188. [62] L.Voicu,R.Patton,andR.Myler,Multicriterionvehicleposeestimationforsaratr,ProceedingsofSPIE,vol.3721,pp.497,1999. [63] L.KaplanandR.Murenzi,Poseestimationofsarimageryusingthetwodimensionalcontinuouswavelettransform,Patternrecognitionletters,vol.24,no.14,pp.2269,2003. [64] N.Xin,G.Wang,andJ.Zhang,Asyntheticalposeestimationofsarimageryusinghoughtransformand2-dcontinuouswavelettransform,inRadar,2006.CIE'06.InternationalConferenceon.IEEE,2006,pp.1. 135

PAGE 136

Y.Sun,Z.Liu,S.Todorovic,andJ.Li,Adaptiveboostingforsarautomatictargetrecognition,IEEETransactionsonAerospaceandElectronicSystems,vol.43,no.1,pp.112,2007. [66] F.McFaddenetal.,Preciseposeestimationforsyntheticapertureradarimagesofvehicles,OpticalEngineering,vol.46,p.107201,2007. [67] P.Han,Z.Han,andR.Wu,Asvr-basedsartargetazimuthfusionestimation,inConferenceonSyntheticApertureRadar,2009.APSAR2009.2ndAsian-Pacic.IEEE,2009,pp.169. [68] C.Duan,W.Hu,andX.Du,Mvmbasedsarimageprocessingforshipposeestimation,inGeoscienceandRemoteSensingSymposium(IGARSS),2010IEEEInternational.IEEE,2010,pp.1605. [69] M.Saidi,A.Toumi,A.Khenchaf,B.Hoeltzener,andD.Aboutajdine,Poseestimationforisarimageclassication,inInternationalGeoscienceandRemoteSensingSymposium(IGARSS).IEEE,2010,pp.4620. [70] C.Si,Y.Jian,andS.Xiaoquan,Anewmethodfortargetaspectestimationinsarimages,inInternationalConferenceMultimediaTechnology(ICMT).IEEE,2010,pp.1. [71] J.Principe,Statisticallyindependentfeatureextractionforsarimagery,inImageUnderstandingWorkshop:proceedingsofaworkshopheldinMonterey,California,November20-23,1998,vol.2,1998,pp.653. [72] C.Paulson,E.Zelnio,L.Gorham,andD.Wu,Poseestimationofcivilianvehiclesutilizingglintphenomenology,submittedtoAerospaceandElectronicSystems,IEEETransactionson,2013. [73] J.Scheer,W.Chastain,N.Alexander,J.Bruder,C.Scheer,M.Horst,andJ.Trostel,Mmwradarcrosssectionrangecharacterizestargets,inRadarConference,1988.,Proceedingsofthe1988IEEENational.IEEE,1988,pp.209. [74] M.Bryant,L.Gostin,andM.Soumekh,-de-csarimagingofat-72tankandsynthesisofitssarreconstructions,AerospaceandElectronicSystems,IEEETransactionson,vol.39,no.1,pp.211,2003. [75] L.YuandY.Zhang,Csarimagingwithdataextrapolationandapproximateglrttechniques,ProgressInElectromagneticsResearchM,vol.19,pp.209,2011. [76] M.Martorella,E.Giusti,F.Berizzi,andE.DalleMese,Automatictargetrecognitionofterrestrialvehiclesbasedonpolarimetricisarimageandmodelmatching,inRadar,2008InternationalConferenceon.IEEE,2008,pp.38. 136

PAGE 137

W.BrownandR.Fredricks,Range-dopplerimagingwithmotionthroughresolutioncells,AerospaceandElectronicSystems,IEEETransactionson,no.1,pp.98,1969. [78] T.Goyette,J.Dickinson,J.Waldman,andW.Nixon,Three-dimensionalfullypolarimetricw-bandisarimageryofscale-modeltacticaltargetsusinga1.56-thzcompactrange,inAeroSense2003.InternationalSocietyforOpticsandPhotonics,2003,pp.66. [79] T.Goyette,J.Dickinson,C.Beaudoin,A.Gatesman,R.Giles,J.Waldman,andW.Nixon,Acquisitionofuhfandx-bandisarimageryusing1/35thscalemodels,inDefenseandSecurity.InternationalSocietyforOpticsandPhotonics,2005,pp.440. [80] C.Paulson,E.Zelnio,L.Gorham,andD.Wu,Usingglinttoperformgeometricsignaturepredictionandposeestimation,inSocietyofPhoto-OpticalInstrumenta-tionEngineers(SPIE)ConferenceSeries,vol.8394,2012,p.22. [81] M.CetinandR.Moses,Sarimagingfrompartial-aperturedatawithfrequency-bandomissions,inProc.SPIE,vol.5808,2005,pp.32. [82] K.Varshney,M.Cetin,J.FisherIII,andA.Willsky,Jointimageformationandanisotropycharacterizationinwide-anglesar,inProceedingsofSPIE,vol.6237,2006,pp.95. [83] A.Jain,R.Duin,andJ.Mao,Statisticalpatternrecognition:Areview,PatternAnalysisandMachineIntelligence,IEEETransactionson,vol.22,no.1,pp.4,2000. [84] M.Bryant,Managingnuisanceparameters,inAeroSense2000.InternationalSocietyforOpticsandPhotonics,2000,pp.579. [85] R.Duda,P.Hart,andD.Stork,Patternclassication,2nded.NewYork:JohnWiley,Section. [86] H.Chan,D.Wei,M.Helvie,B.Sahiner,D.Adler,M.Goodsitt,andN.Petrick,Computer-aidedclassicationofmammographicmassesandnormaltissue:lineardiscriminantanalysisintexturefeaturespace,Physicsinmedicineandbiology,vol.40,no.5,p.857,1999. [87] T.Bandos,L.Bruzzone,andG.Camps-Valls,Classicationofhyperspectralimageswithregularizedlineardiscriminantanalysis,GeoscienceandRemoteSensing,IEEETransactionson,vol.47,no.3,pp.862,2009. [88] D.WittenandR.Tibshirani,Penalizedclassicationusingsher'slineardiscriminant,JournaloftheRoyalStatisticalSociety:SeriesB(StatisticalMethod-ology),vol.73,no.5,pp.753,2011. 137

PAGE 138

A.Jordan,Ondiscriminativevs.generativeclassiers:Acomparisonoflogisticregressionandnaivebayes,Advancesinneuralinformationprocessingsystems,vol.14,2002. [90] M.Pohar,M.Blas,andS.Turk,Comparisonoflogisticregressionandlineardiscriminantanalysis:asimulationstudy,MetodolskiZvezki,vol.1,no.1,pp.143,2004. [91] W.ChengandE.Hullermeier,Combininginstance-basedlearningandlogisticregressionformultilabelclassication,MachineLearning,vol.76,no.2,pp.211,2009. [92] T.HastieandR.Tibshirani,Discriminantadaptivenearestneighborclassication,PatternAnalysisandMachineIntelligence,IEEETransactionson,vol.18,no.6,pp.607,1996. [93] J.F.Borisoff,S.G.Mason,A.Bashashati,andG.E.Birch,Brain-computerinterfacedesignforasynchronouscontrolapplications:improvementstothelf-asdasynchronousbrainswitch,BiomedicalEngineering,IEEETransactionson,vol.51,no.6,pp.985,2004. [94] K.Q.Weinberger,J.Blitzer,andL.K.Saul,Distancemetriclearningforlargemarginnearestneighborclassication,inInNIPS.Citeseer,2006. [95] C.CortesandV.Vapnik,Support-vectornetworks,Machinelearning,vol.20,no.3,pp.273,1995. [96] I.Guyon,J.Weston,S.Barnhill,andV.Vapnik,Geneselectionforcancerclassicationusingsupportvectormachines,Machinelearning,vol.46,no.1,pp.389,2002. [97] C.Kiran,L.Prabhu,R.Abdu,andK.Rajeev,Trafcsigndetectionandpatternrecognitionusingsupportvectormachine,inAdvancesinPatternRecognition,2009.ICAPR'09.SeventhInternationalConferenceon.IEEE,2009,pp.87. [98] A.McCallum,K.Nigametal.,Acomparisonofeventmodelsfornaivebayestextclassication,inAAAI-98workshoponlearningfortextcategorization,vol.752.Citeseer,1998,pp.41. [99] M.Zhang,J.Pena,andV.Robles,Featureselectionformulti-labelnaivebayesclassication,InformationSciences,vol.179,no.19,pp.3218,2009. [100] G.Rosen,E.Reichenberger,andA.Rosenfeld,Nbc:thenaivebayesclassicationtoolwebserverfortaxonomicclassicationofmetagenomicreads,Bioinformatics,vol.27,no.1,pp.127,2011. [101] E.ScheirerandM.Slaney,Constructionandevaluationofarobustmultifeaturespeech/musicdiscriminator,inAcoustics,Speech,andSignalProcessing,1997.

PAGE 139

[102] D.Reynolds,T.Quatieri,andR.Dunn,Speakervericationusingadaptedgaussianmixturemodels,Digitalsignalprocessing,vol.10,no.1,pp.19,2000. [103] S.C.KimandT.J.Kang,Textureclassicationandsegmentationusingwaveletpacketframeandgaussianmixturemodel,PatternRecognition,vol.40,no.4,pp.1207,2007. [104] J.Yamato,J.Ohya,andK.Ishii,Recognizinghumanactionintime-sequentialimagesusinghiddenmarkovmodel,inComputerVisionandPatternRecognition,1992.ProceedingsCVPR'92.,1992IEEEComputerSocietyConferenceon.IEEE,1992,pp.379. [105] J.Li,A.Najmi,andR.M.Gray,Imageclassicationbyatwo-dimensionalhiddenmarkovmodel,SignalProcessing,IEEETransactionson,vol.48,no.2,pp.517,2000. [106] W.ChaiandB.Vercoe,Folkmusicclassicationusinghiddenmarkovmodels,inProceedingsofInternationalConferenceonArticialIntelligence,vol.6,no.6.4.Citeseer,2001. [107] H.Bunke,Syntacticandstructuralpatternrecognition:theoryandapplications.WorldScienticPublishingCompanyIncorporated,1990,vol.7. [108] K.RaoandK.Balck,Typeclassicationofngerprints:Asyntacticapproach,PatternAnalysisandMachineIntelligence,IEEETransactionson,no.3,pp.223,1980. [109] T.WarkandS.Sridharan,Asyntacticapproachtoautomaticlipfeatureextractionforspeakeridentication,inAcoustics,SpeechandSignalProcessing,1998.Proceedingsofthe1998IEEEInternationalConferenceon,vol.6.IEEE,1998,pp.3693. [110] K.Wang,Z.Ming,andT.Chua,Asyntactictreematchingapproachtondingsimilarquestionsincommunity-basedqaservices,inProceedingsofthe32ndinternationalACMSIGIRconferenceonResearchanddevelopmentininformationretrieval.ACM,2009,pp.187. [111] A.Mazroua,M.Salama,andR.Bartnikas,Pdpatternrecognitionwithneuralnetworksusingthemultilayerperceptrontechnique,ElectricalInsulation,IEEETransactionson,vol.28,no.6,pp.1082,1993. [112] M.Bhowmik,D.Bhattacharjee,M.Nasipuri,D.Basu,andM.Kundu,Classicationofpolar-thermaleigenfacesusingmultilayerperceptronforhuman 139

PAGE 140

[113] C.Zhao,Y.Gao,J.He,andJ.Lian,Recognitionofdrivingposturesbymultiwavelettransformandmultilayerperceptronclassier,EngineeringAp-plicationsofArticialIntelligence,2012. [114] M.Musavi,W.Ahmed,K.Chan,K.Faris,andD.Hummels,Onthetrainingofradialbasisfunctionclassiers,Neuralnetworks,vol.5,no.4,pp.595,1992. [115] S.Ghosh-Dastidar,H.Adeli,andN.Dadmehr,Principalcomponentanalysis-enhancedcosineradialbasisfunctionneuralnetworkforrobustepilepsyandseizuredetection,BiomedicalEngineering,IEEETransactionson,vol.55,no.2,pp.512,2008. [116] R.Savitha,S.Suresh,N.Sundararajan,andH.Kim,Afullycomplex-valuedradialbasisfunctionclassierforreal-valuedclassicationproblems,Neurocomputing,vol.78,no.1,pp.104,2012. [117] Y.Huang,K.Englehart,B.Hudgins,andA.Chan,Agaussianmixturemodelbasedclassicationschemeformyoelectriccontrolofpoweredupperlimbprostheses,BiomedicalEngineering,IEEETransactionson,vol.52,no.11,pp.1801,2005. [118] R.Browne,P.McNicholas,andM.Sparling,Model-basedlearningusingamixtureofmixturesofgaussiananduniformdistributions,PatternAnalysisandMachineIntelligence,IEEETransactionson,vol.34,no.4,pp.814,2012. [119] B.Yang,T.Han,andJ.An,Artkohonenneuralnetworkforfaultdiagnosisofrotatingmachinery,MechanicalSystemsandSignalProcessing,vol.18,no.3,pp.645,2004. [120] P.MahonenandP.Hakala,Automatedsourceclassicationusingakohonennetwork,TheAstrophysicalJournalLetters,vol.452,no.1,p.L77,2009. [121] N.Mezghani,P.Phan,A.Mitiche,H.Labelle,andJ.deGuise,Akohonenneuralnetworkdescriptionofscoliosisfusedregionsandtheircorrespondinglenkeclassication,Internationaljournalofcomputerassistedradiologyandsurgery,pp.1,2012. [122] J.Wright,A.Yang,A.Ganesh,S.Sastry,andY.Ma,Robustfacerecognitionviasparserepresentation,PatternAnalysisandMachineIntelligence,IEEETransactionson,vol.31,no.2,pp.210,2009. [123] A.Wagner,J.Wright,A.Ganesh,Z.Zhou,andY.Ma,Towardsapracticalfacerecognitionsystem:Robustregistrationandilluminationbysparserepresentation,inComputerVisionandPatternRecognition,2009.CVPR2009.IEEEConferenceon.IEEE,2009,pp.597. 140

PAGE 141

H.Midelfart,J.Groen,andO.Midtgaard,Templatematchingmethodsforobjectclassicationinsyntheticaperturesonarimages,inProceedingsoftheUnderwaterAcousticMeasurementsConference,no.SS,2009. [125] T.TuytelaarsandK.Mikolajczyk,Localinvariantfeaturedetectors:asurvey,FoundationsandTrendsRinComputerGraphicsandVision,vol.3,no.3,pp.177,2008. [126] J.Canny,Acomputationalapproachtoedgedetection,PatternAnalysisandMachineIntelligence,IEEETransactionson,no.6,pp.679,1986. [127] I.Sobel,Cameramodelsandmachineperception,DTICDocument,Tech.Rep.,1970. [128] J.Prewitt,Objectenhancementandextraction.AcademicPress,NewYork,1970,vol.75. [129] L.Roberts,Machineperceptionofthree-dimensionalsolids,DTICDocument,Tech.Rep.,1963. [130] D.MarrandE.Hildreth,Theoryofedgedetection,ProceedingsoftheRoyalSocietyofLondon.SeriesB.BiologicalSciences,vol.207,no.1167,pp.187,1980. [131] C.HarrisandM.Stephens,Acombinedcornerandedgedetector,inAlveyvisionconference,vol.15.Manchester,UK,1988,p.50. [132] S.SmithandJ.Brady,Susananewapproachtolowlevelimageprocessing,Internationaljournalofcomputervision,vol.23,no.1,pp.45,1997. [133] K.MikolajczykandC.Schmid,Anafneinvariantinterestpointdetector,Com-puterVisionECCV2002,pp.128,2002. [134] E.RostenandT.Drummond,Fusingpointsandlinesforhighperformancetracking,inComputerVision,2005.ICCV2005.TenthIEEEInternationalConfer-enceon,vol.2.IEEE,2005,pp.1508. [135] T.KadirandM.Brady,Saliency,scaleandimagedescription,InternationalJournalofComputerVision,vol.45,no.2,pp.83,2001. [136] D.Lowe,Distinctiveimagefeaturesfromscale-invariantkeypoints,Internationaljournalofcomputervision,vol.60,no.2,pp.91,2004. [137] T.TuytelaarsandL.VanGool,Matchingwidelyseparatedviewsbasedonafneinvariantregions,Internationaljournalofcomputervision,vol.59,no.1,pp.61,2004. 141

PAGE 142

J.Matas,O.Chum,M.Urban,andT.Pajdla,Robustwide-baselinestereofrommaximallystableextremalregions,ImageandVisionComputing,vol.22,no.10,pp.761,2004. [139] G.Mori,X.Ren,A.Efros,andJ.Malik,Recoveringhumanbodycongurations:Combiningsegmentationandrecognition,inComputerVisionandPatternRecognition,2004.CVPR2004.Proceedingsofthe2004IEEEComputerSocietyConferenceon,vol.2.IEEE,2004,pp.II. [140] M.KirbyandL.Sirovich,Applicationofthekarhunen-loeveprocedureforthecharacterizationofhumanfaces,PatternAnalysisandMachineIntelligence,IEEETransactionson,vol.12,no.1,pp.103,1990. [141] Y.Xu,D.Zhang,andJ.Yang,Afeatureextractionmethodforusewithbimodalbiometrics,Patternrecognition,vol.43,no.3,pp.1106,2010. [142] C.RenandD.Dai,Incrementallearningofbidirectionalprincipalcomponentsforfacerecognition,PatternRecognition,vol.43,no.1,pp.318,2010. [143] K.Bae,S.Noh,andJ.Kim,Irisfeatureextractionusingindependentcomponentanalysis,inAudio-andVideo-BasedBiometricPersonAuthentication.Springer,2003,pp.1059. [144] L.Novak,S.Halversen,G.Owirka,andM.Hiett,Effectsofpolarizationandresolutiononsaratr,AerospaceandElectronicSystems,IEEETransactionson,vol.33,no.1,pp.102,1997. [145] L.Novak,G.Owirka,W.Brower,andA.Weaver,Theautomatictarget-recognitionsysteminsaip,LincolnLaboratoryJournal,vol.10,no.2,1997. [146] G.Goldstein,False-alarmregulationinlog-normalandweibullclutter,AerospaceandElectronicSystems,IEEETransactionson,no.1,pp.84,1973. [147] D.Kreithen,S.Halversen,andG.Owirka,Discriminatingtargetsfromclutter,TheLincolnLaboratoryJournal,vol.6,no.1,pp.25,1993. [148] Q.Zhao,J.Principe,V.Brennan,D.Xu,andZ.Wang,Syntheticapertureradarautomatictargetrecognitionwiththreestrategiesoflearningandrepresentation,OpticalEngineering,vol.39,no.5,pp.1230,2000. [149] Q.ZhaoandJ.Principe,Supportvectormachinesforsarautomatictargetrecognition,AerospaceandElectronicSystems,IEEETransactionson,vol.37,no.2,pp.643,2001. [150] J.Wissinger,R.Washburn,N.Friedland,A.Nowicki,D.Morgan,C.Chong,andR.Fung,Searchalgorithmsformodel-basedsaratr,inSocietyofPhoto-OpticalInstrumentationEngineers(SPIE)ConferenceSeries,vol.2757,1996,pp.279. 142

PAGE 143

J.Wissinger,R.Ristroph,J.Diemunsch,W.Severson,andE.Fruedenthal,Mstar'sextensiblesearchengineandmodel-basedinferencingtoolkit,inAeroSense'99.InternationalSocietyforOpticsandPhotonics,1999,pp.554. [152] J.Principe,M.Kim,andM.FisherIII,Targetdiscriminationinsyntheticapertureradarusingarticialneuralnetworks,ImageProcessing,IEEETransactionson,vol.7,no.8,pp.1136,1998. [153] Q.ZhaoandZ.Bao,Radartargetrecognitionusingaradialbasisfunctionneuralnetwork,NeuralNetworks,vol.9,no.4,pp.709,1996. [154] S.Yang,M.Wang,andL.Jiao,Radartargetrecognitionusingcontourletpackettransformandneuralnetworkapproach,SignalProcessing,vol.89,no.4,pp.394,2009. [155] F.Tupin,H.Maitre,J.Mangin,J.Nicolas,andE.Pechersky,Detectionoflinearfeaturesinsarimages:applicationtoroadnetworkextraction,GeoscienceandRemoteSensing,IEEETransactionson,vol.36,no.2,pp.434,1998. [156] M.Jahangir,D.Blacknell,C.Moate,andR.Hill,Extractinginformationfromshadowsinsarimagery,inMachineVision,2007.ICMV2007.InternationalConferenceon.IEEE,2007,pp.107. [157] M.Burl,G.Owirka,andL.Novak,Texturediscriminationinsyntheticapertureradarimagery,inSignals,SystemsandComputers,1989.Twenty-ThirdAsilomarConferenceon,vol.1.IEEE,1989,pp.399. [158] A.MishraandB.Mulgrew,Radarsignalclassicationusingpca-basedfeatures,inAcoustics,SpeechandSignalProcessing,2006.ICASSP2006Proceedings.2006IEEEInternationalConferenceon,vol.3.IEEE,2006,pp.IIIIII. [159] L.Hu,J.Liu,H.Liu,B.Chen,andS.Wu,Automatictargetrecognitionbasedonsarimagesandtwo-stage2dpcafeatures,inSyntheticApertureRadar,2007.APSAR2007.1stAsianandPacicConferenceon.IEEE,2007,pp.801. [160] F.Harris,Ontheuseofwindowsforharmonicanalysiswiththediscretefouriertransform,ProceedingsoftheIEEE,vol.66,no.1,pp.51,1978. [161] R.Moses,L.Potter,andM.Cetin,Wideanglesarimaging,inAlgorithmsforSyntheticApertureRadarImageryXI,vol.5427.SPIE,2004,pp.164. [162] R.MosesandL.Potter,Noncoherent2dand3dsarreconstructionfromwide-anglemeasurements,inin13thAnnualAdaptiveSensorArrayProcess-ingWorkshop,MITLincolnLaboratory,2005. [163] E.Ertin,L.Potter,andR.Moses,Enhancedimagingovercompletecircularapertures,inSignals,SystemsandComputers,2006.ACSSC'06.FortiethAsilomarConferenceon.IEEE,2006,pp.1580. 143

PAGE 144

K.Dungan,J.Ash,J.Nehrbass,J.Parker,L.Gorham,andS.Scarborough,Wideanglesardatafortargetdiscriminationresearch,inSocietyofPhoto-OpticalInstrumentationEngineers(SPIE)ConferenceSeries,vol.8394,2012,p.18. [165] D.Hoaglin,F.Mosteller,andJ.Tukey,Understandingrobustandexploratorydataanalysis.WileyNewYork,1983,vol.3. 144

PAGE 145

ChristopherPaulsonreceivedhisB.S.,M.S.,andPh.D.inelectricalandcomputerengineeringfromtheUniversityofFloridainMay2006,May2009,andMay2013.Dr.PaulsonwasaninternattheMissouriUniversityofScienceandTechnologyfromMay2004toAugust2004,performingresearchoncreatinganautonomousheliumblimp.FromAugust2005toJune2006,hewaspartoftheIntegratedProductandProcessDesign(IPPD)attheUniversityofFloridaandhisprojectwastorevolutionizethetoasterovenfortheJardencompany.FromMay2008toAugust2008,hewasaninternwithWrightPattersonAirForceBasedoingresearchontracking,datacleansing,andimageregistration.Also,fromMay2009toAugust2009,hewasaninternwithWrightPattersonAirForceBasedoingresearchsolelyonimageregistration.FromMay2010toAugust2010,hewasaninternforScience,Engineering,andTechnology(SET)doingresearchonLongRangeObjectDetectionforAutonomousVehicles.ChristopherisaScience,MathematicsandResearchforTransformation(SMART)2010fellowrecipient.FromMay2011toAugust2011,hewasaninternforWrightPattersonAirForceBasedoingresearchongeometricpredictionofSyntheticApertureRadar(SAR)images.Lastly,fromMay2012toAugust2012hewasinterningwithWrightPattersonAirForceBaseperformingresearchonanewapproachtodoclassicationofSARimagesbyusingtwonewtechnologiesthathedevelopedinhisresearch.Hisresearchinterestsincludesrobotics,computervision,syntheticapertureradar,classication,andsignalprocessing. 145