<%BANNER%>

On Learning and Regularization in Super-Resolution Imaging

Permanent Link: http://ufdc.ufl.edu/UFE0044992/00001

Material Information

Title: On Learning and Regularization in Super-Resolution Imaging
Physical Description: 1 online resource (100 p.)
Language: english
Creator: Rushdi, Muhammad A
Publisher: University of Florida
Place of Publication: Gainesville, Fla.
Publication Date: 2013

Subjects

Subjects / Keywords: learning -- magnification -- redundancy -- regularization -- superresolution
Computer and Information Science and Engineering -- Dissertations, Academic -- UF
Genre: Computer Engineering thesis, Ph.D.
bibliography   ( marcgt )
theses   ( marcgt )
government publication (state, provincial, terriorial, dependent)   ( marcgt )
born-digital   ( sobekcm )
Electronic Thesis or Dissertation

Notes

Abstract: Advances in super-resolution imaging have been made by reconstruction, interpolation and example-based algorithmic techniques drawn from the fields of signal and image processing, machine learning, biologically-inspired computer vision, and psychology. However, the performance of super-resolution algorithms has been limited by constraints of sampling frequency, sensor dimensions, sensor noise, focus and motion blurring, and alignment between low-resolution input data samples. In this dissertation, we propose several techniques to improve the performance of state-of-the-art super-resolution techniques. Firstly, a concise introduction and literature survey of super-resolution imaging research is given. Secondly, novel dictionary learning techniques for super-resolution are presented. Thirdly, non-uniform image super-resolution over deformed image domains is approached using patch-redundancy as well as resolution-independence image models. Experimental results are good in visual quality and compare well with other state-of-the-art techniques. Future work should explore the extension of the proposed methods to video and stereoscopic imaging.
General Note: In the series University of Florida Digital Collections.
General Note: Includes vita.
Bibliography: Includes bibliographical references.
Source of Description: Description based on online resource; title from PDF title page.
Source of Description: This bibliographic record is available under the Creative Commons CC0 public domain dedication. The University of Florida Libraries, as creator of this bibliographic record, has waived all rights to it worldwide under copyright law, including all related and neighboring rights, to the extent allowed by law.
Statement of Responsibility: by Muhammad A Rushdi.
Thesis: Thesis (Ph.D.)--University of Florida, 2013.
Local: Adviser: Ho, Jeffrey Yih Chian.

Record Information

Source Institution: UFRGP
Rights Management: Applicable rights reserved.
Classification: lcc - LD1780 2013
System ID: UFE0044992:00001

Permanent Link: http://ufdc.ufl.edu/UFE0044992/00001

Material Information

Title: On Learning and Regularization in Super-Resolution Imaging
Physical Description: 1 online resource (100 p.)
Language: english
Creator: Rushdi, Muhammad A
Publisher: University of Florida
Place of Publication: Gainesville, Fla.
Publication Date: 2013

Subjects

Subjects / Keywords: learning -- magnification -- redundancy -- regularization -- superresolution
Computer and Information Science and Engineering -- Dissertations, Academic -- UF
Genre: Computer Engineering thesis, Ph.D.
bibliography   ( marcgt )
theses   ( marcgt )
government publication (state, provincial, terriorial, dependent)   ( marcgt )
born-digital   ( sobekcm )
Electronic Thesis or Dissertation

Notes

Abstract: Advances in super-resolution imaging have been made by reconstruction, interpolation and example-based algorithmic techniques drawn from the fields of signal and image processing, machine learning, biologically-inspired computer vision, and psychology. However, the performance of super-resolution algorithms has been limited by constraints of sampling frequency, sensor dimensions, sensor noise, focus and motion blurring, and alignment between low-resolution input data samples. In this dissertation, we propose several techniques to improve the performance of state-of-the-art super-resolution techniques. Firstly, a concise introduction and literature survey of super-resolution imaging research is given. Secondly, novel dictionary learning techniques for super-resolution are presented. Thirdly, non-uniform image super-resolution over deformed image domains is approached using patch-redundancy as well as resolution-independence image models. Experimental results are good in visual quality and compare well with other state-of-the-art techniques. Future work should explore the extension of the proposed methods to video and stereoscopic imaging.
General Note: In the series University of Florida Digital Collections.
General Note: Includes vita.
Bibliography: Includes bibliographical references.
Source of Description: Description based on online resource; title from PDF title page.
Source of Description: This bibliographic record is available under the Creative Commons CC0 public domain dedication. The University of Florida Libraries, as creator of this bibliographic record, has waived all rights to it worldwide under copyright law, including all related and neighboring rights, to the extent allowed by law.
Statement of Responsibility: by Muhammad A Rushdi.
Thesis: Thesis (Ph.D.)--University of Florida, 2013.
Local: Adviser: Ho, Jeffrey Yih Chian.

Record Information

Source Institution: UFRGP
Rights Management: Applicable rights reserved.
Classification: lcc - LD1780 2013
System ID: UFE0044992:00001


This item has the following downloads:


Full Text

PAGE 1

ONLEARNINGANDREGULARIZATIONINSUPER-RESOLUTIONIMAGINGByMUHAMMADALIMUHAMMADRUSHDIADISSERTATIONPRESENTEDTOTHEGRADUATESCHOOLOFTHEUNIVERSITYOFFLORIDAINPARTIALFULFILLMENTOFTHEREQUIREMENTSFORTHEDEGREEOFDOCTOROFPHILOSOPHYUNIVERSITYOFFLORIDA2013

PAGE 2

c2013MuhammadAliMuhammadRushdi 2

PAGE 3

TomyparentsDr.AliMuhammadRushdiandMrs.AzzaSabriGodaandmywifeDoaaTalaatElSheikhfortheirloveandsupport 3

PAGE 4

ACKNOWLEDGMENTS Firstofall,IwouldliketosincerelythankmyadviserDr.JeffreyHoforhisguidance,understanding,patience,andmostimportantly,hisfriendshipduringmygraduatestudiesattheUniversityofFlorida.Iwouldlikealsotothankmysupervisorycommitteemembersfortheirinsightfulandconstructivefeedback.SpecialthanksgotomycolleaguesMohsenAli,Yu-TsehChi,S.M.ShahedNejhum,andShaoyuQiwhomhelp,support,anddiscussionsweretrulyinvaluable.IwouldliketothankmyparentsDr.AliMuhammadRushdiandMrs.AzzaSabriGodaandmysiblingsAhmad,Mahmoud,Rufaidah,Mostafa,Muzainah,andSuhailahfortheirloveandsupport.Mostimportantly,IwouldliketothankmywifeDoaaTalaatElSheikhforherloving,caring,andpatientsupportduringyearsofgraduatestudy.Icertainlycouldnothavedoneitwithouther.MythanksgoaswelltomydaughterJumanahandmysonAliwholledmylifewithjoyandhappiness.Iwouldlikealsotothankallofmyrelatives,in-laws,andfriendswhoshoweredmewiththeirsupportandnicewishes.MayAllahacceptourgooddeeds,forgiveoursins,andacceptusinHisparadise. 4

PAGE 5

TABLEOFCONTENTS page ACKNOWLEDGMENTS .................................. 4 LISTOFTABLES ...................................... 7 LISTOFFIGURES ..................................... 8 ABSTRACT ......................................... 11 CHAPTER 1RECENTADVANCESINSUPER-RESOLUTIONIMAGING ........... 12 1.1Super-ResolutionataGlance ......................... 12 1.2RecentAdvancesinSuper-Resolution .................... 12 1.2.1Single-ImageSuper-Resolution .................... 12 1.2.2RegistrationforSuper-Resolution ................... 14 1.2.3MotionEstimationandVideoSuper-Resolution ........... 14 1.2.4NeighborEmbeddingMethods .................... 15 1.2.5Super-ResolutionforFaceRecognition ................ 16 1.2.6Super-ResolutioninConsumerElectronics .............. 16 1.2.7Super-ResolutioninMedicalImaging ................. 16 1.2.8HardwareRealizationofSuper-ResolutionAlgorithms ....... 17 2AUGMENTEDCOUPLEDDICTIONARYLEARNING ............... 18 2.1Introduction ................................... 18 2.2CoupledDictionaryLearninginaNutshell .................. 18 2.3AugmentedCoupledDictionaryLearning .................. 19 2.3.1ImagingModel ............................. 20 2.3.2AlternatingOptimizationScheme ................... 20 2.3.3DictionaryTraining ........................... 21 2.3.4ComputingtheHigh-ResolutionandAugmentingDictionaries ... 21 2.3.5Synthesis ................................ 22 2.4ImageSuper-ResolutionusingAugmentedCoupledDictionaryLearning 22 2.4.1ImplementationDetails ......................... 22 2.4.2QuantitativeandVisualComparison ................. 23 2.4.3EffectoftheHigh-ResolutionDictionarySize ............ 23 2.4.4EffectofDimensionalityReduction .................. 24 2.5ColorDe-RenderingusingAugmentedCoupledDictionaryLearning ... 24 2.5.1ImplementationDetails ......................... 30 2.5.2EffectofExposureandLightingonColorDe-rendering ....... 31 5

PAGE 6

3NON-UNIFORMSUPER-RESOLUTIONUSINGPATCHREDUNDANCYANDRESOLUTION-INDEPENDENTIMAGEMODELS ................ 35 3.1Single-ImageSuper-ResolutionusingPatchRedundancy:AReview ... 35 3.1.1Super-ResolutionusingIn-ScalePatchRedundancy ........ 37 3.1.2Super-ResolutionusingCross-ScalePatchRedundancy ...... 38 3.1.3CombiningClassicalandExample-BasedSuper-Resolution .... 39 3.2DistributionofNearestNeighborsAcrossScale ............... 40 3.3ReducingtheSearchComplexity-LocalScaleNearestNeighbourSearch 61 3.4Non-uniformSuper-ResolutionusingPatchRedundancy .......... 76 3.4.1Non-uniformSuper-resolutionAlgorithm ............... 76 3.4.2ExperimentalResults .......................... 78 3.5Non-UniformSuper-ResolutionusingtheResolution-IndependentImageModel ...................................... 86 3.5.1Resolution-IndependentImageModel:ABriefReview ....... 86 3.5.2ExperimentalResults .......................... 87 4CONCLUSIONSANDFUTUREWORK ...................... 91 4.1Conclusions ................................... 91 4.2FutureWork ................................... 91 REFERENCES ....................................... 93 BIOGRAPHICALSKETCH ................................ 99 6

PAGE 7

LISTOFTABLES Table page 2-1Super-resolutionQualityResults .......................... 29 3-1Super-resolutionperformancemetricsfortheArchimage ............ 61 3-2Super-resolutionperformancemetricsfortheChipimage ............ 64 3-3Super-resolutionperformancemetricsfortheChildimage ............ 66 3-4Super-resolutionperformancemetricsfortheDogimage ............ 68 3-5Super-resolutionperformancemetricsfortheLenaimage ............ 70 3-6Super-resolutionperformancemetricsfortheSunFlowerimage ........ 72 3-7Super-resolutionperformancemetricsfortheZebraimage ............ 74 7

PAGE 8

LISTOFFIGURES Figure page 2-1Effectofdictionarysizeonsuper-resolutionperformance ............ 25 2-2Effectofdimensionalityreductiononsuper-resolutionperformance ....... 26 2-3Thetraineddictionariesoftheaugmentedscheme ................ 27 2-4Visualresultsforsampletestimages ........................ 28 2-5PSNRversusexposurecurvesforseveralcameramodels ............ 32 2-6Colorderenderingofcheckerimages ........................ 33 2-7Colorderenderingofnaturalimages ........................ 34 3-1Patchrecurrencewithinandacrossscalesofasingleimage .......... 36 3-2Averagepatchrecurrencewithinandacrossscalesofasingleimage ..... 37 3-3Classicalmulti-imageversussingle-imagemulti-patchsuper-resolution .... 38 3-4Combiningclassicalandexample-basedsuper-resolutionconstraints ..... 39 3-5Magnication2X:Lowerresolutionpyramid .................... 42 3-6Magnication2X:Layer1 .............................. 43 3-7Magnication2X:Layer2 .............................. 44 3-8Magnication2X:Layer3 .............................. 45 3-9Magnication2X:Layer4 .............................. 46 3-10Magnication3X:Lowerresolutionpyramid .................... 47 3-11Magnication3X:Layer1 .............................. 48 3-12Magnication3X:Layer2 .............................. 49 3-13Magnication3X:Layer3 .............................. 50 3-14Magnication3X:Layer4 .............................. 51 3-15Magnication3X:Layer5 .............................. 52 3-16Magnication4X:Lowerresolutionpyramid .................... 53 3-17Magnication4X:Layer1 .............................. 54 3-18Magnication4X:Layer2 .............................. 55 8

PAGE 9

3-19Magnication4X:Layer3 .............................. 56 3-20Magnication4X:Layer4 .............................. 57 3-21Magnication4X:Layer5 .............................. 58 3-22Magnication4X:Layer6 .............................. 59 3-23Magnication4X:Layer7 .............................. 60 3-24PlotsofperformancemetricsfortheArchimage ................. 62 3-25LocalScaleSearch:Arch .............................. 63 3-26PlotsofperformancemetricsfortheChipimage ................. 64 3-27LocalScaleSearch:Chip .............................. 65 3-28PlotsofperformancemetricsfortheChildimage ................. 66 3-29LocalScaleSearch:Child .............................. 67 3-30PlotsofperformancemetricsfortheDogimage .................. 68 3-31LocalScaleSearch:Dog .............................. 69 3-32PlotsofperformancemetricsfortheLenaimage ................. 70 3-33LocalScaleSearch:Lena .............................. 71 3-34PlotsofperformancemetricsfortheSunFlowerimage .............. 72 3-35LocalScaleSearch:SunFlower .......................... 73 3-36PlotsofperformancemetricsfortheZebraimage ................. 74 3-37LocalScaleSearch:Zebra ............................. 75 3-38Non-uniformsuper-resolutionusingpatch-redundancy:Chip .......... 79 3-39Super-resolutionmap:Chip ............................. 80 3-40Non-uniformsuper-resolutionusingpatchredundancy:Child .......... 81 3-41Non-uniformsuper-resolutionusingpatchredundancy:Koala .......... 82 3-42Non-uniformsuper-resolutionusingpatchredundancy:ThreeWorlds ..... 83 3-43Non-uniformsuper-resolutionusingpatchredundancy:GlynnFractals ..... 84 3-44Non-uniformsuper-resolutionusingpatchredundancy:SquareLimit ...... 85 3-45Non-uniformsuper-resolutionusingtheresolution-independentmodel:Chip .. 89 9

PAGE 10

3-46Non-uniformsuper-resolutionusingtheresolution-independentmodel:Child 90 10

PAGE 11

AbstractofDissertationPresentedtotheGraduateSchooloftheUniversityofFloridainPartialFulllmentoftheRequirementsfortheDegreeofDoctorofPhilosophyONLEARNINGANDREGULARIZATIONINSUPER-RESOLUTIONIMAGINGByMuhammadAliMuhammadRushdiMay2013Chair:JeffreyHoMajor:ComputerEngineeringAdvancesinsuper-resolutionimaginghavebeenmadebyreconstruction,interpolationandexample-basedalgorithmictechniquesdrawnfromtheeldsofsignalandimageprocessing,machinelearning,biologically-inspiredcomputervision,andpsychology.However,theperformanceofsuper-resolutionalgorithmshasbeenlimitedbyconstraintsofsamplingfrequency,sensordimensions,sensornoise,focusandmotionblurring,andalignmentbetweenlow-resolutioninputdatasamples.Inthisdissertation,weproposeseveraltechniquestoimprovetheperformanceofstate-of-the-artsuper-resolutiontechniques.Firstly,aconciseintroductionandliteraturesurveyofsuper-resolutionimagingresearchisgiven.Secondly,noveldictionarylearningtechniquesforsuper-resolutionarepresented.Thirdly,non-uniformimagesuper-resolutionoverdeformedimagedomainsisapproachedusingpatch-redundancyaswellasresolution-independenceimagemodels.Experimentalresultsaregoodinvisualqualityandcomparewellwithotherstate-of-the-arttechniques.Futureworkshouldexploretheextensionoftheproposedmethodstovideoandstereoscopicimaging. 11

PAGE 12

CHAPTER1RECENTADVANCESINSUPER-RESOLUTIONIMAGING 1.1Super-ResolutionataGlanceHigh-resolutionimagesaresoughtinmanydigitalimagingapplicationstoimprovethevisualinformationforhumanandmachineperception[ 1 ].Bycombiningnon-redundantinformationcontainedinmultiplelow-resolutionimages,super-resolutiontechniquesseektoconstructhigh-resolutiondegradation-freeimages.Thebasicideaforincreasingthespatialresolutioninsuper-resolutiontechniquesistheavailabilityofmultiplelow-resolutionimagescapturedfromthesamescene[ 2 ].Theselow-resolutionimagesgivedifferentviewsonthescenewithsub-pixelshifts.Iftheseshiftsarenon-integer,thenhigh-frequencyinformationcanberecoveredtosomeextent.Asweapplysuper-resolutiontechniques,weneedtodealaswellwithimagedegradationcausedbywarping,blurring,cameramotion,sensornoise,andotherfactors.Thecombinedeffectsofdegradationanddown-samplingmakethesuper-resolutionproblemtrulychallengingandappealingtoresearchersinthecomputervisionandmachinelearningcommunities. 1.2RecentAdvancesinSuper-ResolutionTherehasbeenagrowinginterestinalgorithmsandapplicationsofsuper-resolutionimaging.Here,webrieydiscusssomeoftherecentmajortrendsandapplications.MorecomprehensivesurveysweregivenbyParketal.[ 2 ],TianandMa[ 3 ],Ouwerkerk[ 4 ],Cristobaletal.[ 5 ]. 1.2.1Single-ImageSuper-ResolutionSingle-imagesuper-resolutionisachallengingtaskthatresearcherstrytotacklebyexploitingpatchredundancy[ 6 7 ],Gaussianprocessregression[ 8 ],andsparsecodingregression[ 9 ].Sunetal.[ 10 ]searchedamappingbetweenapairoflow-resolutionandhigh-resolutionimagepatchesinthegradientdomainbylearningagenericimagedatabaseandtheinputimageitself.Givenalow-resolutionimage, 12

PAGE 13

thehigh-resolutionimagewasreconstructedbyusingsparserepresentationsinthegradientdomainandsolvingaPoissonequation.Fanetal.[ 11 ]learnedthestructuralcontentoflow-resolutionpixelsandcorrelationsamongthepixelswithsimilarstructure.Thecorrelationwasusedtoguidetheoutputimagereconstructedbyiterativeback-projection.Kimetal.[ 12 ]synthesizedthediagonalhigh-frequencysub-bandofthediscretewavelettransform(DWT)usingthelow-frequencycomponentofthelow-resolutionimageandthehigh-frequencycomponentofthebicubicallyinterpolatedimageinthehigh-frequencysub-bandoftheDWT.Thereconstructedhigh-resolutionimagewasobtainedbyapplyingtheinversewavelettransformtothesynthesizedhigh-frequencysub-bandtogetherwiththeremainingthreesub-bands.Lietal.[ 13 ]proposedafullyautomaticschemethatexploitsknowledgeoftheentirevisualworldandqueriesrelevantreferencesfromtheInternet.Luetal.[ 14 ]utilizedahomotopymethodtochoosetheparameterthatyieldsaproperbalancebetweentheuseoftrainingimagepatchesandtestimagepatches.Kimetal.[ 15 ]introducedanovelimagezoomingalgorithm,theCurvatureInterpolationMethod(CIM),whichisbasedonpartialdifferentialequationsandiseasytoimplement.Inordertominimizeartefactsarisinginimageinterpolationsuchasimageblurandthecheckerboardeffect,theCIMmethodrstevaluatesthecurvatureofthelow-resolutionimage.Afterinterpolatingthecurvaturetothehigh-resolutionimagedomain,ahigh-resolutionimageisconstructedbysolvingalinearizedcurvatureequationthatincorporatestheinterpolatedcurvatureasanexplicitdrivingforce.Zhouetal.[ 16 ]usedmulti-surfacettingtotakefulladvantageofthespatialstructureinformation.Eachsiteoflow-resolutionpixelsisttedwithonesurface.Thenalestimationismadebyfusingthemulti-samplingvaluesonthesesurfacesinamaximum-a-posteriorifashion.MaaloufandLarabi[ 17 ]obtainedthegeometryofthelow-resolutionimagebycomputingthegrouplettransform.Thesegroupletbasesareusedtodeneagrouplet-basedstructuretensortocapturethegeometryand 13

PAGE 14

directionalfeaturesofthelow-resolutioncolorimage.Then,thesuper-resolvedimageissynthesizedbyanadaptivedirectionalinterpolationusingtheextractedgeometricinformationtopreservethesharpnessofedgesandtextures.Thisisaccomplishedbytheminimizationofafunctionalwhichisdenedontheextractedgeometricparametersofthelow-resolutionimageandorientedbythegeometricowdenedbythegrouplettransform. 1.2.2RegistrationforSuper-ResolutionRobustestimationofimageregistrationisstillofgreatimportanceformulti-framesuper-resolution.Zhouetal.[ 18 ]suggestedacoarse-to-neframeworktoregisteraccuratelythelocalregionsofinterest(ROIs)ofimageswithindependentperspectivemotionsbyestimatingtheirdeformationparameters.Milchevskietal.[ 19 ]showedamachine-learning-basedsupper-resolutionalgorithmrobusttoregistrationerrors.Vrigkasetal.[ 20 ]presentedamaximum-a-posteriorischemeforimagesuper-resolutionwheretheimageregistrationpartisaccomplishedintwosteps.Firstly,thelow-resolutionimagesareregisteredbyestablishingcorrespondencesbetweenrobustSIFTfeatures.Secondly,theestimationoftheregistrationparametersisne-tunedalongwiththeestimationofthehigh-resolutionimageinaniterativeprocedureviamaximizationofthemutual-informationcriterion. 1.2.3MotionEstimationandVideoSuper-ResolutionSuetal.[ 21 ]estimatedmotionbetweenimagesusingsparsefeaturepointcorrespondencesbetweentheinputimages.Thefeaturepointcorrespondences,whichareobtainedbymatchingasetoffeaturepoints,areusuallypreciseandmuchmorerobustthandenseopticalowelds.Becausethefeaturepointsrepresentwell-selectedsignicantlocationsintheimage,performingmatchingonthefeaturepointsetisusuallyveryaccurate.Inordertoutilizethesparsecorrespondencesinconventionalsuper-resolution,theauthorsextractedanadaptivesupportregionwithareliablelocaloweldfromeachcorrespondingfeaturepointpair.Schoenemannetal. 14

PAGE 15

[ 22 ]presentedanefcientparallelalgorithmforextractingsharphigh-resolutionlayersfromvideosequences.Sunkavallietal.[ 23 ]describedauniedframeworkforgeneratingasinglehigh-qualitystillimagefromashortvideoclip.Chenetal.[ 24 ]usedaGeneralizedGaussianMarkovRandomField(GGMRF)priortoperformamaximum-a-posterioriestimationofthedesiredhigh-resolutionimageforvideosuper-resolution.Katsukietal.[ 25 ]proposedaBayesianimagesuper-resolutionmethodwithacausalGaussianMarkovRandomField(GMRF)prior.Dongetal.[ 26 ]proposedanewsuper-resolutionreconstructionmethodcombiningthenarrowquantizationconstraintsetandmotionestimationforH.264compressedvideo. 1.2.4NeighborEmbeddingMethodsNeighborembeddingmethodstrytondthenearestneighborsofpatchesinalow-dimensionalmanifoldandthenapplythelearnedstructureinthehigh-dimensionalmanifoldtorecoverthehigh-resolutionpatches.Gaoetal.[ 27 ]proposedasparseneighborselectionschemeforsuper-resolutionreconstruction.Theyrstpredeterminealargernumberofneighborsaspotentialcandidatesanddevelopanalgorithmtosimultaneouslyndtheneighborsandsolveforthereconstructionweights.Recognizingthattheknearestneighborsforreconstructionshouldhavesimilarlocalgeometricstructuresbasedonclustering,theauthorsemployedlocalstatisticalfeatures,namelytheHistogramsofOrientedGradients(HOG)oflow-resolutionimagepatches,toperformsuchclustering.ByconveyinglocalstructuralinformationofHOGfeaturesinthesynthesisstage,thenearestneighborsofeachlow-resolutioninputpatchareadaptivelychosenfromtheirassociatedsubset.Thissignicantlyimprovesthespeedofsynthesizingthehigh-resolutionimagewhilepreservingthequalityofreconstruction.Toaddressthelackoflocalisometrybetweenhigh-andlow-resolutionfeaturespaces,Gaoetal.[ 28 ]appliedajointlearningtechniquetotraintwoprojectionmatricessimultaneouslyandtomaptheoriginallow-resolutionandhigh-resolutionfeature 15

PAGE 16

spacesontoauniedfeaturesubspace.Subsequently,theknearestneighborselectionoftheinputlow-resolutionimagepatchesisconductedintheuniedfeaturesubspacetoestimatethereconstructionweights. 1.2.5Super-ResolutionforFaceRecognitionFromasingleimageperperson,Zengetal.[ 29 ]performedfacerecognitionbylearningnonlinearregressionmodelsfromspecicnon-frontallow-resolutionimagestofrontalhigh-resolutionfeaturesusingradial-basisfunctionsinasubspacebuiltbycanonicalcorrelationanalysis.Wangetal.[ 30 ]proposedamethodforsimulatingadultfacialagingeffectsbymeansofsuper-resolution,tensoranalysis,andactiveappearancemodels. 1.2.6Super-ResolutioninConsumerElectronicsManyothersuper-resolutionapplicationshavebeendemonstratedintherecentliterature.Liuetal.[ 31 ]proposedafriendlyHumanComputerInterface(HCI)tohelpseniorcitizensovercometheproblemcausedbyvisualdeterioration.Animagesuper-resolutiontechnologywasadaptedtoenlargeeachsingleletterfromanewspaperoramagazine.Theimagesoftheseletterswerecapturedbyasmallcamera.Then,atypesettingalgorithmrearrangedthelayoutresults.Ahand-heldprojectorthenprojectedanimageofthesameinformationwithcompletenewtypesetting.Agesturerecognitiontechnologywasimplementedtoallowtheusertoeasilymovethedocumentindifferentdirections.Ogawaetal.[ 32 ]presentedasingle-chipimageprocessor(withahigh-qualityimagesuper-resolutionmodule)forvariousin-carequipmentsuchasthecarnavigationsystem,rear-seatentertainmentandrear-cameradisplay. 1.2.7Super-ResolutioninMedicalImagingScherreretal.[ 33 ]appliedsuper-resolutionreconstructionofdiffusion-weightedimagesfromdistortioncompensatedorthogonalanisotropicacquisitions.Distortioncompensationisachievedbyacquisitionofadual-echoeldmap,providinganestimateoftheeldinhomogeneity.Thesuper-resolutionpartisformulatedasa 16

PAGE 17

maximum-a-posterioriproblemandreliesonarealisticimagegenerationmodel.Wallachetal.[ 34 ]implementedamaximum-a-posteriorisuper-resolutionalgorithmandappliedittorespiratorygatedPositronEmissionTomography(PET)imagesformotioncompensation.Anedge-preservingHuberregularizationtermwasusedtoensureconvergenceandmotioneldswererecoveredusingaB-spline-basedelasticregistrationalgorithm. 1.2.8HardwareRealizationofSuper-ResolutionAlgorithmsSakutaetal.[ 35 ]proposedanewfastsuper-resolutionmagnicationmethodutilizingtheTotalVariation(TV)regularizationforHDTVreceivers.WhiletheTVregularizationapproachhasnotbeenconsideredtobeapracticaltechnologyformotionpicturesbecauseofitslargecomputationtime,theauthorssolvedthisproblembyapplyingacombinationofahigh-passlterandthesimplerTVmethodtotheTVup-samplingblockintheconventionalTVmagnicationsystem.Thecomputationtimehasbeendrasticallyreducedbytheproposedmethodwithoutlosingpicturequality.Nanaetal.[ 36 ]implementedasuper-resolutionvideoreconstructionalgorithmbasedonliftingwaveletsinDSPandeld-programmablegatearray(FPGA)integratedcircuits. 17

PAGE 18

CHAPTER2AUGMENTEDCOUPLEDDICTIONARYLEARNING 2.1IntroductionTheavailabilityofhigh-resolutiondegradation-freeimagesisimportantforthesuccessofmanyapplicationsinmedicine([ 33 ],[ 34 ]),remotesensing[ 37 ],biometricidentication[ 29 ],andotherelds.However,high-resolutionimagingdevicesarestillexpensive.Aswell,thestorageandtransmissionofhigh-resolutionimagesarelimitedbyspaceandbandwidthconsiderations.Ontheotherhand,whilelow-resolutionimagesarecheaptocapture,store,andtransmit,theystillsufferfromlossofdetails,blurring,noise,andinterference.Super-resolutiontechniquesaimatcreatinghigh-resolutionimagesfromlow-resolutiononeswhileovercomingtheinherentlimitationsoflow-resolutionimaging[ 38 ].However,thisisanill-posedproblemsinceonelow-resolutionpatchcancorrespondtomanyhigh-resolutionpatches.Moreover,thelow-resolutionobservationsareblurred,noisy,andmisaligned.So,interpolation,reconstructionandexample-basedtechniqueshavebeenproposedtoregularizethesolution[ 39 ].Example-basedtechniqueslearnthemappingbetweenthelow-andhigh-resolutionpatchesfromatrainingdatasetandapplythismappingtolow-resolutionpatchesoftestimages[ 40 ],[ 41 ].Dictionarylearningschemesimprovethesetechniquesbyrelatingthelow-andhigh-resolutionfeaturesthroughsparserepresentationswithrespecttocoupledover-completedictionaries[ 42 ],[ 43 ],[ 44 ],[ 45 ],[ 46 ].Inthischapter,weshowanewmodelofdictionarylearninganditsapplicationtosuper-resolutionandcolorde-rendering. 2.2CoupledDictionaryLearninginaNutshellSupposewearegiventwocoupledfeaturespaces,thehigh-resolutionpatchspaceXRnxandthelow-resolutionfeaturespaceYRny,tiedbyamappingfunctionFthatmaybenon-linearandunknown.ThegoalofdictionarylearningapproachesistolearntwodictionariesDxRnxKandDyRnyKforXandYsuchthatthesparse 18

PAGE 19

representationofxi2XintermsofthedictionaryDxshouldberelatedbyamapW(chosentypicallyastheidentitymapforsimplicity)totherepresentationofyi2YintermsofthedictionaryDy,whereyi=F(xi).Yangetal.[ 43 ]addressedthisproblembygeneralizingthebasicsparsecodingschemeto minimizeDx,Dy,)]TJ /F2 11.955 Tf 29.69 7.72 Td[(kX)]TJ /F8 11.955 Tf 11.96 0 Td[(Dx)]TJ /F2 11.955 Tf 6.94 0 Td[(k2F+kY)]TJ /F8 11.955 Tf 11.95 0 Td[(Dy)]TJ /F2 11.955 Tf 6.94 0 Td[(k2F+k)]TJ /F2 11.955 Tf 6.94 0 Td[(k1subjectto8><>:8kkDx(:,k)k21kDy(:,k)k21.(2)whereXRnxNandYRnyNaredatamatricesfromthehigh-resolutionpatchspaceXandthelow-resolutionfeaturespaceY,respectively.)]TJ /F2 11.955 Tf 11.25 0 Td[(RKNarethedatasparsecodes(commontobothdictionaries)andisaregularizationparameter. 2.3AugmentedCoupledDictionaryLearningAsimplifyingassumptionofthedictionarylearningscheme 2 istheinvarianceofthesparserepresentation:thesparsecodesofalow-resolutionpatchwithrespecttothelow-resolutiondictionaryareidenticaltothesparsecodesofthecorrespondinghigh-resolutionpatchwithrespecttothehigh-resolutiondictionary.Nevertheless,thisassumptionisinaccurateanddoesnotholdinparticularforlargemagnicationfactors.Jiaetal.[ 47 ]relaxedthisassumptionbyallowingthesparsecodesoflow-andhigh-resolutionpatchpairstohavedifferentvalueswhiletheystillsharethesamesupport.Wangetal.[ 48 ]learnedalinearmapbetweenthesparsecodesofthelow-andhigh-resolutionpatches.Learningthelinearmapping,however,increasesthecomputationcostandshouldbeappliedlocally.Weproposetocompensatefortheinaccuracyoftheinvarianceassumptionofsparserepresentationsbyexplicitlyincorporatingthelow-resolutiondictionarylearningerrorinthehigh-resolutionimagereconstructionmodel.Inparticular,weaugmentthehigh-resolutiondictionarywithadditionalatomsthatrelatetheresidualerrorofthelow-resolutiondictionarytothehigh-resolutionpatches. 19

PAGE 20

2.3.1ImagingModelTheidentity-mapassumptionbetweenthesparerepresentationsofthefeaturesinthelow-andhigh-resolutionspacesisratherrestrictiveandinaccurateforlargesuper-resolutionfactors.Wecanalleviatethisproblemandgetamoretruthfuldictionarymodelbyaugmentingthehigh-resolutionspacedictionaryDxwithanaugmentingdictionaryDaRnxny.Theaugmentingdictionaryshouldcompensateforthemodellingerrorofthenon-augmentedcoupleddictionaryscheme.Thismodellingerrorcanbedenedasthelow-resolution-spacedictionarylearningresidual R=Y)]TJ /F8 11.955 Tf 11.96 0 Td[(Dy)]TJ /F11 11.955 Tf 6.94 0 Td[(.(2)TheaugmentingdictionaryDaatomsmaybeselectedtominimizethisresidualR.So,theAugmentedCoupledDictionaryLearning(ACDL)objectivefunctionmaybeformulatedas minimizeDx,Da,Dy,)]TJ /F2 11.955 Tf 23.44 7.72 Td[(kX)]TJ /F8 11.955 Tf 11.96 0 Td[(Dx)]TJ /F2 11.955 Tf 9.6 0 Td[()]TJ /F8 11.955 Tf 11.95 0 Td[(DaRk2F+kY)]TJ /F8 11.955 Tf 11.96 0 Td[(Dy)]TJ /F2 11.955 Tf 6.94 0 Td[(k2F+k)]TJ /F2 11.955 Tf 6.94 0 Td[(k1subjectto8>>>><>>>>:8kkDx(:,k)k21kDy(:,k)k218jkDa(:,j)k21.(2) 2.3.2AlternatingOptimizationSchemeTheenergyminimizationinEquation 2 istackledbyseparatingtheobjectivefunctionintothreesub-problems,namelyupdatingthesparsecodes)]TJ /F1 11.955 Tf 10.27 0 Td[(ofthelow-resolution-spacetrainingsamplesY,updatingthelow-resolution-spacedictionaryDy,andhencejointlyconstructingthehigh-resolution-spaceandtheaugmentingdictionariesDx,Da. 20

PAGE 21

2.3.3DictionaryTrainingThersttwosub-problemsaresolvedthroughtheK-SVDdictionarytrainingprocedure[ 49 ].Inthisprocedure,thesparsecodinganddictionaryupdatearealternatedtosolvethedecoupledproblem minimizeDy,)]TJ /F2 11.955 Tf 36.08 7.73 Td[(kY)]TJ /F8 11.955 Tf 11.96 0 Td[(Dy)]TJ /F2 11.955 Tf 6.94 0 Td[(k2F+k)]TJ /F2 11.955 Tf 6.94 0 Td[(k1subjectto8kkDy(:,k)k21.(2)TheoutputsoftheK-SVDprocedurearethelow-resolution-spacedictionaryDyandthesparsecodes)]TJ /F1 11.955 Tf 10.26 0 Td[(ofthethelow-resolutiontrainingdataYwithrespecttothisdictionary. 2.3.4ComputingtheHigh-ResolutionandAugmentingDictionariesOncethelow-resolutiondictionaryDyandthesparsecodes)]TJ /F1 11.955 Tf 10.26 0 Td[(ofthetrainingdataareobtained,thehigh-resolutionandaugmentingdictionariescouldbeeasilyobtainedfrom minimizeDx,DakX)]TJ /F8 11.955 Tf 11.96 0 Td[(Dx)]TJ /F2 11.955 Tf 9.6 0 Td[()]TJ /F8 11.955 Tf 11.95 0 Td[(DaRk2F(2)Let =)]TJ /F8 11.955 Tf 5.48 -11.47 Td[(DxDa(2)and =)]TJ /F8 11.955 Tf 6.21 .48 Td[()]TJ -.73 -23.91 Td[(R.(2)ThenthesolutionofEquation 2 isthepseudo-inverseexpression =Xy=XT(T))]TJ /F12 7.97 Tf 6.59 0 Td[(1.(2) 21

PAGE 22

2.3.5SynthesisGivenatestimage,weextractthelow-resolutionfeaturesYtfromtheimagepatches,ndthesparsecodes)]TJ /F6 7.97 Tf 6.94 -1.79 Td[(tofthefeatureswithrespecttotrainedlow-resolutiondictionaryDy,computethelow-resolutionresidualRt=Yt)]TJ /F8 11.955 Tf 12.41 0 Td[(Dy)]TJ /F6 7.97 Tf 6.94 -1.8 Td[(t,thencomputethehigh-resolutionimagepatchesas Xt=Dx)]TJ /F6 7.97 Tf 6.94 -1.79 Td[(t+DaRt.(2)Thehigh-resolutionimagepatchesarethenputbackintheirrespectivelocationsandaveragedinoverlappingareastoproducetheoutputimage. 2.4ImageSuper-ResolutionusingAugmentedCoupledDictionaryLearning 2.4.1ImplementationDetailsFordictionarytraining,weusedthenaturalimagedatasetprovidedbyYangetal.[ 42 ].Thedatasetconsistsof91imagesofowers,faces,vehicles,andothernaturalscenes.Sincethehumanvisualsystemismoresensitivetotheluminanceinformation,weconvertalltrainingimagestogray-levelonesanddiscardthecolorinformation.Regradingtheimagesasthehigh-resolutionexamples,webluranddown-sampleallimageswithafactorof3.Thedown-sampledimagesarethenup-scaledbacktotheiroriginalsizestosimplifythealgorithmicdetails.Theup-scaledimageshavelosthigh-resolutiondetailsduringthedown-samplingprocessandthiswhytheseimagesareconsideredtobeoflowresolution.FollowingYangetal.([ 42 ],[ 43 ]),fourhorizontalandverticalderivativelters(namely,f1=[)]TJ /F11 11.955 Tf 9.3 0 Td[(1,0,1]=fT2,f3=[1,0,)]TJ /F11 11.955 Tf 9.3 0 Td[(2,0,1]=fT4)areappliedtothelow-resolutionimagestoextractlocalizedhigh-frequencycontent.Patchesofsize99areextractedfromthelteredimages.Thepatchesarestackedtogetherforming324-dimensionalfeaturevectors.Thenprincipal-componentanalysis(PCA)isappliedtoignorecomponentsthatcontributenomorethan0.1%oftheaveragefeatureenergy.Thisreducesthelow-resolutiondimensionalityto30.Forthehigh-resolutionimages,lowfrequenciesareremovedbysubtractingeachlow-resolution(up-scaled) 22

PAGE 23

imagefromitshigh-resolutioncounterpart.About130000pairsoflow-resolutionandhigh-resolutionpatchesareextracted(20%ofwhichareusedforvalidation).Wecompareourmethodagainstbicubicinterpolationaswellasthebasiccoupleddictionarylearning(CDL)variantofZeydeetal.[ 44 ].Forbothdictionarylearningschemes,weuseatotalhigh-resolutiondictionarysizeof2048atoms,40iterationsoftheK-SVDalgorithm,andamaximalsparsityof3fortheOrthogonalMatchingPursuit(OMP)codingscheme. 2.4.2QuantitativeandVisualComparisonWetestedthesuper-resolutionperformanceon14standardtestimagesandcomputedtheaveragequalitymetrics.Foreachimage,thedictionary-basedsynthesiswasappliedtotheluminancechannelwhilebicubicinterpolationwasappliedtothechrominancechannels.Table 2-1 showstheStructuralSimilarityIndex(SSIM)andthePeakSignal-to-NoiseRatio(PSNR)measuresintherstandsecondrowforeachimagerespectively.Ouraugmenteddictionarylearningschemesurpassesthebaselinecoupleddictionarylearningonall14images.Onaverage,ourschemeandthebaselineoneshowanimprovementoverbicubicinterpolationof+1.2647dBand+1.1560dB,respectively.Figure 2-4 showssamplesoftheground-truthandreconstructedtestimages.Ourresultslooksharperthanthoseofthebicubicinterpolationandarealsovisuallybetterorsimilartotheircounterpartsinthebaselinescheme.Figure 2-3 showsthetraineddictionariesoftheaugmentedscheme.Theaugmentingdictionaryshowsclearstructuresandimprovestheoverallsuper-resolutionquality. 2.4.3EffectoftheHigh-ResolutionDictionarySizeWeexploretheeffectofthehigh-resolutiondictionarysizeinFigure 2-1 .Dictionarysizesof128,256,512,1024,and2048areused.TheaverageSSIMandPSNRmeasures(Figure 2-1 a,b)ofouraugmentedschemeareconsistentlybetteracrossalldictionarysizevalues.Thiscomesatnocostinsynthesistime.Indeed,ouraveragesynthesistimeisslightlylessthanthatofthebaselinescheme(Figure 2-1 c)sincefor 23

PAGE 24

thesametotalnumberofhigh-resolutiondictionaryatoms,ourschemeallocatesfeweratomstotheover-completedictionaryandhencereducesthesparsecodingcomplexity(whichisknowntobethebottleneckoftheK-SVDalgorithm). 2.4.4EffectofDimensionalityReductionFromEquation 2 ,weseethatthenumberofaugmentingdictionaryDaatomsdependsonthedimensionalityofthelow-resolutionfeaturevectorswhichinturncanbecontrolledthroughthePCA-baseddimensionalityreductionstep.Figure 2-2 showsthedependenceoftheaverageperformancemetricsontheamountofdiscardedfeatureenergy.Ononehand,foralargepercentageofdiscardedenergy,thereisacleardropintheaverageperformancemetrics.Indeed,theperformancegapbetweentheaugmentedandbaselineschemesgetssmallersincetheaugmentingdictionarywillhaveveryfewatomsinthiscase.Ontheotherhand,forsmallpercentagesofthediscardedenergy,thereisnonoticeableimprovementoverthemid-rangevaluesbutthereisabigincreaseinaveragesynthesistime.So,adiscardedenergypercentageof0.1%isareasonabletrade-offherebetweenaccuracyandtimecomplexity. 2.5ColorDe-RenderingusingAugmentedCoupledDictionaryLearningRawimagescapturedbyconsumerdigitalcamerassufferfromcolorandlightingdistributionsinadequateorunpleasanttothehumaneye.Cameramanufacturersthustypicallyapplypost-processingoperationsinsidethecameratoenhancetheimageoutput.Suchoperationsincludedynamicrangecompression(tonemapping),gammacorrection,simulatinglmresponse,orcreatingvisuallyappealingeffects.Manycomputervisionapplications,however,assumecapturedimagedatathatislinearlyrelatedtotheactualsceneradiance.Exampleapplicationsarephotometricstereo,shape-from-shadingreconstruction,colorconstancy,intrinsicimageestimation,andhighdynamicrangeimaging.So,animportantpreprocessingstepfortheaforementionedapplicationsisradiometriccalibrationwhichaimsatcomputingthecameraresponsefunctionthatmapstheamountoflightcollectedbyeachCCDpixeltopixelintensities 24

PAGE 25

A B CFigure2-1. Effectofthehigh-resolutiondictionarysizeontheaveragesuper-resolutionperformancemetricsof14testimages.A)AveragePSNR.B)AverageSSIM.C)AverageSynthesisTime. 25

PAGE 26

A B CFigure2-2. Effectofthedimensionalityofthelow-resolutionfeaturesontheaveragesuper-resolutionperformancemetricsof14testimages.A)AveragePSNR.B)AverageSSIM.C)AverageSynthesisTime. 26

PAGE 27

A B CFigure2-3. Thetraineddictionariesoftheaugmentedcoupleddictionarylearning(ACDL)scheme.A)Low-ResolutionDictionary.B)High-ResolutionDictionary.C)AugmentingHigh-ResolutionDictionary. 27

PAGE 28

Aforeman B0.8940(29.836) C0.9209(31.918) D0.9224(32.158) Emonarch F0.9685(27.757) G0.9802(29.535) H0.9809(29.662) Ippt3 J0.9398(22.439) K0.9648(24.107) L0.9662(24.262) Mzebra N0.9066(25.341) O0.9379(27.364) P0.9416(27.524) Qbarbara R0.8690(24.365) S0.8924(24.886) T0.8963(25.002) Uowers V0.7852(25.912) W0.8260(27.185) X0.8303(27.306)Figure2-4. Visualresultsforsampletestimages.Therstcolumnshowstheground-truthimages.Thesecondoneshowsthebicubicinterpolationresults.Thethirdandfourthcolumnsshowtheresultsofbaselineandaugmentedcoupleddictionarylearning,respectively.QuantitativemetricsareshownasSSIM(PSNR). 28

PAGE 29

Table2-1. Super-resolutionquantitativeresults.TheSSIMandPSNRmeasuresareshownforeachimageinthe1stand2ndrows,respectively. bicubicCDL[ 44 ]ACDL baboon0.796350.840510.8463619.817720.142820.1787barbara0.869020.892390.896324.36524.886325.0015bridge0.999630.997810.996931.148331.330231.4479coastguard0.58150.627150.6345525.227525.863425.8731comic0.687670.749360.7557821.796222.692522.783face0.770610.795990.8007131.421732.155932.2583owers0.785190.825990.8302525.91227.184627.306foreman0.894030.920860.9223829.836231.91832.1575lenna0.948170.963230.9646828.996130.368130.4793man0.902870.930510.9332925.695826.648826.7231monarch0.968470.98020.980927.757229.53529.6624pepper0.964640.975190.9758630.096631.837931.8992ppt30.939750.964780.9662222.43924.106524.2616zebra0.906550.937890.9415925.341127.363927.5241Average0.858170.885850.8889826.417927.573927.6826 intheoutputimage.Manyradiometriccalibrationapproachesexistintheliterature.Chakrabartietal.[ 50 ]analyzedthefactorscontributingtothecoloroutputofacameraandproposeda24-parametermodeltoexplaintheimagingpipeline.Linetal.[ 51 ]presentedanextensivestudyofcalibrationonalargeimagedatabaseandproposedaPCA-basedmodeloftheresponsecurveinthelogarithmicdomain.Xiongetal.[ 52 53 ]usedGaussianprocessregressiontolearnforeachpixel'soutputcolora 29

PAGE 30

probabilitydistributionoverthescenecolorsthatcouldhavecreatedit.Here,weshowhowbaselineandcoupleddictionarylearningapproachescanbeappliedtomodelthemappingbetweenthecameraoutputRGBdataandtherawsceneimagedata. 2.5.1ImplementationDetailsWetestedtheapplicabilityofcoupleddictionarylearninginrecoveringrawcolorfromRGBcolorimages.ThelatentandobservationspacescorrespondtotherawandRGB(rendered)imagespaces,respectively.Foraspeciedcameramodel,anobservation-spacedictionaryislearnedfrompatchesextractedfromexamplerenderedimagesofthegivencameramodel(Equation 2 ).Thentheresultingsparsecodesandtrainingresidualerrorarecombinedwithcorrespondingpatchesfromtherawcameraimagestodeducethelatent-spacedictionary(Equation 2 ).Fordictionarytraining,weusedtheMiddleburyColorDatasetofColorCheckerimages[ 50 ].Thisdatasetconsistsofregisteredrawandrendered(JPEG)viewsfortwocolorcalibrationtargets,X-Rite's24-patchColorCheckerand140-patchDigitalColorCheckerSG.Eachpatternwasphotographed: with35differentcameramodels,24ofwhichsupportbothRAWandJPEG,while11supportonlyJPEG; withxedTungstenwhitebalance(wb1);forasubsetofcamerasalsowithautowhitebalance(wb2); undertwoxedlightingconditions:using3200KTungsten(i1)and4800KDaylight(i2)photooodlightbulbs; with4differentexposures:stops-2(e1),-1(e2),0(e3),and+1(e4).Weusedtheimagesofthe140-patchtargetfortrainingwhilethoseofthe24-patchtargetwereusedfortesting.Patchesofsize55wereextractedfromeachcolorchanneloftherenderedandrawimages.Thepatcheswerethenstackedtogetherforming75-dimensionalfeaturevectors.Fortherenderedimagepatches,principal-componentanalysis(PCA)wasappliedtoignorecomponentsthatcontribute 30

PAGE 31

nomorethan0.1%oftheaveragefeatureenergy.Wecomparedtheaugmentedcoupled-dictionary(ACDL)schemeagainstthebaselinecoupled-dictionary(CDL)variantofZeydeetal.[ 44 ].Forbothdictionarylearningschemes,weusedatotallatentdictionarysizeof1024atoms,40iterationsoftheK-SVDalgorithm,andamaximalsparsityof3fortheOrthogonalMatchingPursuit(OMP)codingscheme. 2.5.2EffectofExposureandLightingonColorDe-renderingForeightmoderncameras,weshowinFigure 2-5 thedependenceofthecolorde-renderingperformanceontheexposuresettingandthelightsourceusedduringtheimagingprocess.Eachsub-gureshowsthePSNRlevelforde-renderedimagesusingtheaugmented(ACDL)andbasiccoupled-dictionarylearning(CDL)methods.FortheSonyDSLR-A300andNikonD300models,theACDLschemeconsistentlyoutperformsthebasicCDLschemeforallexposurelevelsandlightconditions.Forothermodels(e.g.CanonEOS20D,CasioEXZ55,andPentaxK10D),thede-renderingperformanceoftheACDLschemeexceedsthatoftheCDLschemeforhigherexposurelevels.Figure 2-6 showstherendered,raw,andde-renderedimagesusingthetwodictionarylearningschemesfortheNikonD300cameraattwoexposurelevels.WhilethebasicCDLschemeproduceserroneousandspottycolorpredictions,theACDLschemepredictscolorcorrectlyandgeneratessquaresofhighlysolidcoloring. 31

PAGE 32

ACanonEOS20D BCanonPowerShotG9 CCasioEXZ55 DNikonD300 EOlympusE500 FPentaxK10D GSonyDSC-F828 HSonyDSLR-A300Figure2-5. PSNR-versus-exposurecurvesforsomecameramodels.FortheeachoftheCDLandACDLmodels,PSNRcurvesareshownfortwolightingsettings:3200KTungsten(i1)and4800KDaylight(i2). 32

PAGE 33

AJPEG:stop-2(e1) BRAW:GroundTruth CCDL0.7997(31.552) DACDL0.8171(34.269) EJPEG:stop-1(e2) FRAW:GroundTruth GCDL0.7879(26.415) HACDL0.8289(29.881) IJPEG:stop0(e3) JRAW:GroundTruth KCDL0.791(21.566) LACDL0.8631(27.895) MJPEG:stop+1(e4) NRAW:GroundTruth OCDL0.8131(20.252) PACDL0.8714(25.755)Figure2-6. Colorde-renderingofcheckerimagesfortheNikonD300cameraatdifferentexposurelevels.TherstcolumnshowstheJPEGcameraoutput.Thesecondoneshowstheground-truthrawimages.Thethirdandfourthcolumnsshowtheresultsofbaselineandaugmentedcoupleddictionarylearning,respectively.Theimagesshowincreasingexposurelevelsfromthersttothelastrow.QualitymeasuresareshownasSSIM(PSNR). 33

PAGE 34

ACanonEOS20D BGroundTruth CCDL DACDL ENikonD300 FGroundTruth GCDL HACDL IPentaxK10D JGroundTruth KCDL LACDL MCanonPowerShotG9 NGroundTruth OCDL PACDLFigure2-7. Colorde-renderingofnaturalimages.TherstcolumnshowstheJPEGcameraoutput.Thesecondoneshowstheground-truthrawimages.Thethirdandfourthcolumnsshowtheresultsofbaselineandaugmentedcoupleddictionarylearning,respectively. 34

PAGE 35

CHAPTER3NON-UNIFORMSUPER-RESOLUTIONUSINGPATCHREDUNDANCYANDRESOLUTION-INDEPENDENTIMAGEMODELS 3.1Single-ImageSuper-ResolutionusingPatchRedundancy:AReviewPatchrepetitionswithinanimagewerepreviouslyexploitedforimagedenoisingusingaNon-LocalMeans(NLM)approach[ 54 ]aswellasaregularizationpriorforinverseproblems.Arelatedsuper-resolutionapproachwasproposedbyProtteretal.[ 55 ]forobtaininghigher-resolutionvideoframesbyapplyingtheclassicalsuper-resolutionconstraintstosimilarpatchesacrossconsecutivevideoframesandwithinasmalllocalspatialneighbourhood.Glasneretal.[ 6 ]suggestedextendingthesearchforpatchrepetitionsacrossscale.Figure 3-1 demonstratespatchrecurrencewithinandacrossscalesofasingleimage.SourcepatchesintheinputimageIarefoundindifferentlocationsandinotherimagescalesofI(solid-markedsquares).Thehigh-resolutioncorrespondingparentpatches(dashed-markedsquares)provideanindicationofwhatthe(unknown)high-resolutionparentsofthesourcepatchesmightlooklike.Glasneretal.[ 6 ]statisticallytestedthisobservationontheBerkeleySegmentationDatabase1(Figure 3-2 ).Morespecically,theytestedthehypothesisthatsmall55patchesinasinglenaturalgray-scaleimage,whenremovingtheirDC(theiraveragegray-scale),tendtorecurmanytimeswithinandacrossscalesofthesameimage.Thetestwasperformedasfollows:EachimageIintheBerkeleydatabasewasrstconvertedtoagray-scaleimage.Glasneretal.thengeneratedfromIacascadeofimagesofdecreasingresolutionsIs,scaled(down)byscalefactorsof1.25sfors=0,)]TJ /F11 11.955 Tf 9.3 0 Td[(1,...,)]TJ /F11 11.955 Tf 9.3 0 Td[(6(I0=I).Thesizeofthesmallestresolutionimagewas1.25)]TJ /F12 7.97 Tf 6.58 0 Td[(6=0.26ofthesizeofthesourceimageI(ineachdimension).Each55patchinthesourceimageIwascomparedagainstthe55patchesinalltheimagesIs(withouttheirDC),measuringhowmanysimilarpatchesithasineachimagescale.Theseintra-imagepatchstatisticswerecomputedseparatelyforeachimage.Theresultingindependent 35

PAGE 36

Figure3-1. Patchrecurrencewithinandacrossscalesofasingleimage(Glasneretal.[ 6 ]) statisticswerethenaveragedacrossalltheimagesinthedatabase(300images),andareshowninFigure 3-2 a.Notethat,ontheaverage,morethan90%ofthepatchesinanimagehave9ormoreothersimilarpatchesinthesameimageattheoriginalimagescale(withinscale).Moreover,morethan80%oftheinputpatcheshave9ormoresimilarpatchesin0.41=1.25)]TJ /F12 7.97 Tf 6.59 0 Td[(4oftheinputscale,and70%ofthemhave9ormoresimilarpatchesin0.26=1.25)]TJ /F12 7.97 Tf 6.59 0 Td[(6oftheinputscale.Figure 3-2 bshowsthesamestatistics,butthistimemeasuredonlyforimagepatcheswiththehighestintensityvariances(top25%).Thesepatchescorrespondtopatchesofedges,corners,andtexture.Glasneretal.[ 6 ]usedtherecurrenceofpatcheswithinthesameimagescaleasthebasisforapplyingtheclassicalsuper-resolutionconstraintstoinformationfromasingleimage.Aswell,recurrenceofpatchesacrossdifferentscalesgivesrisetoexample-basedsuper-resolutionfromasingleimage,withnopriorexamples. 36

PAGE 37

Figure3-2. Averagepatchrecurrencewithinandacrossscalesofasingleimage(Glasneretal.[ 6 ]) 3.1.1Super-ResolutionusingIn-ScalePatchRedundancyForclassicalmulti-imagesuper-resolution,low-resolutionpixelsinmultiplelow-resolutionimagesimposemultiplelinearconstraintsonthehigh-resolutionunknownswithinthesupportoftheirblurkernels(Figure 3-3 a).Ifenoughlow-resolutionimagesareavailable(atsub-pixelshifts),thenthenumberofindependentequationsexceedsthenumberofunknowns.Suchsuper-resolutionschemeshavebeenshowntoprovidereasonablystablesuper-resolutionresultsuptoafactorof2.Whenthereisonlyasinglelow-resolutionimage,recurringpatcheswithinasinglelow-resolutionimagecanberegardedasifextractedfrommultipledifferentlow-resolutionimagesofthesamehighresolutionscene,thusinducingmultiplelinearconstraintsonthehigh-resolutionunknowns.Hereisahigh-leveldescriptionofthesuper-resolutionalgorithmusingin-scalepatchredundancy.First,foreachpixelintheinputlow-resolutionimageL,nditsknearestpatchneighborsinthesameimageLusingexactorapproximatenearest-neighborsearch.Second,computethesubpixelalignmentoftheknearestneighbors(at1=spixelshifts,wheresisthescalefactor.)Assumingsufcientneighbors 37

PAGE 38

arefound,thisprocessresultsinadeterminedsetoflinearequationsontheunknownpixelvaluesinthehigh-resolutionoutputH.Third,globallyscaleeachequationbyitsreliability(determinedbyitspatchsimilarityscore),andnallysolvethelinearsetofequationstoobtainH. Figure3-3. Classicalmulti-imagesuper-resolutionversussingle-imagemulti-patchsuper-resolution(Glasneretal.[ 6 ]) 3.1.2Super-ResolutionusingCross-ScalePatchRedundancyLetL=I0,I)]TJ /F12 7.97 Tf 6.59 0 Td[(1,...,I)]TJ /F6 7.97 Tf 6.58 0 Td[(mdenoteacascadeofimagesofdecreasingresolutions(scales)obtainedfromtheinputlow-resolutionimageLusingthesameblurfunctionsBl:I)]TJ /F6 7.97 Tf 6.58 0 Td[(l=(LBl)#sl(l=0,..,m).Notethatunlikethehigh-resolutionimagecascade,theselow-resolutionimagesareknown(computedfromL).TheresultingcascadeofimagesisalsoillustratedinFigure 3-4 (theblueimages).LetPl(p)denoteapatchintheimageIlatpixellocationp.Foranypixelintheinputimagep2L(L=I0)anditssurroundingpatchP0(p),wecansearchforsimilarpatcheswithinthecascadeoflow-resolutionimagesI)]TJ /F6 7.97 Tf 6.59 0 Td[(l,l>0.LetP)]TJ /F6 7.97 Tf 6.59 0 Td[(l(~p)besuchamatchingpatchfoundinthelow-resolutionimageI)]TJ /F6 7.97 Tf 6.58 0 Td[(l.Thenitshigher-resolutionparentpatch,Q0(sl.~p),canbeextractedfromtheinputimageI0=L(orfromanyintermediate-resolutionlevelbetweenI)]TJ /F6 7.97 Tf 6.59 0 Td[(landL,ifdesired).Thisprovidesalow-resolution/high-resolutionpatchpair[P,Q],whichprovidesapriorontheappearanceofthehigh-resolutionparentofthelow-resolutioninputpatchP0(p),namely 38

PAGE 39

patchQl(sl.p)inthehigh-resolutionunknownimageIl(orinanyintermediateresolutionlevelbetweenLandIl,ifdesired). 3.1.3CombiningClassicalandExample-BasedSuper-ResolutionFigure 3-4 summarizestheuniedapproach.Patchesintheinputlow-resolutionimageL(darkredanddarkgreenpatches)aresearchedforinthedown-scaledversionsofL(blue-markedimages).Whenasimilarpatchisfound,itsparentpatch(lightredandlightgreen)iscopiedtotheappropriatelocationintheunknownhigh-resolutionimage(purpleimages)withtheappropriategapinscale.Alearned(copied)high-resolutionpatchinducesclassicalsuper-resolutionlinearconstraintsontheunknownhigh-resolutionintensitiesinthetargethigh-resolutionH.Thesupportofthecorrespondingblurkernels(redandgreenellipses)aredeterminedbytheresidualgapsinscalebetweentheresolutionlevelsofthelearnedhigh-resolutionpatchesandthetargetresolutionlevelofH.Fordifferentpatchesfoundindifferentscalegaps,thecorrespondingblurkernels(redandgreenellipses)willaccordinglyhavedifferentsupports. Figure3-4. Combiningexample-basedsuper-resolutionconstraintswithclassicalsuper-resolutionconstraints(Glasneretal.[ 6 ]) 39

PAGE 40

3.2DistributionofNearestNeighborsAcrossScaleThemainbottleneckoftheaforementionedpatch-redundancysuper-resolutionmethodsisthenearest-neighboursearch.Severalapproachescanbetakentohandlethisproblemwithinthesuper-resolutionframework.Oneapproachistouseapproximatesearchtechniquesthatndthenearestneighborsinlesstimebutwithapre-seterrorpenalty.Asecondapproachistoreducethesearchpoolsizebyexcludingpatchesoflow-intensityvariance.Athirdapproach(andtheonewefocusonhere)istoanalysethedistributionofthenearestneighborsacrossscaletoseeifthesizeofthelow-resolutionsearchpyramidcouldbereducedornot.LetuslookatthetwoexamplesoftheArchandChipimages.Therstimagehasarelativelylargersizeof450370andthesecondonehasarelativelysmallersizeof244200.Figure 3-5 showsthelow-resolutionpyramidofthetwoimagesintendedfordoublingtheirrespectivesizes.Astheresolutiondecreases,wecanseetheincreaseddissimilaritybetweentheimages.Wecanquantifythisatthepatchlevelbyndingthedistributionofthenearestneighborsamongthelow-resolutionlevels.Rememberthatthesuper-resolutionisdoneinsmallincrementsoftypically1.25X.Figure 3-6 showstherstincrementaloutput.Overwhelmingpercentagesofthenearestneighborswerefoundinthetoplowerlayer(theinputlow-resolutionimageinthiscase).Inparticular,thesepercentagesareabout75%and90%fortheArchandChipimages,respectively(Figure 3-6 C,D).Aswell,nearestneighbourdistancehistograms(Figure 3-6 E,F)showthatmostofnearestneighborsarewithinadistanceof0.01-0.02fromthesourcepatches.Inparticular,forthehigher-resolutionArchimage,about23%ofthenearestneighborsareatadistanceof0.02whileforthelower-resolutionChipimage,about33%ofthenearestneighborsareatadistanceof0.01.Thesestatisticssaythatforhigher-resolutionimages(e.g.theArchimage),mostofthenearestneighborstendtobefoundinthefewtoplayersofthesearchpyramidwithrelativelylargerdistances(smallersimilarity).Forlower-resolutionimages(e.g.theChipimage),mostofthenearestneighborstendtobe 40

PAGE 41

foundintheveryfewtoplayersofthesearchpyramidwithrelativelysmallerdistances(largersimilarity).Inotherwords,aswewillverifylater,distancesbetweenpatchesincreaseaswegolowerintheresolution.Asthesecond(Figure 3-7 ),third(Figure 3-8 ),andfourth(Figure 3-9 )layersarebuilt,nearestneighborsarefoundmoreinhigherlayerswithsmallerdistances(i.e.,highersimilarity)tothesourcepatches.Figure 3-10 showsthestandardlowerresolutionpyramidforamagnicationfactorof3XandFigures 3-11 to 3-15 showthe5incrementallybuiltlayerstoreachtherequiredscale.Also,Figure 3-16 showsthestandardlowerresolutionpyramidforamagnicationfactorof4XandFigures 3-17 to 3-23 showthe7incrementallybuiltlayerstoreachtherequiredscale.Asforthe2Xmagnicationcase,similarconclusionscanbemadeaboutthedecayingbenetsofthelowerresolutionlayersandtheincreasednearest-neighboursimilarityasweapproachthetargetscale. 41

PAGE 42

Figure3-5. Magnication2X:Lowerresolutionpyramid 42

PAGE 43

A B C D E FFigure3-6. Magnication2X:Layer1.A)Archbuiltlayer1.B)Chipbuiltlayer1.C)Archlayerhistogram.D)Chiplayerhistogram.E)Archdistancehistogram.F)Chipdistancehistogram. 43

PAGE 44

A B C D E FFigure3-7. Magnication2X:Layer2.A)Archbuiltlayer2.B)Chipbuiltlayer2.C)Archlayerhistogram.D)Chiplayerhistogram.E)Archdistancehistogram.F)Chipdistancehistogram. 44

PAGE 45

A B C D E FFigure3-8. Magnication2X:Layer3.A)Archbuiltlayer3.B)Chipbuiltlayer3.C)Archlayerhistogram.D)Chiplayerhistogram.E)Archdistancehistogram.F)Chipdistancehistogram. 45

PAGE 46

A B C D E FFigure3-9. Magnication2X:Layer4.A)Archbuiltlayer4.B)Chipbuiltlayer4.C)Archlayerhistogram.D)Chiplayerhistogram.E)Archdistancehistogram.F)Chipdistancehistogram. 46

PAGE 47

Figure3-10. Magnication3X:Lowerresolutionpyramid 47

PAGE 48

A B C D E FFigure3-11. Magnication3X:Layer1.A)Archbuiltlayer1.B)Chipbuiltlayer1.C)Archlayerhistogram.D)Chiplayerhistogram.E)Archdistancehistogram.F)Chipdistancehistogram. 48

PAGE 49

A B C D E FFigure3-12. Magnication3X:Layer2.A)Archbuiltlayer2.B)Chipbuiltlayer2.C)Archlayerhistogram.D)Chiplayerhistogram.E)Archdistancehistogram.F)Chipdistancehistogram. 49

PAGE 50

A B C D E FFigure3-13. Magnication3X:Layer3.A)Archbuiltlayer3.B)Chipbuiltlayer3.C)Archlayerhistogram.D)Chiplayerhistogram.E)Archdistancehistogram.F)Chipdistancehistogram. 50

PAGE 51

A B C D E FFigure3-14. Magnication3X:Layer4.A)Archbuiltlayer4.B)Chipbuiltlayer4.C)Archlayerhistogram.D)Chiplayerhistogram.E)Archdistancehistogram.F)Chipdistancehistogram. 51

PAGE 52

A B C D E FFigure3-15. Magnication3X:Layer5.A)Archbuiltlayer5.B)Chipbuiltlayer5.C)Archlayerhistogram.D)Chiplayerhistogram.E)Archdistancehistogram.F)Chipdistancehistogram. 52

PAGE 53

Figure3-16. Magnication4X:Lowerresolutionpyramid 53

PAGE 54

A B C D E FFigure3-17. Magnication4X:Layer1.A)Archbuiltlayer1.B)Chipbuiltlayer1.C)Archlayerhistogram.D)Chiplayerhistogram.E)Archdistancehistogram.F)Chipdistancehistogram. 54

PAGE 55

A B C D E FFigure3-18. Magnication4X:Layer2.A)Archbuiltlayer2.B)Chipbuiltlayer2.C)Archlayerhistogram.D)Chiplayerhistogram.E)Archdistancehistogram.F)Chipdistancehistogram. 55

PAGE 56

A B C D E FFigure3-19. Magnication4X:Layer3.A)Archbuiltlayer3.B)Chipbuiltlayer3.C)Archlayerhistogram.D)Chiplayerhistogram.E)Archdistancehistogram.F)Chipdistancehistogram. 56

PAGE 57

A B C D E FFigure3-20. Magnication4X:Layer4.A)Archbuiltlayer4.B)Chipbuiltlayer4.C)Archlayerhistogram.D)Chiplayerhistogram.E)Archdistancehistogram.F)Chipdistancehistogram. 57

PAGE 58

A B C D E FFigure3-21. Magnication4X:Layer5.A)Archbuiltlayer5.B)Chipbuiltlayer5.C)Archlayerhistogram.D)Chiplayerhistogram.E)Archdistancehistogram.F)Chipdistancehistogram. 58

PAGE 59

A B C D E FFigure3-22. Magnication4X:Layer6.A)Archbuiltlayer6.B)Chipbuiltlayer6.C)Archlayerhistogram.D)Chiplayerhistogram.E)Archdistancehistogram.F)Chipdistancehistogram. 59

PAGE 60

A B C D E FFigure3-23. Magnication4X:Layer7.A)Archbuiltlayer7.B)Chipbuiltlayer7.C)Archlayerhistogram.D)Chiplayerhistogram.E)Archdistancehistogram.F)Chipdistancehistogram. 60

PAGE 61

Table3-1. Super-resolutionperformancemetricsfortheArchimage ImageArch.png450370Scale3X4X5XMetric NNSearchLayersAllLayersPSNR27.965226.478824.7684SSIM0.862890.823170.77283Time1182.13962146.69772140.5332TwoLayersPSNR27.764326.179224.6571SSIM0.861140.818750.76878Time776.5391263.61471281.564OneLayerPSNR27.789626.199924.643SSIM0.861370.81620.76596Time559.0558905.775887.403 3.3ReducingtheSearchComplexity-LocalScaleNearestNeighbourSearchAswehaveseenintheprevioussection,higher-resolutionlayersofthelow-resolutionpyramid(constructedforthepatchredundancyalgorithm)playamoredominantroleasmoreincrementallayersarebuilt.Inspiredbythisobservation,weproposetolimitthenearest-neighboursearchpoolforeachtargetlayertojustoneortwolayersimmediatelybelowthetargetlayer.Weshowonseveralimagesthatsucharestrictionwillcauseaconsiderablereductionincomputationtimewithverylittledecreaseinquantitativequalitymetrics(PSNRandSSIM)andmoreimportantlynonoticeablechangeinvisualquality.Thetimesavingsduetothisrestrictionaremoreapparentonimagesoflargersizes.Interestingly,forsmallerimages,thisrestrictionsometimesdoimprovethequantitativequalitymetrics(thoughnotvisuallynoticeableintheexamplesbelow).Thiscanbeascribedtoourearlierobservationthatpatchsimilaritydecreasesaswegolowerinthelower-resolutionpyramid.Inotherwords,thisrestrictionavoidssearchingnoisypatches. 61

PAGE 62

A B C DFigure3-24. Plotsofperformancemetricsversusthesuper-resolutionscalefortheArchimage.A)PSNRmeasure.B)MeanSSIMmeasure.C)Synthesistime.D)Inputimage. 62

PAGE 63

A B C DFigure3-25. Local-ScaleSearch(3X):Arch.A)Super-resolutionusingalllayers.B)Super-resolutionusingtwolayers.C)Super-resolutionusingonelayer.D)Ground-truthimage. 63

PAGE 64

Table3-2. Super-resolutionperformancemetricsfortheChipimage ImageChip.png244200Scale3X4X5XMetric NNSearchLayersAllLayersPSNR33.374131.822130.011SSIM0.939530.903750.86011Time161.7583275.4599264.3727TwoLayersPSNR33.331631.853730.0805SSIM0.939160.904540.86161Time139.6949207.9144202.4265OneLayerPSNR33.290631.876330.0065SSIM0.938560.904990.86041Time112.1093168.5425171.6889 A B C DFigure3-26. Plotsofperformancemetricsversusthesuper-resolutionscalefortheChipimage.A)PSNRmeasure.B)MeanSSIMmeasure.C)Synthesistime.D)Inputimage. 64

PAGE 65

A B C DFigure3-27. Local-ScaleSearch(3X):Chip.A)Super-resolutionusingalllayers.B)Super-resolutionusingtwolayers.C)Super-resolutionusingonelayer.D)Ground-truthimage. 65

PAGE 66

Table3-3. Super-resolutionperformancemetricsfortheChildimage ImageChild.png128128Scale3X4X5XMetric NNSearchLayersAllLayersPSNR31.838130.998529.4395SSIM0.824520.772390.72191Time49.321676.036377.0536TwoLayersPSNR31.841930.979929.4733SSIM0.825530.772660.72403Time51.420972.318774.9522OneLayerPSNR31.821131.061929.4196SSIM0.826080.775990.72178Time47.886168.672173.3557 A B C DFigure3-28. Plotsofperformancemetricsversusthesuper-resolutionscalefortheChildimage.A)PSNRmeasure.B)MeanSSIMmeasure.C)Synthesistime.D)Inputimage. 66

PAGE 67

A B C DFigure3-29. Local-ScaleSearch(3X):Child.A)Super-resolutionusingalllayers.B)Super-resolutionusingtwolayers.C)Super-resolutionusingonelayer.D)Ground-truthimage. 67

PAGE 68

Table3-4. Super-resolutionperformancemetricsfortheDogimage ImageDog.png240160Scale3X4X5XMetric NNSearchLayersAllLayersPSNR32.453231.436430.5772SSIM0.803140.735010.6804Time116.5572187.1007190.7929TwoLayersPSNR32.420731.443230.5867SSIM0.802660.735350.68109Time101.5607151.7452154.1503OneLayerPSNR32.381531.459630.5683SSIM0.801240.736460.68167Time86.6679129.1416129.5952 A B C DFigure3-30. Plotsofperformancemetricsversusthesuper-resolutionscalefortheDogimage.A)PSNRmeasure.B)MeanSSIMmeasure.C)Synthesistime.D)Inputimage. 68

PAGE 69

A B C DFigure3-31. Local-ScaleSearch(3X):Dog.A)Super-resolutionusingalllayers.B)Super-resolutionusingtwolayers.C)Super-resolutionusingonelayer.D)Ground-truthimage. 69

PAGE 70

Table3-5. Super-resolutionperformancemetricsfortheLenaimage ImageLena.png128128Scale3X4X5XMetric NNSearchLayersAllLayersPSNR30.695829.167628.1879SSIM0.828990.766760.70821Time48.773474.762679.1499TwoLayersPSNR30.712129.188428.2062SSIM0.831060.76480.70647Time51.101172.690978.3407OneLayerPSNR30.687429.228428.0671SSIM0.833280.76570.69992Time46.94566.501168.1954 A B C DFigure3-32. Plotsofperformancemetricsversusthesuper-resolutionscalefortheLenaimage.A)PSNRmeasure.B)MeanSSIMmeasure.C)Synthesistime.D)Inputimage. 70

PAGE 71

A B C DFigure3-33. Local-ScaleSearch(3X):Lena.A)Super-resolutionusingalllayers.B)Super-resolutionusingtwolayers.C)Super-resolutionusingonelayer.D)Ground-truthimage. 71

PAGE 72

Table3-6. Super-resolutionperformancemetricsfortheSunFlowerimage ImageSunFlower.png320208Scale3X4X5XMetric NNSearchLayersAllLayersPSNR30.189328.667227.3365SSIM0.799920.721170.64708Time241.1744416.5091407.396TwoLayersPSNR30.119728.634227.3164SSIM0.797940.718060.64602Time190.7357293.7465288.3112OneLayerPSNR30.137428.637427.3446SSIM0.797730.718540.64817Time149.3594229.3083227.7202 A B C DFigure3-34. Plotsofperformancemetricsversusthesuper-resolutionscalefortheSunFlowerimage.A)PSNRmeasure.B)MeanSSIMmeasure.C)Synthesistime.D)Inputimage. 72

PAGE 73

A B C DFigure3-35. Local-ScaleSearch(3X):SunFlower.A)Super-resolutionusingalllayers.B)Super-resolutionusingtwolayers.C)Super-resolutionusingonelayer.D)Ground-truthimage. 73

PAGE 74

Table3-7. Super-resolutionperformancemetricsfortheZebraimage ImageZebra.png241161Scale3X4X5XMetric NNSearchLayersAllLayersPSNR26.504925.299324.7466SSIM0.696680.590890.52642Time116.7284187.6776192.1518TwoLayersPSNR26.489625.291724.7434SSIM0.695990.589530.52618Time102.4879151.0681154.6033OneLayerPSNR26.493725.313824.7455SSIM0.695410.592440.52761Time87.2966128.0024134.9201 A B C DFigure3-36. Plotsofperformancemetricsversusthesuper-resolutionscalefortheZebraimage.A)PSNRmeasure.B)MeanSSIMmeasure.C)Synthesistime.D)Inputimage. 74

PAGE 75

A B C DFigure3-37. Local-ScaleSearch(3X):Zebra.A)Super-resolutionusingalllayers.B)Super-resolutionusingtwolayers.C)Super-resolutionusingonelayer.D)Ground-truthimage. 75

PAGE 76

3.4Non-uniformSuper-ResolutionusingPatchRedundancyCurrentsuper-resolutionapproachestypicallyassumethatthescalingfactorisuniformoveralloftheimagedomain.However,manyreal-worldimagingscenariosinvolveorrequireanon-lineardeformationoftheimagedomain.Here,givenanon-lineardeformationoftheimagedomain,weproposeanewapproachtosuper-resolveanddeforminputimagessimultaneously. 3.4.1Non-uniformSuper-resolutionAlgorithmLetDdenotetheimagedomainofaninputimageI,andf(x,y):D2!R2denoteagiven(non-rigid)space-varyingmagnifyingtransformoftheimagedomain.Theproblemistogeneratethepush-forwardimagef(I).Weapproachthisproblemasfollows:1-Ateachpairofspatialimagecoordinates,computetheJacobianofthederivativesoff.2-FindtheQRdecompositionoftheJacobianmatrix.3-EstimatethelocalscaletobethelargestvalueofthediagonaloftheRmatrix:s=max(diagonal(R))ThelocalscalemaybeestimatedaswellasthedeterminantoftheJacobianmatrix.Withthelocalscalecomputedateachgridpoint,weobtainascalemapforthewholeimage.Wecallthismapthesuper-resolutionmap.Foruniformimagescaling,thismapisaconstantmap.4-Limitthenumberofdistinctscalesinthesuper-resolutionmaptoapre-speciedmaximumnumberofscales.Thisisanoptionalsteptoreducethecomplexityofthesuper-resolutionprocess.Wecalltheresultantmapthereducedsuper-resolutionmap.5-Findtheuniquescalesinthe(reduced)super-resolutionmapandordertheminanascendingorder. 76

PAGE 77

6-Forthesmallestscaleinthe(reduced)super-resolutionmap,applythepatch-redundancyalgorithmtothewholeinputimageI=I0.DenotetheoutputofthisstagebyI1.7-Foreachhigherscaleinthe(reduced)super-resolutionmap,applythepatch-redundancyalgorithmonlytothepixelsassociatedwithascalelargerthanorequaltothecurrentscaleinthepredecessoroutputIk)]TJ /F12 7.97 Tf 6.59 0 Td[(1.Super-resolveotherpixelsusingafastinterpolationscheme(suchasbicubicorlanczosinterpolation).8-RepeattheprocessinStep7tillthemaximumscaleinthe(reduced)super-resolutionmapisapplied.Notethatthissuper-resolutionoutputimagedoesnotreectthedomaindeformation.Also,theoutputimagesizemaynotmatchthetargethigh-resolutionimagesizebecauseofthelimitsoftheincrementalmagnicationofthepatch-redundancyalgorithm.Wedenotethisoutputimagetheovershootimage9-Applythedeformationfunctiontothetargethigh-resolutiongrid.Interpolationmaybeappliedtondthefunctionvalueatallgridpoints.10-Foreachdeformedhigh-resolutiongridpoint,nditsnearest-neighboramongtheundeformedovershootimagegrid.Assignthepixelvalueofthenearestneighbortothehigh-resolutiongridpoint.Theincrementalnatureofthepatchredundancyalgorithmiscrucialforproducingahigh-qualityoutput.However,thenearest-neighborsearchisabottleneckofthepatchredundancyalgorithm.Aswehaveshowninprevioussections,thecomputationalcomplexityofthealgorithmcanbereducedbylimitingthesearchpooltooneortwolevelsonlybelowthesoughtlevel.Thisdoesnothaveanyvisuallydiscernibledegradationofimagequality.Here,weemployanothertricktoavoidthehighcostofthepatchredundancyalgorithm.Sinceourgoalhereistoapplysuper-resolutionnon-uniformly,wecansacricesomeaccuracyatpixellocationswheretherequiredsuper-resolutionfactorislow.AsexplainedinStep7,wedothisbyapplyingthepatch-redundancyalgorithmtothepixelsassociatedwithascalelargerthanorequal 77

PAGE 78

tothecurrentscale.Otherpixelswillbesuper-resolvedwithbicubicinterpolation.Thisproducesnovisualdegradationsincethese'left-behind'pixelshavebeenconsistentlysuper-resolvedwiththepatchredundancyalgorithmtilltheirtargetscaleisreached. 3.4.2ExperimentalResultsForthefollowingsuper-resolutionresults,wedeneadistancefunctiononthehigh-resolutiongridanddeducetheassociatedsuper-resolutionmap.LetXi,Yibetheimageco-ordinatesattheinputresolution.LetHiandWibetheheightandwidthoftheinputimage,respectively.Foreachgridpoint,letRibethedistancetoanarbitrarybutxedpointofinterestontheimagegrid.Thispointmaybepre-speciedorpickedinteractivelybytheuser.LetXt,Yt,Ht,Wt,Rtbethecorrespondingquantitiesforthetargetresolutiongrid.Letg:(Xt,Yt)!Rbethefollowingdistancefunctionfromthepointofinterestonthetarget-resolutiongrid: g(x,y)=max(Ri)(Rt(x,y)=max(Rt(x,y)))p,(3)wherep1isanattenuationparameter.Thehigh-resolutiongridisdeformedbyg.Wecomputethelocalscale(Step3)forthedeformedhigh-resolutiongridthenestimatethesuper-resolutionmapthroughnearest-neighborsearch.Thatis,foreachdeformedhigh-resolutiongridpoint,wendtheclosestlow-resolutiongridpoint.Then,thescaleatthelow-resolutiongridpointisincreasedbytheinverseofthefunctionvalueatthehigh-resolutiongridpoints.Scalecontributionsfromallhigh-resolutiongridpointsarethenaveragedbytheirnumber.Fromhere,wefollowthealgorithmicstepsabovetoproducethedeformedsuper-resolvedoutput.Foreachoftheexamplesbelow,aninterestpointwasarbitrarilypickedthenasuper-resolutionmapwasdeducedasexplainedabove.Forallexamples,patch-redundancyoutputsshowmorecleardetailsespeciallyatthemostmagniedspotsoftheimages.Thetargetsuper-resolutionfactorisshowninthecaptionforeachimage. 78

PAGE 79

A B C DFigure3-38. Non-uniformsuper-resolutionusingpatch-redundancy:Chip(2X).A)Low-resolutioninput.B)Deformednearest-neighboroutput.C)Deformedbicubicoutput.D)Deformedpatch-redundancyoutput. 79

PAGE 80

A B CFigure3-39. Super-resolutionmap:Chip.A)Exampledistancefunction.B)Super-resolutionMap.C)Reducedsuper-resolutionMap. 80

PAGE 81

A B C DFigure3-40. Non-uniformsuper-resolutionusingpatchredundancy:Child(3X).A)Low-resolutioninput.B)Deformednearest-neighboroutput.C)Deformedbicubicoutput.D)Deformedpatch-redundancyoutput. 81

PAGE 82

A B C DFigure3-41. Non-uniformsuper-resolutionusingpatchredundancy:Koala(3X).A)Low-resolutioninput.B)Deformednearest-neighboroutput.C)Deformedbicubicoutput.D)Deformedpatch-redundancyoutput. 82

PAGE 83

A B C DFigure3-42. Non-uniformsuper-resolutionusingpatchredundancy:ThreeWorlds(3X).A)Low-resolutioninput.B)Deformednearest-neighboroutput.C)Deformedbicubicoutput.D)Deformedpatch-redundancyoutput. 83

PAGE 84

A B C DFigure3-43. Non-uniformsuper-resolutionusingpatchredundancy:GlynnFractals(2X).A)Low-resolutioninput.B)Deformednearest-neighboroutput.C)Deformedbicubicoutput.D)Deformedpatch-redundancyoutput. 84

PAGE 85

A B C DFigure3-44. Non-uniformsuper-resolutionusingpatchredundancy:SquareLimit(2X).A)Low-resolutioninput.B)Deformednearest-neighboroutput.C)Deformedbicubicoutput.D)Deformedpatch-redundancyoutput. 85

PAGE 86

3.5Non-UniformSuper-ResolutionusingtheResolution-IndependentImageModel 3.5.1Resolution-IndependentImageModel:ABriefReviewWereviewheretheresolution-independentimagemodelproposedbyViolaetal.[ 56 ].Thismodelisanewformulationofimagereconstructionproblems.Inthismodel,imagedataismodelledbyspatiallydiscretemeasurementsofspatiallycontinuousworldviewedthroughcontinuous-domainPoint-SpreadFunctions(PSF).LetIbeadigitalimageofnpixels I=f(Ii,xi)gni=1. (3) Thelatentimageisrepresentedbyafunctionu(x)denedonacontinuousimagedomainR2,whichisfoundbyminimizingthefunctional Ecd(u)=ZZkI(x))]TJ /F10 11.955 Tf 11.96 0 Td[(u(x)kdx+R(u) (3) whereRisaregularizerfunctional,andwheretheundertildenotationreferstoafunctionasanobject.Usinganite-elementapproach,thelatentimagemaybemodelledasalinearcombinationofMbases u(x)=MXm=1um m(x). (3) Eachnoise-freeimagesampleisobtainedthroughtheactionofapoint-spreadfunctiononthelatentimageu(x) Ii=ZZi(x)u(x)dx (3) 86

PAGE 87

andthenoisypixelIi=Ii+iwhereiisarandomvariablerepresentingimagingnoise.Theenergyfunctionalisthen E(u)=nXi=1kIi)]TJ /F14 11.955 Tf 11.95 16.27 Td[(ZZi(x)u(x)dxk+R(u) (3) wheretheregularizerRisaMumford-Shahfunctional R(u)=1ZZnJ(u)kru(x)kpdx+2ZJ(u)jul(x))]TJ /F10 11.955 Tf 11.96 0 Td[(ur(x)jdx+TheimagedomaincanbemodelledbyatrianglemeshdenedbymeshverticesV.Then,theenergyfunctionalwiththedependenciesonVmadeclearmaybere-writtenas E(U,V)=kI)]TJ /F8 11.955 Tf 11.95 0 Td[(F(V)U:k+1Xt#Tt(V)0B@ut2ut31CAp+2Rdisc(U,V) (3) ComputingderivativeswithrespecttoUandVisstraightforwardexceptforthersttermwhosederivativeswithrespecttoVrequirepolygonclippingthenmomentcomputations.RefertoViolaetal.[ 56 ]formoredetails.Tominimizetheenergy,usealternatingoptimizationoverUandVwithaquasi-Newtonapproach. 3.5.2ExperimentalResultsTheresolution-independentmodelcanbeusedforperformingnon-uniformsuper-resolutionwithsimplemodicationstothealgorithmweproposedinSubsection 87

PAGE 88

3.4.1 .Inparticular,super-resolutioncanbedoneatonestep(ratherthanincrementally)aslongastheimagedomainistriangulatedwithsufcientlysmalltrianglesizetoobtaintherequiredsub-pixelaccuracy.Weneedhowevertondtheoptimumpoint-spreadfunction(PSF)widthandregularizationparametersfortheresolution-independentmodel.Thiscanbedonebysweepingtheparametersorbyincludingtheseparametersintheoptimizationprocedure.Below,weshowacoupleofexamplesofnon-uniformsuper-resolutionusingtheresolution-independentmodel.Theresultsareindeedencouragingandarecompetitivewiththeresultsobtainedearlierusingthepatch-redundancymodel. 88

PAGE 89

A B C DFigure3-45. Non-uniformsuper-resolutionusingtheresolution-independentmodel:Chip(2X).A)Low-resolutioninput.B)Deformednearest-neighboroutput.C)Deformedbicubicoutput.D)Deformedresolution-independentmodeloutput. 89

PAGE 90

ALR BDeformedLR CBicubic DResolution-IndependentModelFigure3-46. Non-uniformsuper-resolutionusingtheresolution-independentmodel:Child(3X).A)Low-resolutioninput.B)Deformednearest-neighboroutput.C)Deformedbicubicoutput.D)Deformedresolution-independentmodeloutput. 90

PAGE 91

CHAPTER4CONCLUSIONSANDFUTUREWORK 4.1ConclusionsInthiswork,wehaveintroducedanovelcoupleddictionarylearningschemeforimagesuper-resolutionandcolorde-rendering.Ourschememodelsmoreaccuratelytherelationbetweenobservedandstatevariablesbyincorporatinganaugmentingdictionarythatlinkstheobservationdictionaryreconstructionerrortothestatedictionarymodel.Wehaveshownthatthenewmodelproducesimprovedsuper-resolutionandcolorde-renderingresultsincomparisonwiththestandardcoupleddictionarylearningscheme.Moreover,wehaveexploredapproachesfornon-uniformimagesuper-resolution.Oneapproachistousepatchredundancyintheimagetobuildasuper-resolvedimageincrementallywithdifferentiatedserviceforimagepixelsbasedontheirrespectivetargetmagnication.Anotherapproachistousetheresolution-independentimagemodel.Thismodelcansuper-resolvetheimagewithouttheneedforincrementalstepsaslongastheimagedomainistriangulatedwithsufcientlysmalltrianglestoobtainthesoughtsub-pixelaccuracy. 4.2FutureWorkDictionarylearningschemeshaveseenaconsiderablegrowthincludingapplicationsinimagesuper-resolution[ 57 59 ].Buildingonourexperiencewithdictionarylearning[ 60 62 ],weplantoinvestigatenoveldictionary-basedapproachestosuper-resolution.Forexample,weplantoputreasonablestructuralconstraintsontheaugmentingdictionarymodeltobettercapturerelevantfeaturesintheresidualdataandtoboundthetimeandspacecomplexityinthecaseoflargedictionarysizes.Moreover,wewouldliketoexploremoregeneralizedcoupleddictionarymodelsforimagesuper-resolutionandothercouplingapplications.Forexample,anon-parametricBayesiandictionarymodelmaybeassumedforcoupleddictionarymodels.Aswell,wewillstudyandexperiment 91

PAGE 92

withrealisticanddata-drivendeformationstocombinewithsuper-resolution.Finally,wewillworkontheintegrationandinterleavingofobjectrecognitionandsuper-resolution. 92

PAGE 93

REFERENCES [1] J.YangandT.Huang,Imagesuper-resolution:Historicaloverviewandfuturechallenges,inSuper-ResolutionImaging,ser.TaylorFrancis/CRCPress,P.Milanfar,Ed.Series:DigitalImagingandComputerVision,2010,ch.1,pp.1. [2] S.C.Park,M.K.Park,andM.G.Kang,Super-resolutionimagereconstruction:atechnicaloverview,SignalProcessingMagazine,IEEE,vol.20,no.3,pp.2136,may2003. [3] J.TianandK.-K.Ma,Asurveyonsuper-resolutionimaging,Signal,ImageandVideoProcessing,vol.5,no.3,pp.329,2011. [4] J.vanOuwerkerk,Imagesuper-resolutionsurvey,ImageandVisionComputing,vol.24,no.10,pp.10391052,2006. [5] G.Cristobal,E.Gil,F.Sroubek,J.Flusser,C.Miravet,andF.B.Rodrguez,Superresolutionimaging:asurveyofcurrenttechniques,inSocietyofPhoto-OpticalInstrumentationEngineers(SPIE)ConferenceSeries,ser.SocietyofPhoto-OpticalInstrumentationEngineers(SPIE)ConferenceSeries,vol.7074,Aug.2008. [6] D.Glasner,S.Bagon,andM.Irani,Super-resolutionfromasingleimage,inComputerVision,2009IEEE12thInternationalConferenceon,292009-oct.22009,pp.349. [7] G.FreedmanandR.Fattal,Imageandvideoupscalingfromlocalself-examples,ACMTrans.Graph.,vol.28,no.3,pp.1,2010. [8] H.HeandW.-C.Siu,Singleimagesuper-resolutionusinggaussianprocessregression,inComputerVisionandPatternRecognition(CVPR),2011IEEEConferenceon,june2011,pp.449. [9] Y.Tang,Y.Yuan,P.Yan,andX.Li,Single-imagesuper-resolutionviasparsecodingregression,inImageandGraphics(ICIG),2011SixthInternationalConfer-enceon,aug.2011,pp.267. [10] G.SunandC.Qin,Singleimagesuper-resolutionviasparserepresentationingradientdomain,inMultimediaInformationNetworkingandSecurity(MINES),2011ThirdInternationalConferenceon,nov.2011,pp.24. [11] Y.Fan,Z.Gan,Y.Qiu,andX.Zhu,Singleimagesuperresolutionmethodbasedonedgepreservation,inImageandGraphics(ICIG),2011SixthInternationalConferenceon,aug.2011,pp.394. 93

PAGE 94

[12] S.Kim,D.Kim,T.-C.Kim,M.Hayes,andJ.Paik,Selectivefrequencydecompositioninthewaveletdomainforsingleimage-basedsuper-resolution,inConsumerElectronics(ICCE),2012IEEEInternationalConferenceon,jan.2012,pp.124. [13] J.LiandD.Tao,Wisdomofcrowds:Singleimagesuper-resolutionfromtheweb,inDataMiningWorkshops(ICDMW),2011IEEE11thInternationalConferenceon,dec.2011,pp.812. [14] X.Lu,P.Yan,Y.Yuan,X.Li,andH.Yuan,Utilizinghomotopyforsingleimagesuperresolution,inPatternRecognition(ACPR),2011FirstAsianConferenceon,nov.2011,pp.316. [15] H.Kim,Y.Cha,andS.Kim,Curvatureinterpolationmethodforimagezooming,ImageProcessing,IEEETransactionson,vol.20,no.7,pp.1895,july2011. [16] F.Zhou,W.Yang,andQ.Liao,Interpolation-basedimagesuper-resolutionusingmulti-surfacetting,ImageProcessing,IEEETransactionson,vol.PP,no.99,p.1,2012. [17] A.MaaloufandM.-C.Larabi,Colourimagesuper-resolutionusinggeometricgrouplets,ImageProcessing,IET,vol.6,no.2,pp.168,march2012. [18] F.Zhou,W.Yang,andQ.Liao,Acoarse-to-nesubpixelregistrationmethodtorecoverlocalperspectivedeformationintheapplicationofimagesuper-resolution,ImageProcessing,IEEETransactionson,vol.21,no.1,pp.53,jan.2012. [19] A.Milchevski,Z.Ivanovski,andB.Mustafa,Machinelearningbasedsupper-resolutionalgorithmrobusttoregistrationerrors,inDigitalSignalPro-cessingWorkshopandIEEESignalProcessingEducationWorkshop(DSP/SPE),2011IEEE,jan.2011,pp.326. [20] M.Vrigkas,C.Nikou,andL.Kondi,Ontheimprovementofimageregistrationforhighaccuracysuper-resolution,inAcoustics,SpeechandSignalProcessing(ICASSP),2011IEEEInternationalConferenceon,may2011,pp.981. [21] H.Su,Y.Wu,andJ.Zhou,Super-resolutionwithoutdenseow,ImageProcess-ing,IEEETransactionson,vol.PP,no.99,p.1,2011. [22] T.SchoenemannandD.Cremers,Acoding-costframeworkforsuper-resolutionmotionlayerdecomposition,ImageProcessing,IEEETransactionson,vol.21,no.3,pp.1097,march2012. [23] K.Sunkavalli,N.Joshi,S.Kang,M.Cohen,andH.Pster,Videosnapshots:Creatinghigh-qualityimagesfromvideoclips,VisualizationandComputerGraph-ics,IEEETransactionson,vol.PP,no.99,p.1,2012. 94

PAGE 95

[24] J.Chen,J.Nunez-Yanez,andA.Achim,Videosuper-resolutionusinggeneralizedgaussianmarkovrandomelds,SignalProcessingLetters,IEEE,vol.19,no.2,pp.63,feb.2012. [25] T.Katsuki,A.Torii,andM.Inoue,Posteriormeansuper-resolutionwithacausalgaussianmarkovrandomeldprior,ImageProcessing,IEEETransactionson,vol.PP,no.99,p.1,2012. [26] H.Dong,Z.Yingxue,andX.Peng,Anewsuper-resolutionreconstructionmethodcombiningnarrowquantizationconstraintsetandmotionestimationforh.264compressedvideo,inImageandGraphics(ICIG),2011SixthInternationalConfer-enceon,aug.2011,pp.258. [27] X.Gao,K.Zhang,D.Tao,andX.Li,Imagesuper-resolutionwithsparseneighborembedding,ImageProcessing,IEEETransactionson,vol.PP,no.99,p.1,2012. [28] ,Jointlearningforsingle-imagesuper-resolutionviaacoupledconstraint,ImageProcessing,IEEETransactionson,vol.21,no.2,pp.469,feb.2012. [29] X.ZengandH.Huang,Super-resolutionmethodformultiviewfacerecognitionfromasingleimageperpersonusingnonlinearmappingsoncoherentfeatures,SignalProcessingLetters,IEEE,vol.19,no.4,pp.195,april2012. [30] Y.Wang,Z.Zhang,W.Li,andF.Jiang,Combiningtensorspaceanalysisandactiveappearancemodelsforagingeffectsimulationonfaceimages,Systems,Man,andCybernetics,PartB:Cybernetics,IEEETransactionson,vol.PP,no.99,pp.1,2012. [31] J.-J.Liu,C.-C.Chang,L.-T.Wang,C.-K.Yang,andS.-Y.Huang,Anewreadinginterfacedesignforseniorcitizens,inInstrumentation,Measurement,Computer,CommunicationandControl,2011FirstInternationalConferenceon,oct.2011,pp.349. [32] Y.Ogawa,Y.Kato,T.Shinohara,T.Fujita,andN.Minegishi,Asinglechipimageprocessorforvariousin-cardisplayequipments,inConsumerElectronics(ICCE),2012IEEEInternationalConferenceon,jan.2012,pp.303. [33] B.Scherrer,A.Gholipour,andS.K.Wareld,Super-resolutionreconstructionofdiffusion-weightedimagesfromdistortioncompensatedorthogonalanisotropicacquisitions,inMathematicalMethodsinBiomedicalImageAnalysis(MMBIA),2012IEEEWorkshopon,jan.2012,pp.249. [34] D.Wallach,F.Lamare,G.Kontaxakis,andD.Visvikis,Super-resolutioninrespiratorysynchronizedpositronemissiontomography,MedicalImaging,IEEETransactionson,vol.31,no.2,pp.438,feb.2012. 95

PAGE 96

[35] Y.Sakuta,A.Tsutsui,T.Goto,M.Sakurai,andR.Sakai,Super-resolutionutilizingtotalvariationregularizationoncellprocessor,inConsumerElectronics(ICCE),2012IEEEInternationalConferenceon,jan.2012,pp.723. [36] L.Nana,Z.Weixing,M.Shiying,andH.Wen,Superresolutionvideoreconstructionindsp+fpgabasedonliftingwavelet,inElectronicMeasurementInstruments(ICEMI),201110thInternationalConferenceon,vol.3,aug.2011,pp.101. [37] F.Li,X.Jia,D.Fraser,andA.Lambert,Superresolutionforremotesensingimagesbasedonauniversalhiddenmarkovtreemodel,GeoscienceandRemoteSensing,IEEETransactionson,vol.48,no.3,pp.1270,march2010. [38] S.C.Park,M.K.Park,andM.G.Kang,Super-resolutionimagereconstruction:atechnicaloverview,SignalProcessingMagazine,IEEE,vol.20,no.3,pp.2136,may2003. [39] G.Cristobal,E.Gil,F.Sroubek,J.Flusser,C.Miravet,andF.B.Rodrguez,Superresolutionimaging:asurveyofcurrenttechniques,inSocietyofPhoto-OpticalInstrumentationEngineers(SPIE)ConferenceSeries,ser.SocietyofPhoto-OpticalInstrumentationEngineers(SPIE)ConferenceSeries,vol.7074,Aug.2008. [40] W.Freeman,T.Jones,andE.Pasztor,Example-basedsuper-resolution,Com-puterGraphicsandApplications,IEEE,vol.22,no.2,pp.56,mar/apr2002. [41] D.Glasner,S.Bagon,andM.Irani,Super-resolutionfromasingleimage,inICCV,2009. [42] J.Yang,J.Wright,T.Huang,andY.Ma,Imagesuper-resolutionassparserepresentationofrawimagepatches,inComputerVisionandPatternRecog-nition,2008.CVPR2008.IEEEConferenceon,june2008,pp.1. [43] ,Imagesuper-resolutionviasparserepresentation,ImageProcessing,IEEETransactionson,vol.19,no.11,pp.2861,nov.2010. [44] R.Zeyde,M.Elad,andM.Protter,Onsingleimagescale-upusingsparse-representations,inCurvesandSurfaces,2010,pp.711. [45] J.Yang,Z.Wang,Z.Lin,S.Cohen,andT.Huang,Coupledictionarytrainingforimagesuper-resolution,ImageProcessing,IEEETransactionson,vol.PP,no.99,p.1,2012. [46] D.LinandX.Tang,Coupledspacelearningofimagestyletransformation,inComputerVision,2005.ICCV2005.TenthIEEEInternationalConferenceon,vol.2,oct.2005,pp.1699Vol.2. 96

PAGE 97

[47] K.Jia,X.Tang,andX.Wang,Imagetransformationbasedonlearningdictionariesacrossimagespaces,PatternAnalysisandMachineIntelligence,IEEETransac-tionson,vol.PP,no.99,p.1,2012. [48] Y.L.S.Wang,L.ZhangandQ.Pan,Semi-coupleddictionarylearningwithapplicationstoimagesuper-resolutionandphoto-sketchimagesynthesis,inProc.IEEEConf.onComputerVisionandPatternRecognition(CVPR),2012. [49] M.Aharon,M.Elad,andA.Bruckstein,K-svd:Analgorithmfordesigningovercompletedictionariesforsparserepresentation,SignalProcessing,IEEETransactionson,vol.54,no.11,pp.4311,nov.2006. [50] A.Chakrabarti,D.Scharstein,andT.Zickler,Anempiricalcameramodelforinternetcolorvision,London,UK,07/09/20092009. [51] H.T.Lin,S.J.Kim,S.Susstrunk,andM.S.Brown,Revisitingradiometriccalibrationforcolorcomputervision,ICCV,2011. [52] Y.Xiong,K.Saenko,T.Darrell,andT.Zickler,Frompixelstophysics:Probabilisticcolorde-rendering,inComputerVisionandPatternRecognition(CVPR),2012IEEEConferenceon,june2012,pp.358. [53] S.J.Kim,H.T.Lin,Z.Lu,S.Susstrunk,S.Lin,andM.S.BrownHai,Anewin-cameraimagingmodelforcolorcomputervisionanditsapplication,inIEEETransactionsonPatternAnalysisandMachineIntelligence(TPAMI),2012. [54] A.Buades,B.Coll,andJ.M.Morel,Areviewofimagedenoisingalgorithms,withanewone,Simul,vol.4,pp.490,2005. [55] M.Protter,M.Elad,H.Takeda,andP.Milanfar,Generalizingthenonlocal-meanstosuper-resolutionreconstruction,ImageProcessing,IEEETransactionson,vol.18,no.1,pp.36,jan.2009. [56] F.Viola,A.Fitzgibbon,andR.Cipolla,Aunifyingresolution-independentformulationforearlyvision,inComputerVisionandPatternRecognition(CVPR),2012IEEEConferenceon,2012,pp.494. [57] J.Yang,J.Wright,T.Huang,andY.Ma,Imagesuper-resolutionviasparserepresentation,ImageProcessing,IEEETransactionson,vol.19,no.11,pp.2861,nov.2010. [58] J.Yang,Z.Wang,Z.Lin,S.Cohen,andT.Huang,Coupledictionarytrainingforimagesuper-resolution,ImageProcessing,IEEETransactionson,vol.PP,no.99,p.1,2012. [59] G.Mu,X.Gao,K.Zhang,X.Li,andD.Tao,Singleimagesuperresolutionwithhighresolutiondictionary,inImageProcessing(ICIP),201118thIEEEInterna-tionalConferenceon,sept.2011,pp.1141. 97

PAGE 98

[60] M.RushdiandJ.Ho,Augmentedcoupleddictionarylearningforimagesuper-resolution,inMachineLearningandApplications(ICMLA),201211thInternationalConferenceon,vol.1,2012,pp.262. [61] ,Textureclassicationusingsparsek-svdtextondictionaries,inVISAPP,2011,pp.187. [62] ,Large-scale-invarianttexturerecognition,inVISAPP,2011,pp.442. 98

PAGE 99

BIOGRAPHICALSKETCH MuhammadRushdiwasborninUrbana,Illinoisin1979.Hereceivedbachelor'sandmaster'sdegreesinbiomedicalandsystemsengineeringfromCairoUniversity,Egyptin2001and2005,respectively.Inhisthesisresearchwork,Muhammaddevelopedanovelalgorithmfornoisereductioninmedicalultrasound.Apaperwrittenaboutthisworkwaspresentedinthe28thAnnualInternationalConferenceoftheIEEEEngineeringinMedicineandBiologySociety(EMBC2006).Muhammadearnedaswellasecondbachelor'sdegreeinmathematicsfromCairoUniversityin2003.FromJanuary2005toMay2006,MuhammadattendedtheUniversityofMiamiasagraduateresearchandteachingassistantattheDepartmentofElectricalandComputerEngineeringwherehetookseveralgraduate-levelcoursesinengineeringandmathematics.Inthesummerof2006,MuhammadwasaninternatHewlett-PackardDevelopmentCompanyinVancouver,WA.InAugust2006,MuhammadmovedwithhisfamilytoGainesvilletopursueadoctoratedegreeattheDepartmentofComputerandInformationScienceandEngineering,UniversityofFlorida.HeworkedundertheguidanceofDr.JeffreyHosince2007.Inthesummerof2007,MuhammadinternedforthesecondtimeatHewlett-PackardDevelopmentCompanyinBoise,ID.FromJanuary2008toJune2011,MuhammadworkedasasoftwaredeveloperattheSageSoftwareHealthcareDivisioninAlachua,FL.Inthesummerof2012,Muhammadearnedasecondmaster'sdegreeincomputerscience.InApril2013,hereceivedhisPh.D.degreeincomputerengineeringfromtheUniversityofFlorida.Muhammad'sdoctoralworkinvolvedmachinelearningtechniquesforsuper-resolutionimaging.MuhammadthenmovedbacktoEgypttoworkasanassistantprofessorinCairoUniversity. 99

PAGE 100

Muhammad'sgeneralresearchinterestsareintheareasofimageprocessing,computervision,machinelearning,andappliedmathematics.Othernon-academicinterestsincludeactivitiesrelatedtopromotinghumanrights,civilliberties,interfaithdialogueandsocialjustice.Besides,Muhammadlovesreadingbooks,hiking,andhavingfamilytripswithhiswifeDoaa,hisdaughterJumanahandhissonAli. 100