<%BANNER%>

The Correntropy MACE Filter for Image Recognition

Permanent Link: http://ufdc.ufl.edu/UFE0021400/00001

Material Information

Title: The Correntropy MACE Filter for Image Recognition
Physical Description: 1 online resource (123 p.)
Language: english
Creator: Jeong, Kyu-Hwa
Publisher: University of Florida
Place of Publication: Gainesville, Fla.
Publication Date: 2007

Subjects

Subjects / Keywords: correlation, correntropy, dimensionality, fastgausstransform, mace, pattern, random, rkhs
Electrical and Computer Engineering -- Dissertations, Academic -- UF
Genre: Electrical and Computer Engineering thesis, Ph.D.
bibliography   ( marcgt )
theses   ( marcgt )
government publication (state, provincial, terriorial, dependent)   ( marcgt )
born-digital   ( sobekcm )
Electronic Thesis or Dissertation

Notes

Abstract: The major goal of my research was to develop nonlinear methods of the family of distortion invariant filters, specifically the minimum average correlation energy (MACE) filter. The minimum average correlation energy (MACE) filter is a well known correlation filter for pattern recognition. My research nvestigated a closed form solution of the nonlinear version of the MACE filter using the recently introduced correntropy function. Correntropy is a positive definite function that generalizes the concept of correlation by utilizing higher order moments of the signal statistics. Because of its positive definite nature, correntropy induces a new reproducing kernel Hilbert space (RKHS). Taking advantage of the linear structure of the RKHS, it is possible to formulate and solve the MACE filter equations in the RKHS induced by correntropy. Due to the nonlinear relation between the feature space and the input space, the correntropy MACE (CMACE) can potentially improve upon the MACE performance while preserving the shift-invariant property (additional computation for all shifts will be required in the CMACE). To alleviate the computation complexity of the solution, my research also presents the fast CMACE using the Fast Gauss Transform (FGT). Both the MACE and CMACE are basically memory-based algorithms and due to the high dimensionality of the image data, the computational cost of the CMACE filter is one of critical issues in practical applications. Therefore, my research also used a dimensionality reduction method based on random projections (RP), which has emerged as a powerful method for dimensionality reduction in machine learning. We applied the CMACE filter to face recognition using facial expression data and the MSTAR public release Synthetic Aperture Radar (SAR) data set, and experimental results show that the proposed CMACE filter indeed outperforms the traditional linear MACE and the kernelized MACE in both generalization and rejection abilities. In addition, simulation results in face recognition show that the CMACE filter with random projection (CMACE-RP) also outperforms the traditional linear MACE with small degradation in performance, but great savings in storage and computational complexity.
General Note: In the series University of Florida Digital Collections.
General Note: Includes vita.
Bibliography: Includes bibliographical references.
Source of Description: Description based on online resource; title from PDF title page.
Source of Description: This bibliographic record is available under the Creative Commons CC0 public domain dedication. The University of Florida Libraries, as creator of this bibliographic record, has waived all rights to it worldwide under copyright law, including all related and neighboring rights, to the extent allowed by law.
Statement of Responsibility: by Kyu-Hwa Jeong.
Thesis: Thesis (Ph.D.)--University of Florida, 2007.
Local: Adviser: Principe, Jose C.

Record Information

Source Institution: UFRGP
Rights Management: Applicable rights reserved.
Classification: lcc - LD1780 2007
System ID: UFE0021400:00001

Permanent Link: http://ufdc.ufl.edu/UFE0021400/00001

Material Information

Title: The Correntropy MACE Filter for Image Recognition
Physical Description: 1 online resource (123 p.)
Language: english
Creator: Jeong, Kyu-Hwa
Publisher: University of Florida
Place of Publication: Gainesville, Fla.
Publication Date: 2007

Subjects

Subjects / Keywords: correlation, correntropy, dimensionality, fastgausstransform, mace, pattern, random, rkhs
Electrical and Computer Engineering -- Dissertations, Academic -- UF
Genre: Electrical and Computer Engineering thesis, Ph.D.
bibliography   ( marcgt )
theses   ( marcgt )
government publication (state, provincial, terriorial, dependent)   ( marcgt )
born-digital   ( sobekcm )
Electronic Thesis or Dissertation

Notes

Abstract: The major goal of my research was to develop nonlinear methods of the family of distortion invariant filters, specifically the minimum average correlation energy (MACE) filter. The minimum average correlation energy (MACE) filter is a well known correlation filter for pattern recognition. My research nvestigated a closed form solution of the nonlinear version of the MACE filter using the recently introduced correntropy function. Correntropy is a positive definite function that generalizes the concept of correlation by utilizing higher order moments of the signal statistics. Because of its positive definite nature, correntropy induces a new reproducing kernel Hilbert space (RKHS). Taking advantage of the linear structure of the RKHS, it is possible to formulate and solve the MACE filter equations in the RKHS induced by correntropy. Due to the nonlinear relation between the feature space and the input space, the correntropy MACE (CMACE) can potentially improve upon the MACE performance while preserving the shift-invariant property (additional computation for all shifts will be required in the CMACE). To alleviate the computation complexity of the solution, my research also presents the fast CMACE using the Fast Gauss Transform (FGT). Both the MACE and CMACE are basically memory-based algorithms and due to the high dimensionality of the image data, the computational cost of the CMACE filter is one of critical issues in practical applications. Therefore, my research also used a dimensionality reduction method based on random projections (RP), which has emerged as a powerful method for dimensionality reduction in machine learning. We applied the CMACE filter to face recognition using facial expression data and the MSTAR public release Synthetic Aperture Radar (SAR) data set, and experimental results show that the proposed CMACE filter indeed outperforms the traditional linear MACE and the kernelized MACE in both generalization and rejection abilities. In addition, simulation results in face recognition show that the CMACE filter with random projection (CMACE-RP) also outperforms the traditional linear MACE with small degradation in performance, but great savings in storage and computational complexity.
General Note: In the series University of Florida Digital Collections.
General Note: Includes vita.
Bibliography: Includes bibliographical references.
Source of Description: Description based on online resource; title from PDF title page.
Source of Description: This bibliographic record is available under the Creative Commons CC0 public domain dedication. The University of Florida Libraries, as creator of this bibliographic record, has waived all rights to it worldwide under copyright law, including all related and neighboring rights, to the extent allowed by law.
Statement of Responsibility: by Kyu-Hwa Jeong.
Thesis: Thesis (Ph.D.)--University of Florida, 2007.
Local: Adviser: Principe, Jose C.

Record Information

Source Institution: UFRGP
Rights Management: Applicable rights reserved.
Classification: lcc - LD1780 2007
System ID: UFE0021400:00001


This item has the following downloads:


Full Text
xml version 1.0 encoding UTF-8
REPORT xmlns http:www.fcla.edudlsmddaitss xmlns:xsi http:www.w3.org2001XMLSchema-instance xsi:schemaLocation http:www.fcla.edudlsmddaitssdaitssReport.xsd
INGEST IEID E20101206_AAAACW INGEST_TIME 2010-12-06T14:57:06Z PACKAGE UFE0021400_00001
AGREEMENT_INFO ACCOUNT UF PROJECT UFDC
FILES
FILE SIZE 1180 DFID F20101206_AABKHF ORIGIN DEPOSITOR PATH jeong_k_Page_025.txt GLOBAL false PRESERVATION BIT MESSAGE_DIGEST ALGORITHM MD5
ebccc3652ef02f0cebc7b94cdb9e4d92
SHA-1
ad75ca09c9b088d934eaa776dc2e3dcd1b1fb75e
21502 F20101206_AABKGQ jeong_k_Page_102.QC.jpg
b2a80ddc430b15389b636b23911bbe3c
a27a669d331520cb591bc2dfb35fd665f4bf8b4d
6701 F20101206_AABKHG jeong_k_Page_060thm.jpg
89c0b57fd755c917c8d655c075a34ee5
3aa91cd7f28974b7e2cb69ff305d6d08bea35c8a
37922 F20101206_AABKGR jeong_k_Page_034.pro
e9ba25726f51c8b5596c98462a298c4e
467913372b01dfea8a00172b480be7f46d382aa9
43664 F20101206_AABKHH jeong_k_Page_004.jpg
7a17f9b3843fdf9b4c5529c8b0aee809
3fec138ccf12fc256e87178e3b46608bd7647ae2
25271604 F20101206_AABKGS jeong_k_Page_093.tif
f87b4227b172218c1822815b8992c164
f2f77760661e3c1596bf312db0c12459a80c7126
95522 F20101206_AABKHI jeong_k_Page_084.jp2
b27aa9657b2035365365473f44e5948e
17bff67f7263d1bc5db504e5a15beabf90862686
22432 F20101206_AABKGT jeong_k_Page_020.QC.jpg
201bc5143975836f98d6dcabefbc6191
bda6e751a7c0de74c5af44fb9c97636081e40ccc
2142 F20101206_AABKGU jeong_k_Page_011.txt
91e0254acbbfc17d6b3587602b227fcd
d05476c448d035d451fe7b5454e017e67968f056
3723 F20101206_AABKHJ jeong_k_Page_114thm.jpg
e2fa832f656c9480dc441bbb2e22b36a
fcb50ff0769b7b5f4bb2a93f43d2c8f103aa8194
1991 F20101206_AABKGV jeong_k_Page_105.txt
bf51920e752c0c69c1ee7f8afe683440
2bfb6e798a5412cb68e1e49e1ab5580fe717944f
39607 F20101206_AABKHK jeong_k_Page_040.jpg
cf1e694738feeb2b5f09e30eda0c7e1d
8b532a31ddda1ad627a05e85fa55d0daf2c2bbcd
1053954 F20101206_AABKGW jeong_k_Page_040.tif
daceda3337e96c7dc3402cc8009be9c1
c92df2cd232d59d79347894f73db2b7218d3cc3f
831434 F20101206_AABKHL jeong_k_Page_107.jp2
890fea03e39ff74b6b0c0cfe3aa51095
55f0c46318490e29a8a76457886fad7fca4db35d
2208 F20101206_AABKGX jeong_k_Page_045.txt
7b4f87ddb313b8acca6752751a7222f1
efb6c017a376c739be0c5650d18314742aaec73a
39319 F20101206_AABKIA jeong_k_Page_116.jp2
4cb00b897ca76d5c3cb892b53f82a3f5
cf18c317e5bdcedd5c396db53ee079d668fc925a
1944 F20101206_AABKHM jeong_k_Page_085.txt
eaa5b83e82d55c5810b50639432e5fcd
689785172b9d03a6c91c932a1e010c0a806e3b4a
5332 F20101206_AABKGY jeong_k_Page_005thm.jpg
b613174677395c781e150766bbd4bf29
e9af1b74e615ec5ed271dcf5e26f58f4ff67e0d4
F20101206_AABKIB jeong_k_Page_011.tif
901af19038c42644bd4416df6cc96434
27a69853688d3d6e92a757d4c0cbe17696dcd213
49185 F20101206_AABKHN jeong_k_Page_010.pro
1f2aaee09bd9d0e64090e14b63282095
0a2b0c98867457d43cb43abf4c5895656c54ea18
964025 F20101206_AABKGZ jeong_k_Page_086.jp2
e089ab72f11b84c015c62bfe04e376ba
20bbec61ed9b9e2c839ea3f8272730166c63e13d
1051945 F20101206_AABKIC jeong_k_Page_074.jp2
27ec7e487fc25d237eb10c433fbf654a
ab129b726b023e44ac2e6282feee9488b469e6b5
49563 F20101206_AABKHO jeong_k_Page_031.pro
217420549c0a7bedcdaa70a07c43bfd1
fd4101c543274f5c16edc4452c8bf4b02b77437b
2549 F20101206_AABKID jeong_k_Page_091.txt
c078e925492655deca5ef8a1cc0b53e4
e2a68c0dc354bd20f458881278505516bdf78b91
61639 F20101206_AABKHP jeong_k_Page_081.jpg
f23553bd41d1541fc60c2aad41dbba15
6069e54a579677e086eb4f0f76c0c3494669c430
80654 F20101206_AABKIE jeong_k_Page_027.jpg
f47a9662513bf917d266676e0e656d2a
8b0ced77fc9dda851c60267911d3436f9a9bbe3f
F20101206_AABKHQ jeong_k_Page_102.tif
5647c484c61d0a9a091b44557d6bf636
5de992b1ba7a4a9f8aae9a5cd3826e2df5a6f3f5
6424 F20101206_AABKIF jeong_k_Page_094thm.jpg
de2985c562bf2451841381a4215cb2e4
394da16dad332e13b5c1bbcec828fa1154a4f3eb
5438 F20101206_AABKHR jeong_k_Page_026.pro
cf2dcca2a10f8497ce15904371b4f12f
8a8c13e23baa3a11ea99fad02b4ec6ec9962ee9a
848 F20101206_AABKIG jeong_k_Page_070.txt
865cdda2040bfbc50e93753b25f9259e
004217d8ea6106a56b6aa33913ac715542965069
5495 F20101206_AABKHS jeong_k_Page_102thm.jpg
b75fa10f009eb6488185fd309b040189
16b5a1a32cdc1331489248b317562d60395f8dd4
21997 F20101206_AABKIH jeong_k_Page_091.QC.jpg
9f30837ee2650921cca00c2a408c4861
e04a0e19c1eefa1a8935419f7327617be6ad4fdf
21236 F20101206_AABKHT jeong_k_Page_055.QC.jpg
51571c9d5eb3a0e642591d7d24bf52d1
fdc850cb790cbc314bb8c2ba998af02645dd188b
89253 F20101206_AABKII jeong_k_Page_016.jpg
b9a58e1765084548b4912ab7e228c75b
8186930b478e9b43fdb7076890efcd1bec072095
45675 F20101206_AABKIJ jeong_k_Page_114.jpg
4bf790b6764b90b24e0a8f5685db80ad
97404de296869d66eb76b29ab61c8062664ddc7e
965815 F20101206_AABKHU jeong_k_Page_070.jp2
0b637d5e8a3f9d55de204ac57a180e48
309c9f6f894d7f272f189a2af6a86dcfa99118ff
2115 F20101206_AABKHV jeong_k_Page_027.txt
354185589480ef3efdb4f7315b02c553
fa02d7030fd1197073c80b1c4ee5b6096724c6b2
32920 F20101206_AABKIK jeong_k_Page_039.pro
3190dc10f0f5549f08165b8ee5e963fe
0e6d27b78a94b13bb11aaf438ee450ccc12c34b2
996319 F20101206_AABKHW jeong_k_Page_042.jp2
317e2848d2910119b0becc574b056670
6f8d36be0574cf3b4f1e65edce93b3ea6cc55d4b
1984 F20101206_AABKJA jeong_k_Page_050.txt
fa16b4e4edd630e3dead89492e7cbf0d
1b2b6664a247770fdf7565b8d38555cec5a0bb48
1051978 F20101206_AABKIL jeong_k_Page_036.jp2
707297325610c2582693ffd510f6225e
b81a14a0db0ba141c5a7763bc79d4bf5abfa60d5
6270 F20101206_AABKHX jeong_k_Page_119thm.jpg
d4f1e456953080158e985b1dc7bd34ce
34daa2bb53179027d666a35fbed3499aa0e1f8da
56836 F20101206_AABKJB jeong_k_Page_024.jpg
b7b13482a89e387f3697328b7bc47b99
e94dcddc3d17bd58981ef2ed2fb11b7dae8a8454
900028 F20101206_AABKIM jeong_k_Page_025.jp2
0011ae84074c48009bcd962b8009f9e0
a2ccb5b53ca42c7f2671495bba8d0d5dd6008a2e
2272 F20101206_AABKHY jeong_k_Page_017.txt
a51a4a5c042cd1cfa9c178ff8cc43606
f9be6914c2c0c40f1bd5991902de2225bcf39bf8
24429 F20101206_AABKJC jeong_k_Page_014.QC.jpg
c2b4e868e0b2a771a879c1a8cb7fb9df
1bd401edef198ec825c04412d8655766fe4271c6
1051982 F20101206_AABKIN jeong_k_Page_010.jp2
15356d3ab7b0fcb43c76732bdd9560db
c6c6b0e0d8633e3ef229850523281ff33e7c3c4d
89 F20101206_AABKHZ jeong_k_Page_002.txt
474ea5314faa6480363eb9fcea6b29a8
1211b03ec9f401ca5b319b9a2b744260be7dc6f2
39990 F20101206_AABKJD jeong_k_Page_108.jpg
a2e95fa6277f447b81fdab3ee8ada0b5
69d03fc583decf487bad5e5f3b0c10ccd5fab7f3
28849 F20101206_AABKIO jeong_k_Page_108.pro
b4a44aa8ca74d3a19cd37c5e573b2f00
cf912f31a2e4513d480672fdef5edfd809c1791a
2388 F20101206_AABKJE jeong_k_Page_111.txt
c29d07d9f65ee341e909cb5942390e65
0523e3c8b8ab2d95f9a7398003dd4771cb0b81e2
1976 F20101206_AABKIP jeong_k_Page_041.txt
6c471132d3bfff5ab7b012ad193b14a3
53257a5c39afe3126c5e9d489557e51a91e4166b
2365 F20101206_AABKJF jeong_k_Page_065.txt
fc2c37cae3d4cbbf2ae472fe0e70f6f4
80b3df9ae91a1eb1715b85d239c18cba16c2b261
1031081 F20101206_AABKIQ jeong_k_Page_020.jp2
291a62332251f6e3afca720007dd3d35
2dda961d9ec72f9b4e63ba72b3161b2521f95a89
25885 F20101206_AABKJG jeong_k_Page_018.QC.jpg
a2ff9b456105761b79d328471eef107a
a7248ff25c541e8b2692a8956131fb0603dc3e2a
73348 F20101206_AABKIR jeong_k_Page_087.jpg
315c946f40bec0bbdaafaadda01df018
dcc44f097f48d57937417030670787ee67e3a2c0
47027 F20101206_AABKJH jeong_k_Page_084.pro
74b19961f7d9c4a9ddccd7880b505a58
282fcb1c12c4bb9db3a5231f2f63866a8cd31f00
36623 F20101206_AABKIS jeong_k_Page_098.pro
78eda2c35925c39b21eb75343b62b461
76fe7fd37d9da0d2e202460896bca33a903562d0
53801 F20101206_AABKJI jeong_k_Page_060.pro
660163b6053d896c6552cf916b988340
7b59c2bbc605ce902a0233233bf984ce8bb4bebc
18196 F20101206_AABKIT jeong_k_Page_039.QC.jpg
62bcd048b877e70a814e98d461e7d543
cbd45f1c4aed15417d1312ec97b1adb8af89ad32
F20101206_AABKJJ jeong_k_Page_044.tif
1db96c3ce387c2a7e20faff133b9430d
7b4fba58dbb80301c232dc720785b1b77b05bf7a
64557 F20101206_AABKIU jeong_k_Page_089.jpg
50eb36fb8805b5aa05bef859cbd18cbe
2b11d1e941977816f4f742d570ff503f9c415268
10586 F20101206_AABKJK jeong_k_Page_116.QC.jpg
bd4a309cad860c6cdf6228a8cff8abca
5d7026209c3d16aea002418aa586ef5b1deb652e
1051965 F20101206_AABKIV jeong_k_Page_013.jp2
e084f35226f1a208edb33a620297aa39
aa292b2bd097a395e5bf2b92203066843709375d
58778 F20101206_AABKIW jeong_k_Page_080.pro
398349cda0c54060ef18e07c1d68b6ca
4f2fdba07be559fd56d6502657c07f2a0e3fb657
F20101206_AABKJL jeong_k_Page_054.tif
f46fc537b0584e101fc428cd59a88177
e200f3745c8d438647b4ff19a394fb1f731391f9
23967 F20101206_AABKIX jeong_k_Page_119.QC.jpg
7b44d11a7f5594c6a90f65e198e49a0e
be00f02d0ccc59f61ea80d53f3c7ddecc879512c
54637 F20101206_AABKKA jeong_k_Page_025.jpg
d0dbff261afa76ba7198f2400da4e1db
212f2ab6fdc2e30a13f839cadb8e893a896b094a
17442 F20101206_AABKJM jeong_k_Page_076.QC.jpg
bc13d7650c5c89d4d85614be77d65fcb
d54e122e47b0d839984c0017048d97a12e011b07
5339 F20101206_AABKIY jeong_k_Page_086thm.jpg
96b50c036b6f6fba058a1d8b2329ae96
37e2f197177499ab34ee9dfc7ba7d9eb26d98c56
5800 F20101206_AABKKB jeong_k_Page_058thm.jpg
a5e2184438591707a366765e08019188
bc7826d4f8addca7f35bbcdc59ebbf44fbda9517
2657 F20101206_AABKJN jeong_k_Page_078thm.jpg
bd5b5363b7ebb333d9d6e857877cf27f
6112f08472dfc6a3934af2067ab3cefe391b6ebf
55857 F20101206_AABKIZ jeong_k_Page_056.pro
96abc2ba5130c84cba0c0bab77833361
632519bd165de7a460018dac82d24d5209db91ac
27022 F20101206_AABKKC jeong_k_Page_110.QC.jpg
8cbeb3c27c6c5f9ea6f8e92dfb9e9722
6c1d8015a6e9712e6d0c5589a571b6f08278e372
988868 F20101206_AABKJO jeong_k_Page_032.jp2
8d96fdb73ec3cf06580ca534cebb4b5f
92b051b0cabdeaf27dd2d482eab9939953001c62
F20101206_AABKKD jeong_k_Page_018.tif
d0f5d97acd1913642c1bfcd0e39c1f81
c1b91af8a035667b68c80028df98649148ebfb02
2446 F20101206_AABKJP jeong_k_Page_112thm.jpg
c80a3e0c7ae4f6fe7395b62a1a572384
59f6fee5b5dcfe669d07ac2257520bfcd1fb1465
F20101206_AABKKE jeong_k_Page_084.tif
7099fb9c9ca817a1f62a0e92958fe467
0169b7a8e9aeb1d2a4ad20df0cb75f5f3eec9c73
1051961 F20101206_AABKJQ jeong_k_Page_087.jp2
17dd93193e8ccf6de124b4556606de2e
f28a51b62cb26970107ddbb96439f0c9060197ee
F20101206_AABKKF jeong_k_Page_020.tif
66665237710cd9fd49a91b02082c86ca
fa15818d5c772f8805fe043ed21487fa1d473e36
24860 F20101206_AABKJR jeong_k_Page_013.QC.jpg
1dbba98b91e5523eada788c263eb61de
f06723c3705393aeae0fb01703ba706aac99751d
6458 F20101206_AABKKG jeong_k_Page_057thm.jpg
3a8743f5d9b69a6c69eeecca680c59b6
3a6b56d8656a5326cde1a885c6b34d7e7ea35b3e
5708 F20101206_AABKJS jeong_k_Page_099.QC.jpg
6b6d9520e91f3942fc7f5240763e0f58
0340c038960e2046ad708cca10173b4c1ca75498
6040 F20101206_AABKKH jeong_k_Page_091thm.jpg
9e9d06d475d184e7b34734e765454357
97774b9f54ce5017c925e5ed093c67900f13490a
41260 F20101206_AABKJT jeong_k_Page_066.pro
c6ba5bde68d8c851492407a942dfe209
cc61b19a04d8062a81a6bab38b3a46fd0b239ea8
3452 F20101206_AABKKI jeong_k_Page_116thm.jpg
7a2978cd8d3829095cdc94182c5a8cc7
a41a7a058600b14b3ca99a82f62cec76f768e221
2092 F20101206_AABKJU jeong_k_Page_042.txt
08b1a836d508a5aca2ab0ca9ad098676
4f69c93c5f99e5bb03486571b16dff1ce8737f9c
2085 F20101206_AABKKJ jeong_k_Page_072.txt
14bad1ed9e519092e35b84acedc9b7ff
9b862d489cdbef5c7f13c6119f7c725aefc0a7d9
F20101206_AABKJV jeong_k_Page_078.tif
827c4bf800b04083680758de3cb5efb2
e316dee60798dc5eed865e2717d6d04cf0a89b35
1784 F20101206_AABKKK jeong_k_Page_030.txt
daf0dd30ea1e8cae3fb85c76663748b9
22e4ef2e514392fdf07dd4ab5f5d70bde75a1892
5659 F20101206_AABKJW jeong_k_Page_113.pro
7dfcc8dc773a003fde719cad16535971
0376644a015c0acff68977311a3ebb8f1f04505b
F20101206_AABKKL jeong_k_Page_099.tif
babde7adefa564ed435fee7395bdfe79
e9e9cd01bc54c79c7fbe68ef503dd3ea1e1794c4
1051936 F20101206_AABKJX jeong_k_Page_075.jp2
9d3d1e2eaf39a2fd5999f71ec0e8a032
68a2c4d2afbe0be3ca373a39085cb218d4cbcaed
727927 F20101206_AABKLA jeong_k_Page_039.jp2
79b690eff1ad7a146507be376d487b6b
93cbb7abc2d4abdb94a3fd171592bbb2fafdb76d
F20101206_AABKJY jeong_k_Page_043.tif
8234afa259008da11f3c5f037e78254a
bcfa61c0f3a5679c39cd837c20b2e6d829e29b40
84820 F20101206_AABKLB jeong_k_Page_021.jp2
1ebd14db5dddfa938d3796adc0cdf84f
4cbf25ad3dde69924bb7991a1d64e8fbf57219ce
75485 F20101206_AABKKM jeong_k_Page_104.jpg
6e2a39315a3a8066d3335f65c84d8425
01bc1d44b718ea14f7c9d7d4ae984564e13f497d
45938 F20101206_AABKJZ jeong_k_Page_102.pro
6e26266cf4eda897cb7f0a406a48b0a7
616d67e7fc51062b52a800b1db27753382339118
29981 F20101206_AABKLC jeong_k_Page_112.jp2
5fa7a474dd8388cec62451c1811c1d99
9539e281f65253bead1c36becc2c66e2a8de4a32
37369 F20101206_AABKKN jeong_k_Page_085.pro
f629dc19f5065cb05afcaee77c62deac
c7236145a68611d760ae0d052f10fc4c1ed1660f
60102 F20101206_AABKLD jeong_k_Page_009.pro
471745ea45a6fd096f308b3dbd076c25
5aa440b275d5653b932022c1882c015c1e0ea25b
2128 F20101206_AABKKO jeong_k_Page_028.txt
50d1d1f4a51f6d1bedf7558ebf09cc43
926f39a9ebd97ce9c958609f402a0de5ede0d172
21362 F20101206_AABKLE jeong_k_Page_042.QC.jpg
5e8e56a3375125de23c5d7ae02cbac86
57e08d14b156aa5754233590aaef4f42fd752c32
1051955 F20101206_AABKKP jeong_k_Page_069.jp2
fc59ae6bbe596f092314f248f9c9b42a
569821039002c96ca8b4f0f52f8041a83794fbdf
2007 F20101206_AABKLF jeong_k_Page_021.txt
e64d834b93aa8f8d5f31f89bec4aa2a0
c146e90c76b661526ede395612a5417e3228e2b0
132 F20101206_AABKKQ jeong_k_Page_097.txt
e948b6d123c08d00933ae55abb79c13e
1974287c18b2c03028208f214194ebe3b39c0b36
55081 F20101206_AABKLG jeong_k_Page_039.jpg
681aec1d27f4cd586ee40994c057f75b
b9ae0ee3f6595ca7ceb0974ccd2f78177731e092
7099 F20101206_AABKKR jeong_k_Page_019thm.jpg
e869a8ff24f546b0741994b42c5fa0b8
b597bdff0bbb6e540d8a5e68d9cabcf5e2be6689
125055 F20101206_AABKLH jeong_k_Page_120.jp2
8b69867a4fa7d970e87a1133212b34f5
4993e096f0f8ff839c01114a36397d100ffd3381
18820 F20101206_AABKKS jeong_k_Page_023.QC.jpg
6ed3497eccc65098a68970b38c25813b
21256975378a53deedf585a76bc22f857673bfe9
1743 F20101206_AABKLI jeong_k_Page_089.txt
71e74023fc1bfe3f6595964ab2f75bd3
9f57f16adfce051cada2f05e29c65bfa28ddeb36
1022 F20101206_AABKKT jeong_k_Page_074.txt
ea8126b6b39232b610b115c95d0050d2
d25df8337becb3032f262bf632f39fd6fdbd6711
6778 F20101206_AABKLJ jeong_k_Page_018thm.jpg
4db2a5320b21cb4f6f13faaeb58691cb
e7ad730b8bfc616458c62ac73a2e6b4b537c4d60
F20101206_AABKKU jeong_k_Page_063.tif
7ce7773489829a1e78212cf4de50d010
af9946d217f0748467a8d7fafb0413b1eeb4056f
65340 F20101206_AABKLK jeong_k_Page_032.jpg
89c14e444582b115cc176986b471e0a4
4e86cb69c981e2149b022e68581b7e204bd5af7e
68340 F20101206_AABKKV jeong_k_Page_082.jpg
ea56fb4e450a1e071fe5056b1daaf3bb
860be36f76a66acc77e43dec51a8417e1a4303d3
56032 F20101206_AABKLL jeong_k_Page_047.jpg
27eeb630501aafe8b931dae997e42bd2
32b340b3e0efb1e686202c6306c699f1b65c2ea5
F20101206_AABKKW jeong_k_Page_016.tif
55aaf70262aaac0276a89d37981c65cd
e2ca40655b94e0574f8d2f46ad25be6877ced274
5059 F20101206_AABKMA jeong_k_Page_053thm.jpg
24d4478ef5f521217c992705294c7c8c
8079948b0ec458c25e692c2446b61e6a6f9104db
F20101206_AABKLM jeong_k_Page_005.jp2
d8bb0ed6afa3f3060184914de3c6eee5
cfa5dec7db97315966606f3f1328c7d8d4cca7f0
27810 F20101206_AABKKX jeong_k_Page_056.QC.jpg
3ba583917432f7dbd1c4c5762161402c
0611c79be466929a70b8567f9440a4a1672ae5ba
6943 F20101206_AABKMB jeong_k_Page_016thm.jpg
b53e72be5475f5dcb82688cdbd0d3833
313ee43b36ae729c23aeccadb210bdef484ca20c
15252 F20101206_AABKKY jeong_k_Page_026.jpg
c566edb155c8f35ab53a9911a3b1aa8b
3bdbd98b87faa5bb7be3c4a0d39f821664610c29
F20101206_AABKMC jeong_k_Page_056.tif
c38fd0dcd93b66579affa886eaf2d268
4211387f22abc3676f9818750d8b68e631918518
24443 F20101206_AABKLN jeong_k_Page_057.QC.jpg
bf072a679e1c5306bd3a4b10deea71b7
c9a2721b4b7064af07f0c36f1bac35fd6e20fa57
1051975 F20101206_AABKKZ jeong_k_Page_104.jp2
aad45dd7dcd4d593129202a5bd1326b8
50b1a923dff8018f31c6851bd6a3c1032538d879
2259 F20101206_AABKMD jeong_k_Page_109.txt
a829e2d60e60b32e90f58b43d4e6717a
1e3ad2b86e0072c49051dac7c13cb6898bf9fc7a
68458 F20101206_AABKLO jeong_k_Page_055.jpg
a140c08515598a5562b0948be2c9f383
f63426e07f1faeef5840bf87df196c7a3c50b780
F20101206_AABKME jeong_k_Page_123.tif
0c1d4a7821479673a20c12d007385763
dcacc4cfbfc4cf4fd9c5ef5a5236c02a803d1f88
5986 F20101206_AABKLP jeong_k_Page_006thm.jpg
6496b3963b0726d913d5173c9301f5b7
e7917c5e8b1f50225c3dfc208394d0c9ad996a5d
57258 F20101206_AABKMF jeong_k_Page_120.pro
c42a0b23beb46dcfddb5a0f04da9f84a
821bf4bce03f73bb6f2f6c695a7ef24ec2473083
F20101206_AABKLQ jeong_k_Page_050.jp2
a9f5b48244301a8e62889cb4fd64db11
dcef7add24af9ec702961080145f97d7ab223500
26640 F20101206_AABKMG jeong_k_Page_060.QC.jpg
0f2a7abe61a386485d86051aa63d72f3
567594b67a9278f9340341367ca00c1cff9af057
60416 F20101206_AABKLR jeong_k_Page_114.jp2
f1b50b2f5c83d643c4805313b783fc37
73d42053613025187592bac2a4c4ae2ec6415b96
9907 F20101206_AABKMH jeong_k_Page_097.jp2
b9e55886400fd593b79198b4611968b5
25c424a5afa9f2aa346cf65c85d6a23318ce9b55
4706 F20101206_AABKLS jeong_k_Page_061thm.jpg
15e3b02e54d221a93f8d68e1a7c68fee
0adc301326d0c83e4d1500759d0aa807b43c11e3
F20101206_AABKMI jeong_k_Page_015.tif
23157ebdda5cbdf349c4a71609914176
d99f7194d77e809118b2338c67e2e3953dd3a1e1
781 F20101206_AABKLT jeong_k_Page_115.txt
a031801daa575e1c915470a8c2de7361
56fbdee0ba23ad073eff0eb462c3f65a01d508af
1051944 F20101206_AABKMJ jeong_k_Page_093.jp2
cc129e8626698945d7d0dbb806f23aa0
ee1633efecd47415fce19124ee26d39a1a947bae
5513 F20101206_AABKLU jeong_k_Page_077.pro
b3cc8cce4dc50012ee309ac5934a3a80
23588e31c7641f7224668c2aa0b1c6c63b7675b7
45409 F20101206_AABKMK jeong_k_Page_055.pro
2e0e6b8efb5f403317b81908f5c5fe9b
44dd614a2669f985254c6a502ba91920b5c299ee
77621 F20101206_AABKLV jeong_k_Page_015.jpg
f494ab851116fbc5bd8b4dac06ba2303
af13a6ae08df68814369e36f1955987a2633e029
F20101206_AABKML jeong_k_Page_055.tif
d228f84f5cc0f50cdbd9e2054359ba7a
12294d608849ec33eb8167bfb22f57565605365a
88319 F20101206_AABKLW jeong_k_Page_079.jpg
05545f7e12caca7de0f508bc5edff7c7
a929f0a4da74c5605f9f1634344c76fdd1b26f80
F20101206_AABKMM jeong_k_Page_071.tif
1a82922da9197bf0c5c28e903444b7da
6007d8ab9e51fc58aa9b9d65666c310410d55ab3
54864 F20101206_AABKLX jeong_k_Page_018.pro
4002fc9174ed616ac530095df999fc6b
2c9a345de117d1c323dc09d21ebd7596ee1ae39c
6614 F20101206_AABKNA jeong_k_Page_013thm.jpg
b0262b2318353813061c09bc6a49de32
f1151d4467dc07867843a4372d39430f31df1627
6662 F20101206_AABKMN jeong_k_Page_103thm.jpg
ca54a631ceae798b3f1ddd5652950a6b
ea88e8960e47d6c39160ef0d3de4c68a0bc88e39
20345 F20101206_AABKLY jeong_k_Page_081.QC.jpg
95e8bd7915369f1c2a7705be11783c88
3961fbe654a895588e0ef966d895beb34779f983
5905 F20101206_AABKNB jeong_k_Page_043thm.jpg
f843dcb2911d395dfd6c60e467b011bd
accff163409b581a7859329cf08c8aaf1fc64baa
22244 F20101206_AABKLZ jeong_k_Page_082.QC.jpg
bc67f8c8e3e5b13aad0b8df18a0a7999
d556472e29df32418b9b13804ca197dc3546b77d
92101 F20101206_AABKNC jeong_k_Page_068.jpg
252de1c04fa56b5d42398c543aa74d46
afbf277a8bb4e78e8306580d48481b5526291fe7
2210 F20101206_AABKMO jeong_k_Page_106.txt
5d312df6cf4b0dd1d0d2fe972ec4234e
59dd05d27228da06c4641863ec8fb7b9e81d414e
1051947 F20101206_AABKND jeong_k_Page_068.jp2
e693488172ff63bdefd1f65f5df367e5
cd9b6d9441dc6ff75689c0ff22cbe6ad48be4e4c
36690 F20101206_AABKMP jeong_k_Page_101.jpg
35a8acb7fcee116c7287fe1de386a9fc
bab26d10ee2fb81ff79632f5901c7cd5c01a2c10
6425 F20101206_AABKNE jeong_k_Page_027thm.jpg
e375a5bbf7a2b56a1582eaee1bb92c84
f7b91a64504b26c5eaa388073630dafd539fe5f2
F20101206_AABKMQ jeong_k_Page_074.tif
a2f3b39dc26b5e25e7dcefb3563fc00f
8287792c585b5d628250af7606e97637474b1b0d
1051967 F20101206_AABKNF jeong_k_Page_027.jp2
96f05633a239c24a5c37c5ac14355c6b
9fa17c3b1fa3489762766cb2ef0e0e73a0c849b4
F20101206_AABKNG jeong_k_Page_012.tif
0eff69ae46fe3cf1b5f28f147633a37e
8f68c8d564361b852a48999d5240084dcde3456e
64499 F20101206_AABKMR jeong_k_Page_083.jpg
eba98ae722f2ad90d128162da34cfc85
ea6a44bf89bdcc685170a921462b42180c8e6843
2074 F20101206_AABKNH jeong_k_Page_099thm.jpg
1c6992b3f234bf068495e7b55c26cf6d
af27ce648e92b8724bb6cd1204cbff72e679676d
54114 F20101206_AABKMS jeong_k_Page_038.jpg
c3b59d7620d940bdb9e9601c5ed8138a
68d38d43387d9e5829257124490d16a99585cf8a
183638 F20101206_AABKNI UFE0021400_00001.xml FULL
7611512e3fef1cfd401f95d7272b7e2c
a62a96899b4429cfa25ef80f0072e9ca198dff00
1733 F20101206_AABKMT jeong_k_Page_083.txt
525e24dd9fdd32200611b05e0cdfc44d
cd21f832566b6d8c6f26e8a9ae5210cc222c9b02
1051985 F20101206_AABKMU jeong_k_Page_063.jp2
4b92cbb1173bd49108615b4e96e722c1
f43523c95938794b3a547cdb8aa67683d7e97c2a
26926 F20101206_AABKMV jeong_k_Page_103.QC.jpg
941fd099058ec705229b6f2aeb3ba9ec
f96d0f3db617034fe087a5b322f193116972daf1
22096 F20101206_AABKNL jeong_k_Page_001.jpg
8f52bd8f07f6b07a74842befbabd0c82
f220c8e5facf4a03df15869f81aae3e95a5751c9
48987 F20101206_AABKMW jeong_k_Page_050.pro
7812b3b0e1b10769e4a0aa9a398aa319
864cff23562cf10abc8d541718048f10d5ee3c8b
69331 F20101206_AABKOA jeong_k_Page_041.jpg
3b8a61a012f2a5b80ed06cd0a48f30d1
d22915731172cf7c6b48442619773d2b0107ceaf
84849 F20101206_AABKNM jeong_k_Page_006.jpg
09ba2787f56a8c8c701400701339b068
7ca06100efe6f092d3e4c1be4458b369c78be700
3191 F20101206_AABKMX jeong_k_Page_002.QC.jpg
8b8f0402f79c175430d5c505c2f02ce0
c34a9b998c80a164b3e65ba5f119958b5e6382ee
72931 F20101206_AABKOB jeong_k_Page_043.jpg
4d9d0c9e7a3df71a470c4e2991b15240
7ae8b055ad40d96fb183a7942817f57b4b0a0c37
72821 F20101206_AABKNN jeong_k_Page_010.jpg
9a3870a833a55654028fe4c7e784278e
e3600595162bd131ffe14dae11c8d447ad9caf57
F20101206_AABKMY jeong_k_Page_108.tif
ae62f36da5d615b5cb6db279c0413dfa
21e5103b8265e34373a347f50862db302aa1807f
57373 F20101206_AABKOC jeong_k_Page_044.jpg
f10fe6a1f24c8339b820770baa4bc430
c2c2e6ca0edf254913acc6fdb32d2d1d486ad367
67226 F20101206_AABKNO jeong_k_Page_011.jpg
76b784a1c5f80bf4a0c81d66e7e5f53e
3c242452fae8686ecb71f7477bdf17a15b70bdf4
45464 F20101206_AABKMZ jeong_k_Page_042.pro
5ab3ef9b53e3ae67eff8870cf3f45802
d35f0f1987383981a0f5b6aa3a28a71d9ca47841
55836 F20101206_AABKOD jeong_k_Page_046.jpg
bfa641cd90f91d42d06068724e29d911
b31f370fde2e9d0711682f2ce168a6383b101c6c
61526 F20101206_AABKOE jeong_k_Page_049.jpg
36b7ba8cbdc7133562c05f0df7b76a93
d7e5a701cb4b39acea7984aef8a7134436c1d795
81132 F20101206_AABKNP jeong_k_Page_013.jpg
4d4bce97d5dc72b6fa0c65ce51587cf7
ed18b77a30771eba7c37909c6c2a5decbab292e7
67827 F20101206_AABKOF jeong_k_Page_052.jpg
55a8a49b5f40a404f54001c123f2e5ca
12358276c7354cb74fef9075f8fef7c54ee57d71
79522 F20101206_AABKNQ jeong_k_Page_014.jpg
b425c797e6d57fd7bea5008548b9ed9b
b8e2470eaf6ca028fdd1c715d0db7601a5ce292f
86948 F20101206_AABKOG jeong_k_Page_060.jpg
54cf6878dc97e9239d32503886a26b3a
c3b90aaeddf5ea50613e88e04a3b3571924aac22
83137 F20101206_AABKNR jeong_k_Page_018.jpg
73b7f402f3f594b9e720399411513230
3f5c24c0171c81c7772d39bc5c93e99e7e038324
54730 F20101206_AABKOH jeong_k_Page_061.jpg
52a155a9e8e512612d5319d203bfcd1a
ad617c7dc6fa368e75437bb46aa9bdc55330ca0f
93699 F20101206_AABKNS jeong_k_Page_019.jpg
75c64bc72f153681e2e542e88c67b7d9
d136bc121c809016223af759728275cfd576b93d
65040 F20101206_AABKOI jeong_k_Page_064.jpg
0ec774b75a42ad3ed38be725a23b27f7
35b6fd05dbe7cd4e91cf4ac9ad936f37fa468ef8
73642 F20101206_AABKNT jeong_k_Page_020.jpg
55703ad8a0a93bbedfec98cc7a0a508f
72f956f09cbde54aa17eee58f5df5789cdea5c4f
56368 F20101206_AABKOJ jeong_k_Page_072.jpg
c5cb2936c67f1c2bdff515e42f869c34
737eafacb93a6e936d81c01f840cd1908f9e10c9
54932 F20101206_AABKNU jeong_k_Page_021.jpg
4ad3763b80e6a15e152f283ae2f0555d
1957941df0286aa3a24270a30e660496c890184b
87125 F20101206_AABKOK jeong_k_Page_075.jpg
b28c49d600fc9d5f45451aff2e49310e
951cd339a0d2546bc66f2114e85c04499c9df3ab
70581 F20101206_AABKNV jeong_k_Page_028.jpg
ede86c2294d136472dd4b130efadc0fc
0765308d09ecea6ef1743eeb77ad2f0ee2ed9abe
59350 F20101206_AABKOL jeong_k_Page_076.jpg
c8a9cd0de1757661347e57c5a9645cbb
6d8b37be1056d502f8310692c7d0125e089c8f0b
46854 F20101206_AABKNW jeong_k_Page_030.jpg
901e9b95513d69cda24f7828329286ab
8ab5437f3de4173343877179c0bd2955f3052da8
32817 F20101206_AABKOM jeong_k_Page_077.jpg
ceb1fed5a5d4235734c6b0a2293b8881
0230a502b0acc412f32c3a664b9065306ef93ac8
73913 F20101206_AABKNX jeong_k_Page_031.jpg
f0e81c74f5342aefc161b12cbd26f8f2
bcb685f0aeefdeff1887c21c07e6445c39a8402b
79957 F20101206_AABKPA jeong_k_Page_119.jpg
64bbb41d76732c0e2cb8120bb16add72
359bb60130b53fd231a3125533355c08ca93c34b
28164 F20101206_AABKON jeong_k_Page_078.jpg
b40661449763ce9f543a4cdde3ce13f7
fbc54fb58f789a9c58b2c5fae240c0b767f8d1ea
64058 F20101206_AABKNY jeong_k_Page_033.jpg
874bbf3381f6377c9deca20622105d06
30cec8147292122ed80b64a483adc579e1cc0970
47817 F20101206_AABKPB jeong_k_Page_122.jpg
f461a55862f51f5efdabcdf147952197
c89a6c941e9015596dfc6654dbf63754cc4d7896
59312 F20101206_AABKOO jeong_k_Page_085.jpg
823af5120a5503e5a9bbeb8f31514292
492d26abd910256cc9e96d8a17169fb924c91028
84569 F20101206_AABKNZ jeong_k_Page_036.jpg
51ee50ad736949748292d747a1800a5c
0c36aaf2c8d41e1ff91d40eccbdd58f94e92bb9d
34509 F20101206_AABKPC jeong_k_Page_123.jpg
3b724d4f58141a1b5bca46b1a96f8a55
d13fd2bba5f09f1fa700b5dd84b39ca07879b43f
93454 F20101206_AABKOP jeong_k_Page_088.jpg
5967ae6e5212ce8d54690b39af5a9a97
3059bd5734c951ad21cf7224bb16877e2143ced1
22032 F20101206_AABKPD jeong_k_Page_001.jp2
dcf3b42a5259e3c82698183b455db7be
45cfcaafc3168f2810436ed1d67e9efcf8c1cace
6133 F20101206_AABKPE jeong_k_Page_003.jp2
2bc21c46711ff71e622aba11069be74a
7c86f8c827f6d8c9618828e4f59cadd496a4f780
86915 F20101206_AABKOQ jeong_k_Page_093.jpg
a256e8ece79448618212fe9a49f0df04
aca321ea5f1716fe92c5842448f90e2d550e24be
F20101206_AABKPF jeong_k_Page_008.jp2
c42153d29c8c428ad2954cf635ef8aaf
36e0ff897b858c0949d22f9d95708d563a6a9b2d
58100 F20101206_AABKOR jeong_k_Page_098.jpg
ad162d1a9c23282554704e0fd684ec6d
6fe83bf8acf62ea3a4ab5be97979b050657b1acc
1051933 F20101206_AABKPG jeong_k_Page_018.jp2
d01d758609dd53765cd762defcdefa7b
1923d86c4d8278ee8a259263a822a2177bd0a37f
36266 F20101206_AABKOS jeong_k_Page_100.jpg
a25e04e13780e112cf38cf9f2ca1bd26
6fbab3c906df0ca26ed073b6c30b2c1a538a2b7d
1051973 F20101206_AABKPH jeong_k_Page_019.jp2
0ed236952116c71adfd25f131c2c98d1
681fc9403ba38f6c13ce0e4524746592affec442
65759 F20101206_AABKOT jeong_k_Page_102.jpg
03c86abcfab725d20b9e661b9b99d6c8
9b690ebdcd3d25aa0bc42cb7d24a57e7b5dfaa78
133710 F20101206_AABKPI jeong_k_Page_026.jp2
a8666e970e672e720be8ea626d2809e6
4d44dff643a00b2c87216b1b42c910f4d67b31d7
64140 F20101206_AABKOU jeong_k_Page_106.jpg
52b38e1d675ce24971093bf0751bc1ac
f6c873507f380b5e2474eef0740ad1b630a25cf3
1051980 F20101206_AABKPJ jeong_k_Page_033.jp2
7e1dc9a6216c603af4cf02368dbbafc0
f25e052180afc7d6e31cc6667374972901e641b4
57306 F20101206_AABKOV jeong_k_Page_107.jpg
ee398d65652cecc87f8e5ed37e51c603
66070494f90ea838d002dbba0bcadc59fe4d27c8
983031 F20101206_AABKPK jeong_k_Page_037.jp2
d863a178ca50466180f840283b6b0383
cd9b373d7ec46139cfeeea7d6b7ad3c99266e84f
23044 F20101206_AABKOW jeong_k_Page_112.jpg
4cf5fca3ebba6a68bb6f7d6976f4778c
93c78d4df00529c54e4379a474e456f18a4b392e
1016605 F20101206_AABKPL jeong_k_Page_043.jp2
3dfe65157b553ea2dfa37dd9df5a7c54
2d987e2158f2e1123b6d86e4df5848274459dffc
38269 F20101206_AABKOX jeong_k_Page_113.jpg
f9cfd894512abfa1b3e2d5d408cbc420
eaea2be213acaed5aae03293eadd8347ca9ab7a9
1051969 F20101206_AABKQA jeong_k_Page_080.jp2
0e587a1e8bfa10a71ac196e64cbdefcb
58e8a5ffc80e7866691c573229eafd5117f4a625
876575 F20101206_AABKPM jeong_k_Page_045.jp2
f7225c9c77ef62d2e2121dbf1fe74a2c
d091c535f1c29bf5a6af4bba9e6c6bf756a78ea6
28558 F20101206_AABKOY jeong_k_Page_115.jpg
735a8c235dfce8608e74f4270564aabf
2a08a24fe436a3b146c99910e09e693d9d87ae40
906798 F20101206_AABKQB jeong_k_Page_085.jp2
f09d77dbeed15ea88d4114021488dbf3
88a2e08a119a492a94471c19e0963cbf21de9a1f
787688 F20101206_AABKPN jeong_k_Page_046.jp2
1d7f1715b72497a30b51833d1fc17cfe
6b3cfa59011128e51ce2072522ddc65892a9bac2
89101 F20101206_AABKOZ jeong_k_Page_118.jpg
2f2062e15ac09a999ae6df7e765061bf
4b40c2b2744fbaa69a1cc7413eabd8efa48368a1
929028 F20101206_AABKQC jeong_k_Page_089.jp2
b6d22e108d1ee8c080e9b3d2cf3f3c1f
9076ec8265b97b86590effc9d803fba366816a5f
1051960 F20101206_AABKPO jeong_k_Page_051.jp2
2656f406de289e819052046a7c3940c4
b262223809b0b85816d0b678b084c872aa581679
F20101206_AABKQD jeong_k_Page_092.jp2
1827b7b151a10e9478975ff56045ede1
89347d1446b066a742b1cc897224d13628c95e69
1045467 F20101206_AABKPP jeong_k_Page_054.jp2
3e022901d738163fbe595ec157b57473
dc4f0c213f49f8e7efeab93bfc25bd57db008538
792721 F20101206_AABKQE jeong_k_Page_098.jp2
1b3e0e14eef269cc105334e83fc80fc2
922bb99f00671fca66f86a630720daa29dbc4f14
1051986 F20101206_AABKPQ jeong_k_Page_057.jp2
8ef2ef6db587f3fd9e0325f826e146bb
190f5f71aae1d9402c3870c22c615ff2001d9241
141261 F20101206_AABKQF jeong_k_Page_099.jp2
33f38c3386e84adf95fbd268a2131495
81d0c5c975a96203ab4ff9cd531f584cd57af7de
53657 F20101206_AABKQG jeong_k_Page_101.jp2
8dfbea29310e9b5963225a21d1fff5f3
a35189fa12ab470faba8405952ff4fcb238b8558
F20101206_AABKPR jeong_k_Page_060.jp2
955926a6e2559221aeb727d6f452b25c
34db87fd36d279c7c74d4d963db54ce65be1df15
99685 F20101206_AABKQH jeong_k_Page_102.jp2
b354cd9cae9c7c8d27ed9ebff9f37da7
3955320c231d9863c320e70fee963737b68fc3e6
1051950 F20101206_AABKPS jeong_k_Page_061.jp2
28d0f5ed18f4abff29d25227ca50ae5c
cef358368a6fdc1ccb2c29e086c8e0dbc5de05c6
1051977 F20101206_AABKQI jeong_k_Page_103.jp2
53dc4ff799781c55b8f5c5b83f687a24
50ee1bb6af3493c334c29b0b5b03016eb8eb3570
953230 F20101206_AABKPT jeong_k_Page_062.jp2
0a62a93a0fc38c95735ecf8c270632d7
12a2976d3bbfc02f0c0cef13f8f4494365d9c955
914437 F20101206_AABKQJ jeong_k_Page_106.jp2
2831858109fad616cd8e3d82cc2badb6
3101214f41749ea3b0a7d13eff290725be66ad90
1015974 F20101206_AABKPU jeong_k_Page_064.jp2
4239ecbfad8f813962594bebac87af5a
96eda95e294955de2ab104db3bffadc927a43b1f
62523 F20101206_AABKQK jeong_k_Page_108.jp2
7e6946933015077e2978326ee4ca0a0e
9e5da8098af0e162872d05758356f21d8f65c0cb
F20101206_AABKPV jeong_k_Page_065.jp2
3c43d10e741c2dd86661c2a6727a7ff4
f9de7fc120e42191d34e48ee4d69de9493d781c9
1051976 F20101206_AABKQL jeong_k_Page_110.jp2
f89b065af2c4e40728f256aa2d54c594
f4f68822e6ab57496672eeb32e2dca70506eccb9
1051929 F20101206_AABKPW jeong_k_Page_066.jp2
2be3f992aa711eaed613b5678aa7d12f
9ab8c0bf56884cc992d32098f2b7e1cfc24dd2ae
F20101206_AABKRA jeong_k_Page_022.tif
0d04467ddb7ddd7ade627802d2a23e3a
8b6c554055d4c4766c1f7f6d09f8a6cc9024898d
124182 F20101206_AABKQM jeong_k_Page_111.jp2
ec8933818b939b4920c2c86e215e6bf5
37424e0bb01df5db1097c196bbedeea9a9a0a17d
784631 F20101206_AABKPX jeong_k_Page_076.jp2
ba907c742161147a6321902dc874d63a
5396067b67b99784a8c2d05f219c2947afa4dec6
F20101206_AABKRB jeong_k_Page_023.tif
32435d592ebe7adeb6a701b234306d2d
c1c3b0e0a97b714b9de45dcb7ba0d33e931678b8
37059 F20101206_AABKQN jeong_k_Page_115.jp2
30bb249d8c8a4ec7aad31fa21041f4d5
6e1ab1f0560880308fb365eaaad1a8a39b0e37d5
37750 F20101206_AABKPY jeong_k_Page_078.jp2
057552665709562428e7566a475176b2
f7517bcab4b92bf15c63218fdbf3287727d31ff5
F20101206_AABKRC jeong_k_Page_026.tif
4b111790b72118074ab5f293454b6e99
53f33f113137a44cee236c828c72dedc0cf17b3b
137531 F20101206_AABKQO jeong_k_Page_118.jp2
7bd29daf8f46e298ee5e3a0481f5b847
514574f84520685ade0bea063794756c6426e8b0
1051968 F20101206_AABKPZ jeong_k_Page_079.jp2
74dd80a973231eb9ac2e1704b0845cdc
3f68a8f96f0b264f045492e62913e89c13e0ea74
F20101206_AABKRD jeong_k_Page_028.tif
7971c97b463fce2fb89285e207418f40
04de1028455ce7d0e9f59b7106a951a0ad07ae8b
129170 F20101206_AABKQP jeong_k_Page_119.jp2
2d22027416b3c32633ca57cff7de8ad2
a75faad35073a45b068c3e38e51f454a1baea335
F20101206_AABKRE jeong_k_Page_033.tif
99effcfd8a1a3a13612d0d5ea003cc42
d65202e2c37d1823a49071ec0f126496cfcf18dd
46823 F20101206_AABKQQ jeong_k_Page_123.jp2
c052eebc18d41ad87ac4c4ddf5bc3f40
5208d412ce5b8bf9e0d6d8133537dae8633ee666
F20101206_AABKRF jeong_k_Page_034.tif
6b5b84b5fe46c4b323b1e25c4904d992
6f3c133c6380dace8d582b9e3213235eda93cf67
F20101206_AABKQR jeong_k_Page_001.tif
4153eafbb7cc27a88751e6a9abf95612
79d50af3ac9bc09dcf646b5cb6cc9be95f792e84
F20101206_AABKRG jeong_k_Page_037.tif
7072099d7d135c3044df1180eaaccdf8
0cc1d82bcedb09ad8f03dfacab43818814a49f78
F20101206_AABKRH jeong_k_Page_039.tif
b0e33587e190232ffdfb9be54164301c
fd3be40ba96aa12cee58b3ca109c6d426494691d
F20101206_AABKQS jeong_k_Page_002.tif
ef0c23916c35a009944912bb97f44905
0d99dfda664bdeb0d3cdf327a87bbd91462c3c6b
F20101206_AABKRI jeong_k_Page_047.tif
27d330b42669cff4007be0924d5e0959
10ae6fa7b181f8246efad5407236392705b9dfce
F20101206_AABKQT jeong_k_Page_003.tif
fa0be16903c2c6f7d20b96e17c9149db
d732a6e09646728d44af33b5e8f07e66f7380150
F20101206_AABKRJ jeong_k_Page_048.tif
89e3e27e5939815b9a206278bd85ce73
0d86813ad1dabec5bbf34a9c244a991563d81125
F20101206_AABKQU jeong_k_Page_005.tif
348ca7d83174bfb78824e6551cfad2a1
85be91c5ac3435318b6d9dbf3489c93a116d33ad
F20101206_AABKRK jeong_k_Page_051.tif
0c7148b85629954ccf8aa07100ea9a50
328afea86e1c5180ec74be60fedaa83b70edcdde
F20101206_AABKQV jeong_k_Page_006.tif
8389296f628536d78f4facb21ec70c15
80f2902fe4e52e50ef4fa11b0671661fac77173a
F20101206_AABKRL jeong_k_Page_060.tif
d7714c9cc499bf56c22891af6dd7dc72
5ed3aff9163a4b9a3546604bb2ca27e886bd2a4b
F20101206_AABKQW jeong_k_Page_010.tif
6cfe572465206f56f3b7271cbfcbfc9b
e7dc21e4a1b8be5fcac1e7f7a942e70d501fabe0
F20101206_AABKRM jeong_k_Page_061.tif
efa937d0e6c209f116cf7a890d10071c
919371d0bc95a6455ab698663ceced4f72ff382a
F20101206_AABKQX jeong_k_Page_013.tif
70190df4e14333f44e711de17091f6ac
00de9cef450c0a4d12c06b3d07352a8f3d63961b
F20101206_AABKSA jeong_k_Page_100.tif
10c10c69701649f49f2d26d7461a7a3d
cdfbb2979ef66cd48c5a3b96f0db9e7c68731bfa
F20101206_AABKRN jeong_k_Page_067.tif
bcb11f689a67a3b6f525f65541212fcb
80a24a5502b1912158a2009556d6b579817aba96
F20101206_AABKQY jeong_k_Page_014.tif
8a0096155e6fa64ad93a433edc702b43
86887249642d6fa21ba7af6f2292e4196a1fe96c
F20101206_AABKSB jeong_k_Page_110.tif
dfa9067e9e4256355108617b37e99217
94a681e7dca05102e82c94fea1317b87a1251e5f
F20101206_AABKRO jeong_k_Page_070.tif
d74cb53bdbf1d0a1cb25d07d22c74fdd
fd101ad86f494869ecbd0d78029c731983637faf
F20101206_AABKQZ jeong_k_Page_017.tif
ad5f901b4bcb214b962a1061b40b11e6
b0f4e84df6f7e5863e42070c55843d90fe080754
F20101206_AABKSC jeong_k_Page_112.tif
fe9fe4e5c1be00443bc8b4c5a1f09e19
242e21cef67b9209b7433e2c573a354a1c96c300
F20101206_AABKSD jeong_k_Page_114.tif
2eca551579bf1f6dcba17e6267641f49
143cbfc699aac3e193bb6932eeee3710a661c8ab
F20101206_AABKRP jeong_k_Page_073.tif
2670f76075e0d3b388fac4a0a3be5472
7d920b46a57622de522c0de24e74364b1306942e
F20101206_AABKSE jeong_k_Page_117.tif
a29e71a3b4bcd2cc91c2ffe30b09eadb
e1a76cabf0b330004c3d1e622385bdbd742f3cb4
F20101206_AABKRQ jeong_k_Page_079.tif
463504e647a3cbc8357f36eedc08c794
d6e4f59fe5cf5db22c3094e4e7ad4d552859ebd3
7394 F20101206_AABKSF jeong_k_Page_001.pro
42acfc679345221c1c8e524decae5c2d
c4ec649071e7163816d0a444a5a34854877dd108
F20101206_AABKRR jeong_k_Page_081.tif
62d7646a5ca523dacb3fddef97227401
c0da9f673a961aa0b358a6a6da3db2ea969e5467
29439 F20101206_AABKSG jeong_k_Page_004.pro
23c4711b2f5641b54372900c4522a970
9bb8a2b96162a9e3a3fa26d1cde8e40c66efcc99
F20101206_AABKRS jeong_k_Page_083.tif
95630d1b8ef316e09477f7e79337570b
d6183af77ac8fe6c5c82ffbe1476c11f2b29558a
54659 F20101206_AABKSH jeong_k_Page_005.pro
cb47f8f21df47b216b4b5c92110d8fc9
b075d0e5929c65b7d30c227c4cddb6e0ffea03b7
66364 F20101206_AABKSI jeong_k_Page_006.pro
7406dc4ac393a038262ee9322c590579
ba9cf7b645baee91e3a946a85bd247a8a79fc497
F20101206_AABKRT jeong_k_Page_085.tif
80befdea780e43a666128544c9856ff5
da2fab1c639de1eab67cb791931066de09baaa03
21278 F20101206_AABKSJ jeong_k_Page_007.pro
3a567f835096387cb4aa79c6ad0ffa52
08af2a21824dc27fbea585e61882d955a9f10f9a
F20101206_AABKRU jeong_k_Page_086.tif
dd52f1096d126ae4d608cdcc3e31b39a
882c4bf0f2162e840611027f2aab423ccf662313
49917 F20101206_AABKSK jeong_k_Page_011.pro
2171f2542fb1201d1197828bdbba53f2
d862d46a2223b6c966d773d387098abd03608204
F20101206_AABKRV jeong_k_Page_088.tif
1df2a9264162c7e55e8188aa9d6e1fab
5d748627e1220eb11e73ad86f32de98006e216f5
54173 F20101206_AABKSL jeong_k_Page_014.pro
83e7f2b24868405374c8bcac3a122d3f
aa22f2333060d0261c1aaced87bee4e75a70095b
F20101206_AABKRW jeong_k_Page_091.tif
0389e96fa660044347b860acfd9d3da0
9cc615218faff0e7aec55c21a183b5244567da16
38663 F20101206_AABKTA jeong_k_Page_087.pro
423aa0c5f9a468857f16f867463cffc5
6aa24af1c29adc142554167400571c1494bf9fb0
58907 F20101206_AABKSM jeong_k_Page_016.pro
9da9ea7e40035ffe02255079e9874a30
108dbc5f1ab2807dc090b61b85b5427544c7c9ea
F20101206_AABKRX jeong_k_Page_094.tif
e752f0d862defb58724e40efc74223b6
6f7ca6a3f6bbc422c38076547ba21bd6149f719f
62182 F20101206_AABKTB jeong_k_Page_088.pro
e4d43987debd8cedc91b309ef42975a0
a712208e1c398fc5f8bdc2a1c31d9086ccde3ef7
24980 F20101206_AABKSN jeong_k_Page_024.pro
cb04fe8abdfdec2ed02e3ca6a60fd175
528ff40188314cef15917a65270991400558fb45
F20101206_AABKRY jeong_k_Page_096.tif
3854b492d9aa3dc6c34e168fbd2954eb
c4be7e09a907e70d876b55269fc657396c0b9740
38153 F20101206_AABKTC jeong_k_Page_089.pro
5b2f2bd472eddfe4b72eab4ca9e3040f
c17a1e2618146db94f8c444fb73df1bab116c3ae
38540 F20101206_AABKSO jeong_k_Page_029.pro
1febc47aed7fd2d65f68cf681a09e291
23ec4173f9ee383ac7a0b7336529ba5e80f49305
F20101206_AABKRZ jeong_k_Page_098.tif
9eac88f1a0a83d92d67dcea47c1592d2
2f5340b6d43028bf4f4a1aefe7a164b30612bd1e
35222 F20101206_AABKTD jeong_k_Page_090.pro
d53449a73e0847cfab09e25ab44cef59
d2158f78e01b5c2fce35a29f2db8a199a3ae7dde
33363 F20101206_AABKSP jeong_k_Page_038.pro
8845f3286a415710553e87cd377b013f
4c5141fb0a99e9f0ea0c8eba7c728e8a14a0d4f0
49645 F20101206_AABKTE jeong_k_Page_091.pro
9d169f2d123249adf6e6aaac2744b1b8
4170c2ad5c5a2e231f3b8dce925e09113cee0f5f
25495 F20101206_AABKSQ jeong_k_Page_040.pro
a08a9a0b556957a85ad1d3c5e1fd5695
f7e997858b3798f45dec0d64637671b8c2142be4
60956 F20101206_AABKTF jeong_k_Page_094.pro
7f2ebaf7a21ef3d1d369608345400459
649f18331518bba34541774ee871c4f37fdd27cd
42696 F20101206_AABKSR jeong_k_Page_045.pro
aec70c7b60d65d0f753f411575f37e05
0adb423163a0e3ddf480fa95c906af9fdacaad8d
57249 F20101206_AABKTG jeong_k_Page_095.pro
ea277a60852a596e0e289bf62834b6e3
adc19b148dae7ea5367832f6caefe83bdd4401d0
27369 F20101206_AABKSS jeong_k_Page_048.pro
f64578b1d8a4c303c5b603043c725e15
99865a5b91e1f6674899de72498dd47248a5dd95
56118 F20101206_AABKTH jeong_k_Page_096.pro
f171e300a631e05dc646cbce19822c64
80162ebdff68022beddbcda53eff8311136a3417
32666 F20101206_AABKST jeong_k_Page_053.pro
f9cc2a3df9296ea9430dbf4d4fd28ee7
44d1305f2c771c47cc2428e55ff3e22a98d6d41c
3099 F20101206_AABKTI jeong_k_Page_097.pro
817a5417aee7d10dbce9fb202d58432b
b9fe34035df480c56a7480b52eff52609ce29bc8
55555 F20101206_AABKTJ jeong_k_Page_103.pro
6a52f18fa41da16975e59ef8196e8387
8b9aca2b851da4e83cf91665bc0747b9e2257243
19959 F20101206_AABKSU jeong_k_Page_061.pro
55befa346e9ab82a2a2e110aaea86c7d
db9ee105465c13b643f396db0161bc22a6ad155e
49603 F20101206_AABKTK jeong_k_Page_104.pro
cdc7c96a488e0c89a1fe5b471027aebb
940acf8537c4e40d3ec12eb1b07907566b7123e5
42000 F20101206_AABKSV jeong_k_Page_062.pro
486ac9eee0f7bef5c9eac6dfadf37371
ee1d43918340f7c920fe4624e5a844513793071b
44464 F20101206_AABKTL jeong_k_Page_105.pro
1f6e1a9be8364098ea52803d23edbdc2
b7b33dc3c5a58a57ede128d28ac15a5987e7df25
60369 F20101206_AABKSW jeong_k_Page_065.pro
46ecfef1d6ace8a6df2bfd2b7e89bf5a
1a0db5f68dff290ccf722b04d3fab9cd8364224a
55309 F20101206_AABKTM jeong_k_Page_110.pro
e32e5fe4d3ac0f8735da6b72cbf7dd75
9a2df1ae8c3fb04bf3cfcc7c856123bd3a3b0adf
20040 F20101206_AABKSX jeong_k_Page_067.pro
1e29a635860d1ec7be77778a17131cc9
8d5ebe6f66b8ffeb078594dbeceb1d4c1134a90b
1239 F20101206_AABKUA jeong_k_Page_024.txt
adadc36918704d8c92c144570d0f66ed
1712802daca711b7454294ef404ca9e779121b70
60014 F20101206_AABKTN jeong_k_Page_111.pro
7f85b5458f870dae01a8a9e7c315f467
9794c82a8535b4221c378a5b4e0dd7497b97f3ce
54611 F20101206_AABKSY jeong_k_Page_075.pro
c47d94feec59ae5cc981affb20fdb3af
5586a7d3c718046c53a37bf9a28757bd2f9032d9
224 F20101206_AABKUB jeong_k_Page_026.txt
0b43d66895ac84b372d06be12ff62919
67efd98b8735e6fb267bc134231052e05df9d249
5425 F20101206_AABKTO jeong_k_Page_114.pro
d159ef0a0a5d41e1f9bec51607a2c5d8
bec31b3bf2ef347303f5281fa4cd0b50a1d6b05b
55470 F20101206_AABKSZ jeong_k_Page_079.pro
9ddaff8176a0a1bcfda7757f2a3de7a3
c49a3e7591193fd623ce4dfee69849476144fcdb
90405 F20101206_AABJRA jeong_k_Page_071.jpg
b665529c24a62a40b94f9139651ff9b0
d5cd4d7c1463a399090ffb4494db16b22b5ef7ab
1912 F20101206_AABKUC jeong_k_Page_033.txt
7d32dc8b1a00aad0b0a6db22c2221f8a
a0847d710b1990ad1fafb00618bb2b03ea1703c4
8566 F20101206_AABKTP jeong_k_Page_115.pro
7d4ddf2e07f7530a3c0c06c161816d74
569aad0d9d2227e978ba5494b53bc334a5d8359f
F20101206_AABJRB jeong_k_Page_077.tif
eae881a4c8ce0587b1a65837db83666a
3a5965112a3430037a2a36af070593c7792c5698
1983 F20101206_AABKUD jeong_k_Page_034.txt
d48f0548d1be60597be0d14b157e75f8
b76400b2cad4b6b12b7afc92168ea87a812c7d12
60328 F20101206_AABKTQ jeong_k_Page_117.pro
798246232a289e4653b5af17aa308123
522f0b901042e728f1a5c986402c31276955d9b6
651 F20101206_AABJRC jeong_k_Page_078.txt
a1d3265ae6366db872cc27d2a7264506
2a17cae95458275fbf056e77667af7daf7287037
1136 F20101206_AABKUE jeong_k_Page_035.txt
a053ca430b83e655b0d7e56c46b78325
1d850d13d5338ec6c795f48833664c9a09108559
33406 F20101206_AABKTR jeong_k_Page_122.pro
45ddb1085b994f1c7aa1a766034d6cbf
e0132a28a08c0eb2dbe9efb5167d96f965707776
6061 F20101206_AABJRD jeong_k_Page_054thm.jpg
145f5320a87077a73e81e0237916a954
0cffee929fa3508233a1083b70d76ede800e171d
2011 F20101206_AABKUF jeong_k_Page_043.txt
540135b82065b2c529b309816024b1fb
0da2618f696dd9ced99da74192f82e8a0083c117
2331 F20101206_AABKTS jeong_k_Page_005.txt
2e22c12a5c1a55646c2b1e45eda99bfb
223a280e68a4f890e5b393c65bad31969fec5f81
66176 F20101206_AABJRE jeong_k_Page_045.jpg
470a0cdc61ab66c04eae2b339ba23f13
1a189c226445a02c972dedaee5d53ec353e61d4c
1081 F20101206_AABKUG jeong_k_Page_044.txt
831e63f649cd22d5715d540c65e0d25f
28a6264941d7f268f2ffe3ee29a44083f5aae65d
889 F20101206_AABKTT jeong_k_Page_007.txt
01fb7a13c011aa97c8766301b62c2141
3ab297beb28bd2d4c2517fed216ca420da5eaff2
78277 F20101206_AABJRF jeong_k_Page_050.jpg
a98847565f36cca1594fb8a6ff89e59d
d6be5f0e9beafda95d7e9fca9f5d1b1940661486
2224 F20101206_AABKUH jeong_k_Page_056.txt
784d2c7262d401be66b88f2a2463ebcf
5a9567a6820c2ee758be0be527b20f28c21a0db9
2444 F20101206_AABKTU jeong_k_Page_009.txt
5f09aa7b0102a70fa7187814485710c9
eef67cc5f7367e157e7350e837bb91f504d26511
3839 F20101206_AABJRG jeong_k_Page_113thm.jpg
7ca7131b3ce4a9a200c1875e83f8e431
1f0c772ca9685bc16c569511b12c31b33d405dfc
2262 F20101206_AABKUI jeong_k_Page_057.txt
707d288ac1d1b5e8629274f3eeed621b
b84fc7aecc68d45041c0abfc0ef92ef1cf95db79
87086 F20101206_AABJRH jeong_k_Page_110.jpg
d81f53c96aa69180688605505bdeb3e3
d51cbd1330b1fddc7ec316bab48c62cdee619fbe
1153 F20101206_AABKUJ jeong_k_Page_061.txt
c58f2350d15504dd2f931df3932f8af1
5aa5f47e930bcc642caa6d94ecc2f76a36be604d
2009 F20101206_AABKTV jeong_k_Page_010.txt
42a7e1a56ab9227bd7adf33437c42384
624ec85776e52853efbd7f910700e5c2dd71e195
75726 F20101206_AABJRI jeong_k_Page_005.jpg
a97fa835c8c0607bd128d63d889603b6
a79f8b80720e187e805ca5de7e97d6f2aa7c12ef
2078 F20101206_AABKUK jeong_k_Page_062.txt
c1f1bf335827bc2e633b365e78536123
206e713692e989b71a2ce8998166794bd3e56278
2197 F20101206_AABKTW jeong_k_Page_013.txt
5306f2d8dcc9632fa4703f496f586b11
d24c5c13d22246a9c7ab6a951ed80f05474cc28e
11991 F20101206_AABJRJ jeong_k_Page_113.QC.jpg
fd6b48f2609646c937184feae924670b
45525b3100281b64b6370de54afe0d119a55b129
2394 F20101206_AABKUL jeong_k_Page_071.txt
bc771151860519b98ede9538dcfaf8ba
cc1ad5a38f245344ff777a548f2446df5a128ffb
2022 F20101206_AABKTX jeong_k_Page_015.txt
c6a61e4facbc42b603cba1504b31d08a
5350d439ee6f2bf40183d96865c534d51f86f521
2449 F20101206_AABKVA jeong_k_Page_121.txt
0fe0d2e96b30b4eb8fc23a8db6dd28cb
fb8d633660270f7f4252b4feec3414c811ece683
708913 F20101206_AABJRK jeong_k_Page_038.jp2
764709189609b7afabea974f96a1bb70
55687d2fa6734c4ae673334b61221da0da49ba7d
2339 F20101206_AABKUM jeong_k_Page_075.txt
9941c3ba88112788876104eec6bd5811
63b54d0f06f763fec4d2c4c6baaffb81b9454065
12583 F20101206_AABJQW jeong_k_Page_112.pro
f09704a48ddbad0fdfe3ddef02c7b1b1
ae6e19e6064ad9dceeadfd0a4e6d2a318cbb3840
2207 F20101206_AABKTY jeong_k_Page_018.txt
17b88ece93b2dafbb548ca90210ae02d
6fd5f386caf9652d05c21a09007ed78dba0b5faf
844 F20101206_AABKVB jeong_k_Page_123.txt
be2ee4d55bd9ab5b10a140f69a544b91
a205d97e419f7e82f1d5d934b6356d6b1c75f72f
F20101206_AABJRL jeong_k_Page_049.tif
57ccda98cac0e27f8f029c0dc725a2a3
e61aa5a8878b2c79c200e9ceb6b176f828d223e9
2260 F20101206_AABKUN jeong_k_Page_079.txt
632f692df15d41c38026bd783795f21f
604690b8fc0242041c924366966f2bfd2a97c3cc
2116 F20101206_AABKVC jeong_k_Page_001thm.jpg
da900a7a552ebf63676d770bc4ff9f7b
1029ca9e25463084bf74d9b967053751cdb092ec
5607 F20101206_AABJQX jeong_k_Page_083thm.jpg
5533d0a5e03022f44f19832dc95563b1
cb079820d28778e8fa1d9c2eb8f01406cd516a18
1821 F20101206_AABKTZ jeong_k_Page_020.txt
2236a6045895c83e8bb0323c5b78ef3a
dccee6fd16340207cd5972140e997c1b6a6a8ee8
42878 F20101206_AABJSA jeong_k_Page_023.pro
ab7ffca3642597a139a7b8093dee6166
181847d89bb62795b2228971c611a93169464e63
F20101206_AABJRM jeong_k_Page_046.tif
5d2569fc99556c58ac5305bd12d618f4
c4a7a57734d4f184ef12b37671fe8eec0cfebbeb
1936 F20101206_AABKUO jeong_k_Page_086.txt
723162e18b06e86885ceeec85a3700a2
996e4dfeef9db6e0f1a147574327220d7434d32a
2919040 F20101206_AABKVD jeong_k.pdf
a6ab20b42cf9ee32153afa06a946ddf0
46b7ce1f8d6a5da20c12d2c5a307b631e1cbe476
957296 F20101206_AABJQY jeong_k_Page_024.jp2
01e92d653dc4aa0aa01370bcb1277392
943a739a427f1deaa2007ba4ba2277322e95bf72
59464 F20101206_AABJSB jeong_k_Page_084.jpg
88a7882aaaafb24161c6f71d66f9093b
c13303c7ad10f12523600cd098389d8e1635e1e3
995358 F20101206_AABJRN jeong_k_Page_109.jp2
827ff4cc1e2b9a2ec8b952bf4dcec20d
6fba827e4ec38d4745d63b4bdebf1762d7d08cd2
2223 F20101206_AABKUP jeong_k_Page_096.txt
e4804519382cbc18f103eaeaf1006216
1361c690e3af59c5144bdb6689b0a3126a4e13ae
6783 F20101206_AABKVE jeong_k_Page_001.QC.jpg
25fadb4b4a1f50914476d43d578fdfd1
d909fb71495fc90b2afdb0bf901fc4e2f966ff20
60932 F20101206_AABJQZ jeong_k_Page_071.pro
9095eb1d356be06b6651b414f7c0b495
e08bc96924f5ad69644d4be6ecd06c6fef981eb3
20364 F20101206_AABJSC jeong_k_Page_123.pro
68c31527b44d3bee42342fec9d046565
93e47cf41a53642386f24601c085722d47250aaf
F20101206_AABJRO jeong_k_Page_103.txt
a76eb5b1375608e3f608ca7a76ee6f8f
aa6b50c6fb8da7cadf90a0ce5e1b5338b4c6dce7
1877 F20101206_AABKUQ jeong_k_Page_102.txt
299bcbc55357952aa35f93cf36eb3ebd
4e2f0ff2443db49f5e058ae69b0c8e5d9395384c
14618 F20101206_AABKVF jeong_k_Page_004.QC.jpg
6eec3a671a741d828635d92081c8025c
c702510808ad2cb6bf10d97a93e1fd05cfede7b6
67069 F20101206_AABJSD jeong_k_Page_062.jpg
5d2fc99c328cd328eb804d2612894747
e2a71f72da21d28612e6bc77c823d89981c827e4
16929 F20101206_AABJRP jeong_k_Page_022.QC.jpg
aafc3798ef973209a23fcbcfe1922e96
51ef33e3c7d3300fb6bccb609a2555b1c09c00c3
2039 F20101206_AABKUR jeong_k_Page_104.txt
002fa4b4170ebae2c5824cdc37c35fa5
de895450e7076b1aef4bf04b9b6ff0f34bd55af4
3934 F20101206_AABKVG jeong_k_Page_004thm.jpg
c0102c93f1a09d4b7029bd51e6086b39
fb9746c58741af6209051b6034c0faa1fbff1150
F20101206_AABJSE jeong_k_Page_076.tif
fe16aea6f94de8d7cd06b8817b7f43d3
ee8412bac3ecc16d71f7c6db4dce367655463758
60599 F20101206_AABJRQ jeong_k_Page_073.jpg
e98c09f72a6b1396d968209558333bdd
781deb4e5f2567792f7667bd3c718fc0e74915e3
2046 F20101206_AABKUS jeong_k_Page_107.txt
8fd66419f49be14909f2d94bc2fedce7
812b62c6c02675f3c8dd7678c6eb314f9ca7c90e
20561 F20101206_AABKVH jeong_k_Page_005.QC.jpg
43f00260120ae158900736cefe88cec3
1ba75d8488d5538dde6543ff8788d06e6c473ebc
12817 F20101206_AABJSF jeong_k_Page_008.QC.jpg
aaadebb5d0a682e9eb7cd4651ddb2a30
31ba2a85350ec4256a21ef4a8fee9333d2c9ab86
35141 F20101206_AABJRR jeong_k_Page_030.pro
430870613306928f1320a8d86b92ac28
116e79baa6a877f87cb51845fd49b220bd1c45ff
1425 F20101206_AABKUT jeong_k_Page_108.txt
f239079cd7fff90479ce23a7a3e32bf3
b5aab68f2694799d52a12a3efa5377838f2dab7e
2965 F20101206_AABKVI jeong_k_Page_007thm.jpg
704b0835bbe8a8ed798be29dca38b808
98d71f289987575231e2ad3a655f43566b79e8cc
2002 F20101206_AABJSG jeong_k_Page_063.txt
422c5ea0fab28b24a8326821b02550b3
39958b02947c65236e7747cefff4d5b89c1fe613
19391 F20101206_AABJRS jeong_k_Page_029.QC.jpg
1be4601995a12e6559164bcfcefb0b9a
a4df86300849e71dc5243f699c6a766028f689d4
502 F20101206_AABKUU jeong_k_Page_112.txt
082bf7c1e9e11560a1c4612c2f0a702f
79ea5c08ae54b49b2a43d6954a036516684e065a
3009 F20101206_AABKVJ jeong_k_Page_012thm.jpg
4d79fed744f5de961a60a418079a1be4
d2a502681b15e1acc9d0e8ac9d5c48a5422f89ad
808 F20101206_AABJSH jeong_k_Page_002.pro
e956857f467af243c096e2b8aad32644
1c23c0294c44b99333217f4a55d820830206c28e
F20101206_AABJRT jeong_k_Page_064.tif
c60ccc67ae97dc19562a1c13c119346e
318ca154b387916687d2ae3ab3a1d91d96d2eac5
385 F20101206_AABKUV jeong_k_Page_113.txt
d481ae60c66a665281571af8d5bd03b2
d9e127794e23b2d668c02866c70f48348efec790
24253 F20101206_AABKVK jeong_k_Page_015.QC.jpg
8bab798515ddfaa346c398bcad032a00
b03be51f4e8894ea016c9e69700c6a3cd0cc123a
450036 F20101206_AABJSI jeong_k_Page_100.jp2
bc9933ce6018c9b202e84657c5756a4a
dfa728e690cbe41afd6807250cbd3430b996a24c
27720 F20101206_AABKVL jeong_k_Page_017.QC.jpg
9cddab1c12ab3a44a21ab6c31df3e0b2
50639b11a1f4e9dbfd067f3e7115d40006d319f3
77022 F20101206_AABJSJ jeong_k_Page_120.jpg
5b94a7a7e63524b2795db261da860911
1de1dd8f9721e1817b18c5a501229b1247c0286c
4695 F20101206_AABJRU jeong_k_Page_067thm.jpg
3d896b58f025e68e29f34de3a50b7005
89dad27da317283afa5ec26d3e4988f878c8fd49
328 F20101206_AABKUW jeong_k_Page_114.txt
6e985a86ef086352ddec0026390553e8
2abedbe77681c4a3391e230983780cc7a1d97785
6184 F20101206_AABKWA jeong_k_Page_042thm.jpg
22530c2d08c7dff5a87bc03125b7064f
de5442e12deadfb1f59a5dd62da1c42b9c60de1c
5688 F20101206_AABKVM jeong_k_Page_020thm.jpg
f1faa5610739516acd037d8ea05efeb9
fbcc276048c7697f507b14b77a4c8df9d4dc4542
68400 F20101206_AABJSK jeong_k_Page_058.jpg
036742f49fb1094edca56db91f1c7cbd
a019a4342626cf401fb2f1843a31dbe0749e93dd
1051911 F20101206_AABJRV jeong_k_Page_059.jp2
54307c15ee3c29c71560c920fa3b2476
2471b5e289a1c04d0b5ec5d8118d620ffc6b7856
2415 F20101206_AABKUX jeong_k_Page_117.txt
d8e7da417c906cc3df6bf4c634cf82f3
babadf68b1f6601b1374c134e2fe8322474b3bbd
17080 F20101206_AABKWB jeong_k_Page_044.QC.jpg
63f34ff724ebc60adf7529c22d23fd81
1d3675ba90a0409f99c6c394ea79376fc906ae53
4864 F20101206_AABKVN jeong_k_Page_022thm.jpg
44b0a804ab2949cd00ecd04a67e57032
4ee44951be91d1f4ffb4a8dfbf93f91fd40427ef
904013 F20101206_AABJSL jeong_k_Page_052.jp2
f16a461903bdc49566e5d017f2e10d57
ce29e6d5b115009ff0704c11c4421a91afaaa26a
39721 F20101206_AABJRW jeong_k_Page_007.jpg
e0c17ea8dda1ba4be390d5704a2deff6
dd98f2ba2c5ba9172360725ec978c63cad157562
2674 F20101206_AABKUY jeong_k_Page_118.txt
ce86dd53e9de43cbb00f75c97c88d35e
38168d4c44c283cd49c42a775d063e3cfa84717f
4807 F20101206_AABKWC jeong_k_Page_044thm.jpg
75dc07e982381c7b034caf876c9b8edf
9676e05de893d7a5e1fa70735bbd726ae05bb24d
5198 F20101206_AABKVO jeong_k_Page_023thm.jpg
be8e249bb81cb5291489e00f1e9b80c4
e9187660b1b4f1ec340c22b80b597af7075479dd
5134 F20101206_AABJTA jeong_k_Page_064thm.jpg
0c02b00cb560f062250b930ebcd2db97
d957efb83d90ab48b145cf70abcd8da8f4e10159
39888 F20101206_AABJSM jeong_k_Page_086.pro
6990882929e13d7773055981d28238b1
ba1be7d9b82b75ab5292313c926871719e3dcac6
F20101206_AABJRX jeong_k_Page_057.tif
cf087bd892a956ab169fee0156794256
d75a162d58237e7f6e86769336c2006c9c7a55d2
2309 F20101206_AABKUZ jeong_k_Page_120.txt
42cefd52ab80243d9bdb17273d0a9f77
8535f0ad3fa298c7143c16e47e3d7c8de1499d1a
17846 F20101206_AABKWD jeong_k_Page_046.QC.jpg
1bb2d60767c082914c0862eb39b848e3
f323a2eff306140d708c03e61181ceb4e443a591
5043 F20101206_AABKVP jeong_k_Page_024thm.jpg
8165af1273eef8ea1210f301b469f2dc
a010a34514cfe7519b0662ce3e173720dda42771
477222 F20101206_AABJSN jeong_k_Page_077.jp2
715c696a525d1deb8fb3121d13b5ac8d
8de997c9cea5695b2f899ca99920b59db6e2afa7
950168 F20101206_AABJRY jeong_k_Page_041.jp2
17fff7f916d88d81b47fad273570fbef
65ab7d958939444d8306552e957849e4493c7c40
2016 F20101206_AABJTB jeong_k_Page_047.txt
1df9ecf3bc0d44c0048657587485cc16
25043a7d01e86a9ccb6fb3db0abc37945ab96b94
5213 F20101206_AABKWE jeong_k_Page_047thm.jpg
01b62cd4f6c1912d388fb89c234e5025
6319f56c788137c734c3054dab572466ecae13f5
22416 F20101206_AABKVQ jeong_k_Page_028.QC.jpg
e881019e82905da14d8409dc7a288178
ec6c624df23c5d570b59221850dcba6ab34a4d20
1092 F20101206_AABJSO jeong_k_Page_022.txt
95b9fbed2113311fd5effaa21595d108
b654fc7e9f580ee97b1b0076de02508e6a9136da
F20101206_AABJRZ jeong_k_Page_082.tif
bd0b645687f065f9c7e7a75fd8205b3b
6033e0a44c1f5c896d3dde650a642c46be554ee1
56060 F20101206_AABJTC jeong_k_Page_035.jpg
74886f4e95306fd471b1697813b700ed
927964db11282449329d0e7abc6eeee11a2f120a
14872 F20101206_AABKWF jeong_k_Page_048.QC.jpg
4f0a849f918c8ad4342bfa49a6515760
8359e2f1c1c73488ce79691543246a75f79f34f1
F20101206_AABKVR jeong_k_Page_033thm.jpg
9f14f94bea8a4ba8a2690c028802d3c4
134eaeeeefed4b6d6c87f8f06e89a1d713a5910a
4520 F20101206_AABJSP jeong_k_Page_108thm.jpg
db625f1ba92a8b4bbffe7ccdde2c903e
3c57f43de9334b1e96e9d0ac78bf47ef2fe9f93c
F20101206_AABJTD jeong_k_Page_062.tif
fb511d294ed8ae484801bbbff64118bc
890559daaf256fc46e91a3a771652eb404cffbcd
24096 F20101206_AABKWG jeong_k_Page_050.QC.jpg
9eea6228a3f927d0d1a31a32da8296f6
41fa0f469d5520227b2a2bc9ba555d21ffa6aa33
17438 F20101206_AABKVS jeong_k_Page_034.QC.jpg
dea6f4e58accfd3b3d9c6d7a77bbe151
794ddcf87e11353555d9654dd9f34e2a331e8a07
11280 F20101206_AABJSQ jeong_k_Page_007.QC.jpg
1058ae1e4358a9552b3bfffd00ab6af9
18bbad6f38877963c1ea24fa38cf86c34eb32e35
F20101206_AABJTE jeong_k_Page_071.jp2
260c7a4ded595044ed453a19e3239c07
1dc1246a843e9e0a20d709254f01332147ca9b49
6334 F20101206_AABKWH jeong_k_Page_050thm.jpg
7eb7374728d00b7d7f137d0498b51318
a69afdd87aca61f1748832a6465a7cc320e68dcd
4961 F20101206_AABKVT jeong_k_Page_034thm.jpg
a7d2120ec20d6b75a3c86e3274347120
2bad71b816c0f37fe05559896b51cb06dfde7b01
61171 F20101206_AABJSR jeong_k_Page_121.pro
d88a26e682eb1d88e1cf5c2e023c1a21
ede561a8b7270bf87893a2024fff9191fb2e6f5b
F20101206_AABJTF jeong_k_Page_109.tif
219e26a3480dcce41268b5ebe3830c1f
8d6f293860e74221d9c3c0198c494928de5eabe4
21085 F20101206_AABKWI jeong_k_Page_052.QC.jpg
a4512eee792522f713702d3f12970086
a44e5d778470560c284e47dff2a3a6822044e90c
16137 F20101206_AABKVU jeong_k_Page_035.QC.jpg
60912fda855d8092ebde2b317a8a82ef
879b9f456eb29440cb7a1a37ce39bf306136b502
3591 F20101206_AABJSS jeong_k_Page_040thm.jpg
80cdc55f1515daa4c8909a1d469e5a3f
9129c97b5d621850b64b88bf63f918638a14207d
59309 F20101206_AABJTG jeong_k_Page_022.jpg
f27aeca5fed426e3bc24a724d3988743
ba4dacb34ecc00bd6cf29d0484cd898438a682d4
5773 F20101206_AABKWJ jeong_k_Page_052thm.jpg
d2591ed0d7d6ab6187b377efcea7e64a
a2d1703e1787d9da788e36f7da9f86c9a6279211
4770 F20101206_AABKVV jeong_k_Page_035thm.jpg
ab95f028bda16efce6bdef3a17327ad4
26a91a7d16c286dfc7e940581e5318240e21acf0
69996 F20101206_AABJST jeong_k_Page_109.jpg
a529d2a7a43e2cb6eb82407c22141373
b0a34fc64afaf897d1140978ef6bc0e9d78678f3
6046 F20101206_AABJTH jeong_k_Page_037thm.jpg
096ee607abaffc94cbc42e274f4dd304
619ffb4a4a3bcf46656cbf9dc052757456124937
23523 F20101206_AABKWK jeong_k_Page_054.QC.jpg
f5416c534ed05ab76d5a4714794816ef
69ba94609c77393afc1574b891ef648c56519366
24809 F20101206_AABKVW jeong_k_Page_036.QC.jpg
c73e12df75386eb558700100bc76fd48
9f59f9d98ac17a28255e5d86cefadfb0078a10db
72743 F20101206_AABJSU jeong_k_Page_122.jp2
a0b3a74f68c01f361c304b1b5b0693dc
bf15c34da53923aa89da74d228a2ad93af4c92c8
958434 F20101206_AABJTI jeong_k_Page_055.jp2
4f5beefe99d798ee798ee1579d13aefc
9a08ce6c86c676c217e9d1f0d166928bf129ebc3
21893 F20101206_AABKWL jeong_k_Page_058.QC.jpg
18ce1f91b3fea2b7319aa684d8c0b7a3
57be26c5812637271616182a9ea06462facabd1d
53881 F20101206_AABJTJ jeong_k_Page_036.pro
ad9c7a29172a3aa8519ee65716f185ab
512e0ccd15948c6495c4ce4d31a56311e2987197
26739 F20101206_AABKXA jeong_k_Page_075.QC.jpg
3c0b67639947d46c24015c46cae74e2a
9290c2f6f093e0b92b18f854bd3de45a2c497c83
25337 F20101206_AABKWM jeong_k_Page_059.QC.jpg
e6c3bc392d1b1f14a73be6f795618619
42804cb8b71866113d5a897398560013dca3e7af
4810 F20101206_AABKVX jeong_k_Page_038thm.jpg
ef685ad4ef26fddd0ea27d735600cccb
43e81e01701574e543e7bb7c0dbd28286131f6ee
894825 F20101206_AABJSV jeong_k_Page_073.jp2
7d37b41f44c5df7bcd161c2b4c897f28
76c6baf8290f94e48271d2d4cd0b61166aada5b0
24683 F20101206_AABJTK jeong_k_Page_027.QC.jpg
2c9a390bb24675446f0f61c9e77235cd
08eb1dd7f2aea39b679e547f06cd45ed298883cd
6555 F20101206_AABKXB jeong_k_Page_075thm.jpg
9c5f9ace05955b5f1f68804e00acc2fc
34ac59648789a87baff73a8417b851910cf35657
15620 F20101206_AABKWN jeong_k_Page_061.QC.jpg
1236c87b46ddf12eabcfb1b5364f66ba
080f0712b2517165039382d18003db318b572c23
12579 F20101206_AABKVY jeong_k_Page_040.QC.jpg
a1738f262423d263f3dd7710f8e8f6d0
e2dc7ed1bc486e6b39e380f1c58b851b41423b98
2324 F20101206_AABJSW jeong_k_Page_059.txt
690881a4c96ece22916e155fbf38a2ad
c9ee506183c5513c9e0df2c564e1bf9098200e82
1059 F20101206_AABJTL jeong_k_Page_100.txt
c972fd5f8bbb9dc45614712445270e98
f34596a36918188785d04f927a92a6d61fedccce
10127 F20101206_AABKXC jeong_k_Page_077.QC.jpg
e8cc7f8830445b17e53c7fc4e92bd801
8ee445256dc0950345717f629a271fe3548f628b
21011 F20101206_AABKWO jeong_k_Page_062.QC.jpg
3ab393102dba04ae8ef4894506d04c10
96177252ba7cc66185f574c00b97945c2a38386a
5874 F20101206_AABKVZ jeong_k_Page_041thm.jpg
1fe9feeea7557ebfcd9e070e1a6a7ee6
ca44725b280a99fbb805ec75912f0e89baa28f69
2285 F20101206_AABJSX jeong_k_Page_036.txt
b23c03a0da00612b5fde52785de89e0b
9ad0ac37c96f3f652a870a2323dacf89f398a20a
16921 F20101206_AABJUA jeong_k_Page_099.jpg
0be36b2da6f9b8fdc76c61d3da4a23da
2d3585fe7aaa6c257fabf439a4d76c86b9a9df7a
17216 F20101206_AABJTM jeong_k_Page_100.pro
b5e5e0a12a59527de8cc18471a315ce6
749c205eb8927d2b8439e386c54b84ad13fbfe07
7037 F20101206_AABKXD jeong_k_Page_080thm.jpg
8693161aee1639c922d7c5a90688229f
af7e2494a670e018b3701ee22ec110f7465e73b7
23682 F20101206_AABKWP jeong_k_Page_063.QC.jpg
2429e66fd2717fdc6a81ba2dcab3d4cd
353cc6ebc6e803dd52993a5a6db4ed1925cbaae4
10200 F20101206_AABJSY jeong_k_Page_012.QC.jpg
0bd2b1e5333fc6fb425044b8ecd02b63
d85eaa75b99e370a65d17cf5a77175ad2398d1e9
85950 F20101206_AABJUB jeong_k_Page_096.jpg
2ed721a5d518d91f35cfd7ed729cec3d
1b1ab2157d650adc63c4b0211773e5571f4a930c
23199 F20101206_AABJTN jeong_k_Page_006.QC.jpg
ab70e225a33cc762d6759a04924eca32
87ff744d55f363194586414d4bbb93ed536bac43
5516 F20101206_AABKXE jeong_k_Page_084thm.jpg
bf43cd62e28006b0ad4d583c5d5a854b
5e0d443eef3a8693df3275b09beb683364862458
18940 F20101206_AABKWQ jeong_k_Page_064.QC.jpg
98e03512eb1b0ffa3ffe18293bac63c1
9e9728ba1945f9f6e86aa5c90884ad0a1058f0f2
1926 F20101206_AABJSZ jeong_k_Page_081.txt
ccc7c41b7d91f942fed96ffc583f26a1
4df0ba47e9308706dd9f8b2545a4fd71d671d735
6196 F20101206_AABJUC jeong_k_Page_117thm.jpg
baad889aa224e441a7a32a6212bb9996
b7f1e08dabed4d18fb4f61560d8963b4bbf92d8e
30574 F20101206_AABJTO jeong_k_Page_073.pro
fb6c8a0a55d949be0211e36710a3a9ba
2791ea1478e61601a05970926d2d0f1fc28297c9
21875 F20101206_AABKXF jeong_k_Page_087.QC.jpg
fde86de52f71f6608e179b8dd6257d07
750a6eea1baea714db61f0e27347d90e6ce18ee7
22112 F20101206_AABKWR jeong_k_Page_066.QC.jpg
96ff58b8ad7ae47b14fb9b04f893a2f5
7bd1d37cfffbcf65ddd9742f26bb1e55005c354a
1054 F20101206_AABJUD jeong_k_Page_008.txt
9f0ee532d4474464616f3540ff06bd36
3d99b05c2f8143cd79dcb87a3ab599071ab0a804
23875 F20101206_AABJTP jeong_k_Page_101.pro
aff967b1c58cfc341d1bbbebce128cb4
a88598825bbfab82c1f04ca06b7f06993a94bfba
6259 F20101206_AABKXG jeong_k_Page_090thm.jpg
ecf0dc81ff0c46e4e28637f3e0c87c14
5070a722f2fc8c52e5cf116320b0f40424138aee
6476 F20101206_AABKWS jeong_k_Page_066thm.jpg
fa66acb7e519dff29a3e38e28be0afbf
6e1f88b970db6bb00111932e39667e8afa60e58f
6568 F20101206_AABJUE jeong_k_Page_059thm.jpg
9deadbc8bc86f1683a481d764ae19b4a
e26f1ed984262096c90ad3df1f8c2749b318ea74
46515 F20101206_AABJTQ jeong_k_Page_008.jpg
8a90428ce5be20081e81a36e7a90c1fb
d098dc8e666b26e681675a24bdfb085002099152
26213 F20101206_AABKXH jeong_k_Page_094.QC.jpg
3b8cdee01b2800db0a9b982ef3186a15
730fe46946c1e6b1798b4ec96978f25fb264fa9f
28503 F20101206_AABKWT jeong_k_Page_068.QC.jpg
a4dab6c5aaf8d1df160d689a955f2b60
5688af39532f7dfe8528369e8a8fa5888f2ef772
6083 F20101206_AABJUF jeong_k_Page_063thm.jpg
b021ca9dfb9e93fefc612b56e69147e3
ac91226730024f9513d6ef910f03a8aa9ef9992f
F20101206_AABJTR jeong_k_Page_090.tif
bf4c0438f084e56d3dabcc6d7e07927f
68b163d8ccffdc436b24ab3c46e67e07bab94552
22139 F20101206_AABKAA jeong_k_Page_090.QC.jpg
fd88f3ffd61a8c543642c90ff5e599bc
32f6d5f1753f1b64680c4a95090b8b3be6ac5bb3
26002 F20101206_AABKXI jeong_k_Page_096.QC.jpg
49a8b10991436537c626e87f6a0b7c32
adf5873aa605bb4973c5d44f12e2dc9a698cb2eb
7184 F20101206_AABKWU jeong_k_Page_068thm.jpg
103cc77bc84110e5fcdfcc0f4e34738a
180a1e58a156e3fa05ce7429b594d270882487b2
52755 F20101206_AABJUG jeong_k_Page_013.pro
1cdf4b65f660fe9e57a4df9e88a4673a
feadb938f16e0a8377d842f7bbf6ab51834a924d
28352 F20101206_AABJTS jeong_k_Page_065.QC.jpg
b16502f692991b2e2fd84e8ec8c2e918
a9a624fb37dae9f2aaa25967e8ce0f24c8dff548
21376 F20101206_AABKAB jeong_k_Page_011.QC.jpg
17eb4eaea32307f844bf70c6835fad0f
d75e0c38302a488928753b9870bbbc0b7fa5b030
3893 F20101206_AABKXJ jeong_k_Page_097.QC.jpg
fff945cfae21d79be9c05c206ab3c655
0b69e9cb0a2e54afd3b13992ac49b59dbc3a3631
18484 F20101206_AABKWV jeong_k_Page_069.QC.jpg
1e9406a2bbe74af11779c417210f8cf3
7d3d65d38de6357c3f31daed85880c10c62f0089
F20101206_AABJUH jeong_k_Page_095.tif
9391961fcd0dba55f5f3b35dcee16e1c
0443c97a99068cbb33d1ad149fba3ccc9e1585ed
71633 F20101206_AABJTT jeong_k_Page_090.jpg
3c86361fc6f5fb2314efb92b0f504a61
44b60412e5e0f91ce4404e3858457d9e23747d73
1517 F20101206_AABKXK jeong_k_Page_097thm.jpg
f9f3831b968a92522a0ccb3dab735929
9e305d97dfb455c759c7ba8d2eea32d790aca1b1
4989 F20101206_AABKWW jeong_k_Page_069thm.jpg
96033bbba80400d4799ea2f6cf782b84
01a4482718ec7c7e02ef549e060496603cdfac4d
46429 F20101206_AABJUI jeong_k_Page_043.pro
feaa9a1267e3a46cff2de563600d1738
c1e002a6bd15f681ba1195aede8aff4c3c7d1702
1566 F20101206_AABJTU jeong_k_Page_087.txt
e41f78d88569ed994b7a60c457a2d375
efe4456c77b2795c0e5a1d936699296d58e95a2a
54954 F20101206_AABKAC jeong_k_Page_067.jpg
dda0297b8d4bf5d409da6ceb8146bb0a
9337971301d632b33f4e44bec8091b78b691bafa
5310 F20101206_AABKXL jeong_k_Page_098thm.jpg
08b393753e0cbf2b6e98f76471b6028a
5f5727e08830551e5764092e96086c5f876ead3c
11038 F20101206_AABKWX jeong_k_Page_070.QC.jpg
e28dd7df270d241989b23597ec2ed69f
704deb4c1a3dca424fbef89298eca477cd9cc640
21066 F20101206_AABJUJ jeong_k_Page_041.QC.jpg
caff2486df305c3cec0352b5c318db34
daf337e2458ae6f0debb9e3e0055ff318e278413
5999 F20101206_AABJTV jeong_k_Page_087thm.jpg
a8a1d387612296fcfcbdc4a318e75b1a
b384ba4d9233712c246aadddb399be4a7eaf9122
737040 F20101206_AABKAD jeong_k_Page_053.jp2
754e316e740cc9c956d78e5a87ecdfa2
a8a25016a078f6a98dbda36af9ebce0183d12bd7
6052 F20101206_AABKYA jeong_k_Page_120thm.jpg
8dc779c77c92fd0ba9d1b159edee39c6
2a3d8f9059b968b66e0c9d23456e5d0319c801d0
3630 F20101206_AABKXM jeong_k_Page_101thm.jpg
74e628ddd03c1311efc38221898b295d
0fb31957027c2539a7addae37dc41a78d3d151ae
1655 F20101206_AABJUK jeong_k_Page_064.txt
f39359aeaca00610bb4c997183e2a679
f3fd009da0cb8351397bdb12db59a27df349f94c
F20101206_AABKAE jeong_k_Page_120.tif
d826c8c14a654143e5d34e92781ff5cc
6c314c9eb9c684d346334f7bcaacf437192ac239
6438 F20101206_AABKYB jeong_k_Page_121thm.jpg
75b2e0501c366d374327539fc7dff60d
13b6dc290aab60067a13c43247fc5f9ffe574cde
24398 F20101206_AABKXN jeong_k_Page_104.QC.jpg
bdaa08e30627f57c813ea63eb2587a16
0df025fb9d027e120bc8ecf535a65b1b29f12616
7007 F20101206_AABKWY jeong_k_Page_071thm.jpg
b26357a8b871b0e9abf8c6a97a6ed225
cc4e2911f3185c69be96c208075b1b330d6de579
1424 F20101206_AABJUL jeong_k_Page_090.txt
b3b13007117a99673dc4080cdefc840f
8a7ed9576835aebe7157caadce2434c882629644
70850 F20101206_AABJTW jeong_k_Page_091.jpg
cce2878795ad7f7ef57e36af6a52bc37
b65518a58294c4b34d3f8905459e123f142c80e3
F20101206_AABKAF jeong_k_Page_075.tif
db1db586d3a802b2401b61aabefdb1fa
363d49f2f8f2f8f5e08778d4b9def33fff03b72d
14052 F20101206_AABKYC jeong_k_Page_122.QC.jpg
46ffb20150b906b744da4f2408fd05fd
e0fadf61eccb60bc8c8ea01fbd926c73b1567f09
19258 F20101206_AABKXO jeong_k_Page_105.QC.jpg
cbe8ee370b377e9d6a6ad0596cee23ff
c8adad95107172ad2ccc5e0a743cef2c11edacaa
5521 F20101206_AABKWZ jeong_k_Page_073thm.jpg
3fdd62ec297e39e4cebd54425c45c50f
062e6801bbc4847886ee979e2c41a64573a7f621
1347 F20101206_AABJVA jeong_k_Page_002thm.jpg
d061bccacdfa61c96747c53792a5185f
761acbcf7b875df215616fc001962bae9675e477
45530 F20101206_AABJUM jeong_k_Page_020.pro
e0b23055cb9d3dedd8071c306a83dfd6
4cdcbe74c344f2a1641be88bd8f23ae87bd00afc
1051956 F20101206_AABJTX jeong_k_Page_016.jp2
c9b12dec70edd0d47872eb1a2ac5a84f
e39a11c2018bdecedc030d82f2ab1e93f3432ad3
48882 F20101206_AABKAG jeong_k_Page_074.jpg
a1e9f8b57094dd766e817df962f60b0a
67c613240b67a409c2cc89f76e831ec39f67f2f2
141908 F20101206_AABKYD UFE0021400_00001.mets
0980c93a91adb68fd6b3b80bc478b062
9612ad003bce3c23566912b6a1d48039e3c3ce43
19351 F20101206_AABKXP jeong_k_Page_107.QC.jpg
18b3a54224e2dbb4ae7f984de5be3c1b
cc22bf52896120742f13bb609386780d770e7f6c
82682 F20101206_AABJVB jeong_k_Page_059.jpg
5105b23dc1d0be1c5d41c4773c5086bc
c226ca3e3b5f92b97dcd1f5338c1ee8e81d74e67
4951 F20101206_AABJUN jeong_k_Page_085thm.jpg
4e5fd6fe35fbc1c0ff29d1f03c1b29ea
079e9d7622bd22b5ae280c984f357b0e657969e4
42490 F20101206_AABJTY jeong_k_Page_047.pro
5533f527c24e4545ae559b2a4f5da09c
2807395dcd3d5908dbc8df611f147e02de48c032
1051909 F20101206_AABKAH jeong_k_Page_096.jp2
b7d49ab5c9a454bd1d46f6e5c42bbf04
61dc395f741e17471b8fd586bfc924dc62927a8f
5387 F20101206_AABKXQ jeong_k_Page_107thm.jpg
7a7c6b1ad3ded7a72c8ba55988e8f2ad
790d2104928d8250a6803040d567fa4cef76d54d
F20101206_AABJVC jeong_k_Page_052.tif
172cd5d6d494c6d9648f6cc92f92bd7e
d9a713c3651d6c0a07afaf331541b5c7f0af0ebc
1005230 F20101206_AABJUO jeong_k_Page_035.jp2
2ed3f91294be85a1f398307490d158bc
8a8a0cef1cd938ca77a219c68360d6cf273a2dff
7054 F20101206_AABJTZ jeong_k_Page_088thm.jpg
80e6b65f7b84ffc1b7c39a645683fdca
b94cfde36763ebe77d6f498ff2f6f7d2214312f4
46023 F20101206_AABKAI jeong_k_Page_113.jp2
835ca4127d5e23ef56f17a98a3e13903
43b5ff0f348b66b7d9bc9b2c7b88b9bf1a9afe77
13425 F20101206_AABKXR jeong_k_Page_108.QC.jpg
d2eb16b0c65888c870bf46935eaf9453
f1dae65dc85d001441f12a033244892afa751a43
757230 F20101206_AABJVD jeong_k_Page_072.jp2
d76516150b209c21fba50eac8ec6c344
9282949c07d14c42a4f7e151f386e585eb43a3b8
16151 F20101206_AABJUP jeong_k_Page_072.QC.jpg
8c5d84422f5130cca0104e62c713825e
13d79f46bc941a6a21fabc23a4ce449725d0fdc0
80941 F20101206_AABKAJ jeong_k_Page_009.jpg
6e4d3c5e05a4f996d961b590137a16df
bbd6f0da66c6574ac100a8baf08b82bec11bd43c
21807 F20101206_AABKXS jeong_k_Page_109.QC.jpg
507ffeb0d0c1334d6246c26bd0c7daf2
6524c5de5bc684b27a126e37df33185c5725ee90
14991 F20101206_AABJVE jeong_k_Page_070.pro
2a2064d4bd9bfbeec9bde8d6970807d6
10e65c0759bb0d49964707cc3168dc0a5fcfb467
F20101206_AABJUQ jeong_k_Page_097.tif
601cffb9d07118a1bc8a10980266aa14
b97e947dc5e075084913fdaf0b905bf85aa616c0
19652 F20101206_AABKAK jeong_k_Page_089.QC.jpg
c9b428c933c08cfca18e7b7140eea751
ac5c4ac30425f6676f4a964274942f964465265b
25921 F20101206_AABKXT jeong_k_Page_111.QC.jpg
d3aaccca68388975111a5b151e484316
a18e9da1066e65b4145b03581f940f48ce380a48
5574 F20101206_AABJVF jeong_k_Page_089thm.jpg
283ebeaec48539d22a3e156d92c523ca
cf6b603568076993103b76e5073607d40a17ec8f
1021495 F20101206_AABJUR jeong_k_Page_022.jp2
4ff17fd6265b8f2de53397fcb9d9740e
7cb58689576311b83f1422848b6af78fc2afea45
105864 F20101206_AABKBA jeong_k_Page_011.jp2
6f7ea1300f5b446fe0a82241e424de11
0e00ce9e68609f9bd25f358008dfbfc3a93aab70
16951 F20101206_AABKAL jeong_k_Page_038.QC.jpg
2bd5a101143585cd6b252e605de9a26c
04aad29ad1780df28749752dc0f29f24814ef4b3
11971 F20101206_AABKXU jeong_k_Page_114.QC.jpg
128548c5f15102d0654ae6377cf03cf9
32e7efbda36a15a0b7f9d7aa476a5a2fea80b448
1828 F20101206_AABJVG jeong_k_Page_038.txt
40581f433c27398f4ea7204c67233aef
87f1779c1146bab37b755a72ce0c6b57abe6ea26
F20101206_AABJUS jeong_k_Page_106.tif
3ef233df60e4755d1f569d7a0427673e
e4e195b61e26cb0d6cc833675ad00effe8e644de
F20101206_AABKBB jeong_k_Page_050.tif
a4366af0ecf2c6a1de4f9e678a912fad
6c49e8a894bbce1c9944b3153396b93fb974bbab
87695 F20101206_AABKAM jeong_k_Page_051.jpg
299e5ca70836ca0589e421785f320391
2e30c261fded3558022347998457ba533edf3c19
8481 F20101206_AABKXV jeong_k_Page_115.QC.jpg
a35e7401e07df19b597de415bea2004e
c5625142769274bd66572125b2118e291f3418b6
5980 F20101206_AABJVH jeong_k_Page_109thm.jpg
68c4cb2ee87deaff1602956af443a57b
c08d27e56afbbc8a60346a9af217704ede77b84a
16668 F20101206_AABJUT jeong_k_Page_092.QC.jpg
fb3cd0f34fe6abc5a9a40943d5dc8b2b
eb90a7d311d5939ccbb0265c6a45f3ef799f1536
28327 F20101206_AABKBC jeong_k_Page_071.QC.jpg
61c2eadfd29260a105fba4695b59503c
b6efdd40a49679ef528fc7581381af6f60f6c11d
F20101206_AABKAN jeong_k_Page_105.tif
617391c3cbfd8c34bc05cc2ff561547e
e4215a46c3359e7055af5a4c7423c9fe59a17f91
2690 F20101206_AABKXW jeong_k_Page_115thm.jpg
9fd534368b07211bba4b5e0cf0d7feb1
b54f811938719c4cbf9918b5e425ca7901ae6450
F20101206_AABJVI jeong_k_Page_119.tif
ae1a9df2cfde85351eb6bd31a01f312f
6a5ccbbe6191c9f9f198a17a4ca360392868c41d
755 F20101206_AABJUU jeong_k_Page_012.txt
786d6151be1834620bbfb5642d31e66e
c436aa72fb1fff7622f89dec537915fae142848a
43313 F20101206_AABKAO jeong_k_Page_048.jpg
296c4bd64836b4dde6ac3e035531b886
cbe08e1ec19ceba6ecee661fafdb2197966334d1
25699 F20101206_AABKXX jeong_k_Page_118.QC.jpg
97a9354a3d4b012ca1f54bd47dc7ac67
82bf90f2b61d69e437f906bf3c6f094d840a0222
44313 F20101206_AABJVJ jeong_k_Page_082.pro
de2fa59093645a291c51fe0d9ebc0a34
30b382ec997b7a40441bafe68e2cd0e460960037
F20101206_AABJUV jeong_k_Page_021.tif
b902d64a8429a799c4d33bd25a4c687d
ffa9939cd8de507ad6812265f86aeb8434f8d010
28586 F20101206_AABKBD jeong_k_Page_080.QC.jpg
826d733c12092f557967502f69c6da6e
f4a817f68a890d19050ce3d877960b8de3abdc2e
F20101206_AABKAP jeong_k_Page_065.tif
d3d291f6576d90335baa24a47cc8b36e
7f9f425b0fc53b02ce71e193f70eae3f230405d6
6446 F20101206_AABKXY jeong_k_Page_118thm.jpg
c57c71d4cd2c1b31201420f188187bcd
eac0466ead5e4cf72a9e7f1f4cfaef16dc498229
1780 F20101206_AABJVK jeong_k_Page_029.txt
a1ea62002e9bda95ab92e995288eb8cd
f5b114429e0f90a942a3c8ab93afce66fba69467
23448 F20101206_AABJUW jeong_k_Page_009.QC.jpg
964e27baad72b13ce2c8c2b5420a1711
3a0ca344d11a038f6871da5ce5d01027b58de810
6799 F20101206_AABKBE jeong_k_Page_079thm.jpg
30d685c231f56ee4174ee36a6e8ad20b
70c93369102d8c5be18f60c5d71a6b89381a0c14
36294 F20101206_AABKAQ jeong_k_Page_072.pro
b5c95c26849314240cb12ff8701b05ea
227d681452fb7c860f7b7bad4e3adfdac1350628
41742 F20101206_AABJVL jeong_k_Page_106.pro
764d6076bc57b69e13a8247ad0ed1970
56436ffa52cfcf592bd00b6a2ed70c380602b8ba
F20101206_AABKBF jeong_k_Page_030.tif
a989bb9d1d88cfd2f7c0c53e0c8f52b6
9cd25f30f609181c5bd3582c45b81eb30ddcf2da
5920 F20101206_AABKAR jeong_k_Page_106thm.jpg
5dd213f6311060a8c92e51faf6c3fc46
d971133cf4a844a111b3196c1764a16263713721
22484 F20101206_AABKXZ jeong_k_Page_120.QC.jpg
072023e0c32534bf74caf0d3b380beed
500c93688fe90b19fd4ea6460cf384d181eb09fa
4536 F20101206_AABJVM jeong_k_Page_092thm.jpg
93485310c9e6c8e0a7047bda35c6e882
4665769cf53fa8c13b1f699cbc597a0f90409a86
57314 F20101206_AABJUX jeong_k_Page_034.jpg
f3f6d489c277b00506e0769040ba02d8
9495acc1741a4f5a70c13835d65cca4e5db3ffc6
1326 F20101206_AABKBG jeong_k_Page_003.pro
69d164022e44f8563aa092e1896e7727
ca36a64c61cf2dd7c6073f4d1798c74b2859a524
84905 F20101206_AABJWA jeong_k_Page_047.jp2
d232a0d0574fdf7af1d5051cfe855c38
25a552e34e4e7d1c4996c11b5efcb1df3503e373
19544 F20101206_AABKAS jeong_k_Page_049.QC.jpg
c73018ffdc098b7fd05255f0b1fbcbf4
f1554bac65bc8b0ed0b9ca7aa65396e9196e1c21
15351 F20101206_AABJVN jeong_k_Page_030.QC.jpg
0648c04aaf8a727cc47c67b3a80173fb
b8011bf374ead17f5a20a36291818294ee0f3472
1951 F20101206_AABJUY jeong_k_Page_023.txt
2b4c9615d984b8bc8754e3eb0ac21d4c
04ddc26c265bff3656c43cca389db73613bd1953
35678 F20101206_AABKBH jeong_k_Page_046.pro
96871e14b4f073ba9c3479668a4bbbe1
2a7aaf00954468093dd627509ec9419c27f1c8ee
1616 F20101206_AABJWB jeong_k_Page_039.txt
196e991b6f69a8aa9abec0f69562b4b9
1e5866d4d451991aa5a47abd955375b3fc516df2
F20101206_AABKAT jeong_k_Page_041.tif
f80bc51cf42c31dd38b2e77d5aa5c1f0
aa408e02e4276c5d8f571f30065e2066e7c08fda
4994 F20101206_AABJVO jeong_k_Page_076thm.jpg
d92dc0cc0d45a3002bd831438dbe0e93
91b9eea389eaa5197ab10515ffbd37c726a2a4a4
78557 F20101206_AABJUZ jeong_k_Page_063.jpg
be67fce36181df8cfae2a833b8f6c7d7
c576981448838692ee764e62d81c9b20fc35a9e2
3232 F20101206_AABKBI jeong_k_Page_003.QC.jpg
15a51b5c4c8eddf0476af2f70736d00c
4a486138bad32e2b729a996afd6b0966d4e0776c
2180 F20101206_AABJWC jeong_k_Page_055.txt
8c195e305c1253960ca70c884724136e
97fb33c1e8a654a8cd8d456cab6c374d0c27eaeb
5143 F20101206_AABKAU jeong_k_Page_046thm.jpg
d23bb332ec38cb61ff1b2630b33ebdf7
299833e0ce25b33da4520a8c69e807cefa0aa856
779710 F20101206_AABJVP jeong_k_Page_034.jp2
ae196aef5b21289167ef455bf8609815
da2e53593148fd9a332a728b53a618db2240e956
F20101206_AABKBJ jeong_k_Page_080.tif
957c65c2c6ca4d0106ea7692a30b8569
e2d187a670b59205f5d78d7fe1d01259cc222932
5209 F20101206_AABJWD jeong_k_Page_002.jp2
4de6dc71cbd7661d8a5f38d178172d8a
72354686d6fe8dcde92c7105a7f78c407d22f665
1051942 F20101206_AABKAV jeong_k_Page_095.jp2
945e74db6e26bdc0cdccd1764b860fd7
22ba00f40efe3ebe6571914ccf80608ae2c37e6c
F20101206_AABJVQ jeong_k_Page_035.tif
08304686a4e3cd886085a06720993f4f
d05aca533847631173da1a099729229fc015190d
64621 F20101206_AABKBK jeong_k_Page_086.jpg
f466129e347eb8b06deed445ebc2ffa9
69312c91ed37c3d9f325fa39b79107afbacd7f7b
91953 F20101206_AABJWE jeong_k_Page_105.jp2
59275c5fd76ade42a1558a143a21df15
92cb2c4d8f2edb533fc1ee3a4e6c709fa75844d7
5969 F20101206_AABKAW jeong_k_Page_082thm.jpg
250974c38c9656d14c99ef759921a214
a4fbb85cf85fc4eb11e1298e225006f866b51061
F20101206_AABJVR jeong_k_Page_094.txt
4b17efb2ed66120602129708563a2bce
f42dc6f328274cfdb75f2d5a77cb5e0ceacc5f3c
51867 F20101206_AABKCA jeong_k_Page_057.pro
375ac8e4b1d038ed966cd95edd9dc604
a06cbf133f9f6aecfe2c01a1e89727a18cc4d8b3
F20101206_AABKBL jeong_k_Page_103.tif
db10e3a5cd11661fd8a9b479571ece71
4e27da6248a9efff2302153e2e732f0a382ecb9c
3097 F20101206_AABJWF jeong_k_Page_123thm.jpg
54f8bb268040eeb93a9c7642056c16f2
6bfb07a6c4e13132ac44c8ec7291b5b7cd1c7ed1
73912 F20101206_AABKAX jeong_k_Page_030.jp2
53529702d14fcbef64ad2e8896756921
19afdf9b0e833751f2c2f36c32e139e7850224db
19396 F20101206_AABJVS jeong_k_Page_086.QC.jpg
6c6c2647139b2a9ccb3bbe63ba964ba9
cfc16193a174bc7687c60752b0f029571c7ee4ef
1051984 F20101206_AABKCB jeong_k_Page_006.jp2
2f83df78a4cc5b7fbad4b17f3ce8b9f6
14442875bcde32d8c7d8dbc640b91b1029dff1fb
6141 F20101206_AABKBM jeong_k_Page_014thm.jpg
b7964e6991789ec2ec50bdc398577567
d5c799ca8ab374adb4d87e2473f4e243c48c7564
1994 F20101206_AABJWG jeong_k_Page_054.txt
372b55ccec76548f5373c9a7d6098143
79e80c54d0d1b604f015812b5a9d87135cd50165
1677 F20101206_AABKAY jeong_k_Page_066.txt
282e19ea260271da6186421d36011a24
6561645b8f7e7186d0c9b8e23fc18c58304de438
51216 F20101206_AABJVT jeong_k_Page_027.pro
3f0b5c8ee2eec73fc3a279ab756be1d2
253690f13b97518e38258262711154f1f92c5822
77868 F20101206_AABKCC jeong_k_Page_117.jpg
3d2ec664d26ee1d4fd9a4c60dc9e7651
4013c5cf8c0ee71d6f4c02d62802f1c211ac9651
F20101206_AABKBN jeong_k_Page_107.tif
c460c53e22988d748eafe42f7131dd8b
d481a91b5da6bf2cb665275b36fc194ec9f7bf38
1576 F20101206_AABJWH jeong_k_Page_076.txt
3547db2860bd6f4d3f9090636d72f10d
3c0ef440244767601cd6fb7865c065eb1e851e07
21205 F20101206_AABKAZ jeong_k_Page_037.QC.jpg
737c441cef1a72ffdaa130b73b892848
3a4d51b4344610cff1bf228cb5882ce435704624
F20101206_AABJVU jeong_k_Page_121.tif
e76f67565ddf5d3695915001b63e9a46
57095b2834e711790a6ece5d59439eb09a5e2ad7
4481 F20101206_AABKCD jeong_k_Page_072thm.jpg
2bdcc4f7eedfae50eb440e0e535e3b74
025733b704aaec0d02b4729d46cb1b63010549d5
1721 F20101206_AABKBO jeong_k_Page_049.txt
5a1aa9a28b6f8952ddd48a7daeb7635e
8ad016eb79299f6482379551f53f1ea09c0d62ba
61259 F20101206_AABJWI jeong_k_Page_029.jpg
843ee26e544a722ffd09681d58b3d10f
fa3c655e05dae71c613b0337c8b14771ac58da0a
F20101206_AABJVV jeong_k_Page_113.tif
a251994175bfd49cebb56636e2e02348
b6b0dd9a37f9ff564d875430ae2f9e752dd6d840
346 F20101206_AABKBP jeong_k_Page_099.txt
6d5e3bd5995f7c71a3a0006d73a70f67
5f305a1c423cf574e697177f6a885ddbd3393525
7031 F20101206_AABJWJ jeong_k_Page_065thm.jpg
c8caf22680a72cdffc5c23ca879fb88e
d39d8391a75bbae00ac4b00217d6f95b5e81efa7
F20101206_AABJVW jeong_k_Page_116.tif
0698895eea5148cdfb15d88249b353d9
b8b05e041498d50baa6b32a5fada9fc50a383dc2
F20101206_AABKCE jeong_k_Page_118.tif
34565588a1ba438bdc2e08f396acc164
61f96a2ced6a3644ded2e8d793ac58106cb0dc65
24667 F20101206_AABKBQ jeong_k_Page_031.QC.jpg
755c354ee91d165c8c7e2ba9b6762a2c
46cb457ce81c5c8ad705ff95217c9771bfd5b956
87661 F20101206_AABJWK jeong_k_Page_023.jp2
8181942c94b9f1f5f80a9b2b2e084638
da25ef5765b64ca0139fd1c54fd0620ec71950e4
43870 F20101206_AABJVX jeong_k_Page_012.jp2
75b51c64445a6ee71356bbe160e1119c
3bda8002634144191383a0d9d0785d8d8d96516a
66934 F20101206_AABKCF jeong_k_Page_037.jpg
71df04b6757a1fcaf905cf3f9c8d9800
919584501932e0e27198fbed05fa10a33859d650
1287 F20101206_AABKBR jeong_k_Page_073.txt
72557455726d0ecdce84a2c96f7cae03
109a505c43e0c3b66838b3bfba2fd74f14147f98
901371 F20101206_AABJWL jeong_k_Page_083.jp2
cc87de4edbca3565b34f49d6e4fff7b6
0166c733a436cf3ec5998c064ee4cbddcc18ddef
40361 F20101206_AABKCG jeong_k_Page_021.pro
aab0fcfa432f113fbb0227d6e2913cce
2b0fc7eac1ce13bb53aae1aa137a92abc3106fea
F20101206_AABJXA jeong_k_Page_008.tif
87212bc85941ec3a6edb4f0c3165bbb2
9a66c5d23faa2e6bc60a04219f5fe9f78641c9e1
87080 F20101206_AABKBS jeong_k_Page_017.jpg
780cf214416857edbba04982082427c8
cdc9f1575d185172acb882c5d7418adc4c92fe15
23309 F20101206_AABJWM jeong_k_Page_117.QC.jpg
deb1de79899024976c430f8f1aa81cce
d1db66cfe0e8c6a52a56b1b0038ba01983efa35b
806537 F20101206_AABJVY jeong_k_Page_029.jp2
9b8edebaf658e80716896b418dafc8e5
150d3592c3505688695e831c9f042267814efcf3
39020 F20101206_AABKCH jeong_k_Page_033.pro
7121f7bb58c686c898d44f613289da92
45468deecb72a396c552dde76320350afe2635f3
F20101206_AABJXB jeong_k_Page_042.tif
f40acbb84f945a198c1f8e0af232d85c
b1cd7dc4e8774c3a320badecb913ad70b07879ab
4703 F20101206_AABKBT jeong_k_Page_025thm.jpg
8760c9c7a0fb83f35f7149ea9aba4ce8
935be3eb630a99c82fd853da1a764b4077199e61
78894 F20101206_AABJWN jeong_k_Page_111.jpg
5ac632744e6aa8d5e1762fbe691bd4d6
bec1eb0f87434b779c2392ff9db8cba4a6eef9a0
67135 F20101206_AABKCI jeong_k_Page_118.pro
704ba9c9fcd74f750b7a8adc9be996ec
8a24f2a93a53839a37b5fb62d8d0ddd4744e961b
1051959 F20101206_AABJXC jeong_k_Page_088.jp2
5d6b3157354be07cb50f7ff5c24d5bb3
0fb85d9fdbd5a4e9c1f219bd6087c279f03cff72
2787 F20101206_AABKBU jeong_k_Page_006.txt
bf9106d2f1e12f563dae894ba8b5a51c
c69c9d635a347b218d7de4c206204e519003e236
5104 F20101206_AABJWO jeong_k_Page_039thm.jpg
4a4872707c1d7b3922b9d2b5cfdee907
5909366caa944fc6d1c7285363100be4e0bb9f9e
23573 F20101206_AABJVZ jeong_k_Page_022.pro
c58f66942c310e6ee126eee716829445
d377a849d7673f282700a58c948c6a2cdb502700
F20101206_AABKCJ jeong_k_Page_058.tif
124250006dceabcee0d3ada2d3ebf7cc
a464da14119d0d86798e178ff7c58a9018f08848
3443 F20101206_AABJXD jeong_k_Page_100thm.jpg
971cf3f20c08250c2969f3273b313f90
72f880d134cc60d1449c0fed3e1e363d6de1a0c2
6693 F20101206_AABKBV jeong_k_Page_093thm.jpg
053ae4e3a0ec0fb9d14b4bf513cf465a
f2c0e8d174d14cac938f005baf36c4588e82392d
F20101206_AABJWP jeong_k_Page_069.tif
1b4e458de9b289087615e7d75e421013
bf4bb0a77afe2201f9bf62c5fade1dce08ad4749
F20101206_AABKCK jeong_k_Page_104.tif
034ef585184144b7bb6d7a67425d3efe
46d2e079c11c1c9832dd6e71cd386b49ba7e7a95
F20101206_AABJXE jeong_k_Page_066.tif
ce815f689b9817cf647ae510cc1debf1
2bed5806f45b20e339614018e053ce1fd6a60db5
27050 F20101206_AABKBW jeong_k_Page_079.QC.jpg
3de3af38ff4d07291735e0a4358d4f28
2d59bda542da142f89baca784a77820dd066bf17
2419 F20101206_AABJWQ jeong_k_Page_019.txt
79833aa6570bffef27a61822eeee5734
038d08fa0e0a4faab1477c56e0c763c9223863a4
20845 F20101206_AABKCL jeong_k_Page_010.QC.jpg
70df3f99ea9ca2efedc269efd1f4705c
608c3d0284a2dfc35a95c8152be2ba2f944c2bee
19871 F20101206_AABJXF jeong_k_Page_084.QC.jpg
1f3e4e7f8ab0fc956ebcd8ee09a24078
7f62dff7cb872690b6ea5bdac791c1eb0eb19879
F20101206_AABKBX jeong_k_Page_059.tif
03a4e98aff374fcc7bd8723a4e4bdfee
1a36272d5ec6b97750149cda1fbb24529d7e1dd9
22298 F20101206_AABJWR jeong_k_Page_025.pro
6bb300e81b1cafa1a743b8bf213c2b7e
f008d45f3333a01d0453185e3c2fddf9c8d82293
72205 F20101206_AABKDA jeong_k_Page_066.jpg
b4febc790c86a3230f816c4d96e8872b
3b3d01e0b6b72da5526567c49b42c8083584540f
127409 F20101206_AABKCM jeong_k_Page_094.jp2
ee3d2b595c33323fb76e92e319162090
8a9bc3587c049a31ba0c877e6da066db49090ec5
87611 F20101206_AABJXG jeong_k_Page_103.jpg
2a54d7b9701f8e4032dea55d62fad69d
020ec383662e87e90c1334f14cd25c4a54fddba8
F20101206_AABKBY jeong_k_Page_037.txt
eea4177844409b0279bc377515179af0
bb78b423d0c7b078e101617fddc6ad37a4ea267b
F20101206_AABJWS jeong_k_Page_036.tif
f1149b44086bd35cda95f3b777993751
684b5eacf4da9b3b6969d27de41cd6a8ed4a6236
60040 F20101206_AABKDB jeong_k_Page_068.pro
5005b34b9bac113f41d32757594ae5da
91882a2a73fe0ee224601d15d96cd047129a9269
4851 F20101206_AABKCN jeong_k_Page_030thm.jpg
973a8a45930b0c593ec2c60ec6b575d1
3ed8a06119204abe1b42d2cab7bf1932b3224ede
9833 F20101206_AABJXH jeong_k_Page_002.jpg
c77084fbbd994f3a40881e77b920ff81
56ca2440d5e22a8e31d95d29ac0af88f96717374
16429 F20101206_AABKBZ jeong_k_Page_024.QC.jpg
249b834739962ffb1e7723159f128def
4fdecff7b08a504322e4f7ee5cb18653d090b688
18407 F20101206_AABJWT jeong_k_Page_085.QC.jpg
8cf32baa1e01303bb6ac7b7cd81e289b
b1576cc89b1653326be05c084b652ddae09b7af0
3568 F20101206_AABKDC jeong_k_Page_077thm.jpg
b2bda322b8ceecfee2766ef78d2a977f
305a34f2de5bb99b21bc2c3cb5cca598fefd0e63
F20101206_AABKCO jeong_k_Page_056.jp2
0313b453aad4b762f11a5172adb86bc4
c8883f71d157871291435714ee010bfd02031f85
68669 F20101206_AABJXI jeong_k_Page_042.jpg
175afc8a6940b5cf3df179bceaf74e50
52f59ab54aa8169e5cd25999cdeefde13266b2f8
1707 F20101206_AABJWU jeong_k_Page_053.txt
efdd409ece841181e2220a023a955dc4
6893759f54ebd84adc5d58338d360ecf37b5647f
1051979 F20101206_AABKDD jeong_k_Page_031.jp2
2e7bf9032865a95b69e5315d140b15b3
ccc24ca534b1a374b06cd062630a4cea483be7e2
1223 F20101206_AABKCP jeong_k_Page_004.txt
3403d4e4b00d22d54355a490c364b6b1
a0392d707ac220ce3158305415bcc6f41b073d15
6075 F20101206_AABJXJ jeong_k_Page_055thm.jpg
3c60942173199d147d950aa1b27295c8
cfcf8a9f28af227d821f6712bb381119cb6074c2
90950 F20101206_AABJWV jeong_k_Page_080.jpg
69093c3bd457f5a5e852eaa57f728f6f
aa8009fec90abe0e0992a740386fd51d0260e11d
45263 F20101206_AABKDE jeong_k_Page_081.pro
666aefe20c0429649534727483ba4536
2cf928501ecf0268605e0f3b725bb7e5e2bdc19e
1404 F20101206_AABKCQ jeong_k_Page_032.txt
0cc2d395d1182d64aaf1aeee107eb6f2
309cfe5a57456cae3456aebd46792771786675f7
6411 F20101206_AABJXK jeong_k_Page_111thm.jpg
8e264b0f79edda9103d5ebc074324886
3b015aa45c83d58a8263e2b5aed15a3f948810df
14281 F20101206_AABJWW jeong_k_Page_074.QC.jpg
fb5607c5b86388e2a392168916dbff05
9f0344aae55d276d2a22ac9ec9da2ce8aacb600a
2004 F20101206_AABKCR jeong_k_Page_058.txt
2c7bf18618030769e8f507c7ad75f7e4
379fe7c0da00d281ad55f746685e80ad0ebd668c
17445 F20101206_AABJXL jeong_k_Page_021.QC.jpg
ac1e197cace8515e4ed0461edcaf7907
53a354ca731b43edb789123b3fd9993336914d9a
45503 F20101206_AABJWX jeong_k_Page_037.pro
3c54748a051cce5ceb66fd71daf579f2
f45587d7739878fb6c558d1b0ab1848beaa668b0
F20101206_AABKDF jeong_k_Page_025.tif
4d8e65d6f18e5937b4ee7b4114c41062
30e330115015b1d29340aae9ffdfe286c5ac79ac
1760 F20101206_AABKCS jeong_k_Page_046.txt
6ad1f3a3492aa3b36f185d78e1ba2b58
c179c16ef33a6157baaea41e54dd53b32044e927
20654 F20101206_AABJXM jeong_k_Page_106.QC.jpg
56be2cce925f52b15af1f05fd6fcab7c
d258ba78bd27b60a97928f682de0462c48a1e404
33764 F20101206_AABJWY jeong_k_Page_032.pro
2e2f23c7ca77833db9a183dcb35da049
cd248aacd9231d85fc2bf12adcaa63b36d25b6c9
7593 F20101206_AABKDG jeong_k_Page_112.QC.jpg
72829777b4c15068f0a11f10afe471a1
84e33108490fc883ec7b42c65f23e8a5e7aa610e
59559 F20101206_AABJYA jeong_k_Page_105.jpg
1f91d4ddc4f2a6deee8f42a0eff2ca6c
0534bdc71539cc8190983778a90cd9627948535a
20923 F20101206_AABKCT jeong_k_Page_045.QC.jpg
a6fd89d0e8eb05cb88e2837349feab1a
599e3b2395882839cea16827c99f065da0594aff
5288 F20101206_AABJXN jeong_k_Page_062thm.jpg
a094e37031fe037360bd16c0775781fe
a0e92ba775cacb1df237d273e19b4f4c0b7589f9
25102 F20101206_AABKDH jeong_k_Page_008.pro
aa93aa4d803fb7f11f437d0ac2f4f840
d6ddbf9fbf92ab16eae2845435c0709ca54fe36f
61265 F20101206_AABJYB jeong_k_Page_092.jpg
ceb3e08ba142861a5821b7cb8e486fb2
67f362426242fba456ab56b65add15babe4549f6
56290 F20101206_AABKCU jeong_k_Page_023.jpg
db36bc3479c04dede269e073e53b8e9f
0650a85bd3d5b6c83e64ef09323368ab876cda66
6444 F20101206_AABJXO jeong_k_Page_104thm.jpg
8b3be97e32d7bbd9777e6ef3cfcc80e1
1a192f582073eaefd5b0e80a4d662a4e1bd6eb1d
1051963 F20101206_AABJWZ jeong_k_Page_028.jp2
b996c411ec302af0a0cfca01f568d6d4
9f482b43421a1a8ab7bd754a9e0d14013faa6df4
4612 F20101206_AABKDI jeong_k_Page_048thm.jpg
4b6fece812746b9f1d76934e24e05a21
964ed0cc803c8e90acc602b3ac999797df2e640b
6567 F20101206_AABJYC jeong_k_Page_096thm.jpg
4e18651283d2828b7ab2c7db31605601
797f08fe05a2383f9c8927f55213cb50728f1b6b
5547 F20101206_AABKCV jeong_k_Page_029thm.jpg
6b2e026eee7fb4b6bc3559b25d6e9e3c
7418451e9c1c762709638bed4a4d5fe64e7d15d8
54915 F20101206_AABJXP jeong_k_Page_051.pro
5e74bc25353e11d1b18d10c8bb5d4dc0
16d4e6c6b5120a0cac6cd6df6ac4d53465abc44e
4614 F20101206_AABKDJ jeong_k_Page_074thm.jpg
b4c265cd98b899c47b7337927f857745
1561a531968b694a8fd6d384380861466dd2916a
1853 F20101206_AABJYD jeong_k_Page_026thm.jpg
76536e08ee31baaf93042d0f83bae263
00f248721bc76935e0d8025a31befbde2ecf46c1
1537 F20101206_AABKCW jeong_k_Page_116.txt
eafbbe347484a3323e209a77370c7d9a
20f79c4d46de5a5ab55a4d9ab4046b8ccce49ac4
942025 F20101206_AABJXQ jeong_k_Page_058.jp2
ece252e06e73c2cc695b8d3f6c8c1d3c
ecf19c8c69947bc80a64424f9c37d9e0baccafcc
5701 F20101206_AABKDK jeong_k_Page_081thm.jpg
7cee35b2d31b3f2119cf8958b42c426b
8c52eef2b4703db472b02b00ae2ea674690423ad
16651 F20101206_AABJYE jeong_k_Page_116.pro
1b156b61d0453ef5ef614efb12efd7d8
5cc20bcd3adfca24746659bbcaba524d061cfe35
F20101206_AABJXR jeong_k_Page_072.tif
0fce968b5e7bbce0cbb57ee32e6a5afe
64d17bf2c2f8654d04f0a5a4102bec72354d1703
1354 F20101206_AABKEA jeong_k_Page_122.txt
0139339a339d6da3779fb62ac6479e1f
183d82dafda90d7c317a96d6ccda1254164b9840
41926 F20101206_AABKDL jeong_k_Page_041.pro
b294e3a9eec825c9bba8b66db3c76697
fb6a7a7147c953c4990173d243f7ef8d63b3ecde
F20101206_AABJYF jeong_k_Page_031.tif
0c4d72f482e6c7ebfd924222b6d7cdd5
6a2eef0e09485b5b93cbf804449613b407854ca4
89614 F20101206_AABKCX jeong_k_Page_065.jpg
92f17dc4826c13a989a12647ed299f19
68bba996d4ebfcba15b5fe842997b5b50b310fb6
1051930 F20101206_AABJXS jeong_k_Page_067.jp2
28f963320247cfccb0f4effef83865c5
d0d3337d06b39856f899eaee11ac9a77244e1599
27704 F20101206_AABKEB jeong_k_Page_016.QC.jpg
f1b96ad28988f7072207300f621e63e2
cfe2c9ad36ba0d74a07b5ee1433153f6e6336db4
22061 F20101206_AABKDM jeong_k_Page_092.pro
1545dce57c062289a3fdd121268f5550
d46ad597cf272345297ab006307e44109657d686
10925 F20101206_AABJYG jeong_k_Page_123.QC.jpg
ba93555a8f0b2abc814c30999f951399
d17fa6f519b43354d0fb89381c1b2ee8ab71fdd4
35962 F20101206_AABKCY jeong_k_Page_076.pro
6ccb84b865a38543c27549949e84d0ab
603e8acc4410ebd4fe7dae605536ed4718a2b961
F20101206_AABJXT jeong_k_Page_009.jp2
bc4c3333c9d01ae38aecd7c5de252662
9ed2be095750d3414662687a5ccc09d66bef2b6e
F20101206_AABKEC jeong_k_Page_014.txt
a38dd4b6bdac3f768e022cfa8f9ea366
87fa98fca4a518730e5b5a76958181512c8eb8f5
3560 F20101206_AABKDN jeong_k_Page_070thm.jpg
6ab990b52078cc02e2d8047b98514c69
0d92ae489b3246f15659945ae30efeb9ceb83b4e
29097 F20101206_AABJYH jeong_k_Page_019.QC.jpg
28596a7fae6bf737066b7c07305079be
cbbae0c418d0e3f2da2dec5df0b407c588029078
46954 F20101206_AABKCZ jeong_k_Page_054.pro
387c9e3870edf728c0b65aa41f8631a1
ca03454c11b84b5673b870f4c091678915326ab3
2183 F20101206_AABJXU jeong_k_Page_110.txt
675bde3cd89ed10eb25366e60f56e2f6
13fe1caf8aee79a461e7e7011904e58ec7eb9c36
33450 F20101206_AABKED jeong_k_Page_116.jpg
153f8366ae67486b47be06640330d9c6
88a7a97c14b398b18023677609948cf305fb6d02
42021 F20101206_AABKDO jeong_k_Page_052.pro
624b0ac0c4a15389beaefc368e8f4273
72adeb42490df4a9d189eb2f701a0b4fc4a2ca1d
86585 F20101206_AABJYI jeong_k_Page_095.jpg
83a1b98c2e9eb8a2dfdece0d926b9492
27eb8c4bc0a8947fb285be6ee0c17f8033f0833b
5086 F20101206_AABJXV jeong_k_Page_026.QC.jpg
d23a5ce305ffb59b1ea39639e1cb4811
e7b92e10d1835adfb5c2427804235b556a56b49a
2079 F20101206_AABKEE jeong_k_Page_031.txt
99f2b706d46be10ac571213ee2eb05e0
f3612f0cdd37113cd6c7af4379b7a69721ecbdcf
63091 F20101206_AABKDP jeong_k_Page_069.jpg
03da494616c7b6e56733e681a2fb139a
041fcd07e4cea9ec8b7957cabedfaeef36de46d2
18662 F20101206_AABJYJ jeong_k_Page_098.QC.jpg
8ddcb175ecb3573f8db217b5f79a3538
ab7febab1c276863100b84e6d3edc544420bb4c4
3319 F20101206_AABJXW jeong_k_Page_008thm.jpg
69ea6eb3a61e8f5a3217dcb0397020b2
2dbca6c2eba5a5853717f2202c18bb5d8460c778
11228 F20101206_AABKEF jeong_k_Page_100.QC.jpg
e4a49512fcd7899b1abdf258c48f396e
1918e1ac8cefad4e22bbec64633b6406baae0bca
F20101206_AABKDQ jeong_k_Page_115.tif
8d7d5110fa8d5de4d45f05deed1e8ddc
73ae77dcc074b80614a79829e638b0a7d6581ab7
F20101206_AABJYK jeong_k_Page_009.tif
3482b49fde5694eed0ad2d355be864b9
5250af11c7245c163f813d49f2049e359765c0db
47141 F20101206_AABJXX jeong_k_Page_109.pro
c653fe2ee17acc97237c06cd06b7dc07
c00df22b1da29d8e01c7a6aa8f836d20e85decf3
14330 F20101206_AABKDR jeong_k_Page_074.pro
3f36b18986e232dda004389dd8d8cc5f
6d1c3a5e03facfb70376e0919b4cc418a0ae7899
9192 F20101206_AABJYL jeong_k_Page_078.QC.jpg
4158a27433e67721b6c78cdb4e426e68
80df4dc26deff6b946c215c45dd71d6dc160929a
74824 F20101206_AABJXY jeong_k_Page_054.jpg
48c03b0f6b0fed62dff1421e9bd17fc4
815b52dc2d2d3eea4c8da8f28bab081109e3274d
530787 F20101206_AABKEG jeong_k_Page_048.jp2
7313b33a1d5f0a38db72d3e846cee526
c068422a2d3839e9c5ab8831b38e5ccf5d088478
2337 F20101206_AABJZA jeong_k_Page_080.txt
f6e658fdf36f9eaad5023338e09c75bb
21e308d91de266efb118ced9e5fc6c8950a152e6
27557 F20101206_AABKDS jeong_k_Page_069.pro
354c8abd805cee2003f183fdba996ec2
ef752f95b56acc4dd211ebb680e5f6811cd45fec
60905 F20101206_AABJYM jeong_k_Page_119.pro
adb1b135c44faa8ff04dacadc1d49941
67c9405fbf2a7dd31e55c25b6b44fdef72109051
F20101206_AABJXZ jeong_k_Page_087.tif
829c6604fdadc384e49c6b221d80ea57
82fb73779c395b496165b46af3a66f5e8e6b22a5
F20101206_AABKEH jeong_k_Page_024.tif
c8e28b47104bc983eb4f53c5a50a9e2a
379fc5e56b1832c7972a3edc31f3d3a117b7f5b9
48008 F20101206_AABJZB jeong_k_Page_028.pro
a51bef5137b01c1c3426be17cb615ed0
199b1c023b771d1579faf02352b3d9837d8550b0
81595 F20101206_AABKDT jeong_k_Page_094.jpg
57791834529c1d25e44953e2632aa288
f51c5931c267ea2a609b4772eb32c9f539aa7fa3
11962 F20101206_AABJYN jeong_k_Page_101.QC.jpg
fdc05a74ad2cdbd9b0dbdb044c249606
eed37fb5808e9440c24fc20e9c600031f112d5b4
27222 F20101206_AABKEI jeong_k_Page_051.QC.jpg
a684340165f63631f0cd7a07b1e5ea95
6696b5438d1f169fc9fa76a4cf9cc4488ce45784
47612 F20101206_AABJZC jeong_k_Page_063.pro
8350569a5dc6f6f3c9db731222db61b3
1df64223db945e2b7323f51bf9b52adf4fb0e273
6873 F20101206_AABKDU jeong_k_Page_051thm.jpg
a774ceb4a94d14bd88aac3bad74af151
7c25cfd903aa249289932bb213dea0e28898b7aa
107 F20101206_AABJYO jeong_k_Page_003.txt
7dc6bbc11ad8135f0b0eb86945c1cab2
d84d77bc71bc4302255718d2aec39743d3f65eee
F20101206_AABKEJ jeong_k_Page_092.tif
3a0c8f4738480b486528fd9fbce958ff
d2bf4419d9ea88862c5968e438051959e2e4d7e6
796942 F20101206_AABJZD jeong_k_Page_007.jp2
b6d06adafe3b6694246efeea9d66249b
e94c0b332dee49af92d78e352b1b9c62de888bfb
11753 F20101206_AABKDV jeong_k_Page_097.jpg
1f93706817ec7be8fca09bee8e1af368
0f9343edd30e9f4e9f376e964fd7ddb9ba08d5ca
1888 F20101206_AABJYP jeong_k_Page_082.txt
b2b9069e77343578904e17e648e0feb8
e5605919008a1404c5577faec84785829927fa83
19781 F20101206_AABKEK jeong_k_Page_033.QC.jpg
8a09f28b1cdb9db3ab66cebfb0048832
238cfd25cc7cfd276a0abe85705d9eee200e09bd
9992 F20101206_AABJZE jeong_k_Page_003.jpg
2308487c9e9395e6ab2af7989723a38a
5f39bdce3d631d08a63e18b1f550ec8c2b5877b8
129622 F20101206_AABKDW jeong_k_Page_121.jp2
2c4ca553337a4d8757ef0a53e474d39c
4ee22fc359cf2a69d9ca6e9e49ff4b5e05ef03c4
29440 F20101206_AABJYQ jeong_k_Page_088.QC.jpg
f2ef1caa1543b79337fae38eee2de6c9
a47d501bdbea14ee3999db467b468063b0d7ede1
1051916 F20101206_AABKEL jeong_k_Page_091.jp2
3865ce5ddc661426de4d7fcbbe53828d
022db6abb57bac13491bf9a60c40ee0dd4206468
6805 F20101206_AABJZF jeong_k_Page_017thm.jpg
81318dab4628df9ab0d04739a5952cda
9e5b98389282374b6c5249d3877e2d449c398657
17979 F20101206_AABKDX jeong_k_Page_047.QC.jpg
504bb7da5e6a16d816d9eeb22b24ebef
c2e5ccc4a0ab1478ab35fd7a47f38e6d7272f8e7
42018 F20101206_AABJYR jeong_k_Page_058.pro
3e741fe8f0b0ff0ea3733ad3e50a0c37
c9b751de5d773be8cb5f03692c5ea201e9444c41
F20101206_AABKFA jeong_k_Page_068.tif
f5842122dd8349dc1fd9526970c52421
3de64400a2e1d6c446b9f5884b609246272fadfb
6646 F20101206_AABKEM jeong_k_Page_110thm.jpg
b1cf88c9583da7549c367fbb3a1177db
5a109746bd5d8d1d5d5918cfbd8ab909a617783b
2442 F20101206_AABJZG jeong_k_Page_119.txt
3d2bbfcc91e22922e8ac190bd6b89483
11a5a923917752fe3bbf613f08c1fbc22494eae5
5738 F20101206_AABKDY jeong_k_Page_032thm.jpg
05f1523fd5877c7e5f8c657b10e7966a
0d7fc29a1b4e28f5355e78fcb4fa49cbbc4d521e
5354 F20101206_AABJYS jeong_k_Page_045thm.jpg
63642d0f3edd35ceab668c190afaa924
8244feff0326de349a269d71db02f55e2e88805b
2282 F20101206_AABKFB jeong_k_Page_093.txt
510211e4fbf563e5d35b0eb9425d403e
d4389a1ceb3ffb6b17cd2dea72a3fbd179d88c2e
1114 F20101206_AABKEN jeong_k_Page_101.txt
5c7b566b6e874d7d03eedd61f876cf8c
d6314a1e4f31be24b272d53a0de82b024c5a9864
6893 F20101206_AABJZH jeong_k_Page_056thm.jpg
afb09b71980d10641c8f9e007f0cecb6
72802f03350ce3af10a50ba91ee7418019a97f77
F20101206_AABKDZ jeong_k_Page_019.tif
59b4fc4dd33ac3e8f80e810c15b37e1e
57ad7b3d8378c6ce12b88db7f1b434bac76b094e
37925 F20101206_AABJYT jeong_k_Page_070.jpg
6f26dc743d62ce8006e179c3378270bc
120ed54f5b961f3feb2c6fd82e00f8321e29a15f
F20101206_AABKFC jeong_k_Page_007.tif
b2c9844b7cdd8db2080ba85487a07df0
1d9448761152f239f36ece806735a93f55805dd0
16336 F20101206_AABKEO jeong_k_Page_078.pro
652bc549804aae070fb371d6958ba61e
4d2133770b4f4b5c3353af432bb614924c2e755e
6610 F20101206_AABJZI jeong_k_Page_031thm.jpg
13bb76102af3c25fdec64817ac7983c5
a0ef6501280c6a72ce4c5d3ce25d47871a7dcdd0
40048 F20101206_AABJYU jeong_k_Page_083.pro
df7bf7c4938add6cb17c3ae95f230a85
842047bcb699e93217d1816ef670a1e1d3a57efa
1958 F20101206_AABKFD jeong_k_Page_052.txt
c4044399f2bb4a97ec3a42a454fe5e2f
453bbaa08d2daa7b5bb2ec63f75365fd1744d03e
57652 F20101206_AABKEP jeong_k_Page_017.pro
9678ff069c4ce44ffe524a541b5b8bac
d5e1da97cd47d9ae7884d14ff836480fff351f74
F20101206_AABJZJ jeong_k_Page_036thm.jpg
455bd61d8260f58a2098acdfc3b0e496
e5e00e12169e932e1868ff188190abe6727319ef
F20101206_AABJYV jeong_k_Page_032.tif
a1bc4f3989374023b2ba3c5c947f566f
60cefb00e3c9e3000625f24aa51559c73a5ee178
55637 F20101206_AABKFE jeong_k_Page_093.pro
b7a91b799389c53123a00a98553c9419
4f94c0120d06da3fbb3a963cc8b0891450c5dd9a
F20101206_AABKEQ jeong_k_Page_027.tif
de2908faa5367169adae812967c54565
9b436fb8c1cf6666f1de96be68b224003b7092ea
2434 F20101206_AABJZK jeong_k_Page_088.txt
ecc834975e9d802fac2d631cb2b899d1
b9b57c1fe9147a6d2811527fbea574f9f7231d25
5348 F20101206_AABJYW jeong_k_Page_049thm.jpg
58b787f0a25433ebd9456521d95edf36
551f7f18076acc0d2604bf47f3895b15d6d50c94
82037 F20101206_AABKFF jeong_k_Page_121.jpg
e7e713e01c576de25c993a0d1ec79c40
ad98b97de210a6c0d4d2f9efcc1e8b481073b63e
18036 F20101206_AABKER jeong_k_Page_073.QC.jpg
2bdae5e8a3ca6c5098a899038be960a3
a0f087ceb03807a048a0fd55502159ab8057feed
5289 F20101206_AABJZL jeong_k_Page_105thm.jpg
77e58d85767a4118a3b3702e0fbdff70
83f67aae6ee5e280528c26df42f63e2d1283b2fc
31162 F20101206_AABJYX jeong_k_Page_064.pro
55416a7b795ac83b87b422a63fd7ac17
34f1f3aa5a4d9cde3d2410e156b78e0cae487f10
2233 F20101206_AABKFG jeong_k_Page_060.txt
3ded3ac46507bf0609d337fdbe658c39
a15710f52951c3a9aba1564635f14fb47e13c131
1765 F20101206_AABKES jeong_k_Page_048.txt
3d9afae50d139e30c28a3f2e45039e16
d3c0af1b3ca5786e2fff24df1cefd899db951d5e
F20101206_AABJZM jeong_k_Page_101.tif
a29a88a95daa9f7b4cafc81cd39363aa
34beb597ba16edf479271f0a4390a66db2979eff
32274 F20101206_AABJYY jeong_k_Page_012.jpg
832e7d9bf732e8214f54088bb9335924
58c711b5fef177b6116c5d5c5498cba0b7244d54
266 F20101206_AABKET jeong_k_Page_077.txt
8484ddb09488e416f1661c9250d4752a
9fe802f2717e2c1c63a1234720ae754d90976c65
23522 F20101206_AABJZN jeong_k_Page_043.QC.jpg
d4e68b400f7be5e952b01b4d4aea5f94
2a7b7bbca2ece266e16bf6e00600d38dd49c993a
23149 F20101206_AABJYZ jeong_k_Page_035.pro
62e4136b0100e7cd09f3e9788319b1ee
e1d082046846ed5da50b9275a0052fdcdb8dc406
98034 F20101206_AABKFH jeong_k_Page_081.jp2
241cdc1a57c8b30c0ed60180feca85c7
41974f8899e68da4124dd773c423f9afde053527
F20101206_AABKEU jeong_k_Page_004.tif
ea2f2e4dda892f3e930c7c6260714371
0dbd738251b0f2e67e1d0c7baf1a28d1ce012fa8
F20101206_AABJZO jeong_k_Page_017.jp2
1592a2cd397ec36cbb29a936a9d2d2c0
c3893e4afd024dd2736199d8c35845c9738af347
24075 F20101206_AABKFI jeong_k_Page_121.QC.jpg
907f138ee50e1a5cfe95ec3a0f1ff8a3
87c95e97988f451989426074e56ac305eef54e2e
2288 F20101206_AABKEV jeong_k_Page_095.txt
5fcbf36b9be3023e23122d6bda2dd687
b806ef3610829c0ae087242837f3bd5b08468b02
F20101206_AABJZP jeong_k_Page_053.tif
619155154d2eeb74706a70a0af38dcd7
6fea164c43366c6f77b486106dbdcdacdf518b41
5343 F20101206_AABKFJ jeong_k_Page_010thm.jpg
48faf91d22df2199fa36a189a0ca9f64
04eb04c2390b0d9ffec6a446b2011c060a009772
38271 F20101206_AABKEW jeong_k_Page_107.pro
1a8265b665ebcc95aa8712acade85113
46b93ddc90db720d1126701d7136ef987cc8f4fc
2313 F20101206_AABJZQ jeong_k_Page_016.txt
195817ae36bcdbeac7ea3b1a6faa6078
9386e5899272a571596dd02efcc48784cce63add
953080 F20101206_AABKFK jeong_k_Page_082.jp2
8f2e520a1848285a2a140c11a90f94c6
93cc4e3c9b738d51f2d20c92eaf055aabea2f0ef
1207 F20101206_AABKEX jeong_k_Page_069.txt
b0e97de9cb71374c7525abbfd041527e
0423be3c4341589bfe77cd06898ad12125945bc0
17297 F20101206_AABJZR jeong_k_Page_053.QC.jpg
9b497033196a9945465d1dc7107bcd4a
dfabbe56b1e204dfb8c15ff896835aeb431e50c8
77162 F20101206_AABKGA jeong_k_Page_057.jpg
3dcdc25d07cb7066fefae6b73d4cf1cc
080769847d049655dbb9b1eda9f3de5905adf4b6
418 F20101206_AABKFL jeong_k_Page_001.txt
9373c4ba4d0e1219b51621234b06f181
7c2684c66db5eb19561318923f3d88fdc004e77e
F20101206_AABKEY jeong_k_Page_040.jp2
87095ffa999f1cdbaadfba7bcbc3eeff
9723dac92a57472a9137e53cf95cfd5ee1821c25
21492 F20101206_AABJZS jeong_k_Page_032.QC.jpg
b69c8364af30d4ef4ed96f7ce6ad0eb3
9ab8507f162f08fbfa2afc77d2082cbe3e5ce52c
F20101206_AABKGB jeong_k_Page_122.tif
171833738413b1adf8803dc02e7b64c2
372487dd90c72c4833ec65fb9a7b7f9c12522918
47669 F20101206_AABKFM jeong_k_Page_015.pro
6f70a57dc8108ea34912fa2aaae1964c
c51c1a1786b02794aedd7f0e5beff28b9561b4f5
F20101206_AABKEZ jeong_k_Page_029.tif
f0b8b17ac3ed47a1b0d77b92c88fa7e3
f5e89a3a4adc0b38a7e8c5b7bc4ad9bd489b12e3
27718 F20101206_AABJZT jeong_k_Page_095.QC.jpg
05193d6c58c5d32e24e2153d22e7a911
fc2bfb99a8ba5abaf1c9e671988e62cae570e438
F20101206_AABKGC jeong_k_Page_098.txt
9c16ecefa7406986b068f71b5de89367
6ad221c3f82ebcea2b0b0fe371f1d351158fe251
1242 F20101206_AABKFN jeong_k_Page_067.txt
83783c2bb76345abc9786567de9297a5
ea2982098c25e5da1da1021931e0e2a5fc6b1377
20249 F20101206_AABJZU jeong_k_Page_083.QC.jpg
5ac6b0056991666115a30dd7cf782943
5d34570bdec34314dc0a804e7c0876616b99e9f4
6497 F20101206_AABKGD jeong_k_Page_015thm.jpg
7d179ccd0d7c22bd797f59dbfa1a5b1f
92b2590a154b6b2aec21bc0dae59111af2c4ad3e
5612 F20101206_AABKFO jeong_k_Page_011thm.jpg
e47047b660933e30da658015dd1c523a
e2e9b6ddbc34f2a1f8dea5a72bd8cf84de0202dc
15152 F20101206_AABJZV jeong_k_Page_067.QC.jpg
d24678b16b4f9aa34d49edbb821ca604
7f1a28f87091143ef4e45d0b72b6dccdeea1b413
F20101206_AABKGE jeong_k_Page_111.tif
e8b52ab0fe1f3987465faf3296cbe34a
1bfb973d378dc17fa264a430c31baf2e569ddca5
16344 F20101206_AABKFP jeong_k_Page_025.QC.jpg
a4a3a6d32d4dc91a0816053e095274de
1fda29024574105eb221348a960f47df099cd9b4
18866 F20101206_AABJZW jeong_k_Page_012.pro
8cf5b07afc26aa11b3e1b369303b6b25
b5622120193a0d14ce259edb05f11385f6a416f6
2024 F20101206_AABKGF jeong_k_Page_084.txt
4e622727954fa4e85d4f537c9beaf4b2
3c06741c04e7b328aa3765618c3fc465a042daf3
1428 F20101206_AABKFQ jeong_k_Page_003thm.jpg
3c0872627c7230da36b260da7aad3298
35e09ce19ead1a0c0fab101a342863b668e978d6
6070 F20101206_AABJZX jeong_k_Page_099.pro
139d049df56b02dad05b378c3c3ec160
1f13a15e19a3becdbbbb475409a747ba5de9a297
1051971 F20101206_AABKGG jeong_k_Page_014.jp2
afd8cb34a77420464e4f9d824c190254
af0ea436bdb6efb94ff694543d15df8f205cdb96
F20101206_AABKFR jeong_k_Page_038.tif
e61128453244f21f28e7f4138cf56ac6
e528245890e1155e7ae005828dcb1fbf15d0c9ad
61716 F20101206_AABJZY jeong_k_Page_019.pro
25e18c5a94b3280821e7bce5ac3c68bd
d5482aa12d4844bea69211c865bd5adf17f461aa
3871 F20101206_AABKGH jeong_k_Page_122thm.jpg
04abcb255160cc9693fc182b82115f53
815681023239a298495fa83d3207b1a7198d13fe
88281 F20101206_AABKFS jeong_k_Page_056.jpg
82e809eba0479e44a729bc5f21ac5085
3e14033e2db58adb92a9c75e084c504be06ce69f
6843 F20101206_AABJZZ jeong_k_Page_095thm.jpg
ebba2eeeedd1e7e3134a2f5add84c7dd
fed9d979a9dff26d0ce69f39a1a5081e9e79a8ed
888013 F20101206_AABKFT jeong_k_Page_044.jp2
bbf626dda404e81144b92833217edbf8
420bf1f7e155edea3a5b6cd68dad05c5fba1c661
124632 F20101206_AABKGI jeong_k_Page_117.jp2
5ffc46be5f72891e5d2953c8e962ad9f
536d7c3b83117fc6f57bb2e951aa9db171878877
5191 F20101206_AABKFU jeong_k_Page_021thm.jpg
313da9dc2e3f48d502959003d3a3f2ed
6e4df99332b3bc8e306ca77b91351d60a8e6174e
F20101206_AABKGJ jeong_k_Page_015.jp2
80be25e96a9525ed018a8f48f4985c7b
0407aba3c866ab2d3360776f5d9f920acdee5ed9
852923 F20101206_AABKFV jeong_k_Page_049.jp2
cd2d225cbc66f2c8a77ed0413160db05
6d27850cac3cbd704bc9ee12c236dcbe0bf51454
2355 F20101206_AABKGK jeong_k_Page_068.txt
827415a76c8afbb296f653cbc9c553d2
4c6f6ef0eb0e416d2e637a56c63acaae0f88b3a4
1157 F20101206_AABKFW jeong_k_Page_040.txt
3c390b2179eea80689a3adaa1afb0896
5e7c9729f83384956f41069b91af345ef974fdff
2219 F20101206_AABKHA jeong_k_Page_051.txt
72b0d4b08315e03f8bf56b19b252512f
4c74f633ec518db090f93049dcfd9435449709bb
26800 F20101206_AABKGL jeong_k_Page_093.QC.jpg
f227b172489b93997ec91fba06c4afc7
4807860815f5ff420a782654d44e0bd22b5d0006
F20101206_AABKFX jeong_k_Page_089.tif
935be683f3fbdfd79cc46007299f26fc
ec3e56e5cb739f05cb8e6e7713dd9efeb72d7f95
65198 F20101206_AABKHB jeong_k_Page_004.jp2
6c50914c3addefb9612974f13331f957
882a88484fafda671b724cabc30e7200a00a9e00
6158 F20101206_AABKGM jeong_k_Page_009thm.jpg
8bac2dbcfbcc1caaca2c657af8d585a0
07457f12aeeb2c7fe530e3ad97306cfe66fa2e0b
24345 F20101206_AABKFY jeong_k_Page_044.pro
fc640f898feb91ce345c00f2efa7fc2d
1654b39c99fc311d3eba1033b917344c3baffd33
1002 F20101206_AABKHC jeong_k_Page_092.txt
83d46b6227f62108a7e90dd4c1f6d755
c6f5a153359cb1fcc6dde34f3756943700a5ae6e
6000 F20101206_AABKGN jeong_k_Page_028thm.jpg
69560aa73dae089546b478346456d3c6
9dbb57ea83dd101b33ab78d97160ff032ec3e795
F20101206_AABKFZ jeong_k_Page_045.tif
ed606ef2532dc0a96d5a84531651bfaf
81ee3b630233585caf71b5676a3f60af428ab789
969495 F20101206_AABKHD jeong_k_Page_090.jp2
d251e582f98f34b92954028696b06f18
1c7808a3572216eaaaac4077eff81620d32c8c59
56049 F20101206_AABKGO jeong_k_Page_053.jpg
35b91d53b50769b1da90546b13728fe6
4864a718bbd3259ba06876ff32d777a2cb34d622
38526 F20101206_AABKHE jeong_k_Page_049.pro
e4039db7c73e1ea1a15064f80c65e4d5
2f345d3f099b742edd3e7ec775e7bc9471e5879a
51764 F20101206_AABKGP jeong_k_Page_059.pro
5dfb542548b01e6c66f097259da7eb1e
2b3c2478457f0f53adfa6d0463dcfa9146a698ff







THE CORRENTROPY MACE FILTER FOR IMAGE RECOGNITION


By

KYU-HWA JEONG



















A DISSERTATION PRESENTED TO THE GRADUATE SCHOOL
OF THE UNIVERSITY OF FLORIDA IN PARTIAL FULFILLMENT
OF THE REQUIREMENTS FOR THE DEGREE OF
DOCTOR OF PHILOSOPHY

UNIVERSITY OF FLORIDA

2007


































2007 Kyu-Hwa Jeong


































I dedicate this to my parents and family









ACKNOWLEDGMENTS

First of all, I thank my advisor, Dr. Jose C. Principe, for his great inspiration,

support and guidance over my graduate studies. I was impressed by Dr. Principe's active

thought and appreciated very much his supervision which gave me a lot of opportunities to

explore on my research.

I am also grateful to all the members of my advisory committee: Dr. John G. Harris,

Dr. K. Clint Slatton and Dr. Murali Rao for their valuable time and interest in serving on

my supervisory committee, as well as their comments, which helped improve the quality of

this dissertation.

I also express my appreciation to all the CNEL colleagues, especially ITL group

members who are Jianwu Xu, Puskal Pokharel, Seungju Han, Sudhir Rao, Antonio Paiva,

and Weifeng Liu for their help, collaboration and valuable discussions during my PhD

study.

Finally, I express my great love for my wife, Inyoung and our two lovely sons, Ho..v--

(Luke) and Seungyeon (Justin). I thank Inyoung for her love, < -iilH- and patience, which

made this study possible. Also I am grateful to my parents for their great support for my

life.









TABLE OF CONTENTS
page

ACKNOW LEDGMENTS ................................. 4

LIST OF TABLES ....................... ............. 8

LIST OF FIGURES .................................... 9

ABSTRACT . . . . . . . . . . 11

CHAPTER


1 INTRODUCTION ...................... .......... 13

1.1 Background . . . . . . . .. 13
1.2 M otivation . . . . . . . .. 18

2 FUNDAMENTAL DISTORTION INVARIANT LINEAR
CORRELATION FILTERS ........ ........ .......... 21

2.1 Introduction . .. . . . . . . .. 21
2.2 Synthetic Discriminant Function (SDF) .. ...... .......... 21
2.3 Minimum Average Correlation Energy (\!ACE) Filter ....... ...... 23
2.4 Optimal Trade-off Synthetic Discriminant (OTSDF) Function ...... .. 25

3 KERNEL-BASED CORRELATION FILTERS ........ . 27

3.1 Brief review on Kernel Method ................... .... 27
3.1.1 Introduction ............................. 27
3.1.2 Kernel M ethod .................... ........ 27
3.2 Kernel Synthetic Discriminant Function .............. .. 30
3.3 Application of the Kernel SDF to Face Recognition . . 31
3.3.1 Problem Description .................. ....... .. 31
3.3.2 Simulation Results .................. ........ .. 32

4 A RKHS PERSPECTIVE OF THE MACE FILTER .............. .. 36

4.1 Introduction ..... . . ..... ................... 36
4.2 Reproducing Kernel Hilbert Space (RKHS) ... . . 36
4.3 Interpretation of the MACE filter in the RKHS . ..... 39

5 NONLINEAR VERSION OF THE MACE IN A NEW RKHS :
THE CORRENTROPY MACE (C\ !ACE) FILTER . . ...... 41

5.1 Correntropy Function ............... ........... ..41
5.1.1 Definition ............... ............ ..41
5.1.2 Some Properties ............... .......... ..42









5.2 The Correntropy MACE Filter . . ......... ..... 46
5.3 Implications of the C' \ ACE Filter in the VRKHS . . 50
5.3.1 Implication of Nonlinearity ................ .... .. 50
5.3.2 Finite Dimensional Feature Space . . . 51
5.3.3 The kernel correlation filter vs. The C \ ACE filter: Prewhitening
in Feature Space ............... ......... .. 51

6 THE CORRENTROPY MACE IMPLEMENTATION ..... . . 53

6.1 The Output of the C'\ ACE Filter ................... . 53
6.2 Centering of the C' \ ACE in Feature Space ..... . . ..... 54
6.3 The Fast C'1.ACE Filter .................. ......... .. 56
6.3.1 The Fast Gauss Transform ................ .... .. 57
6.3.2 The Fast Correntropy MACE Filter .... . . 57

7 APPLICATIONS OF THE C' \ ACE TO
IMAGE RECOGNITION .................. ......... .. 60

7.1 Face Recognition .................. ............. .. 60
7.1.1 Problem Description .................. ..... .. 60
7.1.2 Simulation Results .................. ........ .. 60
7.2 Synthetic Aperture Radar (SAR) Image Recognition . . ... 64
7.2.1 Problem Description .................. ..... .. 64
7.2.2 Aspect Angle Distortion Case ............... . .. 65
7.2.3 Depression Angle Distortion Case .................. .. 71
7.2.4 The Fast Correntropy MACE Results ................ .. 75
7.2.5 The effect of additive noise .................. .. 75

8 DIMENSIONALITY REDUCTION WITH RANDOM PROJECTIONS ..... 79

8.1 Introduction .................. ................ .. 79
8.2 M otivation . .. . . . ..... . . .. 80
8.3 PCA and SVD ............... ............ .. 81
8.4 Random Projections ............... .......... .. 81
8.4.1 Random Matrices ........... . . ... 83
8.4.2 Orthogonality and Similarity Properties .... . . 83
8.5 Simulations .................. ................ .. 85

9 CONCLUSIONS AND FUTURE WORK .................. .. .. 93

9.1 Conclusions . . . . . . .. . ... 93
9.2 Future W ork .................. ................ .. 95

APPENDIX

A CONSTRAINED OPTIMIZATION WITH LAGRANGE MULTIPLIERS .... 98

B THE PROOF OF PROPERTY 5 OF CORRENTROPY . . 100









C THE PROOF OF A SHIFT-INVARIANT PROPERTY OF THE C'\ 1ACE ... 101

D COMPUTATIONAL COMPLEXITY OF THE MACE AND C' \ ACE ...... 102

E THE CORRENTROPY-BASED ROBUST NONLINEAR BEAMFORMER 103

E.1 Introduction ...................... ........... 103
E.2 Standard Beamforming Problem ................... .... 104
E.2.1 Problem ................................. 104
E.2.2 Minimum Variance Beamforming ...... ........... 105
E.2.3 Kernel-based beamforming ......... .......... ... 106
E.3 Nonlinear Beamformer using Correntropy .. ..... ........... 107
E.4 Simulations ..................... ............ 109
E.5 Conclusions ...................... ........... 111

LIST OF REFERENCES ..................... .......... 117

BIOGRAPHICAL SKETCH ................... ......... 123









LIST OF TABLES


Table page

6-1 Estimated computational complexity for training with N images and testing
with one image. Matrix inversion and multiplication are considered ...... ..59

7-1 Comparison of standard deviations of all the Monte-Carlo simulation outputs .. 63

7-2 Comparison of ROC areas with different kernel sizes ............... ..64

7-3 Case A: Comparison of ROC areas with different kernel sizes . .... 72

7-4 Comparison of computation time and error for one test image between the direct
method (C' \ ACE) and the FGT method (Fast C' \ ACE) with p = 4 and kc = 4 75

7-5 Comparison of computation time and error for one test image in the FGT method
with a different number of orders and clusters .................. 76

8-1 Comparison of the memory and computation time between the original C \ ACE
and C\1.ACE-RP .................. ................ .. 92









LIST OF FIGURES


Figure page

1-1 Block diagram for pattern recognition. .................. .... 14

1-2 Block diagram for image recognition process using correlation filter. ...... .15

2-1 Example of the correlation output plane of the SDF ............... ..22

2-2 Example of the correlation output plane of the MACE ............. ..24

2-3 Example of the correlation output plane of the OTSDF ............. ..25

3-1 The example of kernel method ................. . . ..... 28

3-2 Sample images .................. ................. .. 32

3-3 The output peak values when only 3 images are used for training . ... 33

3-4 The comparison of ROC curves with different number of training images. . 34

3-5 The output values of noisy test input images with additive Gaussian noise when
25 images are used for training .................. ........ .. 35

3-6 The ROC curves of noisy test input images with different SNRs when 10 images
are used for training .................. .............. .. 35

5-1 Contours of CIM(X,0) in 2D sample space ................ ... 44

7-1 The averaged test output peak values .................. ..... 61

7-2 The test output peak values with additive Gaussian noise . . ..... 61

7-3 The comparison of ROC curves with different SNRs. ............. ..62

7-4 The comparison of standard deviation of 100 Monte-Carlo simulation outputs
of each noisy false class test images. ................ .... 64

7-5 Case A: Sample SAR images (64x64 pixels) of two vehicle types for a target chip
(BTR60) and a confuser (T62). .................. ..... 66

7-6 Case A: Peak output responses of testing images for a target chip and a confuser 67

7-7 Case A: ROC curves with different numbers of training images. . . 67

7-8 Case A: The MACE output plane vs. the C'\! ACE output plane . ... 69

7-9 Sample images of BTR60 of size (64 x 64) pixels ................. ..70

7-10 Case A: Output planes with shifted true class input image ......... .70

7-11 The ROC comparison with different kernel sizes ............. .. 72









7-12 Case B: Sample SAR images (64x64 pixels) of two vehicle types for a target chip
(2S1) and a confuser (T62). .................. .. ...... 73

7-13 Case B: Peak output responses of testing images for a target chip and a confuser 74

7-14 Case B: ROC curves. ............... ..... ......... 74

7-15 Comparison of ROC curves between the direct and the FGT method in case A 76

7-16 Sample SAR images (64x64 pixels) of BTR60. ................. 77

7-17 ROC comparisons with noisy test images (SNR 7dB) in the case A ...... ..77

8-1 The comparison of ROC areas with different RP dimensionality . ... 85

8-2 The comparison of ROC areas with different RP dimensionality . ... 86

8-3 ROC comparison with different dimensionality reduction methods for MACE
and Ci\ ACE ................... .......... ...... 87

8-4 ROC comparison with PCA for MACE and C'\ IACE .............. .89

8-5 Sample images of size 16 x 16 after RP ................ .... 90

8-6 The cross correlation vs. cross correntropy ................ ... 91

8-7 Correlation output planes vs. correntropy output planes after dimension reduction
with random projection ............... ........... .. 92

E-1 Comparisons of the beampattern for three beamformers in Gaussian noise with
10dB of SNR. .. ............. .. . .... ..... 113

E-2 Comparisons of BER for three beamformers in Gaussian noise with different
SN R s. . . . . . . . . .. 114

E-3 Comparisons of BER for three beamformers with different characteristic exponent
a levels. . . . . . . . .. . 115

E-4 Comparisons of the beampattern for three beamformers in non-Gaussian noise. 116









Abstract of Dissertation Presented to the Graduate School
of the University of Florida in Partial Fulfillment of the
Requirements for the Degree of Doctor of Philosophy

THE CORRENTROPY MACE FILTER FOR IMAGE RECOGNITION

By

Kyu-Hwa Jeong

August 2007

C'!h in: Jos4 C. Principe
Major: Electrical and Computer Engineering

The major goal of my research was to develop nonlinear methods of the family of

distortion invariant filters, specifically the minimum average correlation energy (\ ACE)

filter. The minimum average correlation energy (\ ACE) filter is a well known correlation

filter for pattern recognition. My research investigated a closed form solution of the

nonlinear version of the MACE filter using the recently introduced correntropy function.

Correntropy is a positive definite function that generalizes the concept of correlation

by utilizing higher order moments of the signal statistics. Because of its positive definite

nature, correntropy induces a new reproducing kernel Hilbert space (RKHS). Taking

advantage of the linear structure of the RKHS, it is possible to formulate and solve the

MACE filter equations in the RKHS induced by correntropy. Due to the nonlinear relation

between the feature space and the input space, the correntropy MACE (C'\lACE) can

potentially improve upon the MACE performance while preserving the shift-invariant

property (additional computation for all shifts will be required in the C'\A.CE).

To alleviate the computation complexity of the solution, my research also presents

the fast C' \.ACE using the Fast Gauss Transform (FGT). Both the MACE and C'\ IACE

are basically memory-based algorithms and due to the high dimensionality of the image

data, the computational cost of the C' \ ACE filter is one of critical issues in practical

applications. Therefore, my research also used a dimensionality reduction method based









on random projections (RP), which has emerged as a powerful method for dimensionality

reduction in machine learning.

We applied the C' \ ACE filter to face recognition using facial expression data and the

MSTAR public release Synthetic Aperture Radar (SAR) data set, and experimental results

show that the proposed C \ ACE filter indeed outperforms the traditional linear MACE

and the kernelized MACE in both generalization and rejection abilities. In addition,

simulation results in face recognition show that the C'\ LACE filter with random projection

(C'\ IACE-RP) also outperforms the traditional linear MACE with small degradation in

performance, but great savings in storage and computational complexity.









CHAPTER 1
INTRODUCTION

1.1 Background

The goal of pattern recognition is to detect and assign an observation into one of

multiple classes to be recognized. The observation can be a speech signal, an image or a

higher-dimensional object. In general there are two broad class of classification problems:

open and close sets. Most of classification problems deal with closed set, which means

that we have all prior information of the classes and classify those given classes. In open

classification problem, we only have a prior information of one specific class and no prior

information for out of class which can be universe. Object recognition that we present

in this research is an open set problem, therefore the method that we are going to use is

based on one class versus the universe.

There are a lot of applications for object recognition. In automatic target recognition,

the goal is to quickly and automatically detect and classify objects which may be present

within large amounts of data (typically imagery) with a minimum of human intervention

such as vehicle vs. non-vehicle, tanks vs. trucks, on type of tank vs. another type.

Another pattern recognition applications which is emerging research fields is the

biometrics such as face, iris and fingerprint recognition for human identification and

verification, biometrics technology is rapidly being adopted in a wide variety of security

applications such as computer and physical access control, electronic commerce, homeland

security, and defense.

Figure 1-1 shows the block diagram for the common pattern recognition process. In

the preprocessing block, denosing, normalization, edge detection, pose estimation, etc are

conducted for each applications.

Feature extraction involves simplifying the amount of resources required to describe a

large set of data accurately. When performing analysis of complex data one of the 1 ii' '

problems stems from the number of variables involved. Analysis with a large number









of variables generally requires a large amount of memory and computation power or a

classification algorithm which overfits the training sample and generalizes poorly to new

samples. Feature extraction is a general term for methods of constructing combinations of

the variables to get around these problems while still describing the data with sufficient

accuracy.

The goal of classification is to assign the features derived from the input pattern to

one of the classes. There are av i. i v of classifiers including statistical classifiers, artificial

neural networks, support vector machine (SVM), and so on.

Another important pattern recognition methodology is to use the training data

directly instead of extracting some features and performing classification based on those

features. While feature extraction works well in many pattern recognition applications, it

is not alv--,v- easy for humans to identify what the good features may be.

Correlation filters have been applied successfully to automatic target detection

and recognition (ATR) [1] for SAR image [2],[3],[4] and biometric identification such as

face, iris and fingerprint recognition [5],[6], by virtue of their shift-invariant pi' '" i [7],

which means that if the test image contains the reference object at a shifted location, the

correlation output is also shifted by exactly the same amount. Due to this property, there

is no need to conduct additional process of centering the input image prior to recognizing

it. Also, in some ATR applications, it is not only desirable to recognize various targets,

but to locate them with some degree of accuracy and the location can be easily founded

by searching the peak of the correlation output. Another advantage of correlation filters is

that it is linear and therefore the solution can be computed analytically.



Input Feature Ciassificationi C
r FreprocessiriC] Class
Pattern Frerocessin E Ktraction Identification



Figure 1-1. Block diagram for pattern recognition.









Figure 1-2 depicts the simple block diagram for image recognition process using

correlation filters. Object recognition can be performed by cross-correlating an input

image with a synthesized template (filter) and the correlation output is searched for the

peak, which is used to determine whether the object of interest is present or not.

It is well known that matched filters are the optimal linear filters for signal detection

under linear channel and white noise conditions [8][9]. Matched spatial filters (\!SF) are

optimal in the sense that they provide the maximum output signal to noise ratio (SNR)

for the detection of a known image in the presence of white noise, under the reasonable

assumption of Gaussian statistics [10]. However, the performance of the MSF is very

sensitive to even small changes in the reference image and the MSF cannot be used for

multiclass pattern recognition since it is only optimum for a single image. Therefore

distortion invariant composite filters have been proposed in various papers [1].

Distortion invariant composite filters are a generalization of matched spatial filtering

for the detection of a single object to the detection of a class of objects, usually in the

image domain. Typically the object class is represented by a set of training exemplars.

The exemplar images represent the image class through the entire range of distortions of

a single object. The goal is to design a single filter which will recognize an object class

through the entire range of distortion. Under the design criterion the filter is equally


Test Output Detection/
Image plane Recognition
Testing (On-line)

Training (Off-line)


Training Images




Figure 1-2. Block diagram for image recognition process using correlation filter.









matched to the entire range of distortion as opposed to a single viewpoint as in a matched

filter.

The most well known of such composite correlation filters belong to the synthetic

discriminant function (SDF) class [11] and its variations. One of the appeals of the SDF

class is that it can be computed analytically and effectively using frequency domain

techniques. In the conventional SDF approach, the filter is matched to a composite

template that is a linear combination of the training image vectors such that the cross

correlation output at the origin has the same value with all training images. The hope

is that this composite template will correlate equally well not only with the training

images but also with distorted versions of the training images, as well as with test images

in the same class. One of the problems with the original SDF filters is that because

only the origin of the correlation plane is constrained, it is quite possible that some

other value in the correlation plane is higher than this value at the origin even when the

input is centered at the origin. Since the processing of resulting correlation outputs is

based on detecting peaks, we can expect a high probability of false peaks and resulting

misclassifications in such situations.

Minimum variance SDF (\IVSDF) filter has been proposed in [12] taking into

consideration additive input noise. The MVSDF minimizes the output variance due to

zero-mean input noise while satisfying the same linear constraints of the SDF. One of the

1 i"r difficulties in MVSDF is that the noise covariance is not known exactly; even when

known, an inversion is required and it may be computationally demanding.

Another correlation filter that is widely used is the minimum average correlation

energy (\!ACE) filter [13]. The MACE minimizes the average correlation energy of the

output over the training images to produce a sharp correlation peak subject to the same

linear constraints as the MVSDF and SDF filters. In practice, the MACE filter performs

better than the MVSDF with respect to out-of-class rejection. The MACE filter however,

has been shown to have poor generalization properties, that is, images in the recognition









class but not in the training exemplar set are not recognized well. The MACE filter is

generally known to be sensitive to distortions but readily able to suppress clutter. In

general, it was observed that filters that produce broader correlation peaks(such as the

early SDFs) offer better distortion tolerance. However, they may also provide poorer

discrimination between classes since these filters tend to correlate broadly with low

frequency information in which the classes may be difficult to separate.

By minimizing the average energy in the correlation plane, we hope to keep the side

lobes low while maintaining the origin values at prespecified levels. This is indirect method

to reduce the false peak or side lobe problem. However, in their attempt to produce

delta-function type correlation outputs, MACE filters emphasize high frequencies and yield

low correlation outputs with images not in the training set.

Therefore, some advanced MACE approaches such as the Gaussain MACE (G-MACE)

[14], the minimum noise and correlation energy (MIINACE) [15] and optimal trade-off

filters [16] have been proposed to combine the properties of various SDF's. In the

G-MACE, the correlation outputs are made to approximate Gaussian shaped functions.

This represents a direct method to control the shape of the correlation outputs. The

MINACE and G-MACE variations have improved generalization properties with a slight

degradation in the average output plane variance and sharpness of the central peak

respectively.

In the most of the previous research in SDF type filters, linear constraints are

imposed on the training images to yield a known value at specific locations in the

correlation plane. However, placing such constraints satisfies conditions only at isolated

points in the image space but does not explicitly control the filter's ability to generalize

over the entire domain of the training images.

New correlation filter design based on relaxed constraints, which is called the

maximum average correlation height (\ ACH) filter has been proposed in [17]. MACH

filter adopt a statistical approach that they do not treat training images as deterministic









representations of the object but as samples of a class whose characteristic parameters

should be used in encoding the filter.

The concept of relaxing the correlation constraints and utilizing the entire correlation

output for multi-class recognition was explicitly first addressed by the distance classifier

correlation filter (DCCF)[18].

1.2 Motivation

Most of the members of the distortion invariant filter family are linear filters, which

are optimal only when the underlying statistics are Gaussian. For the non-Gaussian

statistics case, we need to extract information beyond second-order statistics. This is the

fundamental motivation of this research.

A nonlinear version of correlation filters called the polynomial distance classifier

correlation filter (PDCCF) has been proposed in [19]. A nonlinear extension to the

MACE filter using neural network topology has also been proposed in [20]. Since the

MACE filter is equivalent to a cascade of a linear pre-processor followed by a linear

associative memory (LAM) [21], the LAM portion of the MACE can be replaced with a

nonlinear associative memory structure, specifically a feed-forward multi-li-r perception

( \!. P). It is well known that non-linear associative memory structures can outperform

their linear counterparts on the basis of generalization and dynamic range. However,

in general, they are more difficult to design as their parameters cannot be computed in

closed form. Results have also shown that it is not enough to simply train a MLP using

backpropagation. Careful analysis of the final solution is necessary to confirm reasonable

results. Experimental results in [22] showed better generalization and classification

performance than the linear MACE in the MSTAR ATR data set(at I'-, probability of

false alarms, the probability of detection dropped from 4.;:'. (1.ACE) to 2.45' in the

nonlinear MACE).

Recently, kernel based learning algorithms have been applied to classification and

pattern recognition due to the fact that they easily produce nonlinear extensions to linear









classifiers and boost performance [23]. By transforming the data into a high-dimensional

reproducing kernel Hilbert space (RKHS) and constructing optimal linear algorithms in

that space, the kernel-based learning algorithms effectively perform optimal nonlinear

pattern recognition in input space to achieve better separation, estimation, regression and

etc. The nonlinear versions of a number of signal processing techniques such as principal

component analysis (PCA) [24], Fisher discriminant analysis [25] and linear classifiers [25]

have already been defined in a kernel space. Also the kernel matched spatial filter (KMSF)

has been proposed for hyperspectral target detection in [26] and the kernel SDF has

been proposed for face recognition [27]. The kernel correlation filter (KCF) which is the

kernelized MACE filter after prewhitening has been proposed in [28] for face recognition.

Similar to Fisher's idea in [20], in the KCF, prewhitening is performed in the input space

with linear methods and it may affect to the whole performance. We will later present the

difference between the KCF and proposed method, where all the computation including

prewhitening are conducted in the feature space.

More recently, a new generalized correlation function, called correntropy has been

introduced by our group [29]. Correntropy is a positive definite function, which measures

a generalized similarity between random variables (or stochastic processes) and it involves

high-order statistics of input signals, therefore it could be a promising candidate for

machine learning and signal processing. Correntropy defines a new reproducing kernel

Hilbert space (RKHS), which has the same dimensionality as the one defined by the

covariance matrix in the input space and it simplifies the formulation of analytic solutions

in this finite dimensional RKHS. Applications to the matched filter [30], chaotic time

series prediction have been presented in the literature.

Based on the promising properties of correntropy and the MACE filter, the main goal

of this research is to exploit the generalized nonlinear MACE filter for image recognition.

As the first step, we applied the kernel method to the SDF and obtained the kernel SDF.

Application of the kernel SDF to face recognition has been presented in this research. As









the main part of the research, this dissertation establishes the mathematical foundations of

the correntropy MACE filter (called here the CM\ IACE filter) and evaluates its performance

in face recognition and synthetic aperture radar (SAR) ATR applications.

The formulation exploits the linear structure of the RKHS induced by correntropy

to formulate the correntropy MACE filter in the same manner as the original MACE,

and solves the problem with virtually the same equations (e.g without regularization)

in the RKHS. Due to the nonlinear relation between the input and this feature spaces,

the C \ ACE corresponds to a nonlinear filter in the input space. In addition, the

C'\ IACE preserves the shift-invariant property of the linear MACE, however it requires an

additional computation for each input image shift. In order to reduce the computational

complexity of the C'\ IACE, the fast C'\ IACE filter based on the Fast Gauss Transform

(FGT) is also proposed.

Also we introduce the dimensionality reduction method based on random projections

(RP) and apply RP to the C'\ ACE in order to decrease the storage and meet more

readily available computational resources and show the RP method works well with the

C'\ IACE filter for image recognition.

We can the C'\.ACE formulation for image recognition is one application of

the general case of the energy minimization problems. We can also apply the same

methodology which is minimizing correntropy energy of the output for the beamforming

problem, whose conventional linear solution is obtained by minimizing the output power.

Appendix E presents the new application of the correntropy to the beamforming problem

in wireless communications with some preliminary results.









CHAPTER 2
FUNDAMENTAL DISTORTION INVARIANT LINEAR
CORRELATION FILTERS

2.1 Introduction

Distortion invariant composite filters are a generalization of matched spatial filtering

for the detection of a single object to the detection of a class of objects, and those are

widely used for image recognition. There are a lot of variations of the correlation filters.

The SDF and the MACE filter are fundamental correlation-based distortion invariant

filters for object recognition. Most of the correlation filters are based on them. In this

research, we present the nonlinear extensions to the SDF and MACE. The formulations

of the SDF and MACE filter are briefly introduced in this chapter. We consider a

2-dimensional image as a d x 1 column vector by lexicographically reordering the image,

where d is the number of pixels.

2.2 Synthetic Discriminant Function (SDF)

The SDF filter is matched to a composite image h, where h is a linear combination of

the training image vectors xi
N
h aixi, (2-1)
i=1
where N is the number of training images and the coefficients ai are chosen to satisfy the

following constraints

hTXj =Uj, j = 1, 2, ,N, (2-2)

where T denotes the transpose and uj is a desired cross correlation output peak value. In

vector form, we define the training image data matrix X as


X [x, X,2 ,XN], (2-3)


where the size of matrix X is d x N. Then the SDF is the solution to the following

optimization problem

minhTh, subject to XTh u. (2-4)



















0.0 : 2 I








^a1





Figure 2-1. Example of the correlation output plane of the SDF [31].


It is assumed that N < d and so the problem formulation is a quadratic optimization

subject to an under-determined system of linear constraints. The optimal solution is


h X(XTX)-u. (2-5)


Once h is determined, we apply an appropriate threshold to the output of the cross

correlation, which is the inner product of the test input image and the filter h and decide

on the class of the test image.

Figure 2-1 shows the general shape of the correlation output plane of the SDF uisng

inverse synthetic aperture radar (ISAR) imagery [31]. As stated early, the SDF has

a broad output plane response and it means that the SDF has a good generalization

performance with the true class images, but a poorer discrimination between true class

and out of class images.









2.3 Minimum Average Correlation Energy (MACE) Filter

Let us denote the ith image vector be xi after reordering. The conventional MACE

filter is better formulated in the frequency domain. The discrete Fourier transform (DFT)

of the column vector xi is denoted by Xi and we define the training image data matrix X

as

X [Xi, X2, ,XN], (2-6)

where the size of X is d x N and N is the number of training image. Let the vector h be

the filter in the space domain and represent by H its Fourier transform vector. We are

interested in the correlation between the input image and the filter. The correlation of the

ith image sequence xi(n) with filter sequence h(n) can be written as


gi(n) = xi(n) 0 h(n). (2-7)

By Parseval's theorem, the correlation energy of the ith image can be written as a

quadratic form

E, = HHDiH, (2-8)

where Di is a diagonal matrix of size d x d whose diagonal elements are the magnitude

squared of the associated element of Xi, that is, the power spectrum of xi(n) and the

superscript H denotes the Hermitian transpose. The objective of the MACE filter is

to minimize the average correlation energy over the image class while simultaneously

satisfying an intensity constraint at the origin for each image. The value of the correlation

at the origin can be written as

gi(0) XHH = ci, (2-9)

for i = 1, 2,- .. N training images, where ci is the user specified output correlation

value at the origin for the ith image. Then the average energy over all training images is

expressed as

E, = HHDH, (2-10)















0.6



0.4
o~a














Figure 2-2. Example of the correlation output plane of the MACE [31].


where
N
D -(1/N) Di. (2-11)
i=
The MACE design problem is to minimize ELag while satisfying the constraint,XHH =

c, where c = [c, 2, c CN] is an N dimensional vector. This optimization problem can

be solved using Lagrange multipliers, and the solution is


H D- X(XHD-X)- c. (2-12)


It is clear that the spatial filter h can be obtained from H by an inverse DFT. Once h is

determined, we apply an appropriate threshold to the output correlation plane and decide

whether the test image belongs to the class of the template or not.

Figure 2-2 shows the general shape of the correlation output plane of the MACE [31].

It shows a sharp peak at the origin and as a result the abilities of finding the location

of the target and discrimination between true class and out of class images have been













1Y T




t.t
I,,






.. I l
I II
I II





00.- t I',.







Figure 2-3. Example of the correlation output plane of the OTSDF [31].


improved. However, a sharp output plane causes a worse distortion tolerance and poor

generalization.

2.4 Optimal Trade-off Synthetic Discriminant (OTSDF) Function

The optimal trade-off filter (OTSDF) is a well know correlation filter to overcome the

poor generalization of the MACE when noise input is presented. The OTSDF wishes to

trade-off the MACE filter criterion versus the MVSDF filter criterion.

The OTSDF filter in the frequency domain is given by


H T-'X(XHT-1X)-'c, (2-13)


where T = aD + v/ oaC, and 0 < a < 1, and D is the diagonal matrix in the MACE

and C is the diagonal matrix containing the input noise power spectral density as its

diagonal entries.









The correlation output response of the OTSDF is shown in Figure 2-3[31]. As

compared to the MACE filter response, the output peak is not nearly as sharp, but still

more localized than the SDF case.









CHAPTER 3
KERNEL-BASED CORRELATION FILTERS

3.1 Brief review on Kernel Method

3.1.1 Introduction

Kernel-based algorithms have been recently developed in the machine learning

community, where they were first used to solve binary classification problems, the so-called

Support Vector Machine (SVM) [32]. And there is now an extensive literature on SVM

[33],[34] and the family of kernel-based algorithms [23].
A kernel-based algorithm is a nonlinear version of a linear algorithm where the data

has been previously (and most often nonlinearly) transformed to a higher dimensional

space in which we only need to be able to compute inner products (via a kernel function).

It is clear that many problems arising in signal processing are of statistical nature

and require automatic data analysis methods. Furthermore, the algorithms used in

signal processing are usually linear and their transformation for nonlinear processing is

sometimes unclear. Signal processing practitioners can benefit from a deeper understanding

of kernel methods, because they provide a different way of taking into account nonlinearities

without loosing the original properties of the linear method. Another aspect is dealing

with the amount of available data in a space of a given dimensionality, one needs methods

that can use little data and avoid the curse of dimensionality.

Aronszajn [35] and Parzen [36] were some of the first to employ positive definite

kernel in statistics. Later, based on statistical learning theory, support vector machine

[70] and other kernel-based learning algorithms [63] such as kernel principal component

analysis [64], kernel Fisher discriminant analysis [58] and kernel independent component

analysis [4] have been introduced.

3.1.2 Kernel Method

T ii:v algorithms for data analysis are based on the assumption that the data can

be represented as vectors in a finite dimensional vector space. These algorithms, such











FX h
SX X
X x
-, x


SXx

-2 \


Figure 3-1. The example of kernel method (left: Input space, right: feature space).


as linear discrimination, principal component analysis, or least squares regression, make

extensive use of the linear structure. Roughly speaking, kernels allow one to naturally

derive nonlinear versions of linear algorithms through the implicit nonlinear mapping. The

general idea is the following. Given a linear algorithm (i.e. an algorithm which works in

a vector space), one first maps the data living in a space X (the input space) to a vector

space -1 (the feature space) via a nonlinear mapping K<(-) : x -H; and then runs the

algorithm on the vector representation 4(x) of the data. In other words, one performs

nonlinear a'n 1. i-; of the data using a linear method. The purpose of the map +(-) is to

translate nonlinear structures of the data into linear ones in '-H.

Consider the following discrimination problem (see Figure 3-1) where the goal is

to separate two sets of points. In the input space, the problem is nonlinear, but after

applying the transformation )(xi, x2) (x, v/2xx2, x2) which maps each vector to the

three monomials of degree 2 formed by its coordinates, the separation boundary becomes

linear. We have just transformed the data and we hope that in the new representation,

linear structures will emerge.

Working directly with the data in the feature space may be difficult because the space

can be infinite dimensional, or the transformation implicitly defined. The basic idea of

kernel algorithm is to transform the data x from the input space to a high dimensional

feature space of vectors L(x), where the inner products can be computed using a positive








definite kernel function satisfying Mercer's condition [23],


K(x, Y) =< (x), )(y)>. (3-1)

Mercer's Theorem: Suppose K(t, s) is a continuous symmetric non-negative
function on a closed finite interval T x T. Denote by {Ak, k = 1, 2, ... } a sequence
of non-negative eigenvalues of <(t, s) and by {ypk(t), k = 1, 2, ... } the sequence of
corresponding normalized eigenfunctions, in other word, for all integers t and s,

jf (tS(,s) t)dt Ak k(s), s,t eT (3-2)

tf Ok(t)j(t)dx 6k,j (3 3)

where 6k,j is the Kronecker delta function, i.e., equal to 1 or 0 according as k = j or k $ j.
Then

K k-0
where the series above converges absolutely and uniformly on T' x T [37].
This simple and elegant idea allows us to obtain nonlinear versions of any linear
algorithm expressed in terms of inner products, without even knowing the exact mapping
function 4. A particularly interesting characteristic of the feature space is that it is a
reproducing kernel Hilbert space (RKHS), i.e., the span of functions {(-,,x) : x E X}
defines a unique functional Hilbert space [35]. The crucial property of these space is the
reproducing property of the kernel

f(x) < (.,x), f >,Vf F. (3-5)

In particular, we can define our nonlinear mapping from the input space to RKHS as
N(x) = K(-,x), then we have

< L(x), (y) > < K(.,x),K(.,y) > K(x,y), (3-6)









and thus )(x) = K(-,x) defines the Hilbert space associated with the kernel.

In this research, we use the Gaussian kernel, which is the most widely used Mercer

kernel,
1 x y|ll2
(x y) exp ( ). (3-7)
V o-2/-F- 2a 2

3.2 Kernel Synthetic Discriminant Function

Based on the kernel methodology, the previous optimization problem for the SDF can

be solved in an infinite dimensional kernel feature space by transforming each element of

the matrix of exemplars X to K)(Xyj) and h to L)(h) with sample by sample mapping, thus

forming a higher dimensional matrix )(X) whose (i, j)th feature vector is )(Xyi). Let the

N training images matrix X be


X [xi, X2, ,x], (3 8)


where xi is ith training image vector given by


xi = [x(l),x... (d)], (3-9)

Then we can extend the SDF optimization problem to the nonlinear feature space by


min 4T(h)4(h), subject to KT(X)4(h)= u. (3-10)


where the dimensions of the transformed )(X) and K)(h) are oo x N and oo x 1,

respectively for the Gaussian kernel. Then the solution in kernel space becomes


4(h) 4(X) (T(X) (X))- u. (3-11)


We denote Kxx = )T(X))(X) ,which is a N x N full rank matrix whose (i,j)th element

is given by
d
(Kxx) =j- k(xi(k), xj(k)), i,j 1,2,*** ,N. (3-12)
k=1









Although K(h) is a infinite dimensional vector, the output of this filter is going to be an

N x 1, which can be easily computed using these kernels.

Let Z be the matrix of vector images for testing and its number of testing images are

L. We denote Kzx = T(Z)+(X), which is L x N matrix whose each element is given by

d
(Kzx)ij = k(z(k), xj(k)), i 1,2,... ,L,j =1,2, N. (3-13)
k= 1

Then the L x 1 output vector of the kernel SDF is given by


y = T(Z)+(h) KzxKxu. (3-14)


We can compute Kxx off-line with given training data. Then Kzx and y are can be

computed on-line with a given test image. Given N training images and one test image,

the computational complexity are O(dN2) and O(dN) + 0(N3) during off-line and on-line,

respectively. In general, N << d, so the dominant part of the computational complexity for

matrix inversion in (3-14), O(N3), is not a critical computational issue in the KDSF. Also

the required memory for Kxx which is 0(N2) is much less than the case of depending on

the number of image pixels, d.

By applying an appropriate threshold to the output in (3-14), we can detect and

recognize the testing data without generating the composite filter in a feature space. In

object recognition and classification senses, the proposed kernel SDF is simpler than the

kernel matched filter.

3.3 Application of the Kernel SDF to Face Recognition

3.3.1 Problem Description

In this section, we show the performance of the proposed kernel based SDF filter for

face image recognition. In the simulations, we used the facial expression database collected

at the Advanced Multimedia Processing Lab at the Electrical and Computer Engineering

Department of Carnegie Mellon university [38]. The database consists of 13 subjects,

whose facial images were captured with 75 varying expressions. The size of each image is








64x64. Sample images are depicted in Figure 3-2. In this research, we tested the proposed
kernel SDF method with the original database images as well as with noisy images.
Sample images with additive Gaussian noise with a 10 dB SNR are shown in Figure 3-2(c).
In order to evaluate the performance of the SDF and kernel SDF filter in this data set,
we examined 975(13x 75) correlation outputs. From these results and the ones reported
in [5] we picked and report the results of the two most difficult cases who produced the
worst performance with the conventional SDF method. We test with all the images of
each person's data set resulting in 75 outputs for each class. The simulation results have
been obtained by averaging (Monte-Carlo approach) over 100 different training sets

(each training set has been chosen randomly ) to minimize the problem of performance
differences due to splitting the relatively small database in training and testing sets. In
this data set, it has been observed that the kernel size around :il -50' of the standard
deviation of the input data would be appropriate.
3.3.2 Simulation Results








(b) n E




(c) ^ B g




Figure 3-2. Sample images: (a) Person A (b) Person B (c) Person A with additive
Gaussian noise (SNR 10dB).














0.9 True class
o .- x False class
S0.8
0
0.7

0.6
10 20 30 40 50 60 70
number of testing images

0.9



0.8 -
0
0.7 >xX) X<>XX oxx>OxyxX>xx xx>xX

0.6 1 1 1 1 1 1 -
10 20 30 40 50 60 70
number of testing images


Figure 3-3. The output peak values when only 3 images are used for training (N 3),
(Top): SDF, (Bottom): Kernel SDF.


Figure 3-3 shows the average output peak values for image recognition when we use

only N = 3 images as training. The desired output peak value should be close to one

when the test image belongs to the training image class. Figure 3-3 (Top) shows that the

correlation output peak values of the conventional SDF in both true and false classes not

only overlap but are also close to one. As a result the system will have great difficulty to

differentiate these two individuals because they can be interpreted as belonging to the

same class. Figure 3-3 (Bottom) shows the output values of kernel SDF and we can see

that the two images can be recognized well even with a small number of training images.

Figure 3-4 shows the ROC curves with different number of training images (N). In the

kernel SDF with N = 3, the probability of detection with zero false alarm rate is 1.

However, the conventional SDF needs at least 25 images for training in order to have the

same detection performance as the kernel SDF.












S1 ......

0.9 ,

0.8 I

0.7 -
I
I
0o I

6 0.5 1
I
o 0.5
04 -

0.3
.3 / KSDF(N=3), SDF(N=25)
...* SDF(N=3)
-* SDF(N=20)
0. / SDF(N=23)

0.1 ,


0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1
Probability of False Alarm


Figure 3-4. The comparison of ROC curves with different number of training images.


One of the i i jr problems of the conventional SDF is that the performance can

be easily degraded by additive noise in the test image since SDF does not have any

special mechanism to consider input noise. Therefore, it has a poor rejecting ability for

a false class image. Figure 3-5 (Top) shows the noise effect on the conventional SDF.

When the class images are seriously distorted by additive Gaussian noise with a very

low SNR (-2dB), the correlation output peaks of some test images become great than

1, hence wrong recognition happens. The results in Figure 3-5 (Bottom) are obtained

by the kernel SDF. The kernel SDF shows a much better performance even in a very

low SNR environment. The comparison of ROC curves between the kernel SDF and the

conventional SDF in the case of noisy test input with different SNRs is shown in Figure

3-6. We can see that the kernel SDF outperforms the SDF and achieves a robust pattern

recognition performance in a very high noisy environment.

















x x x x '. x
S .. xx x N .x x <...' x
x ;.. x x x x x x.
.x x
5x



10 20 30 40 50 60 70
number of testing images


10 20 30 40 50
number of testing images


60 70


Figure 3-5.


The output values of noisy test input images with additive Gaussian noise
when 25 images are used for training (N 25),(Top): SDF, circle-true class
with SNR 10dB, cross-false class with SNR -2dB, diamond-false class with
no noise, (Bottom): Kernel SDF, circle-true class with SNR 10dB, cross-false
class with SNR=-2dB.


04 05 06
Probabolity of False Alarm


Figure 3-6. The ROC curves of noisy test input images with
images are used for training (N 10).


different SNRs when 10


" 1-, .1- 1'%"_".' Xxxxxxxxxxxxxx


09

08

07

0
S06
a
005


S04

03

02

01









CHAPTER 4
A RKHS PERSPECTIVE OF THE MACE FILTER

4.1 Introduction

This section presents the interpretation of the MACE filter in the RKHS. The original

linear MACE filter was formulated in the frequency domain, however, the MACE filter

can also be understood by the theory of Hilbert space representations of random functions

proposed by Parzen [39]. Parzen analyzed the connection between RKHS and second-order

random (or stochastic) processes by using the isometric isomorphism 1 that exists

between the Hilbert space spanned by the random variables of a stochastic process and the

RKHS determined by its covariance function. Here, we present first the basic theory of the

RKHS, then show the interpretation the MACE filter formulation in the RKHS.

4.2 Reproducing Kernel Hilbert Space (RKHS)

A reproducing kernel Hilbert space (RKHS) is a special Hilbert space associated

with a kernel such that reproduces (via an inner product) each function in the space,

or, equivalently, every point evaluation functional is bounded. Let -H be a Hilbert space

of functions on some set E, define an inner product (., )-H in '-H and a complex-valued



1
Consider two Hilbert space Hi1 and 7-2 with inner products denoted as (fl, f2)1 and
(gi, g2)2 respectively, )Hi and 7H2 are said to be isomorphic if there exists a one-to-one and
surjective mapping i from -1i to -2a satisfying the following properties

&(fl + f2) (fl) + (f2) and &(af) = ap(f) (4-1)

for all functionals in HI1 and any real number a. The mapping i is called an isomorphism
between H1 and -H2. The Hilbert spaces H1 and -H2 are said to be isometric if there exist
a mapping y that preserves inner products,

(l, f2)1 ( ( ), 2))2, (4-2)

for all functions in N-i. A mapping i satisfying both properties (4-1) and (4-2) is
said to be an isometric isomorphism or congruence. The congruence maps both
linear combinations of functionals and limit points from Ni1 into corresponding linear
combinations of functionals and limit points in -a2.









bivariate function K(x, y) on Ex E. Then the function K(x, y) is said to be positive definite

if for any finite point set {xl, x2,... x } c E and for any not all zero corresponding

complex number {aci, a2, ... a, a~} C,


0a>0 o. (4-3)
i= j 1

Any positive definite bivariate function K(x, y) is a reproducing kernel because of the

following fundamental theorem.

Moore-Aronszajn Theorem: Given any positive definite function K(x, y), there

exists a uniquely determined (possibly finite dimensional) Hilbert space -1 consisting of

functions on E such that


(i) for every x E E, <(x, -) E R and (4-4)

(ii) for every x E E and f Ec f(x) = (f, K(x, ))-. (4-5)


Then '- := 'H(K) is said to be a reproducing kernel Hilbert space with reproducing kernel

K. The properties (i) and (ii) are called the reproducing r, '/'. ,I of (x, y) in '-(K).

Parzen [39] analyzed the connection between RKHSs and orthonormal expansions for

second-order stochastic processes obtaining a general expression for the reproducing kernel

inner product in terms of the eigenvalues and eigenfunctions of a certain operator defined

on an appropriate Hilbert space. In addition, Parzen showed that there exists an isometric

isomorphism between the Hilbert space spanned by the random variables of a stochastic

process and the RKHS determined by its covariance function.

Given a zero mean second-order random vector {xi : i E 1} with I being an index set,

the covariance function is defined as


R(i,j) E [xix] (4-6)

It is well known that the covariance function R is non-negative definite, therefore it

determines a unique RKHS, 'H(R), according to the Moore-Aronszajn Theorem. By the









Mercer's theorem [35],


R(i,j) = A,,- (,)k(), (4 7)
k-0

where {A, k = 1, 2, } and { k(i), k = 1, 2, } are a sequence of non-negative

eigenvalues and corresponding normalized eigenfunctions of R(i, j), respectively.

7H(R) has two important properties which make it a reproducing kernel Hilbert space.

First, let R(i, -) be the function on I with value at j in I equal to R(i,j), then by the

Mercer's theorem, eigen-expansion for the covariance function (4-7), we have


k00
R(ij) ZA, (j), ak pk(t). (4-8)

Therefore, R(i, .) E '7-(R) for each i in I. Second, for every function f(.) E '7H(R) of form

given by f(i) = o0 Akakpck(i) and every i in I,


(f, R(i, )) A,, -, (i) f(i). (4-9)
k-0

By the Moore-Aronszajn Theorem, 7-(R) is a reproducing kernel Hilbert space with R(i,j)

as the reproducing kernel. It follows that
00
(R(i, .), R(j, .)) = ,-, (i)k(j) =R(i,i). (4-10)
k-0

Thus 7H(R) is a representation of the random vector{xi : i E 1} with covariance function

R(i, j).

One may define a congruence g form 7-(R) onto linear space L2(xi, i E I) such that


9(R(i,.)) = xi. (4-11)

The congruence G can be explicitly represented as


G(f) = ak^k, (4 12)
k-0









where the set of k is an orthogonal random variables belong to L2(p(i), i E I) and f is

any element in 9H(R) in the form of f(i) = Y o A, 'i ,; (i) and every i in I.

Summary Let {xi : i E 1} be a continuous random function defined on a closed finite

interval I. Then the following conclusions hold:

The covariance kernel R possesses the expansion (4-7).

There exists a Hilbert space L2(c(i), i E I) of sequences which is a representation of

the random function.

There exists a reproducing kernel Hilbert space -(R) of functions on I, which is a

representation of the random function.

4.3 Interpretation of the MACE filter in the RKHS

The original MACE filter was derived in the frequency domain for simplicity, however,

it can also be derived in the space domain [40] and this helps us understand the RKHS

perspective of the MACE. Let us consider the case of one training image and construct the

following matrix


x(d) 0 ...

x(d ) x(d) 0


U x(1) x(2) ...

0 x(1) x(2)

0 0

0 ... ...

where the dimension of matrix U is (2d 1) x d.

matrix U as Ui. Then the column space of U is


Here


0

0



x(d) (4-

x(d 1)



x(1)
ve denote the ith column of the


d
L2 )aU iR, ,
i= 1


13)


(4 14)









which is congruent to the RKHS induced by the correlation kernel


R(i,j) =< Ui, Uj >= UU, j 1,, 1 d, (4-15)


where < > represents the inner product operation. If all the columns in U are

linearly independent, R(i,j) is positive definite and the dimensionality of L2(U) is d.

If U is singular, the dimensionality is smaller than d. However, in either case, all the

vectors in this space can be expressed as a linear combination of the column vectors. The
d
optimization problem of the MACE is to find a vector go = hiUi in L2(U) space with
i=
coordinates h = [hlh2 .. hd]T such that gTgo is minimized subject to the constraint

that the dth component of go (which is the correlation at zero lag) is some constant.

Formulating the MACE filter from this RKHS viewpoint only provides a new perspective

but no additional advantage. However, as explained next, it will help us derive a nonlinear

extension to the MACE with a new similarity measure.









CHAPTER 5
NONLINEAR VERSION OF THE MACE IN A NEW RKHS:
THE CORRENTROPY MACE (C\! ACE) FILTER

5.1 Correntropy Function

5.1.1 Definition

Correlation is one of the fundamental operations of statistics, machine learning and

signal processing because it quantifies similarity. However, correlation only exploits second

order statistics of the random variables or random processes, which limits its optimality to

Gaussian distributed data. Correntropy was introduced in [29] as a generalized measure of

similarity. Its name stresses the connection to correlation, but also indicates the fact that

its mean value across time or dimensions is associated with entropy, more precisely to the

argument of the log in Renyi's quadratic entropy estimated with Parzen windows, which

is called the information potential. Information potential (IP) is the argument of Renyi's

quadratic entropy of a random variable X with PDF fx(x) as,


H2(x) log J f (x)dx, (5-1)


where IP(x)= J ff (x)dx.

A nonparametric estimator of the information potential using Parzen window from N

samples data is
N N
IP(x) -N2 Z Z (X-Xi), (5-2)
i= j 1
where K, is the Gaussian kernel in (5-4) [41].

This relation to entropy shows that the correntropy contains information beyond

second order moments, and can therefore generalize correlation without requiring moment

expansions.

Definition: Cross correntropy or simply correntropy is a generalized similarity

measure between two arbitrary vector random variables X and Y defined as


V(X, Y) = E[K,(X Y)], (5-3)









where E is the mathematical expectation and K, is the Gaussian kernel given by


,(X Y) exp I (5-4)
27wOa 2a2 J

where a is the kernel size of bandwidth.

In practice, given finite number of data samples {(xi, yj)}j the cross correntropy is

estimated by
d
V(X, Y) =x, y). (5-5)
i= 1

5.1.2 Some Properties

Correntropy has very nice properties that make it useful for machine learning and

nonlinear signal processing. First and foremost, it is a positive function also defining a

RKHS, but unlike the RKHS defined by the covariance function of the random variable

(process) it contains higher order statistical information. This new function quantifies

the average angular separation in the kernel feature space between the dimensions of the

random variable (or between temporal lags of the random process). Therefore, correntropy

can be the metric for similarity measurements in feature space. Several properties of

correntropy and their proofs are presented in [29] [42] [43]. Here we present, without proofs,

only the properties that are relevant to this dissertation.

Property 1:Correntropy is a similarity measure between X and Y incorporating

higher order moments of the random variable X Y [29].

Applying the Taylor series expansion for the Gaussian kernel, we can rewrite the

correntropy function in (5-3) as

1 ( 1 kn
V(XY) -= ( kE[(X )2k] (5-6)


which contains all the even-order moments of the random variable X Y. The kernel

size controls the emphasis of the higher order moments with respect to the second, since

the higher order terms of the expansion decay faster for larger a. As a increases, the









high-order moments decay and the second order moment tends to dominate. In fact, for

kernel size larger than 10 times the one chosen from density estimation considerations (e.g.

Silverman's rule [44]), correntropy starts to approach correlation. The kernel size has to be

chosen according to the application, but here this issue will not be further addressed and

Silverman's rule will be used by default.

Property 2: Let {xi,i E T}} be a random vector(process) with T being an index

set, the auto-correntropy function of random vector(process) V(i,j) = E[K,(xi xj)] is a

symmetric and positive definite function; therefore it defines a new RKHS, called VRKHS

[29].

Since K,(xi xj) is symmetrical, it is obvious that V(i,j) is also symmetrical. Also

since K,(xi xj) is positive definite, for any set of n point {xi x,} and none zero

real numbers {al, a,n}, we have
n n
E E aiajK,(Xi Xj) > 0. (5-7)
i= j 1

It is true that for any strictly positive function g(., -) of two random variables x and y,

E[g(x,y)] > 0. Thus we have E a aja (xi xj) > 0. This equals to
i= 1j= x 1ja jT, 0 (



i= j 1 i= j 1

Thus V(i, j) is both symmetric and positive definite. Now, the Moore-Aronszajn theorem

[35] proves that for every real symmetric positive definite function k, there exists a unique

RKHS with k as its reproducing kernel. Hence V(i,j) is a reproducing kernel.

As shown in property 1, VRKHS contains higher order statistical information, unlike

the RKHS defined by the covariance function of random processes.

Property 3:Assume the samples {(xi, yi)} are drawn from the joint PDF

fx,y(x, y) and fx,Y,,(x, y) is the Parzen estimator with kernel size a. The correntropy


























-2 -1 0 1 2


Figure 5-1. Contours of CIM(X,0) in 2D sample space (kernel size is set to 1)

estimator with kernel size a' = /2a is the integral of fxy,((x, y) along the line x = y [42],
/+00
V',(X, Y) = fx,Yo(, y) I ~=ydu. (5-9)

Property 4: Correntropy, as a sample estimator, induces a metric in the sample

space. Given two vectors X = [xi, x2," XN]T and Y = [y1, y2, --" UN]T in the sample

space, the function CIM(X, Y) = (K,(0) V(X, ))1/2 ,where K, is Gaussian kernel in

(5-4) with c,(0) = 1/v/27a, defines a metric in the sample space and is named as the

Correntropy Induced Metric (CIM) [42]. Therefore, correntropy can be the metric for

similarity measurement in feature space.

Figure 5-1 shows the contours of distance from X to the origin in a two dimensional

space. The interesting observation from the figure is as follows: when X is close to zero,









CIM behaves like an L2 norm 1 which is clear from the Taylor expansion in (5-6); further

out CIM behaves like an L1 norm; eventually as X departs from the origin, the metric

saturates and becomes insensitive to distance (approaching a Lo norm2 ). larger differences

saturate so the metric is less sensitive to large deviations what makes it more robust.

This property inspired us to investigate the inherent robustness of CIM to outliers. The

kernel size controls this very interesting behavior of the metric across neighborhoods. A

small kernel size leads to a tight linear (Euclidean) region and to a large Lo region, while

a larger kernel size will enlarge the linear region. In this dissertation, we mathematically

prove that 1) when the kernel size goes to infinity, the CIM norm is equivalent to the

L2 norm and 2) when the kernel size goes to zero (from the positive side), the CIM is

equivalent to the Lo norm.

Let us define E = X Y = [el,e2,...,eN]T, then

N
CIM(E) = ,(O) > ,(e)]1/2
N i (5-10)
S{ 2 3[272 E (1 exp(-e /2a2))]}1/2.
i=

First, let us take a look at the following limit

lim 2a2(1- exp(-ef2/272))

lim -exp( (t : 1/22)
two t (5-11)
Slim exp(-te)e2 (LHospital)
t--o
-2



N
1 Given vector X, Lp norm of X is defined by || X (YE x, P)1' where p is a real
i=1
number with p >= 1.

2 Lo norm of X is defined as limpo 11 X I1:, that is the zero norm of X is simply the
number of non-zero elements of X. Despite its name, the zero norm is not a true norm; in
particular, it is not positive homogeneous.









Therefore,


lim (2v/2N- 3) /2CIM(E) = IIE|2. (5-12)

Second, look at the following limit

0 if ei = 0
lim (1 exp(-e/2 )) f (5-13)
--o+ 1 if ei / 0

Therefore,

lim V2rNa[CIM(E)]2 = I Ello. (5-14)

Property 5: Given data samples {xi}t 1, the correntropy kernel creates another data

set {f(xi)}i 1 preserving the similarity measure as


V(i,j) = E[K,(xi xj)] = E[f(xi)f(xj)]. (5-15)

The proof of property 5 is in Appendix B.

According to property 5, there exists a scalar nonlinear mapping f which makes the

correntropy of x, the correlation of f(xi). (5-15) allows the computation of the correlation

in feature space by the correntropy function in the input space [45] [46].

5.2 The Correntropy MACE Filter

According to the RKHS perspective of the MACE filter in chapter 4, we can extend it

immediately to VRKHS 3 Applying the correntropy concept to the MACE formulation of

chapter 4, the definition of the correlation in (4-15) shall be substituted by
2d-1
V(i,j) 2d= 1 x (U, U ) i,j =1,-...- ,d, (5-16)
n=l

where, Ui, is (i, n)th elements in (4-13). This function is positive definite and thus

induces the VRKHS. According to the Mercer's theorem [35], there is a basis {r, i =



3 In this dissertation, we call the RKHS induced by correntropy VRKHS









1, d} in this VRKHS such that


< i, >= V(i,j), i,j = ,d. (5-17)

Since it is a d dimensional Hilbert space, it is isomorphic to any d dimensional real vector

space equipped with the standard inner product structure. After an appropriate choice

of this isomorphism {rIi, i = 1, d}, which is nonlinearly related to the input space,

a nonlinear extension of the MACE filter can be readily constructed on this VRKHS,
d
namely, finding a vector v0 = E fhii with f [fS ... fhdl] as coordinates such
i=
that vo vo is minimized subject to the constraint that the dth component of vo is some

pre-specified constant.

Let the ith image vector be x, = [xi(1) xi(2) ... xi(d) ] and the filter be h =

[h(1) h(2) ... h(d)]T, where T denotes transpose. From property 5, the C \! ACE filter can

be formulated in feature space by applying a nonlinear mapping function f onto the data

as well as the filter. We denote the transformed training image matrix and filter vector

whose size are d x N and d x 1, respectively, by


Fx =[fl,, fX, fN], (5-18)


fh [f(h(1)) f(h(2)) .. f(h(d))]T, (5-19)

where f,, [f(xi(1)) f(x (2)) ... f(x (d))]T for i =1, ... N. Given data samples, the

cross correntropy between the ith training image vector and the filter can be estimated as

d
v[n] = f (h(n))f (xi(n- m)), (5-20)
n=l

for all the lags m = -d + 1, d- 1. Then the cross correntropy vector Voi can be formed

including all the lags of oi[m] denoted by


Voi = SAfh, (5-21)


where Si is the matrix of size (2d 1) x 1 as












f(xi(d)) 0 ... 0 0

f (x(d )) f(x (d)) 0 ... 0



f(x (1)) f(x(2)) ... ... f(x (d))22)
S (5-22)
0 f(xi(1)) f(xi(2)) ... f (x(d ))

0 0

0 0 0 f(xi(1)) f(xi(2))

o0 0 .. 0 f(xi(1))

Since the scale factor 1/d has no influence on the solution, it will be ignored throughout

the dissertation. The correntropy energy of the ith image is given by


Ei vvoi fiSTSifh, (5-23)


Denoting Vi = STSi and using the definition of correntropy in (5-15), the d x d

correntropy matrix Vi is

S(0) (1) .. (d 1)

V (1) (0) d (d- 2)2
Vi (5-24)


S(d- 1) ... (1) (0)

where, each element of the matrix is computed without explicitly knowledge of the

mapping function f by
d
(1) = ,(xi(n) xi(n + 1)), (5-25)
n=l
for 1 = 0, d 1. The average correntropy energy over all the training data can be

written as
N
E, = E Ei= fVxfh, (5-26)
i= 1









where,

Vx (527)
i= 1
Since our objective is to minimize the average correntropy energy in feature space, the

optimization problem is formulated as


min fTVxfh subject to Ffh -c, (5-28)


where, c is the desired vector for all the training images. The constraint in (5-28) means

that we specify the correntropy values between the training input and the filter as the

desired constant. Since the correntropy matrix Vx is positive definite, there exists an

analytic solution to the optimization problem using the method of Lagrange multipliers in

the new finite dimensional VRKHS. Then the C \ ACE filter in feature space becomes


fh = VxFx(F V'Fx)- c. (5-29)


Unlike the KSDF in (3-11) which has a oo x 1 dimensionality in the RKHS by the

conventional kernel method, the C'\ LACE filter is defined in the finite dimensional VRKHS

which has the same dimensionality as the input space with the size of d x 1. In general,

the kernel method creates a infinite dimensional feature space, so the solution often needs

the regularization to limit the bound of the solution. Therefore, the KSDF may be needed

additional regularization terms for better performance. In the computational complexity of

the C'\ ACE compared to the KSDF, the additional O(d3) operation and O(d2) storage for

VX1 is needed. It makes us to need fast version of the C' \ ACE as well as a dimensionality

reduction method for practical applications.









5.3 Implications of the CMACE Filter in the VRKHS

5.3.1 Implication of Nonlinearity

From (5-17) and (5-22), we can -w that the RKHS induced by correntropy (VRKHS)

is a Hilbert space spanned by the basis {}, Id of size (2d 1) x 1 as


,l = [0, ,0, f(x(d)),.. ,f(x(1)),0,... ,], (5-30)

where f(.) is a nonlinear scalar function and f(x(d)) is located in the ith element. It

is obvious that unlike the RKHS induced by correlation, the VRKHS for the C'\! ACE

filter is nonlinearly related to the original input space. This statement can be simply

exemplified by the CIM metric. Suppose two vectors x = [x1, 0,... O] and the origin

in the input space y = [0, .. 0]. Then the Euclidean distance in the input space is

given by ED(x,y) = x1 and the distance in VRKHS becomes CIM(x,y) ( (K (0) -

K,(xi))1/2,where N is the dimension of the vectors. Now, scaling both vectors by c, we
obtain new vectors x = cx and y = ay. The Euclidean distance between x and y becomes

ED(x, y) = aED(x, y), whereas CIM(Rx,y) N- (k(0) k(axi))1/2 CIM(x,y).
Also, as a goes to infinity, the Euclidean distance in the input space linearly increases

too. However, the CIM distance saturates. This shows that distances in VRKHS are

not linearly related to distances in the input space. This argument can also be observed

directly from the CIM contour in Fig. 5-1.

In addition, since correntropy is different from correlation in the sense that it involves

high-order statistics of input signals, inner products in the RKHS induced by correntropy

are no longer equivalent to statistical inference on Gaussian processes. The transformation

from the input space to VRKHS is nonlinear and the inner product structure of VRKHS

provides the possibility of obtaining closed form optimal nonlinear filter solutions by

utilizing second and high-order statistics.









5.3.2 Finite Dimensional Feature Space

Another important difference compared with existing machine learning methods based

on the conventional kernel method, which normally yields an infinite dimensional feature

space is that VRKHS has the same dimension as the input space. In the conventional

MACE, the template h has d degree of freedom and all the image data are in the d

dimensional Euclidean space. As derived above, all the transformed images belong to a

different d dimensional vector space equipped with the inner product structure defined

with the correntropy. The goal of this new algorithm is to find a template fh in this

VRKHS such that the cost function is minimized subject to the constraint. Therefore,

the degrees of freedom of this optimization problem is still d, so regularization, which will

be needed in traditional kernel methods, is not necessary here. Further work needs to be

done regarding this point, but we hypothesize that in our methodology, regularization

is automatically achieved by the kernel through the expected value operator (which

corresponds to a density matching step utilized to evaluate correntropy). The fixed

dimensionality also carries disadvantages because the user has no control of the VRKHS

dimensionality. Therefore, the quality of the nonlinear solution depends solely on the

nonlinear transformation between the input space and VRKHS. The theoretical advantage

of using this feature space is justified by the CIM metric, which is very suitable to

quantify similarity in feature spaces and should improve the robustness to outliers of the

conventional MACE.

5.3.3 The kernel correlation filter vs. The CMACE filter: Prewhitening in
Feature Space

One of the kernel method applied to correlation filters is the kernel class-dependent

feature analysis (KCFA) [28]. The KCFA is the kernelized version of the linear MACE filter

using the kernel trick after prewhitening preprocess. The correlation output of the MACE

filter h and an input image vector z can be expressed as


Ymace ZX(XTX)-c, (5-31)









where, Z D-1/2Z and X D-1/2X indicate pre-whitened version of Z and X in the

frequency domain. Then (5-31) is equivalent to the linear SDF with prewhitened data and

applying the kernel trick yields the KCFA as follows [27]


YKCF = KzxKx1c, (5-32)


where (i,j)th elements of the matrix Kxx and Kzx are computed by

d
(Kxx)i = ki kj), i, = 1, 2, N, (5-33)
k= 1
d
(Kzx)ij = (zki kj), = 1,2, L,j = 1, 2, N, (5-34)
k= 1
where N is the number of training images and L is the the number of test input images.

In the C \LACE, we denote Fx = V 1/2Fx, and we can decompose fh as

fh = V /2V l/2F X(FrT -1/2V rl/2 F -
fh = v X XF'XvX VX -X)

= VX/2Fx(FxFx)-1c (5-35)


The main difference between the C'\ IACE and KCFA is the prewhitening process. In the

KCFA, prewhitening is conducted in the input space using D, on the other hand, in the

C'\ IACE, (5-35) implies that the image is implicitly whitened in the feature space by the

correntropy matrix Vx. In the space domain MACE filter, the autocorrelation matrix

can be used as a preprocessor for prewhitening. Since the C'\ ACE filter uses the same

formulation in the feature space, we can also expect that the correntropy matrix can be

used for prewhitening. However, in practice, we cannot obtain whitened data explicitly

since the mapping function is not explicitly known. In addition, the solution in the KCFA

is defined in a infinite feature space like the KSDF therefore additional regularization term

may be needed for a better performance.









CHAPTER 6
THE CORRENTROPY MACE IMPLEMENTATION

6.1 The Output of the CMACE Filter

Since the nonlinear mapping function f is not explicitly known, it is impossible to

directly use the C'\! ACE filter fh in the feature space. However, the correntropy output

can be obtained by the inner product between the transformed input image and the

C \! ACE filter in the VRKHS. In order to test this filter, let Z be the matrix of L vector

testing images and Fz be the transformed matrix of Z, then the L x 1 output vector is

given by
y FVx Fx(F V xFx)- 1. (6-1)


Here, we denote Tzx = F VxlFx and Txx = (F Vx1Fx)-1. Then the output becomes


y =Tzx(Txx)-lc, (6-2)


where Txx is N x N symmetric matrix and Tzx is L x N matrix whose (i, j)th element

is expressed by


d d
(Txx)ij r ,f(xik))f(xj(1))
1=1 k=1
d d
>> tAt,, (xi(k) x)(l)), i,j = --, N, (6-3)
l=1 k=1

d d
(Tzx)ij ='r f (zi(k))f(x:j(l))
1=1 k=1
d d
Stryt,,(z (k)- x(1)), i=l,-- ,L, j= ,-- ,N, (6-4)
l=1 k=1

where ir',. is the (1, k)th element of VX1.

The final output expressions in (6-3) and (6-4) are obtained by approximating

f(xi(k))f(xj(1)) and f(zi(k))f(xj(1)) by K (xi(k) Xj(1)) and K,(zi(k) Xj(1)),









respectively, which is similar to the kernel trick and holds on average because of property

5. Unfortunately (6-3) and (6-4) involve weighted versions of the functionals therefore the

error in the approximation requires further theoretical investigation.

The C \ ACE is formulated in the linear VRKHS but has a nonlinear behavior since

the VRKHS is nonlinearly related to the input space. However, the C' \LACE preserves

the shift-invariant property of the linear MACE. The proof of the shift-invariant property

is given in Appendix C. Although the output of the C' \IACE gives us only one value, it

is possible to construct the whole output plane by shifting the test input image and as a

result, the shift invariance property of the correlation filters can be utilized at the expense

of more computation. Applying an appropriate threshold to the output of (6-1), one can

detect and recognize the testing data without generating the composite filter in feature

space. As will be shown in the simulation results section, even with this approximation,

the C' \ ACE outperforms the conventional MACE.

6.2 Centering of the CMACE in Feature Space

With the Gaussian kernel, the correntropy value is aliv-, positive, which brings the

need to subtract the mean of the transformed data in feature space in order to suppress

the effect of the output DC bias. This centering of the correntropy should not be confused

with the spatial centering of the input images.

Given d data samples {x(i)}t let us denote the mean of the transformed data in

feature space as E[f(x(i))] = m, then the centered correntropy, that can be properly

called the generalized covariance function, is give by


(i,j) = E[{f(x(i))- mf}{f (x(j))- mf}]

E[f(x(i)) f (x(j))] -

=V(i,j)- 2, (6-5)









The square of the mean of the transformed data f(-) coincides with the estimate of the

information potential of the original data, that is,

d d
M d22i (() -x(j)). (66)
i=1 j 1

In order to show the validity of (6-6), let us consider the sample estimation of correntropy

(and ignoring the scalar factor 1/d) then we have


d
f ((x(i))f(x(i + t))
i 1


(6-7)


d
(xi) x(i +t)).
i 1


We arrange the double summation (6-6) as an array and sum along the diagonal direction

which yields exactly the autocorrelation function of the transformed data at different lags,

thus the correntropy function of the input data at different lags can be written

d d d-1 d-t d-1 d
S f (x(i))f(x(j)) d 1 f(x(i)) f (x(i + t)) + f (x(i))f (x(i t))}
i=1 j=1 t=0 i=1 t = i=l+t
d-1 d-t d-1 d
Sj2 (Y Y KZiX(1) x(I + t)) + ( (i x(I t )))
t=O i-l t=l- i=l+t
d d

d2 YY) )) (6-8)
i=1 j 1

As we see in (6-8), when the summation is far from the main diagonal, smaller and

smaller data sizes are involved which leads to poor approximation. Notice that this is

exactly the same problem when the auto correlation function is estimated from windowed

data. However, when d is large, the approximation improves. Therefore, in the C'\! ACE

output equation (6 1), we can use the centered correntropy matrix Vxc by subtracting

the information potential from the correntropy matrix Vx as


Vxc Vx M772 l dxd,


(6-9)


where, m 2av is the average estimated information potential over N training images and

and ldxd is a d x d matrix with all the entries equal to 1. Using the centered correntropy









matrix Vxc, a better rejection ability for out of class images is achieved since the offset of

the output can be removed except for the center value in case of training images.

6.3 The Fast CMACE Filter

In practice, the drawback of the proposed C\ ILACE filter is its computation

complexity. In the MACE, the correlation output can be obtained by multiplication in

the frequency domain and the computation time can be drastically reduced by the FFT.

However in the C' \IACE, the output of the C' \IACE filter is obtained by computing

the product of two matrices in (6-3) and (6-4), which depends on the image size and

the number of training images. Each element involves a double summation of weighted

kernel functions. Therefore, each elements of the matrix requires O(d2) computations,

where d is the number of image pixels. When the number of training images is N, the

total computation complexity for one test output is O(Nd2 + N2). A similar argument

shows that the computation needed for training is O(d2 (N2 + 1) + N2). On the other

hand, the MACE only requires O(4(d(2N2 + N + 2) + N2) + Ndlog2(d)) for training

and 0(4d + dlog2(d)) for testing one input image. Table 6-1 shows the computational

complexity of the MACE and C'\ .ACE. More details about the required computation costs

are given in Appendix D. Constructing the whole output plane would significantly increase

the computational complexity of the C' \IACE. This quickly becomes too demanding

in practical settings. Therefore a method to simplify the computation is necessary for

practical implementations.

Here the Fast Gauss Transform (FGT) [47] is proposed to reduce the computation

time with a very small approximation error. The FGT is one of a class of very interesting

and important families of fast evaluation algorithms that have been developed over the

past decades to enable rapid calculation of weighted sums of Gaussian functions with

arbitrary accuracy. In nonparametric probability density estimation with Gaussian kernel,

the FGT can reduce the complexity of O(dM) to O(d + M) for M evaluations with d

sources.









6.3.1 The Fast Gauss Transform

In irn in, problems in mathematics and engineering, the function of interest can be

decomposed into sums of pairwise interactions among a set of sources. In particular, this

type of problem is found in nonparametric probability density estimation as

d
G(z) = qji(z -x(j)), (6-10)
j 1

where k is a kernel function centered at the source points x(j) and qj are scalar weighting

coefficients. With the Gaussian kernel, (6-10) can be interpreted as a "(; i[--i i potential

filed due to sources of strengths qj at the points x(j), evaluated at the target point z.

Suppose that we have M evaluation target points, then the computation of (6-10) requires

O(dM) calculations, which constrains the computation bandwidth for large data sets

d and M in real world applications. The Fast Gauss Transform (FGT) can reduce the

complexity to O(d + M) for (6-10). The FGT is one of a class of very interesting and

important families of fast evaluation algorithms that have been developed over the past

decades to enable rapid calculation of approximations at arbitrary accuracy. The basic

idea is to cluster the sources and target points using appropriate data structures and

the Hermite expansion, and then reduce the number of summations with a given level of

precision.

6.3.2 The Fast Correntropy MACE Filter

The major part of the computation burden in the correntropy MACE filter is given by

d d
T -((i) -(j))2/>2 (611)
i=1 j 1

This is very similar to the density estimation problem that evaluates at d targets z(i) with

given d source samples x(j). However, the weighting factor ,, in (6-11) are dependent

on both target and source, which is different from the original FGT applications, where

the weight vector is alv--,v- the same at every evaluation target points. In our case, the

weight vector wi = [i ,, iT is varying on every evaluation point z(i). We can -,









that (6-11) is a more general expression than the original FGT formulation and it can be

written as
d
T = GC(z), (6-12)
i=1
where
d
G(z) (z)x)). (6-13)
j=1
This means that clustering and the Hermite expansion should be performed at every

target z(i) with a different weight vector wi, which causes an extra computation for

clustering. However, since the sources are clustered in the FGT, if one expresses the

clustered sources about its center into the Hermite expansion, then there is no need to do

clustering and the Hermite expansion at every evaluation. The only thing that is necessary

is to use different weight vectors at every evaluation point. This process does not require

additional complexity compared to the original FGT formulation except that more storage

is required to keep the weight vectors. By using the Hermite expansion around the target

s, the Gaussian centered at x(j) evaluated at z(i) can be obtained by


exp{ (z(i) X(j))2 n1 (j h Z(i) S) + -(p), (6-14)
2o0

where the Hermite function h,(x) is defined by


h,(x) = (-l)n (exp(- 2)). (6-15)


Also, in this research, we use a simple greedy algorithm for clustering [48], which computes

a data partition with a maximum radius at most twice the optimum. This clustering

method and the Hermite expansion with order p requires O(pd). In the case of (6-3) and

(6-4), since the number of sources and targets are the same, they can be interchanged,

that is, the test image can be the source so that the clustering and Hermite expansion can









Table 6-1. Estimated computational complexity for training with N images and testing
with one image. Matrix inversion and multiplication are considered
(In this simulation, d 4096, N=60, p 4,k,=4)
Training (Off line) Testing (On line)
MACE O(4(d(2N2 + N + 2) + N2) + Ndlog(d)) O(4d + dlog2d))
CMACE O(d2 (N2 1) + N2) O(d2N +N2)
Fast CMACE O(N2pd(kc + 1) + d2 + N2) O(pd(kc + 1)N + N2)


be done only one time per test. Thus T in (6-11) can be approximated by
d p-1
d p1 x(i) s) (B), (6-16)
i-1 B n=0

where B represents a cluster with a center sB and C,(B) is given by
> T T(Z) SB)(1
Cn(B)= z(- ( (6 17)
z (j),w j CB

From (6-16), we can see that evaluation at kc expansions at all the evaluation points

costs O(pkcd), so the total number of operations is O(pd(kc + 1)) per computation of

each element in (6-3) and (6-4). The final aim is to obtain the output of the C'\IACE

filter with N training images and L test images. In order to compute the output of one

test image, the original direct method requires O(d2N(N + 1)) operations to obtain Txx

and Tzx, and we can reduce the operation count reduces to O(pd(kc + 1)N(N + 1)) by

applying this enhanced FGT. Typically p and kc are around 4 while d and N are 4,096

and around 100 respectively in our application, which results in a computational savings

of roughly 100 times. Additionally, clustering with the test image is performed only once

per test which reduces the computation time even more. However, from the Table 6-1,

we see that the computational complexity of the C \ I.CE for the testing still depends on

the number of training images, resulting in more computations than the MACE. More

work is necessary to reduce even further the computation time of the C'\ LACE and its

memory storage requirements, but the proposed approach enables practical applications

with present dw computers.









CHAPTER 7
APPLICATIONS OF THE C' \ ACE TO
IMAGE RECOGNITION

7.1 Face Recognition

7.1.1 Problem Description

In this section, we show the performance of the proposed correntropy MACE filter for

face image recognition. In the simulations, we used the same facial expression database

used in the chapter 3. We used only 5 images to composite template (filter) per person

( the MACE filter shows a reasonable recognition result with a small number of training

image in this database [5]). we picked and report the results of the two most difficult

cases who produced the worst performance with the conventional MACE method. We

test with all the images of each person's data set resulting in 75 outputs for each class.

The simulation results have been obtained by averaging (Monte-Carlo approach) over 100

different training sets (each training set consists of randomly chosen 5 images) to minimize

the problem of performance differences due to splitting the relatively small database

in training and testing sets. The kernel size, o, is chosen to be 10 for the correntropy

matrix during training and 30 for test output. In this data set, it has been observed that

the kernel size around :n. -50' of the standard deviation of the input data would be

appropriate. Moreover, we can control the performance by choosing a different kernel size

during training for prewhitening.

7.1.2 Simulation Results

Figure 7-1 shows the average test output peak values for image recognition. The

desired output peak value should be close to one when the test image belongs to the

training image class (true class) and otherwise it should be close to zero. Figure 7-1 (Top)

shows that the correlation output peak values of the conventional MACE in false classes

is close to zero and it means that the MACE has a good rejecting ability of false class.

However, some outputs in the test image set, even in the true class, are not recognized as

the true class. Figure 7-1 (Bottom) shows the output values of the proposed correntropy





















--- True class
x- False class

10 20 30 40 50 60 70
Index of test image










True class
x False class


10 20 30 40
Index of test image


50 60 70


Figure 7-1. The averaged test output peak values (100 Monte-Carlo simulations with
N=5), (Top): MACE, (Bottom): C\!ACE.









0.5


10 20 30 40 50 60 70
index of test image











10 20 30 40 50 60 70
index of test image


Figure 7-2.


The test output peak values with additive Gaussian noise (N 5),(Top):
MACE, circle-true class with SNR 10dB, cross-false class with SNR 2dB,
(Bottom): C\! ACE, circle-true class with SNR 10dB, cross-false class with
SNR 2dB.


08

0 6

S04

0 2'




0
1

08

I06

S04













0.
/ MACE (No noise)
0.8 I- .-.MACE (SNR:2dB)
S:.-'- ******* MACE (SNR:OdB)
c0.7 Proposed method
(No noise, SNR:2dB, OdB)
0.6 -

o 0.5
-o
S0.4
-o 'I
a I
0.3

0
0.2: -



0 0.2 0.4 0.6 0.8 1
Probability of False Alarm


Figure 7-3. The comparison of ROC curves with different SNRs.


MACE and we can see that the generalization and rejecting performance are improved.

As a result, the two images can be recognized well even with a small number of training

images. One of problems of the conventional MACE is that the performance can be easily

degraded by additive noise in the test image since the MACE does not have any special

mechanism to consider input noise. Therefore, it has a poor rejecting ability for a false

class image when noise is added into a false class. Figure 7-2 (Top) shows the noise effect

on the conventional MACE. When the class images are seriously distorted by additive

Gaussian noise ( SNR =2dB), the correlation output peaks of some test images from false

class become great than that of the true class, hence wrong recognition happens. The

results in Figure 7-2 (Bottom) are obtained by the proposed method. The correntropy

MACE shows a much better performance especially for rejecting even in a very low SNR

environment. Figure 7-3 shows the comparison of ROC curves with different SNRs. In

the conventional MACE, we can see that the false alarm rate is increased as additive

noise power is increased. However, in the proposed method, the probability of detection









Table 7-1. Comparison of standard deviations of all the Monte-Carlo simulation outputs
(100x75 outputs)
True False True False
(No noise) (No noise) (SNR:OdB) (SNR:OdB)
MACE 0.0498 0.0086 0.0527 0.0245
C' iACE 0.0 !'- 0.0051 0.0 1. 0.0038


with zero false alarm rate is 1. The correntropy MACE shows much better recognition

performance than the conventional MACE.

One of advantage of the proposed method is that it is more robust than the

conventional MACE. That is, the variation of the test output peak value due to a different

training set is smaller than that of the MACE. Figure 7-4 shows standard deviations

of 100 Monte-Carlo outputs per test input when the test input are noisy false class

images. Table 1 shows the comparison of the standard deviation of 750 outputs (100

Monte-Carlo outputs for 75 inputs) for each class. From the 7-1, we can see that the

variations of the correntropy MACE outputs due to different training set is much less than

those of the conventional MACE and it tells us that our proposed nonlinear version of

the MACE outperforms the conventional MACE and achieves a robust performance for

distortion-tolerant pattern recognition.

Table 7-2 shows the area under the ROC for different kernel sizes in the case of

no additive noise. In this simulation, the kernel sizes lie in the range between 0.1 to 15

provide the perfect ROC performance. The kernel size obtained by Silverman's rule of

thumb [44], which is given by ai 1.06rid-1/5, where ai is the standard deviation of the

ith training data and d is the number of samples, is 9.63 and it also results in the best

performance. As expected from the property of correntropy, it is noticed that correntropy

approaches correlation with large kernel size (ROC area of the MACE is about 0.96).


















0.2 -









0
0.15

0.1




10 20 30 40 50 60 70
index of test image

Figure 7-4. The comparison of standard deviation of 100 Monte-Carlo simulation outputs
of each noisy false class test images.

Table 7-2. Comparison of ROC areas with different kernel sizes
Kernel size ROC area Kernel size ROC area
0.1 1 20 0.9901
0.5 1 50 0.9804
1.0 1 100 0.9810
9.6 1 200 0.9820
10.0 1 500 0.9796
15.0 1 1000 0.9518


7.2 Synthetic Aperture Radar (SAR) Image Recognition

7.2.1 Problem Description

In this section, we show the performance of the proposed correntropy based nonlinear

MACE filter for the SAR image recognition problem in the MSTAR/IU public release

data set[49]. The MSTAR (Moving and Stationary Target Acquisition and Recognition)

data is a standard dataset in the SAR ATR community, allowing researchers to test and

compare their ATR algorithms. The database consists of X-band SAR images with 1 foot

by 1 foot resolution at 15, 17, 30 and 45 degree depression angles. The data was collected

by Sandia National Laboratory (SNL) using the STARLOS sensor. The original dataset









consists of different military vehicles, where the poses (aspect angles) of the vehicles

lie between 0 and 359 degree, and the target image sizes are 128x 128 pixels or more.

Since the MACE and the C' \ ACE have a constraint at the origin of the output plane,

we centered all images and cropped the centered images to the size of 64x64 pixels. (in

practice, with uncentered images, one needs to compute the whole output plane and search

for the peak). The selected area contains the target, its shadow and background clutter.

In this simulation we use the images which lie in the aspect angles of 0 to 179 degree. The

original SAR image is composed of magnitude and phase information, but here only the

magnitude data is used in this simulation.

This dissertation compares the recognition performance of the proposed C \ ACE filter

against the conventional MACE considering two distortion factors. The first distortion

case is due to a different aspect angle between training and J- -ii .- and the second case

is a different depression angle between test and training data. In the simulations, the

performance is measured by observing the test output peak value and creating the ROC

(Receiver Operating C(i ,, :teristic) curve. The kernel size, a, is chosen to be 0.1 for the

estimation of correntropy in the training images and 0.5 for test output in (6-3) and (6-4).

The value of 0.1 for the kernel size corresponds to the standard deviation of the training

data which is consistent with the Silverman's rule. Experimentally it was verified that a

larger kernel size for testing provided better results.

7.2.2 Aspect Angle Distortion Case

In the first simulation, we selected the BTR60 (Armored personal carrier) as a target

(true class) and the T62 (Tank) as a confuser (false class). Both of them are taken at 17

degree depression angles. The goal is to design a filter which will recognize the BTR60

with minimal confusion from the T62. Figure 7-5 (a) shows the training images, which are

used to compose the MACE and the C'\ ACE filters. In order to evaluation the effect of

the aspect angle distortion, training images were selected at every 3 index numbers from a

total of 120 exemplar images for each vehicle (most of index numbers have a 20 difference
















(a) Training images (BTR 60) of aspect angle 0, 35, 124, 159 degrees


(b) Test images from BTR 60 of aspect angle 3, 53, 104, 137 degrees


(c) Test images from confuser (T62) of aspect angle 2, 41, 103, 137
degrees

Figure 7-5. Case A: Sample SAR images (64x64 pixels) of two vehicle types for a target
chip (BTR60) and a confuser (T62).


and some have a 1 difference in aspect angle). That means that the total number of

training images used to construct a filter is 40 (N 40). Figure 7-5 (b) shows test images

for the recognition class and (c) represents confusion vehicle images. Testing is conducted

with all of 120 exemplar images for each vehicle. We are interested only in the center of

the output plane, since the images are already centered. The peak output responses over

all exemplars in the test set are shown in Figure 7-6. In the simulation, the constraint

value for the MACE as well as the C \ ACE filter is one for the training, therefore the

desired output peak value should be close to one when the test image belongs to the target

class and should be close to zero otherwise. Figure 7-6 (Top) shows the correlation output

peak value of the MACE and Figure 7-6 (Bottom) shows the output peak values of the

C'\ IACE filter for both a target and a confuser.

Figure 7-6 illustrates that results are perfect for both the MACE and the C'\.4ACE

within the training images. However, in the MACE filter, most of the peak output values

on test images are less than 0.5. This shows that the MACE output generalizes poorly

























20 40 60 80 100 120
Index of Test images (aspect angle 0~179 degrees)


20 40 60 80 100
Index of Test images (aspect angle 0~179 degrees)


120


Figure 7-6. Case A: Peak output responses of testing images for a target chip (circle) and
a confuser (cross) : (Top) MACE, (Bottom) C'\ ACE.









0.9 .4 .
/I MACE (N=40)
0.8 /" I MACE (N=60)
/ Correntropy MACE (N=40)
0 / -- Correntropy MACE (N=60)
S0I I/ ***f' KCFA(N=40)
0 I -"- KCFA (N=60)

o 0.


0.3

0.2

0.1


0 0.2 0.4 0.6 0.8 1
Probability of False Alarm


Figure 7-7. Case A: ROC curves with different numbers of training images.









for the images of the same class not used in training, which is one of known drawbacks of

the conventional MACE. For the confuser test images, most of the output values are near

zero but some are higher than those of target images, creating false alarms. On the other

hand for the C'\ IACE, most of the peak output values of test images are above 0.5, which

means that C \ ACE generalizes better than the MACE. Also, the rejecting performance

for a confuser is better than the MACE. As a result, recognition performance between

two vehicles is improved by the C' \IACE, as best quantified in the ROC curves of Figure

7-7. From the ROC curves we can see that the detecting ability of the proposed method

is much better than both the MACE and the KCFA. For the KCFA, prewhitened images

are obtained by multiplying D-0.5 in the frequency domain and applied the kernel trick

to the prewhitened images to compute the output in (5-31). Gaussian kernel with kernel

size of 5 is used for the KCF. From the ROC curves in Figure 7-7, we can also see that the

C'\ IACE outperforms the nonlinear kernel correlation filter in particular for high detection

probability.

Figure 7-8 (a) shows the MACE filter output plane and (b) shows the C'\LACE filter

output plane, for a test image in the target class not present in the training set. Figure

7-8 (c) and (d) show the case of a confuser (false class) test input. In Figure 7-8 (a)

and (b), we can see that both the MACE and the C'\LACE produce a sharp peak in the

output plane. However, the peak value at the origin of the C' \.ACE is higher (closer to

the desired value) than that of the MACE. Moreover, the C' \.ACE has less sidelobes and

the values of sidelobes around the origin are lower than those of the MACE. These points

tell us that the detection ability of the propose method is better that the MACE. On

the other hand, for the confuser test input in Figure 7-8. (c) and (d), the output values

around the origin of the C' \IACE have lower values than the MACE, which means that

the C \ ACE has better rejection ability than the MACE.

In order to demonstrate the shift-invariant property of the C' \IACE, we apply the

images of Figure 7-9. The test image was cropped for the object to be shifted 13 pixels


















n 74


I 05-


o-


20


0 0


(a) True class in the MACE


peak 0 87
center 0 22

i :)


& 05-
o
8O -


40


0 0


(c) False class in the MACE


0 44


40


0 0


(b) True class in the CMACE


05-


So.
8

o
o


peak 0 62
center 0 006

/ .


60


0 0


(d) False class in the CMACE


Figure 7-8. Case A: The MACE output plane vs. the C \ IACE output plane



in both x and y pixel positions. Figure 7-10 shows the output planes of the MACE and


C'\ IACE when the shifted image is used as the test input while all the training images


are centered. In Figure 7-10, the maximum peak value should happen at the position


of (77,77) in the output plane since the object is shifted by 13 pixels in both x and y


directions. In the C'\ IACE output plane, the maximum peak happens at (77,77) and the


value is 0.9585. However, in the MACE, the maximum peak happens at (74,93) with


0.9442 and the value at the position of (77,77) is 0.93. In this test, the C'\LACE shows


better shift invariance property than the MACE.


































(b) shifted to (x-13,y-13)


Figure 7-9. Sample images of BTR60 of size (64 x 64) pixels (a) The cropped image of size
(64 x 64) pixels at the center of the original of size (128x128) pixels. (b) The
cropped image of size (64 x 64) pixels with ( 13, y 13) of the original of
size (128 x 128) pixels.











1 74
05 0442
0 0 9585


05- 1 05- /


100


0 0


100


0 0


(a) The MACE output plane


(b) The CMACE output plane


Figure 7-10. Case A: Output planes with shifted true class input image


(a) original (x,y)









The C'\! ACE performance sensitivity to the kernel size is studied next. In order to

find the appropriate kernel size for the C'\ ACE, the easiest step is to apply Silverman's

rule of thumb developed for the kernel density estimation problem, which is given by

ai = 1.06id-1/5, where ai is the standard deviation of the ith training data and d is the

number of samples [44]. A more principled alternative is to apply cross validation to find

the best kernel size. For cross validation, we use one image of training set which is not

included in filter design. Since we are considering images as 1-dimensional vectors, we have

N different training data set. Therefore, we obtain one proper kernel size, a, by averaging

N different kernel sizes with a = z 1 Ui. In this simulation, when N 60, the value

of the kernel size given by Silverman's rule is 0.0185 and the best one from cross validation

is 0.1. Figure 7-11 shows the ROC curves for the kernel size obtained by Silverman's rule

and the one obtained by cross validation. We see that the ROC performance from the

Silverman' rule is very close to that of the optimal kernel size by cross validation. Also

when we increase the kernel size to be 10, its performance is similar to that of the MACE.

As expected from the property of correntropy, it is noticed that correntropy approaches

correlation with large kernel size.

Table 7-5 shows the area under the ROC for different kernel sizes, and we conclude

that kernel sizes between 0.01 to 1 provide little change in detectability. This may be

surprising when contrasted with the problem of finding the optimal kernel size in density

estimation, but in correntropy the kernel size enters in the argument of an expected value

and pl i a different role in the final solution, namely it controls the balance between the

effect of second order moments versus the higher order moments (see property 1).

7.2.3 Depression Angle Distortion Case

In the second simulation, we selected the vehicle 2sl (Rocket Launcher) as a

target and the T62 as a confuser. These two kinds of images look very similar in shape

therefore they represent a difficult object recognition case, useful to test the performance

improvement of the proposed method. In order to show the effect of the depression angle
















u .......** CMACE(c=0.0185, Silverman's rule)
S0.7 / /I CMACE (o=0.1, the best from cross validation)
.2 I MACE
0 0.6 CMACE(c=10)
-r-
o 0.5 I

0.4 i
-g I!
0.3 -
II
0.2 -
.i
-
0.1 -i

01
0 0.2 0.4 0.6 0.8 1
Probability of False Alarm


Figure 7-11. The ROC comparison with different kernel sizes

Table 7-3. Case A: Comparison of ROC areas with different kernel sizes
Kernel size ROC area Kernel size ROC area
0.01 0 '".2 ; 0.6 0.9806
0.0185 (1 '"IN 0.7 0.9771
0.05 0.9631 0.8 0.9754
0.1 0.9847 0.9 0.9749
0.2 0.9865 1.0 0.9602
0.3 0.9797 2.0 0.9397
0.4 0.9797 5.0 0.9256
0.5 0.9808 10.0 0.9033


distortion, training data are selected from target images which were collected at 30 degree

depression angle and the MACE and C'\ ACE are tested with data taken at 17 degree

depression angle.

Figure 7-12 depicts some sample images. As we can see in Figure 7-12 (a) and (b),

due to the big change in depression angle (13 degree of depression is considered a huge

distortion), test images have more shadows and the image size of the vehicles also change,

making detection more difficult. In this simulation, we use all the images (120 images


4r~i;r Iln~ mil I I I I I I
















(a) Training images (2S1) of aspect 0,35,124,159 degrees


(b) Test images (2S1) of aspect 3,53,104,137 degrees


(c) Test images from confuser (T62) of aspect 2,41,103,137
degrees

Figure 7-12. Case B: Sample SAR images (64x64 pixels) of two vehicle types for a target
chip (2S1) and a confuser (T62).


covering 180 degrees of pose) at 30 degree depression angle for training and also test with

all of 120 exemplar images at 17 degree depression angle.

Figure 7-13 (Top) shows the correlation output peak value of the MACE and

(Bottom) shows the output peak values of the C \ ACE filter with a target and a confuser

test data. We see that the conventional MACE is very poor in this case, either under or

overshooting the peak value of 1 for the target class, but the C\ .ACE can improve the

recognition performance because of its better generalization. Figure 7-14 depicts the ROC

curve and summarizes the C'\ ACE advantage over the MACE in this large depression

angle distortion case. More interestingly, the KCFA performance is closer to the linear

MACE, due to the same input space whitening which is unable to cope with the large

distortion.

























20 40 60 80 100
Index of Test Images (aspect angle 0~179 degree)


20 40 60 80 100 120
Index of Test Images (aspect angle 0~179 degree)


Figure 7-13. Case B: Peak output responses of testing images for a target chip (circle) and
a confuser (cross) : (Top) MACE, (Bottom) C'\ ACE.










0.9

0.8

S0.7 ... MACE
S / Correntropy MACE


0 0.5

0.4

0.3

0.2

0.1


0 0.2 0.4 0.6 0.8 1
Probability of False Alarm


Figure 7-14. Case B: ROC curves.









Table 7-4. Comparison of computation time and error for one test image between the
direct method (C \ ACE) and the FGT method (Fast C \ ACE) with p = 4 and
kc = 4
Direct (sec) FGT (sec) Error
Train : Kxx 7622.8 68.31 '1... -06
Kzx 122.8 1.15 8.7575e-06
Test(true)output 2 22,. -03
Kzx 128.6 1.18 3.8844e-05
Test (false)output 8.4377e-03


7.2.4 The Fast Correntropy MACE Results

This section shows both the computation speed improvement and the effect on

accuracy of the fast C' \IACE filter in the aspect angle distortion case with N = 60

training images. Computation time was clocked with MATLAB version 7.0 on a 2.8GHz

Pentium 4 processor with 2GByte of RAM running Windows XP.

Table 2 shows the comparison of computation time for (6-3) and (6-4) between

the direct implementation of the C' \IACE filter and the fast method with a Hermite

approximation order of p = 4 and kc = 4 clusters. The computation time and absolute

errors for one test image were obtained by averaging 120 test images. This simulation

shows that the FGT method is about 100 times faster than the direct method with a

reasonable error precision. Figure 7-15 presents the comparison in terms of ROC curves

of the MACE, the C \ ACE and the fast C \ ACE. Form the ROC curve we can observe

that the approximation with p = 4 and k, = 4 is very close to the original ROC. Table

3 shows the effect of different orders (p) and clusters (k,) on the computation time and

accuracy for the fast C'\ ACE filter. We conclude that the computation time increases

roughly proportional to p and kc, while the absolute error linearly decreases.

7.2.5 The effect of additive noise

This section presents the effect of additive noise on the recognition performance of

both the MACE and C'\ A.CE. For this simulation, we design the template with training

data which are selected at every 3 index numbers from a total of 120 exemplar images

for the BTR60 without noise and test with all 120 images distorted by additive noise for






























0.2 0.4


0.6 0.8


Figure 7-15. Comparison of ROC curves between the direct and the FGT method in case
A


Table 7-5. Comparison of computation time and error for one test
method with a different number of orders and clusters


image in the FGT


Order Time (sec) Error Cl -i i Time (sec) Error
2 0.8116 1.48e-02 2 0.7181 5.61e-02
6 1.5140 8.23e-04 6 1.6693 3.87e-04
10 2.2119 8.58e-06 10 2.5595 4.71e-05
14 2.8533 4.16e-07 14 3.5660 6.93e-06
20 3.8097 1.25e-09 20 5.3067 1.14e-06


each vehicle. Figure 7-16 shows sample images of the original and noisy image with signal

to noise ratio (SNR) of 7dB. Also in this simulation, we compare the C'\ ACE with the

optimal trade-off filter (OTSDF), which is a well know correlation filter to overcome the

poor generalization of the MACE when noise input is presented. The OTSDF filter is

given by


H = T- X(XHT-X)- c,


(7-1)


where T = aD + V -- 2C, and 0 < a < 1, and D is the diagonal matrix in the

MACE and C is the diagonal matrix containing the input noise power spectral density as

its diagonal entries. Figure 7-17 shows the comparison of ROC curves of the MACE,


9 t ...... Fast Correntropy MACE
/ MACE
8 ,*
8 Correntropy MACE
7

5
I


5.

4 1

3

2 -

1


























(a) Original (b) Noisy with SNR=7dB


Figure 7-16. Sample SAR images (64x64 pixels) of BTR60.


Figure 7-17. ROC comparisons with noisy test
(N 40).


images (SNR 7dB) in the case A









OTSDF and C\ IA.CE when white Gaussian noise with signal to noise ratio (SNR) of

7dB are presented to all test images and we see that the MACE performance is degraded

due to the additive input noise and the OTSDF with a = 0.7 shows almost the same

performance as the MACE without noise. However, the performance of the C \IACE with

noisy test data is almost the same as no noise case. Although, the CA \IACE does not take

explicitly the noise into consideration like the OTSDF, the C' \IACE is robust to the input

noise. In practice, the additive noise information in unknown, therefore, the OTSDF is

impractical.









CHAPTER 8
DIMENSIONALITY REDUCTION WITH RANDOM PROJECTIONS

8.1 Introduction

In in i' ii pattern recognition and image processing applications, the high dimensionality

of the observed data make many efficient algorithms of statistical approaches impractical.

Therefore, av i i, I ir of data compression and dimensionality reduction methods have been

proposed to overcome the curse of dimensionality [50]. Dimensionality reduction provides

compression and coding necessary to avoid excessive memory usage and computation.

Principal Component Analysis (PCA) is the most widely known way of reducing

dimension and it is optimal in the mean square error sense. PCA determines the basis

vectors by finding the directions of maximum variance in the data and it minimizes

the error between the original data and the one reconstructed from its low dimensional

representation. PCA has been very popular in face recognition [51] and many pattern

recognition applications [52]. Finding the principal components is a well established

numerical procedure through eigen decomposition of the data covariance matrix, although

it is still expensive to compute. There are other less expensive methods [51] based

on recursive algorithms [53] for finding only a few eigenvectors and eigenvalues of a

large matrix, but the computational complexity is still a burden. Moreover, subspace

projections by PCA do not preserve discrimination [54], so there may be a loss of

performance. Variant to Singular Value Decomposition (SVD) are considered for image

compression utilizing the Karhunen-Loeve Transformation (KLT). Like PCA, SVD method

is also expensive to compute.

Discrete Cosine Transform (DCT) [50] is a widely used method for image compression

and as it can also be used in dimensionality reduction of image data. DCT is computationally

less burdensome than PCA and its performance approaches that of PCA. DCT is optimal

for human cv the distortions introduced occur at the highest frequencies only and the

human eye tends to neglect these as noise. The image is transformed to the DCT domain









and dimensionality reduction is done in the inverse transform by discarding the transform

coefficients corresponding to the highest frequencies.

8.2 Motivation

Both the conventional MACE and the C \ ACE are memory-based algorithms,

therefore, in practice, the drawback of this class of algorithms is both the storage

requirements and the high computational demand. The output of the C \ ACE filter is

obtained by computing the product of two matrices defined by the image size and the

number of training images, and each elements of the matrix requires O(d2) computations,

where d is the number of image pixels. This quickly becomes too complex in practical

settings even for relatively small images. The fast correntropy MACE filter using the fast

Gauss transform (FGT) has been presented to increase the computational speed of the

C'\ IACE filter, but the storage is still high. When the number of training images is N, the

total computation complexity of one test output of the C \ I ACE is O(d2N(N+ 1)) and this

can be reduced to O(pcdN(N + 1)), where p is the order of the Hermite approximation and

c is the number of clusters utilized in the FGT (p, c < d). In general, image has a high

dimensionality and the applications using large images need a large memory capacity. The

main goal of this chapter is to find a simple but powerful dimensionality reduction method

for image recognition with the C'\ IACE filter.

Recently, random projection (RP) has merged as an alternate dimensionality

reduction method in machine learning and image compression [55],[56],[57],[58],[59]

[56],[60] due to its simple complexity and good performance. Many experiments in the

literature show that RP is computationally simple while preserving similarity to a high

degree. In random projection, the original high dimensional data is projected onto a lower

dimensional subspace using a random matrix with only a small distortion of the distanced

between the points while preserving similarity information. Even though the projected

data by random selection includes key information of the original data, we need to extract

the information properly. Since correntropy has ability to extract higher order moments of









the data and in that sense, correntropy can be the promising tool for random projection

applications.

In this chapter we present a dimensionality reduction pre-processor based on

random projections (RP) to decrease the storage and meet more readily available

computational resources and show the RP method works well with the C' \ ACE filter

for image recognition.

8.3 PCA and SVD

Principal Component Analysis (PCA) is the best linear dimensionality reduction

technique in the mean-square error sense. Being based on the covariance matrix of the

random variables it is a second-order method. In various fields, it is also known as the

singular value decomposition (SVD), the Karhunen-Loeve Transformation (KLT),the

empirical orthogonal function (EOF) method and so on.

PCA seeks to reduce the dimensionality of the data by finding a few orthogonal linear

combinations of the original variables with the largest variance.

Let us suppose that we are given a data matrix X in R whose size is d x N, where

N is the number of vectors in d-dimensional space. The goal is to find a k-dimensional

subspace (k < d) such that the projection of X on that subspace minimizes expected

squared error.

Then the projection of the original data onto a lower k-dimensional subspace can be

obtained by

Xrp ppcaX, (8-1)

where P is k x d and it contains the k eigenvectors corresponding to the k largest

eigenvalues.

8.4 Random Projections

Random projection (RP)is a simple yet powerful dimensionality reduction technique

that uses random projection matrices to project data into a low dimensional subspace.

In RP, the original high dimensional space is projected onto a low dimensional subspace









using a random matrix whose columns have unit length. In contrast to other methods,

such as PCA, that use data driven optimization criteria, RP does not use such criteria,

therefore, RP is data independent. Moreover, RP is computationally simple and preserves

the structure of the data without introducing significant distortion. RP theory is far

from complete, so it has to be used with caution. The following lemma from Johnson and

Lindenstrauss (JL) provides theoretical support for RP.

JL lemma For any 0 < e < 1 and any integer N, let k be a positive integer such that


k > 4(C2/2 3/3)-1 n N (8-2)


then for any set V of N point in Rd, there is a map f : Rd Rk such that for all

, V E V,

(1 c) uII 112 <11 f(u) f(v2 < ( + ) II U 211 (83)

Furthermore this map can be found in polynomial time.

JL lemma states that any N point set in d dimensional Euclidean space can be

mapped down onto a k > O(logN/92) dimensional subspace without distorting the

distance between any pair of points by more than a factor of (1 c) for any 0 < C < 1,

with probability O(1/N2). A proof of this lemma as well as tighter bounds on C and k are

given in [61].

Let us suppose that we are given a data matrix X, whose size is d x N, where N is

the number of vectors in d-dimensional space. Then the projection of the original data

onto a lower k-dimensional subspace can be obtained by


Xr PX, (8 4)


where P is k x d and it is called the random projection matrix.

The complexity of RP is very simple compared to other dimension reduction methods.

RP needs only order of O(kdN) for projecting d x N data matrix into k dimensions. The









computational complexity of constructing the random matrix, O(kd), is negligible when

compared with PCA, O(Nd2) + O(d3) [62].

8.4.1 Random Matrices

The choice of the random matrix P is one of the issues of RP. There are some simple

methods satisfying the JL lemma in the literature [58],[63]. Here we present three such

methods.

The Gaussian ensemble : The entries, pij, of k x d random matrix P are identically

and independently sampled from a normal distribution with zero mean and unit

variance.

pij := -r!ij, where rij is i.i.d N(0, 1)
The binary ensemble : The entries,pij, of k x d random matrix P are identically and

independently sampled from a symmetric Bernoulli distribution.

pij : -rij, where rij is i.i.d P(rij = 1) = 1/2
The related ensemble : pij = rij, where


+1 with probability 1/6
rij := 0 with probability 2/3

-1 with probability 1/6

In most applications, the Gaussian ensemble would satisfy JL lemma well. The other two

methods yield significant computational savings [64].

8.4.2 Orthogonality and Similarity Properties

In fact, to preserve the similarity between the original data and the transformed data,

a projection matrix should be orthogonal. However, in a high enough dimensional space,

it is possible to use a non-orthogonal random projection matrix due to the fact that data

is sparse and in a high dimensional space, there exist a much larger number of almost

orthogonal directions [65]. Thus, vectors having random directions in a high dimensional









space is linear independent and these might be sufficiently close to orthogonal to provide

an approximation of a basis.

The inner product of two vectors x and y that have been obtained by random

projection of the vectors u and v with the random matrix R can be expressed as


xy uTRTRv. (8-5)


The matrix RTR can be decomposed into two terms


RTR I + (8-6)


where ij = ryrj for i / j and ij = 0 for all i.

If all the entries in E are equal to zero, i.e., the vectors ri and rj are orthogonal, the

matrix RTR would be equal to I and the similarity between the original data and the

projected data would be preserved exactly in the random mapping. In practice the entries

in e will ne small but not equal to zero.

Here let us consider the case of that the entries of the random matrix are identically

and independently sampled from a normal distribution with zero mean and unit variance

and thereafter the length of all the ri's is normalized. Then it is evident that Eij is an

estimate of the correlation coefficient between two i.i.d normal distributed random variable

and if the dimensionality k of the reduced dimensional space is large, Eij is approximately

normally distributed with zero mean and its variance r2 can be approximated by


a2 1/k. (8-7)


That is, the distortion of the inner product produced by the random projection is zero

on the average and its variance is at most the inverse of the dimensionality of the reduced

space. This result causes the scaling factor 1//k in the choice of random projection

matrix examples to preserve the distance. (In some applications which are not concerning

the distance, we do not need to scale the projection matrix by 1/vk). Moreover, the error











becomes much smaller when the data is sparse and this result tells us the relevance of the

random projection in compressive sampling for sparse signal recovery [66]. However, the

methodology for building Rp may affect the subsequent algorithms used for processing,

and this is an area that is much less studied.

8.5 Simulations

In this section, we show the performance of the C' \ACE filter with the RP

dimensionality reduction in the face recognition example of chapter 7. We project the

original data into a lower dimensional space with random projection and apply the

C' \ ACE filter to the reduced dimensional data. In this simulation, we use the Gaussian

Ensemble method to generate a random projection matrix. In order to compare the

performance of the C' \ ACE after a preprocessing with random projection, we compute the

area under the ROC curve.

Figure 8-1 shows the ROC area values with different reduced dimensions by RP.



1 .5 6--66 ... 6 ... 6 6... 6.... 6-
O X 144 **.. *. *.......
0 Y 09684 ... .. .
09 '* X' maximum
S* minimum
.: .O' average
0.8
<
LU
nO
< 0.7
0o

0.6


0.5


0.4
200 400 600 800 1000 1200 1400 1600
Dimensionality k after random projection


Figure 8-1. The comparison of ROC areas with different RP dimensionality (50 trials with
different training images and RP matrices).














0 X 256
0.9 Y 0 9502 -0 average
0* -x -maximun
minimum
0.8
(*

< 0.7-
8 *

0.6-



0.
0.4 -----------
200 400 600 800 1000 1200 1400 1600
Dimensionality k after ramdom projection


Figure 8-2. The comparison of ROC areas with different RP dimensionality (50 trials with
different RP matrices, but fixed training images).


Since RP is in fact a random function we present results in terms of mean, maximum

and minimum performance obtained in 50 Monte-carlo simulations. At every trial we

use randomly chosen different training images (N = 5) and different random projection

matrices. When the MACE is applied to the original data, the average ROC area is 0.96

(the best case is 0.99 and the worst case is 0.8868). In Figure 8-1 the performance of the

C' I ACE with the reduced dimensionality k > 144 is ahv--, better than that of the MACE

filter with original data. The range of performance between the best and the worst cases is

due to both the effect of different training images and different RP matrices.

In order to monitor only the effect of RP, we fixed the training images and run 50

Monte-carlo simulations with different RP matrices. The results on ROC areas are shown

in Figure 8-2. We can see that the variations of the performance due to different RP

matrices is substantially smaller and the CM\ ACE obtains consistent performance with RP

when the image size is above 16 x 16 ( dimensionality k = 256 ).





















* x. CMACE(averaging)
-- CMACE(RP)
CMACE(bilinear interpolation)
* 'DI, CMACE(subsampling)
--- MACE(averaging)
MACE(bilineat interpolation)
-A- MACE(subsampling)
- MACE(RP)
0.6 0.8


Figure 8-3. ROC comparison with different dimensionality reduction methods for MACE
and C'\A.CE (reduced image size is 16 x 16).


The comparison among the four dimensionality reduction methods (subsampling,

pixel-averaging, bilinear interpolation and Gaussian (RP)) for images of size 16 x 16 (from

64 x 64) is shown in Figure 8-3.

For the C \ ACE, the Gaussian method (RP) and pixel-averaging methods work very

well, with the subsampling the worst, but still with robust performance. Subsampling is

the simplest technique but it can also loose important detail information. In the MACE

case, the Gaussian method is the worst, with pixel-averaging method still performing some

discrimination, but at a much reduced rate (compare with Figure 7-3). It is surprising

that local pixel averaging, the simplest method of dimensionality reduction provides such

a robust performance in this application for both the MACE and C'\LACE. It indicates

that coarse features are sufficient for discrimination up to a certain level of performance.

However, notice that the pixel averaging looses with respect to C'\! ACE-RP when the

operating point in the ROC is close to 1C i'. as can be expected (finer detail is needed to

discriminate between classes).









We have also applied PCA to the MACE and C\I.ACE. There are different v--,- to

apply PCA to this task. One method for dimensionality reduction with PCA, uses training

images from all the data set, and then projects all the images to the subspace spanned by

principal components. With this method, when we choose 10 images (5 from true class

and 5 from false class) and project all true and false class images onto this subspace, the

performance of both the MACE and C' \ ACE are perfect. However, the training data

must be sufficient to find the principal directions to cover the whole test data and large

computation is required. However, in practice, it is impossible to use out of class images

as a training set for a MACE filter which is designed only with data from one class. In

this more realistic case, the test image class does not belong to the training set for PCA

and the performance of the discrimination will be very poor. Figure 8-4 shows the ROC

curves when only true class images are used for PCA. Even for this case, we had to use all

the true class images (75) to find 75 principal directions, project all the images, and then

choose the 5 projected images to composite the MACE and C'\iACE filter. For testing,

we also project the test image onto the subspace obtained from the training set. Since

the false class test images are not used to determine the PCA subspace, the projected

data of the false class is not guarantee to preserve the information of the original images,

therefore, the rejecting performance becomes very poor. The ROC area value for the

MACE and C'\ LACE are 0.4015 and 0.7283, respectively.

We could not obtain reasonable results in the MACE with the RP method as shown

in Figure 8-3. We will explain this MACE behavior due to the Gaussian dimensionality

reduction procedure, but it partially also applies to the other methods. Although RP

preserves similarity in the reduced projections, it changed the statistics of the original data

classes. After random projection with the Gaussian ensemble method, all the projected

images display statistics very close to white noise with similar variance. This result is

shown in Figure 8-5, where two classes sample images of size 16 x 16 after applying RP

to all images are depicted. The first row shows the training image set, while the second

















SU.

I 0.6 -

'5 0.5 -

0.4 -

0.3

0.2

0.1

0
0 0.2 0.4 0.6 0.8 1
Probability of False Alarm

Figure 8-4. ROC comparison with PCA for MACE and C\ ACE (reduced dimensionality
is k = 75).


row di- '1T- i- the in class test set and the third row the out of class test images. We see

that the projected images in the true class and the false class, although slightly different in

detail, seem to have very similar statistics.

The MACE, which extracts only second order information is unable to distinguish

between the projected image set, however the C \! ACE succeeds in this task. In order

to explain the effectiveness of correntropy function, we compare the correlation and

correntropy in the projected space. This result is shown in Figure 8-6. We consider the 2D

images as long 1D vectors. In Figure 8-6 (a) we show the autocorrelation of one original

image vector in the true class, (b) depicts the autocorrelation of one of the training images

after RP, which leads us to conclude that the projected image has been whitened (the only

peak occurs at zero lag), and (c) shows that the cross correlation between the reduced

training image vector and test image vector in the false class after RP is practically the

same as the auto correlation of the reduced training image vector after RP. Therefore the

covariance information of the images after RP is totally destroyed. Since the conventional





























Figure 8-5. Sample images of size 16 x 16 after RP. (a) Training images. (b) True class
images. (c) False class images.


MACE filter utilizes only the second order information, it is unable to discriminate

between in class and out of class images. However, in (e) and (f), we can see that the cross

correntropy between in class and out of class images is still preserved after RP, due to

the fact that correntropy has ability to extract higher order information of the reduced

dimensional data. Therefore, the C'\LACE filter seems very well posed to work with the

reduced dimensional images by random projection for this and other applications.

We can also see the overall detection and recognition performance of the C \! ACE-RP

through a further analysis of the output plane. Figure 8-7 shows correlation output planes

for the MACE and correntropy output planes (C'\.ACE) after dimensionality reduction

with k = 64 random projections. Figure 8-7 (a) shows the desirable correlation output

plane of the MACE filter given the true class test image, however (b) shows the poor

rejecting ability for the false class test image. On the other hand, in the C'\LACE filter,

the true and false class image output plane in Figure 8-7 (c) and (d) show the expected

responses even with such a small dimensional images.











x 10 x 106
15000 2.5 2.5
2 2
10000 1.5 1.5
1 1
5000 0.5 0.5
0 Jdl l W Ldl- 0
0 -0.5 -0.5 -0
-5000 0 5000 -200 0 200 -200 0 200
(a) (b) (c)
X 10-3
0.015 0.0151 3


0.01 0.01 2

0.005 0.005 1

0 0 0 0O
-5000 0 5000 -200 0 200 -200 0 200
(d) (e) (f)

Figure 8-6. The cross correlation vs. cross correntropy. (a) Autocorrelation of one of the
original training image vector. (b) Autocorrelation of one of the reduced
training image vector after RP. (c) Cross correlation between one of the
reduced training image vector and test image vector in the false class after RP.
(d) Autocorrentropy of one of the original training image vector. (e)
Autocorrentropy of one of the reduced training image vector after RP. (f)
Cross correntropy between one of the reduced training image vector and test
image vector in false class after RP.


The initial idea to use a preprocessor based on random projections was to alleviate

the storage and computation complexity of the C \ ACE. Table 8-1 presents comparisons

between the original C'\ IACE and the C'\IACE with RP. The dominant component for

storage is the correntropy matrix (Vx). In single precision (32bit), 6 I\ll1ytes are needed

to store Vx of 64 x 64 pixel images, but only 256Kbytes with 16 x 16 pixel images after

RP. We need an additional !lI lytes to perform random projection with the Gaussian

ensemble method. In the binary ensemble case, no additional storage for RP is needed.

The table also presents the computational complexity of (6-1) with one test image, given

N = 5 training images and clocked with MATLAB version 7.0 on a 2.8GHz Pentium 4

processor with 2Gbytes of RAM.




















10


0 0


(a) With a true class test image in the MACE (b) With a false class test image in the MACE


(c) With a true class test image in the CMACE


(d) With a false class test image in the CMACE


Figure 8-7. Correlation output planes vs. correntropy output planes after dimension
reduction with random projection (reduced image size is 8 x 8).


Table 8-1.


Comparison of the memory and computation time between the original
C' \ ACE (image size of 64 x 64) and C'\.IACE-RP (16 x 16,with Gaussian
ensemble method) for one test image with N = 5
C'\ !ACE C' A\ CE-RP
(d = 4096) (k = 256)
Memory (byte) O(4d2) O(4k(k + d))
(single precision) = 64 MB + a = 4.2 MB + f
Complexity O(d2 (N2 + N + 1)) O(k2 (N2 + N + 1) + kdN)
S0(5.2 x 108) 0(7.3 x 106)
Time (sec) 58.584 0.4297









CHAPTER 9
CONCLUSIONS AND FUTURE WORK

9.1 Conclusions

In my research, we have evaluated the correntropy based nonlinear MACE filter for

image recognition. We presented experimental results for face recognition using C\I U's

facial expression data and SAR image recognition using the MSTAR public release data.

Correntropy induces a new RKHS that has the same dimensionality as the input

space but is nonlinearly related to it. Therefore, it is different from the conventional

kernel methods, in both scope and detail. Here we illustrate that the optimal MACE

filter formulation can be directly solved in the VRKHS. This C \ ACE overcomes the

main shortcomings of the MACE which is poor generalization. We believe this is due

to the utilization for the matching of higher order statistical information in the target

class. The C \ ACE also shows a good rejecting performance as well as robust results

with additive noise. This is due to the prewhitening effect in feature space and the new

metric created by the correntropy that reduces outliers. Simulation results show that the

detection and recognition performance of the C' \IACE exhibits better distortion tolerance

than the MACE in some kinds of distortions(in face recognition, different facial expression,

and in SAR, aspect angle as well as depression) Also the C' \LACE outperforms the

nonlinear kernel correlation filter, which is the kernelized SDF with prewhitened data in

the input space, especially for the large distortion case. Moreover the C'\LACE preserves

the shift-invariant property well.

The sensitivity of the C' \LACE performance on the kernel size is experimentally

demonstrated to be small, but a full understanding of this parameter requires further

investigation. In addition, there is still an approximation in (6-3) and (6-4) to compute

the products of the projected data functionals by a kernel evaluation, which is guaranteed

on average. For large images, this approximation seems to be good, but its error needs to

be understood and quantified to obtain the best performance of the C' \IACE filter.









In practice, the drawback of the proposed C'\ A.CE filter is the required storage and

its computation complexity. Since one does not have direct access to the filer weights in

VRKHS, the computations in the test set must take into consideration the training set

data, so the total computation complexity of one test output is O(d2N(N + 1)) and the

storage depends on the image dimension, 0(d2). The MACE produces easily the entire

correlation output plane by FFT processing. The C'\ ACE can also construct the whole

output plane by shifting the test input image and as a result, there is no need to center

all images provided that the input image is appropriately shifted. However, computing

the whole output plane is a big burden in the C'\ ACE. For this reason, this research also

proposes the fast C'\ ACE to save computation time by using the Fast Gauss Transform

(FGT) algorithm, which results in a computational savings of about 100 fold for 64 x 64

pixel images. With the fast Gauss transform, we were able to reduce the computation to

O(pcdN(N + 1)), where p, c < d.

However, this needs still huge storage and is not very competitive with other methods

for object recognition. The random projection (RP) method may make the C'\ LACE

useful for practical applications using standard computing hardware. RP is a preprocessor

that extracts features of the data, but unlike PCA it is very easy to compute on O(kd).

Reducing the data into features has a double effect of addressing both the storage and

computation requirements. For instance instead of 6 !I \lytes for 64 x 64 pixel images,

the storage for images with RP to 16x16 pixel images is 4.2Mbytes (binary ensemble case

is 256Kbytes). Computational speed improves by more than 100 times. The method of

random projections and its impact on subsequent pattern recognition algorithms is still

poorly understood. Here we verified that the MACE is incompatible with the Gaussian

method of random projections since it destroys the second order statistics that make the

MACE work. The pixel-averaging method seems to preserve second order statistics to a

certain degree. However, the C' \IACE combined with RP is a better alternative, and it

is less sensitive to the method of data reduction. This can be understood if we remember









that the C' \IACE is preserving higher order statistics of the data, unlike the MACE filter.

The performance of the C\! .ACE-RP of 16 x 16 is better than that of the MACE of 64 x 64.

But further work is necessary to quantify extensively the performance of the C \! ACE-RP

versus other algorithms.

These tests with the C' \IACE and data reduction clearly showed a new application

domain for correntropy in signal processing. The conventional data reduction methods

average locally or globally data and tend to destroy mean and variance, but apparently

they preserve some of the higher order information contained in the data that can still be

utilized by correntropy. Therefore, in applications where data reduction at the front-end

is a necessity, correntropy may provide still usable algorithms, in cases where second order

methods fail. This argument is also very relevant in compressive sampling (CS), where

convex optimization needs to be utilized to minimize the 11 norm, since the 12 norm creates

a lot of artifacts in reconstruction. We think that the correntropy induced metric (called

CIM in [42]) can be a candidate to simplify the reconstruction in CS. We have however

to fully understand why correntropy is able to still distinguish between images or signals

that have been heavily distorted, because we can perhaps even propose new data reduction

procedures that preserve the discriminability of correntropy.

9.2 Future Work

The correntropy MACE filter was obtained by solving the constrained optimal

solution in the RKHS induced by correntropy, where the dimension of the RKHS is the

same as the input dimension. The data points in this new RKHS are nonlinearly related

to the original data, therefore, we still can find a closed form solution to the nonlinear

MACE filter that outperforms the linear MACE filter. However, there are still several

work to be investigated.

First, the proposed correntropy MACE filter has hard constraints on the center of

the output plane. As the same as the traditional SDF type filters, linear constraints are

imposed on the training images to yield a known value at specific locations in the output









plane. However, placing such constraints satisfies conditions only at isolated points in

the image space but does not explicitly control the filter's ability to generalize over the

entire domain of the training images. Unlike the general classification problem, the goal

of this research is to find an appropriate template for a specific object that we want to

identify without any information on out-of-classes. We have to suppress the response to

all the images except for the true target image. Therefore, we think that constraining

only one location is not the best solution. Finding new constraints that give us good

generalization as well as rejection ability is one of the future works. One of the idea is

to use the randomly mixed images of the true target as the out-of-class images. These

generated out-of-class images are totally different from the true class images but have the

same statistical information as the true class images. Therefore, we can expect that this

idea may help improve performance.

Second, the computation of the correntropy MACE output requires an approximation.

Unfortunately (6-3) and (6-4) involve weighted versions of the functionals therefore the

error in the approximation should be addressed and it requires further investigation for a

good approximation.

Finally, my research presented the simulation results on applications to face

recognition and SAR image recognition. In addition to face recognition, the proposed

algorithm can be applied to the biometric verification such as iris and fingerprint. Also in

SAR application, there is three-class (BMP2,BTR70 and T72) object classification among

MSTAR/IU public release data set [67]. Most of literatures are applying their algorithm to

the three-class problem to compare the performance. Therefore in order to convince other

researchers we need to compare our algorithm with the three-class problem.

Summarizing the future works

* Find new constraints that provide a better generalization as well as rejecting ability.

* Observe the approximation errors due to the weighted values and find a good
approximation.









* Applications to biometric verification and more detail comparison on the three-class
SAR classification.









APPENDIX A
CONSTRAINED OPTIMIZATION WITH LAGRANGE MULTIPLIERS

If y is a weighted sum of variables, y = aTx, then dy/dx = a. The general quadratic

form for y in matrix notation is y = xTAx, where A = {ai} is an N x N matrix of

weights. Here, we assume that x is a real vector, then the partial derivatives of y with

respect to each variable is dy/dx = (A + AT)x. If A is symmetric, then dy/dx = 2Ax.

The method of Lagrange multipliers is useful for minimizing a quadratic function

subject to a set of linear constraints. Suppose that B = [bib2 ... b] is an N x M

matrix with vectors bi of length N as its columns and c = [cc2 CM] is a vector of M

constants. We want to find the vector x which minimizes the quadratic term y = xTAx

while satisfying the linear equations BTx = c. If A is positive semi-definite, then the y is

convex and there is at least one solution. We form the cost function


J= xTAx 2Ai(bTx cl) -2A2(bx- c2) -. -2AM(bx a), (A-1)


where the scalar parameters Ai, A2, AM are known as the Lagrange multipliers.

Setting the gradient of J with respect to x to zero yields


2Ax 2(Albl + A2b2 + + MbM) = 0. (A-2)


Defining m = [AiA2 .. A M]T, then (A-2) can be expressed as


Ax Bm = 0. (A-3)


or

x A-1Bm. (A-4)

Substituting (A-4) for x into the constraint BTx = c yields


BTA- Bm = c. (A-5)









The Lagrange multiplier vector m can be obtained as

m (BTA-1B)-lc. (A-6)

Using (A-4) and (A-6) we obtain the following solution to the constraint optimization

problem

x = A-B(BTA-1B)-lc. (A-7)









APPENDIX B
THE PROOF OF PROPERTY 5 OF CORRENTROPY

Let pij (x, y) be the joint PDF of (xi, xj) such that
00



where o(.), a are the eigen functions and the eigenvalues of pij(X, y), respectively. Here,


E[f (x) f (y)] ff pij (x, y)f(x) f (y',.1,
= E ai j i(x)f(x)dx jf ci(y)f(y)dy (B-2)


i

where, pi = f ~i(x)f(x)dx.

Now, let (.), Ai be the eigen functions and the eigenvalues of the kernel k, then


E[k(x, y)] f pij (x, y)k(x, y).1l./l/
-fYY a E (x) i(y) Aj j(x)j (y'). 1, .1,
j i (B-3)

j i



Observing (5-15) and (B-1) we can construct f such that Pi = /E AjT

Then there exist f(x) E= Zs(x) satisfying (5-15).
i









APPENDIX C
THE PROOF OF A SHIFT-INVARIANT PROPERTY OF THE C' .4ACE

A shift invariant system is one for which a shift or delay of the input sequence

causes a corresponding shift in the output sequence. All the components of the C' \ ACE
output are determined by the kernel function and by proving that the Gaussian kernel is
shift-invariant, we can easily -w that the C' \ ACE is shift-invariant.

Let the output of the Gaussian kernel be

y(n) = K,(XI(n) x2(n)) = exp X(2- x(n))2 (C-1)

Start with a shift of the input x1s(n) = xi(n So) and x2s x2(n S), then the response

yi(n) of the shifted input is

(-(X(n s\) x2(n -So))
yi(n) = K,(xis(n) x2s(n)) = exp (oX(n ,-2 S-)) (C-2)

Now the shift of the output defined as y(n so) becomes


y(n s,) = exp (Xn ,) S2))2 (C3)

Then clearly yi(n) = y(n s,), therefore, the Gaussian kernel is time-invariant.









APPENDIX D
COMPUTATIONAL COMPLEXITY OF THE MACE AND C \ ACE

Here, only the computational complexity for the matrix inversion and multiplication

in both the MACE and C' \ ACE are considered. Let us assume that all the elements of

matrices are real.

In order to construct the MACE template in the frequency domain with a given

training image set, 0(d) multiplications are needed for the inversion of the diagonal matrix

D of size d x d and 0(N2) for the inversion of the Toeplitz matrix (XHD-1X) of size of

N x N. The number of multiplications are O(N2) for D-1X, O(dN2 + d) for (XHD-1X)

and 0(dN2) for D-1X(XHD-1X)-1. In addition, the FFT needs 0((N + 1)dlog2d))

multiplications with N training images. In reality the elements of the matrices are

complex valued, therefore, the MACE requires a total of 0(4(d(2N2 + N + 2) + N2) +

Ndlog2(d)) multiplications to compose the template for the true class in the frequency

domain. For the test of one input image after building a template, the MACE requires

only 0(4d + dlog2(d)) multiplications.

The C \ ACE needs O(d2) and 0(N2) multiplications for the inversion of both the

Toeplitz matrix Vx of size d x d and Txx of size N x N and 0((Nd)2) to compute Txx,

therefore, the total number of multiplications in off-line mode with the given training

image set is 0(d2 (N2 + 1) + N2). For the testing of one image, 0(N2) multiplications

for the output and 0((Nd)2) operations for obtaining Tzx are needed, therefore, the

total computational complexity of the C'\ LACE for one test image requires 0(d2N + N2)

multiplications.

The fast C'\ LACE with the FGT reduces the computational complexity to 0(N2 pd(k,+

1) + d2 + N2) for the training set and to 0(pd(kc + 1)N + N2) for one testing image.









APPENDIX E
THE CORRENTROPY-BASED ROBUST NONLINEAR BEAMFORMER

E.1 Introduction

Beamforming is often used with an array of radar antenna in order to transmit

or receive signals in different directions without having to mechanically steer the

array [68],[69], and has found numerous applications in radar, sonar, seismology, radio

astronomy, medical ii, ,-,- speech processing, and wireless communications. The

classical approach for beamforming is a natural extension of Fourier-based spectral

analysis to spatio-temporally sampled data, which is called the conventional Bartlett

beamformer [70]. This algorithm maximizes the energy of the beamforming output for

a given input signal. Because it is independent of the signal characteristics, but only

depends on a certain direction, its in i i" difficulties are low spatial resolution and high

sidelobes. In an attempt to alleviate the limitations of the conventional beamformer, the

Capon beamformer is introduced [71],[72].

A Capon beamformer attempts to minimize the output energy contributed by

interference coming from other directions than from the "look direction". Moreover, it

maintains a fixed constant gain in the "look direction" (normalized to one) in order not to

risk the loss of the signal containing the information. This Capon beamformer is sensitive

to the mismatch between the assumed and actual array steering vector, which occurs

often in practice. Recently a robust beamformer was proposed by extending the Capon

beamformer to the case of uncertain array steering vectors [73],[74].

In a statistical point of view, most of these techniques are based on linear models,

which make use of only the first and second order moment information (e.g. the mean and

the variance) of the data. Therefore, they are not an appropriate choice in non-Gaussian

distributed data such as impulsive noise scenarios. In order to deal with more realistic

situation, further research into signal modeling has led to the realization that many

natural phenomena can be better represented by distributions of a more impulsive nature.









One type of distribution that exhibits heavier tails than the Gaussian is the class of stable

distributions introduced by Nikias and Shao [75]. Alpha-stable distributions have been

used to model diverse phenomena such as random fluctuations of gravitational fields,

economic market indexes [76], and radar clutter [77].

To overcome the limitation of the linear model for the non-Gaussian statistics case,

a nonlinear beamformer has been proposed in [78] but most of nonlinear beamforming

methods are complicated for weight vector computation. Recently, kernel based learning

algorithms have been heavily researched due to the fact that linear algorithms can be

easily extended to nonlinear versions through kernel methods [23]. Some kernel based

methods have been presented in [79],[80],[26] for beamforming and target detection

problems.

The correntropy MACE (C'\ ACE) filter [46] [81], which is the nonlinear version of the

correlation filter, has been shown to possess good generalization and rejecting performance

for image recognition applications.

In this appendix, we apply correntropy to the beamforming problem and exploit

the linear structure of the RKHS induced by correntropy to formulate the correntropy

beamformer. Due to the fact that it involves high-order statistics with the nonlinear

relation between the input and this feature spaces, the correntropy beamformer shows

better performance than the Capon and kernel methods and is robust to impulsive noise

scenarios.

E.2 Standard Beamforming Problem

E.2.1 Problem

Consider the standard beamforming model. Let the uniformly spaced linear array of

M sensors receive signals xk generated by a narrow-band source sk arriving from direction

0. Using a complex envelop representation, the M x 1 vector of received signals at kth

snapshot can be expressed as

Xk a (0) sk +nk, (E-l)









where a (0) c CMX1 is the steering vector of the array toward direction 0 as
T
a (0) = 1 e(2 /A)dcos ... j(27(M-1)/A)dcos0 (E-2)


and nk is the M x 1 vector of additive white noise. Also, the beamformer output is given

by

yk = HXk wHa(0 ) Sk + Hnk, (E-3)

where w E CMxl is a vector of weights and H denotes the conjugate transpose. The goal

is to satisfy wHa (0) = 1 and minimize the effect of the noise (wHnk), in which case, Yk

recovers Sk.

Besides, we also assume that each element of nk follows a symmetric a-stable (SaS)

distribution described by the following characteristic function


y (w) = exp (j*w 7 |wl") (E-4)

where a is the characteristic exponent restricted to the values 0 < a < 2, 6 (-oo < 6 < oo)

is the location parameter, and 7 (7 > 0) is the dispersion of the distribution. The value

of a is related to the degree of the impulsiveness of the distribution. Smaller values of a

correspond to heavier tailed distributions and hence to more impulsive behavior, while

as a increases, the tails are lighter and the behavior is less impulsive. The special case of

a = 2 corresponds to the Gaussian distribution (N (6, 27)), while a = 1 corresponds to the

Cauchy distribution.

E.2.2 Minimum Variance Beamforming

Since the look-direction frequency response is fixed by the constraints, minimization

of the non-look-direction noise energy is the same as minimization of the total output

energy. The energy of beamformer output (yk) is minimized subject to the constraint of a

distortionless response in the direction of the desired signal as given by


minE [,'] subject to wHa (0) = 1. (E-5)
w









The constraint wHa (0) = 1 prevents the gain in the direction of the signal from being

reduced. This is commonly referred to as Capon's method [71],[72]. Equation (E-5) has an

analytical solution given by
Wp Rxla (0)
a (O)H Rxla (0)
where Rx denotes the covariance matrix of the array output vector. In practical

applications, Rx is replaced by the sample covariance matrix Rx, where


Rx XkXH, (E-7)
k=1

with N denoting the number of snapshots. Substituting wapo, into equation (E-3), the

constrained least squares estimate of the look-direction output is

H a (0)H RX1Xk
mcapon,k W capo Xk a (E8)
a (O)" Rxla (0)

E.2.3 Kernel-based beamforming

The basic idea of kernel algorithm is to transform the data xi from the input space

to a high dimensional feature space of vectors ) (xi), where the inner products can be

computed using a positive definite kernel function satisfying Mercer's conditions [23] :

K (xi, xj) = ()(xi), ()(xj)). This simple and elegant idea allows us to obtain nonlinear

versions of any linear algorithm expressed in terms of inner products, without even

knowing the exact mapping ).

Using the constrained least-squares approach that was explained in the previous

section it can easily be shown that the equivalent solution wkerne in the feature space is

given by
R;-x 1) (a (0))
kernel X) (E-9)
I (a (o))H R- (a(0))'
where Re(x) is the correlation matrix in the feature space. The estimated correlation

matrix is given by
1
R4(x) 1= NXdX", (E-10)
N









assuming the sample mean has already been removed from each sample(centered), where

X4 = [KI (xl), KP (x2),..., () (XN) is a full rank matrix whose columns are the mapped
input reference data in the feature space. Its output is given by

i) (a (0))H R-
H (x)) (Xk)
ykernelk = Wkernel9 {Xk) 1 (E-ll)
) (a (0))H R-' I) (a (0))

Due to the high dimensionality of the feature space, equation (E-11) can not be directly
implemented in the feature space. It needs to be converted in terms of the kernel functions

by the eigenvector decomposition procedure of the kernel PCA [26]. The kernelized version

of the beamformer output is given by

KH K-1KX,
Ykernel,k K a) -1Ka()' (B-12)

where
K ) =) (a (0))T X
(EB13)
S[ (a (0) xi) K (a (0) X2),... ,K (a (0) XN)] ,

KT -) (xk)T X
x /(E-14)
S[K (Xk, X) (Xk, X2) ,K (X, XN)]

and K = XHXX is an N x N Gram matrix whose entries are the dot products K (xi, xj)

( E.3 Nonlinear Beamformer using Correntropy

The correntropy beamformer is formulated in the RKHS induce by correntropy and

the solution is obtained by solving the constrainted optimization problem which is to
minimize average correntropy output energy. We denote the transformed received data

matrix and filter vector whose size are M x N and M x 1, respectively, be

Fx [fxL,fx,, ,fxX, (E-15)

fw [f(w(1))f(w(2)) ... f(w(M))]H. (E-16)









where,


f, = [f(xk())f(xk(2)) ... f(xk(M))]H


(E-17)


for k = 1, 2, -. N. Given data samples, the cross correntropy between the received signal

at kth snapshot and the filter can be estimated as


(E-18)


1
d
Voi[m] =- f (w(n)) f(xk(n m)),


for all the lags m = -M + 1, *- M 1.

The correntropy energy of the kth received signal output is given by


Ek = VkVok ffVxfw,


(E-19)


and the M x M correntropy matrix Vx, is


,,(0)

Uv~k1)


,(1)

Vk(0)


\k(d 1) ...

where, each element of the matrix is computed wit

mapping function f by


(d 1)

S (d- 2)



k () k (0)

hout explicitly knowledge of the


M
Vkl) K (Xk (n) Xk (n +),
n.=


for 0,... ,M-1.

The average correntropy energy over all the received data can be written as

N
Ea, Ek f Vxfw, (E
k=l

where Vx = N kI= Vx,. Since our objective is to minimize the average correntropy

energy in the linear feature space, we can formulate the optimization problem by


min fVxfw, subject to ffa(o) = 1. (


(E-20)








(E-21)


-22)


-23)









where, fa(O) of size M x 1 is the transformed vector of the steering vector Then the

solution in feature space becomes


fw fa(O) (E-24)
f(0) VX fa(0)

Then the output is given by

f() V fxk Tax
Ycorrentropy,k faO fa() Ta (E25)
fa(0) X fa(0) Ta

where
M M M M
Ta > > i f (a(j))f (a(*)) )7 .-. -(a(j)- a(i)), (E-26)
i= j 1 i=l j=1
M M M M
Taz Y f ( aJ))fa)) 7 o' I( xk(J) a(i)), (E-27)
i= j=1 i= j=1
where ", is the (i, j)th element of V1, Xk(i) is the ith element of the received signal at

kth snapshot and a(i) is ith element of the steering vector.

The final output expressions in (E-26) and (E-27) are obtained by approximating

f(a(j))f(a(i)) and f(xk(j))f(a(i)) by K (a(j) a(i)) and ,(xk(j) a(i)), respectively,

which is similar to the kernel trick and holds on average because of property 5.

E.4 Simulations

In this simulation, we present comparison results among the Capon, kernel and

correntropy beamformer in the wireless communications with multiple receiving antennas.

In all of the experiments, we assume a uniform linear array with M = 25 sensor elements

and half-wavelength array spacing. Note that as the number of elements increases, the side

lobes become smaller. Also, as the total width of the array increases, the central beam

becomes narrower. For the source scenario, we assume that narrow band signals arrives

from far field and the target of interest is located at angle 0 = 45. We use BPSK (Binary

Phase Shift Keying) signalling which has unity power and is uncorrelated. In order to

make the result independent of the input and noise, we perform Monte-Carlo simulations

with 100 different inputs and noises.









In the first experiment, we investigate the effect of the number of snapshots

(N) in a spatially white Gaussian noise case. Figure E-l shows the beampatterns of

Capon, kernel and correntropy beamformers with N = 100 and 1000 for the case that

signal-to-noise-ratios (SNR) is 10dB. The Capon beamformer has the poor performance,

i.e., higher side-lobes for the small number of N, while the kernel method and correntropy

beamformer show a good beampattern even a small number of N. It is well known that

one of the problem of the standard Capon has a poor performance with a small number

of training data. In Figure E-2, we show the performance of BER with N = 100 and

1000 for the range of SNR between 5 and 15dB. It has been shown from Figure E-2(a)

that for N = 100 the Capon beamformer exhibits a high BER floor, but the proposed

beamformer has a much better BER performance than the Capon and kernel beamformer.

For N = 1000 in Figure E-2(b), compared with the Capon when SNR is under of 9dB,

the Capon beamformer shows better BER performance than other two methods, but when

SNR increases, BER of the correntropy beamformer becomes the best.

Next, we test the robustness of the Capon, kernel and correntropy beamformers to

the impulsive noise with N = 1000. We select 7 such that SNR is 10dB when an a-stable

noise with a = 2 and 6 = 0 (Gaussian noise). Figure E-3 shows BER performance of

three beamformers at different a levels. The correntropy beamformer di-pl ',-i superior

performance for decreasing a, that is, increasing the strength of impulsiveness. From this

result, we can -w that the proposed method is robust in terms of BER to the impulsive

noise environment for wireless communications.

Figure E-4(a) and (b) show the beampattern of three beamformer at a = 1.5 and

a = 1.0, respectively. When a = 1.5 in Figure E-4(a), the beampattern of Capon is similar

to that of kernel, and the gain of its side lobe is higher than that of correntropy by 2dB.

As decreasing a, the gap of the side lobe gain between Capon and correntropy is increased

as shown Figure E-4(b).









One interesting result of the kernel method is that its BER performance is poor

both in Gaussian and impulsive noise cases even though it shows a nice beampattern.

The output values of the kernel method are far from the original transmitted signals,

1 therefore, it results in the poor BER performance. The kernel method shown in this

dissertation use a constraint, however, the solution of the optimization problem exists on

the infinite dimensional feature space, therefore, additional regularization for the output

to be close to the original signal may be needed. One important difference compared

with existing conventional kernel method, which normally yields an infinite dimensional

feature space, is that RKHS induced by correntropy (we call it VRKHS) has the same

dimension as the input space. In the beamforming problem, the weight vector w has M

degree of freedom and all the received data are in the M dimensional Euclidean space.

As derived above, all the transformed data belong to a different M dimensional vector

space equipped with the inner product structure defined with the correntropy. The goal

of the proposed beamformer is to find a template fw in this VRKHS such that the cost

function is minimized subject to the constraint. Therefore, the degrees of freedom of this

optimization problem is still M, so regularization, which will be needed in traditional

kernel methods, is not necessary here. Further work needs to be done regarding this point.

E.5 Conclusions

In this research, we have presented a correntropy-based nonlinear beamformer and

compared it with the Capon beamformer, which is a widely used linear technique, and the

kernel-based beamformer, which is one of the nonlinear beamformer. From simulation

results in BPSK wireless communications, it has been shown that the correntropy

beamformer outperforms the Capon and kernel beamfomers significantly in term of

sidelobe suppression of beam shaping and a reduced bit-error-rate. Also, the correntropy

beamformer has a clear advantage over the Capon beamformer in those case where small

data sets are available for training and where non-Gaussian noise is present. Compared to

the kernel beamformer, the correntropy beamformer is computationally much simpler. In









the computation complexity, the kernel method needs to compute the inverse of N x N

Gram matrix. On the other hand, the correntropy beamformer needs the inverse of M x M

correntropy matrix, where M < N (in this simulation, M = 25 and N = 1000). In

addition, we hypothesize that in our methodology, regularization is automatically achieved

by the kernel through the expected value operator (which corresponds to a density

matching step utilized to evaluate correntropy).

































Degree (0)

(a) N = 100


0 10 20 30 40 50 60 70 80 90
Degree (0)

(b) N = 1000

Figure E-l. Comparisons of the beampattern for three beamformers in Gaussian noise
with 10dB of SNR.



















10



a,
w1 10



10



10


5 10
SNR (dB)

(b) N = 1000


Figure E-2. Comparisons of BER for
SNRs.


three beamformers in Gaussian noise with different


10
SNR (dB)

(a) N = 100


































10 .i .
10 0 ;,






10-3 .: !
1012







10-4
S: : : .-0-Capon
*** *** Kernel

10- ....... Correntropy
2 1.5 1 0.5
ALPHA (a)

Figure E-3. Comparisons of BER for three beamformers with different characteristic
exponent a levels.























CO -2

E -3

c. -4
E
S-5 A
a)
6 7

-7

-8-
-8 .-.-.Capon
-9 ....... Kernel

-10. Correntropy
0 10 20 30 40 50 60 70 80 90
Degree (0)

(a) a 1.5


1

.-----..------------*-. ......
-1

I -2

E -3



-5-
o) .- ". "- -
> -6 ........... ..... .

S-7

-8 -*- Capon
-9 ....... Kernel

10 Correntropy
0 10 20 30 40 50 60 70 80 90
Degree (0)

(b) a= 1.0


Figure E-4. Comparisons of the beampattern for three beamformers in non-Gaussian
noise.









LIST OF REFERENCES


[1] B. V. Kumar, "Tutorial survey of composite filter designs for optical correlators,"
Appl.Opt, vol. 31, pp. 4773-4801, 1992.

[2] A. Mahalanobis, A. Forman, M. Bower, R. C('! iiy, and N. Day, i\!li i-class SAR
ATR using shift invariant correlation filters," special issue of Pattern Recognition on
correlation filters and neural networks, vol. 27, pp. 619-626, 1994.

[3] A. Mahalanobis, B. Vij -,iv Kumar, D. W. Carlson, and S. Sims, "Performance
evaluation of distance classifier correlation filters," in Proc. SPIE, 1994, vol. 2238, pp.
2-13.

[4] R. q1, ii,.i and D. Casasent, "Correlation filters that generalize well," in Proc. SPIE,
March 1998, vol. ;: :;- pp. 100-110.

[5] M. Savvides, B. V. Kumar, and P. Khosla, 1 '. verification using correlation filters,"
in Proc. Third IEEE Automatic I I. ,l'.:l. ri.:n Advanced Technologies, Tarrytown, NY,
2002, pp. 56-61.

[6] B. V. Kumar, M. Savvides, C. Xie, and K. Venkataramani, "Biometric verification
with correlation filters," Applied Optics, vol. 43, no. 2, pp. 391-402, Jan 2004.

[7] B. V. K. V. Kumar, A. Mahalanobis, and R. Juldiv-, Correlation Pattern Recognition,
Cambridge University Press, 2005.

[8] G. Turin, "An introduction to matched filters," IEEE Trans. Information Th(. ,
vol. 6, pp. 311-329, 1960.

[9] S. M. Kay, Fundamentals of Statistical -lI,.,l processing, Volume II Detection Th(. ,
Prentice-Hall, 1998.

[10] A. VanderLugt, "Signal detection by complex spatial filtering," IEEE Trans.
Information Tht(.., no. 10, pp. 139-145, 1964.

[11] C. Hester and D. Casasent, \!II i variant technique for multiclass pattern
recognition Appl.Opt, vol. 19, pp. 1758-1761, 1980.

[12] B. V. Kumar, \l~iiiiiiiiiin variance synthetic discriminant functions," J.Opt.Soc.Am.A,
vol. 3, no. 10, pp. 1579-1584, 1986.

[13] A. Mahalanobis, B. V. Kumar, and D. Casasent, \j,!,!:ii in average correlation
energy filters," Appl.Opt, vol. 26, no. 17, pp. 3633-3640, 1987.

[14] D. Casasent and G. Ravichandran, "Advanced distortion-invariant minimum average
correlation energy MACE filters," Appl.Opt, vol. 31, no. 8, pp. 1109-1116, 1992.

[15] G. Ravichandran and D. a. Casasent, \,iiiii iini noise and correlation energy filters,"
Appl.Opt, vol. 31, no. 11, pp. 1823-1833, 1992.









[16] P. Refregier and J. Figue, "Optimal trade-off filter for pattern recognition and their
comparison with weiner approach," Opt. Computer Process., vol. 1, pp. 3-10, 1991.

[17] A. Mahalanobis, B. V. Kumar, S. Song, S. Sims, and J. Epperson, "Unconstrained
correlation filters," Appl.Opt, vol. 33, pp. 3751-3759, 1994.

[18] A. Mahalanobis, B. V. Kumar, and S. Sims, "Distance-classifier correlation filters for
multiclass target recognition," Appl.Opt, vol. 35, no. 17, pp. 3127-3133, June 1996.

[19] M. Alkanha and B. V. Kumar, "Polynomial distance classifier correlation filter for
pattern recognition," Appl.Opt, vol. 42, no. 23, pp. '.--" 1708, Aug. 2003.

[20] J. Fisher and J. Principe, "A nonlinear extension of the MACE filter," Neural
Networks, vol. 8, pp. 1131-1141, 1995.

[21] J. Fisher and J. Principe, "Formulation of the mace filter as a linear associative
memory," in Proc. Int. Conf. on Neural Networks, 1994, vol. 5.

[22] J. Fisher and J. Principe, "Recent advances to nonlinear MACE filters," Optical
Engineering, vol. 36, no. 10, pp. 2697-2709, Oct. 1998.

[23] B. Scholkopf and A. J. Smola, Learning with Kernels, The MIT Press, 2002.

[24] B. Scholkopf, A. J. Smola, and K. Muller, "Kernel principal component analysis,"
Neural Computation, vol. 10, pp. 1299-1319, 1998.

[25] A. Ruiz and E. Lopez-de Teruel, Nuiiinear kernel-based statistical pattern analysis,"
IEEE Trans. on Neural Networks, vol. 12, pp. 16-32, 2001.

[26] H. Kwon and N. M. N 1-i 1, .d "Kernel matched signal detectors for hyperspectral
target detection," in Proc. Int. Conf. Acoustics, Speech, S.:gi'il Processing (ICASSP),
2005, vol. 4, pp. 665-668.

[27] K. Jeong, P. Pokharel, J. Xu, S. Han, and J. Principe, "Kernel synthetic distriminant
function for object recognition," in Proc. Int. Conf. Acoustics, Speech, S.:giil
Processing (ICASSP), France, May 2006, vol. 5, pp. 765-768.

[28] C. Xie, M. Savvides, and B. V. Kumar, "Kernel correlation filter based redundant
class-dependence feature analysis KCFA on FRGC2.0 data," in Proc. ';../ Int.
Workshop A,.i..l.:-- Modeling of Faces Gesture (AMFG), Beijing, 2005.

[29] I. Santamarfa, P. Pokharel, and J. Principe, "Generalized correlation function:
Definition,properties and application to blind equalization," IEEE Trans. S.:giIl
Processing, vol. 54, no. 6, pp. 2187-2197, June 2006.

[30] P. Pokharel, R. Agrawal, and J. Principe, "Correntropy based matched filtering," in
Proc. IEEE Int. Workshop on Machine Learning for -.:,l,,i Processing (_ l.SP), Sept.
2005, pp. 148-155.









[31] J. W. Fisher, Nonlinear Extensions to the Miminum Average Correlation Energy
Filter, Ph.D. dissertation, University of Florida, Gainesville, FL, 1997.

[32] B. Boser, I. Guyon, and V. Vapnik, "A training algorithm for optimal margin
classifiers," in Proc. 5th COLT, 1992, pp. 144-152.

[33] V.Vapnik, The Nature of Statistical Learning Th(.,;, Springer V 1 1995.

[34] N. Cristianini and J. S. Taylor, An Introduction to Support Vector Machines,
Cambridge University Press, 2000.

[35] N. Aronszajn, "Theory of reproducing kernels," Trans. Amer. Math. Soc., vol. 68, pp.
337-404, 1950.

[36] E. Parzen, "On the estimation of probability density function and mode," The Annals
of Mathematical Statistics, 1962.

[37] J. Mercer, "Functions of positive and negative type, and their connection with the
theory of integral equations,," Philosophical Trans. of the R..';,l. S..'. i, /of London,
vol. 209, pp. 415-446, 1909.

[38] "http://www.amp.ece.cmu.edu : Advanced multimedia processing lab at electrical
and computer eng., C\iU,"

[39] E. Parzen, "Statistical methods on time series by hilbert space methods," Tech. Rep.
Technical Report No 23, Applied Mathematics and Statistics Laboratory, Stanford
University, 1959.

[40] S. Sudharsanan, A. Mahalanobis, and M. Sundareshan, "Unified framework for the
synthesis of synthetic discriminant functions with reduced noise varianve and sharp
correlation structure Optical Engineering, 1990.

[41] J. C. Principe, D. Xu, and J. Fisher, "Information theoretic 1. iiii in Unsuper-
vised Adaptive Filtering, S. Haykin, Ed., pp. 265-319. JOHN WILEY, 2000.

[42] W. Liu, P. P. Pokharel, and J. C. Principe, "Correntropy: properties and applications
in non-gaussian signal processing," in press, IEEE Trans. on S.:ji.il Processing.

[43] W. Liu, P. Pokharel, and J. Principe, "Correntropy: a localized similarity measure,"
in Proc. .'.tiiti IEEE World Congress on Computational Intelligence (WCCI), Canada,
July 2006, pp. 10018-10023.

[44] B. W. Silverman, D. ,;':/; Estimation for Statistics and Data Aal,.i-. CRC Press,
1986.

[45] P. Pokharel, J. Xu, D. Erdogmus, and J. Principe, "A closed form solution for a
nonlinear wiener filter," in Proc. Int. Conf. Acoustics, Speech, S. :'jil Processing
(ICASSP), France, May 2006, vol. 3, pp. 720-723.









[46] K. Jeong and J. Principe, "The correntropy MACE filter for image recognition,"
in Proc. IEEE Int. Workshop on Machine Learning for ..:,i1l Processing (_/I.P),
Ireland, July 2006, pp. 9-14.

[47] L. Greengard and J. Strain, "The fast gauss transform," SIAM J. Sci. Statist.
Comput., vol. 12, no. 1, pp. 79-94, Jan. 1991.

[48] T. Gonzalez, "Clustering to minimize the maximum intercluster distance," Theoretical
Computer Science, vol. 38, pp. 293-306, 1985.

[49] T. Ross, S. Worrell, V. Velten, J. Mossing, and M. Bryant, "Standard SAR ATR
evaluation experiments using the MSTAR public release data set," in Proc. SPIE,
April 1998, vol. 3370, pp. 566-573.

[50] R. Gonzalez and R. Woods, Digital Image Processing, Second Edition, Prentice Hall,
2002.

[51] M. Turk and A. Pentland, "Eigenfaces for recognition," Journal of Cognitive
Neuroscience, vol. 3, no. 1, pp. 71-86, 1991.

[52] R. O. Duda, P. E. Hart, and S. D. G, Pattern Cl.r -...: l.:'n, Second Edition, John
Willy and sons, 2001.

[53] D. Erdogmus, Y. Rao, H. Peddaneni, A. Hegde, and J. Principe, "Recursive principal
components analysis usingeigenvector matrix perterbation," EURASIP Journal on
Applied S.:,.1l Processing, no. 13, pp. 2034-2041, Mar. 2004.

[54] K. Fukunaga, Introduction to statistical pattern recognition, Second Edition, Academic
Press Professional, 1990.

[55] S. Kaski, "Dimensionality reduction by random mapping: Fast silimarity computation
for (Ii-. I, i,- in Proc. Int. Joint Conf. on Neural Networks (IJCNN), 1998, pp.
413-418.

[56] D. Fradkin and D. Madigan, "Experiments with random projection for machine
1. i,,ir. in Proc. Conference on Knowledge Discovery and Data Mining, 2003, pp.
517-522.

[57] E. Brigham and H. Maninila, "Random projection in dimensionality reduction:
applications to image and text data," in Proc. Conference on Knowledge Discovery
and Data Mining, 2001, pp. 245-250.

[58] D. Achlioptas, "Database-friendly random projections," in Symposium on Principles
of Database S.;,i m,/PODS), 2001, pp. 274-281.

[59] S. Dasgupta, "Experiments with random projection," in Proc. Conference on
Uncer'.iil:, in Ar'.!, .:.1 Intelligence, 2000.









[60] E. Candes, J. Romberg, and T. Tao, "Robust uncertainty principles: Exact signal
reconstruction from highly incomplete frequency information," IEEE Trans. on
Information The.',,/ vol. 52, no. 2, pp. 489-509, 2006.

[61] S. Dasgupta and A. Gupta, "An elementary proof of the johnson-lindenstrauss
lemma," Tech. Rep. Technical Report TR-99-006, International Computer Science
Institute, Berkeley, CA, 1999.

[62] G. H. Golub and C. F. v. Loan, Matrix Computations, North Oxford Academic,
Oxford, UK, 1983.

[63] E. J. Candes and T. Tao, \, i'-optimal signal recovery from random projections:
Universal encoding strategies?," IEEE Trans. of Information The. 'i~ vol. 52, no. 12,
pp. 5406-5425, Dec. 2006.

[64] D. Achlioptas, "Database-friendly random projections: Johnson-lindenstrauss with
binary coins," Journal of Computer and System Sciences, vol. 66, pp. 671-687, 2003.

[65] R. Hecht-Nielsen, Context vectors: general purpose approximate i, .i.':.,,j representa-
tions self-orj,,.:. l.../ from raw data, IEEE Press, 1994.

[66] D. Donoho, "Compressed sensing," IEEE Trans. of Information The. ~; vol. 52, no.
4, pp. 1289-1306, 2006.

[67] D. Casasent and N. A., "Confuser rejection performance of EMACH filters for
MSTAR ART," in Proc. SPIE, April 2006, vol. 6245, pp. 62450D1-12.

[68] B. D. Van Veen and K. M. Buckley, "Beamforming: a versatile approach to spatial
filtering," IEEE ASSP M rj..:,., vol. 5, no. 2, pp. 4-22, April 1988.

[69] H. Krim and M. Viberg, "Two decades of array signal processing research: the
parametric approach," IEEE S.:j,,1 Processing Mi,..,., vol. 13, no. 4, pp. 67-94,
July 1996.

[70] M. S. Bartlett, "Smoothing periodograms from time series with continuous spectra,"
Nature, vol. 161, no. 4096, pp. 686-687, May 1948.

[71] J. Capon, "High-resolution fr-equ'ii' i--,venumber spectrum analysis," Proceedings of
the IEEE, vol. 57, no. 8, pp. 1408-1418, August 1965.

[72] R. T. Lacoss, "Data adaptive spectral analysis methods," Gu '/l,;.. vol. 36, no. 4,
pp. 134-148, August 1971.

[73] P. Stoica, Z. Wang, and J. Li, "Robust capon beamforming," IEEE S.:jiir Processing
Letters, vol. 10, no. 6, pp. 172-175, June 2003.

[74] R. G. Lorenz and S. P. Boyd, "Robust minimum variance beamforming," IEEE
Trans. S.:il Processing, vol. 53, no. 5, pp. 1684-1696, A l,- 2005.









[75] M. Shao and C. L. Nikias, "Signal processing with fractional lower order moments:
Stable processes and their applications," Proceedings of the IEEE, vol. 81, no. 7, pp.
986-1010, July 1993.

[76] R. Adler, R. E. Feldman, and T. M. S, A Practical Guide to H. ..;;/ Tails: Statistical
Techniques and Applications, Boston, MA: Birkhauser, 1998.

[77] P. Tsakalides, R. Raspanti, and C. L. Nikias, "Angle/doppler estimation in
heavy-tailed clutter backgrounds," IEEE Trans. Aerospace and Electronic S l, -/.
vol. 35, no. 2, pp. 419-436, April 1999.

[78] T. Lo, H. Leung, and J. Litva, "Nonlinear beamforming," Electronics Letters, vol. 27,
no. 4, pp. 350-352, February 1991.

[79] S. C'! i, L. Hanzo, and A. Wolfgang, "Kernel-based nonlinear beamforming
construction using orthogonal forward selection with the fisher ratio class separability
measure," IEEE S.:,.rl Processing Letters, vol. 11, no. 6, pp. 478-481, May 2004.

[80] M. Martinez-Ramon, J. L. Rojo-Alvarez, and G. Camps-Vails, "Kernel antenna array
processing," IEEE Trans. Antennas and Propagation, vol. 55, no. 3, pp. 642-650,
March 2007.

[81] K. H. Jeong, W. Liu, S. Han, and J. C. Principe, "The correntropy MACE filter,"
submitted to IEEE Trans. on Pattern A,.,l .:. and Machine Intelligence (PAMI).









BIOGRAPHICAL SKETCH

Kyu-Hwa Jeong was born June, 1972 in Korea and received the M.S degree

in electronics engineering from Yonsei University, Seoul, Korea, in 1997, where he

focused on adaptive filter theory and its applications to acoustic echo cancelation. In

1997-2003, he was a senior research engineer with Digital Media Research Lab. in LG

Electronics, Seoul, Korea, and belonged to optical storage group. He mainly participated

in CD/DVD recorder projects. Since 2003, he has been pursuing the Ph.D. degree

with the Computational NeuroEngineering Lab in electrical and computer engineering,

University of Florida, Gainesville, FL. His research interests are in the field of signal

processing, machine learning and its applications to image pattern recognition.





PAGE 1

1

PAGE 2

2

PAGE 3

3

PAGE 4

Firstofall,Ithankmyadvisor,Dr.JoseC.Principe,forhisgreatinspiration,supportandguidanceovermygraduatestudies.IwasimpressedbyDr.Principe'sactivethoughtandappreciatedverymuchhissupervisionwhichgavemealotofopportunitiestoexploreonmyresearch.Iamalsogratefultoallthemembersofmyadvisorycommittee:Dr.JohnG.Harris,Dr.K.ClintSlattonandDr.MuraliRaofortheirvaluabletimeandinterestinservingonmysupervisorycommittee,aswellastheircomments,whichhelpedimprovethequalityofthisdissertation.IalsoexpressmyappreciationtoalltheCNELcolleagues,especiallyITLgroupmemberswhoareJianwuXu,PuskalPokharel,SeungjuHan,SudhirRao,AntonioPaiva,andWeifengLiufortheirhelp,collaborationandvaluablediscussionsduringmyPhDstudy.Finally,Iexpressmygreatloveformywife,Inyoungandourtwolovelysons,Hoyeon(Luke)andSeungyeon(Justin).IthankInyoungforherlove,caring,andpatience,whichmadethisstudypossible.AlsoIamgratefultomyparentsfortheirgreatsupportformylife. 4

PAGE 5

page ACKNOWLEDGMENTS ................................. 4 LISTOFTABLES ..................................... 8 LISTOFFIGURES .................................... 9 ABSTRACT ........................................ 11 CHAPTER 1INTRODUCTION .................................. 13 1.1Background ................................... 13 1.2Motivation .................................... 18 2FUNDAMENTALDISTORTIONINVARIANTLINEARCORRELATIONFILTERS ............................. 21 2.1Introduction ................................... 21 2.2SyntheticDiscriminantFunction(SDF) .................... 21 2.3MinimumAverageCorrelationEnergy(MACE)Filter ............ 23 2.4OptimalTrade-oSyntheticDiscriminant(OTSDF)Function ....... 25 3KERNEL-BASEDCORRELATIONFILTERS ................... 27 3.1BriefreviewonKernelMethod ........................ 27 3.1.1Introduction ............................... 27 3.1.2KernelMethod ............................. 27 3.2KernelSyntheticDiscriminantFunction ................... 30 3.3ApplicationoftheKernelSDFtoFaceRecognition ............. 31 3.3.1ProblemDescription .......................... 31 3.3.2SimulationResults ........................... 32 4ARKHSPERSPECTIVEOFTHEMACEFILTER ............... 36 4.1Introduction ................................... 36 4.2ReproducingKernelHilbertSpace(RKHS) .................. 36 4.3InterpretationoftheMACElterintheRKHS ............... 39 5NONLINEARVERSIONOFTHEMACEINANEWRKHS:THECORRENTROPYMACE(CMACE)FILTER ................ 41 5.1CorrentropyFunction .............................. 41 5.1.1Denition ................................ 41 5.1.2SomeProperties ............................. 42 5

PAGE 6

......................... 46 5.3ImplicationsoftheCMACEFilterintheVRKHS .............. 50 5.3.1ImplicationofNonlinearity ....................... 50 5.3.2FiniteDimensionalFeatureSpace ................... 51 5.3.3Thekernelcorrelationltervs.TheCMACElter:PrewhiteninginFeatureSpace ............................. 51 6THECORRENTROPYMACEIMPLEMENTATION .............. 53 6.1TheOutputoftheCMACEFilter ....................... 53 6.2CenteringoftheCMACEinFeatureSpace .................. 54 6.3TheFastCMACEFilter ............................ 56 6.3.1TheFastGaussTransform ....................... 57 6.3.2TheFastCorrentropyMACEFilter .................. 57 7APPLICATIONSOFTHECMACETOIMAGERECOGNITION .............................. 60 7.1FaceRecognition ................................ 60 7.1.1ProblemDescription .......................... 60 7.1.2SimulationResults ........................... 60 7.2SyntheticApertureRadar(SAR)ImageRecognition ............ 64 7.2.1ProblemDescription .......................... 64 7.2.2AspectAngleDistortionCase ..................... 65 7.2.3DepressionAngleDistortionCase ................... 71 7.2.4TheFastCorrentropyMACEResults ................. 75 7.2.5Theeectofadditivenoise ....................... 75 8DIMENSIONALITYREDUCTIONWITHRANDOMPROJECTIONS ..... 79 8.1Introduction ................................... 79 8.2Motivation .................................... 80 8.3PCAandSVD ................................. 81 8.4RandomProjections .............................. 81 8.4.1RandomMatrices ............................ 83 8.4.2OrthogonalityandSimilarityProperties ................ 83 8.5Simulations ................................... 85 9CONCLUSIONSANDFUTUREWORK ..................... 93 9.1Conclusions ................................... 93 9.2FutureWork ................................... 95 APPENDIX ACONSTRAINEDOPTIMIZATIONWITHLAGRANGEMULTIPLIERS .... 98 BTHEPROOFOFPROPERTY5OFCORRENTROPY ............. 100 6

PAGE 7

... 101 DCOMPUTATIONALCOMPLEXITYOFTHEMACEANDCMACE ...... 102 ETHECORRENTROPY-BASEDROBUSTNONLINEARBEAMFORMER .. 103 E.1Introduction ................................... 103 E.2StandardBeamformingProblem ........................ 104 E.2.1Problem ................................. 104 E.2.2MinimumVarianceBeamforming ................... 105 E.2.3Kernel-basedbeamforming ....................... 106 E.3NonlinearBeamformerusingCorrentropy ................... 107 E.4Simulations ................................... 109 E.5Conclusions ................................... 111 LISTOFREFERENCES ................................. 117 BIOGRAPHICALSKETCH ................................ 123 7

PAGE 8

Table page 6-1EstimatedcomputationalcomplexityfortrainingwithNimagesandtestingwithoneimage.Matrixinversionandmultiplicationareconsidered ....... 59 7-1ComparisonofstandarddeviationsofalltheMonte-Carlosimulationoutputs .. 63 7-2ComparisonofROCareaswithdierentkernelsizes ................ 64 7-3CaseA:ComparisonofROCareaswithdierentkernelsizes ........... 72 7-4Comparisonofcomputationtimeanderrorforonetestimagebetweenthedirectmethod(CMACE)andtheFGTmethod(FastCMACE)withp=4andkc=4 75 7-5ComparisonofcomputationtimeanderrorforonetestimageintheFGTmethodwithadierentnumberofordersandclusters ................... 76 8-1ComparisonofthememoryandcomputationtimebetweentheoriginalCMACEandCMACE-RP ................................... 92 8

PAGE 9

Figure page 1-1Blockdiagramforpatternrecognition. ....................... 14 1-2Blockdiagramforimagerecognitionprocessusingcorrelationlter. ....... 15 2-1ExampleofthecorrelationoutputplaneoftheSDF ................ 22 2-2ExampleofthecorrelationoutputplaneoftheMACE .............. 24 2-3ExampleofthecorrelationoutputplaneoftheOTSDF .............. 25 3-1Theexampleofkernelmethod ............................ 28 3-2Sampleimages .................................... 32 3-3Theoutputpeakvalueswhenonly3imagesareusedfortraining ........ 33 3-4ThecomparisonofROCcurveswithdierentnumberoftrainingimages. .... 34 3-5TheoutputvaluesofnoisytestinputimageswithadditiveGaussiannoisewhen25imagesareusedfortraining ........................... 35 3-6TheROCcurvesofnoisytestinputimageswithdierentSNRswhen10imagesareusedfortraining ................................. 35 5-1ContoursofCIM(X,0)in2Dsamplespace ..................... 44 7-1Theaveragedtestoutputpeakvalues ........................ 61 7-2ThetestoutputpeakvalueswithadditiveGaussiannoise ............. 61 7-3ThecomparisonofROCcurveswithdierentSNRs. ............... 62 7-4Thecomparisonofstandarddeviationof100Monte-Carlosimulationoutputsofeachnoisyfalseclasstestimages. ......................... 64 7-5CaseA:SampleSARimages(64x64pixels)oftwovehicletypesforatargetchip(BTR60)andaconfuser(T62). ........................... 66 7-6CaseA:Peakoutputresponsesoftestingimagesforatargetchipandaconfuser 67 7-7CaseA:ROCcurveswithdierentnumbersoftrainingimages. ......... 67 7-8CaseA:TheMACEoutputplanevs.theCMACEoutputplane ......... 69 7-9SampleimagesofBTR60ofsize(6464)pixels .................. 70 7-10CaseA:Outputplaneswithshiftedtrueclassinputimage ............ 70 7-11TheROCcomparisonwithdierentkernelsizes .................. 72 9

PAGE 10

............................. 73 7-13CaseB:Peakoutputresponsesoftestingimagesforatargetchipandaconfuser 74 7-14CaseB:ROCcurves. ................................. 74 7-15ComparisonofROCcurvesbetweenthedirectandtheFGTmethodincaseA 76 7-16SampleSARimages(64x64pixels)ofBTR60. ................... 77 7-17ROCcomparisonswithnoisytestimages(SNR=7dB)inthecaseA ....... 77 8-1ThecomparisonofROCareaswithdierentRPdimensionality ......... 85 8-2ThecomparisonofROCareaswithdierentRPdimensionality ......... 86 8-3ROCcomparisonwithdierentdimensionalityreductionmethodsforMACEandCMACE ..................................... 87 8-4ROCcomparisonwithPCAforMACEandCMACE ............... 89 8-5Sampleimagesofsize1616afterRP ....................... 90 8-6Thecrosscorrelationvs.crosscorrentropy ..................... 91 8-7Correlationoutputplanesvs.correntropyoutputplanesafterdimensionreductionwithrandomprojection ............................... 92 E-1ComparisonsofthebeampatternforthreebeamformersinGaussiannoisewith10dBofSNR. ..................................... 113 E-2ComparisonsofBERforthreebeamformersinGaussiannoisewithdierentSNRs. ......................................... 114 E-3ComparisonsofBERforthreebeamformerswithdierentcharacteristicexponentlevels. ........................................ 115 E-4Comparisonsofthebeampatternforthreebeamformersinnon-Gaussiannoise. 116 10

PAGE 11

Themajorgoalofmyresearchwastodevelopnonlinearmethodsofthefamilyofdistortioninvariantlters,specicallytheminimumaveragecorrelationenergy(MACE)lter.Theminimumaveragecorrelationenergy(MACE)lterisawellknowncorrelationlterforpatternrecognition.MyresearchinvestigatedaclosedformsolutionofthenonlinearversionoftheMACElterusingtherecentlyintroducedcorrentropyfunction. Correntropyisapositivedenitefunctionthatgeneralizestheconceptofcorrelationbyutilizinghigherordermomentsofthesignalstatistics.Becauseofitspositivedenitenature,correntropyinducesanewreproducingkernelHilbertspace(RKHS).TakingadvantageofthelinearstructureoftheRKHS,itispossibletoformulateandsolvetheMACElterequationsintheRKHSinducedbycorrentropy.Duetothenonlinearrelationbetweenthefeaturespaceandtheinputspace,thecorrentropyMACE(CMACE)canpotentiallyimproveupontheMACEperformancewhilepreservingtheshift-invariantproperty(additionalcomputationforallshiftswillberequiredintheCMACE). Toalleviatethecomputationcomplexityofthesolution,myresearchalsopresentsthefastCMACEusingtheFastGaussTransform(FGT).BoththeMACEandCMACEarebasicallymemory-basedalgorithmsandduetothehighdimensionalityoftheimagedata,thecomputationalcostoftheCMACElterisoneofcriticalissuesinpracticalapplications.Therefore,myresearchalsousedadimensionalityreductionmethodbased 11

PAGE 12

WeappliedtheCMACEltertofacerecognitionusingfacialexpressiondataandtheMSTARpublicreleaseSyntheticApertureRadar(SAR)dataset,andexperimentalresultsshowthattheproposedCMACElterindeedoutperformsthetraditionallinearMACEandthekernelizedMACEinbothgeneralizationandrejectionabilities.Inaddition,simulationresultsinfacerecognitionshowthattheCMACElterwithrandomprojection(CMACE-RP)alsooutperformsthetraditionallinearMACEwithsmalldegradationinperformance,butgreatsavingsinstorageandcomputationalcomplexity. 12

PAGE 13

Therearealotofapplicationsforobjectrecognitions.Inautomatictargetrecognition,thegoalistoquicklyandautomaticallydetectandclassifyobjectswhichmaybepresentwithinlargeamountsofdata(typicallyimagery)withaminimumofhumaninterventionsuchasvehiclevs.non-vehicle,tanksvs.trucks,ontypeoftankvs.anothertype. Anotherpatternrecognitionapplicationswhichisemergingresearcheldsisthebiometricssuchasface,irisandngerprintrecognitionforhumanidenticationandverication.biometricstechnologyisrapidlybeingadoptedinawidevarietyofsecurityapplicationssuchascomputerandphysicalaccesscontrol,electroniccommerce,homelandsecurity,anddefense. Figure 1-1 showstheblockdiagramforthecommonpatternrecognitionprocess.Inthepreprocessingblock,denosing,normalization,edgedetection,poseestimation,etcareconductedforeachapplications. Featureextractioninvolvessimplifyingtheamountofresourcesrequiredtodescribealargesetofdataaccurately.Whenperforminganalysisofcomplexdataoneofthemajorproblemsstemsfromthenumberofvariablesinvolved.Analysiswithalargenumber 13

PAGE 14

Thegoalofclassicationistoassignthefeaturesderivedfromtheinputpatterntooneoftheclasses.Thereareavarietyofclassiersincludingstatisticalclassiers,articialneuralnetworks,supportvectormachine(SVM),andsoon. Anotherimportantpatternrecognitionmethodologyistousethetrainingdatadirectlyinsteadofextractingsomefeaturesandperformingclassicationbasedonthosefeatures.Whilefeatureextractionworkswellinmanypatternrecognitionapplications,itisnotalwayseasyforhumanstoidentifywhatthegoodfeaturesmaybe. Correlationltershavebeenappliedsuccessfullytoautomatictargetdetectionandrecognition(ATR)[ 1 ]forSARimage[ 2 ],[ 3 ],[ 4 ]andbiometricidenticationsuchasface,irisandngerprintrecognition[ 5 ],[ 6 ],byvirtueoftheirshift-invariantproperty[ 7 ],whichmeansthatifthetestimagecontainsthereferenceobjectatashiftedlocation,thecorrelationoutputisalsoshiftedbyexactlythesameamount.Duetothisproperty,thereisnoneedtoconductadditionalprocessofcenteringtheinputimagepriortorecognizingit.Also,insomeATRapplications,itisnotonlydesirabletorecognizevarioustargets,buttolocatethemwithsomedegreeofaccuracyandthelocationcanbeeasilyfoundedbysearchingthepeakofthecorrelationoutput.Anotheradvantageofcorrelationltersisthatitislinearandthereforethesolutioncanbecomputedanalytically. Figure1-1. Blockdiagramforpatternrecognition. 14

PAGE 15

1-2 depictsthesimpleblockdiagramforimagerecognitionprocessusingcorrelationlters.Objectrecognitioncanbeperformedbycross-correlatinganinputimagewithasynthesizedtemplate(lter)andthecorrelationoutputissearchedforthepeak,whichisusedtodeterminewhethertheobjectofinterestispresentornot. Itiswellknownthatmatchedltersaretheoptimallinearltersforsignaldetectionunderlinearchannelandwhitenoiseconditions[ 8 ][ 9 ].Matchedspatiallters(MSF)areoptimalinthesensethattheyprovidethemaximumoutputsignaltonoiseratio(SNR)forthedetectionofaknownimageinthepresenceofwhitenoise,underthereasonableassumptionofGaussianstatistics[ 10 ].However,theperformanceoftheMSFisverysensitivetoevensmallchangesinthereferenceimageandtheMSFcannotbeusedformulticlasspatternrecognitionsinceitisonlyoptimumforasingleimage.Thereforedistortioninvariantcompositeltershavebeenproposedinvariouspapers[ 1 ]. Distortioninvariantcompositeltersareageneralizationofmatchedspatiallteringforthedetectionofasingleobjecttothedetectionofaclassofobjects,usuallyintheimagedomain.Typicallytheobjectclassisrepresentedbyasetoftrainingexemplars.Theexemplarimagesrepresenttheimageclassthroughtheentirerangeofdistortionsofasingleobject.Thegoalistodesignasinglelterwhichwillrecognizeanobjectclassthroughtheentirerangeofdistortion.Underthedesigncriterionthelterisequally Figure1-2. Blockdiagramforimagerecognitionprocessusingcorrelationlter. 15

PAGE 16

Themostwellknownofsuchcompositecorrelationltersbelongtothesyntheticdiscriminantfunction(SDF)class[ 11 ]anditsvariations.OneoftheappealsoftheSDFclassisthatitcanbecomputedanalyticallyandeectivelyusingfrequencydomaintechniques.IntheconventionalSDFapproach,thelterismatchedtoacompositetemplatethatisalinearcombinationofthetrainingimagevectorssuchthatthecrosscorrelationoutputattheoriginhasthesamevaluewithalltrainingimages.Thehopeisthatthiscompositetemplatewillcorrelateequallywellnotonlywiththetrainingimagesbutalsowithdistortedversionsofthetrainingimages,aswellaswithtestimagesinthesameclass.OneoftheproblemswiththeoriginalSDFltersisthatbecauseonlytheoriginofthecorrelationplaneisconstrained,itisquitepossiblethatsomeothervalueinthecorrelationplaneishigherthanthisvalueattheoriginevenwhentheinputiscenteredattheorigin.Sincetheprocessingofresultingcorrelationoutputsisbasedondetectingpeaks,wecanexpectahighprobabilityoffalsepeaksandresultingmisclassicationsinsuchsituations. MinimumvarianceSDF(MVSDF)lterhasbeenproposedin[ 12 ]takingintoconsiderationadditiveinputnoise.TheMVSDFminimizestheoutputvarianceduetozero-meaninputnoisewhilesatisfyingthesamelinearconstraintsoftheSDF.OneofthemajordicultiesinMVSDFisthatthenoisecovarianceisnotknownexactly;evenwhenknown,aninversionisrequiredanditmaybecomputationallydemanding. Anothercorrelationlterthatiswidelyusedistheminimumaveragecorrelationenergy(MACE)lter[ 13 ].TheMACEminimizestheaveragecorrelationenergyoftheoutputoverthetrainingimagestoproduceasharpcorrelationpeaksubjecttothesamelinearconstraintsastheMVSDFandSDFlters.Inpractice,theMACElterperformsbetterthantheMVSDFwithrespecttoout-of-classrejection.TheMACElterhowever,hasbeenshowntohavepoorgeneralizationproperties,thatis,imagesintherecognition 16

PAGE 17

Byminimizingtheaverageenergyinthecorrelationplane,wehopetokeepthesidelobeslowwhilemaintainingtheoriginvaluesatprespeciedlevels.Thisisindirectmethodtoreducethefalsepeakorsidelobeproblem.However,intheirattempttoproducedelta-functiontypecorrelationoutputs,MACEltersemphasizehighfrequenciesandyieldlowcorrelationoutputswithimagesnotinthetrainingset. Therefore,someadvancedMACEapproachessuchastheGaussainMACE(G-MACE)[ 14 ],theminimumnoiseandcorrelationenergy(MINACE)[ 15 ]andoptimaltrade-olters[ 16 ]havebeenproposedtocombinethepropertiesofvariousSDF's.IntheG-MACE,thecorrelationoutputsaremadetoapproximateGaussianshapedfunctions.Thisrepresentsadirectmethodtocontroltheshapeofthecorrelationoutputs.TheMINACEandG-MACEvariationshaveimprovedgeneralizationpropertieswithaslightdegradationintheaverageoutputplanevarianceandsharpnessofthecentralpeakrespectively. InthemostofthepreviousresearchinSDFtypelters,linearconstraintsareimposedonthetrainingimagestoyieldaknownvalueatspeciclocationsinthecorrelationplane.However,placingsuchconstraintssatisesconditionsonlyatisolatedpointsintheimagespacebutdoesnotexplicitlycontrolthelter'sabilitytogeneralizeovertheentiredomainofthetrainingimages. Newcorrelationlterdesignbasedonrelaxedconstraints,whichiscalledthemaximumaveragecorrelationheight(MACH)lterhasbeenproposedin[ 17 ].MACHlteradoptastatisticalapproachthattheydonottreattrainingimagesasdeterministic 17

PAGE 18

Theconceptofrelaxingthecorrelationconstraintsandutilizingtheentirecorrelationoutputformulti-classrecognitionwasexplicitlyrstaddressedbythedistanceclassiercorrelationlter(DCCF)[ 18 ]. Anonlinearversionofcorrelationlterscalledthepolynomialdistanceclassiercorrelationlter(PDCCF)hasbeenproposedin[ 19 ].AnonlinearextensiontotheMACElterusingneuralnetworktopologyhasalsobeenproposedin[ 20 ].SincetheMACElterisequivalenttoacascadeofalinearpre-processorfollowedbyalinearassociativememory(LAM)[ 21 ],theLAMportionoftheMACEcanbereplacedwithanonlinearassociativememorystructure,specicallyafeed-forwardmulti-layerperceptron(MLP).Itiswellknownthatnon-linearassociativememorystructurescanoutperformtheirlinearcounterpartsonthebasisofgeneralizationanddynamicrange.However,ingeneral,theyaremorediculttodesignastheirparameterscannotbecomputedinclosedform.ResultshavealsoshownthatitisnotenoughtosimplytrainaMLPusingbackpropagation.Carefulanalysisofthenalsolutionisnecessarytoconrmreasonableresults.Experimentalresultsin[ 22 ]showedbettergeneralizationandclassicationperformancethanthelinearMACEintheMSTARATRdataset(at80%probabilityoffalsealarms,theprobabilityofdetectiondroppedfrom4.37%(MACE)to2:45%inthenonlinearMACE). Recently,kernelbasedlearningalgorithmshavebeenappliedtoclassicationandpatternrecognitionduetothefactthattheyeasilyproducenonlinearextensionstolinear 18

PAGE 19

23 ].Bytransformingthedataintoahigh-dimensionalreproducingkernelHilbertspace(RKHS)andconstructingoptimallinearalgorithmsinthatspace,thekernel-basedlearningalgorithmseectivelyperformoptimalnonlinearpatternrecognitionsininputspacetoachievebetterseparation,estimation,regressionandetc.Thenonlinearversionsofanumberofsignalprocessingtechniquessuchasprincipalcomponentanalysis(PCA)[ 24 ],Fisherdiscriminantanalysis[ 25 ]andlinearclassiers[ 25 ]havealreadybeendenedinakernelspace.Alsothekernelmatchedspatiallter(KMSF)hasbeenproposedforhyperspectraltargetdetectionin[ 26 ]andthekernelSDFhasbeenproposedforfacerecognition[ 27 ].Thekernelcorrelationlter(KCF)whichisthekernelizedMACElterafterprewhiteninghasbeenproposedin[ 28 ]forfacerecognition.SimilartoFisher'sideain[ 20 ],intheKCF,prewhiteningisperformedintheinputspacewithlinearmethodsanditmayaecttothewholeperformance.WewilllaterpresentthedierencebetweentheKCFandproposedmethod,whereallthecomputationincludingprewhiteningareconductedinthefeaturespace. Morerecently,anewgeneralizedcorrelationfunction,calledcorrentropyhasbeenintroducedbyourgroup[ 29 ].Correntropyisapositivedenitefunction,whichmeasuresageneralizedsimilaritybetweenrandomvariables(orstochasticprocesses)anditinvolveshigh-orderstatisticsofinputsignals,thereforeitcouldbeapromisingcandidateformachinelearningandsignalprocessing.CorrentropydenesanewreproducingkernelHilbertspace(RKHS),whichhasthesamedimensionalityastheonedenedbythecovariancematrixintheinputspaceanditsimpliestheformulationofanalyticsolutionsinthisnitedimensionalRKHS.Applicationstothematchedlter[ 30 ],chaotictimeseriespredictionhavebeenpresentedintheliterature. BasedonthepromisingpropertiesofcorrentropyandtheMACElter,themaingoalofthisresearchistoexploitthegeneralizednonlinearMACElterforimagerecognition.Astherststep,weappliedthekernelmethodtotheSDFandobtainedthekernelSDF.ApplicationofthekernelSDFtofacerecognitionhasbeenpresentedinthisresearch.As 19

PAGE 20

TheformulationexploitsthelinearstructureoftheRKHSinducedbycorrentropytoformulatethecorrentropyMACElterinthesamemannerastheoriginalMACE,andsolvestheproblemwithvirtuallythesameequations(e.gwithoutregularization)intheRKHS.Duetothenonlinearrelationbetweentheinputandthisfeaturespaces,theCMACEcorrespondstoanonlinearlterintheinputspace.Inaddition,theCMACEpreservestheshift-invariantpropertyofthelinearMACE,howeveritrequiresanadditionalcomputationforeachinputimageshift.InordertoreducethecomputationalcomplexityoftheCMACE,thefastCMACElterbasedontheFastGaussTransform(FGT)isalsoproposed. Alsoweintroducethedimensionalityreductionmethodbasedonrandomprojections(RP)andapplyRPtotheCMACEinordertodecreasethestorageandmeetmorereadilyavailablecomputationalresourcesandshowtheRPmethodworkswellwiththeCMACElterforimagerecognition. WecansaytheCMACEformulationforimagerecognitionisoneapplicationofthegeneralcaseoftheenergyminimizationproblems.Wecanalsoapplythesamemethodologywhichisminimizingcorrentropyenergyoftheoutputforthebeamformingproblem,whoseconventionallinearsolutionisobtainedbyminimizingtheoutputpower.Appendix E presentsthenewapplicationofthecorrentropytothebeamformingprobleminwirelesscommunicationswithsomepreliminaryresults. 20

PAGE 21

whereNisthenumberoftrainingimagesandthecoecientsaiarechosentosatisfythefollowingconstraints whereTdenotesthetransposeandujisadesiredcrosscorrelationoutputpeakvalue.Invectorform,wedenethetrainingimagedatamatrixXas wherethesizeofmatrixXisdN.ThentheSDFisthesolutiontothefollowingoptimizationproblem minhTh;subjecttoXTh=u:(2{4) 21

PAGE 22

ExampleofthecorrelationoutputplaneoftheSDF[ 31 ]. ItisassumedthatN
PAGE 23

wherethesizeofXisdNandNisthenumberoftrainingimage.LetthevectorhbethelterinthespacedomainandrepresentbyHitsFouriertransformvector.Weareinterestedinthecorrelationbetweentheinputimageandthelter.Thecorrelationoftheithimagesequencexi(n)withltersequenceh(n)canbewrittenas ByParseval'stheorem,thecorrelationenergyoftheithimagecanbewrittenasaquadraticform whereDiisadiagonalmatrixofsizeddwhosediagonalelementsarethemagnitudesquaredoftheassociatedelementofXi,thatis,thepowerspectrumofxi(n)andthesuperscriptHdenotestheHermitiantranspose.TheobjectiveoftheMACElteristominimizetheaveragecorrelationenergyovertheimageclasswhilesimultaneouslysatisfyinganintensityconstraintattheoriginforeachimage.Thevalueofthecorrelationattheorigincanbewrittenas fori=1;2;;Ntrainingimages,whereciistheuserspeciedoutputcorrelationvalueattheoriginfortheithimage.Thentheaverageenergyoveralltrainingimagesisexpressedas 23

PAGE 24

ExampleofthecorrelationoutputplaneoftheMACE[ 31 ]. where TheMACEdesignproblemistominimizeEavgwhilesatisfyingtheconstraint,XHH=c,wherec=[c1;c2;;cN]isanNdimensionalvector.ThisoptimizationproblemcanbesolvedusingLagrangemultipliers,andthesolutionis ItisclearthatthespatiallterhcanbeobtainedfromHbyaninverseDFT.Oncehisdetermined,weapplyanappropriatethresholdtotheoutputcorrelationplaneanddecidewhetherthetestimagebelongstotheclassofthetemplateornot. Figure 2-2 showsthegeneralshapeofthecorrelationoutputplaneoftheMACE[ 31 ].Itshowsasharppeakattheoriginandasaresulttheabilitiesofndingthelocationofthetargetanddiscriminationbetweentrueclassandoutofclassimageshavebeen 24

PAGE 25

ExampleofthecorrelationoutputplaneoftheOTSDF[ 31 ]. improved.However,asharpoutputplanecausesaworsedistortiontoleranceandpoorgeneralization. TheOTSDFlterinthefrequencydomainisgivenby whereT=D+p 25

PAGE 26

2-3 [ 31 ].AscomparedtotheMACElterresponse,theoutputpeakisnotnearlyassharp,butstillmorelocalizedthantheSDFcase. 26

PAGE 27

3.1.1Introduction 32 ].AndthereisnowanextensiveliteratureonSVM[ 33 ],[ 34 ]andthefamilyofkernel-basedalgorithms[ 23 ]. Akernel-basedalgorithmisanonlinearversionofalinearalgorithmwherethedatahasbeenpreviously(andmostoftennonlinearly)transformedtoahigherdimensionalspaceinwhichweonlyneedtobeabletocomputeinnerproducts(viaakernelfunction). Itisclearthatmanyproblemsarisinginsignalprocessingareofstatisticalnatureandrequireautomaticdataanalysismethods.Furthermore,thealgorithmsusedinsignalprocessingareusuallylinearandtheirtransformationfornonlinearprocessingissometimesunclear.Signalprocessingpractitionerscanbenetfromadeeperunderstandingofkernelmethods,becausetheyprovideadierentwayoftakingintoaccountnonlinearitieswithoutloosingtheoriginalpropertiesofthelinearmethod.Anotheraspectisdealingwiththeamountofavailabledatainaspaceofagivendimensionality,oneneedsmethodsthatcanuselittledataandavoidthecurseofdimensionality. Aronszajn[ 35 ]andParzen[ 36 ]weresomeofthersttoemploypositivedenitekernelinstatistics.Later,basedonstatisticallearningtheory,supportvectormachine[70]andotherkernel-basedlearningalgorithms[63]suchaskernelprincipalcomponentanalysis[64],kernelFisherdiscriminantanalysis[58]andkernelindependentcomponentanalysis[4]havebeenintroduced. 27

PAGE 28

Theexampleofkernelmethod(left:Inputspace,right:featurespace). aslineardiscrimination,principalcomponentanalysis,orleastsquaresregression,makeextensiveuseofthelinearsstructure.Roughlyspeaking,kernelsallowonetonaturallyderivenonlinearversionsoflinearalgorithmsthroughtheimplicitnonlinearmapping.Thegeneralideaisthefollowing.Givenalinearalgorithm(i.e.analgorithmwhichworksinavectorspace),onerstmapsthedatalivinginaspace(theinputspace)toavectorspaceH(thefeaturespace)viaanonlinearmapping():!H;andthenrunsthealgorithmonthevectorrepresentation(x)ofthedata.Inotherwords,oneperformsnonlinearanalysisofthedatausingalinearmethod.Thepurposeofthemap()istotranslatenonlinearstructuresofthedataintolinearonesinH. Considerthefollowingdiscriminationproblem(seeFigure 3-1 )wherethegoalistoseparatetwosetsofpoints.Intheinputspace,theproblemisnonlinear,butafterapplyingthetransformation(x1;x2)=(x21;p Workingdirectlywiththedatainthefeaturespacemaybedicultbecausethespacecanbeinnitedimensional,orthetransformationimplicitlydened.Thebasicideaofkernelalgorithmistotransformthedataxfromtheinputspacetoahighdimensionalfeaturespaceofvectors(x),wheretheinnerproductscanbecomputedusingapositive 28

PAGE 29

23 ], wherek;jistheKroneckerdeltafunction,i.e.,equalto1or0accordingask=jork6=j.Then (3{4) wheretheseriesaboveconvergesabsolutelyanduniformlyonTT[ 37 ]. Thissimpleandelegantideaallowsustoobtainnonlinearversionsofanylinearalgorithmexpressedintermsofinnerproducts,withoutevenknowingtheexactmappingfunction.AparticularlyinterestingcharacteristicofthefeaturespaceisthatitisareproducingkernelHilbertspace(RKHS),i.e.,thespanoffunctionsf(;x):x2gdenesauniquefunctionalHilbertspace[ 35 ].Thecrucialpropertyofthesespaceisthereproducingpropertyofthekernel Inparticular,wecandeneournonlinearmappingfromtheinputspacetoRKHSas(x)=(;x),thenwehave 29

PAGE 30

Inthisresearch,weusetheGaussiankernel,whichisthemostwidelyusedMercerkernel, wherexiisithtrainingimagevectorgivenby ThenwecanextendtheSDFoptimizationproblemtothenonlinearfeaturespaceby minT(h)(h);subjecttoT(X)(h)=u: wherethedimensionsofthetransformed(X)and(h)are1Nand11,respectivelyfortheGaussiankernel.Thenthesolutioninkernelspacebecomes (h)=(X)(T(X)(X))1u: WedenoteKXX=T(X)(X),whichisaNNfullrankmatrixwhose(i;j)thelementisgivenby (KXX)ij=dXk=1k(xi(k);xj(k));i;j=1;2;;N: 30

PAGE 31

LetZbethematrixofvectorimagesfortestinganditsnumberoftestingimagesareL.WedenoteKZX=T(Z)(X),whichisLNmatrixwhoseeachelementisgivenby (KZX)ij=dXk=1k(zi(k);xj(k));i=1;2;;L;j=1;2;;N: ThentheL1outputvectorofthekernelSDFisgivenby WecancomputeKXXo-linewithgiventrainingdata.ThenKZXandyarecanbecomputedon-linewithagiventestimage.GivenNtrainingimagesandonetestimage,thecomputationalcomplexityareO(dN2)andO(dN)+O(N3)duringo-lineandon-line,respectively.Ingeneral,Nd,sothedominantpartofthecomputationalcomplexityformatrixinversionin( 3{14 ),O(N3),isnotacriticalcomputationalissueintheKDSF.AlsotherequiredmemoryforKXXwhichisO(N2)ismuchlessthanthecaseofdependingonthenumberofimagepixels,d. Byapplyinganappropriatethresholdtotheoutputin( 3{14 ),wecandetectandrecognizethetestingdatawithoutgeneratingthecompositelterinafeaturespace.Inobjectrecognitionandclassicationsenses,theproposedkernelSDFissimplerthanthekernelmatchedlter. 3.3.1ProblemDescription 38 ].Thedatabaseconsistsof13subjects,whosefacialimageswerecapturedwith75varyingexpressions.Thesizeofeachimageis 31

PAGE 32

3-2 .Inthisresearch,wetestedtheproposedkernelSDFmethodwiththeoriginaldatabaseimagesaswellaswithnoisyimages.SampleimageswithadditiveGaussiannoisewitha10dBSNRareshowninFigure 3-2 (c).InordertoevaluatetheperformanceoftheSDFandkernelSDFlterinthisdataset,weexamined975(1375)correlationoutputs.Fromtheseresultsandtheonesreportedin[ 5 ]wepickedandreporttheresultsofthetwomostdicultcaseswhoproducedtheworstperformancewiththeconventionalSDFmethod.Wetestwithalltheimagesofeachperson'sdatasetresultingin75outputsforeachclass.Thesimulationresultshavebeenobtainedbyaveraging(Monte-Carloapproach)over100dierenttrainingsets(eachtrainingsethasbeenchosenrandomly)tominimizetheproblemofperformancedierencesduetosplittingtherelativelysmalldatabaseintrainingandtestingsets.Inthisdataset,ithasbeenobservedthatthekernelsizearound30%-50%ofthestandarddeviationoftheinputdatawouldbeappropriate. Sampleimages:(a)PersonA(b)PersonB(c)PersonAwithadditiveGaussiannoise(SNR=10dB). 32

PAGE 33

Theoutputpeakvalueswhenonly3imagesareusedfortraining(N=3),(Top):SDF,(Bottom):KernelSDF. Figure 3-3 showstheaverageoutputpeakvaluesforimagerecognitionwhenweuseonlyN=3imagesastraining.Thedesiredoutputpeakvalueshouldbeclosetoonewhenthetestimagebelongstothetrainingimageclass.Figure 3-3 (Top)showsthatthecorrelationoutputpeakvaluesoftheconventionalSDFinbothtrueandfalseclassesnotonlyoverlapbutarealsoclosetoone.Asaresultthesystemwillhavegreatdicultytodierentiatethesetwoindividualsbecausetheycanbeinterpretedasbelongingtothesameclass.Figure 3-3 (Bottom)showstheoutputvaluesofkernelSDFandwecanseethatthetwoimagescanberecognizedwellevenwithasmallnumberoftrainingimages.Figure 3-4 showstheROCcurveswithdierentnumberoftrainingimages(N).InthekernelSDFwithN=3,theprobabilityofdetectionwithzerofalsealarmrateis1.However,theconventionalSDFneedsatleast25imagesfortraininginordertohavethesamedetectionperformanceasthekernelSDF. 33

PAGE 34

ThecomparisonofROCcurveswithdierentnumberoftrainingimages. OneofthemajorproblemsoftheconventionalSDFisthattheperformancecanbeeasilydegradedbyadditivenoiseinthetestimagesinceSDFdoesnothaveanyspecialmechanismtoconsiderinputnoise.Therefore,ithasapoorrejectingabilityforafalseclassimage.Figure 3-5 (Top)showsthenoiseeectontheconventionalSDF.WhentheclassimagesareseriouslydistortedbyadditiveGaussiannoisewithaverylowSNR(-2dB),thecorrelationoutputpeaksofsometestimagesbecomegreatthan1,hencewrongrecognitionhappens.TheresultsinFigure 3-5 (Bottom)areobtainedbythekernelSDF.ThekernelSDFshowsamuchbetterperformanceeveninaverylowSNRenvironment.ThecomparisonofROCcurvesbetweenthekernelSDFandtheconventionalSDFinthecaseofnoisytestinputwithdierentSNRsisshowninFigure 3-6 .WecanseethatthekernelSDFoutperformstheSDFandachievesarobustpatternrecognitionperformanceinaveryhighnoisyenvironment. 34

PAGE 35

TheoutputvaluesofnoisytestinputimageswithadditiveGaussiannoisewhen25imagesareusedfortraining(N=25),(Top):SDF,circle-trueclasswithSNR=10dB,cross-falseclasswithSNR=-2dB,diamond-falseclasswithnonoise,(Bottom):KernelSDF,circle-trueclasswithSNR=10dB,cross-falseclasswithSNR=-2dB. Figure3-6. TheROCcurvesofnoisytestinputimageswithdierentSNRswhen10imagesareusedfortraining(N=10). 35

PAGE 36

39 ].ParzenanalyzedtheconnectionbetweenRKHSandsecond-orderrandom(orstochastic)processesbyusingtheisometricisomorphism 4{1 )and( 4{2 )issaidtobeanisometricisomorphismorcongruence.ThecongruencemapsbothlinearcombinationsoffunctionalsandlimitpointsfromH1intocorrespondinglinearcombinationsoffunctionalsandlimitpointsinH2. 36

PAGE 37

Anypositivedenitebivariatefunction(x;y)isareproducingkernelbecauseofthefollowingfundamentaltheorem. (i)foreveryx2E;(x;)2Hand (4{4) (ii)foreveryx2Eandf2H;f(x)=hf;(x;)iH: ThenH:=H()issaidtobeareproducingkernelHilbertspacewithreproducingkernel.Theproperties(i)and(ii)arecalledthereproducingpropertyof(x;y)inH(). Parzen[ 39 ]analyzedtheconnectionbetweenRKHSsandorthonormalexpansionsforsecond-orderstochasticprocessesobtainingageneralexpressionforthereproducingkernelinnerproductintermsoftheeigenvaluesandeigenfunctionsofacertainoperatordenedonanappropriateHilbertspace.Inaddition,ParzenshowedthatthereexistsanisometricisomorphismbetweentheHilbertspacespannedbytherandomvariablesofastochasticprocessandtheRKHSdeterminedbyitscovariancefunction. Givenazeromeansecond-orderrandomvectorfxi:i2IgwithIbeinganindexset,thecovariancefunctionisdenedas ItiswellknownthatthecovariancefunctionRisnon-negativedenite,thereforeitdeterminesauniqueRKHS,H(R),accordingtotheMoore-AronszajnTheorem.Bythe 37

PAGE 38

35 ], wherefk;k=1;2;gandf'k(i);k=1;2;gareasequenceofnon-negativeeigenvaluesandcorrespondingnormalizedeigenfunctionsofR(i;j),respectively. 4{7 ),wehave Therefore,R(i;)2H(R)foreachiinI.Second,foreveryfunctionf()2H(R)offormgivenbyf(i)=P1k=0kak'k(i)andeveryiinI, BytheMoore-AronszajnTheorem,H(R)isareproducingkernelHilbertspacewithR(i;j)asthereproducingkernel.Itfollowsthat ThusH(R)isarepresentationoftherandomvectorfxi:i2IgwithcovariancefunctionR(i;j). OnemaydeneacongruenceGformH(R)ontolinearspaceL2(xi;i2I)suchthat ThecongruenceGcanbeexplicitlyrepresentedas 38

PAGE 39

4{7 ). 40 ]andthishelpsusunderstandtheRKHSperspectiveoftheMACE.Letusconsiderthecaseofonetrainingimageandconstructthefollowingmatrix wherethedimensionofmatrixUis(2d1)d.HerewedenotetheithcolumnofthematrixUasUi.ThenthecolumnspaceofUis 39

PAGE 40

where<;>representstheinnerproductoperation.IfallthecolumnsinUarelinearlyindependent,R(i;j)ispositivedeniteandthedimensionalityofL2(U)isd.IfUissingular,thedimensionalityissmallerthand.However,ineithercase,allthevectorsinthisspacecanbeexpressedasalinearcombinationofthecolumnvectors.TheoptimizationproblemoftheMACEistondavectorgo=dPi=1hiUiinL2(U)spacewithcoordinatesh=[h1h2hd]TsuchthatgTogoisminimizedsubjecttotheconstraintthatthedthcomponentofgo(whichisthecorrelationatzerolag)issomeconstant.FormulatingtheMACElterfromthisRKHSviewpointonlyprovidesanewperspectivebutnoadditionaladvantage.However,asexplainednext,itwillhelpusderiveanonlinearextensiontotheMACEwithanewsimilaritymeasure. 40

PAGE 41

5.1.1Denition 29 ]asageneralizedmeasureofsimilarity.Itsnamestressestheconnectiontocorrelation,butalsoindicatesthefactthatitsmeanvalueacrosstimeordimensionsisassociatedwithentropy,morepreciselytotheargumentoftheloginRenyi'squadraticentropyestimatedwithParzenwindows,whichiscalledtheinformationpotential.Informationpotential(IP)istheargumentofRenyi'squadraticentropyofarandomvariableXwithPDFfX(x)as, where,IP(x)=Rf2X(x)dx. AnonparametricestimatoroftheinformationpotentialusingParzenwindowfromNsamplesdatais whereistheGaussiankernelin( 5{4 )[ 41 ]. Thisrelationtoentropyshowsthatthecorrentropycontainsinformationbeyondsecondordermoments,andcanthereforegeneralizecorrelationwithoutrequiringmomentexpansions. 41

PAGE 42

whereisthekernelsizeofbandwidth. Inpractice,givennitenumberofdatasamplesf(xi;yi)gdi=1,thecrosscorrentropyisestimatedby ^V(X;Y)=1 29 ][ 42 ][ 43 ].Herewepresent,withoutproofs,onlythepropertiesthatarerelevanttothisdissertation. 29 ]. ApplyingtheTaylorseriesexpansionfortheGaussiankernel,wecanrewritethecorrentropyfunctionin( 5{3 )as whichcontainsalltheeven-ordermomentsoftherandomvariableXY.Thekernelsizecontrolstheemphasisofthehigherordermomentswithrespecttothesecond,sincethehigherordertermsoftheexpansiondecayfasterforlarger.Asincreases,the 42

PAGE 43

44 ]),correntropystartstoapproachcorrelation.Thekernelsizehastobechosenaccordingtotheapplication,butherethisissuewillnotbefurtheraddressedandSilverman'srulewillbeusedbydefault. 29 ]. Since(xixj)issymmetrical,itisobviousthatV(i;j)isalsosymmetrical.Alsosince(xixj)ispositivedenite,foranysetofnpointfx1;;xngandnonezerorealnumbersf1;;ng,wehave Itistruethatforanystrictlypositivefunctiong(;)oftworandomvariablesxandy,E[g(x;y)]>0.ThuswehaveE"nPi=1nPj=1ij(xixj)#>0.Thisequalsto ThusV(i;j)isbothsymmetricandpositivedenite.Now,theMoore-Aronszajntheorem[ 35 ]provesthatforeveryrealsymmetricpositivedenitefunctionk,thereexistsauniqueRKHSwithkasitsreproducingkernel.HenceV(i;j)isareproducingkernel. Asshowninproperty1,VRKHScontainshigherorderstatisticalinformation,unliketheRKHSdenedbythecovariancefunctionofrandomprocesses. 43

PAGE 44

ContoursofCIM(X,0)in2Dsamplespace(kernelsizeissetto1) estimatorwithkernelsize0=p 42 ], ^Vp 5{4 )with(0)=1=p 42 ].Therefore,correntropycanbethemetricforsimilaritymeasurementinfeaturespace. Figure 5-1 showsthecontoursofdistancefromXtotheorigininatwodimensionalspace.Theinterestingobservationfromthegureisasfollows:whenXisclosetozero, 44

PAGE 45

5{6 );furtheroutCIMbehaveslikeanL1norm;eventuallyasXdepartsfromtheorigin,themetricsaturatesandbecomesinsensitivetodistance(approachingaL0norm LetusdeneE=XY=[e1;e2;:::;eN]T,then 2p First,letustakealookatthefollowinglimit lim!122(1exp(e2i=22))=limt!01exp(te2i) 45

PAGE 46

lim!1(2p Second,lookatthefollowinglimit lim!0+(1exp(e2i=22))=8><>:0ifei=01ifei6=0(5{13) Therefore, lim!0p Theproofofproperty5isinAppendix B Accordingtoproperty5,thereexistsascalarnonlinearmappingfwhichmakesthecorrentropyofxithecorrelationoff(xi).( 5{15 )allowsthecomputationofthecorrelationinfeaturespacebythecorrentropyfunctionintheinputspace[ 45 ][ 46 ]. 4 ,wecanextenditimmediatelytoVRKHS 4 ,thedenitionofthecorrelationin( 4{15 )shallbesubstitutedby 2d12d1Xn=1(UinUjn)i;j=1;;d;(5{16) where,Uinis(i;n)thelementsin( 4{13 ).ThisfunctionispositivedeniteandthusinducestheVRKHS.AccordingtotheMercer'stheorem[ 35 ],thereisabasisfi;i= 46

PAGE 47

SinceitisaddimensionalHilbertspace,itisisomorphictoanyddimensionalrealvectorspaceequippedwiththestandardinnerproductstructure.Afteranappropriatechoiceofthisisomorphismfi;i=1;;dg,whichisnonlinearlyrelatedtotheinputspace,anonlinearextensionoftheMACEltercanbereadilyconstructedonthisVRKHS,namely,ndingavectorv0=dPi=1fhiiwithfh=[fh1fhd]TascoordinatessuchthatvT0v0isminimizedsubjecttotheconstraintthatthedthcomponentofv0issomepre-speciedconstant. Lettheithimagevectorbexi=[xi(1)xi(2)xi(d)]Tandthelterbeh=[h(1)h(2)h(d)]T,whereTdenotestranspose.Fromproperty5,theCMACEltercanbeformulatedinfeaturespacebyapplyinganonlinearmappingfunctionfontothedataaswellasthelter.WedenotethetransformedtrainingimagematrixandltervectorwhosesizearedNandd1,respectively,by wherefxi=[f(xi(1))f(xi(2))f(xi(d))]Tfori=1;;N.Givendatasamples,thecrosscorrentropybetweentheithtrainingimagevectorandtheltercanbeestimatedas forallthelagsm=d+1;;d1.Thenthecrosscorrentropyvectorvoicanbeformedincludingallthelagsofvoi[m]denotedby whereSiisthematrixofsize(2d1)1as 47

PAGE 48

Sincethescalefactor1=dhasnoinuenceonthesolution,itwillbeignoredthroughoutthedissertation.Thecorrentropyenergyoftheithimageisgivenby DenotingVi=STiSiandusingthedenitionofcorrentropyin( 5{15 ),theddcorrentropymatrixViis where,eachelementofthematrixiscomputedwithoutexplicitlyknowledgeofthemappingfunctionfby forl=0;;d1.Theaveragecorrentropyenergyoverallthetrainingdatacanbewrittenas 48

PAGE 49

Sinceourobjectiveistominimizetheaveragecorrentropyenergyinfeaturespace,theoptimizationproblemisformulatedas minfThVXfhsubjecttoFTXfh=c;(5{28) where,cisthedesiredvectorforallthetrainingimages.Theconstraintin( 5{28 )meansthatwespecifythecorrentropyvaluesbetweenthetraininginputandthelterasthedesiredconstant.SincethecorrentropymatrixVXispositivedenite,thereexistsananalyticsolutiontotheoptimizationproblemusingthemethodofLagrangemultipliersinthenewnitedimensionalVRKHS.ThentheCMACElterinfeaturespacebecomes UnliketheKSDFin( 3{11 )whichhasa11dimensionalityintheRKHSbytheconventionalkernelmethod,theCMACElterisdenedinthenitedimensionalVRKHSwhichhasthesamedimensionalityastheinputspacewiththesizeofd1.Ingeneral,thekernelmethodcreatesainnitedimensionalfeaturespace,sothesolutionoftenneedstheregularizationtolimittheboundofthesolution.Therefore,theKSDFmaybeneededadditionalregularizationtermsforbetterperformance.InthecomputationalcomplexityoftheCMACEcomparedtotheKSDF,theadditionalO(d3)operationandO(d2)storageforV1Xisneeded.ItmakesustoneedfastversionoftheCMACEaswellasadimensionalityreductionmethodforpracticalapplications. 49

PAGE 50

5.3.1ImplicationofNonlinearity 5{17 )and( 5{22 ),wecansaythattheRKHSinducedbycorrentropy(VRKHS)isaHilbertspacespannedbythebasisfigdi=1ofsize(2d1)1as wheref()isanonlinearscalarfunctionandf(x(d))islocatedintheithelement.ItisobviousthatunliketheRKHSinducedbycorrelation,theVRKHSfortheCMACElterisnonlinearlyrelatedtotheoriginalinputspace.ThisstatementcanbesimplyexempliedbytheCIMmetric.Supposetwovectorsx=[x1;0;;0]Tandtheoriginintheinputspacey=[0;;0]T.ThentheEuclideandistanceintheinputspaceisgivenbyED(x;y)=x1andthedistanceinVRKHSbecomesCIM(x;y)=(1 Also,asgoestoinnity,theEuclideandistanceintheinputspacelinearlyincreasestoo.However,theCIMdistancesaturates.ThisshowsthatdistancesinVRKHSarenotlinearlyrelatedtodistancesintheinputspace.ThisargumentcanalsobeobserveddirectlyfromtheCIMcontourinFig. 5-1 Inaddition,sincecorrentropyisdierentfromcorrelationinthesensethatitinvolveshigh-orderstatisticsofinputsignals,innerproductsintheRKHSinducedbycorrentropyarenolongerequivalenttostatisticalinferenceonGaussianprocesses.ThetransformationfromtheinputspacetoVRKHSisnonlinearandtheinnerproductstructureofVRKHSprovidesthepossibilityofobtainingclosedformoptimalnonlinearltersolutionsbyutilizingsecondandhigh-orderstatistics. 50

PAGE 51

28 ].TheKCFAisthekernelizedversionofthelinearMACElterusingthekerneltrickafterprewhiteningpreprocess.ThecorrelationoutputoftheMACElterhandaninputimagevectorzcanbeexpressedas 51

PAGE 52

5{31 )isequivalenttothelinearSDFwithprewhiteneddataandapplyingthekerneltrickyieldstheKCFAasfollows[ 27 ] where(i;j)thelementsofthematrixKXXandKZXarecomputedby (KXX)ij=dXk=1(xkixkj);i;j=1;2;;N;(5{33) (KZX)ij=dXk=1(zkixkj);i=1;2;;L;j=1;2;;N;(5{34) whereNisthenumberoftrainingimagesandListhethenumberoftestinputimages. IntheCMACE,wedenoteFX=V1=2XFX,andwecandecomposefhasfh=V1=2XV1=2XFX(FTXV1=2XV1=2XFX)1c=V1=2XFX(FTXFX)1c ThemaindierencebetweentheCMACEandKCFAistheprewhiteningprocess.IntheKCFA,prewhiteningisconductedintheinputspaceusingD,ontheotherhand,intheCMACE,( 5{35 )impliesthattheimageisimplicitlywhitenedinthefeaturespacebythecorrentropymatrixVX.InthespacedomainMACElter,theautocorrelationmatrixcanbeusedasapreprocessorforprewhitening.SincetheCMACElterusesthesameformulationinthefeaturespace,wecanalsoexpectthatthecorrentropymatrixcanbeusedforprewhitening.However,inpractice,wecannotobtainwhiteneddataexplicitlysincethemappingfunctionisnotexplicitlyknown.Inaddition,thesolutionintheKCFAisdenedinainnitefeaturespaceliketheKSDFthereforeadditionalregularizationtermmaybeneedeforabetterperformance. 52

PAGE 53

Here,wedenoteTZX=FTZV1XFXandTXX=(FTXV1XFX)1.Thentheoutputbecomes whereTXXisNNsymmetricmatrixandTZXisLNmatrixwhose(i;j)thelementisexpressedby (TXX)ij=dXl=1dXk=1wlkf(xi(k))f(xj(l))=dXl=1dXk=1wlk(xi(k)xj(l));i;j=1;;N; wherewlkisthe(l;k)thelementofV1X. Thenaloutputexpressionsin( 6{3 )and( 6{4 )areobtainedbyapproximatingf(xi(k))f(xj(l))andf(zi(k))f(xj(l))by(xi(k)xj(l))and(zi(k)xj(l)), 53

PAGE 54

6{3 )and( 6{4 )involveweightedversionsofthefunctionalsthereforetheerrorintheapproximationrequiresfurthertheoreticalinvestigation. TheCMACEisformulatedinthelinearVRKHSbuthasanonlinearbehaviorsincetheVRKHSisnonlinearlyrelatedtotheinputspace.However,theCMACEpreservestheshift-invariantpropertyofthelinearMACE.Theproofoftheshift-invariantpropertyisgiveninAppendix C .AlthoughtheoutputoftheCMACEgivesusonlyonevalue,itispossibletoconstructthewholeoutputplanebyshiftingthetestinputimageandasaresult,theshiftinvariancepropertyofthecorrelationlterscanbeutilizedattheexpenseofmorecomputation.Applyinganappropriatethresholdtotheoutputof( 6{1 ),onecandetectandrecognizethetestingdatawithoutgeneratingthecompositelterinfeaturespace.Aswillbeshowninthesimulationresultssection,evenwiththisapproximation,theCMACEoutperformstheconventionalMACE. Givenddatasamplesfx(i)gdi=1,letusdenotethemeanofthetransformeddatainfeaturespaceasE[f(x(i))]=mf,thenthecenteredcorrentropy,thatcanbeproperlycalledthegeneralizedcovariancefunction,isgivebyVc(i;j)=E[ff(x(i))mfgff(x(j))mfg]=E[f(x(i))f(x(j))]m2f=V(i;j)m2f: 54

PAGE 55

Inordertoshowthevalidityof( 6{6 ),letusconsiderthesampleestimationofcorrentropy(andignoringthescalarfactor1=d)thenwehave Wearrangethedoublesummation( 6{6 )asanarrayandsumalongthediagonaldirectionwhichyieldsexactlytheautocorrelationfunctionofthetransformeddataatdierentlags,thusthecorrentropyfunctionoftheinputdataatdierentlagscanbewritten1 Asweseein( 6{8 ),whenthesummationisfarfromthemaindiagonal,smallerandsmallerdatasizesareinvolvedwhichleadstopoorapproximation.Noticethatthisisexactlythesameproblemwhentheautocorrelationfunctionisestimatedfromwindoweddata.However,whendislarge,theapproximationimproves.Therefore,intheCMACEoutputequation( 6{1 ),wecanusethecenteredcorrentropymatrixVXCbysubtractingtheinformationpotentialfromthecorrentropymatrixVXas where,m2favgistheaverageestimatedinformationpotentialoverNtrainingimagesandand1ddisaddmatrixwithalltheentriesequalto1.Usingthecenteredcorrentropy 55

PAGE 56

6{3 )and( 6{4 ),whichdependsontheimagesizeandthenumberoftrainingimages.Eachelementinvolvesadoublesummationofweightedkernelfunctions.Therefore,eachelementsofthematrixrequiresO(d2)computations,wheredisthenumberofimagepixels.WhenthenumberoftrainingimagesisN,thetotalcomputationcomplexityforonetestoutputisO(Nd2+N2).AsimilarargumentshowsthatthecomputationneededfortrainingisO(d2(N2+1)+N2).Ontheotherhand,theMACEonlyrequiresO(4(d(2N2+N+2)+N2)+Ndlog2(d))fortrainingandO(4d+dlog2(d))fortestingoneinputimage.Table 6-1 showsthecomputationalcomplexityoftheMACEandCMACE.MoredetailsabouttherequiredcomputationcostsaregiveninAppendix D .ConstructingthewholeoutputplanewouldsignicantlyincreasethecomputationalcomplexityoftheCMACE.Thisquicklybecomestoodemandinginpracticalsettings.Thereforeamethodtosimplifythecomputationisnecessaryforpracticalimplementations. HeretheFastGaussTransform(FGT)[ 47 ]isproposedtoreducethecomputationtimewithaverysmallapproximationerror.TheFGTisoneofaclassofveryinterestingandimportantfamiliesoffastevaluationalgorithmsthathavebeendevelopedoverthepastdecadestoenablerapidcalculationofweightedsumsofGaussianfunctionswitharbitraryaccuracy.InnonparametricprobabilitydensityestimationwithGaussiankernel,theFGTcanreducethecomplexityofO(dM)toO(d+M)forMevaluationswithdsources. 56

PAGE 57

wherekisakernelfunctioncenteredatthesourcepointsx(j)andqjarescalarweightingcoecients.WiththeGaussiankernel,( 6{10 )canbeinterpretedasa"Gaussian"potentialledduetosourcesofstrengthsqjatthepointsx(j),evaluatedatthetargetpointz.SupposethatwehaveMevaluationtargetpoints,thenthecomputationof( 6{10 )requiresO(dM)calculations,whichconstrainsthecomputationbandwidthforlargedatasetsdandMinrealworldapplications.TheFastGaussTransform(FGT)canreducethecomplexitytoO(d+M)for( 6{10 ).TheFGTisoneofaclassofveryinterestingandimportantfamiliesoffastevaluationalgorithmsthathavebeendevelopedoverthepastdecadestoenablerapidcalculationofapproximationsatarbitraryaccuracy.ThebasicideaistoclusterthesourcesandtargetpointsusingappropriatedatastructuresandtheHermiteexpansion,andthenreducethenumberofsummationswithagivenlevelofprecision. Thisisverysimilartothedensityestimationproblemthatevaluatesatdtargetsz(i)withgivendsourcesamplesx(j).However,theweightingfactorwijin( 6{11 )aredependentonbothtargetandsource,whichisdierentfromtheoriginalFGTapplications,wheretheweightvectorisalwaysthesameateveryevaluationtargetpoints.Inourcase,theweightvectorwi=[wi1;;wid]Tisvaryingoneveryevaluationpointz(i).Wecansay 57

PAGE 58

6{11 )isamoregeneralexpressionthantheoriginalFGTformulationanditcanbewrittenas where ThismeansthatclusteringandtheHermiteexpansionshouldbeperformedateverytargetz(i)withadierentweightvectorwi,whichcausesanextracomputationforclustering.However,sincethesourcesareclusteredintheFGT,ifoneexpressestheclusteredsourcesaboutitscenterintotheHermiteexpansion,thenthereisnoneedtodoclusteringandtheHermiteexpansionateveryevaluation.Theonlythingthatisnecessaryistousedierentweightvectorsateveryevaluationpoint.ThisprocessdoesnotrequireadditionalcomplexitycomparedtotheoriginalFGTformulationexceptthatmorestorageisrequiredtokeeptheweightvectors.ByusingtheHermiteexpansionaroundthetargets,theGaussiancenteredatx(j)evaluatedatz(i)canbeobtainedby exp(z(i)x(j))2 wheretheHermitefunctionhn(x)isdenedby Also,inthisresearch,weuseasimplegreedyalgorithmforclustering[ 48 ],whichcomputesadatapartitionwithamaximumradiusatmosttwicetheoptimum.ThisclusteringmethodandtheHermiteexpansionwithorderprequiresO(pd).Inthecaseof( 6{3 )and( 6{4 ),sincethenumberofsourcesandtargetsarethesame,theycanbeinterchanged,thatis,thetestimagecanbethesourcesothattheclusteringandHermiteexpansioncan 58

PAGE 59

EstimatedcomputationalcomplexityfortrainingwithNimagesandtestingwithoneimage.Matrixinversionandmultiplicationareconsidered(Inthissimulation,d=4096,N=60,p=4,kc=4) MACEO(4(d(2N2+N+2)+N2)+Ndlog2(d))O(4d+dlog2(d))CMACEO(d2(N2+1)+N2)O(d2N+N2)FastCMACEO(N2pd(kc+1)+d2+N2)O(pd(kc+1)N+N2) 6{11 )canbeapproximatedby whereBrepresentsaclusterwithacentersBandCn(B)isgivenby From( 6{16 ),wecanseethatevaluationatkcexpansionsatalltheevaluationpointscostsO(pkcd),sothetotalnumberofoperationsisO(pd(kc+1))percomputationofeachelementin( 6{3 )and( 6{4 ).ThenalaimistoobtaintheoutputoftheCMACElterwithNtrainingimagesandLtestimages.Inordertocomputetheoutputofonetestimage,theoriginaldirectmethodrequiresO(d2N(N+1))operationstoobtainTXXandTZX,andwecanreducetheoperationcountreducestoO(pd(kc+1)N(N+1))byapplyingthisenhancedFGT.Typicallypandkcarearound4whiledandNare4,096andaround100respectivelyinourapplication,whichresultsinacomputationalsavingsofroughly100times.Additionally,clusteringwiththetestimageisperformedonlyoncepertestwhichreducesthecomputationtimeevenmore.However,fromtheTable 6-1 ,weseethatthecomputationalcomplexityoftheCMACEforthetestingstilldependsonthenumberoftrainingimages,resultinginmorecomputationsthantheMACE.MoreworkisnecessarytoreduceevenfurtherthecomputationtimeoftheCMACEanditsmemorystoragerequirements,buttheproposedapproachenablespracticalapplicationswithpresentdaycomputers. 59

PAGE 60

7.1.1ProblemDescription 5 ]).wepickedandreporttheresultsofthetwomostdicultcaseswhoproducedtheworstperformancewiththeconventionalMACEmethod.Wetestwithalltheimagesofeachperson'sdatasetresultingin75outputsforeachclass.Thesimulationresultshavebeenobtainedbyaveraging(Monte-Carloapproach)over100dierenttrainingsets(eachtrainingsetconsistsofrandomlychosen5images)tominimizetheproblemofperformancedierencesduetosplittingtherelativelysmalldatabaseintrainingandtestingsets.Thekernelsize,,ischosentobe10forthecorrentropymatrixduringtrainingand30fortestoutput.Inthisdataset,ithasbeenobservedthatthekernelsizearound30%-50%ofthestandarddeviationoftheinputdatawouldbeappropriate.Moreover,wecancontroltheperformancebychoosingadierentkernelsizeduringtrainingforprewhitening. 7-1 showstheaveragetestoutputpeakvaluesforimagerecognition.Thedesiredoutputpeakvalueshouldbeclosetoonewhenthetestimagebelongstothetrainingimageclass(trueclass)andotherwiseitshouldbeclosetozero.Figure 7-1 (Top)showsthatthecorrelationoutputpeakvaluesoftheconventionalMACEinfalseclassesisclosetozeroanditmeansthattheMACEhasagoodrejectingabilityoffalseclass.However,someoutputsinthetestimageset,eveninthetrueclass,arenotrecognizedasthetrueclass.Figure 7-1 (Bottom)showstheoutputvaluesoftheproposedcorrentropy 60

PAGE 61

Theaveragedtestoutputpeakvalues(100Monte-CarlosimulationswithN=5),(Top):MACE,(Bottom):CMACE. Figure7-2. ThetestoutputpeakvalueswithadditiveGaussiannoise(N=5),(Top):MACE,circle-trueclasswithSNR=10dB,cross-falseclasswithSNR=2dB,(Bottom):CMACE,circle-trueclasswithSNR=10dB,cross-falseclasswithSNR=2dB. 61

PAGE 62

ThecomparisonofROCcurveswithdierentSNRs. MACEandwecanseethatthegeneralizationandrejectingperformanceareimproved.Asaresult,thetwoimagescanberecognizedwellevenwithasmallnumberoftrainingimages.OneofproblemsoftheconventionalMACEisthattheperformancecanbeeasilydegradedbyadditivenoiseinthetestimagesincetheMACEdoesnothaveanyspecialmechanismtoconsiderinputnoise.Therefore,ithasapoorrejectingabilityforafalseclassimagewhennoiseisaddedintoafalseclass.Figure 7-2 (Top)showsthenoiseeectontheconventionalMACE.WhentheclassimagesareseriouslydistortedbyadditiveGaussiannoise(SNR=2dB),thecorrelationoutputpeaksofsometestimagesfromfalseclassbecomegreatthanthatofthetrueclass,hencewrongrecognitionhappens.TheresultsinFigure 7-2 (Bottom)areobtainedbytheproposedmethod.ThecorrentropyMACEshowsamuchbetterperformanceespeciallyforrejectingeveninaverylowSNRenvironment.Figure 7-3 showsthecomparisonofROCcurveswithdierentSNRs.IntheconventionalMACE,wecanseethatthefalsealarmrateisincreasedasadditivenoisepowerisincreased.However,intheproposedmethod,theprobabilityofdetection 62

PAGE 63

ComparisonofstandarddeviationsofalltheMonte-Carlosimulationoutputs(10075outputs) OneofadvantageoftheproposedmethodisthatitismorerobustthantheconventionalMACE.Thatis,thevariationofthetestoutputpeakvalueduetoadierenttrainingsetissmallerthanthatoftheMACE.Figure 7-4 showsstandarddeviationsof100Monte-Carlooutputspertestinputwhenthetestinputarenoisyfalseclassimages.Table1showsthecomparisonofthestandarddeviationof750outputs(100Monte-Carlooutputsfor75inputs)foreachclass.Fromthe 7-1 ,wecanseethatthevariationsofthecorrentropyMACEoutputsduetodierenttrainingsetismuchlessthanthoseoftheconventionalMACEandittellsusthatourproposednonlinearversionoftheMACEoutperformstheconventionalMACEandachievesarobustperformancefordistortion-tolerantpatternrecognition. Table 7-2 showstheareaundertheROCfordierentkernelsizesinthecaseofnoadditivenoise.Inthissimulation,thekernelsizeslieintherangebetween0.1to15providetheperfectROCperformance.ThekernelsizeobtainedbySilverman'sruleofthumb[ 44 ],whichisgivenbyi=1:06^id1=5,where^iisthestandarddeviationoftheithtrainingdataanddisthenumberofsamples,is9.63anditalsoresultsinthebestperformance.Asexpectedfromthepropertyofcorrentropy,itisnoticedthatcorrentropyapproachescorrelationwithlargekernelsize(ROCareaoftheMACEisabout0.96). 63

PAGE 64

Thecomparisonofstandarddeviationof100Monte-Carlosimulationoutputsofeachnoisyfalseclasstestimages. Table7-2. ComparisonofROCareaswithdierentkernelsizes KernelsizeROCarea KernelsizeROCarea 0.11 200.99010.51 500.98041.01 1000.98109.61 2000.982010.01 5000.979615.01 10000.9518 7.2.1ProblemDescription 49 ].TheMSTAR(MovingandStationaryTargetAcquisitionandRecognition)dataisastandarddatasetintheSARATRcommunity,allowingresearcherstotestandcomparetheirATRalgorithms.ThedatabaseconsistsofX-bandSARimageswith1footby1footresolutionat15,17,30and45degreedepressionangles.ThedatawascollectedbySandiaNationalLaboratory(SNL)usingtheSTARLOSsensor.Theoriginaldataset 64

PAGE 65

ThisdissertationcomparestherecognitionperformanceoftheproposedCMACElteragainsttheconventionalMACEconsideringtwodistortionfactors.Therstdistortioncaseisduetoadierentaspectanglebetweentrainingandtesting,andthesecondcaseisadierentdepressionanglebetweentestandtrainingdata.Inthesimulations,theperformanceismeasuredbyobservingthetestoutputpeakvalueandcreatingtheROC(ReceiverOperatingCharacteristic)curve.Thekernelsize,,ischosentobe0.1fortheestimationofcorrentropyinthetrainingimagesand0.5fortestoutputin( 6{3 )and( 6{4 ).Thevalueof0.1forthekernelsizecorrespondstothestandarddeviationofthetrainingdatawhichisconsistentwiththeSilverman'srule.Experimentallyitwasveriedthatalargerkernelsizefortestingprovidedbetterresults. 7-5 (a)showsthetrainingimages,whichareusedtocomposetheMACEandtheCMACElters.Inordertoevaluationtheeectoftheaspectangledistortion,trainingimageswereselectedatevery3indexnumbersfromatotalof120exemplarimagesforeachvehicle(mostofindexnumbershavea2dierence 65

PAGE 66

CaseA:SampleSARimages(64x64pixels)oftwovehicletypesforatargetchip(BTR60)andaconfuser(T62). andsomehavea1dierenceinaspectangle).Thatmeansthatthetotalnumberoftrainingimagesusedtoconstructalteris40(N=40).Figure 7-5 (b)showstestimagesfortherecognitionclassand(c)representsconfusionvehicleimages.Testingisconductedwithallof120exemplarimagesforeachvehicle.Weareinterestedonlyinthecenteroftheoutputplane,sincetheimagesarealreadycentered.ThepeakoutputresponsesoverallexemplarsinthetestsetareshowninFigure 7-6 .Inthesimulation,theconstraintvaluefortheMACEaswellastheCMACElterisoneforthetraining,thereforethedesiredoutputpeakvalueshouldbeclosetoonewhenthetestimagebelongstothetargetclassandshouldbeclosetozerootherwise.Figure 7-6 (Top)showsthecorrelationoutputpeakvalueoftheMACEandFigure 7-6 (Bottom)showstheoutputpeakvaluesoftheCMACElterforbothatargetandaconfuser. Figure 7-6 illustratesthatresultsareperfectforboththeMACEandtheCMACEwithinthetrainingimages.However,intheMACElter,mostofthepeakoutputvaluesontestimagesarelessthan0.5.ThisshowsthattheMACEoutputgeneralizespoorly 66

PAGE 67

CaseA:Peakoutputresponsesoftestingimagesforatargetchip(circle)andaconfuser(cross):(Top)MACE,(Bottom)CMACE. Figure7-7. CaseA:ROCcurveswithdierentnumbersoftrainingimages. 67

PAGE 68

7-7 .FromtheROCcurveswecanseethatthedetectingabilityoftheproposedmethodismuchbetterthanboththeMACEandtheKCFA.FortheKCFA,prewhitenedimagesareobtainedbymultiplyingD0:5inthefrequencydomainandappliedthekerneltricktotheprewhitenedimagestocomputetheoutputin( 5{31 ).Gaussiankernelwithkernelsizeof5isusedfortheKCF.FromtheROCcurvesinFigure 7-7 ,wecanalsoseethattheCMACEoutperformsthenonlinearkernelcorrelationlterinparticularforhighdetectionprobability. Figure 7-8 (a)showstheMACElteroutputplaneand(b)showstheCMACElteroutputplane,foratestimageinthetargetclassnotpresentinthetrainingset.Figure 7-8 (c)and(d)showthecaseofaconfuser(falseclass)testinput.InFigure 7-8 (a)and(b),wecanseethatboththeMACEandtheCMACEproduceasharppeakintheoutputplane.However,thepeakvalueattheoriginoftheCMACEishigher(closertothedesiredvalue)thanthatoftheMACE.Moreover,theCMACEhaslesssidelobesandthevaluesofsidelobesaroundtheoriginarelowerthanthoseoftheMACE.ThesepointstellusthatthedetectionabilityoftheproposemethodisbetterthattheMACE.Ontheotherhand,fortheconfusertestinputinFigure 7-8 .(c)and(d),theoutputvaluesaroundtheoriginoftheCMACEhavelowervaluesthantheMACE,whichmeansthattheCMACEhasbetterrejectionabilitythantheMACE. Inordertodemonstratetheshift-invariantpropertyoftheCMACE,weapplytheimagesofFigure 7-9 .Thetestimagewascroppedfortheobjecttobeshifted13pixels 68

PAGE 69

(b)TrueclassintheCMACE (c)FalseclassintheMACE (d)FalseclassintheCMACE CaseA:TheMACEoutputplanevs.theCMACEoutputplane inbothxandypixelpositions.Figure 7-10 showstheoutputplanesoftheMACEandCMACEwhentheshiftedimageisusedasthetestinputwhileallthetrainingimagesarecentered.InFigure 7-10 ,themaximumpeakvalueshouldhappenatthepositionof(77,77)intheoutputplanesincetheobjectisshiftedby13pixelsinbothxandydirections.IntheCMACEoutputplane,themaximumpeakhappensat(77,77)andthevalueis0.9585.However,intheMACE,themaximumpeakhappensat(74,93)with0.9442andthevalueatthepositionof(77,77)is0.93.Inthistest,theCMACEshowsbettershiftinvariancepropertythantheMACE. 69

PAGE 70

SampleimagesofBTR60ofsize(6464)pixels(a)Thecroppedimageofsize(6464)pixelsatthecenteroftheoriginalofsize(128x128)pixels.(b)Thecroppedimageofsize(6464)pixelswith(x13;y13)oftheoriginalofsize(128128)pixels. (b)TheCMACEoutputplane CaseA:Outputplaneswithshiftedtrueclassinputimage 70

PAGE 71

44 ].Amoreprincipledalternativeistoapplycrossvalidationtondthebestkernelsize.Forcrossvalidation,weuseoneimageoftrainingsetwhichisnotincludedinlterdesign.Sinceweareconsideringimagesas1-dimensionalvectors,wehaveNdierenttrainingdataset.Therefore,weobtainoneproperkernelsize,,byaveragingNdierentkernelsizeswith=1 7-11 showstheROCcurvesforthekernelsizeobtainedbySilverman'sruleandtheoneobtainedbycrossvalidation.WeseethattheROCperformancefromtheSilverman'ruleisveryclosetothatoftheoptimalkernelsizebycrossvalidation.Alsowhenweincreasethekernelsizetobe10,itsperformanceissimilartothatoftheMACE.Asexpectedfromthepropertyofcorrentropy,itisnoticedthatcorrentropyapproachescorrelationwithlargekernelsize. Table 7-5 showstheareaundertheROCfordierentkernelsizes,andweconcludethatkernelsizesbetween0.01to1providelittlechangeindetectability.Thismaybesurprisingwhencontrastedwiththeproblemofndingtheoptimalkernelsizeindensityestimation,butincorrentropythekernelsizeentersintheargumentofanexpectedvalueandplaysadierentroleinthenalsolution,namelyitcontrolsthebalancebetweentheeectofsecondordermomentsversusthehigherordermoments(seeproperty1). 71

PAGE 72

TheROCcomparisonwithdierentkernelsizes Table7-3. CaseA:ComparisonofROCareaswithdierentkernelsizes KernelsizeROCarea KernelsizeROCarea 0.010.9623 0.60.98060.01850.9686 0.70.97710.050.9631 0.80.97540.10.9847 0.90.97490.20.9865 1.00.96020.30.9797 2.00.93970.40.9797 5.00.92560.50.9808 10.00.9033 distortion,trainingdataareselectedfromtargetimageswhichwerecollectedat30degreedepressionangleandtheMACEandCMACEaretestedwithdatatakenat17degreedepressionangle. Figure 7-12 depictssomesampleimages.AswecanseeinFigure 7-12 (a)and(b),duetothebigchangeindepressionangle(13degreeofdepressionisconsideredahugedistortion),testimageshavemoreshadowsandtheimagesizeofthevehiclesalsochange,makingdetectionmoredicult.Inthissimulation,weusealltheimages(120images 72

PAGE 73

CaseB:SampleSARimages(64x64pixels)oftwovehicletypesforatargetchip(2S1)andaconfuser(T62). covering180degreesofpose)at30degreedepressionanglefortrainingandalsotestwithallof120exemplarimagesat17degreedepressionangle. Figure 7-13 (Top)showsthecorrelationoutputpeakvalueoftheMACEand(Bottom)showstheoutputpeakvaluesoftheCMACElterwithatargetandaconfusertestdata.WeseethattheconventionalMACEisverypoorinthiscase,eitherunderorovershootingthepeakvalueof1forthetargetclass,buttheCMACEcanimprovetherecognitionperformancebecauseofitsbettergeneralization.Figure 7-14 depictstheROCcurveandsummarizestheCMACEadvantageovertheMACEinthislargedepressionangledistortioncase.Moreinterestingly,theKCFAperformanceisclosertothelinearMACE,duetothesameinputspacewhiteningwhichisunabletocopewiththelargedistortion. 73

PAGE 74

CaseB:Peakoutputresponsesoftestingimagesforatargetchip(circle)andaconfuser(cross):(Top)MACE,(Bottom)CMACE. Figure7-14. CaseB:ROCcurves. 74

PAGE 75

Comparisonofcomputationtimeanderrorforonetestimagebetweenthedirectmethod(CMACE)andtheFGTmethod(FastCMACE)withp=4andkc=4 Train:KXX7622.868.319.9668e-06 Table2showsthecomparisonofcomputationtimefor( 6{3 )and( 6{4 )betweenthedirectimplementationoftheCMACElterandthefastmethodwithaHermiteapproximationorderofp=4andkc=4clusters.Thecomputationtimeandabsoluteerrorsforonetestimagewereobtainedbyaveraging120testimages.ThissimulationshowsthattheFGTmethodisabout100timesfasterthanthedirectmethodwithareasonableerrorprecision.Figure 7-15 presentsthecomparisonintermsofROCcurvesoftheMACE,theCMACEandthefastCMACE.FormtheROCcurvewecanobservethattheapproximationwithp=4andkc=4isveryclosetotheoriginalROC.Table3showstheeectofdierentorders(p)andclusters(kc)onthecomputationtimeandaccuracyforthefastCMACElter.Weconcludethatthecomputationtimeincreasesroughlyproportionaltopandkc,whiletheabsoluteerrorlinearlydecreases. 75

PAGE 76

ComparisonofROCcurvesbetweenthedirectandtheFGTmethodincaseA Table7-5. ComparisonofcomputationtimeanderrorforonetestimageintheFGTmethodwithadierentnumberofordersandclusters ClusterTime(sec)Error 20.81161.48e-02 20.71815.61e-0261.51408.23e-04 61.66933.87e-04102.21198.58e-06 102.55954.71e-05142.85334.16e-07 143.56606.93e-06203.80971.25e-09 205.30671.14e-06 7-16 showssampleimagesoftheoriginalandnoisyimagewithsignaltonoiseratio(SNR)of7dB.Alsointhissimulation,wecomparetheCMACEwiththeoptimaltrade-olter(OTSDF),whichisawellknowcorrelationltertoovercomethepoorgeneralizationoftheMACEwhennoiseinputispresented.TheOTSDFlterisgivenby whereT=D+p 7-17 showsthecomparisonofROCcurvesoftheMACE, 76

PAGE 77

SampleSARimages(64x64pixels)ofBTR60. Figure7-17. ROCcomparisonswithnoisytestimages(SNR=7dB)inthecaseA(N=40). 77

PAGE 78

78

PAGE 79

50 ].Dimensionalityreductionprovidescompressionandcodingnecessarytoavoidexcessivememoryusageandcomputation. PrincipalComponentAnalysis(PCA)isthemostwidelyknownwayofreducingdimensionanditisoptimalinthemeansquareerrorsense.PCAdeterminesthebasisvectorsbyndingthedirectionsofmaximumvarianceinthedataanditminimizestheerrorbetweentheoriginaldataandtheonereconstructedfromitslowdimensionalrepresentation.PCAhasbeenverypopularinfacerecognition[ 51 ]andmanypatternrecognitionapplications[ 52 ].Findingtheprincipalcomponentsisawellestablishednumericalprocedurethrougheigendecompositionofthedatacovariancematrix,althoughitisstillexpensivetocompute.Thereareotherlessexpensivemethods[ 51 ]basedonrecursivealgorithms[ 53 ]forndingonlyafeweigenvectorsandeigenvaluesofalargematrix,butthecomputationalcomplexityisstillaburden.Moreover,subspaceprojectionsbyPCAdonotpreservediscrimination[ 54 ],sotheremaybealossofperformance.VarianttoSingularValueDecomposition(SVD)areconsideredforimagecompressionutilizingtheKarhunen-LoeveTransformation(KLT).LikePCA,SVDmethodisalsoexpensivetocompute. DiscreteCosineTransform(DCT)[ 50 ]isawidelyusedmethodforimagecompressionandasitcanalsobeusedindimensionalityreductionofimagedata.DCTiscomputationallylessburdensomethanPCAanditsperformanceapproachesthatofPCA.DCTisoptimalforhumaneye:thedistortionsintroducedoccuratthehighestfrequenciesonlyandthehumaneyetendstoneglecttheseasnoise.TheimageistransformedtotheDCTdomain 79

PAGE 80

Recently,randomprojection(RP)hasmergedasanalternatedimensionalityreductionmethodinmachinelearningandimagecompression[ 55 ],[ 56 ],[ 57 ],[ 58 ],[ 59 ][ 56 ],[ 60 ]duetoitssimplecomplexityandgoodperformance.ManyexperimentsintheliteratureshowthatRPiscomputationallysimplewhilepreservingsimilaritytoahighdegree.Inrandomprojection,theoriginalhighdimensionaldataisprojectedontoalowerdimensionalsubspaceusingarandommatrixwithonlyasmalldistortionofthedistancedbetweenthepointswhilepreservingsimilarityinformation.Eventhoughtheprojecteddatabyrandomselectionincudeskeyinformationoftheoriginaldata,weneedtoextracttheinformationproperly.Sincecorrentropyhasabilitytoextracthigherordermomentsof 80

PAGE 81

Inthischapterwepresentadimensionalityreductionpre-processorbasedonrandomprojections(RP)todecreasethestorageandmeetmorereadilyavailablecomputationalresourcesandshowtheRPmethodworkswellwiththeCMACElterforimagerecognition. PCAseekstoreducethedimensionalityofthedatabyndingafeworthogonallinearcombinationsoftheoriginalvariableswiththelargestvariance. LetussupposethatwearegivenadatamatrixXinRd,whosesizeisdN,whereNisthenumberofvectorsind-dimensionalspace.Thegoalistondak-dimensionalsubspace(k
PAGE 82

thenforanysetVofNpointinRd,thereisamapf:Rd!Rksuchthatforallu;v2V, (1)kuvk2kf(u)f(v)k2(1+)kuvk2:(8{3) Furthermorethismapcanbefoundinpolynomialtime. JLlemmastatesthatanyNpointsetinddimensionalEuclideanspacecanbemappeddownontoakO(logN=2)dimensionalsubspacewithoutdistortingthedistancebetweenanypairofpointsbymorethanafactorof(1)forany0<<1,withprobabilityO(1=N2).Aproofofthislemmaaswellastighterboundsonandkaregivenin[ 61 ]. LetussupposethatwearegivenadatamatrixX,whosesizeisdN,whereNisthenumberofvectorsind-dimensionalspace.Thentheprojectionoftheoriginaldataontoalowerk-dimensionalsubspacecanbeobtainedby wherePiskdanditiscalledtherandomprojectionmatrix. ThecomplexityofRPisverysimplecomparedtootherdimensionreductionmethods.RPneedsonlyorderofO(kdN)forprojectingdNdatamatrixintokdimensions.The 82

PAGE 83

62 ]. 58 ],[ 63 ].Herewepresentthreesuchmethods. Inmostapplications,theGaussianensemblewouldsatisfyJLlemmawell.Theothertwomethodsyieldsignicantcomputationalsavings[ 64 ]. 65 ].Thus,vectorshavingrandomdirectionsinahighdimensional 83

PAGE 84

TheinnerproductoftwovectorsxandythathavebeenobtainedbyrandomprojectionofthevectorsuandvwiththerandommatrixRcanbeexpressedas ThematrixRTRcanbedecomposedintotwoterms where"ij=rTirjfori6=jand"ij=0foralli. Ifalltheentriesin"areequaltozero,i.e.,thevectorsriandrjareorthogonal,thematrixRTRwouldbeequaltoIandthesimilaritybetweentheoriginaldataandtheprojecteddatawouldbepreservedexactlyintherandommapping.Inpracticetheentriesin"willnesmallbutnotequaltozero. Hereletusconsiderthecaseofthattheentriesoftherandommatrixareidenticallyandindependentlysampledfromanormaldistributionwithzeromeanandunitvarianceandthereafterthelengthofalltheri'sisnormalized.Thenitisevidentthat"ijisanestimateofthecorrelationcoecientbetweentwoi.i.dnormaldistributedrandomvariableandifthedimensionalitykofthereduceddimensionalspaceislarge,"ijisapproximatelynormallydistributedwithzeromeananditsvariance2"canbeapproximatedby Thatis,thedistortionoftheinnerproductproducedbytherandomprojectioniszeroontheaverageanditsvarianceisatmosttheinverseofthedimensionalityofthereducedspace.Thisresultcausesthescalingfactor1=p 84

PAGE 85

66 ].However,themethodologyforbuildingRpmayaectthesubsequentalgorithmsusedforprocessing,andthisisanareathatismuchlessstudied. 7 .WeprojecttheoriginaldataintoalowerdimensionalspacewithrandomprojectionandapplytheCMACEltertothereduceddimensionaldata.Inthissimulation,weusetheGaussianEnsemblemethodtogeneratearandomprojectionmatrix.InordertocomparetheperformanceoftheCMACEafterapreprocessingwithrandomprojection,wecomputetheareaundertheROCcurve. Figure 8-1 showstheROCareavalueswithdierentreduceddimensionsbyRP. Figure8-1. ThecomparisonofROCareaswithdierentRPdimensionality(50trialswithdierenttrainingimagesandRPmatrices). 85

PAGE 86

ThecomparisonofROCareaswithdierentRPdimensionality(50trialswithdierentRPmatrices,butxedtrainingimages). SinceRPisinfactarandomfunctionwepresentresultsintermsofmean,maximumandminimumperformanceobtainedin50Monte-carlosimulations.Ateverytrialweuserandomlychosendierenttrainingimages(N=5)anddierentrandomprojectionmatrices.WhentheMACEisappliedtotheoriginaldata,theaverageROCareais0.96(thebestcaseis0.99andtheworstcaseis0.8868).InFigure 8-1 theperformanceoftheCMACEwiththereduceddimensionalityk144isalwaysbetterthanthatoftheMACElterwithoriginaldata.TherangeofperformancebetweenthebestandtheworstcasesisduetoboththeeectofdierenttrainingimagesanddierentRPmatrices. InordertomonitoronlytheeectofRP,wexedthetrainingimagesandrun50Monte-carlosimulationswithdierentRPmatrices.TheresultsonROCareasareshowninFigure 8-2 .WecanseethatthevariationsoftheperformanceduetodierentRPmatricesissubstantiallysmallerandtheCMACEobtainsconsistentperformancewithRPwhentheimagesizeisabove1616(dimensionalityk=256). 86

PAGE 87

ROCcomparisonwithdierentdimensionalityreductionmethodsforMACEandCMACE(reducedimagesizeis1616). Thecomparisonamongthefourdimensionalityreductionmethods(subsampling,pixel-averaging,bilinearinterpolationandGaussian(RP))forimagesofsize1616(from6464)isshowninFigure 8-3 FortheCMACE,theGaussianmethod(RP)andpixel-averagingmethodsworkverywell,withthesubsamplingtheworst,butstillwithrobustperformance.Subsamplingisthesimplesttechniquebutitcanalsolooseimportantdetailinformation.IntheMACEcase,theGaussianmethodistheworst,withpixel-averagingmethodstillperformingsomediscrimination,butatamuchreducedrate(comparewithFigure 7-3 ).Itissurprisingthatlocalpixelaveraging,thesimplestmethodofdimensionalityreductionprovidessucharobustperformanceinthisapplicationforboththeMACEandCMACE.Itindicatesthatcoarsefeaturesaresucientfordiscriminationuptoacertainlevelofperformance.However,noticethatthepixelaveraginglooseswithrespecttoCMACE-RPwhentheoperatingpointintheROCiscloseto100%,ascanbeexpected(nerdetailisneededtodiscriminatebetweenclasses). 87

PAGE 88

8-4 showstheROCcurveswhenonlytrueclassimagesareusedforPCA.Evenforthiscase,wehadtouseallthetrueclassimages(75)tond75principaldirections,projectalltheimages,andthenchoosethe5projectedimagestocompositetheMACEandCMACElter.Fortesting,wealsoprojectthetestimageontothesubspaceobtainedfromthetrainingset.SincethefalseclasstestimagesarenotusedtodeterminethePCAsubspace,theprojecteddataofthefalseclassisnotguaranteetopreservetheinformationoftheoriginalimages,therefore,therejectingperformancebecomesverypoor.TheROCareavaluefortheMACEandCMACEare0.4015and0.7283,respectively. WecouldnotobtainreasonableresultsintheMACEwiththeRPmethodasshowninFigure 8-3 .WewillexplainthisMACEbehaviorduetotheGaussiandimensionalityreductionprocedure,butitpartiallyalsoappliestotheothermethods.AlthoughRPpreservessimilarityinthereducedprojections,itchangedthestatisticsoftheoriginaldataclasses.AfterrandomprojectionwiththeGaussianensemblemethod,alltheprojectedimagesdisplaystatisticsveryclosetowhitenoisewithsimilarvariance.ThisresultisshowninFigure 8-5 ,wheretwoclassessampleimagesofsize1616afterapplyingRPtoallimagesaredepicted.Therstrowshowsthetrainingimageset,whilethesecond 88

PAGE 89

ROCcomparisonwithPCAforMACEandCMACE(reduceddimensionalityisk=75). rowdisplaystheinclasstestsetandthethirdrowtheoutofclasstestimages.Weseethattheprojectedimagesinthetrueclassandthefalseclass,althoughslightlydierentindetail,seemtohaveverysimilarstatistics. TheMACE,whichextractsonlysecondorderinformationisunabletodistinguishbetweentheprojectedimageset,howevertheCMACEsuccedsinthistask.Inordertoexplaintheeectivenessofcorrentropyfunction,wecomparethecorrelationandcorrentropyintheprojectedspace.ThisresultisshowninFigure 8-6 .Weconsiderthe2Dimagesaslong1Dvectors.InFigure 8-6 (a)weshowtheautocorrelationofoneoriginalimagevectorinthetrueclass,(b)depictstheautocorrelationofoneofthetrainingimagesafterRP,whichleadsustoconcludethattheprojectedimagehasbeenwhitened(theonlypeakoccursatzerolag),and(c)showsthatthecrosscorrelationbetweenthereducedtrainingimagevectorandtestimagevectorinthefalseclassafterRPispracticallythesameastheautocorrelationofthereducedtrainingimagevectorafterRP.ThereforethecovarianceinformationoftheimagesafterRPistotallydestroyed.Sincetheconventional 89

PAGE 90

Sampleimagesofsize1616afterRP.(a)Trainingimages.(b)Trueclassimages.(c)Falseclassimages. MACElterutilizesonlythesecondorderinformation,itisunabletodiscriminatebetweeninclassandoutofclassimages.However,in(e)and(f),wecanseethatthecrosscorrentropybetweeninclassandoutofclassimagesisstillpreservedafterRP,duetothefactthatcorrentropyhasabilitytoextracthigherorderinformationofthereduceddimensionaldata.Therefore,theCMACElterseemsverywellposedtoworkwiththereduceddimensionalimagesbyrandomprojectionforthisandotherapplications. WecanalsoseetheoveralldetectionandrecognitionperformanceoftheCMACE-RPthroughafurtheranalysisoftheoutputplane.Figure 8-7 showscorrelationoutputplanesfortheMACEandcorrentropyoutputplanes(CMACE)afterdimensionalityreductionwithk=64randomprojections.Figure 8-7 (a)showsthedesirablecorrelationoutputplaneoftheMACEltergiventhetrueclasstestimage,however(b)showsthepoorrejectingabilityforthefalseclasstestimage.Ontheotherhand,intheCMACElter,thetrueandfalseclassimageoutputplaneinFigure 8-7 (c)and(d)showtheexpectedresponsesevenwithsuchasmalldimensionalimages. 90

PAGE 91

Thecrosscorrelationvs.crosscorrentropy.(a)Autocorrelationofoneoftheoriginaltrainingimagevector.(b)AutocorrelationofoneofthereducedtrainingimagevectorafterRP.(c)CrosscorrelationbetweenoneofthereducedtrainingimagevectorandtestimagevectorinthefalseclassafterRP.(d)Autocorrentropyofoneoftheoriginaltrainingimagevector.(e)AutocorrentropyofoneofthereducedtrainingimagevectorafterRP.(f)CrosscorrentropybetweenoneofthereducedtrainingimagevectorandtestimagevectorinfalseclassafterRP. TheinitialideatouseapreprocessorbasedonrandomprojectionswastoalleviatethestorageandcomputationcomplexityoftheCMACE.Table 8-1 presentscomparisonsbetweentheoriginalCMACEandtheCMACEwithRP.Thedominantcomponentforstorageisthecorrentropymatrix(VX).Insingleprecision(32bit),64MbytesareneededtostoreVXof6464pixelimages,butonly256Kbyteswith1616pixelimagesafterRP.Weneedanadditional4MbytestoperformrandomprojectionwiththeGaussianensemblemethod.Inthebinaryensemblecase,noadditionalstorageforRPisneeded.Thetablealsopresentsthecomputationalcomplexityof( 6{1 )withonetestimage,givenN=5trainingimagesandclockedwithMATLABversion7.0ona2.8GHzPentium4processorwith2GbytesofRAM. 91

PAGE 92

(b)WithafalseclasstestimageintheMACE (c)WithatrueclasstestimageintheCMACE (d)WithafalseclasstestimageintheCMACE Correlationoutputplanesvs.correntropyoutputplanesafterdimensionreductionwithrandomprojection(reducedimagesizeis88). Table8-1. ComparisonofthememoryandcomputationtimebetweentheoriginalCMACE(imagesizeof6464)andCMACE-RP(1616,withGaussianensemblemethod)foronetestimagewithN=5 CMACECMACE-RP(d=4096)(k=256) Memory(byte)O(4d2)O(4k(k+d))(singleprecision)=64MB+=4.2MB+ Time(sec)58.5840.4297 92

PAGE 93

CorrentropyinducesanewRKHSthathasthesamedimensionalityastheinputspacebutisnonlinearlyrelatedtoit.Therefore,itisdierentfromtheconventionalkernelmethods,inbothscopeanddetail.HereweillustratethattheoptimalMACElterformulationcanbedirectlysolvedintheVRKHS.ThisCMACEovercomesthemainshortcomingsoftheMACEwhichispoorgeneralization.Webelievethisisduetotheutilizationforthematchingofhigherorderstatisticalinformationinthetargetclass.TheCMACEalsoshowsagoodrejectingperformanceaswellasrobustresultswithadditivenoise.Thisisduetotheprewhiteningeectinfeaturespaceandthenewmetriccreatedbythecorrentropythatreducesoutliers.SimulationresultsshowthatthedetectionandrecognitionperformanceoftheCMACEexhibitsbetterdistortiontolerancethantheMACEinsomekindsofdistortions(infacerecognition,dierentfacialexpression,andinSAR,aspectangleaswellasdepression).AlsotheCMACEoutperformsthenonlinearkernelcorrelationlter,whichisthekernelizedSDFwithprewhiteneddataintheinputspace,especiallyforthelargedistortioncase.MoreovertheCMACEpreservestheshift-invariantpropertywell. ThesensitivityoftheCMACEperformanceonthekernelsizeisexperimentallydemonstratedtobesmall,butafullunderstandingofthisparameterrequiresfurtherinvestigation.Inaddition,thereisstillanapproximationin( 6{3 )and( 6{4 )tocomputetheproductsoftheprojecteddatafunctionalsbyakernelevaluation,whichisguaranteedonaverage.Forlargeimages,thisapproximationseemstobegood,butitserrorneedstobeunderstoodandquantiedtoobtainthebestperformanceoftheCMACElter. 93

PAGE 94

However,thisneedsstillhugestorageandisnotverycompetitivewithothermethodsforobjectrecognition.Therandomprojection(RP)methodmaymaketheCMACEusefulforpracticalapplicationsusingstandardcomputinghardware.RPisapreprocessorthatextractsfeaturesofthedata,butunlikePCAitisveryeasytocomputeonO(kd).Reducingthedataintofeatureshasadoubleeectofaddressingboththestorageandcomputationrequirements.Forinstanceinsteadof64Mbytesfor6464pixelimages,thestorageforimageswithRPto16x16pixelimagesis4.2Mbytes(binaryensemblecaseis256Kbytes).Computationalspeedimprovesbymorethan100times.Themethodofrandomprojectionsanditsimpactonsubsequentpatternrecognitionalgorithmsisstillpoorlyunderstood.HereweveriedthattheMACEisincompatiblewiththeGaussianmethodofrandomprojectionssinceitdestroysthesecondorderstatisticsthatmaketheMACEwork.Thepixel-averagingmethodseemstopreservesecondorderstatisticstoacertaindegree.However,theCMACEcombinedwithRPisabetteralternative,anditislesssensitivetothemethodofdatareduction.Thiscanbeunderstoodifweremember 94

PAGE 95

ThesetestswiththeCMACEanddatareductionclearlyshowedanewapplicationdomainforcorrentropyinsignalprocessing.Theconventionaldatareductionmethodsaveragelocallyorgloballydataandtendtodestroymeanandvariance,butapparentlytheypreservesomeofthehigherorderinformationcontainedinthedatathatcanstillbeutilizedbycorrentropy.Therefore,inapplicationswheredatareductionatthefront-endisanecessity,correntropymayprovidestillusablealgorithms,incaseswheresecondordermethodsfail.Thisargumentisalsoveryrelevantincompressivesampling(CS),whereconvexoptimizationneedstobeutilizedtominimizethel1norm,sincethel2normcreatesalotofartifactsinreconstruction.Wethinkthatthecorrentropyinducedmetric(calledCIMin[ 42 ])canbeacandidatetosimplifythereconstructioninCS.Wehavehowevertofullyunderstandwhycorrentropyisabletostilldistinguishbetweenimagesorsignalsthathavebeenheavilydistorted,becausewecanperhapsevenproposenewdatareductionproceduresthatpreservethediscriminabilityofcorrentropy. First,theproposedcorrentropyMACElterhashardconstraintsonthecenteroftheoutputplane.AsthesameasthetraditionalSDFtypelters,linearconstraintsareimposedonthetrainingimagestoyieldaknownvalueatspeciclocationsintheoutput 95

PAGE 96

Second,thecomputationofthecorrentropyMACEoutputrequiresanapproximation.Unfortunately( 6{3 )and( 6{4 )involveweightedversionsofthefunctionalsthereforetheerrorintheapproximationshouldbeaddressedanditrequiresfurtherinvestigationforagoodapproximation. Finally,myresearchpresentedthesimulationresultsonapplicationstofacerecognitionandSARimagerecognition.Inadditiontofacerecognition,theproposedalgorithmcanbeappliedtothebiometricvericationsuchasirisandngerprint.AlsoinSARapplication,thereisthree-class(BMP2,BTR70andT72)objectclassicationamongMSTAR/IUpublicreleasedataset[ 67 ].Mostofliteraturesareapplyingtheiralgorithmtothethree-classproblemtocomparetheperformance.Thereforeinordertoconvinceotherresearchersweneedtocompareouralgorithmwiththethree-classproblem. Summarizingthefutureworks 96

PAGE 97

97

PAGE 98

Ifyisaweightedsumofvariables,y=aTx,thendy=dx=a.Thegeneralquadraticformforyinmatrixnotationisy=xTAx,whereA=faijgisanNNmatrixofweights.Here,weassumethatxisarealvector,thenthepartialderivativesofywithrespecttoeachvariableisdy=dx=(A+AT)x.IfAissymmetric,thendy=dx=2Ax. ThemethodofLagrangemultipliersisusefulforminimizingaquadraticfunctionsubjecttoasetoflinearconstraints.SupposethatB=[b1b2bM]isanNMmatrixwithvectorsbioflengthNasitscolumnsandc=[c1c2cM]isavectorofMconstants.Wewanttondthevectorxwhichminimizesthequadratictermy=xTAxwhilesatisfyingthelinearequationsBTx=c.IfAispositivesemi-denite,thentheyisconvexandthereisatleastonesolution.Weformthecostfunction wherethescalarparameters1;2;;MareknownastheLagrangemultipliers. SettingthegradientofJwithrespecttoxtozeroyields 2Ax2(1b1+2b2++MbM)=0:(A{2) Deningm=[12M]T,then( A{2 )canbeexpressedas or Substituting( A{4 )forxintotheconstraintBTx=cyields 98

PAGE 99

Using( A{4 )and( A{6 )weobtainthefollowingsolutiontotheconstraintoptimizationproblem 99

PAGE 100

Letpij(x;y)bethejointPDFof(xi;xj)suchthat where'(),iaretheeigenfunctionsandtheeigenvaluesofpij(x;y),respectively.Here, where,i=R'i(x)f(x)dx. Now,leti(),ibetheeigenfunctionsandtheeigenvaluesofthekernelk,then Observing( 5{15 )and( B{1 )wecanconstructfsuchthati=r Pjjij. Thenthereexistf(x)=Pii'i(x)satisfying( 5{15 ). 100

PAGE 101

Ashiftinvariantsystemisoneforwhichashiftordelayoftheinputsequencecausesacorrespondingshiftintheoutputsequence.AllthecomponentsoftheCMACEoutputaredeterminedbythekernelfunctionandbyprovingthattheGaussiankernelisshift-invariant,wecaneasilysaythattheCMACEisshift-invariant. LettheoutputoftheGaussiankernelbe Startwithashiftoftheinputx1s(n)=x1(nso)andx2s=x2(nso),thentheresponsey1(n)oftheshiftedinputis Nowtheshiftoftheoutputdenedasy(nso)becomes Thenclearlyy1(n)=y(nso),therefore,theGaussiankernelistime-invariant. 101

PAGE 102

Here,onlythecomputationalcomplexityforthematrixinversionandmultiplicationinboththeMACEandCMACEareconsidered.Letusassumethatalltheelementsofmatricesarereal. InordertoconstructtheMACEtemplateinthefrequencydomainwithagiventrainingimageset,O(d)multiplicationsareneededfortheinversionofthediagonalmatrixDofsizeddandO(N2)fortheinversionoftheToeplitzmatrix(XHD1X)ofsizeofNN.ThenumberofmultiplicationsareO(N2)forD1X,O(dN2+d)for(XHD1X)andO(dN2)forD1X(XHD1X)1.Inaddition,theFFTneedsO((N+1)dlog2(d))multiplicationswithNtrainingimages.Inrealitytheelementsofthematricesarecomplexvalued,therefore,theMACErequiresatotalofO(4(d(2N2+N+2)+N2)+Ndlog2(d))multiplicationstocomposethetemplateforthetrueclassinthefrequencydomain.Forthetestofoneinputimageafterbuildingatemplate,theMACErequiresonlyO(4d+dlog2(d))multiplications. TheCMACEneedsO(d2)andO(N2)multiplicationsfortheinversionofboththeToeplitzmatrixVXofsizeddandTXXofsizeNNandO((Nd)2)tocomputeTXX,therefore,thetotalnumberofmultiplicationsino-linemodewiththegiventrainingimagesetisO(d2(N2+1)+N2).Forthetestingofoneimage,O(N2)multiplicationsfortheoutputandO((Nd)2)operationsforobtainingTZXareneeded,therefore,thetotalcomputationalcomplexityoftheCMACEforonetestimagerequiresO(d2N+N2)multiplications. ThefastCMACEwiththeFGTreducesthecomputationalcomplexitytoO(N2pd(kc+1)+d2+N2)forthetrainingsetandtoO(pd(kc+1)N+N2)foronetestingimage. 102

PAGE 103

68 ],[ 69 ],andhasfoundnumerousapplicationsinradar,sonar,seismology,radioastronomy,medicalimaging,speechprocessing,andwirelesscommunications.TheclassicalapproachforbeamformingisanaturalextensionofFourier-basedspectralanalysistospatio-temporallysampleddata,whichiscalledtheconventionalBartlettbeamformer[ 70 ].Thisalgorithmmaximizestheenergyofthebeamformingoutputforagiveninputsignal.Becauseitisindependentofthesignalcharacteristics,butonlydependsonacertaindirection,itsmajordicultiesarelowspatialresolutionandhighsidelobes.Inanattempttoalleviatethelimitationsoftheconventionalbeamformer,theCaponbeamformerisintroduced[ 71 ],[ 72 ]. ACaponbeamformerattemptstominimizetheoutputenergycontributedbyinterferencecomingfromotherdirectionsthanfromthe"lookdirection".Moreover,itmaintainsaxedconstantgaininthe"lookdirection"(normalizedtoone)inordernottoriskthelossofthesignalcontainingtheinformation.ThisCaponbeamformerissensitivetothemismatchbetweentheassumedandactualarraysteeringvector,whichoccursofteninpractice.RecentlyarobustbeamformerwasproposedbyextendingtheCaponbeamformertothecaseofuncertainarraysteeringvectors[ 73 ],[ 74 ]. Inastatisticalpointofview,mostofthesetechniquesarebasedonlinearmodels,whichmakeuseofonlytherstandsecondordermomentinformation(e.g.themeanandthevariance)ofthedata.Therefore,theyarenotanappropriatechoiceinnon-Gaussiandistributeddatasuchasimpulsivenoisescenarios.Inordertodealwithmorerealisticsituation,furtherresearchintosignalmodelinghasledtotherealizationthatmanynaturalphenomenacanbebetterrepresentedbydistributionsofamoreimpulsivenature. 103

PAGE 104

75 ].Alpha-stabledistributionshavebeenusedtomodeldiversephenomenasuchasrandomuctuationsofgravitationalelds,economicmarketindexes[ 76 ],andradarclutter[ 77 ]. Toovercomethelimitationofthelinearmodelforthenon-Gaussianstatisticscase,anonlinearbeamformerhasbeenproposedin[ 78 ]butmostofnonlinearbeamformingmethodsarecomplicatedforweightvectorcomputation.Recently,kernelbasedlearningalgorithmshavebeenheavilyresearchedduetothefactthatlinearalgorithmscanbeeasilyextendedtononlinearversionsthroughkernelmethods[ 23 ].Somekernelbasedmethodshavebeenpresentedin[ 79 ],[ 80 ],[ 26 ]forbeamformingandtargetdetectionproblems. ThecorrentropyMACE(CMACE)lter[ 46 ][ 81 ],whichisthenonlinearversionofthecorrelationlter,hasbeenshowntopossessgoodgeneralizationandrejectingperformanceforimagerecognitionapplications. Inthisappendix,weapplycorrentropytothebeamformingproblemandexploitthelinearstructureoftheRKHSinducedbycorrentropytoformulatethecorrentropybeamformer.Duetothefactthatitinvolveshigh-orderstatisticswiththenonlinearrelationbetweentheinputandthisfeaturespaces,thecorrentropybeamformershowsbetterperformancethantheCaponandkernelmethodsandisrobusttoimpulsivenoisescenarios. E.2.1Problem 104

PAGE 105

andnkistheM1vectorofadditivewhitenoise.Also,thebeamformeroutputisgivenby wherew2CM1isavectorofweightsandHdenotestheconjugatetranspose.ThegoalistosatisfywHa()=1andminimizetheeectofthenoise(wHnk),inwhichcase,ykrecoverssk. Besides,wealsoassumethateachelementofnkfollowsasymmetric-stable(SS)distributiondescribedbythefollowingcharacteristicfunction whereisthecharacteristicexponentrestrictedtothevalues0<2,(<<1)isthelocationparameter,and(>0)isthedispersionofthedistribution.Thevalueofisrelatedtothedegreeoftheimpulsivenessofthedistribution.Smallervaluesofcorrespondtoheaviertaileddistributionsandhencetomoreimpulsivebehavior,whileasincreases,thetailsarelighterandthebehaviorislessimpulsive.Thespecialcaseof=2correspondstotheGaussiandistribution(N(;2)),while=1correspondstotheCauchydistribution. minwEy2ksubjecttowHa()=1:(E{5) 105

PAGE 106

71 ],[ 72 ].Equation( E{5 )hasananalyticalsolutiongivenby whereRxdenotesthecovariancematrixofthearrayoutputvector.Inpracticalapplications,Rxisreplacedbythesamplecovariancematrix^Rx,where withNdenotingthenumberofsnapshots.Substitutingwcaponintoequation( E{3 ),theconstrainedleastsquaresestimateofthelook-directionoutputis 23 ]:(xi;xj)=h(xi);(xj)i.Thissimpleandelegantideaallowsustoobtainnonlinearversionsofanylinearalgorithmexpressedintermsofinnerproducts,withoutevenknowingtheexactmapping. Usingtheconstrainedleast-squaresapproachthatwasexplainedintheprevioussectionitcaneasilybeshownthattheequivalentsolutionwkernelinthefeaturespaceisgivenby (a())HR1(x)(a());(E{9) whereR(x)isthecorrelationmatrixinthefeaturespace.Theestimatedcorrelationmatrixisgivenby 106

PAGE 107

(a())HR1(x)(a()):(E{11) Duetothehighdimensionalityofthefeaturespace,equation( E{11 )cannotbedirectlyimplementedinthefeaturespace.ItneedstobeconvertedintermsofthekernelfunctionsbytheeigenvectordecompositionprocedureofthekernelPCA[ 26 ].Thekernelizedversionofthebeamformeroutputisgivenby where andK=XHXisanNNGrammatrixwhoseentriesarethedotproducts(xi;xj)=h(xi);(xj)i. 107

PAGE 108

fork=1;2;;N.Givendatasamples,thecrosscorrentropybetweenthereceivedsignalatkthsnapshotandtheltercanbeestimatedas forallthelagsm=M+1;;M1. Thecorrentropyenergyofthekthreceivedsignaloutputisgivenby andtheMMcorrentropymatrixVxkis where,eachelementofthematrixiscomputedwithoutexplicitlyknowledgeofthemappingfunctionfby forl=0;;M1. Theaveragecorrentropyenergyoverallthereceiveddatacanbewrittenas whereVX=1 minfHwVXfw;subjecttofHwfa()=1:(E{23) 108

PAGE 109

Thentheoutputisgivenby where wherewijisthe(i;j)thelementofV1X,xk(i)istheithelementofthereceivedsignalatkthsnapshotanda(i)isithelementofthesteeringvector. Thenaloutputexpressionsin( E{26 )and( E{27 )areobtainedbyapproximatingf(a(j))f(a(i))andf(xk(j))f(a(i))by(a(j)a(i))and(xk(j)a(i)),respectively,whichissimilartothekerneltrickandholdsonaveragebecauseofproperty5. 109

PAGE 110

E-1 showsthebeampatternsofCapon,kernelandcorrentropybeamformerswithN=100and1000forthecasethatsignal-to-noise-ratios(SNR)is10dB.TheCaponbeamformerhasthepoorperformance,i.e.,higherside-lobesforthesmallnumberofN,whilethekernelmethodandcorrentropybeamformershowagoodbeampatternevenasmallnumberofN.ItiswellknownthatoneoftheproblemofthestandardCaponhasapoorperformancewithasmallnumberoftrainingdata.InFigure E-2 ,weshowtheperformanceofBERwithN=100and1000fortherangeofSNRbetween5and15dB.IthasbeenshownfromFigure E-2 (a)thatforN=100theCaponbeamformerexhibitsahighBERoor,buttheproposedbeamformerhasamuchbetterBERperformancethantheCaponandkernelbeamformer.ForN=1000inFigure E-2 (b),comparedwiththeCaponwhenSNRisunderof9dB,theCaponbeamformershowsbetterBERperformancethanothertwomethods,butwhenSNRincreases,BERofthecorrentropybeamformerbecomesthebest. Next,wetesttherobustnessoftheCapon,kernelandcorrentropybeamformerstotheimpulsivenoisewithN=1000.WeselectsuchthatSNRis10dBwhenan-stablenoisewith=2and=0(Gaussiannoise).Figure E-3 showsBERperformanceofthreebeamformersatdierentlevels.Thecorrentropybeamformerdisplayssuperiorperformancefordecreasing,thatis,increasingthestrengthofimpulsiveness.Fromthisresult,wecansaythattheproposedmethodisrobustintermsofBERtotheimpulsivenoiseenvironmentforwirelesscommunications. Figure E-4 (a)and(b)showthebeampatternofthreebeamformerat=1:5and=1:0,respectively.When=1:5inFigure E-4 (a),thebeampatternofCaponissimilartothatofkernel,andthegainofitssidelobeishigherthanthatofcorrentropyby2dB.Asdecreasing,thegapofthesidelobegainbetweenCaponandcorrentropyisincreasedasshownFigure E-4 (b). 110

PAGE 111

111

PAGE 112

112

PAGE 113

(b)N=1000 FigureE-1. ComparisonsofthebeampatternforthreebeamformersinGaussiannoisewith10dBofSNR. 113

PAGE 114

(b)N=1000 FigureE-2. ComparisonsofBERforthreebeamformersinGaussiannoisewithdierentSNRs. 114

PAGE 115

ComparisonsofBERforthreebeamformerswithdierentcharacteristicexponentlevels. 115

PAGE 116

(b)=1:0 FigureE-4. Comparisonsofthebeampatternforthreebeamformersinnon-Gaussiannoise. 116

PAGE 117

[1] B.V.Kumar,\Tutorialsurveyofcompositelterdesignsforopticalcorrelators,"Appl.Opt,vol.31,pp.4773{4801,1992. [2] A.Mahalanobis,A.Forman,M.Bower,R.Cherry,andN.Day,\Multi-classSARATRusingshiftinvariantcorrelationlters,"specialissueofPatternRecognitiononcorrelationltersandneuralnetworks,vol.27,pp.619{626,1994. [3] A.Mahalanobis,B.VijayaKumar,D.W.Carlson,andS.Sims,\Performanceevaluationofdistanceclassiercorrelationlters,"inProc.SPIE,1994,vol.2238,pp.2{13. [4] R.ShenoyandD.Casasent,\Correlationltersthatgeneralizewell,"inProc.SPIE,March1998,vol.3386,pp.100{110. [5] M.Savvides,B.V.Kumar,andP.Khosla,\Facevericationusingcorrelationlters,"inProc.ThirdIEEEAutomaticIdenticationAdvancedTechnologies,Tarrytown,NY,2002,pp.56{61. [6] B.V.Kumar,M.Savvides,C.Xie,andK.Venkataramani,\Biometricvericationwithcorrelationlters,"AppliedOptics,vol.43,no.2,pp.391{402,Jan2004. [7] B.V.K.V.Kumar,A.Mahalanobis,andR.Juday,CorrelationPatternRecognition,CambridgeUniversityPress,2005. [8] G.Turin,\Anintroductiontomatchedlters,"IEEETrans.InformationTheory,vol.6,pp.311{329,1960. [9] S.M.Kay,FundamentalsofStatisticalsignalprocessing,VolumeIIDetectionTheory,Prentice-Hall,1998. [10] A.VanderLugt,\Signaldetectionbycomplexspatialltering,"IEEETrans.InformationTheory,,no.10,pp.139{145,1964. [11] C.HesterandD.Casasent,\Multivarianttechniqueformulticlasspatternrcognition,"Appl.Opt,vol.19,pp.1758{1761,1980. [12] B.V.Kumar,\Minimumvariancesyntheticdiscriminantfunctions,"J.Opt.Soc.Am.A,vol.3,no.10,pp.1579{1584,1986. [13] A.Mahalanobis,B.V.Kumar,andD.Casasent,\Minimumaveragecorrelationenergylters,"Appl.Opt,vol.26,no.17,pp.3633{3640,1987. [14] D.CasasentandG.Ravichandran,\Advanceddistortion-invariantminimumaveragecorrelationenergyMACElters,"Appl.Opt,vol.31,no.8,pp.1109{1116,1992. [15] G.RavichandranandD.a.Casasent,\Minimumnoiseandcorrelationenergylters,"Appl.Opt,vol.31,no.11,pp.1823{1833,1992. 117

PAGE 118

P.RefregierandJ.Figue,\Optimaltrade-olterforpatternrecognitionandtheircomparisonwithweinerapproach,"Opt.ComputerProcess.,vol.1,pp.3{10,1991. [17] A.Mahalanobis,B.V.Kumar,S.Song,S.Sims,andJ.Epperson,\Unconstrainedcorrelationlters,"Appl.Opt,vol.33,pp.3751{3759,1994. [18] A.Mahalanobis,B.V.Kumar,andS.Sims,\Distance-classiercorrelationltersformulticlasstargetrecognition,"Appl.Opt,vol.35,no.17,pp.3127{3133,June1996. [19] M.AlkanhaandB.V.Kumar,\Polynomialdistanceclassiercorrelationlterforpatternrecognition,"Appl.Opt,vol.42,no.23,pp.4688{4708,Aug.2003. [20] J.FisherandJ.Principe,\AnonlinearextensionoftheMACElter,"NeuralNetworks,vol.8,pp.1131{1141,1995. [21] J.FisherandJ.Principe,\Formulationofthemacelterasalinearassociativememory,"inProc.Int.Conf.onNeuralNetworks,1994,vol.5. [22] J.FisherandJ.Principe,\RecentadvancestononlinearMACElters,"OpticalEngineering,vol.36,no.10,pp.2697{2709,Oct.1998. [23] B.ScholkopfandA.J.Smola,LearningwithKernels,TheMITPress,2002. [24] B.Scholkopf,A.J.Smola,andK.Muller,\Kernelprincipalcomponentanalysis,"NeuralComputation,vol.10,pp.1299{1319,1998. [25] A.RuizandE.Lopez-deTeruel,\Nonlinearkernel-basedstatisticalpatternanalysis,"IEEETrans.onNeuralNetworks,vol.12,pp.16{32,2001. [26] H.KwonandN.M.Nasrabadi,\Kernelmatchedsignaldetectorsforhyperspectraltaregetdetection,"inProc.Int.Conf.Acoustics,Speech,SignalProcessing(ICASSP),2005,vol.4,pp.665{668. [27] K.Jeong,P.Pokharel,J.Xu,S.Han,andJ.Principe,\Kernelsyntheticdistriminantfunctionforobjectrecognition,"inProc.Int.Conf.Acoustics,Speech,SignalProcessing(ICASSP),France,May2006,vol.5,pp.765{768. [28] C.Xie,M.Savvides,andB.V.Kumar,\Kernelcorrelationlterbasedredundantclass-dependencefeatureanalysisKCFAonFRGC2.0data,"inProc.2ndInt.WorkshopAnalysisModelingofFacesGesture(AMFG),Beijing,2005. [29] I.Santamara,P.Pokharel,andJ.Principe,\Generalizedcorrelationfunction:Denition,propertiesandapplicationtoblindequalization,"IEEETrans.SignalProcessing,vol.54,no.6,pp.2187{2197,June2006. [30] P.Pokharel,R.Agrawal,andJ.Principe,\Correntropybasedmatchedltering,"inProc.IEEEInt.WorkshoponMachineLearningforsignalProcessing(MLSP),Sept.2005,pp.148{155. 118

PAGE 119

J.W.Fisher,NonlinearExtensionstotheMiminumAverageCorrelationEnergyFilter,Ph.D.dissertation,UniversityofFlorida,Gainesville,FL,1997. [32] B.Boser,I.Guyon,andV.Vapnik,\Atrainingalgorithmforoptimalmarginclassiers,"inProc.5thCOLT,1992,pp.144{152. [33] V.Vapnik,TheNatureofStatisticalLearningTheory,SpringerVerlag,1995. [34] N.CristianiniandJ.S.Taylor,AnIntroductiontoSupportVectorMachines,CambridgeUniversityPress,2000. [35] N.Aronszajn,\Theoryofreproducingkernels,"Trans.Amer.Math.Soc.,vol.68,pp.337{404,1950. [36] E.Parzen,\Ontheestimationofprobabilitydensityfunctionandmode,"TheAnnalsofMathematicalStatistics,1962. [37] J.Mercer,\Functionsofpositiveandnegativetype,andtheirconnectionwiththetheoryofintegralequations,,"PhilosophicalTrans.oftheRoyalSocietyofLondon,vol.209,pp.415{446,1909. [38] \http://www.amp.ece.cmu.edu:Advancedmultimediaprocessinglabatelectricalandcomputereng.,CMU,". [39] E.Parzen,\Statisticalmethodsontimeseriesbyhilbertspacemethods,"Tech.Rep.TechnicalReportNo23,AppliedMathematicsandStatisticsLaboratory,StanfordUniversity,1959. [40] S.Sudharsanan,A.Mahalanobis,andM.Sundareshan,\Uniedframeworkforthesynthesisofsyntheticdiscriminantfunctionswithreducednoisevarianveandsharpcorrelationstruture,"OpticalEngineering,1990. [41] J.C.Principe,D.Xu,andJ.Fisher,\Informationtheoreticlearning,"inUnsuper-visedAdaptiveFiltering,S.Haykin,Ed.,pp.265{319.JOHNWILEY,2000. [42] W.Liu,P.P.Pokharel,andJ.C.Principe,\Correntropy:propertiesandapplicationsinnon-gaussiansignalprocessing,"inpress,IEEETrans.onSignalProcessing. [43] W.Liu,P.Pokharel,,andJ.Principe,\Correntropy:alocalizedsimilaritymeasure,"inProc.2006IEEEWorldCongressonComputationalIntelligence(WCCI),Canada,July2006,pp.10018{10023. [44] B.W.Silverman,DensityEstimationforStatisticsandDataAnalysis,CRCPress,1986. [45] P.Pokharel,J.Xu,D.Erdogmus,andJ.Principe,\Aclosedformsolutionforanonlinearwienerlter,"inProc.Int.Conf.Acoustics,Speech,SignalProcessing(ICASSP),France,May2006,vol.3,pp.720{723. 119

PAGE 120

K.JeongandJ.Principe,\ThecorrentropyMACElterforimagerecognition,"inProc.IEEEInt.WorkshoponMachineLearningforsignalProcessing(MLSP),Ireland,July2006,pp.9{14. [47] L.GreengardandJ.Strain,\Thefastgausstransform,"SIAMJ.Sci.Statist.Comput.,vol.12,no.1,pp.79{94,Jan.1991. [48] T.Gonzalez,\Clusteringtominimizethemaximuminterclusterdistance,"TheoreticalComputerScience,vol.38,pp.293{306,1985. [49] T.Ross,S.Worrell,V.Velten,J.Mossing,andM.Bryant,\StandardSARATRevaluationexperimentsusingtheMSTARpublicreleasedataset,"inProc.SPIE,April1998,vol.3370,pp.566{573. [50] R.GonzalezandR.Woods,DigitalImageProcessing,SecondEdition,PrenticeHall,2002. [51] M.TurkandA.Pentland,\Eigenfacesforrecognition,"JournalofCognitiveNeuroscience,vol.3,no.1,pp.71{86,1991. [52] R.O.Duda,P.E.Hart,andS.D.G,PatternClassication,SecondEdition,JohnWillyandsons,2001. [53] D.Erdogmus,Y.Rao,H.Peddaneni,A.Hegde,andJ.Principe,\Recursiveprincipalcomponentsanalysisusingeigenvectormatrixperterbation,"EURASIPJournalonAppliedSignalProcessing,,no.13,pp.2034{2041,Mar.2004. [54] K.Fukunaga,Introductiontostatisticalpatternrecognition,SecondEdition,AcademicPressProfessional,1990. [55] S.Kaski,\Dimensionalityreductionbyrandommapping:Fastsilimaritycomputationforclustering,"inProc.Int.JointConf.onNeuralNetworks(IJCNN),1998,pp.413{418. [56] D.FradkinandD.Madigan,\Experimentswithrandomprojectionformachinelearning,"inProc.ConferenceonKnowledgeDiscoveryandDataMining,2003,pp.517{522. [57] E.BrighamandH.Maninila,\Randomprojectionindimensionalityreduction:applicationstoimageandtextdata,"inProc.ConferenceonKnowledgeDiscoveryandDataMining,2001,pp.245{250. [58] D.Achlioptas,\Database-friendlyrandomprojections,"inSymposiumonPrinciplesofDatabaseSystems(PODS),2001,pp.274{281. [59] S.Dasgupta,\Experimentswithrandomprojection,"inProc.ConferenceonUncertaintyinArticialIntelligence,2000. 120

PAGE 121

E.Candes,J.Romberg,andT.Tao,\Robustuncertaintyprinciples:Exactsignalreconstructionfromhighlyincompletefrequencyinformation,"IEEETrans.onInformationTheory,vol.52,no.2,pp.489{509,2006. [61] S.DasguptaandA.Gupta,\Anelementaryproofofthejohnson-lindenstrausslemma,"Tech.Rep.TechnicalReportTR-99-006,InternationalComputerScienceInstitute,Berkeley,CA,1999. [62] G.H.GolubandC.F.v.Loan,MatrixComputations,NorthOxfordAcademic,Oxford,UK,1983. [63] E.J.CandesandT.Tao,\Near-optimalsignalrecoveryfromrandomprojections:Universalencodingstrategies?,"IEEETrans.ofInformationTheory,vol.52,no.12,pp.5406{5425,Dec.2006. [64] D.Achlioptas,\Database-friendlyrandomprojections:Johnson-lindenstrausswithbinarycoins,"JournalofComputerandSystemSciences,vol.66,pp.671{687,2003. [65] R.Hecht-Nielsen,Contextvectors:generalpurposeapproximatemeaningrepresenta-tionsself-organizedfromrawdata,IEEEPress,1994. [66] D.Donoho,\Compressedsensing,"IEEETrans.ofInformationTheory,vol.52,no.4,pp.1289{1306,2006. [67] D.CasasentandN.A.,\ConfuserrejectionperformanceofEMACHltersforMSTARART,"inProc.SPIE,April2006,vol.6245,pp.62450D1{12. [68] B.D.VanVeenandK.M.Buckley,\Beamforming:aversatileapproachtospatialltering,"IEEEASSPMagazine,vol.5,no.2,pp.4{22,April1988. [69] H.KrimandM.Viberg,\Twodecadesofarraysignalprocessingresearch:theparametricapproach,"IEEESignalProcessingMagazine,vol.13,no.4,pp.67{94,July1996. [70] M.S.Bartlett,\Smoothingperiodogramsfromtimeserieswithcontinuousspectra,"Nature,vol.161,no.4096,pp.686{687,May1948. [71] J.Capon,\High-resolutionfrequency-wavenumberspectrumanalysis,"ProceedingsoftheIEEE,vol.57,no.8,pp.1408{1418,August1965. [72] R.T.Lacoss,\Dataadaptivespectralanalysismethods,"Geophysics,vol.36,no.4,pp.134{148,August1971. [73] P.Stoica,Z.Wang,andJ.Li,\Robustcaponbeamforming,"IEEESignalProcessingLetters,vol.10,no.6,pp.172{175,June2003. [74] R.G.LorenzandS.P.Boyd,\Robustminimumvariancebeamforming,"IEEETrans.SignalProcessing,vol.53,no.5,pp.1684{1696,May2005. 121

PAGE 122

M.ShaoandC.L.Nikias,\Signalprocessingwithfractionallowerordermoments:Stableprocessesandtheirapplications,"ProceedingsoftheIEEE,vol.81,no.7,pp.986{1010,July1993. [76] R.Adler,R.E.Feldman,andT.M.S,APracticalGuidetoHeavyTails:StatisticalTechniquesandApplications,Boston,MA:Birkhauser,1998. [77] P.Tsakalides,R.Raspanti,andC.L.Nikias,\Angle/dopplerestimationinheavy-tailedclutterbackgrounds,"IEEETrans.AerospaceandElectronicSystems,vol.35,no.2,pp.419{436,April1999. [78] T.Lo,H.Leung,andJ.Litva,\Nonlinearbeamforming,"ElectronicsLetters,vol.27,no.4,pp.350{352,February1991. [79] S.Chen,L.Hanzo,andA.Wolfgang,\Kernel-basednonlinearbeamformingconstructionusingorthogonalforwardselectionwiththesherratioclassseparabilitymeasure,"IEEESignalProcessingLetters,vol.11,no.6,pp.478{481,May2004. [80] M.Martinez-Ramon,J.L.Rojo-Alvarez,andG.Camps-Valls,\Kernelantennaarrayprocessing,"IEEETrans.AntennasandPropagation,vol.55,no.3,pp.642{650,March2007. [81] K.H.Jeong,W.Liu,S.Han,andJ.C.Principe,\ThecorrentropyMACElter,"submittedtoIEEETrans.onPatternAnalysisandMachineIntelligence(PAMI). 122

PAGE 123

Kyu-HwaJeongwasbornJune,1972inKoreaandreceivedtheM.SdegreeinelectronicsengineeringfromYonseiUniversity,Seoul,Korea,in1997,wherehefocusedonadaptiveltertheoryanditsapplicationstoacousticechocancelation.In1997-2003,hewasaseniorresearchengineerwithDigitalMediaResearchLab.inLGElectronics,Seoul,Korea,andbelongedtoopticalstoragegroup.HemainlyparticipatedinCD/DVDrecorderprojects.Since2003,hehasbeenpursuingthePh.D.degreewiththeComputationalNeuroEngineeringLabinelectricalandcomputerengineering,UniversityofFlorida,Gainesville,FL.Hisresearchinterestsareintheeldofsignalprocessing,machinelearninganditsapplicationstoimagepatternrecognition. 123