<%BANNER%>

Nonlinear Signal Processing Based on Reproducing Kernel Hilbert Space

Permanent Link: http://ufdc.ufl.edu/UFE0021159/00001

Material Information

Title: Nonlinear Signal Processing Based on Reproducing Kernel Hilbert Space
Physical Description: 1 online resource (161 p.)
Language: english
Creator: Xu, Jianwu
Publisher: University of Florida
Place of Publication: Gainesville, Fla.
Publication Date: 2007

Subjects

Subjects / Keywords: correntropy, dependence, information, kernel, pca, pitch, rkhs, synchronization
Electrical and Computer Engineering -- Dissertations, Academic -- UF
Genre: Electrical and Computer Engineering thesis, Ph.D.
bibliography   ( marcgt )
theses   ( marcgt )
government publication (state, provincial, terriorial, dependent)   ( marcgt )
born-digital   ( sobekcm )
Electronic Thesis or Dissertation

Notes

Abstract: My research aimed at analyzing the recently proposed correntropy function and presents a new centered correntropy function from time-domain and frequency-domain approaches. It demonstrats that correntropy and centered correntropy functions not only capture the time and space structures of signals, but also partially characterize the higher order statistical information and nonlinearity intrinsic to random processes. Correntropy and centered correntropy functions have rich geometrical structures. Correntropy is positive definite and centered correntropy is non-negative definite, hence by Moore-Aronszajn theorem they uniquely induce reproducing kernel Hilbert spaces. Correntropy and centered correntropy functions combine the data dependent expectation operator and data independent kernels to form another data dependent operator. Correntropy and centered correntropy functions can be formulated as ``generalized' correlation and covariance functions on nonlinearly transformed random signals via the data independent kernel functions. Those nonlinearly transformed signals appear on the sphere in the reproducing kernel Hilbert space induced by the kernel functions if isotropic kernel functions are used. The other approach is to directly work with the reproducing kernel Hilbert space induced by the correntropy and centered correntropy functions directly. The nonlinearly transformed signals in the reproducing kernel Hilbert space is no longer stochastic but rather deterministic. The reproducing kernel Hilbert space induced by the correntropy and centered correntropy functions includes the expectation operator as embedded vectors. The two views further our understandings of correntropy and centered correntropy functions in geometrical perspective. The two reproducing kernel Hilbert space induced by kernel functions and correntropy functions respectively represent stochastic and deterministic functional analysis. The correntropy dependence measure is proposed based on the correntropy coefficient as a novel statistical dependence measure. The new measure satisfies all the fundamental desirable properties postulated by Renyi. We apply the correntropy concept in pitch determination, and nonlinear component analysis. The correntropy coefficient is also employed as a novel similarity measure to quantify the inder-dependencies of multi-channel signals.
General Note: In the series University of Florida Digital Collections.
General Note: Includes vita.
Bibliography: Includes bibliographical references.
Source of Description: Description based on online resource; title from PDF title page.
Source of Description: This bibliographic record is available under the Creative Commons CC0 public domain dedication. The University of Florida Libraries, as creator of this bibliographic record, has waived all rights to it worldwide under copyright law, including all related and neighboring rights, to the extent allowed by law.
Statement of Responsibility: by Jianwu Xu.
Thesis: Thesis (Ph.D.)--University of Florida, 2007.
Local: Adviser: Principe, Jose C.

Record Information

Source Institution: UFRGP
Rights Management: Applicable rights reserved.
Classification: lcc - LD1780 2007
System ID: UFE0021159:00001

Permanent Link: http://ufdc.ufl.edu/UFE0021159/00001

Material Information

Title: Nonlinear Signal Processing Based on Reproducing Kernel Hilbert Space
Physical Description: 1 online resource (161 p.)
Language: english
Creator: Xu, Jianwu
Publisher: University of Florida
Place of Publication: Gainesville, Fla.
Publication Date: 2007

Subjects

Subjects / Keywords: correntropy, dependence, information, kernel, pca, pitch, rkhs, synchronization
Electrical and Computer Engineering -- Dissertations, Academic -- UF
Genre: Electrical and Computer Engineering thesis, Ph.D.
bibliography   ( marcgt )
theses   ( marcgt )
government publication (state, provincial, terriorial, dependent)   ( marcgt )
born-digital   ( sobekcm )
Electronic Thesis or Dissertation

Notes

Abstract: My research aimed at analyzing the recently proposed correntropy function and presents a new centered correntropy function from time-domain and frequency-domain approaches. It demonstrats that correntropy and centered correntropy functions not only capture the time and space structures of signals, but also partially characterize the higher order statistical information and nonlinearity intrinsic to random processes. Correntropy and centered correntropy functions have rich geometrical structures. Correntropy is positive definite and centered correntropy is non-negative definite, hence by Moore-Aronszajn theorem they uniquely induce reproducing kernel Hilbert spaces. Correntropy and centered correntropy functions combine the data dependent expectation operator and data independent kernels to form another data dependent operator. Correntropy and centered correntropy functions can be formulated as ``generalized' correlation and covariance functions on nonlinearly transformed random signals via the data independent kernel functions. Those nonlinearly transformed signals appear on the sphere in the reproducing kernel Hilbert space induced by the kernel functions if isotropic kernel functions are used. The other approach is to directly work with the reproducing kernel Hilbert space induced by the correntropy and centered correntropy functions directly. The nonlinearly transformed signals in the reproducing kernel Hilbert space is no longer stochastic but rather deterministic. The reproducing kernel Hilbert space induced by the correntropy and centered correntropy functions includes the expectation operator as embedded vectors. The two views further our understandings of correntropy and centered correntropy functions in geometrical perspective. The two reproducing kernel Hilbert space induced by kernel functions and correntropy functions respectively represent stochastic and deterministic functional analysis. The correntropy dependence measure is proposed based on the correntropy coefficient as a novel statistical dependence measure. The new measure satisfies all the fundamental desirable properties postulated by Renyi. We apply the correntropy concept in pitch determination, and nonlinear component analysis. The correntropy coefficient is also employed as a novel similarity measure to quantify the inder-dependencies of multi-channel signals.
General Note: In the series University of Florida Digital Collections.
General Note: Includes vita.
Bibliography: Includes bibliographical references.
Source of Description: Description based on online resource; title from PDF title page.
Source of Description: This bibliographic record is available under the Creative Commons CC0 public domain dedication. The University of Florida Libraries, as creator of this bibliographic record, has waived all rights to it worldwide under copyright law, including all related and neighboring rights, to the extent allowed by law.
Statement of Responsibility: by Jianwu Xu.
Thesis: Thesis (Ph.D.)--University of Florida, 2007.
Local: Adviser: Principe, Jose C.

Record Information

Source Institution: UFRGP
Rights Management: Applicable rights reserved.
Classification: lcc - LD1780 2007
System ID: UFE0021159:00001


This item has the following downloads:


Full Text
xml version 1.0 encoding UTF-8
REPORT xmlns http:www.fcla.edudlsmddaitss xmlns:xsi http:www.w3.org2001XMLSchema-instance xsi:schemaLocation http:www.fcla.edudlsmddaitssdaitssReport.xsd
INGEST IEID E20101129_AAAAAQ INGEST_TIME 2010-11-29T23:07:52Z PACKAGE UFE0021159_00001
AGREEMENT_INFO ACCOUNT UF PROJECT UFDC
FILES
FILE SIZE 6659 DFID F20101129_AAALDB ORIGIN DEPOSITOR PATH xu_j_Page_149thm.jpg GLOBAL false PRESERVATION BIT MESSAGE_DIGEST ALGORITHM MD5
c9ddd680c8c75d67c94a3a2f95644a17
SHA-1
21e88012499b306535ec082c7f8ee6e7ebb1ca92
23704 F20101129_AAALCM xu_j_Page_137.QC.jpg
e7c808121fd987439aa22e4e8e4accfe
98ab5cac43b79d403320fc1b656fb27a66832147
5833 F20101129_AAALBY xu_j_Page_125thm.jpg
7640b36e529fa8226e136bda482e7c63
0861ec1fbe52ff4379332b6ac3003438b109d4ad
3120 F20101129_AAAKXG xu_j_Page_002.QC.jpg
12c921c2282906cbb2054bf05f4edfe7
67906440c5d28d880368d938df91201280d70184
5827 F20101129_AAAKWS xu_j_Page_112thm.jpg
ebcebd32abfd947bc81b797ec132f335
e2ebce8e71f0d9688a25aa72e49477121a5b5fc8
88167 F20101129_AAAJUE xu_j_Page_006.jpg
270dc7f97d6ccb0c4ac78ee68a756870
d8acd84ec239fe0e6f9a8e781f6e0872f1ad41a8
27229 F20101129_AAAJTQ xu_j_Page_102.QC.jpg
8a42985e35d2fba9ad6ff2bc65da7e71
f03fd83c1e390e5bc4dd2c83ec64c1f610cddc92
6300 F20101129_AAALDC xu_j_Page_151thm.jpg
47c61bde7c2f6ec3788880186b236c22
87ca74c53743954be454dbb7cb131b546df43e41
21822 F20101129_AAALCN xu_j_Page_139.QC.jpg
e2dced99d597c49174c11be6abbc441b
03609df19d2b3346dc86391107ef5b9ab4e62bbb
19385 F20101129_AAALBZ xu_j_Page_126.QC.jpg
ae6f66235218d4b8ee161fee34f68dd5
bad798ea81eaf36b4577dc9fec4955362338f15c
3180 F20101129_AAAKXH xu_j_Page_003.QC.jpg
d666b60b389fefe96e80d0d9465bab6c
b1fcd40af5c01810497527335af4709be3573203
5351 F20101129_AAAKWT xu_j_Page_090thm.jpg
17fe0180503c0e564f8abd0cee3d4161
c9fcc5e427fbfcb7c6b6cd1a6096993f4e0cd13e
15590 F20101129_AAAJUF xu_j_Page_008.jpg
3ac900767c3d2d2a9a20213387b500bd
e0921980d3cfd16096a0799d673de3754270a168
6917 F20101129_AAAJTR xu_j_Page_103thm.jpg
70e5b5959400afacbc7571bd25266671
ac613b4a03130913d72e10bf3a9786b8c7584217
1051946 F20101129_AAAKAA xu_j_Page_033.jp2
7e19e587639311218f12abd5c39634bb
472ee83ba92b0b2a4b00022d18b332b310809732
5590 F20101129_AAALCO xu_j_Page_139thm.jpg
7cf4b987ec899a098ad14e34274d4f97
c36c567438c2db8ccbdfb03c2f7b8a04fccb9108
1427 F20101129_AAAKXI xu_j_Page_003thm.jpg
acf424ace05cdb363816a01ef2642cd5
ab288d011452a6db57258179cdf60b46664cdebc
28898 F20101129_AAAKWU xu_j_Page_023.QC.jpg
a76a0cc206d8853c5ea3b717e8b74154
67f72077c0617a643b9f499dcb777b09a4ef7bcd
96007 F20101129_AAAJUG xu_j_Page_009.jpg
b4c9668a2f5076cea14ba55027a51f45
d600e4f9e51a4c628f7e573b29055b0fad110cd2
900 F20101129_AAAJTS xu_j_Page_086.txt
a8c93f6d9b8f7da92ad1f1f112dccdc8
5e824a1f92d46b58a32cd44d9b98eb709830c426
6198 F20101129_AAALDD xu_j_Page_152thm.jpg
75c9b11ad8b874276b9a4b6af0ab412f
ef4f3966c6d4e216caacefd5e2ce18a3e9125cdc
5904 F20101129_AAALCP xu_j_Page_140thm.jpg
cb1d82b2d00cc5485c3b594885c295da
e5af10fd39a504cae0238c27acf5b6c5f2343e8b
24606 F20101129_AAAKXJ xu_j_Page_004.QC.jpg
b8c1527959f6794a2dc9228cbee25ea1
33ac6b16cef92537dfa921888932569a3d0b6d9e
17569 F20101129_AAAKWV xu_j_Page_118.QC.jpg
845cd3affd76eaeb04ffb8b0ded1adae
07f420013dedb0e4d1b337538735c9021cc49fcc
89678 F20101129_AAAJUH xu_j_Page_010.jpg
aa2f0813e3ce62e011b45a4b45066361
652adafa07d5be483c920095739443b8e2168495
6751 F20101129_AAAJTT xu_j_Page_033thm.jpg
3a45331ee6f75cfd9be6846054b889e8
d6c1cf93eedeebf8cc7fbc4f510a1e35300c88ff
919180 F20101129_AAAKAB xu_j_Page_034.jp2
d13b9b82ad219f3a0ae93ab334a6cd7b
8774633f58f11aecd0c533ae3fe3c22cffaaed49
23757 F20101129_AAALDE xu_j_Page_154.QC.jpg
1d49739884e0ce716b5e8ed10c2c013f
f5f898bd38b4f26de64a934c1eae4c0e1c2df9b5
22906 F20101129_AAALCQ xu_j_Page_141.QC.jpg
3c78d9e504f1ff217449ffaf0a00ffe2
e0a7ace2f9a79806ef698b8fe142a8b8cea553c5
6172 F20101129_AAAKXK xu_j_Page_004thm.jpg
8a34548d12dd072a40cadba830cf60a1
a1bc73c8e744e328f79a63ad88d565f7406af943
1659 F20101129_AAAKWW xu_j_Page_008thm.jpg
d6ac640ad8e560e59e124d1431cef308
0ea38612ca74c5c3dc5792bce1a37309bb261c6a
69155 F20101129_AAAJUI xu_j_Page_011.jpg
54f95d87f3411849784f25e20d3246fa
605b052d937b8159e8d448610d0a3c0f87fb7d4c
847564 F20101129_AAAJTU xu_j_Page_115.jp2
d4f4d608457be2284d1d5c36b7808380
804cac12f32022a2a732619d28e16430b1dacbbe
987482 F20101129_AAAKAC xu_j_Page_035.jp2
2d6c4f357af68c6ce9937b99956e013e
acfcc2cd0362ed32697a274402edb8eb388f959a
6272 F20101129_AAALDF xu_j_Page_154thm.jpg
85fb5b06d6c3295da858a9c9c8dc2de3
630d60537e8a79670e06cb7b5775403454d08b99
5755 F20101129_AAALCR xu_j_Page_141thm.jpg
59af26a12a3727b94745957848a7f1db
8c43ab1b7d6956021d9f0be9ee9460dd2dd1e537
4771 F20101129_AAAKXL xu_j_Page_005.QC.jpg
a914dc20768d3f56b09f9e09077e685c
a94ebf54ecc1984e0f695a996f863b7234764e4f
32461 F20101129_AAAJUJ xu_j_Page_012.jpg
d4b23930528b6778ca9d0c62e6d87af1
cc196e45bf6b302ce339a3ec24a99460a5f2d6fd
716421 F20101129_AAAKAD xu_j_Page_036.jp2
d0466751d02426962e0313eb7df3f6b1
e49feb28c32d5cd801a92625bba32a0f63d639fd
24090 F20101129_AAALDG xu_j_Page_156.QC.jpg
1011ddb74fdefb3db9938b8eb86206c0
d24b6a64632c53db1c18aaf1e9d964db6d7fb71f
18242 F20101129_AAALCS xu_j_Page_142.QC.jpg
7729cb2161c14eedbe7a9983952d3237
599392f3c994727f5bcfc9286e7542c2f52375e9
18288 F20101129_AAAKYA xu_j_Page_019.QC.jpg
a0d5c02f9dc7fed96c53be59cd3d7252
9ccc238e62e3d5d728166f8cc09863601a4f8358
22950 F20101129_AAAKXM xu_j_Page_006.QC.jpg
efa933c65030a096f4cab91395ad5c17
6727d84debf37bb3178d8bbbd52c689a70f913f8
5582 F20101129_AAAKWX xu_j_Page_107thm.jpg
afc28f0a0ae3369908c4f3d996b99f8f
5fcb5aeba01e58088cc414548d188a7b67372079
71373 F20101129_AAAJUK xu_j_Page_013.jpg
91072bc57c6fe0534738405f9d851d2d
db636ade660823fb5c3b06101073430125e5b2a7
26719 F20101129_AAAJTV xu_j_Page_116.pro
e7295d345ab621eee0c5b676d989f82d
af2c5f31975ed8b316558764cd2a6f47c5682742
821318 F20101129_AAAKAE xu_j_Page_037.jp2
43d859c8d792ae739a9aa34c91629c37
3e10b88674bf4f35f2250a7a928df90c349bef9b
6310 F20101129_AAALDH xu_j_Page_156thm.jpg
b6a14fb0edfdd1be44745b962dd5e746
a5711065a7b49ca217afb51783a9003136284ea6
5038 F20101129_AAALCT xu_j_Page_142thm.jpg
19c3b910d8244214f8d684e559b183e2
db8a3109faa22e156951721cda907a70357f2e09
18520 F20101129_AAAKYB xu_j_Page_020.QC.jpg
06f9d1f0d70f51d61e957ea1969f5dd4
ba266d6267f4a69d7c9be3dd91b5ab86bbc72e70
5678 F20101129_AAAKXN xu_j_Page_006thm.jpg
93299a023adc34d972565118c673eb8a
c00d8aa50212b23c975213c8fecf58631aea613f
F20101129_AAAKWY xu_j_Page_084thm.jpg
0f05a6965c78cc003eccaa38537fa3ae
e4c9e2b071e45177a4d78066f32a8a80b19706b4
72391 F20101129_AAAJUL xu_j_Page_015.jpg
661f4c7ff8eb0bf181bb350c3471a475
09f6be23ea858c8f2570d214e659dcffaca132df
1051972 F20101129_AAAJTW xu_j_Page_081.jp2
b1893e61b83ca492e6cd312e390e79c6
9ac94afa06f59ec661c9f19aad299ff9d613291e
950251 F20101129_AAAKAF xu_j_Page_039.jp2
ec0e05342bb3464c5e0d5048f74007e5
a2e5b80fc827c3248c8435df33138caa7b830347
25417 F20101129_AAALDI xu_j_Page_157.QC.jpg
6fa5f5ce7de5578dc7269654e9804a44
e535cc9311bb1e09e2a4e5a43b410cdc6dbea0c9
28152 F20101129_AAALCU xu_j_Page_143.QC.jpg
1559876b30afe0404850dae8edebba48
79d5ebb0240cad047bb48594e0bf86b015754641
7109 F20101129_AAAKYC xu_j_Page_021thm.jpg
204ba3098aeb0ee7150ff5b293ffe7d7
d09d2384b95a89cb7cf0a5d5d4b5ef9d2fca89e2
5435 F20101129_AAAKXO xu_j_Page_007thm.jpg
a5b2f659653a000640bf81867b2b60ea
4a22e8fd12a4a3c337b2f00b70df17c8ca5e9d8b
3869 F20101129_AAAKWZ xu_j_Page_070thm.jpg
63d94cea41cad171dcf6cb58d6c27079
7f4f24463313178b1210d515ee826d28f450723f
69537 F20101129_AAAJUM xu_j_Page_016.jpg
98840d04ad4cb851a9d7af0abe53c9ab
24da43801506186096b4bc4562f650d92f3ea0d1
182570 F20101129_AAAJTX UFE0021159_00001.mets FULL
c75ae5b217b5ecc86698d319d67e2aed
2aa895a3a2fda7b128082959ea679b44c773666b
1023599 F20101129_AAAKAG xu_j_Page_040.jp2
eed0355e869c9a8881aa5581107f4877
2af7f0ad99a9699d15703f0f7c59c211e281cbb0
60428 F20101129_AAAJVA xu_j_Page_037.jpg
88211581a64b7ef5c310b7eddd8ced88
7b6760d039f3700090dfb67905d0af76bb06c8f8
6525 F20101129_AAALDJ xu_j_Page_157thm.jpg
39d5f9e2292f1398a3c226b20f43d949
fe609a1a244b75ff7240cf5ce2cb322e5ea87bfb
6980 F20101129_AAALCV xu_j_Page_143thm.jpg
21c624ce35432bf18e811e46049cc611
547add19383daf6c5679d20ff3899c15c7b62a50
29249 F20101129_AAAKYD xu_j_Page_022.QC.jpg
1f26e8decfe5809ae299e7e3b3496ce8
5f6e8dddb2dcd942001c4e79a7b822618047409e
4898 F20101129_AAAKXP xu_j_Page_008.QC.jpg
19c11962508c1c7611fe04297723e36f
b9f01faa2ef89cbed3bf01214a1cc5c6ec38d104
58584 F20101129_AAAJUN xu_j_Page_018.jpg
0804e9b95e5ac7020d15fb0e06ac7407
131803fbca9861711c6e63c7983f8a6f5e96c766
846854 F20101129_AAAKAH xu_j_Page_043.jp2
d3f31e932823bd1f4650a8e03a338aa6
66ce7d5f7487649ae07085dd3733ebcb89c41078
54052 F20101129_AAAJVB xu_j_Page_038.jpg
f9a5eeb2c96d40240ad073c2982ec469
966cbd1390c6382872721497891485eb5e70b407
27896 F20101129_AAALDK xu_j_Page_159.QC.jpg
d35a8579670ac39c24db4e8d71a3cf57
5adc57c8a3bb08733b9415c06f1d34c30c834e50
18933 F20101129_AAALCW xu_j_Page_145.QC.jpg
2ee77966fd012c78a28d8f0dc5071c57
bcbc5cae92711d2a08a6470ab2a269a20c2b3988
7136 F20101129_AAAKYE xu_j_Page_022thm.jpg
024a271a3c051a19ad158793ec2356d5
2677ff88768cd93524082db4cae8fad0fcd51eb2
26348 F20101129_AAAKXQ xu_j_Page_009.QC.jpg
8d01d9c1c60e0759d7b6ce06238ac857
7f75993c05cf41a853e3c936b2a63116902d4e2d
60817 F20101129_AAAJUO xu_j_Page_019.jpg
356222a0198918079a7439ba3509bc45
4aca48225a4478832d6463e29af5c3c54d642458
1051955 F20101129_AAAKAI xu_j_Page_046.jp2
b0989477b472263e4d6fd5db12ba5eee
921101b094157b4b25be0f22b78d9abf16a9bac2
65719 F20101129_AAAJVC xu_j_Page_039.jpg
07d7aadf304b9c3e62413cee97f20eee
9e71929e6d5acb757e8ee90d2e9f127185332bc0
7296 F20101129_AAALDL xu_j_Page_159thm.jpg
e94a0f61e8eb2a0b628b18362c0c2384
7bdde64ecb354c64854ab1514a6cd68133e29150
5303 F20101129_AAALCX xu_j_Page_145thm.jpg
0770833c8e60d739ced86bdfa7593afc
98d40bdc3357bc0e408fbb2b3233e2ab8e2afe90
7075 F20101129_AAAKYF xu_j_Page_023thm.jpg
d475ad2d6c36b743f873f6c32dfd037b
bc287ee8a71f7ce86858280bdc4c735a907046c9
6351 F20101129_AAAKXR xu_j_Page_009thm.jpg
0b38c03045fa329ffeb69c6d8a0e09eb
1898b486e1601f7e4b190c6b32fe9afcd7c777d6
92811 F20101129_AAAJUP xu_j_Page_023.jpg
dee9971caa829abb9dd18f1c4c17f890
4bdf72f9f98a23d241e4b3298ca35b8b5ae83eaa
1051984 F20101129_AAAKAJ xu_j_Page_048.jp2
eb097ecd9feff70f4c9a9d7280276c91
c1bc9dde45f4ab0988a372c19704f25b10f335d8
77836 F20101129_AAAJVD xu_j_Page_041.jpg
0d5806267da7ce3f949081746c9c4b1e
2e53b0286c57b1d0c27a82f979c63eb3e57f8991
27015 F20101129_AAALCY xu_j_Page_146.QC.jpg
ba03bc8feb706e16924af8e59a97d99f
1206329c8d7924d9016aeffbdb76a02fd77f83de
25212 F20101129_AAAKYG xu_j_Page_025.QC.jpg
06e844ab598ad9f945471a07a597b531
0726136ac7a72b1a81efa6111aa4abe932d89388
5984 F20101129_AAAKXS xu_j_Page_010thm.jpg
07ff2eecf4e8291a81dd92583782fa8e
ebcf40bc62863af640c98e8a04177302ee401a86
85544 F20101129_AAAJUQ xu_j_Page_024.jpg
44840ce345f1b270a5b30d34fc14d5e9
51f4090bbd976550a3b5170531aef6ab49c7f73c
940356 F20101129_AAAKAK xu_j_Page_050.jp2
3f3f70e9be8fe71309965267fcc0b6a4
7f1bb22b3af972dfc6baada7ca28fa31cbcd83bb
83832 F20101129_AAAJVE xu_j_Page_042.jpg
7611f11c5425fb79de763f8574c6902a
5661cf0a223bc3a257555837df72600281a61dc2
26955 F20101129_AAALCZ xu_j_Page_147.QC.jpg
bd7a55acc43672198451094aa98b5017
7e0d2c7b0be80e02835742fcebbfbded6af70924
6639 F20101129_AAAKYH xu_j_Page_025thm.jpg
fb2dd97462c04634e4596cc688e6c0f8
598f6ab156cb5302ffb38d470a2e6b5605e88cde
22083 F20101129_AAAKXT xu_j_Page_011.QC.jpg
c6ed6c972fda2c1f96d0d27c6581b706
30cbb71411617ec16ac259c5f8e679a7c123970b
79080 F20101129_AAAJUR xu_j_Page_025.jpg
eb354d4aa8ba7badb6b9c483a2ef2623
d6a4d1a1dd468de3c2eb30e50ddd7955226a7763
1051923 F20101129_AAAKBA xu_j_Page_073.jp2
5ca71f960b8ab86afe982a16813bddbb
9f08255ee573afeee8bbfafeb444ba4dee7c4148
850728 F20101129_AAAKAL xu_j_Page_051.jp2
ecac425b2cf3e5c4d109114d2715336d
fa8eff327e584d377dd468992b3b20ebb5e6fe4f
62280 F20101129_AAAJVF xu_j_Page_043.jpg
c537fed8e7bda24f582b95b20cf73155
10ce2019356e75de37dba85bb00d61e1d1188e68
28477 F20101129_AAAKYI xu_j_Page_026.QC.jpg
02816f835c82cfacd5cbfb871f0ae217
6e7c53074034ed078dc5fbde4a9c5e558ee2226e
5692 F20101129_AAAKXU xu_j_Page_011thm.jpg
4d0bbc04cc7d0fc04156878d649a5891
b7da6d4c0060256be446f163688f40eda52b77ff
91038 F20101129_AAAJUS xu_j_Page_026.jpg
e91b29e5ae5ca11083b0d8c3e56af0c7
7d287dd20c3ce724d844578bdc592078580422b6
1051962 F20101129_AAAKBB xu_j_Page_074.jp2
cc81b47d057eb3336ed38b05475e02c9
298e3ec565927d770e7dca9c9f4a1bbc15908dbb
1013963 F20101129_AAAKAM xu_j_Page_052.jp2
e635942b1ea4d71f5ea298032abfc459
8ced07b7a8efe897bc7c8c5e05340b288a54efbe
66206 F20101129_AAAJVG xu_j_Page_044.jpg
7c24e3d8e5ebb413174db62cd43378cf
0f61579a3252faf8859c9d0d33e7006ad6396f13
5746 F20101129_AAAKYJ xu_j_Page_028thm.jpg
2435fdac9f7a586aa4934d44ef3e35fe
d5b9b5d8bcfc15a4c233f68b24505b9316ffac67
21078 F20101129_AAAKXV xu_j_Page_013.QC.jpg
1159a07db25a30b9b063fafdda9b88dc
ab1fbc51bffbd361dbdaa7afc577eccc7c0c1ac3
73850 F20101129_AAAJUT xu_j_Page_028.jpg
d56c763e80d6c36a3431fb10984f1c87
74297063a659d14cc39a8d3ef24eda08f10343b5
979033 F20101129_AAAKAN xu_j_Page_055.jp2
b739293c5391b9520db9cf6ef13862c1
fc80928dbf8cc1bd509eb162f28cc2598309a99e
82867 F20101129_AAAJVH xu_j_Page_045.jpg
74adc0daed522d97d95d253996a72c43
5acbb6f986549b314277a376ed64846531bc7fd4
20804 F20101129_AAAKYK xu_j_Page_029.QC.jpg
741d090b0284a84c218d5ce8d8fc964c
e399caabe9e5e29a871ac4864760792757f01f47
6170 F20101129_AAAKXW xu_j_Page_013thm.jpg
ff2d8f6448c0dccd1b707c989667f9cb
068821fac91fba7950c19249f7e4e34192524d6a
86484 F20101129_AAAJUU xu_j_Page_030.jpg
868b3f71b0e7c61ba75271b3dd9688b6
9a8bfc99ad88bfd7e052430874dc3c99a45ca819
1044171 F20101129_AAAKBC xu_j_Page_075.jp2
de5dbd52c2adc2b1157e37dab6e2aa74
ded91cee54b9586122be7a8a0755f61a1a1bfe7e
885971 F20101129_AAAKAO xu_j_Page_056.jp2
8e681c6a96e2adb5bd39bf579dc9276a
d7866f5a1d6d47392cac3e407eda0a72b6cfbb21
80180 F20101129_AAAJVI xu_j_Page_046.jpg
5dfbe22f19c1619df31a08fdbedc6810
db4d2346e9ec5bc3269641806b4cade758ad9430
5726 F20101129_AAAKYL xu_j_Page_029thm.jpg
5a1b2959127246070ca364a6f46098e4
034139c0f9d2eb91bf5d966333a95efccac9227e
6932 F20101129_AAAKXX xu_j_Page_014thm.jpg
a80046479fbd5e1d321f486bde8e05b6
17a336638b4f17c24d6b9dcce268a79d38a4d8fc
91996 F20101129_AAAJUV xu_j_Page_031.jpg
f860bebd15b6129eeecd3cc0598c80f9
0c1f3e7eb620307f1d4d7bf3546076f3b5143e46
1051977 F20101129_AAAKBD xu_j_Page_076.jp2
7b879142e8897918ec36ed35b05a6e34
62d7a7d9578674a21f3f247da13ebe25a8339e2d
1051947 F20101129_AAAKAP xu_j_Page_058.jp2
4d678f912ee322c88cd7e546a0a4a1eb
b2040c9e832c0882959261704461333bbcbb11cc
89494 F20101129_AAAJVJ xu_j_Page_048.jpg
48329c774bdb074f9343d239ecbdb74a
d34b6c9da2021e4740c24c577771439291769549
6961 F20101129_AAAKZA xu_j_Page_048thm.jpg
e4745b622fef427650b0f5d59fea546e
88c451b5ad583473984fd97e6616806928937d72
6233 F20101129_AAAKYM xu_j_Page_032thm.jpg
84d46e0abed3d355612d859399fc7aca
4383a8852c6093f48715efa0842288337477ae47
1051985 F20101129_AAAKBE xu_j_Page_077.jp2
e24e44934fc786eee50a746ec7182a3d
820fba7b55e06025d9433d3f859ae16178e13669
896146 F20101129_AAAKAQ xu_j_Page_059.jp2
b23ded90454a813164b1f05d8117dc4d
24a1ec2633c450360fcc0a93806806bc248e557f
69170 F20101129_AAAJVK xu_j_Page_049.jpg
fc96c028b7174142e971ff8eb21effa3
7b2b52990df688a62d4571a9bd24a3f1c8803c57
21888 F20101129_AAAKZB xu_j_Page_049.QC.jpg
b8aba12a466e8cf838123ef9be7fb74b
7cd3d8468a545cd38123a949d4ddf87f955a223d
20246 F20101129_AAAKYN xu_j_Page_034.QC.jpg
93daad0aae50b3d60c3fa407e42048ed
ffb2fad6f349faf21786075c0bc15367239b63a9
21927 F20101129_AAAKXY xu_j_Page_016.QC.jpg
98aafbc3f81760ebaf16ec2a2351ab0c
849aceb5a060d93633b0250d9f7e302aa845d2b4
82584 F20101129_AAAJUW xu_j_Page_032.jpg
c03ffcbc937d4eeaab88bfb6941df78f
66bd1bfa9409ff2b73a2ba02c032123c7e7d877e
974603 F20101129_AAAKBF xu_j_Page_078.jp2
55b39a18d9c52a2082c4e3834de4db76
5b95d1de467152992ed1df54c12872bc8fef29ba
1051967 F20101129_AAAKAR xu_j_Page_061.jp2
821d08604884300d6b8be737e049bff2
6a68b5f45de2903729e139cac432d5ccfb25cc34
64292 F20101129_AAAJVL xu_j_Page_050.jpg
75635d5554ae21d4d05be85f9b7c10ed
2195bbf1eb792798644ec47e5652f222fa06bb29
5724 F20101129_AAAKZC xu_j_Page_049thm.jpg
ee85b6dc4de82dbe51cf2f12a24f3b4e
1b915f596febf223782331a2c470ec135c25a06c
16468 F20101129_AAAKYO xu_j_Page_036.QC.jpg
ca01337be096362ceec286d8107fd7e5
f65eaaf9b023301f8cec3c7db8e12a8f36763516
24275 F20101129_AAAKXZ xu_j_Page_017.QC.jpg
f0129f162b8554a56e74c7f27eeae156
141f67d2b192f589f9607bb8845662e0d1a31ee2
85376 F20101129_AAAJUX xu_j_Page_033.jpg
15f280961540ca3f5919dc2c68f5f43a
5b0de5ff41e67dffd16b413b8a430447ed4a2830
75755 F20101129_AAAKBG xu_j_Page_079.jp2
084a789f8f41756adbe6ca69a2437d8d
358031a8847001a8af9ace0e385996746a4ab37d
41730 F20101129_AAAJWA xu_j_Page_066.jpg
7eb8ffdd32eb6a1b86b0ecd9e76f3ab7
4e2972b031f75cc0396adce43b27a8b0e4add446
60792 F20101129_AAAKAS xu_j_Page_062.jp2
0ba5c095456c4120f28cb18dc8aff714
6dd697f13c19d0605b81ba7ee2748898cc602ad2
61307 F20101129_AAAJVM xu_j_Page_051.jpg
684f563726ec109072460eee30cc2f9f
fe46d6aa981b6c05a257ece5bef7a342d7df86ed
19704 F20101129_AAAKZD xu_j_Page_051.QC.jpg
7fcfddc4b6b75c83c9084d47b4bcfb5c
b15f4bef132b36c5405ae7e72f2964698512a0d6
4935 F20101129_AAAKYP xu_j_Page_036thm.jpg
a7b7df067ada810aad7190690c79983d
52ef5f676481d0784b560ac16b0b25d82b7b3d18
73742 F20101129_AAAJUY xu_j_Page_035.jpg
a76376fafb854ff9bb7bc9b8392aac91
dc8e50d841ddb8652ea8bba8083ce10dd3d92cf2
766289 F20101129_AAAKBH xu_j_Page_080.jp2
13d2d16434b6107c51289cadc7e50bed
b03b33f95e4a87907bf2be32f63069fb1cb97dbf
39124 F20101129_AAAJWB xu_j_Page_067.jpg
d855c9d6262b7f579b34e45af14c26c9
5833009bedb2d37fa284cd477a1f61b2c0bfd5b0
1051975 F20101129_AAAKAT xu_j_Page_063.jp2
32be0219c7230d49c3252283e1069311
e8556dd8344cec90f2cb0217ec287f78c6752bb6
74651 F20101129_AAAJVN xu_j_Page_052.jpg
5b75defcd73c928d184325d9e4d9657c
6d6d95e74967eddc35760bc91fe21443eae8eef0
5836 F20101129_AAAKZE xu_j_Page_052thm.jpg
fd49b245a77f300cbf4f3fd86b12cb1b
35e24b85651f4ffab22a7d307f640391bcfed340
19576 F20101129_AAAKYQ xu_j_Page_037.QC.jpg
51f36ead9e34b98a08febf4caa4e280c
8d0ffec629feabf573b063f4f91ff4eefa734269
52590 F20101129_AAAJUZ xu_j_Page_036.jpg
970a0e9b5dd1f88d2531c3c4e26c80fc
2ee3ba84e193895d051bb761c201e81ee5dbd1a3
862001 F20101129_AAAKBI xu_j_Page_083.jp2
6b8d076e6cc4521c0b1df9b21ad91bb9
b436c80993a2799d9e210c959bcc2695c808400e
81047 F20101129_AAAJWC xu_j_Page_068.jpg
08086aaed1cd3741810b45b405e44bfe
25800e492841eb86617d935167b2731663aee651
855548 F20101129_AAAKAU xu_j_Page_065.jp2
94feb1d11e74c14b1458c6e6b89f1c3b
f53a93c2077aa0abb49d6e3a8bd8a7e63e858b33
52760 F20101129_AAAJVO xu_j_Page_053.jpg
2eb980ff953490be537584b6fc3210b8
2771c0287ecc07b469cae02043568f37a9ecc086
18686 F20101129_AAAKZF xu_j_Page_054.QC.jpg
e66fafd25934f70811e0a9277e757cd6
ca6558517fe59371f0d5749f6673936065c95040
5522 F20101129_AAAKYR xu_j_Page_037thm.jpg
358f728275855242a55cb9d716cd8f2c
9c01316e58355bf3714a36c778a281804069a6f5
1051986 F20101129_AAAKBJ xu_j_Page_084.jp2
216f9f6ea231d03e0774f0ba9504b0e7
2dd105fc678931670615d7a83ff3c4b635accc7e
57595 F20101129_AAAJWD xu_j_Page_069.jpg
a70cc31f91fda3fa0d40b8184caf6f99
b2797fdcba55ad7c173dde355abe631c8d61c366
64246 F20101129_AAAKAV xu_j_Page_066.jp2
75fcacbaf340f0e0ed9421c318274c5f
c199210ad5289c6de69f69368ef7dcc4cfb1a1c4
57840 F20101129_AAAJVP xu_j_Page_054.jpg
5c07959078b2b13ffff53e54f3e62951
1c9b4129f5428743e562acbfb9032088d7911783
20844 F20101129_AAAKZG xu_j_Page_055.QC.jpg
4281dfc1765a013818eae90789099e87
7db5a4751294c0d2a58f5d7659ff72b6ff7c6824
4916 F20101129_AAAKYS xu_j_Page_038thm.jpg
72032629828168141e55ea406240d3c8
f1afb63f7a25f4cd963316a2a7aa1d5b5ca8d796
952517 F20101129_AAAKBK xu_j_Page_085.jp2
3f18fe7cd83cdf991ab3929732d9fac5
a17cd878333e557982f815ee78a2d2ab8ee06d53
37502 F20101129_AAAJWE xu_j_Page_070.jpg
e8f8af9505e358e3967026710bde799b
a912d26674562a15cc5c02a397988fe0fcf5636d
50682 F20101129_AAAKAW xu_j_Page_067.jp2
3ccd769abf80e882bd97adda72516fbe
eafc42cc3f0001991d6f6c6cc11a63b978c7a2b7
64635 F20101129_AAAJVQ xu_j_Page_056.jpg
9b5fddd3ef5f0447c7f5b2755791882a
5146b8ad07f082ad150395b594557e176ac2d5e2
20530 F20101129_AAAKZH xu_j_Page_056.QC.jpg
e7a889b0fdbce5ff0a2faf03b86708ab
9e9ecaf3d7d6069db1d835a5e0de7aed1d6cabb6
23732 F20101129_AAAKYT xu_j_Page_040.QC.jpg
d6ff450f7695b301445cb3768c5f9764
f61d260f94fa8abd19e770713346f1c171a9fcb8
493175 F20101129_AAAKBL xu_j_Page_086.jp2
72e8e22ba5566790dae9affa22131b30
bb49ecf9f62481c50c7cd5ff6fa62825766b751c
53009 F20101129_AAAJWF xu_j_Page_071.jpg
0052b772cd8d4702213fd59be5069fcf
72fec7bc79ceee77e4dcb73d6ed4fea02cdbc9f5
91983 F20101129_AAAKAX xu_j_Page_069.jp2
e2285d049b03a1279b57355dc9b79612
b7e7faa03b945914625afa66ce5c29eae37d6648
50228 F20101129_AAAJVR xu_j_Page_057.jpg
d79e64c6d27513dac270c54e86ec5278
894b9ebd84cd94e83320631729cb71b94cff6bcf
610768 F20101129_AAAKCA xu_j_Page_105.jp2
6820b9905d9b06637a58e16e57460b7e
0a21c5e67b11ffc7cd561b54eb0f4feddb5e78bb
5434 F20101129_AAAKZI xu_j_Page_056thm.jpg
781592fd9a61392c8b01db70c46c3f9e
33a63cc33503d022085387e0f1195cc9b6620ebc
6326 F20101129_AAAKYU xu_j_Page_041thm.jpg
9a94a3e9a9ecf00edcbcc49e40237249
470987d464e05af0ee41b73d104e9e7ea6969ee5
826556 F20101129_AAAKBM xu_j_Page_087.jp2
cc88116936b403c0bd239c4ebe946157
861d7d0b9f755d599a3c1b49e5f142b4da3198f5
32190 F20101129_AAAJWG xu_j_Page_072.jpg
a431b6af32a65ccb9d29c0f21a3e5e07
15f6333ed04a61bd8cab1c7aa51254160c250d1b
58040 F20101129_AAAKAY xu_j_Page_070.jp2
97bb64e342064458aa2fc67f8688072d
86c1726dc94129b2f133aae60cb709a432d19206
75469 F20101129_AAAJVS xu_j_Page_058.jpg
cc65ceb662c4c1dc7a881b053be4a459
97aa4d31e9d8f832f7b7ba4dcdc1b59ee895ebfe
881218 F20101129_AAAKCB xu_j_Page_106.jp2
0ce6ddf29ae2fa3787dce048efecfd1f
89cca7fb80d243f174edd580fdc32d729bb45fc2
17978 F20101129_AAAKZJ xu_j_Page_060.QC.jpg
31f581337d235f429ca00154e7b13212
ab01a6719ff8348874b6989c992000481387605a
25455 F20101129_AAAKYV xu_j_Page_042.QC.jpg
51aadc004c24eb536bd2289253d5499b
09ab81100b5c32e7d62cb571cd8b81dd4844e899
1051978 F20101129_AAAKBN xu_j_Page_088.jp2
38520bb48b4650f674b9c5a6320be6cc
60cdedfac30559a6d1074842adc94930f536da16
87154 F20101129_AAAJWH xu_j_Page_073.jpg
9215c2f729a4106747f2984ec52bfc9a
e65207bd253037e825ff6927c8224274a2d3b03f
713262 F20101129_AAAKAZ xu_j_Page_071.jp2
76f2af48382c92ad3eef83ef7f5b5fec
898337bf472a95de4e93af25b6bd4ac9adc35bee
64945 F20101129_AAAJVT xu_j_Page_059.jpg
fa4caef2e132048e2e42a80042aa2b4a
e0119812552d47bb1a1c77b91df6062a979875d1
1046678 F20101129_AAAKCC xu_j_Page_107.jp2
ed4fc0d725629427d660d816b08d74ec
8814eebcefb44b9cdeecd7caed65db3189c37f55
29070 F20101129_AAAKZK xu_j_Page_061.QC.jpg
c3f985f9b32deeaf32df45875f16fc28
8a40942c07455aec5f5630837e5aef4f828646fa
5611 F20101129_AAAKYW xu_j_Page_043thm.jpg
94d45e61830e10c3154146df1c4287b0
490eeb90c2a437fdd9a480fbc7256489021f7b3c
1051961 F20101129_AAAKBO xu_j_Page_089.jp2
3db44de44e40e18818b93fddd6e3fd7d
00448fa24504fceb218f9c7f4314ef3c1ef0a753
82746 F20101129_AAAJWI xu_j_Page_074.jpg
4ae36231978e479150e600b6e92d2744
5ccfe461b8b5acf3608b35da3d37725e9b3ac46b
56979 F20101129_AAAJVU xu_j_Page_060.jpg
c47fa4eea08c1b2c680c2f5facc2eea5
be44991416209b0f1b409593e2e7b7f251789835
7092 F20101129_AAAKZL xu_j_Page_061thm.jpg
1b8684c90b0765ed69b3ce0c46b00e8f
f1efe50264fea5cf938d0a52696902cd79dffac8
21172 F20101129_AAAKYX xu_j_Page_044.QC.jpg
9ddfcad7dafea0f616e97f9701177d77
e377680d07dc08b73d9016e1622fde0b5c8c0b78
873272 F20101129_AAAKBP xu_j_Page_090.jp2
648ae5aef1e10bf4a91481fd400d7b04
531fd13433c6b580590c688667e6468c34b3faed
70862 F20101129_AAAJWJ xu_j_Page_075.jpg
2516332cb9ffcedf25afe983f345e66f
ad62d95a924a97df7be947324468887ff849d871
93326 F20101129_AAAJVV xu_j_Page_061.jpg
3d3fa898ff42a6f707ef43358608a36d
2843eace8b390eb6ed3c9e2d566ea9f1b1264fd5
1051956 F20101129_AAAKCD xu_j_Page_108.jp2
3d2757032e255b1bf364cf323fe24cd9
a5ae6049e65e7ce9d433c720fc12841e036e3555
13865 F20101129_AAAKZM xu_j_Page_062.QC.jpg
bec0d442b924111757a279c5a317184f
b34ab9db6e721c50239131b92104c1cf29d5ea82
5872 F20101129_AAAKYY xu_j_Page_044thm.jpg
a59761f628ffc2335fa9e71ff87663af
f418dd728e4270b4c3c6e2536d42d365110e75b4
1001501 F20101129_AAAKBQ xu_j_Page_091.jp2
0f102eae62720c8d68cb98ea17285b39
30f1c127652c0de0deef958636af6533e1035b6f
75410 F20101129_AAAJWK xu_j_Page_076.jpg
8aa11970c03684880e90e5af1fafe366
d0fbfaee65ef7e6f27a85666289c71780a9f4128
42415 F20101129_AAAJVW xu_j_Page_062.jpg
709070caaf46259d60b147683fdd4467
f39ee5aeacb9bc0c024fe72187cd5694bdb78102
115385 F20101129_AAAKCE xu_j_Page_109.jp2
6f917e17f65fbb650fa465dbb8892ed7
c7de575aad1b1926e4e75b6887e2876ad39b9622
6376 F20101129_AAAKZN xu_j_Page_063thm.jpg
5fd6d4406e93972aefc23949bcbf57ca
848a2ce69fde7f942a6d6adeecadf5c5ede63c6a
350421 F20101129_AAAKBR xu_j_Page_093.jp2
df35a47b38653480c49b94be18a69a8f
fcf49ca17dc172b9f381ba68144bf86b7c3c82f2
91599 F20101129_AAAJWL xu_j_Page_077.jpg
fdb69d53ab05123b726fb1f01d357f40
f928780c52338c93841349515a03e74b7d0d2fac
1023320 F20101129_AAAKCF xu_j_Page_110.jp2
959bdaf6949e3dbfe02dda9334274ec4
19d500d36f206ac2461e633708e1f630b19a77f1
28305 F20101129_AAAKZO xu_j_Page_064.QC.jpg
b8c881ab357705b1f2a9cfb43df752bb
011f59c8ea9bbbe1ae7ab71e2f057c9954564132
25950 F20101129_AAAKYZ xu_j_Page_046.QC.jpg
f62a51a151732ba44d8d850e55cc9146
9b11f57e97a2983469a5c50ed4ef61c1f69899e5
89784 F20101129_AAAJXA xu_j_Page_094.jpg
c093986d2c7bd9eb912f45751c27179a
8c19f605f5fbcf9adf4d877799e3bd2a9468d78f
1051951 F20101129_AAAKBS xu_j_Page_094.jp2
3fb946d3b54bde97f69d8a58de6c3614
bd9eed7589d2afa8492a2d7e18b46a584733c1cc
69132 F20101129_AAAJWM xu_j_Page_078.jpg
e960c2324b87da81da8a504120ced5dc
10fa560dd060c850e67f3a2b5e7243e10eab39d5
81352 F20101129_AAAJVX xu_j_Page_063.jpg
1f573837a72c1a5af479d15b933b8bbe
b01c53e800f9b9c0b45864460f58bd308036e238
1051882 F20101129_AAAKCG xu_j_Page_111.jp2
1775f23447f12eaa5405a243509a2444
c38a71d238eafcd99078d1fb515d03e30812ab09
7012 F20101129_AAAKZP xu_j_Page_064thm.jpg
2c0f25646caa36949d8f280506beef3b
44a548976dcbfe376c56945bd66f05cd999e5ee4
93963 F20101129_AAAJXB xu_j_Page_095.jpg
0e957754865728918447c89c044cd46c
3c908e1c9fd744b1c309cadde5806d9c79f77fca
1051953 F20101129_AAAKBT xu_j_Page_095.jp2
fb8f5e6cda2b05552e9e23678dce8d0b
5f3d2015dd05c941c5bbf8563ccb89f5089f5e85
56254 F20101129_AAAJWN xu_j_Page_080.jpg
e0f34cd484038517f864352098c6d1e8
5124dbd31d77dbe6d9850a7c5c29a3bba477a2d4
91278 F20101129_AAAJVY xu_j_Page_064.jpg
961dce59f9819ca07ca789f60f521681
cbc2489cc5d088c3f5e6a6b29e64b93b26eaabd9
1051965 F20101129_AAAKCH xu_j_Page_112.jp2
fd14b9ff9aa9e2fa53b74230bc84ec88
a67aaa8bbfdcb19ebd128441fa2cf4e76a086f55
21488 F20101129_AAAKZQ xu_j_Page_065.QC.jpg
00e8accd723be411793ab5422b9e4c5e
fe248208d88f5128b67d147deb4f36a5ea7ca72b
70593 F20101129_AAAJXC xu_j_Page_096.jpg
bd963add18fa17398e1e8c0283b3d597
130207e54ab1f41b3687ac7b3ede52ac03665092
73168 F20101129_AAAKBU xu_j_Page_097.jp2
0dfcdb27ae1922bceacb0ba2aae1cd26
6651225c79b46ade1dee469163ce3a28698c640a
71846 F20101129_AAAJWO xu_j_Page_082.jpg
99a2b2ec8c713f3187291c0d22cf3707
af343b0bf7b9db8fcb8870b1faf72a05b2c98b22
71414 F20101129_AAAJVZ xu_j_Page_065.jpg
4a7b66a619f1eaee100204579c0b9915
591a7a70317b7f7b80b4e2342015998928df0ee1
689918 F20101129_AAAKCI xu_j_Page_116.jp2
b8d99ec6690b06bd8f5f19f1a0db12bf
8d230ed6002fbc3a9031bac7bfdd083a50108362
11734 F20101129_AAAKZR xu_j_Page_067.QC.jpg
e7345e80161e116c73d4429302b8c790
669fa28a17a61b388bb0fd9046c485ca1c9ca070
49474 F20101129_AAAJXD xu_j_Page_097.jpg
d1b99ab95e9e5a476f3c4fccea36c549
fca6485c2a646b55fa1fdf7ec8721ff9262f7691
759300 F20101129_AAAKBV xu_j_Page_098.jp2
b692dc5a9163067282f503ed4f498cbc
2942f19b7f67b05dd24e066bb5ce0e79c7925511
62150 F20101129_AAAJWP xu_j_Page_083.jpg
3b055475eedc92a7a70bd6421fb1fb12
51452ae510a83cf8b69fdb36727b3ddf6c4f006e
678693 F20101129_AAAKCJ xu_j_Page_117.jp2
be734c2d6be25f133eb10f8a56b1aae0
ba93a7ea7c791e74d17a652ade64c220ee89280c
3794 F20101129_AAAKZS xu_j_Page_067thm.jpg
f61e6485448a7ea512e7d4c7e1b3c2f9
35d8699f7e9dda393264826d9206e64e2ed690ee
54389 F20101129_AAAJXE xu_j_Page_098.jpg
70238a47ceb6f793cca0695965abb8ed
245d30060eb5898c2516b0c52becaad11daa194e
668503 F20101129_AAAKBW xu_j_Page_099.jp2
26405e01871159f2f36c854222b6e893
204f3c5bf9624c84ea26d8dc17929ed89e62d608
90119 F20101129_AAAJWQ xu_j_Page_084.jpg
f5068a0a5598b60caf7b5bfa3577f3dd
76021e4108adf65da8c7b065bfeb315ec930c666
774140 F20101129_AAAKCK xu_j_Page_118.jp2
ff7fd78dbb8730bcb28b8fdd948c974c
f6611d770be738582db9e89e7510b892b7fc73fb
5180 F20101129_AAAKZT xu_j_Page_069thm.jpg
77a03374ea4f4dc7df69dd3908f3287c
1539fd279c1b9671dc0bcb7599bf427237281b97
47757 F20101129_AAAJXF xu_j_Page_099.jpg
50cab19f8288f3e6838850733a9921ac
309b5865e5d8b3992d423c7ce4e9f3ade77642f8
815195 F20101129_AAAKBX xu_j_Page_100.jp2
9fea5f7078ab35321be6f5d302033c65
9c699bd044128f8d14bf59f5df490b238beed543
69529 F20101129_AAAJWR xu_j_Page_085.jpg
f4580ba4eef5550a381dc2216db3f081
1a82d9d4f92a5532bbb2385eab6027e584d45f95
999694 F20101129_AAAKDA xu_j_Page_138.jp2
9e88f575930cb123c9a300996811539f
28f887a2ad3a1e07c0291a93be3df3842a017e6f
845355 F20101129_AAAKCL xu_j_Page_119.jp2
ded3d614f68d85d7685eb73e45e25742
dc29a0f66e5e071fad6a58f90bd433ebf7450c6a
12507 F20101129_AAAKZU xu_j_Page_070.QC.jpg
f50da8f18a99e409650cc132c98935b8
940a5d2bbbf59dd62d16e359eb8180684dfdf23b
58959 F20101129_AAAJXG xu_j_Page_100.jpg
f77583f143a06cccdcfcd8344f947533
31038a657457da6e700bd9a8dc451b268af34416
970699 F20101129_AAAKBY xu_j_Page_101.jp2
2a4635ee432833490bab2f9c378096c8
2ca7036f4f4cf1cba3163a5eca9d1886df8cb9bd
39393 F20101129_AAAJWS xu_j_Page_086.jpg
5ca486684af9daad7f5e5276eeca68e7
cf075c3c73a6297303668cb9cc9d81bfd6a99165
1051895 F20101129_AAAKDB xu_j_Page_140.jp2
e74f40586abb63c30ad30e96a3d26ca0
1975811fc618239c8a8fba8cf5996eacd8af1827
784185 F20101129_AAAKCM xu_j_Page_122.jp2
337d4d288927eddc6fa599c9d73113f3
272f58e20e1c15e5399260ebec50ed7042247fec
17401 F20101129_AAAKZV xu_j_Page_071.QC.jpg
60aa10271c858033ca5e7001c1b1fd4c
479d07314100c8190c94430393f02b6bc4a1df0d
67781 F20101129_AAAJXH xu_j_Page_101.jpg
f25fa408d47f470c3cab56ac41c8a804
8375ec33df2cd20f67332a3e177a4058701753db
F20101129_AAAKBZ xu_j_Page_103.jp2
4f3ec311a4a7e005d1ed02c5b442e03d
6f2477d0997197783d157fa50dec95f3a4468ba2
60286 F20101129_AAAJWT xu_j_Page_087.jpg
ebacee8e18153b0ff2de053e3d5de098
2e46c16d50d8ec8cc4259c36b6bb509a0861cabc
1028556 F20101129_AAAKDC xu_j_Page_141.jp2
1589da4425cfa09f743874e3ad13cb86
9214ca8e8d178f90172d2a5147698c39c5b3d58c
723908 F20101129_AAAKCN xu_j_Page_123.jp2
9acebf69779b7f23e4695886dc6017a7
a8b0017ccdd3d2e1dde83bf65a00e06a700c02bc
5102 F20101129_AAAKZW xu_j_Page_071thm.jpg
de53a7a2d6f555bfb46338f8c6bf45c5
fd20263a8cb5146b2898ef9e6482684dcb12b263
87850 F20101129_AAAJXI xu_j_Page_102.jpg
2a49b87faedf2a4bdbcb42e726bd525b
933cb9cdcc65d27b9c1df342bae2688341621ff0
78525 F20101129_AAAJWU xu_j_Page_088.jpg
5e2fddf37dbd335d00d7467a7bf528f3
14fb5431d00646e72b06b5a277f71ea622d890ef
88672 F20101129_AAAKDD xu_j_Page_142.jp2
4068022520a732aaa16ee9c886c65d38
a10e594b9668166ce998ec103fa8da58eb9b9285
955305 F20101129_AAAKCO xu_j_Page_124.jp2
4c8c8c3b45f47326f3a04e308c17b0d2
a98f26f44a68e431639bafc82136c1ed59686fec
3386 F20101129_AAAKZX xu_j_Page_072thm.jpg
0c5b498f410aae25a6d3b02a1b668301
9dcd61eaba465d37857fb9d9dbda1da260301172
91428 F20101129_AAAJXJ xu_j_Page_103.jpg
ffbe4d66ab7c400ecbacdbaa0ca3854c
367c17cd6abe03d14ceee7061d0b6d2849f0067f
75746 F20101129_AAAJWV xu_j_Page_089.jpg
d0a63dcf8fb20323bc5e3af41a9751fe
d47632d940ae8b44641c99c71c782ee0e9e44f0a
977962 F20101129_AAAKCP xu_j_Page_125.jp2
d4a180657c1071186256637675fea6e6
e1991e1413362f499b1ce969174985cf0675b85f
6747 F20101129_AAAKZY xu_j_Page_073thm.jpg
9eea7114912a901cd2c3663205a2e3be
ea7168e323767d7fb2dc9bd56b70d42be9b0050d
58863 F20101129_AAAJXK xu_j_Page_104.jpg
28af26277144993773f0daf0a4866f5b
e0e14c22390e634c00d18f892c8edfe7bc36f560
61588 F20101129_AAAJWW xu_j_Page_090.jpg
85345b148989fff7e803b65c1b44c99f
ea845f8a2f9ff2bff6760d38001b1fd77b6ebd89
F20101129_AAAKDE xu_j_Page_143.jp2
cb5277fdee7ed52e566cef4406684f90
316ace29319b9418b92bb214c6cd5a26fee27878
89984 F20101129_AAAKCQ xu_j_Page_126.jp2
cc129511fb75e524d3e41622c4d1fcae
923018741c5eae840af8ade901d05d3789bb6a40
25385 F20101129_AAAKZZ xu_j_Page_074.QC.jpg
7023d8a2465ba4b6aee109618e512f1f
d0e2f8ff32d03970dcebf3bc2d8aae10ae8087d1
45408 F20101129_AAAJXL xu_j_Page_105.jpg
078df85599ba4bc97ae53f336ef1ab6d
12618de51b0854d71044d655dcd47710b42c70de
71070 F20101129_AAAJWX xu_j_Page_091.jpg
36268c907fb78c55d2a39a416fe99f12
e434db2e46b40dfdc3092e4d5e3a11803e32590c
1051939 F20101129_AAAKDF xu_j_Page_144.jp2
a13fdabfd4da152bdbb7c9dd5e8c2c62
b1fa580967a28e377c828653a861ac524a6e253e
F20101129_AAAKCR xu_j_Page_127.jp2
733fffa19b7ba39f0a7137f24de9fa92
6a977ec4cb4b139af8e2e429c8b1c69a11bb6a5c
59778 F20101129_AAAJXM xu_j_Page_106.jpg
08bc55b899050767903264d2b3665245
c607c22d8c479474dcf7d71c96fb410465d97c07
F20101129_AAAKDG xu_j_Page_145.jp2
56f6af60e8fcc42c6ab433c998a6364b
e877fc2f84d79eb07409afbf9d6db1f24cfbbf73
69327 F20101129_AAAJYA xu_j_Page_125.jpg
b96cae73607d2f77d16151833882871e
dedcccdad1d0944a153cf407f2e1809fc7bb19cd
994964 F20101129_AAAKCS xu_j_Page_128.jp2
2aa20c4f75d1deb358bd3ff87b0de9b2
4785f0c82a8fb4b188b92b2a28a96fd9c1f80b41
74611 F20101129_AAAJXN xu_j_Page_107.jpg
a4b0b15078699de01e151c7cf93fd2db
8036f99708a592edc6a0932bfca91eea53f67f00
81658 F20101129_AAAJWY xu_j_Page_092.jpg
082a1fb037c05777396ced3212f8d846
feaa5387d3a731fc7d21c3f5195bb73316c3e137
F20101129_AAAKDH xu_j_Page_147.jp2
70ff2247d5920c57fd703f0857443246
bc54c96eb1e52bf5802bf070108f4d270ec3cd15
61682 F20101129_AAAJYB xu_j_Page_126.jpg
84b3fba22b68c3a8484a66d2eadf7c77
d1bc3d8701468d8c2624dca88ae26b6315933648
1018432 F20101129_AAAKCT xu_j_Page_129.jp2
25d060626045d3bac75f26bfeff0cf73
12b669b7e75bb0c8b932c0fdd09c6fac3844cfd7
31526 F20101129_AAAJWZ xu_j_Page_093.jpg
167cc93a1897916dcf78d85bfa612e88
e0fe0b06d189bf4d7842e253d7532e5a95bd91c7
71118 F20101129_AAAKDI xu_j_Page_150.jp2
aa3977824eac51262ea51d5e6e1d3163
0cf31581faa70a37c2bdd773fae47a757a303042
69872 F20101129_AAAJYC xu_j_Page_128.jpg
d1d7913c80cdb47a2658a7162986d7a6
bde16f9e9b9ae7ba1ef6f0a58e0e59630c8a57fa
F20101129_AAAKCU xu_j_Page_130.jp2
bffc87d2d15cfe1ab872d505020d4733
55adfba35e0044bf5c841e6e650be86341a76358
74108 F20101129_AAAJXO xu_j_Page_108.jpg
15ac4f99f78167340d6517785058ec96
d576efd6b07acb66bc544202efaa01cedbeebe66
130033 F20101129_AAAKDJ xu_j_Page_151.jp2
2e2f3ae4e59718126ec6b6be727876d3
8ddf6fb686ff27664bb9f59eafa7a17a2462457f
74337 F20101129_AAAJYD xu_j_Page_129.jpg
c326d8c3c2d1256afd40122b8669673b
5320ea233a8ef0226fcf7ee703215f66a7426979
16457 F20101129_AAAKCV xu_j_Page_131.jp2
e0cc3773899798ebea0ae4c53b458484
f80bf54eb801e7330d4d2c0b28facff477ac7055
73671 F20101129_AAAJXP xu_j_Page_109.jpg
1e280e7790fd212c9915b41a88f8dd43
8301d8595892d032d5dc1e310cd774072bfe09e1
132080 F20101129_AAAKDK xu_j_Page_152.jp2
fa9073230536a02bee0b96469d4a36cd
a6938cc384bbf21a5094d6d2f2a93f51a97872b9
91242 F20101129_AAAJYE xu_j_Page_130.jpg
041a6c248e467c7d9bb630aea1064220
7a92457691eb07842c46527e1f9fec3558bf1d69
1051966 F20101129_AAAKCW xu_j_Page_133.jp2
3c688010e69444c714f0281658e9f9b6
7096712f836fa4fd6437def839ca6caaab9241db
86363 F20101129_AAAJXQ xu_j_Page_111.jpg
55a49f81fffafad0ab60d7e238b0ca77
a8543cfcfa15599ca47154bdaef191e28bac058a
25271604 F20101129_AAAKEA xu_j_Page_010.tif
4b3642b84022f005b2712f7ee3fdf46d
246f2a3c52f6dbf76446932c3ee871aa364e86fa
137568 F20101129_AAAKDL xu_j_Page_153.jp2
d764fee9f894d44cb331190876997b5d
d11e969702e1ea80a3e4d17a8470a3c22d7d1cec
14545 F20101129_AAAJYF xu_j_Page_131.jpg
1429feaa567d8b15e40fc966f4ca8a53
c07b3e2e8734a48b71100312e88f3b0a8772d1dc
1015073 F20101129_AAAKCX xu_j_Page_134.jp2
0e2453b6ecd70ef28bbacf56fcc883f3
afe2ef95ff204ad8c55c6cf8b0b262f6404cad26
85391 F20101129_AAAJXR xu_j_Page_113.jpg
f074f3786b5d4d47d7704b1e28080a88
581b5efb1ffa60f091fc9c384ac396edec7e6a4e
1053954 F20101129_AAAKEB xu_j_Page_011.tif
6999cfc75699f72e8e7b144dad05f2c9
197015ff6472d8f25980d28db8dcb6072db0f9f2
132650 F20101129_AAAKDM xu_j_Page_154.jp2
6bd436e91a82e9d6556fe9bc5c823269
7d350284ef3cc623b3f2cefbcd66da4178e42da7
79990 F20101129_AAAJYG xu_j_Page_132.jpg
0cf9eaed197dac318ea55f451459f7b5
e223c7a6b045cccfe12e5e1746198a2da51758e0
1039358 F20101129_AAAKCY xu_j_Page_135.jp2
2eba8718707b2c16cd5d250f916a6a35
d1f05dff7bc1f26708331eea49815c290198be83
59289 F20101129_AAAJXS xu_j_Page_115.jpg
3c3f48f9a4aa6af5b92765a0434d24f2
e8c00adb290a8b28f940cd9feae7576b7e948e62
F20101129_AAAKEC xu_j_Page_012.tif
084c5c3fb1105d6b8f2419b82cbd1e45
0580f029d6f93ed5956f37a8f3aab4fc6eb5cd9e
132889 F20101129_AAAKDN xu_j_Page_155.jp2
622738bfffb369af800ae51941485295
54250d8dfc79cef97d12e0c6976c10eaf5a910db
81165 F20101129_AAAJYH xu_j_Page_133.jpg
77c36d456f6d55a0bc2925be6bb22443
02d96e1e51131e0a5175972760c02a2843b440be
1037036 F20101129_AAAKCZ xu_j_Page_136.jp2
2133d960e1f387d0e16f3ba40f081fba
aadc05a8188145c2ec7faf85cd457a7f7498c215
52638 F20101129_AAAJXT xu_j_Page_116.jpg
ee490edf2970a110dc02dfe291e6bd39
fbcbd4f1ede44671ec2ccaa95b07bc66ec2221c1
F20101129_AAAKED xu_j_Page_013.tif
3d00558cda51685a962af97ef959163e
63ce8f942ef772039164a5dc76df63a5c393c2ea
141027 F20101129_AAAKDO xu_j_Page_157.jp2
0bec01161b7cce3689e757e02e0c4954
f5cb035f81ba168bb922a0a770c2aae8e27184d7
74379 F20101129_AAAJYI xu_j_Page_134.jpg
5fac74bfb41aa6ed2c8093010b37e489
539f23e5610d0ed592476e6e297c989e2e853d53
56619 F20101129_AAAJXU xu_j_Page_117.jpg
9750de815d61b20f29e1851d8619786e
73421af54a5030a6473e9defb3e97fa433ab706b
F20101129_AAAKEE xu_j_Page_014.tif
b89cf38eb05c5cf7bd87938aeb66e3b1
f7ab906a99493753686a0166c7d97f25227fafa8
139294 F20101129_AAAKDP xu_j_Page_158.jp2
025cca54ea9a540521af62066790e57f
e04db73429ddc78775c09a9979af2a05e6da7e66
72475 F20101129_AAAJYJ xu_j_Page_136.jpg
047f3302e458767b7cffb3387fa3e769
b6542f6b26c5a96f24e2b24bbc882072fcb3f43f
61134 F20101129_AAAJXV xu_j_Page_119.jpg
d3d7bec59be11c8123c7470218107e02
68c0305787e8d10c4952dc6a73420243cd31abea
F20101129_AAAKDQ xu_j_Page_159.jp2
7176148f742a73734fd250cc82b9d348
39c13c594f1e2237419290a604d3d3ea423a7d96
76133 F20101129_AAAJYK xu_j_Page_137.jpg
f6f07d3632f4df0f78045b1d0d0ca104
88162c18f953e76976c5056812b765a19df79228
57513 F20101129_AAAJXW xu_j_Page_121.jpg
b7bf2296fee4f91de9269838cfc04cc3
ab16f27924fa029080398702f7a288dbc743650e
F20101129_AAAKEF xu_j_Page_015.tif
7716b1303d3e3bdee4dd7d89d3e3a468
9074b712e5a740249e82714fc3017cf272999fba
107901 F20101129_AAAKDR xu_j_Page_160.jp2
907a92961427fcab853a08f57556b98f
aa91e14e6b0922e9c41db0f11d6552aae75f6e04
69488 F20101129_AAAJYL xu_j_Page_138.jpg
834452739a37f96d74ee049f2581b733
fc96077a97b4e4e0dbaeb3d28d61cdd3f813ab6c
56138 F20101129_AAAJXX xu_j_Page_122.jpg
257e5bbc96b6bd6bb386102323160c2b
e1f8fe186e571bdcd458e0ae598c754bd3e0cf80
F20101129_AAAKEG xu_j_Page_016.tif
df3b8533a78404b25779df44695c64fc
93de7b7723da0160288dfe1a922e3d4eb8ca5560
68288 F20101129_AAAJZA xu_j_Page_160.jpg
e32d57a1841558966be621c0be7c1b99
a09c44b8f0805b296cf8ffaf3775ab314ab8744d
52863 F20101129_AAAKDS xu_j_Page_161.jp2
5c30599b08e53ede509654cfd4e043cc
5d8bb8fb0e9f2ea445dacb4b69fa41759a97ff11
72889 F20101129_AAAJYM xu_j_Page_139.jpg
a408648a07141498c197a690b37fd4b0
ecac8dad3542acaea914840d12cd3940660ee39e
56328 F20101129_AAAJXY xu_j_Page_123.jpg
6f0fe388b301817a7a5549ba54e205fe
4a9f0afdacb94172b7687cac3900bb5ca113e154
F20101129_AAAKEH xu_j_Page_017.tif
2d4314a94349afe342c1b8708279ae9d
78bd0b3e50186008399156b94896814334db4874
39088 F20101129_AAAJZB xu_j_Page_161.jpg
6dc2a488629f2af09b54898f3b74e312
962cae019f08241ea08973c03d3d72a2c1e79a80
F20101129_AAAKDT xu_j_Page_002.tif
12ec6d2c113cf000902140457d0dccef
25edf0e3f22fd04964a4a0ebb6ca372bfb70bb06
74718 F20101129_AAAJYN xu_j_Page_140.jpg
6e8b2580ace38fb8f89623c0d28a424c
9e35b3f102f60fdb955e464569bb70051ee8d7a7
F20101129_AAAKEI xu_j_Page_018.tif
9f9d0e7fe8f8c6727d639bd5c21e1b1b
6d97581ec76a26cc125079d03b69fe674a81a17b
4722 F20101129_AAAJZC xu_j_Page_002.jp2
cd002f19721806ac516d501729986e4b
ef0ec045a360adcef9c6c3fb72c754e65d8c3dd8
F20101129_AAAKDU xu_j_Page_004.tif
19a52ecc113350c0bb110077a0ce1c6b
cfccedfa2ecb5e4a302ad07962991655bcfd49d1
90533 F20101129_AAAJYO xu_j_Page_143.jpg
a9fd9eaa89589104731f1fd61162e7af
bb77ef556c3719337cbc775a3964d3ccb22144c7
69086 F20101129_AAAJXZ xu_j_Page_124.jpg
18cdbe6f694369a276b6bdcb1fe8f2a0
a25fba9cdad8c6031a5842aa70e7717c7cd84b75
F20101129_AAAKEJ xu_j_Page_019.tif
d9a2e1a231e83b243c3c170d382f015d
57054bb489e581b738f5411dca3724b8e16b71d6
5962 F20101129_AAAJZD xu_j_Page_003.jp2
e7cf7dd644da947245bbb5eedadf56f3
cebae929c92f51e19c830955df45aa125b78c4b8
F20101129_AAAKDV xu_j_Page_005.tif
7997607548cc92dd770c636f21e0b3d6
e5ebe10d6eb329e4ec60c64de018226624562d86
69465 F20101129_AAAJYP xu_j_Page_144.jpg
8f7da5986376be98c0799dfafe828390
58acef93bd3add8a783da09043c17ef9b68aa24f
F20101129_AAAKEK xu_j_Page_020.tif
709b4c2837f5590cf741f23ce5f5350f
1d271b59f988d9282afa406c932cabdd37b1def3
115308 F20101129_AAAJZE xu_j_Page_004.jp2
7259434a5f8be12c93269196dcd9ef7a
6d8c02f636768aac9107864b74fd18eda0b61980
F20101129_AAAKDW xu_j_Page_006.tif
112096f675ff50cba33d7cd3a7ee51c1
42b8341477cf5f608e80db76a17ae57027334060
86383 F20101129_AAAJYQ xu_j_Page_148.jpg
b72a090242deb9ec3d2d4cfcd3cec268
12c57419eb16f8f0c25232d84e7e0d02bd39aa67
14585 F20101129_AAAJZF xu_j_Page_005.jp2
dcb4c760b2ff4e344b35836181e04bda
9a5566eb10ae05f689c8ac18acf7905a3d3eb973
F20101129_AAAKDX xu_j_Page_007.tif
5ff43d924f59b864f5a50132b1ba5d1d
5257737d3b5570ca9e04fab7541f7f7f937448bd
87478 F20101129_AAAJYR xu_j_Page_149.jpg
d34d5fa67b4b970c586e481744357a22
eede1b2bb98d11ee423e14cc13b83941fff8e74b
F20101129_AAAKFA xu_j_Page_046.tif
0f49aac4d7d2b85b30e051c7773830ea
ab18f8c630cbd856e425a3cb61018b1a79a92f8b
F20101129_AAAKEL xu_j_Page_025.tif
a7ed592f0bae6135fbeca65ba147354e
7d10fbd0fff1f3e69c263faf39066b40f74523b2
F20101129_AAAJZG xu_j_Page_007.jp2
0616a8afbee06d5e60d4ef50fc184593
e373da5ea30a65b769fdb8af0de63b443b7613b9
F20101129_AAAKDY xu_j_Page_008.tif
fc2987112dceb14573ebfe53c1a5edcb
52b9392dab6a0e124246e8ddc4186058e3cbc89d
46192 F20101129_AAAJYS xu_j_Page_150.jpg
aa413c03c410441592a6b1b337511f43
8b669e4b24af58709737c0bf04b11b1893a9339d
F20101129_AAAKFB xu_j_Page_048.tif
6e45ca7aeed8b50e1fc706527e7bb8f2
ff035284fc80368c95459b4d93e9a29f7ac5920b
F20101129_AAAKEM xu_j_Page_026.tif
fe3d0921730d91f4f25be01474ce11e5
9f9c501f427cd7fa7ca2b4445d63337bf0cc5d17
160101 F20101129_AAAJZH xu_j_Page_008.jp2
0cdb852c0aba04a0e3ed6d9278aeff4b
669b6901e6c5237ac5a1520eddd736603a0f19f0
F20101129_AAAKDZ xu_j_Page_009.tif
07f607d1f37f222e7f41fb5601305442
5f04d6e0767c7b277b3b8be3210c45b4db09b161
80006 F20101129_AAAJYT xu_j_Page_151.jpg
121e6c8c959079e8ca6bebdf60804e0c
fca940949e03cce850d2571ef0cd0b1ae5d7c63c
F20101129_AAAKFC xu_j_Page_050.tif
17d0f58e396d3f7fd72fbd376bbaa1c6
4b2a7fdac7ad229619f488cdf039fe5881dfcdee
F20101129_AAAKEN xu_j_Page_027.tif
0c16e3fe0ab208a8abd0965ded0e66af
d9d6630ffd64339c612a971da9ee6a4eacdd7d7b
110839 F20101129_AAAJZI xu_j_Page_011.jp2
93b8f486341d55a9e5733c0b305f5bf4
d64e1592ed1e5bd9e686c08e6c0a5fb6a4ac5eb1
82609 F20101129_AAAJYU xu_j_Page_154.jpg
cb24ae3e9dcc594f00d3c3f7e5e2ef4e
884e86faa7ed8077e9a608cd67a2b864334fe0eb
F20101129_AAAKFD xu_j_Page_051.tif
5a83b0abe2de1d67fd1e8836e18a684d
c1b4b4ded5dc0174a6494e8233007a3ca29b1b6b
F20101129_AAAKEO xu_j_Page_028.tif
348cf91f0c093fd5364c726c4f6554da
b5aaab9753c316ccdfbde2e9a18d348ac152a854
46303 F20101129_AAAJZJ xu_j_Page_012.jp2
f5e14e2848438a6180046984dbab8c78
ae6803ce641aed74b78171b3ee84e1fa52d0ee57
80867 F20101129_AAAJYV xu_j_Page_155.jpg
c4bc2ee3e344c165ee979d3fbd199453
4655cc6b6c87c3514606b47bbc9c5f790acfd01e
F20101129_AAAKFE xu_j_Page_052.tif
430a5d398caee0e1c75d06c18d01b394
36ea761f6d6fd8abea987ec868bbfb1c2a7dab7b
F20101129_AAAKEP xu_j_Page_030.tif
ea59ee4e456d05f1bef7147c95838ba2
875f38dd9ac8a121a4ff348d0d902654a4ec09c3
1012389 F20101129_AAAJZK xu_j_Page_013.jp2
725b5458d7cd9a0189bab101ef216d40
27b44ae55792529383594022005b256572a3f40a
81266 F20101129_AAAJYW xu_j_Page_156.jpg
0709ca1942468d61c1d539c1786252c1
d721001efa0af88eeb08c7603609942064901c83
F20101129_AAAKFF xu_j_Page_053.tif
446d30fa143a1f8531c3ccf1cd56ee68
33922aa4185251f60c7fddd5d1d95e4466e4d066
F20101129_AAAKEQ xu_j_Page_032.tif
044bfa0916319002553fb758966dfd50
90f5e5a96a2e980489df356ad42330860bf8798e
F20101129_AAAJZL xu_j_Page_014.jp2
c8a4b84b45b1aaab482aca16e370b5e2
e06360bcb9ff8faf8696885effab3bc187d703e2
85986 F20101129_AAAJYX xu_j_Page_157.jpg
a9a0d7e5dc80be779ff3869421f55589
51203ad9c21c51d0f1b200bd2551cef562aec20e
F20101129_AAAKER xu_j_Page_033.tif
063a0c7760d76bc549772b2ac4d74a4b
6af91350bc2e040bb7db0897f65cbfd59c2b4a64
1021324 F20101129_AAAJZM xu_j_Page_015.jp2
75d38600071bd14d4fcbc09aadf96d54
dc840a51aa7b4fc766377eb27cc00772b71e6975
85358 F20101129_AAAJYY xu_j_Page_158.jpg
b92a1e93b3bc1c9434cd44f810f0c722
ec2c4d1869b386797e20289b3988ecb842ec5668
F20101129_AAAKFG xu_j_Page_055.tif
ad55b52ce61909af3d096a985084779f
94ab0fc83dc29800d0ef8b390562c156fed5bba6
F20101129_AAAKES xu_j_Page_034.tif
0f6d8008f009ee67411fc923c87300d2
e192c6256550c20a1099232456374791f1d11b7d
F20101129_AAAJZN xu_j_Page_017.jp2
83c984ba81f82fb7c04744c1525becc4
71dc12343a1da4c714fa986652e5bccdc4f0d475
98771 F20101129_AAAJYZ xu_j_Page_159.jpg
77d1f456b8ed1fbad36cf8de0abe95fe
72120c522f324f1ef3508c9ed80b90cd65d858ef
F20101129_AAAKFH xu_j_Page_056.tif
88913728dff73eaff760619833aa677d
d8ac8ba1e077c129959538ff3458704e48b8807c
F20101129_AAAKET xu_j_Page_036.tif
a2733731af95f63662c893523151aa06
1d2efc0ecdf6f53e678263c52513c713af3eca4f
F20101129_AAAJZO xu_j_Page_021.jp2
f084d789745295f4b64537eb31cdcfa0
2e968c9962b7583b930f4857fae5375ba4a1cd26
F20101129_AAAKFI xu_j_Page_057.tif
044e51f724ea7aa740be1022f9532b0d
d9a8004f01a4805ecb0419ec67619f7f967e20e0
F20101129_AAAKEU xu_j_Page_037.tif
f98bb9d0ac07db0f2b85460a1f01fcc8
33e817e9284eb91eb5630eeb4307a8d6fac0d838
F20101129_AAAJZP xu_j_Page_022.jp2
881fdbad3606792474c7dbfea2a9eab2
c5e27cfa5a34bf620b37aae1b1757de4119df968
F20101129_AAAKFJ xu_j_Page_059.tif
af75bcb1abdf9de43b113611380a3c05
f1ab1fc0165c214aff7c21f6bf1a1acf4a5ffa2b
F20101129_AAAKEV xu_j_Page_039.tif
fa81e932962c03103de31725683b6456
ce1b18a6a611366ba7a644beadaf7910a8383f72
1051980 F20101129_AAAJZQ xu_j_Page_023.jp2
72e09112cacaf6fa1fffe2073f35af77
c84b37a732e3e95ae364f00d77c736774713782f
F20101129_AAAKFK xu_j_Page_061.tif
b2560b5fa35b4e11a06700a3dcfb9280
2ec3399a9061b874b0cb99b0a90f0cab2a66462d
F20101129_AAAKEW xu_j_Page_040.tif
526e62c9ad29fd5d119be5f2950bf541
ba0243d7617a3cac80b7e183a8a83c5c60319a69
F20101129_AAAJZR xu_j_Page_024.jp2
4bdfb430212a9bc6610c8c5f1b279d57
ad2ca6ce4ba33bea2c466212312ebc0c45a25bd5
F20101129_AAAKGA xu_j_Page_080.tif
66bb29e157b131119103eeadb5494a00
9c39263ba23bbd1d70341b9f37f7067b4bf6fd85
F20101129_AAAKFL xu_j_Page_062.tif
a2ec7370c892c43d4f57c0e815cd6416
ca7d16121a6c9d24cb3967622e317fd0aa6bc642
F20101129_AAAKEX xu_j_Page_042.tif
6fb32557a113c3fdc877ccc195cb5fd0
ab34e5c4be7a6ed22129033e92b195f1239ade03
F20101129_AAAJZS xu_j_Page_025.jp2
253df794a603464ee8cac3f243186244
6eadb4bb7bbf48731c694bf023def8e8ea6ac9f2
F20101129_AAAKGB xu_j_Page_081.tif
cdd1900781dfefa55dbab163e98f74b3
393eae9deafbcab46c80bb536ee2763110e7a189
F20101129_AAAKFM xu_j_Page_063.tif
14d44ff25b3c8fb153a825d01c9916d9
0cb1fc3670c65fc7303659f689331b36d94ab47c
F20101129_AAAKEY xu_j_Page_044.tif
15f0631f8b477d9593b78e89bfd60bab
bf4977b30bf8bcde271dcefb104e8b34cd47d390
1051981 F20101129_AAAJZT xu_j_Page_026.jp2
618a80582295672c6dec090d0ec18df5
b103fd2c35fb5e7cdaad7d31b68a9a79fbe2cc87
F20101129_AAAKGC xu_j_Page_082.tif
22d6aa6ad92e0ac095a2a64fcb66bc0a
c49d8cc665f9f67533178abe9f219bad4e2c9f30
F20101129_AAAKFN xu_j_Page_064.tif
a6100686c901366ff07d11b65d506823
b61bddd862f7dcbe5c2fadc4859acc7985ffbdf5
F20101129_AAAKEZ xu_j_Page_045.tif
dfab0ea19e29f3b4aa7154e9d9120d49
0d0da59065dca9cfc2672583d8cf57d575ffc6e0
F20101129_AAAJZU xu_j_Page_027.jp2
22fb586a72b36515672fa1d8c7600d54
54310cbb871d985eca0c1598eb6cf345f2cbd1de
F20101129_AAAKGD xu_j_Page_083.tif
3e8b9cd2feca70596c4c41aafae76d9f
9557b9a097582978a8bbcec0ad5121a95c56678b
F20101129_AAAKFO xu_j_Page_065.tif
e8c37f0774cc778ff4243bdc4775810d
ddb647c2f8781fa437b3205bb8f7af14ef3378de
1003352 F20101129_AAAJZV xu_j_Page_028.jp2
292af8a49df61c4dbe274532f17927ff
d02096472c87857a10f5b3aa26c238132b0caea9
F20101129_AAAKGE xu_j_Page_084.tif
d3192c3955b40c5784aa3616f6a289dd
0e0fa215bf4f49c96325de5a0737fe1969096aa3
F20101129_AAAKFP xu_j_Page_066.tif
bcad00a17d95473afcc2d55b3d2b9d4a
9cec4254e9e36af881f66f94f1db05c4a54e1579
929352 F20101129_AAAJZW xu_j_Page_029.jp2
2302982a6814446a7edfa0aecf774505
f92fec1a40bf4fe3944d4c53a3b0eb61c533fd16
F20101129_AAAKGF xu_j_Page_085.tif
4f816acba4a2e86ec94b9cfb18329a7f
65245c4b865e1206a27d121fc6af4a2152da3b0f
F20101129_AAAKFQ xu_j_Page_068.tif
1a313c51218cf0ea6fc889f96ed55987
4dcbad81f632bc1058345532b12230280567e3ff
F20101129_AAAJZX xu_j_Page_030.jp2
9d86f8e88732c9c7d5b0d0f49ce0e803
eea001f91a3b8b16fb1f5737e6d1c73d7fccfa81
F20101129_AAAKGG xu_j_Page_086.tif
c1dc957a82cac3aada2673dbea177887
8e279c8596d9b14aa1072a4c67d692688de21689
F20101129_AAAKFR xu_j_Page_069.tif
79ae423d1f9daee5349aa5e19926e628
4a36457441148f69269175d1a83901617a44f99c
1051963 F20101129_AAAJZY xu_j_Page_031.jp2
4cb95f52d9a086917cba21bb13b2bdf7
06241aa6ef5c2ee71f15ddd4d8719480232086fe
F20101129_AAAKFS xu_j_Page_070.tif
684e15285815f0b8d8abc7f965514494
1e78fc89133136469614f97e402d1e72c7e10cf5
F20101129_AAAJZZ xu_j_Page_032.jp2
ddd99cad69a4b483697c66f7505cbea8
2336a53f489f38ddf5ce9fffe06ca31a7feb6bb2
F20101129_AAAKGH xu_j_Page_087.tif
1a4e76f3162e4526b5b2f995be218444
f6d8780848dcf274e01809682b0aea8efa67da35
F20101129_AAAKFT xu_j_Page_071.tif
5671ef0aa31327a309062db1f2c650b7
9c62bd8ac27ccf2cefef35561d2d6ad61e19dbc4
F20101129_AAAKGI xu_j_Page_088.tif
cf693f30ef8040f1cc85feafb7b3afe6
92da62c570cf6bf122b45fe362ed3a19a507a6ae
F20101129_AAAKFU xu_j_Page_072.tif
39f25034846ba03d2e177ff693ca8b11
91c8fbf35b1c7c5e3d08c636b15d1612aa05db59
F20101129_AAAKGJ xu_j_Page_089.tif
ab454b5d311d3c8c0974f12a7dea8b1c
e3e1357069ff29ca6d28cb39ff1fbae380f58015
F20101129_AAAKFV xu_j_Page_073.tif
27572f13ae149224da01fbc280481a4b
c544afb838ae18c31d748aaa0ca2c4b1172dd7ff
F20101129_AAAKGK xu_j_Page_090.tif
6a6941cff965b57380eb824f2a879d2f
d9ba3f756008836011816e069de5b2e7ed415a50
F20101129_AAAKFW xu_j_Page_076.tif
1bd499faafcd729b2106fe3dfd88eb69
ac39e728223153a2c380dcdd502a27ed0a8a86e5
F20101129_AAAKGL xu_j_Page_091.tif
6f24161528d0ed4d6de675e02a7d3e1f
3fe5b7be908e60bec51e5979631cc323a59ce06e
F20101129_AAAKFX xu_j_Page_077.tif
2fc94423d5f05b76d6976460e1ba7532
ac4fc9b72b6e68109540f6b76a4b56f650907492
F20101129_AAAKHA xu_j_Page_109.tif
89426d6d2a4aa06257fb7f9322f39eff
4165dcc7703c05125a03a666ba60ad8ced1078f6
F20101129_AAAKGM xu_j_Page_092.tif
65809d5446e609831be997df8e6e901f
60c36cbcb8de19615327a457519aa819f21f9bc9
F20101129_AAAKFY xu_j_Page_078.tif
aa2d9cd713811f7a61434acf745bde52
8fc3c34983c577e3b48210e6817e41cf4cd29470
F20101129_AAAKHB xu_j_Page_110.tif
c0c0a4e4e0922ad2c0efed65df141152
f7418b64546afdeb4af08a3d12ad5163b34638ba
F20101129_AAAKGN xu_j_Page_093.tif
d79deb7260ad91381c7f835fb71c6b26
751541037700e418d3fb505e55edfdaee9fdb16b
F20101129_AAAKFZ xu_j_Page_079.tif
b7affb9a3ebdc311b441c51a63783a08
e4254afc76a072284ef51ebc7dd94f332a49ed74
F20101129_AAAKHC xu_j_Page_111.tif
ff71d7a108e40f07e523aaa67253d7e9
524dc910c501ee5c6e6522a5d52fa7d5f9b76bcc
F20101129_AAAKGO xu_j_Page_094.tif
e36ae6e2bcd51365c6a5cb6a9c71f110
d06dc1537c340f66255a9e4e089a7fae6630f292
F20101129_AAAKHD xu_j_Page_113.tif
7f02a09f40a92ad21e325f7b102a93cc
1ae65a582183693a828e42c0a47ee6351fff7711
F20101129_AAAKGP xu_j_Page_095.tif
0d5c99c100d3c798983b5e6d1c713a70
4c5e6682d949f90bbeba37943d134056d7a69d11
F20101129_AAAKHE xu_j_Page_115.tif
89026b593d042e339b6c42cf3704d5cf
37a032cc7535cf3139dbda33bc4272e9e6e10429
F20101129_AAAKGQ xu_j_Page_096.tif
46171e440caea742074bd653c61b2400
915e83a5f644a7dcee984d991be966de760317d5
F20101129_AAAKHF xu_j_Page_117.tif
9c0b4cdada75274b7f24d7db7bb2205f
ed091302f540aac2e8cc5d777ab98492b0abb65b
F20101129_AAAKGR xu_j_Page_097.tif
d7f66d44e0645dc68e2f9f63a81b7026
25ba780b65d57ed8b72e4df76367d2b1e5b74815
F20101129_AAAKHG xu_j_Page_119.tif
e26c0022221d980fe76cf6193f09da8b
2fc2b36163dae2924a66cd84822822d2ef16d55e
F20101129_AAAKGS xu_j_Page_098.tif
0016f52c82908d3684e69ee2983ab4a7
8420604f8adaced65db3a2095530b99afa4d083e
F20101129_AAAKHH xu_j_Page_120.tif
9be9f355291a05d30d500b7fa7418198
e8e3e37d8496bf07a05c00b5bfbe6e5895aa666e
F20101129_AAAKGT xu_j_Page_099.tif
690bd0ae26acd5ddd4ac452520635e34
ed1d84204b0c04460bd3489cb41b4362d0785cda
F20101129_AAAKGU xu_j_Page_101.tif
950602c4598708ebf250b92aaa2b5af9
575ff6aa44e2f0f8837d829acae290df7e80a772
F20101129_AAAKHI xu_j_Page_121.tif
56541eb461b08e6305c6eeea7bd1a544
995e47ac38a9ad2d41764786944119e60e4e883c
F20101129_AAAKGV xu_j_Page_102.tif
01ce17016b6ef78ccc775486749e9e09
a5376c0ac0f3626715d462343d14fbe09ed0d92c
8423998 F20101129_AAAKHJ xu_j_Page_123.tif
a35624bab0ecab37a1c7cd8ae058cfca
66417e1f8972f0a5a40defae1eac3226a2304aa4
F20101129_AAAKGW xu_j_Page_103.tif
7bf70c5d375616f7fe5979f771854eb9
23aeac6fbe9b5205ad3a7a83cf2bf112e984de8c
F20101129_AAAKHK xu_j_Page_124.tif
ef9a45dadd9a9eddc31c19a81e6dbc2d
f91ecdf27a2850d3e0d3b59e1923ee2a176dfc3f
F20101129_AAAKGX xu_j_Page_104.tif
4d09d70b696fe29df9e7f7b1a1010e80
f93371509f4a3784e8b12e37e27f7db08745ec4e
F20101129_AAAKIA xu_j_Page_141.tif
2d67d08f679774ee80a68d46c89cc425
3a78efc1ba8b4cf65ab733e5d25cfcc319fc1f77
F20101129_AAAKHL xu_j_Page_125.tif
5a6a0d6aad260539700b0870236644d7
d22ac488fbd93a60f01eac018e3e44e473dfc902
F20101129_AAAKGY xu_j_Page_107.tif
a93c4a80d3a57f3a23bc1deded406a86
72776a2f2245ed3b26d5d85ae5a7a6d32b9bd478
F20101129_AAAKIB xu_j_Page_142.tif
d1fa03bd59ceafddc59f2de1826d0898
6f85c954fcadfe94537996467f2badf451de73bb
F20101129_AAAKHM xu_j_Page_126.tif
18e71d761d96842f3dbfef2f0acfd460
0150ca12425cc8980cf954f4e2e96e9fdb1936c0
F20101129_AAAKGZ xu_j_Page_108.tif
eef39fe120f62435ce3b3ee54d547dac
3edd84a60d2093cc2b36561d22c6b34d89cc887b
F20101129_AAAKIC xu_j_Page_143.tif
7480211ea42b7a0e1b4a951eb67f61d4
ec2aed56d7c76d229a4ee0d5b26aa14271ebd151
F20101129_AAAKHN xu_j_Page_127.tif
64ec7676733c443ee94a4d3307b236a3
da306479a703d2d96da68ce837e203a05c7c505a
F20101129_AAAKID xu_j_Page_145.tif
a2ef9fe22607cfd27146230651afaf40
a57dc4ecfffd313a9c0d40b2c47383811e80f86f
F20101129_AAAKHO xu_j_Page_128.tif
5bcf2f2272d99690355e460e579e7603
99f666e6ca523e3121b4eba3a3d11b91ebdd1d8d
F20101129_AAAKIE xu_j_Page_146.tif
6a46cb1fa452f0a6cc7c577b9a775046
4bf746b3b61c998b94697ebdf3e5e00cf7027807
F20101129_AAAKHP xu_j_Page_129.tif
2399b343d2bc13869900cfb245a459b4
9ef5631d44614405f72a9eaac4f17a8c6b720caf
F20101129_AAAKIF xu_j_Page_147.tif
d93a28c820bfa2019fa675290e939bec
575d817ac61cb8b67f53cc5b47d38c406510950e
F20101129_AAAKHQ xu_j_Page_130.tif
94205a18a8f41371c1210564408e8a6e
4d58d3f1dbfc44ab7fc879377d9a699357937f6e
F20101129_AAAKIG xu_j_Page_148.tif
1c57375cce1b48e46c8db407a44e9a0d
27d70a49603d32c5c8a28f4c16bfd95e92e4491c
F20101129_AAAKHR xu_j_Page_131.tif
e681cba2af91d7ec99d41976e8a9f93b
dedab94b8e8dc9238b1b27b673db0695516af7e3
F20101129_AAAKIH xu_j_Page_149.tif
bbfb691b254363262221acd7c848814b
dd8c9e07d07e74e8932b9e220c95c7ec9e372e54
F20101129_AAAKHS xu_j_Page_133.tif
d6c9fad324adc0640a17e4516784b120
4c829f649f75894f73e35e8764a51491ae0a9af0
F20101129_AAAKII xu_j_Page_150.tif
78a0992bcd220806b503f40ba3e53829
0736bc5249a80b94597b79a187e6127100cab5c5
F20101129_AAAKHT xu_j_Page_134.tif
5af07536d60a81c4973f1aed4a0d33f6
1e03e0b8f99a50a8552979830a8eae086c504a6a
F20101129_AAAKHU xu_j_Page_135.tif
d0f3c6251a67defe1c73434a76d984ab
85345362642e9a3516dbbea6c7bd93f8ea0669c0
F20101129_AAAKIJ xu_j_Page_151.tif
b5b95113592efe33f712133890291516
cb040a09bb9e66627b3d464a3e22e764a56d1a8f
F20101129_AAAKHV xu_j_Page_136.tif
0d7ed39e97eda250bfda2dfab1f88f40
73085f8a922e61b58f9073d4ef37e309db43c1f1
F20101129_AAAKIK xu_j_Page_152.tif
da572eeee0a7010e0b3d8864f998adb2
04d0331f69916310002838bd92ddd8b0ce4dc9f1
F20101129_AAAKHW xu_j_Page_137.tif
b92a08610df98ed9103b8287faba10fa
8bc92394ab0abac3005970403c2dd25ede97a120
52318 F20101129_AAAKJA xu_j_Page_011.pro
d44704a9413c53a396c644eda1fc9f5a
89d4d6cf56467918d96d9777de33f6f36aa6bed6
F20101129_AAAKIL xu_j_Page_153.tif
f7752e83a59c7806508516e29ef93045
01f785c41c2562a9aa084b71fe32309bc201c666
F20101129_AAAKHX xu_j_Page_138.tif
4f0ff03b4c9ed1037469e926f1d31c7e
5f02952718e9f257e24f4385cf2a2e88ad9f8a09
20252 F20101129_AAAKJB xu_j_Page_012.pro
4947f70b772b0721996e8df7dd78f148
60c7c8bea56ee2ee59238924f85753608c102863
F20101129_AAAKIM xu_j_Page_155.tif
d286854b26e6d91b144b70c64826cb92
955061c059cea194c5284505f2bc54abd6e1f3a6
F20101129_AAAKHY xu_j_Page_139.tif
3bb2394c1012babf5b30ebf29f5811bf
ed9217cd54cdf7f43c1e5d60f79cc5ef18ad67f7
45872 F20101129_AAAKJC xu_j_Page_013.pro
9b8aeaecc8703ddefc1385903d4289a9
0e6af8f8d6b63e5956d6bfbd8651c56543c3d306
F20101129_AAAKIN xu_j_Page_157.tif
ea7fabe372fbbccd0389f9b2335ac276
a6125872c89f35b50e6dc8c3809e29163294d8ef
F20101129_AAAKHZ xu_j_Page_140.tif
7800a9257fce8fc806a5458eef2c0815
866c6a4c5cb7dc2039d8b78551af95576ec45eb7
59895 F20101129_AAAKJD xu_j_Page_014.pro
47116070ce324d84f3bada10091ac1f7
8f64a6c512de0a5d7c7684222acb2ad5691db0eb
F20101129_AAAKIO xu_j_Page_159.tif
93e64fc754b7ddec6e26cb6fe862049f
472ef4527c5e6e95c54b32a72a26eb810fa7fca3
44479 F20101129_AAAKJE xu_j_Page_015.pro
650cf842a7f41b69376f5c2cc7a6477b
8c97023452f0ce8354e1e9389fffb8d6efcad063
F20101129_AAAKIP xu_j_Page_160.tif
b3c814ccecefedbc1b6709a9f3ce9878
c858810b971a9b42bb961b69db23de2e2a4cee99
49523 F20101129_AAAKJF xu_j_Page_017.pro
8ffeb16fdd3f3946ae5412b22be47613
aea877023116039f30a00e4dd5547b80faac9f55
F20101129_AAAKIQ xu_j_Page_161.tif
890749648357be073365e7f6ae6d4391
0875e127368a2c4aec638067aa035831517f504d
34899 F20101129_AAAKJG xu_j_Page_018.pro
7d9017ac9dbb56a081487f6d0a89cf44
a25c19157668bb34b846e93ba85943b2d48d3008
7818 F20101129_AAAKIR xu_j_Page_001.pro
b1becc0ba6d6dfa96af0cf094c3b77d6
7c88df8650fdbe465f9b83d92ff604548359d23e
33703 F20101129_AAAKJH xu_j_Page_019.pro
1769123454b170c0d665b97c188e39a4
a49dcb66806cb2a915f9674304b72f0760be8d2a
670 F20101129_AAAKIS xu_j_Page_002.pro
e7536c7af02cb08a3e1a8c3af58a5349
8c0e4d1b04494b69156090522d7525d358cb5bb4
33354 F20101129_AAAKJI xu_j_Page_020.pro
3bd8ec96d2ab8d394192d22b28ddb139
0792b1cf87c6e72a6fb52ff3b7887bc7f36eb1f7
1190 F20101129_AAAKIT xu_j_Page_003.pro
0934545feb878867e7718edc08421424
156840cc59a5f255e6cd514d54807d3da139ba4e
55347 F20101129_AAAKIU xu_j_Page_004.pro
37d27de9cb2336b2ae448dc6ec0205ef
121ede9fc52d8ef301971d99f2d42ff5e71c4d22
61788 F20101129_AAAKJJ xu_j_Page_021.pro
dee9f92893b9204a683095e2c844b63a
ef1b2599651c18b7a7b650ebd0872659f04760d1
5240 F20101129_AAAKIV xu_j_Page_005.pro
6cc18de871a6b700c880e1f68e3544f7
a03413f249387b0cc67c7ef763f181facebdb235
47598 F20101129_AAAKIW xu_j_Page_007.pro
76c753481d402a3bd670378c5b259be3
431ea8ecbdfd192e889300f76e0f055c082dc5d9
61971 F20101129_AAAKJK xu_j_Page_022.pro
90342628c654b577cc9ee69d793f6f80
453863cd113229487d55bba12ae7173a4020d3f5
4759 F20101129_AAAKIX xu_j_Page_008.pro
e0b201bc11df0140ebac4bd586b3e225
cb87733b73b28e8392357acd97b901200a3ee236
46337 F20101129_AAAKKA xu_j_Page_040.pro
c58ab2bb3ef1f9c065434139f1b062c3
77cda30bb84c157599c4e9a71e4751d2eef7319a
60398 F20101129_AAAKJL xu_j_Page_023.pro
583978616ce7f17a7739af6f66b995ad
e1bec9af1105f5269a6e2eec151cd5ef8862d0c7
58985 F20101129_AAAKIY xu_j_Page_009.pro
b685de9aba8b1151e387b8409d5d22a1
bc7c4c560d993965d36577b804ea6602928da12e
50209 F20101129_AAAKKB xu_j_Page_041.pro
5402a1f1950d17b6e32f79438c928988
7f532015e6ba72032f2d7b15d249cf2ccbbe950c
55817 F20101129_AAAKJM xu_j_Page_024.pro
61e450ba0a017b1ab2a7b6ef91c680b3
5b77bf6a2dff8e6262f94a60ce4d4f2beb16626f
56713 F20101129_AAAKIZ xu_j_Page_010.pro
4e77093100e218c2312d82264a255e3f
b08644394e403a4ff47374615efe8c4416adb77c
53174 F20101129_AAAKKC xu_j_Page_042.pro
bd2e7a8d6d9fad1419bce1027ef85085
88dcd547cfe04428a6fc6421a9911a669158c15b
54583 F20101129_AAAKJN xu_j_Page_025.pro
bf5548cee78a7458b4c6e25fa102176a
183d13bbb818370f0dba47a8ff02fae1f5500192
36035 F20101129_AAAKKD xu_j_Page_043.pro
3a1dc14d2ad7ee1a3ee16e8b0a5cc133
504c3e4b7af9a4e2ca5237bb6890cd73de96db19
60517 F20101129_AAAKJO xu_j_Page_026.pro
3ee49d488a7b92e916d4eaad193b4b85
60ba97c45224e3b97137bed02a300193685760de
40121 F20101129_AAAKKE xu_j_Page_044.pro
15edb6248d45e12077737c3c4c9d17f9
4433a949de9ea360b995e479d0e669677c36b2fb
47013 F20101129_AAAKJP xu_j_Page_027.pro
711e6d2ccb06bb3e9396938a6699d060
4853617e55caae59357f8e97c41dab33d24e6495
53291 F20101129_AAAKKF xu_j_Page_046.pro
176b7d9bd8a7b0b9259d4ba734253354
218fb45456e3181f71aa354975b9314f8ce688fe
43064 F20101129_AAAKJQ xu_j_Page_028.pro
08ce89263cffbb4a7eb0c18acc1eb3a5
a4218f35fbbf40417dd3188742913b63db803bfe
10974 F20101129_AAAKKG xu_j_Page_047.pro
7d690c6912e22744bde506c319efa788
a11a1aaa9e98d9757872f1ac4473e18f825ca1e7
33402 F20101129_AAAKJR xu_j_Page_029.pro
7331181ccff71fbb6e4ba63b998f23d0
11cec74ea1dd6038605704858c0a35e49d0efc8f
57806 F20101129_AAAKKH xu_j_Page_048.pro
015823db2e951fbadc64df8faf400d7e
41219d2eaed9966a7e3af3832c16902f91423f22
57425 F20101129_AAAKJS xu_j_Page_030.pro
98566d2d29aed693c42d984ccbca5ffb
df4a2dbfefa1c36ebe7e46720ca2eed183770f40
39336 F20101129_AAAKKI xu_j_Page_050.pro
012e44c4cf55c39997244b60d9651d4e
c19eafec98ab2a9fe6be67b5f56b236d0b15d36f
60914 F20101129_AAAKJT xu_j_Page_031.pro
1af6567c021ff48f2af30f2572446a15
ee7799df81c47370c1d72f03eb38c4251d7efd61
34112 F20101129_AAAKKJ xu_j_Page_051.pro
1f9784ff67004c7a8b8a71151f5aef4e
e6f39f12b1d4f35ca98c04e94a459099809610a0
52667 F20101129_AAAKJU xu_j_Page_032.pro
c7241c236f067e5a92a6d3639bdf61cf
f11726a57a103fa07c176ea34f875d7f3839704d
30499 F20101129_AAAKKK xu_j_Page_053.pro
393019fe8375b548cae929301c891fac
c696907ecb73c66b0a0c41907da1c727e12f09f4
54552 F20101129_AAAKJV xu_j_Page_033.pro
daa2c178d7192b0431c25bd0d3932a44
e05c1b35afb93776de7c13f68b9bbe6d3b3e72ea
40526 F20101129_AAAKJW xu_j_Page_035.pro
86cc4305b340a534a565716eba4310b6
b77247aa4c35cdc81ecec518b49b52e43ffa3e3d
50544 F20101129_AAAKLA xu_j_Page_074.pro
3738010c7079f45a74bd2bf7230b160e
71a858f06a30ab652b63639d276e3ed647a8c342
36924 F20101129_AAAKKL xu_j_Page_054.pro
ccd222eaef70b37173373729bacfcbfe
ce241c75946dab2edf006f1e51d2d9ef62d42d37
29074 F20101129_AAAKJX xu_j_Page_036.pro
a966cce23f8b08c47e41d9f6316eccf6
c9f50db9747abf6103b29e53e835a56e92740135
45116 F20101129_AAAKLB xu_j_Page_075.pro
bd7eacee53e3e7c381de8a01f7be1ba6
f38e964c71573ed4fb6a200be68cc5c845e1af2e
44992 F20101129_AAAKKM xu_j_Page_055.pro
dc081e9e32476325ce31398066e6851f
d47dff5f4768839e69b9aa163f906c9ff6034c02
34276 F20101129_AAAKJY xu_j_Page_037.pro
106904c13593b5fd4ef55fbb82c1edcd
e17245057bfbe970000e3c16012e29b56a389992
46801 F20101129_AAAKLC xu_j_Page_076.pro
0e5874437c5d4307c0da364bbabf8698
05d114a184a6273972469455e8e5a6b7a503dbef
39111 F20101129_AAAKKN xu_j_Page_056.pro
9f5363d712798909feaa7c9b6d8b82ed
1d44b5022e3e79be2672da0c3efc86aa3139a7cd
30598 F20101129_AAAKJZ xu_j_Page_038.pro
6de2ae1eb57bcc6537bc5bbfb36c749a
5ba579512d7b7abc873666216bfb0e6069f8e863
60645 F20101129_AAAKLD xu_j_Page_077.pro
df64fa16eedce6cc1d9de0ed20f4b789
c18ae84094adc2a614e997560f62cbdd3a83faaa
29553 F20101129_AAAKKO xu_j_Page_059.pro
ef85931fb4ecd1c7d078ebd19ae06e02
5585d604b1b7884c69fd454a7d98cb5cd1a5c982
43607 F20101129_AAAKLE xu_j_Page_078.pro
8590df4f5920cb9513ef9b5096febf49
e4aea4313e8b8f9c547a887c5c26542fdc1860db
63491 F20101129_AAAKKP xu_j_Page_061.pro
72a14e685d24847ea533b2a39bd1eec6
ae98c7cdeff6152073fbc5360ced1fb0aaf1937d
32229 F20101129_AAAKLF xu_j_Page_079.pro
cb312ad7cbf9286bc4106532fffeee83
4561640664368433fc598d7fae64580e3ac9a0e2
15672 F20101129_AAAKKQ xu_j_Page_062.pro
1f2aeddc793b7305a6e4db7f40cf2873
9e7b81afefcf3ede453b5e82d37b8211760124d2
31851 F20101129_AAAKLG xu_j_Page_080.pro
a91f8f2b9a4b808db65585e7c41c1c75
9638adcd9d28e5fb1de601136ef7fb532522da57
53535 F20101129_AAAKKR xu_j_Page_063.pro
72315fc963c50e0b1aaaa93c7176db83
cdbe620eaadf7d1ce3da9f06890547db455e837d
52306 F20101129_AAAKLH xu_j_Page_081.pro
9f93e18f99521137302587a010b916a5
c90815d80d8b738d4603eb4c116a365133a90603
27480 F20101129_AAAKKS xu_j_Page_066.pro
5e48a4c04e395dd7f37f4dbddf5435df
9db4d539e8e31471c58ca988f91f6b91e80a2f30
44212 F20101129_AAAKLI xu_j_Page_082.pro
64e0b21d53df9abe27bc8a4936f2dba8
b9f288c6e223539fb061cde6746c59b23891efc7
24652 F20101129_AAAKKT xu_j_Page_067.pro
8c3f68484954d539063e6a23044ea40c
b03de0a4dbae3e824cddfe84fc61ec73ba198b1d
34537 F20101129_AAAKLJ xu_j_Page_083.pro
3de4cef812db47a6c5590ddcc5264ba3
aa28b9155e3229a55d6f11877adc2543f12db9b2
55021 F20101129_AAAKKU xu_j_Page_068.pro
9ee95c3bd7a5c9cccf8acfef8be8f3d1
599a6e8bd750c81a9de376d3a744d77fce03ecaa
44203 F20101129_AAAKLK xu_j_Page_085.pro
f88ef8b3be2794cea32a3aeca7243acf
d7af2fbe0bc9612f89cb52e3baa24ff78ec16d0b
41172 F20101129_AAAKKV xu_j_Page_069.pro
953d6d50b677fd9038f2fe0a76ec2213
6432e728034b18886390dead26e5dd287b0336fe
17562 F20101129_AAAKLL xu_j_Page_086.pro
61fbb51e0f8f494d5ec422a6bece0a6b
8877c5cd51b5e76f1ebb9383c75d282ef0c365c6
19482 F20101129_AAAKKW xu_j_Page_070.pro
562a1ec7b75f28aaa2e4726a491009bb
19ea54a397025ccc10dc0bbcf1eb47e530d8d7a4
24545 F20101129_AAAKKX xu_j_Page_071.pro
e1420bfb8671c9494ead83b1a9d6c290
ec0a53ff01b10fba9b479abb50df75c3297742a9
33702 F20101129_AAAKMA xu_j_Page_104.pro
dcdb0f13358fbdb0624eb5e8ba3c62e7
241020baff39687e97e59723b7677c36e1e0fd4e
36024 F20101129_AAAKLM xu_j_Page_087.pro
89697f381fad4dd475e67e978ccfabd2
0f50b24a48641686ad15d7a0ff9a2fc23ca45672
15897 F20101129_AAAKKY xu_j_Page_072.pro
8993e8ab484e8ea33d65f7dea29d3a9e
5ff6bf0944f877569add4c9943eb78a41787f24f
21658 F20101129_AAAKMB xu_j_Page_105.pro
e720d6545412f26ab806c0ef241c119a
0f775ee505a4bae8aa5fdf330cd59209224198c4
51920 F20101129_AAAKLN xu_j_Page_088.pro
9f12784882e80894fbb5019e15695af0
8e089b60a24e58a904edbc8f98e30d020394c26e
56564 F20101129_AAAKKZ xu_j_Page_073.pro
fc9fdff5121dfcb6e1c57f220ac5cb8c
88612f261e0f6e446ce3636195db52f5ef298032
38213 F20101129_AAAKMC xu_j_Page_106.pro
4ec06c7c048022547a66e086a84ab933
191beaa88a896d22806b74dd22de868e5cb2c1bd
42957 F20101129_AAAKLO xu_j_Page_089.pro
9763cda00c3d626300bfd7bd2a2b7f49
14a01cbcac6281a20628deaac6fad558a8dfea10
37467 F20101129_AAAKMD xu_j_Page_107.pro
f0c8122986ec2d3e1fbdd4773ca8ac98
4f06395562eb7deb312d5aaa764114974feb4f8b
38238 F20101129_AAAKLP xu_j_Page_090.pro
8e52b11d6436dc2f657a018e5cd4aa64
efa9c93d4683a892d6e2f4c1e21605960834f250
38185 F20101129_AAAKME xu_j_Page_108.pro
ee894d040cfe5cbc0605e05deb594d3f
8349f932b9eb024b0c75ff8379ee444e2783cb9b
42770 F20101129_AAAKLQ xu_j_Page_091.pro
cda9949ff52334a24aeec162533c8466
7ee2a3d6e3a94f7b6daf2b7d19537a88c2a50f01
55006 F20101129_AAAKMF xu_j_Page_109.pro
62a9f3bb26ceb21cceb29d589e5724ee
06d3ff4122114e751e5865aba2ba9ac0c5edfa47
62400 F20101129_AAAKLR xu_j_Page_092.pro
acf016bc1f927f9952b30c0efc671e84
2205cbffb5d26b565cf3017c77c4280a3c550288
43966 F20101129_AAAKMG xu_j_Page_110.pro
98ebef92a7b11e5d7116e45e2867dcfb
df6fa8a8a80543ec9ff6c96c261ba13685e38342
58691 F20101129_AAAKLS xu_j_Page_094.pro
f7235f1deae014de61cc3aa6da665c33
39a31e86f513fb141772d684a1b0945e1088ee4a
56576 F20101129_AAAKMH xu_j_Page_113.pro
43058a6ad043942204a4a07ab7b0ef82
fbeafaec3577dc1f9e73cdb8a02eaa246a8fcc34
62235 F20101129_AAAKLT xu_j_Page_095.pro
496d831e2a246d21717b8c6ec54af3b1
34c0fca67d7ca85707028f097f62fc3060b725ea
44195 F20101129_AAAKMI xu_j_Page_114.pro
76f4a276ba858012a620b9ec65e1cb7f
f82054e690115cc653f3ecdf5549cc08e1e85f39
46084 F20101129_AAAKLU xu_j_Page_096.pro
0ccd5460d50c48d575c0134fc31cd2ec
a3036ead15522aa870a5a3bd9f7d14d3304d5e7c
36512 F20101129_AAAKMJ xu_j_Page_115.pro
0be6c4f9ebc62d75126c7fea7d8a85f9
edf6ed6a1c9ba984124196cfbd95771e8af61d84
33568 F20101129_AAAKLV xu_j_Page_097.pro
33c7299422ce082f0980a6ca932bf4eb
a9cbeaa2e61048496a5be7fb72337f49751232ac
28902 F20101129_AAAKMK xu_j_Page_118.pro
61e9798d383b3c397957ae56cd3d6b26
fa63992a2cda625f722af34baddcabbcf1fa128b
24446 F20101129_AAAKLW xu_j_Page_098.pro
6876e12b2df696b7f38935f74e1a4aaf
545e2f0ef4e657df488f29c3ae59de4a3f562305
34340 F20101129_AAAKML xu_j_Page_119.pro
6993650a18c7147197383fb6270f31c3
08b02f427253e2dae2a9a6ce740eef0e7bbd3f79
18386 F20101129_AAAKLX xu_j_Page_099.pro
db596485f92b6012eedfb6e685ac241e
c739a23b12025afe0204a5fcb2e4fcfb9fb5f072
52377 F20101129_AAAKNA xu_j_Page_137.pro
b3d7418811ff035c7c648f36012cd833
1f1673d47da4d1a2ae9f3b44433e4f19311a88fd
30420 F20101129_AAAKMM xu_j_Page_120.pro
7e0bb56c724f8b96f5a719c66e3eb7db
8e6f846c892d2b347e0439335314f4eb42797768
36826 F20101129_AAAKLY xu_j_Page_100.pro
5a54122c4d0a80f22b3b2c5ba1ef9ee8
1a031aef370f4adbba56d180bcd511a9724b5dfb
43643 F20101129_AAAKNB xu_j_Page_138.pro
17c5824f3a0d0a4de335f9a3b69ebbc4
6260dda1245214c83c315144c7a2f581acf12d71
57486 F20101129_AAAKLZ xu_j_Page_102.pro
b5814d99292732a3bd69b17d5a5102bf
e83f9a373bbbf25a4c3f64e5403f9b7abaa0332c
42539 F20101129_AAAKNC xu_j_Page_139.pro
eff5a0f64020564a0708b5b4309636af
7962f5ebab45a752770fbe1b9e63a86e6be29c58
30989 F20101129_AAAKMN xu_j_Page_121.pro
bb893c27af6dcbe4224e9aa1f2a47f41
8a06fd8f082ace5c2bf64ef04211b5b42fdaee29
40956 F20101129_AAAKND xu_j_Page_140.pro
b048b01616b59735c53afb1d7f330a80
c3758e5d4c7c870ed8cfe43677a5532d584eea8e
31722 F20101129_AAAKMO xu_j_Page_122.pro
8bcdc7b23e2ed8b61f24f7596076e08d
443e828d540906efaaeba383b91b6d7a1268cc06
41261 F20101129_AAAKNE xu_j_Page_142.pro
8dd30008accbb9cd45190f208730ec14
1d32e710e47c30b4e02ab235276a93dfe1583fe3
27364 F20101129_AAAKMP xu_j_Page_123.pro
636bcd41bd6a67a6186ab6520a6e0a2f
cf325f39b2e96050ab930ba477f4b7d7cade0be7
60596 F20101129_AAAKNF xu_j_Page_143.pro
7dbae2d0545056629db2a4fb20d8bf4f
5aa7a5642cbda5ee3ada12b793ab02f474f16980
42050 F20101129_AAAKMQ xu_j_Page_124.pro
eb704b7b9939d6c7b98c39f93097fa77
4394aef0bcaa5401855b446684d42ea5604f79a9
31528 F20101129_AAAKNG xu_j_Page_144.pro
6ad1a61aaca6f220f7d1916f37df10f8
5f09d81f2eb68bb4aedef7936cf41bda4abdceea
42196 F20101129_AAAKMR xu_j_Page_125.pro
ef853737cfe8f210f3fbae6526e53b97
ebcf5f8603ed1fef06c2e926c055b8bd4cb21c68
55563 F20101129_AAAKNH xu_j_Page_146.pro
1c23eebabb8e8529ae574882d89c0a25
6853beafac8312f2456b3e7397b386a811d06857
40239 F20101129_AAAKMS xu_j_Page_126.pro
b30c68b9156cd790705c6854f9fc1d8b
f433a8657d166ab6b74c364ed368af8097138ecd
57201 F20101129_AAAKNI xu_j_Page_147.pro
a24e1771c174b1a346731d3f7bd4bd32
6753c772f7a941befe0ba094e4d6fe926715010f
61950 F20101129_AAAKMT xu_j_Page_127.pro
e2d388f163b49bbabdf836211061c0eb
0b9734feb44438ae3eae2299813425bd7d44d0fb
57497 F20101129_AAAKNJ xu_j_Page_148.pro
7238e61c22b4f51bc0e0e5871a7c0862
f58486c7e6fdf713bdaf7c893541dadc7b17c967
40399 F20101129_AAAKMU xu_j_Page_128.pro
d1db75d784142601c2935013dc7e0277
9fa7ce11478db7ed4841d1d5fb3c7a82e65606df
55837 F20101129_AAAKNK xu_j_Page_149.pro
be87c65e155a30c03898b9da436af453
24cafbdfa53b73fa1ae6de8a5a319289deacf93c
44978 F20101129_AAAKMV xu_j_Page_129.pro
16797f21426c8de9a889b3d27f7e9aa9
d3052d5fc8bdfbf36e7ae665fb26c542967d6fcd
32549 F20101129_AAAKNL xu_j_Page_150.pro
7017d694d6a0f03a8ca202de5fff6586
3b157b9c30b1d9465391f64dc2de021769dcb7e8
6238 F20101129_AAAKMW xu_j_Page_131.pro
86eab72b84c4e94c77fc420b0d0cbd44
bbaf1d0ffbb166e4aa8c053a186d8f3294f9ec78
2218 F20101129_AAAKOA xu_j_Page_004.txt
302332dbb1a382efd8d15208a0255b72
37821d7dc89c5d3fe03f720bad824b96da513ddc
61158 F20101129_AAAKNM xu_j_Page_151.pro
914cf5a186f4e80fb968392a8584b2c3
cdcee2fab328bdda9a27e1e28ce417f898099969
46352 F20101129_AAAKMX xu_j_Page_134.pro
01d04483f554b43efe3cbac3e80e0f39
8d66e86e8e1560d4e3edb24085553193f8273d26
223 F20101129_AAAKOB xu_j_Page_005.txt
de3b8d7f87705c36dd3cd5e7db5fa553
367723b37e218ae803292c2f7a78d05b43f407fa
62214 F20101129_AAAKNN xu_j_Page_152.pro
e770d8ca275322dcef016d7ceb685a22
7cc16e88f4f1242c438063b77fbde654135f1b31
47057 F20101129_AAAKMY xu_j_Page_135.pro
7df63385c118708c1f8eaa93e7f15937
318a734b11472a72bdf7c133fc3cce4bbf4f8a04
2693 F20101129_AAAKOC xu_j_Page_006.txt
dc925397eafe1c80430efb31aa0cd480
9753b6f7f418a7746171b29bf33d4b5ca98b7778
46110 F20101129_AAAKMZ xu_j_Page_136.pro
5d124648879a9479b21893912c40d545
8d6c70a6a2e2b78b9cad95fd38d15b5ec906e52a
231 F20101129_AAAKOD xu_j_Page_008.txt
9a6c6dff755b0103f7d54f9e1981fd91
7f9c1a3684d807686f20bebd5172721e0f5820ed
65858 F20101129_AAAKNO xu_j_Page_153.pro
98ceb4bb66b82f4a61c8750e1d7361c4
0d34deda4110691214570e46928054f6e46a57f6
2477 F20101129_AAAKOE xu_j_Page_009.txt
a952d011773f331f47e70f80a6f90c9a
741be92642fb5e914cb26b79c43adf802715d17c
64198 F20101129_AAAKNP xu_j_Page_154.pro
ef67176ea260a95be9974c9ff971199c
858e1b72cce26448c24cee099a0860858a5edfef
2453 F20101129_AAAKOF xu_j_Page_010.txt
9ca8bb4fdc8726310f18ae1511d6aa3a
9931fa96b1984c7adafc16ee4022484ccd79fe1e
64492 F20101129_AAAKNQ xu_j_Page_155.pro
efbaf3c1ff86c21ac08d937846271742
44e7085d1560e66766eccd95bf9106abc8ebc2d2
64428 F20101129_AAAKNR xu_j_Page_156.pro
33fe0e37d6dfb2fe6f44c4d781a3a5b8
ed91d1dcefa3e0f05219f7af5aa9640d2c808c44
2256 F20101129_AAAKOG xu_j_Page_011.txt
6b27bc55d0fe49aaf75ef35f0a86c383
b807bed73d595cfc2e3c9565f49edb6c72d70dfd
68540 F20101129_AAAKNS xu_j_Page_157.pro
0c69b9b576d282af8300fe073eeb912e
8c587feac2536699195a30c5af7de95a22dd2bb9
813 F20101129_AAAKOH xu_j_Page_012.txt
4995b4d8bd5e64c8f78c07de91b44ee6
57bbac923c594a41bdb9cf57039f8d3780eb9517
66703 F20101129_AAAKNT xu_j_Page_158.pro
21b44eae99e5faf7b3812e9f63cc283d
6f24e64c4b5daba52db0b9490fbf9c3ea8ec5821
2002 F20101129_AAAKOI xu_j_Page_013.txt
921f8d15117a54ffd56df0798371b6ce
58bbbe2209125897e2c92ad18698d63e73a8e97e
66778 F20101129_AAAKNU xu_j_Page_159.pro
13206ad74c41ef4f00675694cd478158
03cfd00af8c39d698b24c51298e7cd48bb7c5844
2351 F20101129_AAAKOJ xu_j_Page_014.txt
0a5ed4962c8496d197d8227e18036b10
317c66d9d28958765f70b44598e9bdc7f235d2d6
52509 F20101129_AAAKNV xu_j_Page_160.pro
b1f48d2b802ea38abf63eb1edffcfc5b
84d5ee5fe67f0c7c5ec12a575839caca832d9207
1827 F20101129_AAAKOK xu_j_Page_015.txt
ab83d2a3ac20228a94c8d685c7e0bb13
d4645dce2ba5050c5762a87c7c028ad5a958b3c8
23380 F20101129_AAAKNW xu_j_Page_161.pro
b05892cc6afdbf681665861181b584c5
ec0652d848b543feae5a684c3d1197a8ab7704e6
2079 F20101129_AAAKOL xu_j_Page_017.txt
e1ee9427c18a9a60f516ee7b2379c0f9
d07643beff3cca4f4f2bfb8626b791659f7a195d
463 F20101129_AAAKNX xu_j_Page_001.txt
bf7eb169eaf639e2f5764449fc32b673
ce2dfd862ba9787fbfd1276614f0ea93f69d503b
1765 F20101129_AAAKPA xu_j_Page_035.txt
69bb016572f968c3dafcc0bb7edcfcc5
d2607dfc6a0a5b22dc1b40db8c92d8777228ae46
1518 F20101129_AAAKOM xu_j_Page_018.txt
5280470f43273e3a68cef74128531e90
c3f360dd3c1c79cc44a9144ca34a67c89d1bd9bf
82 F20101129_AAAKNY xu_j_Page_002.txt
2bd2d124c4024c2fd1b70e645381c278
ee38a07f4f89f4a1c10386cab66c44cd2bc889fd
1328 F20101129_AAAKPB xu_j_Page_036.txt
f861aa54c4e4eee344c97ecc95cf4f04
da1ccc79b5cdc8f95cf4553255e45a906939d344
1458 F20101129_AAAKON xu_j_Page_019.txt
16dc1c21bf14df2630968a93f2f2aced
32cd4351cc7d672e750b654d13248e227421d73f
106 F20101129_AAAKNZ xu_j_Page_003.txt
2c9094273ddc216e52c769a0fbfa7586
026cb5ef64bbcb050d7506b0effbd9840b4e9059
5622 F20101129_AAAJMA xu_j_Page_106thm.jpg
89c6c30fda16f710eb2317f329a905ef
81909a294c728733582f7185651c8df2fa27c866
1464 F20101129_AAAKPC xu_j_Page_037.txt
694571cba010834a57fc27e6cbd695d5
e0ad26cfa596b614162a931e307e814bf2405086
1533 F20101129_AAAKOO xu_j_Page_020.txt
ad0fdedeb6e8cd4461c07b4099f9d073
f2a88dfbd16df43fa671d82fae7aafdc6a8707a1
F20101129_AAAJMB xu_j_Page_114.tif
dd920f13846c82a515d6ef32e299910b
8497cc66201aa4ff87b504ea28e83adb99b11447
1373 F20101129_AAAKPD xu_j_Page_038.txt
e167956efaa88a0116b5d271bb67a2fb
7d162b7fdfcc62656d783e788741307b28694602
65916 F20101129_AAAJMC xu_j_Page_145.jpg
9a3cca47027d17dd554fe74cb50fadf3
abb923f9355dff6cc61ffe1e0b4443c2aa32b46f
1686 F20101129_AAAKPE xu_j_Page_039.txt
d0886bf499a7b627c1ab15d493927a45
d74a4988cfc3def578f92cb851998e3af2039b4a
2421 F20101129_AAAKOP xu_j_Page_021.txt
0c744373a20be7196cd806939c27ab80
46baea84edbf97ffbc1a7526993cd91f1b6978f6
58055 F20101129_AAAJMD xu_j_Page_142.jpg
77ae4d57766384d8b245c3f793858a84
37811efb02d5cf0d2a30a998beaa6d32bf2a9f64
2102 F20101129_AAAKPF xu_j_Page_041.txt
9368e41e5ebd974368ea6cd5fd953542
84c82f1f7c6c669b948ae654a95c4baac7a59547
2387 F20101129_AAAKOQ xu_j_Page_023.txt
09f6079fd17d16172a428dc6eb5053c8
02a6661ab616b2de58bbefdfea68c09eed59378a
2171 F20101129_AAAJME xu_j_Page_007.txt
8aeddb31c5e2f9b81faf68034bfc9a51
fba474c0964a7fb8e29a51d5ae17397e7f517c82
2192 F20101129_AAAKPG xu_j_Page_042.txt
b2fc668ae6c6fddb769e7b4e88510032
942b507a187bb5613d9db01708d6a0078491d460
2226 F20101129_AAAKOR xu_j_Page_024.txt
0e3c09a5d82c7e5b11f3773fb695e9d0
2111ce725e61625fdfa359aa36e5428c819a9010
1792 F20101129_AAAJMF xu_j_Page_106.txt
9f8209187c619fd080e47ebf3ed6c9da
1572c1c4eafbf7ed38aa180b6bd78205f1273a53
1636 F20101129_AAAKPH xu_j_Page_043.txt
896470082b9b7f810215be5c4dc581e8
30c636f13ae5ae5b8dd1d5709d00bed1d91f8fd5
2231 F20101129_AAAKOS xu_j_Page_025.txt
388c6a5f96058761150c676dd6d8cb4f
d77581bf45518df00a72e3b428aaf70124af539e
F20101129_AAAJMG xu_j_Page_131.QC.jpg
24de8a0460879a9c73ab0e772380637d
58c6339921aa66d8dd0dbe3784d6a71ee0bb99f7
1698 F20101129_AAAKPI xu_j_Page_044.txt
b0b728a9bf8257f27b370521e3e5e32c
8d39a2988e88bc09f48df7430c31e55e4d791e99
1825 F20101129_AAAKOT xu_j_Page_028.txt
bc09e607f082bbb4bacec35d38f32495
e1432a9b4deb1004bd02cfc69f2213e71530cd96
810415 F20101129_AAAJMH xu_j_Page_104.jp2
faea89e9968da66762dbe7e2e3986fdd
08ab924fcba0e71ab014b9f47606457fedbdab39
2139 F20101129_AAAKPJ xu_j_Page_045.txt
989bc49e06b8e7b0f7f61a9590037802
ad9958df9bbcd5f3b3efcfdd97cf8d76cc779717
1524 F20101129_AAAKOU xu_j_Page_029.txt
60953f2a61c2b7f1198837149dc23e32
67d74b694774dcdc0ffa48b37e891638574f7f0a
1189 F20101129_AAAJMI xu_j_Page_116.txt
6a0874b8dd7e8f9d5d16eef846f08941
5212919bc1f43e035bb8fad97508a3d5011c4a35
2133 F20101129_AAAKPK xu_j_Page_046.txt
36e6c89ebdcc994e0e9f26a2e5cfb851
683ee425769fe3a075fe2cdd3dcb0c39c1ff8077
2279 F20101129_AAAKOV xu_j_Page_030.txt
648b69f1831b38a620a036e0cb4070bb
18af0e8145ff1e912758dd657fcfd3f16706bb4f
6871 F20101129_AAAJMJ xu_j_Page_102thm.jpg
7a8c2d2d8c18a19dfc3534e653db4447
855c2911ff5cb0b9bed57b5558fc978460ebd249
440 F20101129_AAAKPL xu_j_Page_047.txt
7e5bf2f98f1d43754c09861971509d11
1e6f93bb2d05f48732ca00387bbd6bb31cada529
2388 F20101129_AAAKOW xu_j_Page_031.txt
5e044a2abc792bf4c7bf4d9b75b85873
d82538803d122760eefed520270b53ad14d8ab13
2168 F20101129_AAAKQA xu_j_Page_063.txt
76ed4a5257400401667e8e30d24563e5
d556ce215b0035e5be42b5d61e24cb427164882a
40547 F20101129_AAAJMK xu_j_Page_034.pro
3cdd1e5e57b22b843b2b8d8415ce3baf
eaeb0ce48d6797ec64cd7bf99b4f739e37300ad2
2328 F20101129_AAAKPM xu_j_Page_048.txt
2000feea56105dad484a8ae1b9cb1f80
24305d3ef65c6deac534227f08afa55454378cb5
2117 F20101129_AAAKOX xu_j_Page_032.txt
d2033106d3da59bb8ca018b647d73772
f982a0107c0b16e0079ba44afb8623d56a40d6d4
2437 F20101129_AAAKQB xu_j_Page_064.txt
7ca1cfeb009d30d78674dcf5e981183b
3a2a43bf27bb34bd38020d2b510fb0df31e3afbe
53915 F20101129_AAAJML xu_j_Page_133.pro
30dc0d7b0f8d8238f51a64d9cb706fd8
4d07649269c5e1ab578e6ffb6d9bd95a458e67f2
1670 F20101129_AAAKPN xu_j_Page_050.txt
cfb14c20002d2a3a984b5ca26a473389
6141aed9f81b3bce320dd15818d90707ab473e5f
2239 F20101129_AAAKOY xu_j_Page_033.txt
369a9b70eca550448a9045529bc3a2e0
c9084de633b9cbc6ce6fb08276edcf318cad195b
779884 F20101129_AAAJNA xu_j_Page_120.jp2
ceb5a45755103356c7a83c7639e7764c
7a3a32760aae0c359156101ebdba80fba233e861
1571 F20101129_AAAKQC xu_j_Page_065.txt
51677df2f032a44225145dd5bf8be583
70e31fb3f4f9717b3081011f652899278361e9cb
2429 F20101129_AAAJMM xu_j_Page_022.txt
ea714583d28cacabcf93791c4786269a
fb05eef3f03adb0aa965fd00c235be1d39bba8a6
1625 F20101129_AAAKPO xu_j_Page_051.txt
2750de02a6bf4a0668c4d32e97d225a1
dc928acef311926edfb2b5439828e15651cf3b4d
1699 F20101129_AAAKOZ xu_j_Page_034.txt
19b02babaa54dac4fe5ee1d438c2f3e7
b1cd491b027c8e3923d1b212e953e9f3456c0732
76243 F20101129_AAAJNB xu_j_Page_040.jpg
21733cb357a391ba7c6b8cab3e4f1716
069e469759692eed783c984d3770a965e9d1e4ce
1641 F20101129_AAAKQD xu_j_Page_067.txt
b0309a16de4516f82b16e65fb8fc362b
185c6fa4d432124b69ca64572cc6b8185e72ad31
22541 F20101129_AAAJMN xu_j_Page_108.QC.jpg
7eb7ec34291b60ef7efc3525765f1d39
b4c494e6570665c04fe5f7c0a7567744b07ee872
1852 F20101129_AAAKPP xu_j_Page_052.txt
5a621a3208d12dc5e4022ef8bb931c63
45a882f7c0688e2c365a4f355a7560a3d9aa2adb
20317 F20101129_AAAJLY xu_j_Page_050.QC.jpg
2ece37854e366dbe57ba36147d98f217
9560c8bc0881c1c5b95033c0076ec05c2b0da386
1879 F20101129_AAAKQE xu_j_Page_069.txt
028f7ecfe706b0552cfb286f92e24e47
79dea808ecbd8057900de8b6b7a71c87c2335dbf
55172 F20101129_AAAJNC xu_j_Page_111.pro
00b6f81bd1c0e55028e6a0f052bde572
ae881d69d2c0e3e7d233f28954e6c603d8e1509b
2020 F20101129_AAAJLZ xu_j_Page_027.txt
91e8e5bf4c2a46a7a49855a499aad567
f44fcfabc1fe279dc79e65efde328c48f4d55bdf
923 F20101129_AAAKQF xu_j_Page_070.txt
f7567898f804fcad2e907e9db6cf5508
6d9d30a7fe63359f60e840203799e4fadba2bb27
4809 F20101129_AAAJND xu_j_Page_079thm.jpg
ef2d68eea1d8d03a1379cb93ca639e28
1ad8d005ab20af31723f01ce90f482948377aba3
46553 F20101129_AAAJMO xu_j_Page_141.pro
51fcd594df5c7d2984e896f05f84fd55
7eebb393fdaabeedd24710ae7a527ae09dc120b9
1397 F20101129_AAAKPQ xu_j_Page_053.txt
3a7bb8bcff3ac222e7cb09da6498a510
d25169857778bab35506707a95db74ee4edbd65e
1151 F20101129_AAAKQG xu_j_Page_071.txt
fb8254a4b7083e52b726be1970ef3002
da20a2743e64779545defae4fb35c4157d7ae0cf
1051979 F20101129_AAAJNE xu_j_Page_102.jp2
a759df992d9a82441ebf84568e38ffee
7366f9049fd4853c29966a3e96ee5e02afdea45c
23586 F20101129_AAAJMP xu_j_Page_001.jp2
1275c0a01a015ad5bc448aaf538c88fe
aa963228f59e8eb4ab987cbc0bf3d2d2b0264117
F20101129_AAAKPR xu_j_Page_054.txt
24d6063af2ed5fc68f8eb483199b9428
0a6e845c9143b35b793e9f446f04ad42667fd32e
832 F20101129_AAAKQH xu_j_Page_072.txt
708be7ec58da3fe885d4e89492eb7850
738b87f3fe15e18b1a6d11dea0c51601281b2db4
24835 F20101129_AAAJNF xu_j_Page_153.QC.jpg
bead4f4f07b45e2457fa97a3ae1fabf4
8aed250e0b048dd4e6a8a2af2a357afa0ce33ecf
F20101129_AAAJMQ xu_j_Page_075.tif
f875c4342a6733007a66f32e57ca8de9
fcc4eec1db9dbe530169c1ddc766ab55f343c370
1920 F20101129_AAAKPS xu_j_Page_055.txt
95a45106b18fb6d7416ae6a636bba412
ab1838c80fea3365ba4e056e3bf6ffcf195755d9
2277 F20101129_AAAKQI xu_j_Page_073.txt
307f21cf9a56b7c42b6a50b8b814e37a
e9ffbb63a751d835aabb6aa307a6e7db6c5d9e98
38548 F20101129_AAAJNG xu_j_Page_039.pro
e36fb305e7d1c2f0444b8420fa58d531
08754db99a5d31de94b33942e925f8c334b62ecc
F20101129_AAAJMR xu_j_Page_023.tif
7d2a188c22735278457d5b10d12d20ac
1c7896ac1c2c1b0ecbacac968f4c3fec0d636c27
1838 F20101129_AAAKPT xu_j_Page_056.txt
22014e6ea6d0dfacff73467b65158ecf
c43d3e969d98d5fc594338836728b5dc157ef571
2090 F20101129_AAAKQJ xu_j_Page_074.txt
c3d7e912f91ab6048a53eea4c0931749
fcd5fb42ef7a9423a9a750782bae38d6fc173dfd
F20101129_AAAJNH xu_j_Page_112.tif
d36c680c2033bf08b1a0569e6a69190a
a3c19ad1b344c4f6c5cac4407e6c5dca5753835f
5670 F20101129_AAAJMS xu_j_Page_050thm.jpg
b6d9b3717943435fc32a3777ec9e1aa0
57025148961917adb9eb41bdf570cd093f78a566
1701 F20101129_AAAKPU xu_j_Page_057.txt
4619e32d53e2d90aeedac4896f567d4d
45d23a1cd53ff199eb42c8372625c8fe641e8cda
1853 F20101129_AAAKQK xu_j_Page_075.txt
be15e1ede7dc3e0af8527eb15364f331
bdbbf069cc7d7ba2ba8e48591c454f527873f38f
26810 F20101129_AAAJNI xu_j_Page_111.QC.jpg
52d5002ced05c253bb36570a4a656b1b
a6cd96600012e17c878593f22fecbbaae5949005
6099 F20101129_AAAJMT xu_j_Page_076thm.jpg
dd86431d0e793797b2e9f501f683a369
91e45bc3064c69924572b7677559395c1931d8ea
2209 F20101129_AAAKPV xu_j_Page_058.txt
e721968b9dca18cc2b89eb359d43c1f0
3bb65567342731c4a50b21ea54e7bef88df28320
1911 F20101129_AAAKQL xu_j_Page_076.txt
de12518a2b153b841de8f1ad4f5d5f2b
18889ee58949cf3f4ced808471d350c038eca9d6
60023 F20101129_AAAJNJ xu_j_Page_103.pro
8b40f9fbd43a5bdf93ffb6fe6b76e7eb
d2cd93b571b28a8c8ccdbc0598814ac01693b036
F20101129_AAAJMU xu_j_Page_041.tif
9666fb75e4a628af71aa41c3cde97052
dd24a4339f025f4ebcbf4da687501a122de8e4e0
1260 F20101129_AAAKPW xu_j_Page_059.txt
8fb65181cc6269b4a29dbc5c20806f62
099f1fc49b42c4e57a2573ae4b1ec814c4dc7360
2363 F20101129_AAAKRA xu_j_Page_094.txt
bc5cf2c216c9a7f9dbcbc2c02988aa0e
82c051dd5bef3843852c0a6802bce6db3f8a0fc7
2375 F20101129_AAAKQM xu_j_Page_077.txt
4ba3122e1be7acda482099e92dbbed59
89a5e57cc8404eac57fed3bc26def12b3ca415b7
2161 F20101129_AAAJNK xu_j_Page_001thm.jpg
6cf3098bb1427ef187ae25669f2cabb1
7dc10c4098c1b85d316601497ff340cebec2a87c
71949 F20101129_AAAJMV xu_j_Page_114.jpg
e38a99054a3ed5dd8b38a75788ecfe4e
3e681e0dd594117048e38d534cff557af14ee82a
1782 F20101129_AAAKPX xu_j_Page_060.txt
d424179d7ea0591e13bad873ff3c73ae
192ac9fe3a45005cbdb24a0a18d99e27e274e180
1441 F20101129_AAAKRB xu_j_Page_097.txt
c06829e171ef3d74abc8e15be4e0c452
6f6a4e22644cf77187057d10a911586f2a025b61
1817 F20101129_AAAKQN xu_j_Page_078.txt
3eb011e391f84c2070bcda0d57c949ed
8576c3ae127dc2d58a77821960e0321ea4429c07
25815 F20101129_AAAJNL xu_j_Page_047.jp2
fd0440c9710fb6f2866b1070bc6cc43e
0b12a6f1aa857562605bff2d51efcbc6509bee86
22408 F20101129_AAAJMW xu_j_Page_028.QC.jpg
1cc84b0878f76aa38d4d6f47deced958
5a68ac3864187af0855ae254c707077e3ef770b4
2486 F20101129_AAAKPY xu_j_Page_061.txt
02ee124dd9f5e0c09452a1dfb9135cd0
272652d481d522262af7813b5f5a0304d20f9e3c
1117 F20101129_AAAKRC xu_j_Page_098.txt
33ef59a7e35da9f46e667ff86a379e1d
f6d261aa2058b3964a59ac1cb188086083825225
1478 F20101129_AAAKQO xu_j_Page_079.txt
e567334f873d561b47eea84f44c34aba
dfef8352bf44671076e8744b217da6c4735d203b
F20101129_AAAJNM xu_j_Page_031.tif
e6599b65dd70d4692c32d51fcd6d4442
29e13f6678569523b892f7a5d0867091ebb59b3f
1051973 F20101129_AAAJMX xu_j_Page_064.jp2
56671db7a9da81e7cc03a92a1b7df98b
f55eb968ba245be99118fd255ce0ae71144ecb9b
858 F20101129_AAAKPZ xu_j_Page_062.txt
fbf7fa10b435335e5667e578afaef182
7817335fd6c7397c74f57efb2e91206bf8ce6928
51248 F20101129_AAAJOA xu_j_Page_132.pro
0db258f74c4298a55ca874b72af0e9dc
d36eb97ffcb4d069d15f0ff548cd3517ed952ba5
1580 F20101129_AAAKRD xu_j_Page_100.txt
f81a26be58f638b072c92161b27edc69
98d4e34fd5f981ecd951861150a529fb0fec5a5d
1428 F20101129_AAAKQP xu_j_Page_080.txt
770c001bb9a9acbf2cc91132ea4a9b51
5f3923ad114ed41bd17ce3ddb8fedec2eae75801
6838 F20101129_AAAJNN xu_j_Page_094thm.jpg
b848042a18741bf5ff3c87eb54e9f80f
3ea9ce4e72a28e2ca22302c6c1e40393362c800c
92840 F20101129_AAAJMY xu_j_Page_022.jpg
2272a7c9c93d1ec14d96038ea2140552
a8a4346bb519b7e15c3c0ad88008fc96e6f70dfb
1051926 F20101129_AAAJOB xu_j_Page_041.jp2
b3cddb3525618545231994f47967b71b
286f6f5207fbf9b0eb91749d0c1a5fb65e0dccc5
1779 F20101129_AAAKRE xu_j_Page_101.txt
3bc4ae17ac59f764c458f4564ddd488a
62a3f63002435053fa502bc878c24c5cc12d0fca
2103 F20101129_AAAKQQ xu_j_Page_081.txt
e16fb9b01d3d96fc95a94cc6d6fa3038
90804cc7f2bf64c4ea76252310db5e5f41b0a385
67418 F20101129_AAAJNO xu_j_Page_029.jpg
0b078adc1537e16cd0253eb15345403a
8d6d4ecc0e0b6681be53994a73b1df2fc7997ada
986412 F20101129_AAAJMZ xu_j_Page_049.jp2
0730dfe508071567227ee1ecc3db7298
8d016a83b294bc03d0d3a22abb684c52ecf3871a
1051968 F20101129_AAAJOC xu_j_Page_148.jp2
0ba69675887a15c638793bdd7a64a43e
5b0a5ba7d17d9279f5625aba8da21f499b1d0225
2308 F20101129_AAAKRF xu_j_Page_102.txt
436167fee1471e2bafad402b3bc9fffb
24acd0ad76e32bb28c9da57be733aa990d8b2adf
789 F20101129_AAAJOD xu_j_Page_099.txt
231222ca1c0c0c6e143f8c5fe8566bf2
a18e9d1107da76b3863a171bc25beaf5c1447368
2354 F20101129_AAAKRG xu_j_Page_103.txt
f310cb2044dba3787831926b2ff59ba1
6a5b10e9986bd2c240da86ee73bd7fbffafe3257
1889 F20101129_AAAKQR xu_j_Page_082.txt
afecee3ee27da5a2a612034b0389029e
cecd38fe42c683338ccdffd1c198e6db54fbd095
944084 F20101129_AAAJNP xu_j_Page_044.jp2
6763d9ae37beb89611be17e3f8b530a7
19dd810c57236c06294c28bf915813d626218f5b
6149 F20101129_AAAJOE xu_j_Page_015thm.jpg
80d1a008b747e98085d4a8b924ede994
a8b0db4819fbf9bd6bcdea51e35901b9e4ab7461
1553 F20101129_AAAKRH xu_j_Page_104.txt
a2936e1621391514ca7e27c4e17255af
7319f8ba7cb078ed2db86421100c1d40123bf3f4
2316 F20101129_AAAKQS xu_j_Page_084.txt
cbf549480971373bb29df9c091a5db84
129ed78f9ef816adb67e7f9b0964dd0f84b3dfed
81554 F20101129_AAAJNQ xu_j_Page_152.jpg
03d6deb357825ce74d72b2bb554748f4
f119e52226b8848db29db8b8944ff4010091a531
F20101129_AAAJOF xu_j_Page_001.tif
cefeea8b54e0288ec20e9b0d90d3f5cc
94d834956c5c2b92dff1e78c396d7a66833db3af
1204 F20101129_AAAKRI xu_j_Page_105.txt
65e66f0bd9a0d300ed08dfcbd4fb7d71
4d6b3aa5385b3ba4de156188bf28df7584225893
1934 F20101129_AAAKQT xu_j_Page_085.txt
d7318368fd23a67923081ac9e2777c33
30b79eae10ea41e0536af41b23d657d5304b8ef5
59861 F20101129_AAAJNR xu_j_Page_130.pro
2614060d8bdd1452bf0b920df5cdcd02
0c5d500b803a72462c71259e7da388c1a891af44
72914 F20101129_AAAJOG xu_j_Page_027.jpg
1d862c369cf5716021d215873a4ad464
e567451535fc7e9baee1e7910d435138ea1cad2d
1691 F20101129_AAAKRJ xu_j_Page_108.txt
9178833de346a47a75f6e80cb163485a
bf61bb57083fb21824d33dc6f3d37ac502cd1eec
1733 F20101129_AAAKQU xu_j_Page_087.txt
892130f95892fabaafe88d8757c8f0e2
8aebfe92d828eebd07a78f9fd8876d74f0f5f7aa
732519 F20101129_AAAJNS xu_j_Page_038.jp2
6226c80e79030c221b23abc0ba94e6b6
bfa18d651c2c8c91c8972a6c9c50d57933d257b7
994294 F20101129_AAAJOH xu_j_Page_016.jp2
3d4c471c5c217337281d1d6052949fb3
305cc40474e1a23da8ccd9ee6069e39c9c213943
2174 F20101129_AAAKRK xu_j_Page_109.txt
8b71fcd32170336a0fd22f9763dbd6c7
2655ae297970b90d43c691c5ef4ca82f801e1651
2057 F20101129_AAAKQV xu_j_Page_088.txt
b089bd7c55c8ba12378049a71f385990
2c248d0298f567473be223d75c5025ac3be39703
5291 F20101129_AAAJOI xu_j_Page_087thm.jpg
2b298ee257a008679dab49caab8cb0fa
494bf91ea6dfd28f96f46f36bf4ef6665216e3d5
21869 F20101129_AAAJNT xu_j_Page_138.QC.jpg
556cfd2a1c76ba52b0a2425e3da8e55a
63307ecc1e7feb00e1357e5afcce9fa95dbc1da6
2112 F20101129_AAAKRL xu_j_Page_110.txt
f0c3b6193185ee1989ac67e285e65899
3cbeefcce266a2125a45357a2c164aceafd7e401
1842 F20101129_AAAKQW xu_j_Page_089.txt
163f0f5216facc3eaeda484950eeb3d2
6d00e941c946dd40c21448b5ccee70d999ab77f7
81797 F20101129_AAAJOJ xu_j_Page_007.jpg
f492c715df6077a0b00d44c83aceb4e1
c4d0ae75d05908974998f803367722c5c4adc2f4
F20101129_AAAJNU xu_j_Page_060.tif
dffe26d84af6492faa3a6cfc89baa0af
f6ed87aaa22ad206b9691715bb83015f51ba6318
2427 F20101129_AAAKSA xu_j_Page_127.txt
6cbd46e0500f348a710420c67dbb67d1
e7fc5d98e928ffcdf595dd757c0fcf53e0b1ed2b
2236 F20101129_AAAKRM xu_j_Page_111.txt
7cc45020149042f841b0e81ffdf6f0b0
0fd8f4ad4feaddfee93650b3445b48115e0070ac
1702 F20101129_AAAKQX xu_j_Page_090.txt
6593256a39a53eee99803fb82fd0acbf
1b6795811f4c46923d95c1c6ae472e831019a566
F20101129_AAAJOK xu_j_Page_042.jp2
6e82fb173068794716ed3554f84fcebc
fdb7b034d3d132c8d9fb523a40197c7568165e8b
69008 F20101129_AAAJNV xu_j_Page_055.jpg
e196c50dd1ab6cbf53234e73472d9cb8
f19b8e5d72afa75753ca2143e07171129b0753cf
1714 F20101129_AAAKSB xu_j_Page_128.txt
fb950b02cccdd22871dec48e0192d762
64fca1e4c5cf1ea7487518e247ad3ad475c0e7bb
2052 F20101129_AAAKRN xu_j_Page_112.txt
7140c31388d5eca6a50f6ed5b427244b
bcbf409eb30b848a608599688b85be52aae9c38e
1752 F20101129_AAAKQY xu_j_Page_091.txt
d2aceb1d0a7fb36490e3551b63e23115
adfeaf08233d83349f313fdc47cb0a5b4b765555
5219 F20101129_AAAJOL xu_j_Page_104thm.jpg
e216faf051812c038a80453a0c3d7f3b
da56024af40ee639392e40dc1f4414b6557659c8
F20101129_AAAJNW xu_j_Page_154.tif
386797698f90c94ad064fa02dabf8346
48590bd2e9d0833c3daafedb2f88e055f400b0ed
1915 F20101129_AAAKSC xu_j_Page_129.txt
f7f088c57a2e3a9d4fd0a4757b150b2a
b85a63bf6dcd1509bbdf69f249b4774f361dfa2e
2323 F20101129_AAAKRO xu_j_Page_113.txt
6485dd5eb51b067942a088023973bc16
fe8a8c88393dd079c20ffeac02137c215f62410f
659 F20101129_AAAKQZ xu_j_Page_093.txt
bc57d478c6d889c5d23d9daf7fdc25cb
e79fa3f81ecce47de41d307f0249e7a5c1c6244f
2440 F20101129_AAAJPA xu_j_Page_095.txt
ca58d8b816140586eddb8b4f1126c6d1
71f335d4c1b925e360975474b3179a6bb2432de4
76669 F20101129_AAAJOM xu_j_Page_110.jpg
5e8ca8b856af0bd50470760e92acf048
e9c2f25f357b7ebdc7a350550524a11cce5f3d9d
127386 F20101129_AAAJNX xu_j_Page_092.jp2
73e3d84c7c8f41ee1db46f71c92531b9
2a1ecf620cd411d7cf05f81e0f173023f893d3a7
2385 F20101129_AAAKSD xu_j_Page_130.txt
fb402116071ac5afc53a41b4de94da7f
71412e8278e48c0c7d1abd7fd59bf1cb43927bac
1859 F20101129_AAAKRP xu_j_Page_114.txt
b6a6492d10e7b66ee75d5a29ca6185a8
b706448d2d80c2891fdd75d5766407c7d8495020
82149 F20101129_AAAJPB xu_j_Page_081.jpg
c6aa80a2865b10fb7182640512a45fa0
d5fc03d1f6057c57725b8fbe2db430424f3bcf1b
1823 F20101129_AAAJON xu_j_Page_005thm.jpg
b90bcd3a6a9a90738e4c90ce51dfee33
cd9a5e8d35a8848533e85f6082eacdbdde952aad
F20101129_AAAJNY xu_j_Page_058.tif
1e0169e45fac0376ba5700cb699fe1ec
f4b7f2e5b3a9f315006bbd4386c1740bda5ec343
253 F20101129_AAAKSE xu_j_Page_131.txt
fd8963764ad2d13d75f4f7f0fb0dd1a5
eecc722d15ac83ad6bf7211319f84b034423e8c8
1576 F20101129_AAAKRQ xu_j_Page_115.txt
bf0df9985d23526489dce0bb0cce02ce
c43c53962f0f93231d637fd9327d19ce099e5bc4
2225 F20101129_AAAJPC xu_j_Page_068.txt
51604b93700572eeb284f4e858565b3c
ed148a68fda3d19d48185af4502ffa4bbc26edf8
4279 F20101129_AAAJOO xu_j_Page_066thm.jpg
f5975af357ad007631e0bf8fc4e2f29b
73607147d98593db959893cc7fde583fadfb03bd
24287 F20101129_AAAJNZ xu_j_Page_058.QC.jpg
88e5f8ae9a47aa29e89e18aed95449a4
e06331d9e104a98798bfce3327c4616f2d62cbec
2110 F20101129_AAAKSF xu_j_Page_132.txt
41fb6f93e2237b5a1365e21366761483
9f9fb0b2ec4aa8919469357f32898fa9bda19a7a
1119 F20101129_AAAKRR xu_j_Page_117.txt
d2c63b48f93e117e92dd7d8a5585dc21
9b10ff7b4512caf60c7d38a1dc1afdb49bbac335
17070 F20101129_AAAJPD xu_j_Page_053.QC.jpg
1c46d39552b0ba44779e73f4bd455da6
754008ec51865c020e9fedd90b790f6b9abf3983
4973 F20101129_AAAJOP xu_j_Page_117thm.jpg
7e684187faa996c56d892a3641e472af
6aba8e27cf531b2323a18531f3e9cbccaaf12f89
2170 F20101129_AAAKSG xu_j_Page_133.txt
bc66eb712d6fefb9edb66580c1ff19f2
409a5bcdbbafc4a65aece79f7543389d76236877
744694 F20101129_AAAJPE xu_j_Page_060.jp2
5d260add20152dbf232334dfdc2522da
406818b1a15e57c0faa07971cb0eaa16f0cbf219
1925 F20101129_AAAKSH xu_j_Page_134.txt
c1c138f59637dd8cd1d8e8bdbf9ef364
77cc3b3596b249e0da8fb2faf841ab189dc1ba95
1457 F20101129_AAAKRS xu_j_Page_119.txt
ca675cfee2447094881de30efc3b108a
654b2ac6b9871697e2735e3736732d5713581fea
25548 F20101129_AAAJPF xu_j_Page_032.QC.jpg
c0e811f62470a1499fe25440dfb806d6
248effbed50e27c5db6a0cc87d676b820c3127ca
18698 F20101129_AAAJOQ xu_j_Page_087.QC.jpg
0c1a62ed5fcd56b89b54801ba09627b3
d8ae0d2e1d13d5519dc5131d5de6fb546b5c9ef0
2132 F20101129_AAAKSI xu_j_Page_135.txt
ef45c3eb9ba093106984587e22909077
a70776a195bc320f33f4cafa20254be7100fc155
1278 F20101129_AAAKRT xu_j_Page_120.txt
267b06e71ca674911426c98fecaacc84
766b7d5f28f3fad0438f9eb73951f98679817a76
27939 F20101129_AAAJPG xu_j_Page_084.QC.jpg
a3f8ba60fff55c4570c30c7b6059280b
2de1d3af3ae28ade12a4bede3b6e6bba85941c06
58617 F20101129_AAAJOR xu_j_Page_020.jpg
aed6f39ab5199aa09ed02553a5f77770
51472506c0ee3f0946ad79cc88503af4a7fd23a1
2158 F20101129_AAAKSJ xu_j_Page_136.txt
c3d63552b23f76854abfa268815ed85c
b722639e9f1a0e67727415258d413a4332b8d237
1325 F20101129_AAAKRU xu_j_Page_121.txt
451a8a8df84708309b3d6af5aa5b18b5
014e66c8ceec12711d761d6a632e55af194ba3d5
44677 F20101129_AAAJPH xu_j_Page_016.pro
9cd5540d9a1d67794bff4bdf881e7862
49d9f65b8869787e0580675450f3f04aac8f21f5
1599 F20101129_AAAJOS xu_j_Page_107.txt
aee5e97e5f6962830ef5e868084c69f7
689b08369c1e529d43be5e4466310e1e04bd379f
2716 F20101129_AAAKSK xu_j_Page_137.txt
3ed16284883c32a9288843e40e1891c7
276201836143627ed46c1754e548b32bc5e98e2d
1296 F20101129_AAAKRV xu_j_Page_122.txt
1041c855a18f141c37d830b0e7b74aed
e97b8dbfad48046a7540f0f4927a8ce15f834b9d
17674 F20101129_AAAJPI xu_j_Page_098.QC.jpg
ecad6f0a6eaad040d37130e181b4db85
5ce0dac3c3b99c2d62dafc929c0e42256bdec6ce
1051982 F20101129_AAAJOT xu_j_Page_010.jp2
aa5fe978c9909824d4bd3aed78ed7cce
221b312636ace1db4fd0acd7de1e9641f1465dce
1923 F20101129_AAAKSL xu_j_Page_138.txt
6fe34b4b83319795bf7449f47e870e7d
9bc8afd73403aa53f26a1379cb27f7c98dafe788
1196 F20101129_AAAKRW xu_j_Page_123.txt
545592ff74f0c77b9dbea743df4f048e
f51e7132030769aa071994bd15481fecf7e0d8b9
22495 F20101129_AAAJPJ xu_j_Page_076.QC.jpg
9146e9fb5d2112d2f22bd63729827b33
6b9a06a9f53e0ccc4a6ec4a94d791c710ff0589e
2441 F20101129_AAAJOU xu_j_Page_092.txt
6c27262005294560cdbe87dec98e0765
167a8b085b8712bccd775a18c0f6af221a46c3c5
2592 F20101129_AAAKTA xu_j_Page_156.txt
878a576eb57c0656da9d7ce6ab971a3a
3a6b7e8a4ec53ff641a47ae0413bd5af3d73e54a
1843 F20101129_AAAKSM xu_j_Page_139.txt
e4edd9aca35a2fb783e9876d225cae69
47e31d3191a19956842e3e62fa29dca5630f30b7
2019 F20101129_AAAKRX xu_j_Page_124.txt
2e3f69ed4c473599df3fde44cb2273bb
7402d123085ba85b3dc338be0708f2f71d3a2668
23717 F20101129_AAAJPK xu_j_Page_117.pro
5ef850fe53a1228b54ca9caf64b30ac6
f3ce053c567c74431de056c1922644c8bd9632f4
6217 F20101129_AAAJOV xu_j_Page_081thm.jpg
61a768a478ff922b45eaf30b5474ba13
0b8bc5f7dcde8035996047bad5ff0fce363db659
2756 F20101129_AAAKTB xu_j_Page_157.txt
337c864d9ba5821b5c4dd327e8b6d30b
cd4f425780b5e4243231593772944e62c91f4df8
1738 F20101129_AAAKSN xu_j_Page_140.txt
fc1ebcd68b6af3246151ef334143eb7d
ffc7ca422f53918688ad1e2699f022a1af159c81
1770 F20101129_AAAKRY xu_j_Page_125.txt
0c1596ab0c99a0cff99a337141911da9
73b5b97b79b3c0f11f31f490e8c1fb94db8e2168
92220 F20101129_AAAJPL xu_j_Page_021.jpg
5b538251174293352a7b0f3a5772780a
c4773c71b59a67d75579bbf77476095b6adfbc54
1637 F20101129_AAAJOW xu_j_Page_083.txt
39802eb5568da1be6f37112fdfc8e071
0da32df680d865528e7d9881339a475d87f0096a
2710 F20101129_AAAKTC xu_j_Page_158.txt
9a661140f06b7438f2261ac8d8b85fb1
8fc887a41860c6d3427f0b0bc24f0320b6c88617
2091 F20101129_AAAKSO xu_j_Page_141.txt
1555e4c292dace836720d66adda86704
a6b2d559a1b06984ce6fa278f1b54be1e328e990
1674 F20101129_AAAKRZ xu_j_Page_126.txt
36066e867cdd10a25d3b9404abc917ff
9ddb4977566c7f1ae46341d72e5fead1eddc8304
26899 F20101129_AAAJPM xu_j_Page_113.QC.jpg
0a9100d0781a7402487a64e919a14d8a
4b3df7acdbf9ff94fe5690a1db0281a4abc2cc81
F20101129_AAAJOX xu_j_Page_067.tif
0397987e3debb2d66e5493a10c7a69da
ad1230225e208bb2caf89996a8a9e7a834decb14
F20101129_AAAJQA xu_j_Page_021.tif
6f9ae1ae016a68ad81c17ec4524f4489
2f1936d280e349f767e05074a48fd2de0363e9be
2157 F20101129_AAAKTD xu_j_Page_160.txt
811f6314d98dee5a5bc3910157ac9c34
daba7810a4cf73428c6ffe9a2b223c4fd455ed55
1869 F20101129_AAAKSP xu_j_Page_142.txt
c0a4dd14350fee571e32e1fccb925fea
07f5117eed2772b8420b381525568e3706213101
62139 F20101129_AAAJPN xu_j_Page_064.pro
4d63378c9383ba714a6cd7353182f418
9d11c3bfc734995c8b45a2c06efc06cbb6838d00
86724 F20101129_AAAJOY xu_j_Page_147.jpg
fb5208e4370cff34d8e8f77dcdf584b4
1b2641f3cd908d1eed96dc3a4ade135c76682662
19081 F20101129_AAAJQB xu_j_Page_069.QC.jpg
75a24ed962a158eea889cf22eafecd11
273febdd891926d27c7a51af627e287be3a6bdcd
966 F20101129_AAAKTE xu_j_Page_161.txt
f68bc5223ec462854e7420dff523a0f5
d2ee6f9b00f76eeed30856e613236f1192791cc9
2418 F20101129_AAAKSQ xu_j_Page_143.txt
ba41b3ba06b922f224c79125c2d980de
618bf9960d11c96f1f57a1a4922636df61ac1224
72057 F20101129_AAAJPO xu_j_Page_141.jpg
b3c5a4f5774ca9271a3d39e8ccfa2f27
833f77c0a165bd0e29033ad2fca8253b002096f2
28088 F20101129_AAAJOZ xu_j_Page_145.pro
df63b396a5ecca489fb48f9c7d957592
dd4f9a221720d68b6c8a93742e7c4c0259ff657b
43158 F20101129_AAAJQC xu_j_Page_101.pro
172db79b5870220252a1d692c1635434
a4d28d65ae5dea63dbd08a12c85206cebb514156
1132035 F20101129_AAAKTF xu_j.pdf
f10dfcd701e12a09abd037a1e007d12e
1b8d8d563c296dcef6ceb87fe437d3e0f09b9858
1446 F20101129_AAAKSR xu_j_Page_144.txt
4a04bc08b7143729211fec9c9b026d46
a5acaafff7f7e9945a3acbd56121d1b102d927d4
974281 F20101129_AAAJPP xu_j_Page_082.jp2
012af013ee829bce1c1d330fca33d50a
f4fc24fe231e3e1e50a1b43de544cfbe3102f9f8
53729 F20101129_AAAJQD xu_j_Page_045.pro
b0cdcf026f3e6e78ea6bbec706953f3b
2b79e93e84f325c06fbbfa83e5507404fb09c514
28829 F20101129_AAAKTG xu_j_Page_031.QC.jpg
544e1695ed9ca2b1dd93716eb4a05855
e8b8d424e1fecdb26774d59a6dbd5c61816f1b4c
1346 F20101129_AAAKSS xu_j_Page_145.txt
d74f610e4fc43adc91a30cbf0111d7f6
cc1b6883a70b09eb58307c81b6894b9e7d62f7bc
F20101129_AAAJPQ xu_j_Page_122.tif
08fc151b750293d89bfe1f26c078fe6c
6fdfebd7a18baa74c5e1e8a1ca6a9ef79c377e88
82214 F20101129_AAAJQE xu_j_Page_053.jp2
667a9c6067fb59044353991692158172
80ceecfe239ccb71fbc1f3ccff8f75f4cd1bfe4c
6279 F20101129_AAAKTH xu_j_Page_155thm.jpg
07f324c6dcb0f070b2302df683228ae4
85ddb527dbaa9cd59169e0b1e7db4eeb943b48ac
F20101129_AAAJQF xu_j_Page_024.tif
8ea429d5f97b1e5fd1840eff8a4f9bfb
13ab771f6585c3d448d9b1719b2d172a05197eec
5923 F20101129_AAAKTI xu_j_Page_137thm.jpg
72d96b53ff093724f2e1b3b620e9e552
e843f1e3926bb6c8ea8ac23a3aa0446db319b5d9
2222 F20101129_AAAKST xu_j_Page_146.txt
6da83b7d4d13c4437d81397049652c04
5ac8adb6d5b5e49cdf27fa67b0682c790d4e0467
32770 F20101129_AAAJPR xu_j_Page_065.pro
411532b88a43fe7c5fcb1f0074b2eec1
36be827e97004b1ae90bd5500c27ddad8e9ec858
F20101129_AAAJQG xu_j_Page_105.tif
f6db5f02850f88d00594f5bb7325c39d
49e825cd4bd135d9926b41f16be5b7d67205015f
5852 F20101129_AAAKTJ xu_j_Page_059thm.jpg
5feeb7b6007bae068a9b72c2ddfccd83
183a0e2131dc92a0f964c6c8d96ccbc8f1773bb7
2296 F20101129_AAAKSU xu_j_Page_148.txt
a74c07681ffc3378b983fbda7457d314
dd697af6a4b2896f25207eb4d80fb53999673790
F20101129_AAAJPS xu_j_Page_029.tif
db6fcbfa15c8da173f1e9d407337d90a
ea0541926feffb9cdd4dac1f0cae455f21c97a3f
F20101129_AAAJQH xu_j_Page_054.tif
c561fe2ccb7f91feb93aef60d811560e
0fbc7cc9cc2cd5b5aa976be76b13e1f4681ff1e4
25988 F20101129_AAAKTK xu_j_Page_068.QC.jpg
4cc6b219f854565f6af46c9803678858
aefe313fe408d3f39644f76f0755a8866241a28d
2295 F20101129_AAAKSV xu_j_Page_149.txt
c116c90fe770f71f8109e6ab2f576cc1
f5adf0ab0a0df921232c78d60e9b5a067c6b8605
15873 F20101129_AAAJPT xu_j_Page_093.pro
fbfb9f3aed6867f9656ed95853e30e0e
1cf01e8a25088c02b2032b7eaf8b1e16e3c79f22
F20101129_AAAJQI xu_j_Page_132.tif
5b67210816168d758c5d3e6671e6895e
0a5b01fd3587eb4322b676be7882d90bdbeb8e84
15433 F20101129_AAAKTL xu_j_Page_099.QC.jpg
28e4706dca92470928b9eed7822b8e39
e3f1bcd70822edcb19731d6ce346be0c50f4aec6
1352 F20101129_AAAKSW xu_j_Page_150.txt
eb67913ec7dd29ff4c546ab77a0c8e21
4571b0d3e2231207cd360ad397d8b7cdebf8653c
19923 F20101129_AAAJPU xu_j_Page_090.QC.jpg
b92e950adde5bef2f12a523a8463f564
597c8f94a0ee9284dba21aad55eed724ab4d68df
20251 F20101129_AAAJQJ xu_j_Page_160.QC.jpg
2ec97bc35fe9bb76c613329d0da52974
3ade48237cf9c89aafbde0489683be4fda42cfce
5680 F20101129_AAAKUA xu_j_Page_129thm.jpg
76d82d146b0a28d74095b764298f2df0
d53f850777338c4f3e66b688a45482b592a33081
18612 F20101129_AAAKTM xu_j_Page_100.QC.jpg
091e78137cc4714de1a854fc1b591b1a
ed0ff49ea6b08f9183140b961057ff3845646494
2496 F20101129_AAAKSX xu_j_Page_152.txt
9e624f3e4d016370f049105588635e4b
5f450ed0bec87da54c301a2d8d7a62ab86cf98ac
10244 F20101129_AAAJPV xu_j_Page_012.QC.jpg
cf1a77f6030208a5d795491b484bed54
0543432522cb94b5216723b6d5e6cabfc65cfab6
F20101129_AAAJQK xu_j_Page_006.jp2
8407fb7f986d2f9182632354956c292f
3b86ce08e0801db2d1eda343ee533f5b533f832b
21420 F20101129_AAAKUB xu_j_Page_082.QC.jpg
707af451eb463c06ebf927bff69d66cf
d0b0aa7d494b222b6745f00fb76388292ab2578b
23539 F20101129_AAAKTN xu_j_Page_027.QC.jpg
72998930e13ac99a2d5cd1254b8464e5
61712b973666957df86650d2d83da59e1e172b2f
2574 F20101129_AAAKSY xu_j_Page_154.txt
e0b77eb224a8c6ae238b29bc238a0c4c
d788afda2ae131bf01465e7f3f0dbfa9fc791fe3
658182 F20101129_AAAJPW xu_j_Page_057.jp2
b8601bc134786132956f3095d7dae7e9
3a0c6031c076f7df9a84660d315d94d2eeb6bc5c
1191 F20101129_AAAJQL xu_j_Page_118.txt
79bda935619f34119eeeaeab8263e1ff
181f20a7d899e87f9d5dd1872fab930b9c0c176a
23969 F20101129_AAAKUC xu_j_Page_152.QC.jpg
39f9528b1d4c59173ac5624f4ae27890
e95635a75e7d22a0b5b82198ad6c1be0acc78217
6275 F20101129_AAAKTO xu_j_Page_040thm.jpg
34dfa893b58645ed9d10a806bbbfa875
f54adbc547ae0ebc33a5f52cd8742036e1f67211
2598 F20101129_AAAKSZ xu_j_Page_155.txt
10656b372b75ec0de92f790048542f25
b741a6daf8a2bc87df267c8e030991f30cf71cdb
65853 F20101129_AAAJPX xu_j_Page_034.jpg
6b07e10658a28e794f62df85cf159085
b409ca7e22967e964b17d43d6dc735f1d44d870c
49303 F20101129_AAAJRA xu_j_Page_058.pro
8d970c96ee9a179739ae75f5a6231fff
235b223ae8dc630eea6e0ec566310473aa254760
18680 F20101129_AAAJQM xu_j_Page_018.QC.jpg
6a6b1f7bc564373da19119ffa311c071
8fcde578eb1e4543287eefaae7c1a8d58d46e8b8
27131 F20101129_AAAKUD xu_j_Page_148.QC.jpg
2d6854c140f0687e3ac895265e9aa720
e4d76af30b5243b38667341b15bc3ea80d308a9f
19595 F20101129_AAAKTP xu_j_Page_043.QC.jpg
ad5ffdef70044a2a7eb9568f440541f9
f8740de6bdd1a37115d4c1c5de453d693c5e0eae
1038023 F20101129_AAAJPY xu_j_Page_114.jp2
974ec9d461b17629e72cfd3a6285c9c1
116b5e99d7b7a336dd9040cf8e70b6120c18799c
24120 F20101129_AAAJRB xu_j_Page_089.QC.jpg
bcb9677ee28b0303690187000b3d25e4
c1ee8e2782d684b0fc05e071ebe88a10a735b7a2
5282 F20101129_AAAJQN xu_j_Page_119thm.jpg
f5abc5a677c49a970eda6e83e83b0fd7
b3f0702b9c080f55d29f343f482e086f3225d1b6
29054 F20101129_AAAKUE xu_j_Page_021.QC.jpg
02f93f6a8d900a789e2be8ccc10497dc
0e32ac68ee421a2e4c7d3961baca0498f7175ea7
5185 F20101129_AAAKTQ xu_j_Page_115thm.jpg
65568776a3096fce366a75d7fd4faf7c
ec7d7c047f834453eeac1c4867a286553d378671
6536 F20101129_AAAJPZ xu_j_Page_068thm.jpg
41f2488d99a11ac1e5798509580af102
0159a845ddc1d888ed1a60ee7af82f8906b01a8c
29228 F20101129_AAAJRC xu_j_Page_077.QC.jpg
6013b9c2b0aa46573578ab82a233820e
ca5006132458753cc6d62b24521347b3dc86103d
2632 F20101129_AAAJQO xu_j_Page_153.txt
8c84bcfad3b843924886e680f13db9c5
d8684a56fe79d7cabe9eec3074011c534268fedb
15825 F20101129_AAAKUF xu_j_Page_057.QC.jpg
4d167e15c57b3cb5a1ed82745bf2ca0b
5495fefc649c34118eea9a1489b05af0edb45088
19841 F20101129_AAAKTR xu_j_Page_059.QC.jpg
127a8b02dfd038e9d0612e42d91936ee
9a7c8dc14756cd9e5361f7b927f95b20854dd8bd
72786 F20101129_AAAJRD xu_j_Page_135.jpg
d47df6ef43afc46af366059c563beed2
45fa4e3f42886da6f2347e1120d5f445da547317
84203 F20101129_AAAJQP xu_j_Page_153.jpg
83735cd3766f55a18cabb825efbc7643
77cf49d5b3957d566ec6bee12d402031d3fdaf44
5929 F20101129_AAALAA xu_j_Page_075thm.jpg
8a0cce4e138c2c6e2b47913cf43fe049
054f4d04a58dc360c0c121e9ec41cf2f2d6ade1c
2206 F20101129_AAAKUG xu_j_Page_047thm.jpg
3ce6da6bf35a99ad792e970b592d1991
cd093922c1d3ca1fc92d064b5b5c6070bd1b0cd5
F20101129_AAAKTS xu_j_Page_016thm.jpg
3e27124770ef6c3b2758db72cb9c6445
cb4fa997322d3a107316c25e928015cffe5acf7a
6559 F20101129_AAAJRE xu_j_Page_042thm.jpg
95bcf47378b4a67bd39ba5652172503d
5ea08bbeab54d05f31488d7cc95f391ad2d43deb
1051958 F20101129_AAAJQQ xu_j_Page_137.jp2
b00ec582959cd667f88bdd7dbc0ff79c
148a5c476965c23e08edb99fdeab6a818c11a2c9
7045 F20101129_AAALAB xu_j_Page_077thm.jpg
d28b14037c860ddbea8e728d5de977ea
1c82f60c3be1b54add136b9beac85d332000d762
5714 F20101129_AAAKUH xu_j_Page_078thm.jpg
e0fd1e07438b802e1045d6ace933bf63
09cca96aecd33516573ca5b12e5b34cca4bd2e90
28247 F20101129_AAAKTT xu_j_Page_014.QC.jpg
e03a01ccd646804f4196a972152641a9
856967131d81c71535b4652172566a28b6a20ec6
F20101129_AAAJRF xu_j_Page_038.tif
e2da051691f896ba050ea36fc228cf4d
c8b7c3a18607cd3eacd625d5792a6a25242c40a6
F20101129_AAAJQR xu_j_Page_144.tif
ea4ff164734c48f7e61bf638a68b4571
715e07d8cd6612bd5c32adbf2cf03009506a35a6
21939 F20101129_AAALAC xu_j_Page_078.QC.jpg
c6812a19bdb960cb217dd512d8081c65
809df6bf0a97e421acc58d8da2e85a3bb510d29c
17476 F20101129_AAAKUI xu_j_Page_122.QC.jpg
c2605819d88a74ce4ed1a25d6797cb00
9b09fd70118478e240eb937e7d66d43eda2786fe
20770 F20101129_AAAJRG xu_j_Page_144.QC.jpg
41c0bec07f3521e0aa4a5a4e53aebada
1eec106cf72f46d2fb6dab5b02a8de13a05a8875
15240 F20101129_AAALAD xu_j_Page_079.QC.jpg
6a147f4409aec6ac2d1c3904cdb95f78
e3d4cc03b54bb7314d68421c7cf98ea3d886dd12
23200 F20101129_AAAKUJ xu_j_Page_015.QC.jpg
84e62eab9756d9b304a910e7cc4f3c8d
37c83a49dae1993e3cf1c62ce4fe78e84e1e6544
13891 F20101129_AAAKTU xu_j_Page_066.QC.jpg
6a9c3d0ffd65b9c7f78d50a0a28426f3
8b6c8eeb7cc2fc9ec6258693ad4aaa3c284566e2
2384 F20101129_AAAJRH xu_j_Page_026.txt
9e62beb33f5f2f214f8a2a995cc35897
f051f1903eafe78681472b7f1bf7c782959b5640
F20101129_AAAJQS xu_j_Page_045.jp2
49ffe97528bcec190c4d0c0e7624b4c3
a2a0709ad9eca9248a3497736dcf0bf8f968771d
17611 F20101129_AAALAE xu_j_Page_080.QC.jpg
fe05e88e3f25cdc0172dc3d632970b3e
51205fa7d04c6b87327d082ea6d6f48c19f5c8a6
6796 F20101129_AAAKUK xu_j_Page_047.QC.jpg
eaed6c57649779e24ce3a9146027bc5b
339bf48ece62c275a8a97ec2db2f46424244795d
22289 F20101129_AAAKTV xu_j_Page_136.QC.jpg
9c09c03f885763d509a9d2fbe3418747
b51c760e2255a6831d0614776ed30af31be99c6f
822641 F20101129_AAAJRI xu_j_Page_054.jp2
f9ac886729b67d49177980b71dd1cb58
411ad178545b7bb41b556b10f197ea187b0fa389
22775 F20101129_AAAJQT xu_j_Page_135.QC.jpg
79b0970fef4676306feab58827eaf821
3947043af52d9bd7db91e14d79e8385289d3f566
25250 F20101129_AAALAF xu_j_Page_081.QC.jpg
fc0417cee1251369328c9cc8397f7faa
006e0b5d7f113625dd481ae04cf0aa767a7f54d9
4096 F20101129_AAAKUL xu_j_Page_150thm.jpg
430ea6519bcc1b2073cb3795fe0e0169
a3b7174b53a8bba380dcb243717c4d7f10d69017
6246 F20101129_AAAKTW xu_j_Page_017thm.jpg
61f7eb7da5fb68ad3be3f7e22c7f9607
48350030b2531f32ad9bad87f954b30aacdbc0a4
1348 F20101129_AAAJRJ xu_j_Page_066.txt
7890f37ed23c2b200da1963d72e16102
10b6c0b2a98c84e6278b1f3f5bf897b94513fb8b
1051970 F20101129_AAAJQU xu_j_Page_146.jp2
726cb3576f7cf9b1e43874aed6931d16
9f1b6bce2f9b98bc4e1d8e76a6016a34830500c7
5822 F20101129_AAALAG xu_j_Page_082thm.jpg
365f77cc5270f05d28b5a5650217df81
64d2dfc492a62ecc6627641d69fc3e7816b5e97e
10634 F20101129_AAAKVA xu_j_Page_072.QC.jpg
c5253f4b5c3941bad9507825792c688f
086c3c6666c0822895869909610b85b518283331
6660 F20101129_AAAKUM xu_j_Page_046thm.jpg
156f8ee7457310b38402a1a2b1e6585a
8688401e4ed1fb9153e68c17f196c603db911d22
1850 F20101129_AAAKTX xu_j_Page_131thm.jpg
701ad741999e09a94167a4897a52ccbd
236d99fed7c6bd387723f6796db89c98284ff8eb
F20101129_AAAJRK xu_j_Page_009.jp2
5a96cd2cdb44e2b4a7b3bd26bc2f6be9
51daa9c544112edc96e59404dacc37e86db3fa36
F20101129_AAAJQV xu_j_Page_047.tif
7293d1252404ac420f92132c788a6f46
3edcb572f45aa684fafe9ff57e3173f8523c7aed
19128 F20101129_AAALAH xu_j_Page_083.QC.jpg
b52c9665d7c5777ed2e59b405fab1364
4911a443ed8f809df92eff606e339e3e4c55c311
3279 F20101129_AAAKVB xu_j_Page_161thm.jpg
ab6c9b98172a2b02568cff04273540ee
0dc32b0a76a3a42c1746a8a116564c3d5a998e48
5132 F20101129_AAAKUN xu_j_Page_080thm.jpg
65b2fc613972776b0d595ec7d707f972
19ab7ae3cd952cbe3eed37bdc3e3ed12274ade28
17336 F20101129_AAAKTY xu_j_Page_038.QC.jpg
13fb0381f321a39cc412d23a7e60fff7
27b5c83ea3d371eb5ba6c9d8c5ea8b14beaf12b4
22613 F20101129_AAAJRL xu_j_Page_140.QC.jpg
ecb1bd52c7c6471130e17677ae61286a
270811e345a5409530db50ad2189edc895f7425d
1051976 F20101129_AAAJQW xu_j_Page_068.jp2
d40e0bed2411f7d60a1d42d5569d0334
0137c608bcf7a12393b01a15ea9c9060f790a314
5248 F20101129_AAALAI xu_j_Page_083thm.jpg
64147176c5e8cd46440e61ed136029d9
9c2fc2f4c746108aedc0ef8bfab9b045499af158
22002 F20101129_AAAKVC xu_j_Page_007.QC.jpg
4974b9ae6bd53b436afe46d3c45193ad
9bcae707e992780e1d7e22b267f4a5807bd9add1
5830 F20101129_AAAKUO xu_j_Page_034thm.jpg
a14811e0b150aff64df59912c7e548d0
19f92661594e557b7a8efc50ba7c7e2197133aec
6482 F20101129_AAAKTZ xu_j_Page_058thm.jpg
ee9266d30d437366a1a786b68650cf09
e6be50dd8ca300d922ad612062df4c214e07c790
45616 F20101129_AAAJSA xu_j_Page_052.pro
4708e6d8f26bd210ed22db8507ad83cc
313ca9003a203d9568ffd069e3f4c14458d4a9a8
1870 F20101129_AAAJRM xu_j_Page_096.txt
f248e24a1900d0ff56f62db9aa858f0a
0e915ef3b7d63f79dd4a252eb4f7bab9edda69a7
5432 F20101129_AAAJQX xu_j_Page_051thm.jpg
61ed304381611db0db8e8bcf2b885837
bc14f25ee72cad4027d398a169a1d11dbbb1b4f7
21456 F20101129_AAALAJ xu_j_Page_085.QC.jpg
ba5e1f4025ffb149173f440c654ab76d
964e623686f96a7273d7f12111b9f97386ce7525
5735 F20101129_AAAKVD xu_j_Page_114thm.jpg
a6f89d7c48c943cb1c8f31d15b5d41a3
824dbf800aaec268cb423d694b6734e3be471996
23045 F20101129_AAAKUP xu_j_Page_035.QC.jpg
c3567f1409baeb42a412d1d278a1173a
422f7781006c56c8274805578b0efb434315ae89
F20101129_AAAJSB xu_j_Page_040.txt
035111f1d4cd7d04adfba4952d5f50e0
80400f8ef27e04f1441f81af1811730aaa7ec5a5
25065 F20101129_AAAJRN xu_j_Page_158.QC.jpg
9337610c2986b7eb14580d0a72ac436b
221c6b7a28c60ba9856ab79d6fd91aa738057ae5
F20101129_AAAJQY xu_j_Page_022.tif
1516317408b99318dc33f36600b0e5df
649145c5baefd4b5d06279e6374db2fa689e7b4c
5777 F20101129_AAALAK xu_j_Page_085thm.jpg
b28e302ee3c4e7a75ec0d11d7bdb61ec
6180d1e60b7ede386dde1eba1ded8485aa6ed273
5826 F20101129_AAAKVE xu_j_Page_138thm.jpg
674b7f350f9cf3641e57749265250fe6
55e162d7a5e7d93078c57afedceea2ea4dc883c4
5844 F20101129_AAAKUQ xu_j_Page_027thm.jpg
c10b70ffb6f3ce5d0c097dbf705a7a26
4446b8558b2eecff7f26b5e610b5b03b6d8ca582
F20101129_AAAJSC xu_j_Page_151.txt
c2c046e7efa2fe34b4862411d5a74bbe
8685fbef9e4c9dbcdf96cf2d21983282b1ec03bc
F20101129_AAAJRO xu_j_Page_096.jp2
966dc74be1a37627213a55fedb924614
3ba9a5f9e0ea9d0656ccb327707f51955339d2f2
2293 F20101129_AAAJQZ xu_j_Page_147.txt
cb459170c68e8bcdf4e809301fc2c62e
d795c79247c5a1fb9a1e74f4d73fddd86094eb6f
5892 F20101129_AAALBA xu_j_Page_101thm.jpg
15b0852f2a67eefa6c47af83371eecf6
91c9b14dd996fb1a85c902d74b63d7dc83049ee3
3761 F20101129_AAALAL xu_j_Page_086thm.jpg
46d23d3b275f070ed251953bc55a6900
36e11e5545b0c181acfbc94c8441734355b2bd58
5709 F20101129_AAAKVF xu_j_Page_136thm.jpg
f2ad770f39fb72fadf9e5e2d6da20e3f
b9218f222ab988dcf830159b3124b4e6cf3d2e89
24615 F20101129_AAAKUR xu_j_Page_063.QC.jpg
8e8e42417d87d65e2be511ff6731e12b
30a58ca887da7b993eedf34c8262bad69a44bc7c
14533 F20101129_AAAJSD xu_j_Page_105.QC.jpg
9a19e23b5f5396a60543e69ec6a72d23
74d04e691932302e0ee524d97b343936fb941bde
27035 F20101129_AAAJRP xu_j_Page_094.QC.jpg
d97889cf7cc66f6ffeeb076ffdded88a
6d273897c478880ed7921dc99ad24a70d5a1d3b8
24187 F20101129_AAALAM xu_j_Page_088.QC.jpg
19ffeef870c32f8a6d21fe9dcd8bfe18
09e2be6f4076e251ef3dff01a00dbf6c4a2cd9f8
22946 F20101129_AAAKVG xu_j_Page_110.QC.jpg
a0cee800a22c947884e5ba705845cd0a
993b5b59dffa1e48535fb797223f5bf71df7c2d7
5697 F20101129_AAAKUS xu_j_Page_039thm.jpg
33ec9046b2a7bfcfe0e01fe568578960
5c7629c7e1a3e85bb51f16ddea4766211223ac1d
60491 F20101129_AAAJSE xu_j_Page_006.pro
a842e3cecba65a1fdeb85ff62bf44d8b
7af53d1a47b1123afc93e6d50596090e8c482839
F20101129_AAAJRQ xu_j_Page_118.tif
792468e509db9bd0f771c86a90152d85
d5e074f283a40a346858234ea356aee24e2a4f40
28038 F20101129_AAALBB xu_j_Page_103.QC.jpg
93ac7de02dfb21c750bfe9dfbd00d983
0647abd7d5b9369a6f249811cf74fd4f436ac810
6200 F20101129_AAALAN xu_j_Page_088thm.jpg
9d5c5298bf685c6c9b525581a64961ab
aba78f0da6008ac32533c5a46745259c2b38d4ec
6667 F20101129_AAAKVH xu_j_Page_111thm.jpg
af6cfce4a40fd214639310ea0831be2f
d982c0d104921ef72dcebd7724b9fa2d6f5e5ed3
F20101129_AAAKUT xu_j_Page_018thm.jpg
99e041d51ae1342f1002907ca08efeaa
4621eac344abcfeccc2a31b17f5fe47e757e6a88
74900 F20101129_AAAJSF xu_j_Page_004.jpg
7ede5765ee354550247507b6790deeb7
ee0e3cb89aa99345da66a6de0e3a76c25d9dd57b
F20101129_AAAJRR xu_j_Page_074.tif
9904e952e3347cb1796ebf0d7256d525
2a38785f8363146edab1915e2f87ea8da091184c
18593 F20101129_AAALBC xu_j_Page_104.QC.jpg
0e252fef8ec324ee034190edc565b9b6
205821d5ce46667ce1e8d5ccb9c564c839ce2514
6162 F20101129_AAALAO xu_j_Page_089thm.jpg
9460df53b50af9146ba06ea3d0c364b4
ebdce1737fea4816a8411fa111d6fd2c4f24d28b
5716 F20101129_AAAKVI xu_j_Page_128thm.jpg
ef196678ad9c52984f8c24e1ea8d28ed
457764d4e721b22ead8efc7439174fb064f3a4a8
18567 F20101129_AAAKUU xu_j_Page_121.QC.jpg
f2358aec22565adc492dcbf6e5f98054
8652f39707020999eb22a39a5d10f56f837ce115
34502 F20101129_AAAJSG xu_j_Page_060.pro
93ea7885af1f8ca18bc9d579e920a3d6
dbb1e3f4040ea9dad21ba5910cdab325e22942b1
78931 F20101129_AAAJRS xu_j_Page_017.jpg
603d99f6268bbdefb8c8ebef11d23f66
c299fde43ff70801f9cadc345a60e6e3e5b0b4b9
4741 F20101129_AAALBD xu_j_Page_105thm.jpg
99573a8908aa445d4e16337f8bb13deb
85de17cff3f7141afbf2f18dc122c9609591bcfd
5993 F20101129_AAALAP xu_j_Page_091thm.jpg
2e8ee6dba35752d4bec1d29789563e55
ed492af7bc8279fb0343ead4060421f494aa78ba
12141 F20101129_AAAKVJ xu_j_Page_161.QC.jpg
c86ff08abaebd91d82e21a34d078d802
e983ec7d9a6df23dd2dc3ea569e72ea812c9c75d
F20101129_AAAJSH xu_j_Page_003.tif
84b7fba49e71f5715d47ac1e17f17c4e
3681f931e09a0b3c67a26c6df26021811b70c344
19797 F20101129_AAALBE xu_j_Page_106.QC.jpg
99dd7a12e411ed9a332eeda13841d2b2
1ba9f19ee88beb48e61916acd7aec59b74c655db
9700 F20101129_AAALAQ xu_j_Page_093.QC.jpg
614c92054435c90b808953fa08a42383
d55bf06a986d1fc971ef597930143e62bec66489
12386 F20101129_AAAKVK xu_j_Page_086.QC.jpg
88e73e98d12dd542d60581595bff7166
e9d2455b3de7752f9b2f1d11e9d27e8b57e38066
7060 F20101129_AAAKUV xu_j_Page_130thm.jpg
5d10b56fe1ae1f3478d06a496d303979
4de03452ef001a2b5383fbfbab5bc7543ab1dc59
F20101129_AAAJSI xu_j_Page_139.jp2
c9011517a851a34ab0074f639fd05a1b
1e6885c846e7bd1019a9c225654066d2d0e2ba8d
F20101129_AAAJRT xu_j_Page_156.tif
f37230e792e6b76cdbb28e82b7b923bb
a0f577f4a358b9827786dd0e4f049961f468f67b
22019 F20101129_AAALBF xu_j_Page_107.QC.jpg
bbafa56f9c7b29bb9aabaad63c707797
d526bf6020edded4a7d59d0f7965f1a9ec55957f
2901 F20101129_AAALAR xu_j_Page_093thm.jpg
63f8f1425275f6e696cecc432f84af52
5981570c95e48a43bfe401f11aa2ada2657130da
6500 F20101129_AAAKVL xu_j_Page_074thm.jpg
282bcb007461a3e02c511b53e93690cf
9808967b1e5bcbda038451df099a6fa190245247
5896 F20101129_AAAKUW xu_j_Page_109thm.jpg
d1fbe7243a1fe86d78db6558ce3fffd1
c7162f0b07c585748321e402a21c8ee7ba4386d6
82954 F20101129_AAAJSJ xu_j_Page_146.jpg
fbfd0ad57ab825390513f2da9f7fc72d
12e33abe96c6e58049e563099ecee903461751be
6489 F20101129_AAAJRU xu_j_Page_045thm.jpg
2f8674e4eb9262b23f1f5c9b65063faa
4de78e82c7c4363d9e942388c6f287c67591fd7d
6212 F20101129_AAALBG xu_j_Page_108thm.jpg
6d1aed2410b2e5fdba1e7f5790c86d03
38612304ac2830b79481202f341ceacfc9ad0830
29415 F20101129_AAALAS xu_j_Page_095.QC.jpg
4591e1967ac813fb740f196cbc1ee424
679ff46f0d0ee2fd9ff157b780a521212df10db3
6891 F20101129_AAAKWA xu_j_Page_031thm.jpg
17e5baaead05561d5d600d5fd2c51ade
73b955f1e3eae1bc72ec947ad8e5bd7fff008c21
26702 F20101129_AAAKVM xu_j_Page_073.QC.jpg
d82ee6db84ee8dd313910d528fb896e0
acd6a6a5b2955a223de16e2dc32dc05dc9489c8c
6121 F20101129_AAAKUX xu_j_Page_065thm.jpg
6c1473097f434b78a4b40744930db84f
206649484d74f4997d3b2718a900391a26f5fa8a
72647 F20101129_AAAJSK xu_j_Page_112.jpg
af479dc7089122924b3fca6e4540a2d0
b8bd9e468cf30c05eaff17c78c1a38512d0db2cc
F20101129_AAAJRV xu_j_Page_149.jp2
d6c703942c2ae6caa3282bb2979d63f1
820399983596be8d31407b37223371b71cb89068
5882 F20101129_AAALBH xu_j_Page_110thm.jpg
4d449ffe4e5ccac609dcdd86649f3dc1
1df290f966ff6624e681c8ecb14df97ceed1c8e9
7082 F20101129_AAALAT xu_j_Page_095thm.jpg
9721c5f4bbb6b63e656ab688670a1869
92957118564a4f81601e56842d7e8d775d3bef42
26184 F20101129_AAAKWB xu_j_Page_024.QC.jpg
2e9e5ba5da91225bd67813a023f4ff3d
747e4c09bd747a4eb9349cdd36dbe718ccf691f9
22457 F20101129_AAAKVN xu_j_Page_091.QC.jpg
bd0b2c0ff5e2d2b1b5add994c14e79ec
84b98d9e02d09efa0663a41c65ba6ff36ca35c6f
6404 F20101129_AAAKUY xu_j_Page_158thm.jpg
7e7eed8b26c99526e2f115de8a076195
14658b2f3a7b8c052eb59bfcbf6d4beaabb462fd
F20101129_AAAJSL xu_j_Page_116.tif
1afdcc7b382ef9990eb28274f3f249d0
b2f3c0e4b077efdc589972253ef2ba041a4c011e
808281 F20101129_AAAJRW xu_j_Page_019.jp2
5ad782fc35165974f9ed20e2364f1678
13726c9f2240c6b28d1733bdc00b38a170ce1d8a
22832 F20101129_AAALBI xu_j_Page_112.QC.jpg
7df7b111a98ebe882c47fd990fa4a7f4
28acb31900365a71ad1547230d1dec51e82e4812
22503 F20101129_AAALAU xu_j_Page_096.QC.jpg
7b2f2dae4e2be7670f75baeef0fab63b
f1dca287a8cbc8b157f7da0e46308aa8992bc4d4
4993 F20101129_AAAKWC xu_j_Page_053thm.jpg
a61d47dca0033669cd48480c05eec0bd
d48f0fb8a0b77c458c357c798902bb7bc5a0b6d5
22878 F20101129_AAAKVO xu_j_Page_052.QC.jpg
7c36748720400e93e695daae625d129d
d3da867940d27aca23185a6d5ebeceb2f7811ff3
6801 F20101129_AAAKUZ xu_j_Page_030thm.jpg
d617e5d133001ac4053ca0a5af020b1a
af8039182e8427f48f69136a8b0c4f1f5ff4d0c9
5101 F20101129_AAAJSM xu_j_Page_060thm.jpg
2b07ef8024df0ac7c14c94404ec3d9f8
28800d8d25ff8463c2c16d7faaec7587cf6e0d2e
752541 F20101129_AAAJRX xu_j_Page_121.jp2
090a3990f038c8864455286c667a4f73
75e72b61f463ccebdf30058c2f78615fc83565cb
F20101129_AAAJTA xu_j_Page_100.tif
4f89d39ca4488653d24975e7e8ff647b
3e604b277e8da3c8d640db88c874bffd7740efd1
6750 F20101129_AAALBJ xu_j_Page_113thm.jpg
70c3075728bf4495f8daf159c7590bd8
d764fd314cfd31e5abe93c6f63536c3d4553c687
6190 F20101129_AAALAV xu_j_Page_096thm.jpg
456591273b5bab734ec7b772a767cb01
80ff048f70257b41efcdbe36223a42228d7e380c
4750 F20101129_AAAKWD xu_j_Page_123thm.jpg
24dfa1a7f1f86dcfc0ec611d5a042470
c6e92c9c649cf629ee8cb31bc36b3c16d1adf6f7
4235 F20101129_AAAKVP xu_j_Page_062thm.jpg
87e7cae5695d14c84a5566defa5070f9
e24325f13c9f780603b5393bae6540c884200a44
F20101129_AAAJSN xu_j_Page_049.tif
2ee272108bbbe67ce77b4ebc52ed7fc4
33ac147a3a45f84e3300b121bacc8bfc694cec4e
133130 F20101129_AAAJRY xu_j_Page_156.jp2
8ce7299f2b0703a83e3d5fb96fc53288
8218659a22e130e75bfa99765bd2a96f227e3113
15062 F20101129_AAAJTB xu_j_Page_150.QC.jpg
f763ef7b3207bf1750d442abdf4a796e
b6088adfa76bfa0c973281904978e05272e50e6b
19014 F20101129_AAALBK xu_j_Page_115.QC.jpg
be6f908d6b430a2084babf527bcc4059
f2695614506898fba4bfeefd5f85fe2e02c7f1b1
4553 F20101129_AAALAW xu_j_Page_097thm.jpg
f4ccf6eb27309aac8bc17c6c7c88ab8c
eac341791a93c372d249ee2e167662e51767ac88
5903 F20101129_AAAKWE xu_j_Page_055thm.jpg
f53384c5acf0e943ce5692add3a86e9f
34aa11029a1e22fd99c8ec08904a65bad5c691b9
23842 F20101129_AAAKVQ xu_j_Page_151.QC.jpg
694bfc235036ffee435409c33704a1bf
e473f52bb6da53149607ae8d1c29657a6f12e5c8
59040 F20101129_AAAJSO xu_j_Page_084.pro
4254135f2323ec87d045c46d32c4c80d
55bab8acb8f98d8e4977ceca724f22eaa69401f3
23647 F20101129_AAAJRZ xu_j_Page_010.QC.jpg
a0426af73c389238ac6b539c7a6671f9
d027a59c0cac9a1953e22ede0e50dd14b81c6ce3
F20101129_AAAJTC xu_j_Page_024thm.jpg
a446371724d6132dad1752d3184815a7
48ea2d1b0f1b835ac2c9ab5654c4ca3686155c48
5116 F20101129_AAALCA xu_j_Page_126thm.jpg
520ed7f9162a9da9944a35ce36469c91
0b82ce3e84324fc68c2d4a6c28b283db3d5f3b1f
17119 F20101129_AAALBL xu_j_Page_116.QC.jpg
06355f7b0044d19011220f924feb1434
0fef4c1e458a392faf8b699de8c64d619422f92c
F20101129_AAALAX xu_j_Page_098thm.jpg
33b9a0fb76ceb6ee12954bede2da8832
015351ebfd683a5ae41f1b5dcd9121a0f0cee3a1
26674 F20101129_AAAKWF xu_j_Page_033.QC.jpg
4048b4331faefde32fc0d5a3a9ced11f
4c331abceea4125b24cb9859786a617a2767cc2a
5269 F20101129_AAAKVR xu_j_Page_100thm.jpg
2e16853db436d02818ee7abafb85d3d1
c21c7cb6a0f162c74a313414b6f6abd83eaf922d
F20101129_AAAJSP xu_j_Page_106.tif
90afda6a0ccc1b8ad70527af0d4fb431
7a508c77341e4468c15eb9e547794f6e7a59b33f
47463 F20101129_AAAJTD xu_j_Page_112.pro
3f4d1e1e6338b3ab8fbe26e53179b1b0
dd8017158233ec6735d93ef41f7e5ba4ccbf60db
29093 F20101129_AAALCB xu_j_Page_127.QC.jpg
dddcbeaf4265fb8c0857dee865582b0c
a746e2a29516c3984435b4571add005ea33b00ef
4917 F20101129_AAALBM xu_j_Page_116thm.jpg
1d43f096c809a89ccc22d92a3bb5092f
87e8cfc869e9cd8c4db9163f7c211489f8e272ad
F20101129_AAALAY xu_j_Page_099thm.jpg
e7e3b91aec3593f4fcc608d8088c3d87
75035115bd47a3421fbd429dea5a211cd204e01e
6705 F20101129_AAAKWG xu_j_Page_146thm.jpg
4833ba0dac191a4ac9fc7cd836989507
29763810584af39fcdd2ff937380712c6e7b2a81
22424 F20101129_AAAKVS xu_j_Page_128.QC.jpg
eb10aafea18f477e1149f9953ee4b9e6
b486144f593ded811a94d3c1a89ac1f36fabe1dc
F20101129_AAAJSQ xu_j_Page_132.jp2
48b154b71fa96c45a50cea87da74c584
1fc15492903d8de06bb5ab7aa3fd7fa98fb719be
55865 F20101129_AAAJTE xu_j_Page_118.jpg
1c7aa6c3a3df6b67dbd999cf26213e39
7b1b1c80ceaa6a7e904da420ac0135fdc49825c9
17224 F20101129_AAALBN xu_j_Page_117.QC.jpg
e58e96930c97d0347fbbdca915d63edf
9be7b5675ab246c69f9f3589de1779ddb9f7826e
20730 F20101129_AAALAZ xu_j_Page_101.QC.jpg
b7a823a370104533118c251b1c48818b
0152af04249476d112fea2b63bd520c56b859750
6514 F20101129_AAAKWH xu_j_Page_153thm.jpg
7a9a6c02017d3d1e9ef2c02476df9997
99878eb06289c8e7b43a640e7a3475f2936b0814
7054 F20101129_AAAKVT xu_j_Page_026thm.jpg
5b936f5f234ba5d156f0e0b1812eb523
81a35675b27a0ecadf3ca40dc00b7e3c6b3a9350
813238 F20101129_AAAJTF xu_j_Page_020.jp2
af5aaf4abee844824b7c0f159767a19e
13404d51f9595c80d13be2e72caf220f743503e7
805921 F20101129_AAAJSR xu_j_Page_018.jp2
8097a341dcee53676ea03b69dc42f778
ca37ff57f10bb598de8ad019e8840f87d6f3d7a2
6979 F20101129_AAALCC xu_j_Page_127thm.jpg
ab3c6bd074baa9169029d09b87022d30
ef282eafd7327da308ca3b7a163f53e874aa3bb4
5379 F20101129_AAALBO xu_j_Page_118thm.jpg
8b76e6b7b3b5b92548bf632996e25c1a
8b959ca18f59f15ad47549579a2632e85531cf6c
20110 F20101129_AAAKWI xu_j_Page_039.QC.jpg
8d15a0ab6430c17001cb2dcfbf5b5c63
816f32f489eeed66e1838fd63fec91a1e4fbf82e
22806 F20101129_AAAKVU xu_j_Page_075.QC.jpg
4417de78606bd5d222fc7f7181419933
2b7fc3e95b94f331c257a9590d4756c8b2789591
1995 F20101129_AAAJTG xu_j_Page_049.txt
2baf2ce679f5c866e4de3bef862e2189
e887a9725d396f2f97fddbc81f9a24f979dbf451
1878 F20101129_AAAJSS xu_j_Page_016.txt
e4281ee9c99858c3ab85f0cf8243a7e8
cc908bf0929928ddc1ec8a7941ac095f1916c875
21761 F20101129_AAALCD xu_j_Page_129.QC.jpg
0873242f634deae58eddf3028bb890b7
33ddd879f781fed6406f4e632c989682d9af6e12
19349 F20101129_AAALBP xu_j_Page_119.QC.jpg
097666c85c58e798802209711f2cdf46
d7f56ac07e4195f0860c42d388adf163261f05b9
26078 F20101129_AAAKWJ xu_j_Page_045.QC.jpg
70f5158700fdcf98b61231eaf74cf5dd
44fa6e83f9b93bc1e0a0810c37196e556174b979
26236 F20101129_AAAKVV xu_j_Page_092.QC.jpg
485b3533178c71baa2dfe0c9e8131cd4
b8a27a0454484950ee47d815b2d46f7562ca31e6
2702 F20101129_AAAJTH xu_j_Page_159.txt
64f8bfdd9e5386332cf4c78b9884dc20
6a625be4bff0db89c049dadc6bf20713aeb4a652
25872 F20101129_AAAJST xu_j_Page_057.pro
f34d0eadaec1103b8afaada1ce402c5f
e9acb436239a93745757e74c4de3c66bd5aa91c3
28729 F20101129_AAALCE xu_j_Page_130.QC.jpg
a4fe67fbc0f4630d14bb9b04a60c29b2
39131f477a991046e7f419a087057edb11bd8d71
18808 F20101129_AAALBQ xu_j_Page_120.QC.jpg
ffdff4a01c0eba42f89c72616c9a9bcb
19227913a8b5795fb53cab4096e31d9580f72b5e
2983 F20101129_AAAKWK xu_j_Page_012thm.jpg
191dfe226f8f8b6004f72900338ceced
4728fbed8905850487f4be57be98e566d8226de4
6839 F20101129_AAAJTI xu_j_Page_147thm.jpg
301637c7be9e8de163abffc20c287864
b5be59ecb2d1fbc57a09c24e10f99c3ae6f1d90f
24865 F20101129_AAALCF xu_j_Page_132.QC.jpg
74679666c6a81032b0b1f20f2c5d95f1
b2250c95e06e4a4244f0a7514327b0f50dffc5cb
5348 F20101129_AAALBR xu_j_Page_120thm.jpg
7ab1a645fbfae00f29b863afa80786dd
d4d657ba9b76e10820308396b8e7b762b87caef9
4636 F20101129_AAAKWL xu_j_Page_057thm.jpg
3c6589048cc9f815de9be655dbf44b73
773c865a804c88bc650f8dfd64056569f5903ceb
23840 F20101129_AAAKVW xu_j_Page_155.QC.jpg
ec2132b226bd38e3f4cf83f5cf3d017f
4b04f616236a03d43eed2fa2939cf4ccee8ea69d
F20101129_AAAJTJ xu_j_Page_113.jp2
0ca17fa26eda180833be99f79bebbe0d
2c8831570e67c481e2f8aa1980f1b97d3e54a26c
47982 F20101129_AAAJSU xu_j_Page_079.jpg
a86e1fd37d128ec707f769f1a9347c03
bd4120100cf11f21bb7d15b85a1047e012f35b94
6120 F20101129_AAALCG xu_j_Page_132thm.jpg
1a360db04c9d5247088421290053222f
23608b92887036cf74bf16dc5b1695c230b81c6d
5250 F20101129_AAALBS xu_j_Page_121thm.jpg
dd669896a758e86f835da0eb92e5fa93
f52aab61039a040e1d676ee54113fd7490d35679
24448 F20101129_AAAKXA xu_j_Page_041.QC.jpg
b932c487a74f1ec7ee8f5e1ea927d7ee
a98c2a15e296a2e4c69d28827f0646ab304d97b0
16101 F20101129_AAAKWM xu_j_Page_097.QC.jpg
afb04e3f35c165a0f4959b0febcdb2a8
d837d245173d45e25e34fc433fd3ef55addd86c5
1331 F20101129_AAAKVX xu_j_Page_002thm.jpg
47a0e6921d25ea6efb833a7665c7fc71
2a12c6b8edc117fac6a70452523c97567db736a1
F20101129_AAAJTK xu_j_Page_043.tif
1c88c428e04f613eeee238fbb0f02255
cd2b1ba8cd7c490d376313e738d1f576cddf1d28
90321 F20101129_AAAJSV xu_j_Page_014.jpg
d3453051277dc181f85dda0b19a09c56
ef4c73e5e28dfec1d5e41ec0799a13ebf65bf506
25371 F20101129_AAALCH xu_j_Page_133.QC.jpg
008b92a78487c9a556cee946530bb771
6b4f8fa479fd9ff7e0aae2dff0f8e0241231a2bc
4941 F20101129_AAALBT xu_j_Page_122thm.jpg
9dad57de60cb3cf7af7956cfdd1c83a0
ff41ebf16632d309600c7ddd7ae3a6b769ab1988
5485 F20101129_AAAKXB xu_j_Page_160thm.jpg
41a95337939cffaa85f3407dfe685e8a
5987e13980b4341a29ca7e1bbc0c0951848ea9b4
5121 F20101129_AAAKWN xu_j_Page_054thm.jpg
c931e8e7baa95b9a286ea7aa13a0b799
6596ed352da5d3cb37de3277fe49808d4d08f7d0
23645 F20101129_AAAKVY xu_j_Page_109.QC.jpg
5228468a1a32937e2e8946c805e211fa
ea85eec48f08a24d08a5d8227f531e42fc9eab19
57389 F20101129_AAAJTL xu_j_Page_120.jpg
218d9505125de86e4a8911f46fd7d699
9ef689125417f067e56ec6f776dc7a42e7995604
F20101129_AAAJSW xu_j_Page_035.tif
054f41b1934aaa2d465e6f2367d54a2e
fb35f7086c45bf3ef824809734fdc2b02e28821c
6543 F20101129_AAALCI xu_j_Page_133thm.jpg
5bef9215bac93ffd21645c68040f228d
602bdd558542e8a24b43c40d8d1f5b91e9252fc8
17347 F20101129_AAALBU xu_j_Page_123.QC.jpg
d3ddc823b074cbd267c42423ffe29df7
ed72950322b77c5700a77de42160e4da5b8df980
5454 F20101129_AAAKXC xu_j_Page_019thm.jpg
d056939b89b8a04076728cd29ff912d3
c96846761396e2da5bbdb83ad37ebb435477c71d
27264 F20101129_AAAKWO xu_j_Page_048.QC.jpg
7adae3e467d6e347a513029e4b12defb
d483ef0a6252f0b8c4ba2829c955f4a6433de16d
5650 F20101129_AAAKVZ xu_j_Page_144thm.jpg
1b47062328c0d90a71baf09e67117a64
ce1fe812bcecb3429cddef02d13a755c00097ee6
22917 F20101129_AAAJUA xu_j_Page_001.jpg
04a6c4d923bca3fea473732ce5e5cce7
257ec9f851ee8b2e9905a10446cee76c76bd0fbc
92509 F20101129_AAAJTM xu_j_Page_127.jpg
d86c36ed9d7960409f4c45a3f70e7a0d
f2afe231470b937a87d89d19496edb461d6deab6
45633 F20101129_AAAJSX xu_j_Page_049.pro
066a6caec0bb47f2e8b82f271a6c5dc0
af5101ff5e3610884aefed7250846c7282f02a64
22564 F20101129_AAALCJ xu_j_Page_134.QC.jpg
5f951c367f5ca349d60ea3cdca73ac3f
87113c14c18e67938b6944768a08aecdb79b9475
20398 F20101129_AAALBV xu_j_Page_124.QC.jpg
0eacd851ea2dc3b5329d993618ac9f08
bad2195a78ac7c8625aea5d3af923458f1f05517
6725 F20101129_AAAKXD xu_j_Page_148thm.jpg
811804c3b5c6c366e490397d4cc26978
53dd971e46d40ffd3c9e9537498adcb6f973fd6c
5420 F20101129_AAAKWP xu_j_Page_020thm.jpg
06c0450109953b86610506fb34a75862
633ed00dc2e56731d3fc5ee71b2c853136096071
9641 F20101129_AAAJUB xu_j_Page_002.jpg
50f30746d4fbdb18113fd81038c58126
f9133dbde5c9e1149183fcfadf94dfb0b6856f77
F20101129_AAAJTN xu_j_Page_158.tif
5d0bd8042637d467ec339b24d3e02ca5
cff8d57c606c9ffba01947b4c63821a12f30976b
22652 F20101129_AAAJSY xu_j_Page_114.QC.jpg
c547255a0dd09db9de3d61f5e03512b4
9ab159cc10760bb1609752046eaa6c1c480c9a5e
5950 F20101129_AAALCK xu_j_Page_134thm.jpg
c833d386f6fed212ad27eaf7e1ea763f
f264ed482238dff9ca329ee2b1796d34444de2a2
5280 F20101129_AAALBW xu_j_Page_124thm.jpg
e9fa89f33365f7ddf2d1f2a33792148b
bef23764bba960af2ff68b84a553938e54307e09
236058 F20101129_AAAKXE UFE0021159_00001.xml
b507b0bb819868de6d3f45f3018256a5
0beef3ec323b92a3a8fdf3a89f7f849d035eeb1d
5913 F20101129_AAAKWQ xu_j_Page_035thm.jpg
6d8f57b51513d10cb914a8db352511f2
a2f2765b5c1d06d114cc652bff8f775e0ff61f25
9858 F20101129_AAAJUC xu_j_Page_003.jpg
9381a713012fdcebf2b214583a219c84
e428f87f6e0e5679d4b12d011f6231bc8a3c0e9c
20531 F20101129_AAAJTO xu_j_Page_047.jpg
a4ed1bd1202d049c67a68bfdf3994dd7
e062bacb6f8f330bb9718d334b7c8acb00a560d5
400096 F20101129_AAAJSZ xu_j_Page_072.jp2
fc6b7ac66f51ee11739d0a8cc2be82ba
ab351987a8ee2604914047f248c690facb3c73ef
27016 F20101129_AAALDA xu_j_Page_149.QC.jpg
c275ba6ba860cc4bcad3be9ba7dc00cb
7f252130a53ed6e4b75a703808c00e0d936ccba2
5959 F20101129_AAALCL xu_j_Page_135thm.jpg
c7a08a33b416d361bd616243f3473374
b39a82c06c932b19cb1df53f3da2bce51cf8ff02
21125 F20101129_AAALBX xu_j_Page_125.QC.jpg
5f8c7c51d98889fa95db5690b5c70180
fe64581e63edbab26972cda565302bc06d2dced6
7091 F20101129_AAAKXF xu_j_Page_001.QC.jpg
a69e8d98bc650eddc3b001c203e26f75
b7729bd0b62c151abdc2eced386ea2fec0c276cc
6516 F20101129_AAAKWR xu_j_Page_092thm.jpg
8331db14c0ac0e223a5e85dbabe6e721
8dfed2e32938bdaba672887dc7c4c40f70eda400
13845 F20101129_AAAJUD xu_j_Page_005.jpg
f05985f118fa8c7646dc832cd701da1d
b4b753132f917e25d57425fb9706b5c01e255182
27178 F20101129_AAAJTP xu_j_Page_030.QC.jpg
06c73667dd31d691d18e5795a8db75e1
ca69c78265e166396b62dbec68eaf4199fd0508a







NONLINEAR SIGNAL PROCESSING BASED ON REPRODUCING KERNEL
HILBERT SPACE


















By
JIANWU XU


A DISSERTATION PRESENTED TO THE GRADUATE SCHOOL
OF THE UNIVERSITY OF FLORIDA IN PARTIAL FULFILLMENT
OF THE REQUIREMENTS FOR THE DEGREE OF
DOCTOR OF PHILOSOPHY

UNIVERSITY OF FLORIDA

2007


































S2007 Jianwu Xu




































To my parents, friends and teachers










ACKENOWLED GMENTS

First and most, I express my sincere gratitude to my Ph.D. advisor Dr. .Jose Principe

for his encouraging and inspiring style that made possible the completion of this work.

Without his guidance, imagination, and enthusiasm, passion, which I admire, this

dissertation would not have been possible. His philosophy on autonomous thinking and

the importance of asking for good questions, molded me into an independent researcher

front a Ph.D. student.

I also thank my coninittee nienter Dr. Murali Rao for his great help and valuable

discussions on reproducing kernel Hilbert space. His niathentatical rigor refines this

dissertation. I express my sincere appreciation to Dr. .John AI. Shea for serving as my

coninittee nienter and taking time to criticize, proofread and improve the quality of

this dissertation. I thank my coninittee nienter Dr. K(. Clint Slatton for providing me

valuable coninents and constructive advice.

I am also grateful to Dr. Andrzej Cichocki front the Laboratory for Advanced

Brain Signal Processing in RIK(EN Brain Science Institute in .Japan for his guidance and

words of wisdom during my suniner school there. The collaboration with Dr. Andrzej

Cichocki, Hovagint Bakardjian and Dr. Tomiasz Rutkowski made the chapter 8 possible

in this dissertation. I thank all of them for their great help and insightful discussions

on biomedical signal processing. The hospitality in the lah made my stay in .Japan a

nienorable and wonderful experience.

During my course on Ph.D. research, I interacted with many CNEL colleagues and I

benefited front the valuable discussions on research and life at large. Especially, I thank

former and current group niembers Dr. Deniz Erdognius, Dr. Yadu Rao, Dr. Puskal

Pokharel, Dr. K~yu-Hwa .1..0.-_ Dr. Seungju Han, Weifeng Liu, Sudhir Rao, II Park,

Antonio Paiva and Riiii Ilro; Li, whose contributions in this research are tremendous.

Certainly those sleepless nights together with Rui Yan, Mustafa Can Ozturk and

Anant Hegde for homework and projects are as unforgettable as the joy and frustration










experienced through Ph.D. research. The friendship and scholarship are rewarding and

far-reaching.

Last but not least, I thank my parents for their love and support throughout all my

life.











TABLE OF CONTENTS


page

ACK(NOWLEDGMENTS ......... . . 4

LIST OF TABLES ......... ... . 8

LIST OF FIGURES ......... .. . 9

ABSTRACT ......... ..... . 11

CHAPTERS

1 INTRODUCTION ......... . 13

1.1 Definition of Reproducing K~ernel Hilbert Space (RK(HS) .. .. .. 13
1.2 RK(HS in Statistical Signal Processing .... .. 15
1.3 RK(HS in Statistical Learning Theory ... . . 23
1.4 A Brief Review of Information-Theoretic Learning (ITL) .. .. .. 26
1.5 Recent Progress on Correntropy . ... .. 30
1.6 Study Objectives ......... ... 32

2 AN RK(HS FRAMEWORK( FOR ITL . ..._.. .. .. 33

2.1 The RK(HS based on ITL ........ .. 33
2.1.1 The L2 Space of PDFs ........ .. .. 34
2.1.2 RK(HS N~V Based on L2(S) . .... .. 35
2.1.3 Congruence Map Between N~v and L2(~ ** 38
2.1.4 Extension to Multi-dimensional PDFs ... . .. 39
2.2 ITL Cost Functions in RK(HS Fr-amework .... .... .. 40
2.3 A Lower Bound for the Information Potential ... ... .. 43
2.4 Discussions ......... . .. 44
2.4.1 Non-parametric vs. Parametric ..... .... . 44
2.4.2 K~ernel Function as a Dependence Measure ... .. . .. 45
2.5 Conclusion ........ .. .. 46

3 CORRENTROPY AND CENTERED CORRENTROPY FUNCTIONS .. .. 48

3.1 Autocorrentropy and Crosscorrentropy Functions ... .. .. 48
3.2 Frege I-i i~~l-Domain Analysis ........ ... .. 68

4 CORRENTROPY ANALYSIS BASED ON RK(HS APPROACH .. .. .. .. 73

4.1 RK(HS Induced by the K~ernel Function .. .. 74
4.1.1 Correntropy Revisited from K~ernel Perspective .. .. .. .. 75
4.1.2 An Explicit Construction of a Gaussian RK(HS .. .. .. .. 77
4.2 RK(HS Induced by Correntropy and Centered Correntropy Functions .. .. 83
4.2.1 Geometry of Nonlinearly Transformed Random Processes .. .. 85
4.2.2 Representation of RK(HS by Centered Correntropy Function .. .. 89











4.3 Relation Between Two RK(HS


5 CORRENTROPY DEPENDENCE MEASURE .... .. 94

5.1 Parametric Correntropy Function . ..... .. 96
5.2 Correntropy Dependence Measure . .... .. 99

6CORRENTROPY PRINCIPAL COMPONENT ANALYSIS .. .. . 102

7 CORRENTROPY PITCH DETERMINATION ALGORITHM .. .. .. .. 110

7.1 Introduction ......... . .. .. 110
7.2 Pitch Determination based on Correntropy ... .. .. 11:3
7.3 Experiments ........ . .. 120
7.3.1 Single Pitch Determination . .... .. 121
7.3.2 Double Pitches Determination ...... .. . 12:3
7.3.3 Double Vowels Segregation . .... .. 126
7.3.4 Benchmark Database Test . ..... .. 128
7.4 Discussions ......... ... .. 129
7.5 Conclusion ......... ... .. 1:30

8CORRENTROPY COEFFICIENT AS A NOVEL SIMILARITY MEASURE. 1:32


8.1 Introduction ......
8.2 Experiments ...... .
8.2.1 Two IUnidirectionally Coupled Hiinon maps ....
8.2.1.1 Variation of Correntropy Coefficient with
8.2.1.2 Robustness Against Measurement Noise
8.2.1.3 Sensitivity to Tinte-dependent Dynamical
8.2.1.4 Effect of K~ernel Width .......
8.2.1.5 Ability to Quantify Nonlinear Coupling .


1:32
13:3
13:3
1:34
1:36
1:37
1:39
140
142
146
146
148
148


. .
. .
. . .
Coupling Strength
. . . .
as any a .
. .
. . . .
. .
. .
. .
. .
. .


8.2.2 EEG Signals .
8.3 Discussions .....
8.3.1 K~ernel Width
8.3.2 Scaling Effect
8.4 Conclusion .....


. .
. .


1 1 1 1 1


9 CONCLUSIONS AND FITTIRE WORK( .

9.1 Conclusions
9.2 Future work .......

LIST OF REFERENCES

BIOGRAPHICAL SK(ETCH ....










LIST OF TABLES

Table page

7-1 Gross error percentage of PDAs evaluation ..... .... . 129

8-1 Z-score for the surrogate data .. ... ... .. 143










LIST OF FIGURES


Figure page

:3-1 Correntropy and centered correntropy for i.i.d. and filtered signals versus the
time lagf ........ . 59

:3-2 Auto correlation and correntropy for i.i.d. and AR CH series versus the time lag 60

:3-:3 Autocorrelation and correntropy for i.i.d. and linearly filtered signals and Lorenz
dynamic system versus the time lag . .... .. 62

:3-4 Correntropy for i.i.d. signal and Lorenz time series with different kernel width .65

:3-5 Separation coefficient versus kernel width for Gaussian kernel .. .. .. .. 66

:3-6 Correntropy for i.i.d. signal and Lorenz time series with different kernel functions 67

4-1 Square error between a Gaussian kernel and the constructed kernel in Eq. (4-7)
versus the order of polynomials ........ .. .. 8:3

4-2 two vectors in the subspace S ........ ... .. 86

6-1 Linear PCA versus correntropy PCA for a two-dintensional mixture of Gaussian
distributed data ........ . .. 107

6-2 K~ernel PCA versus correntropy PCA for a two-dintensional mixture of Gaussian
distributed data ........ . .. 108

7-1 Autocorrelation, narrowed autocorrelation with L = 10 and correntropy functions
of a sinusoid signal. ......... . .. 114

7-2 Fourier transform of autocorrelation, narrowed autocorrelation with L = 10 and
correntropy functions of a sinusoid signal. .... ... . 115

7-3 Correlogrant (top) and suninary (hottoni) for the vowel /a/. .. .. .. 116

7-4 Autocorrelation (top) and suninary (hottoni) of third order cuniulants for the
vowel /a/. ........ ... .. 117

7-5 Narrowed autocorrelation (top) and suninary (hottoni) for the vowel /a/. .. 118

7-6 Correntropy-grant (top) and suninary (hottoni) for the vowel /a/.. .. .. .. 119

7-7 Correlogrant (top) and suninary (hottoni) for a mixture of vowels /a/ and /u/. 120

7-8 Third order cuniulants (top) and suninary (hottoni) for a mixture of vowels
/a/ and /u/. ......... ... .. 121

7-9 Narrowed autocorrelations (top) and suninary (hottoni). .. .. .. 122










7-10 Correntropy-gram (top) and summary bottomm) for a mixture of vowels /a/ and
/u. ........... ....... ....... 12:3

7-11 The ROC curves for the four PDAs based on correntropy-gram, autocorrelation,
narrowed autocorrelation (L = 15), and autocorrelation of :$rd order cumulants
in double vowels segregation experiment. ...... ... 124

7-12 The percentage performance of correctly determining pitches for both vowels
for proposed PDA hased on correntropy function and a CASA model. .. .. 125

7-1:3 Summary of correntropy functions with different kernel sizes for a single vowel
/a/. .. ....... ... . .... 126

7-14 Summary of correntropy functions with different kernel sizes for a mixture of
vowels /a/ and /u/. .. ... . .. 128

8-1 Averaged correntropy coefficient for unidirectionally identical (a) and nonidentical
(b) coupled Hiinon maps. ......... ... .. 1:35

8-2 Influence of different noise levels on correntropy coefficient. .. .. .. 1:36

8-:3 Influence of different noise levels on correntropy coefficient. .. .. .. 1:37

8-4 Time dependent of correntropy coefficient. ..... .. .. 1:38

8-5 Effect of different kernel width on correntropy coefficient for unidirectionally
coupled identical Hiinon maps. ......... .. .. 1:39

8-6 Effect of different kernel width on correntropy coefficient for unidirectionally
coupled non-identical Hiinon maps. . ...... .. 140

8-7 Comparison of correlation coefficient, correntropy coefficient and similarity index.
141

8-8 Comparison of the correntropy coefficient for the original data and the surrogate
data for unidirectionally coupled non-identical Hi~non map. .. .. .. 142

8-9 Comparison of correlation coefficient and correntropy coefficient in synchronization
detection among auditory cortex for audio stimuli EEG signal. .. .. .. .. 144

8-10 Comparison of correlation coefficient and correntropy coefficient in characterization
of synchronization among occipital cortex for visual stimulus EEG signal. .. 145










Abstract of Dissertation Presented to the Graduate School
of the University of Florida in Partial Fulfillment of the
Requirements for the Degree of Doctor of Philosophy

NONLINEAR SIGNAL PROCESSING BASED ON REPRODUCING KERNEL
HILBERT SPACE

By

Jianwu Xu

December 2007

C'I I!r: Jose C. Principe
Major: Electrical and Computer Engineering

My research aimed at analyzing the recently proposed correntropy function and

presents a new centered correntropy function from time-domain and frequenes-~ Ii-domain

approaches. It demonstrates that correntropy and centered correntropy functions not

only capture the time and space structures of signals, but also partially characterize

the higher order statistical information and nonlinearity intrinsic to random processes.

Correntropy and centered correntropy functions have rich geometrical structures.

Correntropy- is positive definite and centered correntropy- is non-negative definite, hence

by Moore-Aronszajn theorem they uniquely induce reproducing kernel Hilbert spaces.

Correntropy and centered correntropy functions combine the data dependent expectation

operator and data independent kernels to form another data dependent operator.

Correntropy and centered correntropy functions can be formulated as H .

correlation and covariance functions on nonlinearly transformed random signals via the

data independent kernel functions. Those nonlinearly transformed signals appear on the

sphere in the reproducing kernel Hilbert space induced by the kernel functions if isotropic

kernel functions are used. The other approach is to directly work with the reproducing

kernel Hilbert space induced by the correntropy- and centered correntropy- functions

directly. The nonlinearly transformed signals in the reproducing kernel Hilbert space

is no longer stochastic but rather deterministic. The reproducing kernel Hilbert space

induced by the correntropy and centered correntropy functions includes the expectation










operator as embedded vectors. The two views further our understandings of correntropy

and centered correntropy functions in geometrical perspective. The two reproducing kernel

Hilbert space induced by kernel functions and correntropy functions respectively represent

stochastic and deterministic functional an~ lli--;-

The correntropy dependence measure is proposed based on the correntropy coefficient

as a novel statistical dependence measure. The new measure satisfies all the fundamental

desirable properties postulated by Renyi. We apply the correntropy concept in pitch

determination, and nonlinear component analysis. The correntropy coefficient is also

emploi-v I as a novel similarity measure to quantify the inder-dependencies of multi-channel

signals .









CHAPTER 1
INTRODUCTION

1.1 Definition of Reproducing Kernel Hilbert Space (RKHS)

A reproducing kernel Hilbert space (RK(HS) is a special Hilbert space associated

with a kernel such that reproduces (via an inner product) each function in the space,

or, equivalently, every point evaluation functional is bounded. Let 'F be a Hilbert space

of functions on some set E, define an inner product (-, -)H in 'F and a complex-valued

hivariate function s(:r, y) on Ex E. Then the function n(:r, y) is said to be positive definite

if for any finite point set {2'i, 2-2.. -, O} E and for any not all zero corresponding

complex number {01l, c02*.. *~ O

II1 11
i= 1 j= 1

Any positive definite hivariate function n(:r, y) is a reproducing kernel because of the

following fundamental theorem.

Moore-Aronszajn Theorem: Given any positive definite function n(:r, y), there

exists a uniquely determined (possibly finite dimensional) Hilbert space N~ consisting of

functions on E such that


(i) for every xrE E, s(:r, -) E'F and (1-2)

(ii) for every xrE E and fe E', f(:r) = (f, n(:r, ))H. (1-3)


Then N~ := N~(s) is said to be a reproducing kernel Hilbert space with reproducing kernel

m. The properties (1-2) and (1-3) are called the rep~roducin9 ~, r'-r I;, of s(:r, y) in N~(s).

The reproducing kernel Hilbert space terminology has existed for a long time since

all the Green's functions of self-adjoint ordinary differential equations and some bounded

Green's functions in partial differential equations belong to this type. But it is not until

194:3 that N. Aronszajn [1] systematically developed the general theory of RK(HS and

named the term "reproducing bi s!~I The expanded paper [2] on his previous work










became the standard reference for RK(HS theory. Around the same time, there are some

independent work on RK(HS presented in the Soviet Union. For instances, A. Povzner

derived many of the basic properties of RK(HS in [3] and presented some examples in [4].

Meanwhile M.G. K~rein also derived some RK(HS properties in his study of kernels with

certain invariance conditions [5]. Other works studying RK(HS theory include Hille [6],

Meschkowski [7], Shapiro [8], Saitoh [9] and Davis [10]. Bergman introduced reproducing

kernels in one and several variables for the classes of harmonic and analytic functions

[11]. He applied the kernel functions in the theory of functions of one and several complex

variables, in conformal mappings, pseudo-conformal mappings, invariant Riemannian

metrics and other subjects. A more abstract development of RK(HS appears in a paper by

Schwarts [12].

As discussed in the bloore-Aronszajn Theorem, RK(HS theory and the theory of

positive definite functions are two sides of the same coin. In 1909, J. Mercer examined

the positive definite functions satisfying in Eq. (1-1) in the theory of integral equations

developed by Hilbert. Mercer proved that positive definite kernels have nice properties

among all the continuous kernels of integral equations [13]. This was the celebrated M~er-

ce's Theorem which became the theoretic foundation of application of RK(HS in statistical

signal processing and machine learning. E. H. Moore also studied those kernels in his

general analysis context under the name of positive Hertaitian matrixr and discovered the

fundamental theorem above [14]. Meanwhile, S. Bochner examined continuous functions

~(:) of a real variable :r such that s(:r, y) = #(:r y) satisfying condition in his studying of

Fourier transformation. He named such functions positive 7~~. In~e functions [15]. Later A.

Weil, I. Gelfand, D. Raikov and R. Godement generalized the notion in their investigations

of topological groups. These functions were also applied to the general metric geometry by

I. Schoenberg [16, 17], J. V. Neumann and others. In [18], J. Stewart provides a concise

historical survey of positive definite functions and their principal generalizations as well as

a useful bibliography.









From a mathematical perspective, the one-to-one correspondence between the RK(HS

and positive definite functions relates operator theory and the theory of functions. It

finds useful applications in numerous fields, of which includes: orthogonal polynomials,

Gaussian processes, harmonic analysis on semigroups, approximation in RK(HS, inverse

problem, interpolation, zero counting polynomials, and etc. The book [19] offers a review

of recent advance in RK(HS in many mathematical fields.

More relevant to this proposal is the RK(HS methods in probability theory, random

processes and statistical learning theory. I will present a brief review on these two in the

following sections separately.

1.2 RKHS in Statistical Signal Processing

Almost all the literature dealing with RK(HS methods in statistical signal processing

only considered the second order random processes. The rational behind this is that

random processes can he approached by purely geometric methods when they are studied

in terms of their second order moments variances and covariances [20].

Given a probability space (R, FT, P), we can define a linear space L2(R 10 P be

the set consisting all the random variables X whose second moment satisfying





Furthermore we can impose an inner product between any two random variables X and Y

in L2 0, 7T, P) aS

(X, Y) = E [XY] = XYdP.1 (1-5)

Then L2(R, 7, P) becomes all illner product space. Moreover it possesses the completeness

property in the sense that the Cauchy sequence converges in the space itself [21]. Hence

the inner product space L2 0, 7T, P) Of all Square integrable random variables on the

probability space (R, FT, P) is a Hilbert space.









Consider a second order random process {xt :te T}9 defined on a proba ability space

(R, FT, P) satisfying

,r.,~1=Jo :~1 ~(6
for the second moment of all the random variables xt. Then for each te T the random

variable xt can be regarded as a data point in the Hilbert space L2(R, FT, P). Hilbert

space can thus be used to study the random processes. In particular, we can consider

constructing a Hilbert space spanned by a random process.

We define a linear I,,r,.:l.. ..1. for a given random process {xt : te T}9 to be the set of

all random variables X which can be written in the form of linear combinations





for any n EN and Ck 6 @. ClOSe the set in Eq. (1-7) topologically according to the

convergence in the mean using the norm


||Y~ -1 Z| E[Y |


and denote the set of all linear combinations of random variables and its limit points by

L2 xt, te T ). By the theory of quadratically integrable functions, we know that the linear

space L2 It,t E T~) forms a Hilbert space if an inner product is imposed by the definition

of Eq. (1-5) with corresponding norm of Eq. (1-8). Notice that L2 xt,t E 6 ) is included

in the Hilbert space of all quadratically integrable functions on (R, FT, P), hence


L2 xt, te T ) C L2(,T -)


Indeed, it can be a proper subset. Therefore by studying the Hilbert space L2~t~ t

T~) we can gain the knowledge of the Hilbert space L2(R, FT, P). One of the theoretic

foundations to employ RK(HS approach to study second order random processes is that

the covariance function of random processes induces a reproducing kernel Hilbert space

and there exists an isometric isomorphism (congruence) between L2 xt, te T ) and









the RK(HS determined by its covariance function. It was K~olmogorov who first used

Hilbert space theory to study random processes [22]. But it was until in the late 1940s

that Loibve established the first link between random processes and reproducing kernels

[23]. He pointed out that the covariance function of a second-order random process is a

reproducing kernel and vice versa. Loibve also presented the basic congruence (isometric

isomorphism) relationship between the RK(HS induced by the covariance function of a

random process and the Hilbert space of linear combinations of random variables spanned

by the random process [24].

Consider two abstract Hilbert space NI1 and %2a With inner products denoted as

(fl, f2 1 and (g,, g2) 2 TSpectively, N~ and'~ ar2 T Said to be isomorp~hic if there exists a

one-to-one and surjective mapping from NT1 to 'F2 SailSfying the following properties


~(fl + f)~f2 ) 1 2f) and ~(af) = ag(f) (1-9)


for all functionals in Nx1 and any real number a~. The mapping is called an isomorp~hism

between N~ and %2~. The Hilbert spaces 'FN and'~ ar2 T Said to be isometric if there exist

a mapping that preserves inner products,


(li, 2 1 1 2 2,(1-10)


for all functions in Ni1. A mapping satisfying both properties Eq. (1-9) and Eq.

(1-10) is said to be an isometric isomorp~hism or congruence. The congruence maps both

linear combinations of functionals and limit points from 'FI into corresponding linear

combinations of functionals and limit points in 'F2 [20].

Given a second-order random process {xt : te T}9 satisfying Eq. (1-6), we know that

the mean value function p(t) is well defined according to the Cauchy-Schwartz inequality.

We can ah-liws assume that p(-) -- 0, if not we can preprocess the random process to

reduce the DC component. The covariance function is defined as


R(t, s) =E [XtXs] (1-11)









which is also equal to the auto-correlation function. It is well known that the covariance

function R is non-negative definite, therefore it determines a unique RK(HS, N~(R),

according to the Moore-Aronszajn Theorem. We can construct the RK(HS induced by

the covariance function R in the following procedure. First, a series expansion to the

covariance function R can be found by the Mercer's theorem.

Mercer's Theorem: Suppose R(t, s) is a continuous symmetric non-negative

function on a closed finite interval T~x T~. Denote by { Ak, k = 1, 2,. .. } a sequence

of non-negative eigenvalues of R(t, s) and by { k (t), k = 1, 2, .. } the sequence of

corresponding normalized eigenfunctions, in other word, for all integers t and s,


/ R~~s~(t, )r i k r-,(s), s, teT (1-12)



where 6k~j is the K~ronecker delta function, i.e., equal to 1 or 0 according as k = j or k / j.

Then





where the series above converges absolutely and uniformly on T~xT9 [13].

Then we can define a function f on T~ as the form of


f (t) = ,0 ,(1-15)

where the sequence {ak, k = 1, 2, ... } satisfies the following condition


AXka < OO. (1 -16)

Let 'F(R) be the set composed of functions f(-) which can be represented in the form Eq.

(1-15) in terms of eigenfunctionS kp and eigenvalues Ak of the covariance function R(t, s).









Furthermore we might define an inner product of two functions in 'F(R) as




where f and g are of form Eq. (1-15) and ak, bk SatiSfy property Eq. (1-16). One might

as well show NF(R) is complete. Let f,(t) = C"0 Xkt L)r k(t) be a C no !Lv sequence

in NF(R) suchl that each~ siequence {a*,I! a~ = 1, 2, ... } converges to a limit point ak.
Hence the Cauchy sequence converges to f (t) = C[" o .1, ,-, (t) which belongs to N~(R).

Therefore N~(R) is a Hilbert space. 'F(R) has two important properties which make it a

reproducing kernel Hilbert space. First, let R(t, -) be the function on T with value at s
in T equal to R(t, s), then by the Mercer's Theorem eigen-expansion for the covariance

function Eq. (1-14), we have


K(1,\ k=0i ,,;(); n=ii g

Therefore, R(t, -) E 'F(R) for each t in T~. Second, for every function f (-) E 'F(R) of form

given by Eq. (1-15) and every t in T~,




By the Moore-Aronszaj~n Theorem, N~(R) is a reproducing kernel Hilbert space with R(t, s)
as the reproducing kernel. It follows that


(R~t -) R~, -) = ,, (hk~s = ~ts).(1 20)

Thus 'F(R) is a representation of the random process {xt : te T}9 with covariance function

R(t, s). One may define a congruence form N~(R) onto L2 xt,t E 6 ) Such that

~(R(t, -)) = xt. (1-21)









In order to obtain an explicit representation of W, we define an orthogonal random

variable sequence {(m, m = 1, 2, .. .} such that


E [(km] O k /m


where Ak and I, ( f) are eigenvalue and eigenfunction associated with the kernel function

R(t, s) by the Mercer's theorem. We achieve an orthogonal decomposition of the random

process as

xt = k k, V f (x) E (1 22)

Note that the congruence map can be characterized as the unique mapping from

'F(R) onto L2 xt,t E 6 ) SatiSfying the condition that for every functional f in N~(R)


E[W( f)xt] = ( f, R(t, -)) = f (t). (1-23)

It is obvious that W in Eq. (1-21) fulfills the condition Eq. (1-23). Then the congruence

map can be represented explicitly as


"(f ) = ~ak k, vi tfe N(), (1 24)

where ak, SatiSfieS condition Eq. (1-16).

To prove the representation Eq. (1-24) is a valid and unique map, substituting Eq.

(1-22) and Eq. (1-24) into Eq. (1-23), we obtain


E[W(f)xt] = kk m mGkmtE[k
k=0 m=0 k=0 m=0




Parzen applied Loibve's results to statistical signal processing, particularly the

estimation, regression and detection problems in late 1950s. Parzen clearly illustrated that

the RK(HS approach offers an elegant general framework for minimum variance unbiased










estimation of regression coefficients, least-squares estimation of random variables, and

detection of know signals in Gaussian noise. Actually, the solutions to all these problems

can he written in terms of RK(HS inner product. Parzen [25] derived the basic RK(HS

formula for the likelihood ratio in detection through sampling representations of the

observed random process. The nonsingularity condition for the known signal problem was

also presented. In [26], Parzen provided a survey on the wide range of RK(HS applications

in statistical signal processing and random processes theory. The structural equivalences

among problems in control, estimation, and approximation are briefly discussed. These

research directions have been developed further since 1970. Most of Parzen's results can he

found in [25-28]. The book [29] contains some other papers published hv Parzen in 1960s.

Meanwhile, a Czechoslovakia statistician named H iek established the basic

congruence relation between the Hilbert space of random variables spanned by a random

process and the RK(HS determined by the covariance function of the random process

unaware of the work of Loibve, Parzen and even Aronszajn. In a remarkable paper [:30],

he shows that the estimation and detection problems can he approached by inverting the

basic congruence nmap for stationary random processes with rational spectral densities.

H jek also derived the likelihood ratio using only the individual RK(HS norms under a

strong nonsigularity condition.

In early 1970s, K~ailath presented a series of papers on RK(HS approach to detection

and estimation problems [:31-35]. In paper [:31], K~ailath discusses the RK(HS approach

in great details to demonstrate its superiority in computing likelihood ratios, testing for

nonsingularity, bounding signal detectability, and determining detection stability. A simple

but formal expression for the likelihood ratio using RK(HS norm is presented in paper [:32].

It also presents a test that can verify the likelihood ratio obtained front the formal RK(HS

expressions is correct. The RK(HS approach to detection problems is based on the fact

that the statistics of a zero-nican Gaussian random process are completely characterized

by its covariance function, which turns out to be a reproducing kernel. In order to extend










to Non-Gaussian random processes detection, characteristic function is used to represent

the Non-Gaussian process since it completely specifies the statistics and it is symmetric,

non-negative definite and thus a reproducing kernel. Duttweiler and K~ailath generalize the

RK(HS work to Non-Gaussian processes in [34]. Paper [35] considers the variance bounds

for unbiased estimates of parameters determining the mean or covariance of a Gaussian

random process. An explicit formula is also provided for estimating the arrival time of a

step function in white Gaussian noise.

RK(HS method is also applied to more difficult aspects of random processes. For

instances, Hida and Ikeda study the congruence relation between the nonlinear span of an

independent increment process and the RK(HS determined by its characteristic function.

Orthogonal expansions of nonlinear functions of such processes can be derived based on

this relation [36]. K~allianpur presents a nonlinear span expression for a Gaussian process

as the direct sum of tensor product [37]. Another important RK(HS application area is the

canonical, or innovations, representations for Gaussian processes. Hida was the first to

present the connection of RK(HS and canonical representations [38]. K~ailath presented the

generalized innovations representations for non-white noise innovations process [33]. RK(HS

has also been applied to deal with Markovian properties of multidimensional Gaussian

processes (random fields). Paper [39, 40] provide RK(HS development on multidimensional

Brownian motion and the conditions for more general Gaussian fields to be Markovian.

Besides the successful applications of RK(HS in estimation, detection and other

statistical signal processing areas, there have been extensive research on applying RK(HS

to a wide v-1I r ii of problems in optimal approximation including interpolation and

smoothing by spline functions in one or more dimensions (curve and surface fitting). In

[41] Weinert surveys the one-dimensional case in RK(HS formulation of recursive spline

algorithms and connections with least-square estimation. Optimality properties of splines

are developed and an explicit expression for the reproducing kernel in the polynomial case

is proposed by de Boor in [42]. Schumaker presents a survey of applications of RK(HS










in niultidintensional spline fitting in [43]. Wahba presents extensive results on spline in

[44]. Figueiredo took a different approach to apply RK(HS in nonlinear system and signal

analysis. He built the RK(HS front botton1-up using arbitrarily weighted Fock spaces

[45]. The spaces are composed of polynomials or power series in either scalar variable

or niulti-dintensional ones. The spaces can also be extended to infinite or finite Volterra

functional or operator series. The generalized Fock spaces have been applied to nonlinear

system approximation, semiconductor device characteristics modeling and others [45].

The RK(HS approach has enjoi-..1 its successful applications in a wide range of

problems in statistical signal processing since 1940s, and continues bringing new

perspectives and methods towards old and new problems. The essential idea behind

this is that there exits a congruence nmap between the Hilbert space of random variables

spanned by the random process and its covariance function which determines a unique

RK(HS. The RK(HS framework provides a natural link between stochastic and deterministic

functional analysis.

1.3 RKHS in Statistical Learning Theory

The statistical learning theory is the niathentatical foundation for a broad range of

learning problems including pattern recognition, regression estimation, density estimation

and etc. The general definition of a learning problem can he stated as follows. Given a

set of independent identically distributed (i.i.d.) random variable .r drawn front a fixed

but unknown distribution P(.r), a corresponding set of output random variable y for

every input .r according to a fixed but unknown conditional distribution P(I| e), and a

learning machine that can intplenient a set of functions f (., A), Ae A the problem of

learning front examples is to select the function f(.r, A) to predict the output response in

the best possible way. One employs the loss or discrepancy measure L(y, f (., A) between

the output y given the input .r and the response of f(.r, A) front the learning machine to

select the best function. In the statistical learning theory, mainly developed by Vapnik and

Ch.I v .onenkis in 1990s [46, 47], the risk nxinintization criterion is used to search for the









best function. The risk functional which characterizes the loss measure is given by


T(A) = ~ ,f(,Ad~,.(1-26)


The objective is to find the optimal function f(x, Xo) such that the risk functional R(A) is

minimized over all the possible functions when the joint probability distribution P(x, y) is

fixed but unknown and the only available information is the data set.

The evolution of statistical learning theory has undergone three periods. In the

1960s efficient linear algorithms were proposed to detect linear relations between the

input and response. One example was the perception algorithm which was introduced in

1958 [48]. The 1!! I i r research challenge at that time was the problem of how to detect

the nonlinear relations. In the mid 1980s, the field of statistical learning underwent

a nonlinear revolution with the almost simultaneous introduction of backpropagfation

rrialr'l1 li-,1 Id neural networks and efficient decision tree learning algorithms. This nonlinear

revolution drastically changed the field of statistical learning, and some new research

directions such as bioinformatics and data mining were emerged. However, these nonlinear

algorithms were mainly based on gradient descent, greedy heuristics and other numerical

optimization techniques so suffered from local minima and others. Because their statistical

behavior was not well understood, they also experienced overfitting. A third stage in the

evolution of statistical learning theory took place in the mid-1990s with the introduction

of support vector machine [47] and other kernel-based learning algorithms [49] such as

kernel principal component as~ lli--; [50], kernel Fisher discriminant analysis [51] and

kernel independent component analysis [52]. The new algorithms offered efficiency in

analyzing nonlinear relations from computational, statistical and conceptual points of

view, and made it possible to do so in the high-dimensional feature space without the

dangers of overfitting. The problems of local minima and overfitting that were typical of

neural networks and decision trees have been overcome.










The RK(HS pha7i~ a crucial role in the kernel-based learning algorithms. It follows from

the Mercer's theorem Eq. (1-14) that any symmetric positive definite function n(:r, y) can

he rewritten as an inner product between two vectors in the feature space, i.e.,


s(:r y) (0(), #y))(1-27)

cI :2T k~~(r), k =1,21..


There are some different kernels used in statistical learning theory. For example, among

others there are

*~~~~~ ~~ Gaussian~ kenl (r )= ep- } here o- is the kernel widtlh.

Polynomial kernel: s(:r, y) = (1 +;r y)d, where d is the polynomial power.

Sigmoid kernel: tanh(s(:r, y) + 73), where /3 is specified a priori.

K~ernel-based learning algorithms use the above idea to map the data from the original

input space to a high-dimensional, possibly infinite-dimensional feature space. By

the bloore-Aronszaj~n Theorem in the previous section, there exists a unique RK(HS

corresponding to the symmetric positive definite kernel s(:r, y). Therefore the feature

space where the transformed data reside is a reproducing kernel Hilbert space, where

the nonlinear mapping # constitutes the basis. Instead of considering the given learning

problems in input space, one can deal with the transformed data {@k(:r),k = 1, 2,...} in

feature space. When the learning algorithms can he expressed in terms of inner products,

this nonlinear mapping becomes particular interesting and useful since one can employ

the kernel trick to compute the inner products in the feature space via kernel functions

without knowing the exact nonlinear mapping. The essence of kernel-based learning

algorithm is that the inner product of the transformed data can he .0aid..oltil computed

in the RK(HS without ilcl.. .llti using or even knowing the nonlinear mapping Q. Hence,

by applying kernels one can elegantly build a nonlinear version of a linear algorithm

hased on inner products. One of the rationales to nonlinearly mapping the data into a

high-dimension RK(HS is Cover's theorem on the separability of patterns [53]. Cover's










theorem, in qualitative terms, states that a complex statistical learning problems cast

in a high-dimensional space nonlinearly is more likely to be linearly separable than in a

low-dimensional space. By transforming the data into this high-dimensional RK(HS and

constructing optimal linear algorithms in that space, the kernel-based learning algorithms

effectively perform optimal nonlinear pattern recognition in input space to achieve better

separation, estimation, regression and etc.

The research on kernel-based learning algorithms became very active since Vapnik's

seminal paper on support vector machines was published in 1990s. People started to ker-

nelized most the linear algorithms which can he expressed in terms of inner product. One

of the drawbacks of the kernel-based learning algorithms is the computational complexity

issue. 1\ost kernel-based learning algorithms will eventually result in operations on Gram

matrix whose dimension depends on the number of data. For instance, computation of

eigenvalues and eigenvectors for a thousand dimensional Gram matrix demands a great

deal of computational complexity. Therefore, a great amount of optimization algorithms

have been developed to address this issue based on numerical linear algebra. On the

other hand, since kernel-based learning from data usually ends up an ill-posed problem,

regularization through nonlinear functionals becomes necessary and mandatory. Hence,

cross validation is needed to choose an optimal regularization parameter [54].

1.4 A Brief Review of Information-Theoretic Learning (ITL)

In parallel to the developments in kernel-based methods research, independently

a research topic called information-theoretic learning (ITL) has emerged [55], where

kernel-based density estimators form the essence of this learning paradigm. Information-theoretic

learning is a signal processing technique that combines information theory and adaptive

systems to implement information filtering without requiring a model of the data

distributions. ITL uses the concepts of Parzen windowing applied to Renyi's entropy

definition to obtain a sample by sample algorithm that estimates entropy directly from

pairs of sample interactions. By utilizing Renyi's measure of entropy and approximations










to the K~ullback-Leibler probability density divergence, ITL is able to extract information

beyond second-order statistics directly from data in a non-parametric manner.

Information-theoretic learning has achieved excellent results on a number of learning

scenarios, e.g. blind source separation [56], supervised learning [57] and others [55].

One of the most commonly used cost functions in information-theoretic learning is the

quadratic Renyi's entropy [58]. Renyi's entropy is a generalization of Shannon's entropy.

Given a PDF f(:r) for a random variable :r, the quadratic Renyi's entropy is defined as


H(:r) =-lg f()dr=- log E [f (:r)].


Since logarithm function is monotonic, the quantity of interest in adaptive filtering is its

argument

I(:r = f2(2-d~r,(1-28)

which is called information potential, so named due to a similarity with the potential

energy field in physics [55]. The concept and properties of information potential have

been mathematically studied and a new criterion based on information potential has been

proposed, called the MEE (\hInt.... !,1!.1 Error Entropy), to adapt linear and nonlinear

systems [59]. MEE serves as an alternative to the conventional MSE (11. ,1! Square Error)

in nonlinear filteringf with several advantages in performance.

A non-parametric .I-i ll!!.11' cally unbiased and consistent estimator for a given PDF

f(:r) is defined as [60]

f (:) = (:rrs),(1-29)
i= 1
where n(-, -) is called the Parzen window, or kernel, which is the same symmetric

non-negative definite function used in kernel-based learning theory such as Gaussian

kernel, polynomial kernel and others [61]. Then by approximating the expectation by the

sample mean in Eq. (1-28), we can estimate the information potential directly from the









data


i= 1 j= 1
where {X,),N=1 is the data sample and NV is the total number. According to the Mercer's

theorem [13], any symmetric non-negative definite kernel function has an eigen-decomposition

as x(x, y) = (0(x), #(y))H,, where #(x) is the nonlinearly transformed data in the RK(HS

'Fs induced by the kernel function and the inner product is performed in 'Fs. Therefore,

we can re-write the estimate of information potential as



I~) (1 NI N N 2;i
i= 1 j= 1 i= 1

However, the RK(HS 'Fs is data independent since the kernel is pre-designed regardless of

the data. Therefore, only the estimate of the information potential, not the information

potential itself, can be formulated as such. Statistical inference in the RK(HS 'Fs does not

yield intrinsic geometric interpretation to the statistical information of signals required by
ITL.

ITL has also been used to characterize the divergence between two random variables.

In information theory, mutual information is one of the quantities that quantifies the

divergence between two random variables. Another well-known divergence measure is the

K~ullback-Leibler divergence [62]. However, the K~ullback-Leibler measure is difficult to

evaluate in practice without imposing simplifying assumptions about the data, therefore

numerical methods are required to evaluate the integrals. In order to integrate the

non-parametric PDF estimation via Parzen windowing to provide an efficient estimate,

two divergence measures for random variables based on Euclidean difference of vectors

inequality and Cauchy-Schwartz inequality respectively have been proposed [55].









The divergence measure based on Euclidean inequality is defined as


DED~f g )-9 ~

= f r)2d f (r~g(r~dr + ~r)2~r.(1-31)

The divergence measure based on Cauchy-Schwartz inequality is given by


f ~~i(:r)9(idi
Dcs(f 9) =- log (1-32)



Notice that both DED(f, g) and Dcs(f, g) are greater than zero, and the equality holds if

and only if f (:) = g(:r).

The Euclidean Eq. (1-31) and Cauchy-Schwartz divergence measures Eq. (1-32) can

he easily extended to two-dimensional random variables. As a special case, if we substitute

the marginal PDFs f and y in Eq. (1-31) and Eq. (1-32) by a joint PDF fl,2 71, 72) and

the product of marginal PDFs fl(:ri)/2 72) Tespectively, the Euclidian quadratic mutual
information is given by [55]







and the Cauchy-Schwartz quadratic mutual information is defined as [55]

fl,2 (1, 2af( l 1~ 1 2 )dxr dr2
Ics(fl, f2 = og (1-34



As can he seen from above that IEDfl 1 2) > 0 and Ics(fl, f2) > 0. If and only if the two

random variables are statistically independent, then IEDfl 1 2) = 0 and Ics(fl, f2) = 0.

Basically, the Euclidean quadratic mutual information measures the Euclidean difference

between the joint PDF and the factorized marginals, and likewise for the Cauchy-Schwartz










quadratic mutual information. Hence minimization of these two measures leads to

minimization of mutual information between two random variables. Cauchy-Schwartz

divergence measure has been applied to independent component analysis [63], and

clustering [64].

One of the limitations of ITL is that it does not convey the time structure of signals

because it assumes i.i.d. data. However, in practice most of signals in engineering have

correlation in time or temporal structures. It would be helpful to incorporate the temporal

structures while still containing high order statistics, for instance working with coded

source signals in digital communications.

1.5 Recent Progress on Correntropy

Fr-om the previous two brief introductions on RK(HS in statistical signal processing

and statistical learning algorithms, we notice that there are two different operators, the

expectation on random processes and the positive definite kernel on static data, which

uniquely determine two different reproducing kernel Hilbert spaces. The expectation

operator is dates dependent because it operates on the random processes and hence the

RK(HS induced by the correlation function is embedded with statistics. While the kernel

operator in statistical learning algorithms is dates independent, which is specified by the

designer to be one of the Gaussian kernel, polynomial kernel, sigmoid kernel or others,

hence the RK(HS induced by one of them in statistical learning only depends on the

specific kernel and does not contain the statistical information of data. The conventional

statistical signal processing from RK(HS perspective induced by the correlation function

provides new understanding of second order random processes, however it does not

offer new results because all the conventional statistical signal processing had already

been carried out without using RK(HS tool. On the other hand, the application of data

independent kernel in statistical learning theory requires regularization to make the

solution unique because the all the learning algorithms evolves the computation of Gram

matrix whose dimension is the same as the number of data samples.










One natural question to ask is to combine these two reproducing kernel Hilbert spaces

together in some ;, r; by means of proposing a new operator such that it is composed

of the expectation operator and the pre-designed kernel operator. If successful, we might

address the non-linearity in statistical signal processing because the pre-designed kernel

can nonlinearly map the random processes into a high-dimensional RK(HS. Moreover,

the pre-designed kernels contain beyond second-order operations on the variables, which

might also provide a new tool on non-Gaussian statistical signal processing. The new

operator will also capture the time structure of signal because it has the same spirit

of conventional autocorrelation, while it might still preserve the higher order statistics

information. This can overcome the limitation of Information-Theoretic Learning. The

problem of regularization can also be implicitly solved since the new operator employs the

expectation.

Recently a new generalized correlation function, called correntropy, has been proposed

to combine these two kernels to characterize both the temporal structure and statistical

information of random processes [65]. The correntropy has been applied to various

signal processing and machine learning problems and produced promising results. The

correntropy based matched filter outperforms the conventional matched filter in impulse

noise scenario [66]. The correntropy MACE filter has been proposed for image recognition

[67]. Since correntropy induces a new RK(HS, it is able to bring nonlinearity into the

traditional statistical signal processing. Correntropy Wiener filter nonlinearly transforms

the original random process into the high dimensional RK(HS induced by the kernel

function while minimizes the mean square error between the desired and output signals.

The output signal has been represented by the inner product between those nonlinearly

transformed input signal and the weights in RK(HS. The correntropy Wiener filter exhibits

much better performance than the conventional Wiener filter and muiltil on -r perception

[68]. These up-to-date developments of correntropy clearly demonstrate the promising

features and applicable areas, which effectively introduce a new nonlinear signal processing










paradigm hased on reproducing kernel Hilbert space. Unlike the recent advance in

kernel-based learning in computer science field, correntropy defines a new data-dependent

RK(HS and adapts to the intrinsic structure of data.

1.6 Study Objectives

In this dissertation, we analyze the newly proposed correntropy function [65],

and present another generalized covariance function, called centered correntropy. The

correntropy and centered correntropy functions are typically a combination of expectation

operator and pre-designed operator. It can he easily seen that the new operators are also

symmetric positive definite and thus induces another reproducing kernel Hilbert spaces

which drastically change the structure of the reproducing kernel Hilbert spaces induced hv

conventional autocorrelation function and the pre-designed kernel function. Although the

correntropy and centered correntropy have been applied to some different signal processing

and machine learning problems, further theoretical analysis and experimental work are

needed to fully elucidate the new concept and evaluate the associated properties. This

dissertation strives to serve as one of these efforts.

This dissertation is organized as follows. In chapter 3, the definitions of generalized

correlation and covariance functions, which are called correntropy and centered correntropy

respectively, are proposed and analyzed from time-domain and frequency-domain. C'! l oter

4 addresses the geometric structure of the reproducing kernel Hilbert spaces induced by

the centered correntropy. A new explicit construction of RK(HS with Gaussian kernel

is presented. A parametric correntropy function is proposed in chapter 5 to quantify

dependence measure. Application of centered correntropy in principal component analysis

is presented in Chapter 6. I also apply correntropy in pitch determination in chapter 7

and nonlinear coupling measure in chapter 8. I conclude the work and present some future

work in chapter 9.









CHAPTER 2
AN RK(HS FRAMEWORK( FOR ITL

In this chapter, we propose a reproducing kernel Hilbert space (RK(HS) framework

for the information-theoretic learning (ITL). The RK(HS is uniquely determined by the

symmetric non-negative definite kernel function which is defined as the cross information

potential (CIP) in ITL. The cross information potential as an integral of product of two

probability density functions characterizes similarity between two random variables.

We also prove the existence of a one-to-one congruence mapping between the presented

RK(HS and the Hilbert space spanned by probability density functions. All the cost

functions in the original information-theoretic learning formulation can be re-written as

algebraic computations on functional vectors in the reproducing kernel Hilbert space. We

prove a lower bound for the information potential based on the presented RK(HS. The

proposed RK(HS framework offers an elegant and insightful geometric perspective towards

information-theoretic learning.

Fr-om the definitions of various cost functions in information-theoretic learning, we see

that the most fundamental quantity is the integral of product of two probability density

functions (PDFs) f f(x)g(x)dx which is called the cross information potential (CIP)

[55]. Cross information potential measures the similarity between two PDFs, while the

information potential Eq. (1-28) is nothing but a measure of self-similarity. CIP appears

both in Euclidean and Cauchy-Schwartz divergence measures. In this chapter, we shall

develop the reproducing kernel Hilbert space framework for information-theoretic learning

based on the cross information potential.

2.1 The RKHS based on ITL

The RK(HS framework based on the PDFs of the data for ITL is proposed in this

section. We first focus on the development for one-dimensional case. The extension to

multi-dimension is straightforward. We form a L2 Space consisting of all one-dimensional

PDFs, and define an inner product in L2. Since the inner product is symmetric non-negative









definite, it uniquely determines a reproducing kernel Hilbert space N~v. We then prove

that the inner product itself is indeed a reproducing kernel in 'Fy.

2.1.1 The L2 Space of PDFs

Let 8 be the set that consists of all square integrable one-dimensional probability

density functions, i.e., fi(x) E 8, Vi E IT, where f fi(X)2dX < 00 and IT is an index set. We
then form a linear manifold


I affi'x (2-1)
for any I C I and asi E R. Close the set in Eq. (2-1) topologically according to the

convergence in the mean using the norm


|| fi(x) fj(x)|| = 1I(fi(x) fj(x))2 dx, ijl(2-2)

and denote the set of all linear combinations of PDFs and its limit points by L2(S). L2(~

is an L2 Space on PDFs. Moreover, by the theory of quadratically integrable functions,

we know that the linear space L2(S) forms a Hilbert space if an inner product is imposed

accordingly. Given any two PDFs fi(x) and fy (x) in 8, we can define an inner product as


(fs(x), fy(x))a? = f(x) fy(x)dxVi j (2-3)

Notice that this inner product is exactly the cross information potential [55]. This

definition of inner product has a corresponding norm of Eq. (2-2). Hence, the L2(~

equipped with the inner product Eq. (2-3) is a Hilbert space. However, it is not

a reproducing kernel Hilbert space because the inner product does not satisfy the

reproducing property in L2 8). NeXt We Show that the inner product Eq. (2-3) is

symmetric non-negative definite, and by the Moore-Aronszaj~n theorem it uniquely
determines a reproducing kernel Hilbert space.










2.1.2 RKHS 'Fv Based on L2 8)

First, we define a bivariate function on the set 8 as


V(fef)= f~)f~~x i, je (2-4)


This function is also the definition of the inner product Eq. (2-3), and the cross

information potential between two PDFs. This will be the kernel function in the RK(HS

'Fv constructed below. In reproducing kernel Hilbert space theory, the kernel function is

a measure of similarity between functionals. As pointed out earlier, the cross information

potential is a similarity measure between two probability density functions, hence it is

natural and meaningful to define the kernel function as such. Next, we show that function

Eq. (2-4) is symmetric non-negative definite in S.

Property 1 (Non-Negative Definiteness): The function Eq. (2-4) is symmetric

non-negative definite in E x 8 R.

Proof: The symmetry is obvious. Given any positive integer NV, any set of

{fl(x), f2 x, *N(X)} E S and any not all zero real numbers {Qax, 8.. N}, by
definition we have


NN~- ?i:: iiNN~hcl~l
i= 1 j= 1 i= 1 j= 1


i= 1 j= 1 i= 1

Hence, V( fi, fj) is symmetric non-negative definite, and it is also a kernel function.

According to the Moore-Aronszajn theorem, there is a unique reproducing kernel

Hilbert space, denoted by 'Fv, associated with the symmetric non-negative definite

function Eq. (2-4). We construct the RK(HS 'Fv from bottom-up. Since function Eq.

(2-4) is symmetric and non-negative definite, it also has an eigen-decomposition by the

Mercer's theorem [13] as
OO5









where { <', (f), k = 1, 2, .. .} and { Ak, k = 1, 2, .. .} are sequences of eigenfunctions and

corresponding eigenvalues of the kernel function V(fi, fj) respectively. The series above

converges absolutely and uniformly on E x 8 [13].

Then we define a space 'Fv consisting of all functionals ~(-) whose evaluation for any

given PDF fi(x) E S is defined as


k=16

where the sequence {ak, k = 1, 2, ... } satisfies the following condition


AXkaI < 00). (2-7)

Furthermore we define an inner product of two functionals in N~v as





where W and 7 are of form Eq. (2-6), and ak, and bk SatiSfy property Eq. (2-7).

It can be verified that the space 'Fv equipped with the kernel function Eq. (2-4) is

indeed a reproducing kernel Hilbert space and the kernel function V(fi, -) is a reproducing

kernel because of the following two properties:

1. V(fi, fj) as a function of fi(x) belongs to N~v for any given fj(x) E S because we can

rewrite V(fi, A) as





That is, the constants {bk, k = 1,2,...} become the eigenfunctions {<', (fi), k=

1, 2, ... } in the definition of W. Therefore,


V(fi, -) E N~V, Vfi(x)E e .









2. Given any We E Nv, the inner product between the reproducing kernel and W yields

the function itself by the definition Eq. (2-8)




This is so called the reproducing r'**~i'' I;

Therefore, N~v is a reproducing kernel Hilbert space with the kernel function and inner

product defined above.

By the reproducing property, we can re-write the kernel function Eq. (2-5) as


V( fi,fj) = (V( fi, -), V( fj, -))Hv (2-9)

V(fL),-) : fi ,i (fi), k = 1,2,...

The reproducing kernel nonlinearly maps the original PDF fi(x) into the RK(HS 'Fv.

We emphasize here that the reproducing kernel V(fi, fj) is data-dependent by which

we mean the norm of nonlinearly transformed vector in the RK(HS 'Fv is dependent on the

PDF of the original random variable because


|| V(fi, -)|| Vfe ) Ie,-),= sx2x


This is very different from the reproducing kernel x(x, y) used in kernel-based learning

theory. The norm of nonlinearly projected vector in the RK(HS 'Fs does not rely on the

statistical information of the original data since




if we use translation-invariant kernel function [61]. The value of x(0) is a constant

regardless of the original data. Consequently, the reproducing kernel Hilbert spaces 'Fy

and Ns, determined by V( fi, fj) and x(x, y) respectively are very different in nature.









2.1.3 Congruence Map Between N~v and L2 8)

We have presented two Hilbert spaces, the Hilbert space L2(S) of PDFs and the

reproducing kernel Hilbert space Nyv. Even though their elements are very different, there
actually exists a one-to-one congruence mapping W (isometric isomorphism) from RK(HS
'Fv onto L2 8) Such that

W(V(fi, -)) = fi (X). (2-10)

Notice that the mapping W preserves isometry between N~v and L2 8) Since by definitions
of inner product Eq. (2-3) in L2(S) and Eq. (2-9) in L2(~

(V(fi, -), V(fj, -))H = (fi(x), fj(X))Lz = (W(V(f -)), W(V(fj, -)))L .

That is, the mapping W maintains the inner products in both 'Fv and L2 -)
In order to obtain an explicit representation of 9, we define an orthogonal function

sequence {(m(x), m =1, 2, .. .} satisfying



(kix)(mxId = 0, k /m

and

'C c, (fe(kxdx 1, (2 11)

where Ak and I', ( f) are eigenvalue and eigenfunction associated with the kernel function

V( fi, fj) by the Mercer's theorem Eq. (2-5). We achieve an orthogonal decomposition of
the probability density function as

f~ ~ ~ ~~~~~~~( ()=( (f)k),Vf() .(212)

The normality condition is fulfilled by the assumption Eq. (2-11).










Note that the congruence map W can be characterized as the unique mapping from

'Fy into L2 8) SailSfying the condition that for every functional W in N~v and every j in IT


W(WA~~d =(W, V(fj, ))H = ~(fj) (2-13)


It is obvious that W in Eq. (2-10) fulfills the condition Eq. (2-13). Then the congruence

map can be represented explicitly as


(~~ ) = G kV, (2-14)


where ak, SatiSileS condition Eq. (2-7).

To prove the representation Eq. (2-14) is a valid and unique map, substituting Eq.

(2-12) and Eq. (2-14) into Eq. (2-13), we obtain



k= 1 m= 1


k= 1 m= 1




In summary, we provide an explicit representation for the congruence map W from

RK(HS 'Fv into L2(S). These two spaces are equivalent in some geometrical sense.

However it should be emphasized that the constituting elements are very different in

nature. The RK(HS isometry framework offers a natural link between stochastic and

deterministic functional analysis. Hence, it is more appealing to use RK(HS 'Fv for

information-theoretic learning as we will show in next section.

2.1.4 Extension to Multi-dimensional PDFs

Extension of N~v to multi-dimensional PDFs is straightforward since the definitions

and derivations in the previous section can be easily adapted into multi-dimensional

probability density functions. Now let Sm be the set that consists of all square integrable

m dimensional probability density functions, i.e., fi,m(x1,..., xm) E Sm, Vi E l and me N ,









where f~ fe,m(xl,..., Xm)2dX1,. .,dxm < 00 and IT is the index set. We need to change the

definition of kernel function Eq. (2-4) to


V(fi,m, fj,m) = fi,m(xx,.~b..xcf:mx,...,xml~dxx,...dm i e1


Then every definitions and derivations might as well be modified accordingly in the

previous section. Let 'Fv(m) denote the reproducing kernel Hilbert space determined by

the kernel function for m dimensional PDFs. The proposed RK(HS framework is consistent

with dimensionality of PDFs.

2.2 ITL Cost Functions in RKHS Framework

In this section, we re-examine the ITL cost functions in the proposed RK(HS

framework.

First, as the kernel function V(fi, fj) in N~v is defined as the cross information

potential between two PDFs, immediately we have


f (x:)gl(xidx (V(f -), V(g, -))H. (2-15)

That is, the cross information potential is the inner product between two nonlinearly

transformed functionals in the RK(HS 'Fv. The inner product quantifies similarity between

two functionals which is consistent with the definition of cross information potential.

The information potential can thus be specified as the inner product of a functional with

respect to itself

f ()2d = V~f, -, V f ,-)), =||V f ,-)|2.(2-16)

The information potential appears as the norm square of nonlinearly transformed

functional in the RK(HS 'Fv. Therefore, minimizing error entropy in ITL turns out to

be maximization of norm square in the RK(HS 'Fv (because the information potential is

the argument of the log in Renyi's quadratic entropy). As stated in ITL, MEE employs

higher-order statistics in nonlinear adaptive systems training since it is based on Renyi's

quadratic entropy [59]. We observe here that the higher-order statistics in MEE becomes









second norm in the RK(HS 'Fv. However, the nonlinearly transformed functional V( f, -)

is deterministic in 'Fy. Hence, the proposed RK(HS framework provides a link between

stochastic and deterministic transformation. The conventional mean square error has

also been re-written as norm square of projected vectors in the RK(HS Ns~ induced by

the covariance function [25]. But the RK(HS Ns~ only takes the second-order statistics

into account, i.e., the mean square error. The RK(HS 'Fv implicitly embeds higher-order

statistics. Compared to the RK(HS 'Fs induced by the pre-designed kernel function

used in the machine learning, our framework is more elegant theoretically because it

corresponds to the definition of information potential directly without employing any

kernel-based PDF estimator. From the computational point of view, the estimate of

information potential based on Parzen window PDF estimator yields a direct calculation

of information quantity from data. However from the theoretical perspective, it is more

appropriate to define the RK(HS framework based on the information potential itself

instead of the estimate using a kernel-based PDF estimator.

Based on the reformulations of cross information potential Eq. (2-15) and information

potential Eq. (2-16) in RK(HS 'Fv, we are ready to re-write the one-dimensional Euclidean

Eq. (1-31) and Cauchy-Schwartz divergence measures Eq. (1-32) in terms of operations

on functionals in Nyv. First,


DED(f, g) = ||V(f,-)- V(g,-)||2


That is, the Euclidean divergence measure is in fact the norm square of difference between

two corresponding functionals in N~v. This interpretation resembles the conventional

definition of Euclidean distance more than the original description Eq. (1-31) does. The

Cauchy-Schwartz divergence measure can be phrased as

(V(f -), V(g, -))H
Dos( f, g) = log log(cos 8),
||V(f,-)|| ||V(g, -)||









where 8 is the angle between two functional vectors V(f, -) and V(g, -). Therefore, the

Cauchy-Schwartz divergence measure truly depicts the separation of two functional vectors

in the RK(HS 'Fv. When two vectors lie in the same direction and the angle 8 = Oo,

Dos( f, g) = 0. If two vectors are perpendicular to each other (8 = 900), Dos( f, g) = 00.

The RK(HS 'Fy supplies rich geometric insights into the original definitions of the two

divergence measures.

To extend the same formulation to the Euclidean and Cauchy-Schwartz quadratic

mutual information Eq. (1-33) and Eq. (1-34), consider the product of marginal PDFs

fl(xl) f22~) aS a Special subset A2 of the 2-dimensional square integrable PDFs set 82

where the joint PDF can be factorized into product of marignals, i.e., A2 2. Then

both measures characterize different geometric information between the joint PDF and the

factorized marginal PDFs. The Euclidean quadratic mutual information Eq. (1-33) can be

expressed as

IED 1f, f2) = || V( fl,2,-) V( ftf2, 12,

where V( fl,2, *) is the functional in N~V(2) COTTOSponding to the joint PDF fl~x,2 1 2,

and V(ft f2, ) is for the product of the marginal PDFs fl(xl) f2 2). Similarly, the

Cauchy-Schwartz quadratic mutual information can be re-written as

(V(fl,2, -), V(fy f2, .'Hy
Ics (fl, 12) = l0g lOg COS y). (2-17)
||V(fl,2, -)|| ||V(fl, f2 I

The angle y is the separation between two functional vectors in N~V(2). When two random

variables are independent ( fl~x,2 1~ 2 1 1 2 2(x) and A2 2 ~), Y = Oo and the

divergence measure Ics(fl, f2) = 0 since two sets are equal. If y = 900, two vectors in

'FV(2) arT Orthogonal and the joint PDF is singular to the product of marginals. In this

case, the divergence measure is infinity.

The proposed RK(HS framework provides an elegant and insightful geometric

perspective towards information-theoretic learning. All the cost functions in ITL can

now be re-expressed in terms of algebraic operations on functionals in RK(HS 'Fv.









2.3 A Lower Bound for the Information Potential

Based on the proposed RK(HS framework for the information-theoretic & 11.,w

derive a lower bound for the information potential Eq. (1-28) in this section. First we cite

the projection theorem in Hilbert space that we will use in the following proof.

Theorem 2 (Projection in Hilbert Space) Let N~ be a Hilbert space, MZ/ be a

Hilbert subspace of N~ spanned by NV linearly independent vectors ul, u82., uN, a be a

vector in 'F, and d be a quantity such that


d = infimum||s u||, V nE M/.


Then there exists a unique vector, denoted as P(s|MZ/), in MZ/ such that


P(s|MZ/) =(,uK (,j ,(2-18)
i= 1 j= 1

where K(i, j) is the NVx NV Gram matrix whose (i, j) is given by (ui, sy). The projected

vector P(s|MZ/) also satisfies the following conditions:


||s P(s|MZ/)|| = d = min||s ui ||, (2-19)

(s P(s|MZ/), ui) = 0, Vui E MZ1,

(P(s|Mi/1), ui) = (s, ui), Vui E MZ/. (2-20)


The geometrical explanation of the theorem is straightforward. Readers can refer

to [21] for a thorough proof. Now we state the proposition on a lower bound for the

information potential.

Proposition (Lower Bound for the Information Potential) Let V( f, -) be a

vector in the RK(HS 'Fv induced by the kernel V, MZ/ be a subspace of Nyv spanned by NV

linearly independent vectors V(gl, -), V(gl, -),..., V(grv, -) E ~v, Then,









where G(i, j) is the N~xN Gram matrix whose (i, j) term is defined as (V(gi, -), V(gj, -))H.

Proof: first by the projection theorem Eq. (2-18), we can find the orthogonal projection

of V( f, -) onto the subspace MZ/ as


P(V( f, -)|MZ) = V ) (s )w -1(,j ~y )


Since the Gram matrix is symmetric and positive definite, the inverse ah-li-w exists. Next,

we calculate the norm square of the projected vector by Eq. (2-20),


||P(V(f, -1/)|M)|| = (V(f, -), P(V(f, -)|Mz))w,




On the other hand, the projection residual defined in Eq. (2-19) satisfies


d2 = ||V(f, -)||2 ||P(V(f, -)|M2/)||2 > 0. (2-21)


Combining Eq. (2-21) and Eq. (2-21), we come to the conclusion of our proposition Eq.

(2-21).

The proposition generalizes the Carmer-Rao inequality in the statistical estimation

theory. It can also be viewed as an approximation to the functional norm by a set of

orthogonal bases. Equation Eq. (2-21) offers a theoretical lower bound for minimization of

information potential.

2.4 Discussions

In this section, we relate our work to the concepts of information geometry and

probability product kernels.

2.4.1 Non-parametric vs. Parametric

The RK(HS framework presented in this article elucidates the geometric structure on

the space of all probability density distributions. Since it does not assume any models

for the PDFs, it is non-parametric and infinite-dimensional. In statistics, information

geometry studies the intrinsic geometry in a finite-dimensional, parametric statistical









manifold formed by all PD Fs [69]. Extension to infinite-dimensional non-parametric

sub-manifold has been advanced [70]. For finite-dimensional, parametric families of PDFs,

the only invariant metric to the tangent space is the Riemannian structure defined by the

Fisher information [69, 71]


y~(U) E~8 log f (:; 8) 8 log f (:; 8)j, i8


in the component form. The Riemannian metric coincides infinitesimally with the double

of the K~ullback-Leibler divergence. 1\ore interestingly, the Fisher information is a

symmetric non-negative definite function defined in the parameter space. Therefore, it

uniquely determines a reproducing kernel Hilbert space. But, it is very different from our

approach because the proposed RK(HS framework assumes a non-parametric probability

density functions. The non-negative definite kernel function Eq. (2-4) is defined directly

in the PDF space, however the kernel function in information geometry is defined in the

parameter space since it aims at estimating model parameters from data. Hence, both

methodologies define non-parametric and parametric reproducing kernel Hilbert spaces

respectively to tackle problems of interest from different perspectives.

2.4.2 Kernel Function as a Dependence Measure

The kernel function we defined characterizes relationships between probability

density functions. For instance, the one-dimensional kernel function V( f, g) quantifies

how similar one PDF is to the other. The two-dimensional function V(fl,2s 1 2~) specifies

the relationship between joint PDF and product of marginal PDFs. Therefore it also

measures how dependent one random variable is to the other. The Cauchy-Schwartz

quadratic mutual information Eq. (2-17) was applied to independent component analysis

based on this interpretation [63]. Using probability distributions to measure similarity is

nothing new. One customary quantity is the K~ullback-Leibler divergence. However, it is

not positive definite nor symmetric, and hence does not have a reproducing kernel Hilbert










space associated with it. Therefore, K(L divergence lacks the geometric advantage of RK(HS

that our kernel function possesses.

Recently, several probability product kernels have been proposed in machine learning

field to use ensembles instead of individual data to capture dependence between generative

models [72]. It is shown that the Bhattacharyya coefficient defined by





is a reproducing kernel [73]. The expected likelihood kernel in [73] is exactly the cross

information potential. But as they proposed the probability product kernel purely from

machine learning point of view, it failed to relate to a broader information theoretic

framework. Our contribution in this chapter is to independently propose a reproducing

kernel Hilbert space framework for information-theoretic b Illrf11. construct the RK(HS

from bottom-up, and prove the validity mathematically. Therefore the kernel function has

a rich information-theoretic interpretation. Moreover, as ITL is mainly applied to adaptive

signal processing, we employ non-parametric method to compute the cross information

potential kernel and other quantities without an explicit probability density function

estimation. However, a parametric generative model is assumed in order to calculate the

kernel in their approach [72, 73].

2.5 Conclusion

In this chapter, we present a geometric structure for the information-theoretic

learning methodology. The proposed reproducing kernel Hilbert space framework is

determined by the symmetric non-negative definite kernel function which is defined as

the cross information potential. The kernel function quantifies the similarity between two

transformed functionals in the RK(HS 'Fiv. We can re-write all the cost functions in ITL in

terms of algebraic operations on the functionals in the proposed reproducing kernel Hilbert

space. These formulations offer a rich geometric for the original information-theoretic

learning algorithms. Compared to a previous RK(HS framework by the pre-designed










kernel function, ours is built directly on the probability density functions and contains the

statistical information of the data. Hence, the RK(HS 'Fy provides an elegant geometric

structure intrinsic to the data statistics. The future work would include deriving least

projection theorem in RK(HS 'Fv so that we might present minimum information potential

estimator and others directly from the functionals.









CHAPTER 3
CORRENTROPY AND CENTERED CORRENTROPY FUNCTIONS

3.1 Autocorrentropy and Crosscorrentropy Functions

A random process is completely specified by its n-dimensional joint probability

distribution function for every positive integer n. But in reality, it is not practical,

or even possible, to determine all of its n-dimensional joint probability distribution

functions. Therefore, we might only use the 2-dimensional distribution functions to

partially characterize a given random process. In many engineering applications, the

2-dimensional distribution functions offer a substantial information to tackle most the

problems. The most important descriptions for the 2-dimensional distributions are the

mean, autocorrelation, and autocovariance functions. This way of partial (second-order)

characterization provides practical measurements of random processes and also well suits

linear operations on random processes [74].

But when the application is required to deal with nonlinear systems or non-gaussian

signals, the second order statistics are not sufficient because they fail to capture the

nonlinearity and higher order statistics intrinsic to the problems. There have been many

attempts to tackle nonlinear and non-Gaussian signal processing. For instance, the Wiener

and Hammerstein models are the early methods proposed to implement optimal nonlinear

system identification [75]. In these models, a static nonlinearity chosen a priori is placed

in front of (or after) a linear time-invariant system, and the optimal solution is obtained

via Wiener-Hopf equation [75]. Others include Volterra series [76]. Recently Principe et al

applied information theory quantities, like entropy and mutual information, to adaptive

signal processing, which is named information-theoretic learning (ITL) [55]. But ITL

methods lack the time structure information of the random processes. To incorporate both

the time structure and the high order statistics, a new generalized correlation function,

called correntropy, was proposed by Santamaria et al recently in [65]. This generalized

correlation fraction can also be viewed as a correlation fumetion for the transformed










random process in a high dimensional reproducing kernel Hilbert space via nonlinear

mapping imposed by the kernel. The linear operations on the transformed random process

in the feature space will become nonlinear operations on the original random process

in the input space by means of inverse nonlinear mapping. Therefore, this generalized

correlation function also conveys the nonlinearity in the random processes.

Definitions: Given a random process {xt : te T}4 with t typically denoting

time and T being an index set of interest, the generalized correlation function, named

autocorrentropy, is defined as

V(t, s) = E[a(xt, x,)], (3-1)

the generalized covariance function, named centered autocorrentropy, is defined as


U(t, s) = Exams [a(xt, xs)] Ez, E,s [a(xt, x,)], (3-2)


for each t and a in T~, where E denotes the statistical expectation operator and x(-, -)

is a symmetric positive definite kernel function. Notice that the correntropy is the joint

expectation of K(xt, x,), while the centered correntropy is the difference between joint

expectation and marginal expectation of K(xt, x,).

In the literature, there have been several symmetric positive definite kernel functions

proposed in machine 19 -,11.11. function approximation, density estimation, support vector

machines, and etc. For example, the sigmoidal kernel, Gaussian kernel, polynomial kernel

and spline kernel just to name a few. The mostly used kernel function is the Gaussian

kernel which is given by

1 e (Xt xs)2 (3
~(Xt, Xs)= ex -.(3)
2/o-r 2o-2

By the Mercer's theorem in chapter 1 Eq. (1-14), any symmetric positive definite

function x(xt, x,) can be rewritten as an inner product between two vectors in the feature










space, i.e.,







The nonlinear transformation # maps the random process {xt : te T}9 into another

random process {@(xt) : te T}4 in the high-dimensional feature space, which is a

reproducing kernel Hilbert space induced by the kernel according to the Moore-Aronzajn

theorem Eq. (1-2). Therefore we can rewrite the autocorrentropy in terms of the nonlinear

mapping # as

V(t, s) = E[(0(xt), #(x,))]. (3-5)

Likewise the centered autocorrentropy can also be expressed as




=E [(@(Xt) E[@(xt)], #(x,) E[@(Xs)])] (3-6)

It can be seen that the correntropy function is a "conventional"' correlation function--

for the transformed random process in the high-dimensional RK(HS. While the centered

correntropy is nothing but the correntropy for the zero mean (centered) random process

{((Xt) E[@(xt)] : te T}q. This way of defining the generalized correlation and covariance

functions is in the same spirit of the standard correlation and covariance functions.

Correntropy can be applied both to one time series, called the autocorrentropy

as we have defined above, or a pair of multidimensional random variables, called the

crosscorrentropy. The definitions of crosscorrentropy and centered crosscorrentropy

functions are straightforward. They compute the generalized correlation across the space

structure instead of time structure.

Definition: Given two random variables x and y, the crosscorrentropy function is

defined as

V(x, y) =E[a(x, y) = E[(0(x), #(y))], (3-7)









and the centered crosscorrentropy is defined as


U(x, y) = Ez,~,[a(x, y)] EE, [a(x, y)]

=E[(0(x) E[@(x)], #(y) E[@(y)])]

= x~, y ( x~v(xy) -fx x) y ())ddy,(3-8)


where E denotes the statistical expectation operator, K(-, -) is a symmetric positive definite

kernel function, and # is the nonlinear transformation associated with kernel x(-, -) Eq.

(1-27). Notice that correntropy is the joint expectation of x(x, y), while the centered

correntropy is the difference between the joint expectation and marginal expectation of



The correntropy and centered correntropy functions for random variables share the

same properties as the ones for random processes since they can also be viewed as one

instant of correntropy and centered correntropy functions for random processes. Without

ambiguity, we will call both autocorrentropy and crosscorrentropy correntropy functions in

the followings.

If the random process {xt : te T}4 is pair-wise independent, in other word,


fx,,x, (Xt, X,) = fx, (Xt) fx, (X,), Vt / se T (3-9)


where fx,,x, (xt, x,) is the joint probability density function (PDF) and fx, (Xt), fx, (x,)

are the marginal PDFs, then the correntropy at t / s becomes


V(t, s) =( (x) ( ))f tx ( s.el




= (E[(xt)l, E[(x,))f> xt/,xdex

= #(t)/x(xt~xt, (xs)fx,(s1dx

NNii-










We have used the sample mean to estimate the statistical expectation and the independence

property in the computation above, where {2'i, 2-2.., ** EN is one realization of the random

process. The quantity Eq. (3-10) is called information potential and corresponds to

the argument of the logarithm of the quadratic Renyi's entropy when a Parzen window

estimator is used [55]. Hence the generalized correlation function is called correntropy.

Under this pair-wise independent condition, the centered correntropy reduces to zero

since it is defined as the difference of correntropy and the information potential. In the

conventional second order random processes, only pair-wise uncorrelatedness is required

to zero the covariance function. The condition of pair-wise independence is much stronger

than the pair-wise uncorrelatedness, this shows that the centered correntropy requires

high order statistical information in order to attain zero. This can also been seen from the

followingf observation.

Applying Taylor series expansion to the Gaussian kernel, we can rewrite the

correntropy function as


V(t,.s) = C22E [(:r, -:r)2k 3 1

which contains all the even-order moments of the random variable (:rt :rs). Obviously

different kernel functions would yield different expansions. But all the kernel functions

mentioned above involve high order statistical information about random processes.

Therefore the correntropy and centered correntropy partially characterizes the high order

statistics of random processes. The correntropy and centered correntropy satisfy the

following properties.

Property 1: The correntropy is positive I. I;,../.: and the centered correntropy is

positive .semi-,/. It../.I










Given a positive definite kernel function x(-, -), for any positive integer n, any points

tl,..., t in T~, and any not all zero real numbers azl,..., an, by definition we have



i= 1 j= 1

Certainly, the expectation of any positive definite function is ah-liws positive definite. Thus

we have


i=f 1~ iI j= 1 0
This equals to


i= 1 j= 1 i= 1 j= 1
Therefore the correntropy is positive definite. Similarly for the centered correntropy,




i= 1





i= = 1 j


i= 1 j= 1



Hence the centered correntropy is positive semi-definite.

The positive definiteness properties of the correntropy and centered correntropy are

the most fundamental properties for these two functions because all the other following

properties can be deduced from positive definiteness and more importantly it guarantees

that the correntropy and centered correntropy uniquely determine two reproducing kernel

Hilbert spaces. This positive definiteness property enables these two generalized functions

a wide range of potential applications in statistical signal processing.

Property 2: V(t, s) and U(t, s) are ; arrant I. / V(t, s) = V(s, t), and U(t, s)=

U(s, t).









This is the direct consequence of symmetric kernel function used in the definitions of

the correntropy and centered correntropy functions.

Property 3: V(t, t) > 0 and U(t, t) > 0.

Since s(xt, xt) > 0 by the positive definiteness of kernel function, accordingly,

V(t, t) > 0. For U(t, t), let asi, asj = 1 and n = 1 in Eq. (3-13), the result follows.

Property 4: |V(t, s)| < V~t, t)VCs, s) anld |U(t, s)| < JcU(t, t)Ui(s, s)

Let n = 2 in Eq. (3-12) and Eq. (3-13), two expressions reduce to


a1 VLt, t)l + aL V1s s) >/ 2aza2|L2 VIt sL)|3. (3-14)

n Urt, t) + na Ufs,\ s)n >rf 204|~, )| (3-15)


We can substitute

2 V(s, s) 2 V(t, t)
a~ and a~

1 Ut )(s, s) 2a 2 (t, t)V~,s
P1 Jitts and v2

into Eq. (3-14) and Eq. (3-15) respectively to obtain the properties above.

These properties are very similar to those of conventional correlation and covariance

functions, but the correntropy and centered correntropy functions partially provide the

high order statistics of random processes while the conventional correlation and covariance

functions can only offer second order statistical information.

A random process {xt : te T}4 is said to be strictly stationary if the finite-dimensional

joint probability density function is shift-invariant for each n and each choice of tl,..., t

in T~,

fn(X1,... X,; tl,... t) = fn(X1,... X,; tl + to, t + to) (3-16)









for all to in T~. For a strict stationary random process, the correntropy and centered

correntropy functions satisfy the following properties,


E [@(xt)] = #+(2) (0), (3-17)


V(t + 7r, t) = V(-r, 0), (3-18)

and

U(t + 7r, t) = U(-r, 0) (3-19)

for all t and -r in T~. We denote #+(,) (0), V(t + 7r, t) and U(t + 7r, t) by #+(,), V(-r) and

U(-r) respectively whenever Eq. (3-17) Eq. (3-19) hold for all t and -r. It might be that

the functions #+(,), V(-r) and U(-r) are shift invariant in the sense of Eq. (3-17) Eq.

(3-19), yet the random process {xt : te T}9 is not strictly stationary. Since those equations

represent properties of the random process that are of interest in their own right, we

can define two forms of stationarity that are, in general, much weaker than the strict

stationarity defined in Eq. (3-16), but much stronger than the conventional wide-sense

stationarity and covariance stationarity since all the even-order moments should be time

shift invariant from Eq. (3-11) in order to obtain a univariate correntropy function.

Definitions: A random process is said to be correntroris,-;. i,;w stat... t..r,;, (CSS)

if Eq. (3-17) and Eq. (3-18) hold for all t and 7,; the process is said to be centered

correntropy stationary if Eq. (3-19) holds for all t and -r.

Suppose t = s + -r in Eq. (3-6), then the centered correntropy becomes


U(s + 7r, s) = V(s + 7r, s) (E[@(x, ,)], E[@(x,)]). (3-20)


The right-hand side does not depend on a for a correntropy-sense stationary process.

Consequently, U(s + 7r, s) does not depend on s. Therefore we have the following important

fact :

A correntropy-sense stat... t.rer;, process is also centered correntropy stat... t..r;<;










However, a centered correntropy stationary process might not be necessary correntropy-sense

stationary. For a correntropy-sense stationary random process, the properties 2, 3 and 4

become


V(-r) = V( ,(3-21)

V(0) > 0, (3-22)

|V(r)| < V(0), (3-23)




The expectations, or ensemble averages, of a random process are average across the

process. In practice, we do not have infinite realizations of random processes. Accordingly,

we might use time averages to approximate the ensemble averages. In order to be rigorous

for using this approach, we have to show that time averages converge to corresponding

ensemble averages in some statistical sense. Consider a correntropy-sense stationary

discrete-time random process {x, : n E Z+}. Though the mean of the nonlinearly

transformed random process {@(x,) : n E Z+} is not required in applications, it is of

interest to investigate the relationship between ensemble average and time average to the

mean. In this regard, denote the mean of the process {@(x,) : n E Z+}, E[@(x,)] by p and

define the time average to the mean as
N-1
p(N)= #(,),(3 24)
n=o

where NV is the number of available samples used in the estimation. The estimator Eq.

(3-24) is an unbiased estimator of the ensemble average of the process since


E[fl(NV) = p for all NV. (3-25)


Furthermore, the process is said to be mean-norm ergodic in the mean-norm-square error

sense if the mean-norm-square value of the error between the ensemble average p and the











time average p(NV) approaches zeros as the number of samples NV tends to infinity,


lim?_E [||1 (N;)||2] = 0. (3-26)


Substituting Eq. (3-24) into Eq. (3-26), we can rewrite



N-1
E [|p- (N)|2 = E p #(x,) -
n=0
N-1 2-

n=0 =
N-N-1N-



n=0 k=0
N-1N-1

EEU~n k),(3-27)
n=0 k=0

where U(n k) is the centered correntropy at n k time lag for the correntropy-sense

random process. Let m = n k, the double summation in Eq. (3-27) can be simplified as

follows,
N-1

E [||P- (N)||2= = (1 )U(m). (3-28)
r= -N+1

Hence, the necessary and sufficient condition for the process {x, :n ob

mean-norm ergodic in the mean-norm-square-error sense is that

N-1
lim -1- ) U(m) = 0. (3-29)
N->oo NV N I Ir
~r=-N+1

The time average p(NV) of the process converges to the ensemble average p in the

mean-norm-square-error sense if the process {x, : n E Z+} is .I-in plll )tically correntropy-sense

uncorrelated in the sense of Eq. (3-29).










The correntropy and centered correntropy can be estimated using time average for a

discrete-time correntropy-sense stationary random process as follows:

N-1

Vi(1) = N -11 + 1 xZx,, xnz) (3-30)

and
N-1 N-1 N-1

n=l n= o m= o

wher Ln= Lm=O x\x, xcm) is the estimate of information potential which is a

constant depending on the signal and the specific kernel function used.

In what follows, we conduct several simulations to demonstrate some of the properties

of correntropy and centered-correntropy. First, we would like to illustrate the relation

between correntropy and centered-correntropy functions. 3 sets of 100000 i.i.d. data

samples each of Gaussian, exponential, and Gamma distributed respectively were

generated. The Gaussian source is set to be zero mean and unit variance. The parameter

in exponential distribution is set to be 20. The parameters in Gamma distribution are set

to be 0.2 and 1 respectively. Then we pass these data through an IIR filter with transfer

function
1 +0.2Z-]
H(Z) = .(332)
1 -1.5Z-i 0.8Z-2

The i.i.d. signal and filtered signal are both normalized to zero mean and unit variance.

A Gaussian kernel with unit kernel width is used to estimate the correntropy and

centered correntropy. In Fig. 3-1, we plot the correntropy [65] and centered correntropy

functions for the original and filtered signal respectively. As proved in Eq. (3-10), under

the pair-wise independent assumption, the value of correntorpy at non-zero lag is the

information potential and the centered correntropy at non-zero lag reduce to zero since it

is defined as the difference between correntropy and information potential. The left plot

clearly illustrates the point. The estimated information potentials are 0 l 11;1, 0.26098 and

0.31452 for Gaussian, exponential and Gamma distributed signals respectively. Likewise,

































-0.051 -0.051
0 5 10 15 20 0 5 10 15 20

Figure 3-1. Correntropy and centered correntropy for i.i.d. and filtered signals versus the
time lag


the right plot shows the correntropy and centered correntropy for the filtered signals. We

will only plot the correntropy function in the following simulations since the difference

between correntropy and centered correntropy is only the information potential which is a

constant depending on the kernel and signal (Eq. (1-31)).

Our second simulation demonstrates the effectiveness of correntropy and centered

correntropy in partial characterization of high order statistics of random processes. We

compare the conventional autocorrelation and correntropy functions for two data sets.

The first data set is 100000 i.i.d. data of Gaussian distributed with zero mean and unit

variance. The second data set is the ARCH(autoregressive conditional heteroskedasticity)

model which is used in econometrics to predict asset return volatility [77]. The ARCH

time series is uncorrelated (second order statistics) but not i.i.d.. The time series model is












Autocorrelation for i.i.d. Gaussian data


Autocorrelation for ARCH series


0.8 0.8

0.6 ~10.6

0.41 0.4

0.2 ~110.2

0 stt 0 n n c0~ac 0 rc 0 000 0

-0.2 -0.2
0 5 10 15 20 0 5 10 15 20


Correntropy for i.i.d. Gaussian data Correntropy for ARCH series
0.40 0.4

0.38
0.35

0.36
0.3
0.34

0.25
0.32

0.2
0 5 10 15 20 0 5 10 15 20


Figure :3-2. Auto correlation and correntropy for i.i.d. and AR CH series versus the time
lagf



defined as







where {e,},"o is a white noise stochastic process and no > 0, asi > 0 V i = 1,..., r. We

choose {et,},o to be i.i.d. Gaussian distribution of zero mean and unit variance, r = 1,

n~o = 0.2 and orl = 1. A Gaussian kernel with unit kernel width is used to estimate

the correntropy. In Fig. :3-2, the autocorrelation functions for the i.i.d. Gaussian data

and uncorrelated but dependent AR CH time series are given in the top two plots. It

is expected that the conventional autocorrelation functions for an i.i.d. signal and an

uncorrelated signal are the same since autocorrelation function only specifies the second

order statistics. The bottom two plots show the correntropy functions for the same i.i.d.

and uncorrelated signals respectively. As has been pointed out previously, the correntropy










partially characterizes the high order statistics. Accordingly, unlike the conventional

correlation function, the correntropy would be different for i.i.d. and uncorrelated signals.

We define a significance region to quantify the significant non-zero lag values. The

maximum value of the significance region is the correntropy values at zero lag, which is

0.3989 in this specific simulation; the minimum value of the region is the 10 percent of

difference between zero lag correntropy and estimate of information potential above the

estimate of information potential, which is 0.3140. If any non-zero lag correntropy value

falls into this region, we call it CII' significance which indicates there exists a "correlation"

at that time lag. In this simulation, the correntropy values at time lag 1 and 2 are 0.32:32

and 0.3156 respectively which fall into the significance region. This result clearly -II---- -R-

that there exist a ,. is, I .!. 0 correlation" for this ARCH time series using correntropy

quantification. Therefore correntropy specifies more information than second-order

statistics compared to the conventional autocorrelation function.

The third simulation investigates the effectiveness of correntropy in capturing the

PDF and nonlinearity information of the random processes. We generate :3 sets of 100000

i.i.d. data samples each of Gaussian, exponential, and Gamma distribution respectively

to test correntropy's ability to characterize PDF information of random processes. The

Gaussian source is set to be zero mean and unit variance. The parameter in exponential

distribution is set to be 20. The parameters in Gamma distribution are set to be 0.2

and 1 respectively. Then the data set is passed through the IIR filter Eq. (:332). A

Gaussian kernel with unit kernel width is used to estimate the correntropy. In Fig. :3-3

(a-d), we plot the conventional autocorrelation function and correntropy function for the

original and filtered signal respectively. Because the conventional autocorrelation function

only represents the time structure of random processes while containing no information

about the PDF of random variables. Consequently, the autocorrelation functions for

random processes of different PDF distributions are the same. This is demonstrated in

the plots (a) and (c) for i.i.d. data of different distributions and filtered signals. However,






















Autocorrelation for i.i.d. signal

-8 Gaussian distribution
les + Exponential distribution
-*- Gamma distribution










0 5 10 15 20
(a)

Autocorrelation for filtered signal


Correntropy for i.i.d. signal


Correntropy for filtered signal

-9 Gaussian source
+c Exponential source
-- Gamma source


0.25


0.15'
5 10 15 20 0
(c)


5 10 15 20
(d)

Correntropy for Lorenz dynamic system


Autocorrelation for Lorenz dynamic system


20 40 60 80


0 20 40 60 80


Figure 3-3. Autocorrelation and correntropy for i.i.d. and linearly filtered signals and
Lorenz dynamic system versus the time lag










the correntropy function not only captures the time structure of random processes,

but also depicts the PDF distributions of random variables. The differences among the

random processes of different PDF distributions can he easily seen in the plot (b). The

conrrentropy values at non-zero lags are the estimate of information potential for different

distributions Eq. (1-31), which are certainly different for different PDF distributions.

Correntropy also contains the time structure of signals. Comparing (c) and (d) plots, we

can see that the shape of correntropy is very similar to that of autocorrelation function.

1\oreover, correntropy functions are different for :3 different sources. The reason that the

separation of correntropy functions for :3 different signals are not as obvious as the one

in i.i.d. case (plot (b)) is because the distributions for the :3 filtered signals become more

similar. In the plots (e) and (f), we demonstrate the difference of conventional correlation

function and correntropy in capturing the nonlinearity intrinsic to systems. The Lorenz

dynamic system is used here for illustration. The system function of Lorenz time series is

given by


Xi = o-(y :r,

I/ = U7 :r: + RXr,




where R = 28, o- = 10 and b = 8/:3. This set of parameter makes Lorenz dynamic

system exhibit chaotic behavior. 10000 samples of the Lorenz dynamic system is generated

by solving the equations with 4th-order Runge-K~utta method with integral step 0.01.

Afterwards, it is normalized to zero mean and unit variance. In Fig. :3-:3 (e-f), we plot

the conventional autocorrelation function and correntropy for the :3 components of the

dynamic system. Observe that the correntropy functions for the :3 state variables follow

the same trend. They peak and drop around the same time lag. From the dynamic

equations, it shows that :r is nonlinearly coupled into y and X. The periodic similarities in

one state variable affect the other states. This is clearly demonstrated in the correntropy










functions. However, the conventional autocorrelation functions do not exhibit any periodic

similarities among the three states. This simulation -II--- -; that correntropy is able to

extract the nonlinear coupling information embedded in the time structure of the time

series while the conventional autocorrelation fails.

In the previous simulations, we just used Gaussian kernel with same kernel width.

In fact, the kernel functions and kernel width ph-li a crucial role in the correntropy and

centered correntropy functions. Obviously, different kernel widths result in different

estimate of information potential. 1\oreover, kernel width controls the ability of

correntropy to capture the nonlinearity embedded in systems. From the RK(HS perspective,

different kernel functions determine different reproducing kernel Hilbert spaces and the

kernel width defines the norm and inner product of RK(HS. We will explore more on this

issue in the next chapter. Here we first present an experiment on the different kernel

width for the Gaussian kernel function. The same data sets on linear i.i.d. and the Lorenz

time series from the previous example are used here. In Fig. :3-4, we plot the correntropy

functions for i.i.d. and Lorenz time series using Gaussian kernel with different kernel

width ranging from 0.01 to 15. It can he seen from the top two plots that correntropy

loses the ability to differentiate time series of different PDF distributions and to detect

the nonlinearity embedded in Lorenz time series when kernel width is too small (o- = 0.01

and 0.1 in this experiment). If the kernel is too hig (o- = 5 and 15 in this experiment), the

correntropy functions approach to conventional autocorrelation. The shape of correntropy

in the bottom two plots are very similar to the ones of Fig. :3-:3 (a) and (e). This can also

be verified by the Taylor series expansion for the correntropy using Gaussian kernel in Eq.

Eq. (:311). If the kernel width is too large, the values of high order terms decay rapidly.

Therefore correntropy approaches to conventional autocorrelation. The middle two plots

give the correntropy functions of appropriate kernel width. It can he seen that correnropy

can nicely separate the i.i.d signal of different PDF distributions and detect the embedded

nonlinear coupling in the Lorenz time series.











1~ 1




_0.4



0 10 20 0

1~ 1




~ l"----l0.6

0.5
0.3
0 10 20 0


1~ 1
Gaussian--x
Exponential-y

0.4 -G m a0.4 ----z



-0.2 -0.2
0 10 20 0 50

11 1



0.6 ~~10.6



0.2 02
0 10 20 0 50


50










50


1 1 1~ 1



II 0.9 1 ., II 10.996


0.94
0.95 0.995 0.99299
0 10 20 0 50 0 10 20 0 50


Figure 3-4. Correntropy for i.i.d. signal and Lorenz time series with different kernel width


To further quantify the relationship between kernel width and the ability of

correntropy in separation i.i.d. signals of different PDF distributions, we define a

separation coefficient S(a) as a function of kernel width a among the signals of different

PDF distributions by


S(a) =(3-35)


where L is number of i.i.d. signals, le(i) is the estimate of information potential of ith

signal (Eq. (1-31)), and V,(0) is the correntropy at zero lag. The correntropy values at

zero lag for different i.i.d. signals are the same since Gaussian kernel is isotropic. The

separation coefficient basically computes the sum of normalized distance between different

i.i.d. signals. Fig. 3-5 plots the separation coefficient as a function of kernel width a. It

can be easily seen that the best separation coefficient achieves at a = 1.


















06-








104 103 102 10' 100 10' 102


Figure :3-5. Separation coefficient versus kernel width for Gaussian kernel


Next, we investigate the effect of different kernel functions used in the correntropy

functions. The same data sets are used again, and we compare four different kernel

functions, Gaussian, polynomial, sigmoid and For a complete description, six kernel

function expressions are given here.


1 (.r, .rs)2
1. Gaussian kernel: s(.r, ,r) = exp{- } (3-36)

2. Polynomial kernel : s(.r, ,r) = (1 +.r .rs,)d qn)


:3. Sigmoid kernel : s(.r, ,r) = tanh(/Soxr, .r + /SI) ($-BS)

4. Wave kernel: s(.r, ,r)=-sne -(-9

5. Exponential kernel : s(.r, .rs) = A exp (-A|.rt .r |) (3-40)

6. Inverse niultiquadrics kernel : s(.r, ,r) (:341)

In order to have a fair comparison, we choose suitable parameter for each of kernel

functions for this specific data set. We select a = 1 for Gaussian kernel, d = 2 for

polynomial kernel, /So = 10 and /91 = 0.5 for sigmoid kernel, a~ = 2.5 for wave kernel,














Polynomial kernel
40
Gausslan distribution
-Exponential distribution
30 Gamma distribution


20


10



0 5 10 15 20


Slgmold kernel

Gausslan distribution
-Exponential distribution
0 8 -Gamma distribution

06

04

02


0 5 10 15 20
Wave kernel

0 4 -Gausslan distribution
-Exponential distribution
Gamma distribution


Polynomial kernel


4


Slgmold kernel




05 \


0 50
Wave kernel
04

0 35

03


10 15 20 0 20 40 60 80


Exponential kernel
026
Gausslan distribution
-Exponential distribution 0 24
-Gamma distribution
0 22

02

0 18

0 16

0 14

012
5 10 15 20 0

Inverse multiquadrics kernel


Exponential kernel


025





02





015'



0 28

0 26

0 24

0 22

02

0 18

0 16

0


20 40 60 80

Inverse multiquadrics kernel


Gausslan Istribution
Exponential distribution
Gamma distribution


0 251




0 15


01
0 50 9(


5 10 15 20


Figure 3-6. Correntropy for i.i.d. signal and
functions


Lorenz time series with different kernel










A = 2.5 for exponential kernel and 8 = 0.7 for inverse multiquadrics kernel. The

correntropy functions for i.i.d. data of Gaussian, exponential and Gamma distributions

and Lorenz time series are plotted using different kernel function in Fig. :3-6. Notice that

all the kernel functions except polynomial kernel can separate different PDF data. The

sigfmoid kernel fails to detect the nonlinear coupling embedded in Lorenz time series.

The wave, exponential and inverse multiquadrics kernels exhibit reasonable performance

compared to Gaussian kernel.

3.2 Frequency-Domain Analysis

We have already presented the definitions and properties of correntropy and centered

correntropy functions in the previous sections. If we view the correntropy and centered

correntropy functions as generalized correlation and covariance functions, certainly

we can proceed to the fre. to. n. s ---domain analysis. Similar to the conventional (power)

spectral density function defined for a wide-sense stationary random processes, we can

also define (correntropy) spectral density function for a correntropy-sense stationary

random processes. Fourier transformation techniques offer an alternative approach to the

time-domain analysis of correntropy- for random processes.

Definition: Given a correntropy-sense stationary random process {.e, te T}4 with

centered correntropy- function U(-r), the correntropy spectral I.;.'-//// function is defined by


P(a)m =(I(r)e- "cirl (:3 42)


whenever the integral exists. In other word, the correntropy- spectral density function is

the Fourier transform of the centered correntropy- function.

The variable w in Eq. (:342) is usually called the restian fr. .;, :>. ..;i and is measured

in radians per second. The frequency w/2xr, denoted as f, is referred to as the "usual"

frequency and measured in hertz(Hz). To convert an expression for the correntropy

spectral density as a function of w into an expression for the same spectral density as a

function of f, one needs to replace whby 2x f. Formally, the correntropy- spectral density as









a function of frequency in Hz is defined by


P(f) =/ Uir)6-j2"'fr. 3

Since P(w) = F{U(-r)}, we can obviously define the inverse ine,: I:rm of the

correntropy spectral density function, U(-r)= F-1{P())}, as




= ~ )6~fdf. (:345)


In particular, when the time lag -r = 0, the correntropy spectral spectral density function
becomes


E[n(xr,, re)] E,.,E,., [n(:re, re)] = E[|| (:re)||$] ||E[@(r )]||2 = U(0)

2xr
1 "o
2 1 P(w)de =o P( f)df (:346)

This equation shows that the difference between the expectation of norm square and

norm square of the expectation of the nonlinearly transformed random process is the total

area under the correntropy spectral density function P( f) of the process. In second-order

random processes analysis, the power in the random process is the total area under the

power spectral density function. By the similar approach, we define the difference between

the expectation of norm square and norm square of the expectation of the nonlinearly

transformed random process, in other word, the centered correntropy at zero lag U(0), as

the correntrop~y power. From a geometrical perspective, 17(0) can also be considered as

,,- 1!s I .!.. .1 variance, or ,,- is, I .!.. .1 power, of the nonlinearly transformed random

process in the feature space. Physically, P(w) represents the density of correntropy

power at frequency Lc radians/sec, and P( f) ptI i-s the same role for frequency in Hz.

The correntropy power in any frequency hand is obtained by integrating the correntropy










spectral density function over the range of frequencies that build up the band. Specifically,

the correntropy power in a frequency band from Lo' to w2 radians/sec, or from fl to f2 Hz,
is defined as

2 P(w)du = / P( f)df (3-47)

The correntropy spectral density function for a correntropy-sense stationary random

process satisfies the following properties:

Property 1: P(w) = P(-w), for all w.

Because U(-r) = U(--7) for a correntropy-sense stationary random process, we then
have







Property 2: P(w) = [P(w)]*, for all w.

[P(w)]* denotes the complex: conjugate of P(w). To prove this property, notice that






=P(-w) = Pt,). (3-49)


This property shows that the correntropy spectral density function is real.

Property 3: P(w) > 0, for all w.









Observe that for any T > 0,





/T TT


/T rT


Hence, this last double integral is nonnegative. Since T > 0, certainly, T-l is also greater
than 0, and this results in


T- UOC(t s)e-iwi"-s)dids; > 0 (3-50)

for all T > 0. We make a change-of-variable transformation in Eq. (3-50) by letting
t s = -r. The transformation maps the original region of integration { (t, s) : 0 <
t < T,0 < s < T}onto the region {(t, -) : 0 < t < T,t T < -r
word, the original region of square is changed into a parallelogram, which can also be
decompoI~sed intoU two regions with I =~ {(t,7) : 7 < t < T,O < 7 < T} and

G2 = {(t, -) : 0 < t < T + 7r, -T < -r < 0}. Then Eq. (3-50) can be rewritten as the sum
of an integral over the region G1 and an integral over the region G2 aS follows:


T- U111i(t )e-yw".t-did

/iT T/11, 0- d + T+7r[(~l~l i
T- Ure-"dtd +UrJ-"'td



IJT T U(r)e-"' dr. (3-51)









Let us define the function UT(r) as


Ur (r) = 1 qUr), |75T (3-52)


Then for any fixed 7r, we have

lim U (7) = U(r). (3-53)

Therefore, the limit of Eq. (3-51) becomes

lim U(r)e ""'dr = lim UT((T)e ""'d-r

=lim UT(-r)e-""'d-r


= U~r(7)e-'"7d7 P(W). (3-54)

The interchange of order of the limit and the integration is validated by the dominated

convergence theorem [21]. Eq. (3-51) and Eq. (3-54) established that P(w) is the limit of

a sequence of nonnegative numbers for any w, and the limit of a sequence of nonnegative
numbers must be nonnegative. Consequently, P(w) > 0.










CHAPTER 4
CORRENTROPY ANALYSIS BASED ON RK(HS APPROACH

Fr-om the definitions of correntropy and centered correntropy functions, we know

that there are two reproducing kernel Hilbert spaces embedded in the correntropy and

centered correntropy functions. The first RK(HS is induced by the symmetric positive

definite data independent kernel functions, like the Gaussian kernel, Sigfmoid kernel and

others Eq. (:336) Eq. (:341). These kernel functions nonlinearly transform the original

random processes into a high dimensional RK(HS, which is called feature space. The linear

operations on the transformed random processes will become nonlinear operations on the

original random processes when mapped back by the inverse nonlinear transformation.

The second RK(HS is induced by the symmetric positive definite data dependent

correntropy and centered correntropy functions directly. Unlike the conventional

autocorrelation and covariance functions that operate directly on the original random

processes and determine data dependent RK(HS, correntropy and centered correntropy

operate on the expectation of nonlinearly transformed random processes, hence induce

reproducing kernel Hilbert spaces that are very different from the ones determined hv data

independent kernel functions and conventional autocorrelation and covariance functions.

The inclusion of data independent kernel functions inside the expectation operation makes

the correntropy and centered correntropy functions different from kernel functions and

expectation operation alone. This also changes the dynamics of the RK(HS induced by the

correntropy and centered correntropy and makes it unique.

In this chapter, we investigate the correntropy and centered correntropy functions

from a geometrical perspective by analyzing the reproducing kernel Hilbert spaces induced

by them directly and the one by kernel functions alone. These two approaches to analyze

the correntropy and centered correntropy will lead us to a better understanding of the

geometrical structure of the RK(HS induced by kernel functions, correntropy and centered

correntropy respectively.










By the Mercer's theorem, any symmetric positive definite functions possess an

eigen-decomposition. Even without the explicit knowledge of the eigen-functions and

eigen-values, kernel methods solve problems which can be expressed in terms of inner

product because the evaluation of the inner product is equivalent to the value of kernel

function. In this chapter, we present a new method to explicitly build the RK(HS with

Gaussian kernel based on polynomial functions. A knowledge of the bases that construct

the RK(HS offer a potential applications of RK(HS beyond the inner product.

4.1 RKHS Induced by the Kernel Function

By the Mercer's theorem in chapter 1 Eq. (1-14), any symmetric positive definite

function K(xt, x,) can be rewritten as an inner product between two vectors in the feature

space, i.e.,




= (@(t), #x,)(4-1)

#:xt p=t) k1,2,...,anld IeT.


where Ak and kpr are eigen-ValueS and orthogonal eigen-functions for the kernel x(-, -)

respectively. The nonlinear transformation # maps the random process {xt : te T}9 into

another random process {@(xt) : te T}4 in the high-dimensional feature space, which is a

reproducing kernel Hilbert space induced by the kernel according to the Moore-Aronzajn

theorem Eq. (1-2). The construction of RK(HS based upon the eigen-values and

eigen-functions Ak and kpr follows the same approach from Eq. (1-14) to Eq. (1-20)

by substituting the autocorrelation function R(t, s) with kernel function x(xt, x,). Notice

that if isotropic kernel functions are used, specifically the Gaussian kernel Eq. (3-36),

wave kernel Eq. (3-39), exponential kernel Eq. (3-40) and inverse multiquadrics kernel

Eq. (3-41), then the norm of transformed random process ||#(xt) 12 is a COnStant equals

to K(0). This shows that the transformed random process {@(xt) : te T}4 resides

on the sphere. The isotropic kernel functions transform the random processes such









that the instant power become constant. This approach offers potential applications in

communications and signal processing.

The data independent kernel functions include Gaussian kernel, Sigmoid kernel and

others Eq. (3-36) Eq. (3-42). These data independent kernel functions are embedded

inside the correntropy and centered correntropy functions. An analysis of correntropy and

centered correntropy from kernel perspective offers a new insight.

4.1.1 Correntropy Revisited from Kernel Perspective

According to the inner product expression in Eq. (4-1), we can rewrite the correntropy

in terms of the nonlinear mapping # as


V(t, s) = E[(0(xt), #(x,))]. (4-2)


Likewise the centered correntropy can also be expressed as




=E [(@(xt) E[@(xt)], #(x,) E[@(xs)])] (4-3)


It can be seen that the correntropy function is a "conventional" correlation function

for the transformed random process in the high-dimensional RK(HS. While the centered

correntropy is nothing but the correntropy for the zero mean (centered) random process

{((Xt) E[@(xt)] : te T}q. This way of defining the generalized correlation and covariance

functions follows the same spirit of the standard correlation and covariance functions.

The high order statistics for any random process in the input space turns out to be the

"second-order" statistics in the feature space. K~ernel-based learning algorithms employ the

nonlinear mapping # to treat nonlinear algorithms in a linear way if the problems can be

expressed in terms of inner product [49]. This -II_ _- -; that we can deal with nonlinear

systems efficiently and elegantly in a linear fashion when applying the correntropy and

centered correntropy functions.










In fact, all the previous properties of correntropy and centered correntropy functions

in section 2.1 can be derived in a kernel framework. For example, property 3 can be shown

that V(t, t) = E[||@(Xt)l2] Which means that V(t, t) is nothing but the expectation of norm

square of transformed random process. The centered correntropy at zero lag U(t, t) can be

rewritten as



Ut = E [||@(Xt) E[@(xt)]||2) -E[(t~~


=[I E[|(Xt)||2]_ |E[@(xt)]||2(4


Therefore, the centered correntropy can be viewed as the expectation of the norm square

of zero-mean (centered) transformed random process or the generalized covariance of

transformed random process. Similarly, property 4 can be cast in terms of the nonlinearly

transformed random process. The inequality can be re-expressed as





which is called the generalized Car;,. I, -Schwarz .:,.' .;:>.rl./ /l in the reproducing kernel

Hilbert space.

Fr-om chapter 3.1, we know that if the random process is pair-wise independent then

the correntropy at non-zero lags becomes the information potential and the centered

correntropy at non-zero lags reduces to zero. Under this condition, the nonlinear

transformed random process becomes "uncorrelated" in the feature space. The condition

of pair-wise independence in the original input space implies pair-wise uncorrelatedness in

the feature space, but not vice versa. This offers correntropy and centered correntropy a

wide range of potential applications in machine learning and signal processing.

The kernel function perspective to correntropy and centered correntropy so__~-1-;-

that we can treat the nested kernel function inside the expectation operator as an implicit

nonlinear transformation that maps the original random process into the RK(HS induced










by the kernel function, where the transformed random process resides on the sphere if

isotropic kernel functions are used. Then correntropy and centered correntropy functions

are nothing but 4.i,~ II! as.. autocorrelation and covariance functions of the nonlinearly

transformed random process. This nonlinear transformation of random processes brings

a range of advantages, as what have been demonstrated in chapter 2, but also poses

challenges about operations on those transformed random processes since only the inner

product of the transformed random processes in known without any knowledge of the

individual nonlinear transformations. A method of explicit construction of the nonlinear

functionals would alleviate the problem.

4.1.2 An Explicit Construction of a Gaussian RKHS

The Mercer's theorem does not explicitly provide the basis cp that builds the RK(HS.

Rather it only offers the inner product of the eigen-functions because it is sufficient for

the statistical learning problems which can be expressed in terms of inner products.

One of the most fundamental issues in learning problems is the selection of the data

representation. In kernel-based learning algorithms, this translates into the choice of the

functionals, or the appropriate feature space RK(HS. The reason is that the nonlinear

mapping has a direct impact on the kernel and thus, on the solution of the given learning

problems. Different kernels (polynomial, sigmoid, Gaussian) very likely will result in

different performances. The functional form of RK(HS is still of great interest to optimize

performance and to get an insight into the appropriateness of the data representation.

Ultimately, this will allow us to utilize the RK(HS structure and expand the class of

algorithms, beyond inner products, that can be developed in kernel space. The advantage

of kernel-based learning algorithms becomes also a disadvantage. The general question

of how to select the ideal kernel for a given problem remains an open issue. Recently,

there have been attempts to explicitly construct an RK(HS. A. R I1:n k in! Ialluiv~ et al.

proposed a method of building an RK(HS and its associated kernels by means of frame

theory [78]. Any vector in that RK(HS can be represented by linear combination of the









frame elements. But a frame is not necessary linear independent although it results in

stable representation.

In this section, we take the polynomial space approach to construct explicitly

an RK(HS associated with one of the most popular kernels, the Gaussian kernel. By

transforming a generalized Fock space [45] with a positive operator, we build an RK(HS

associated with Gaussian kernel. The functionals, are explicitly given by the polynomials.

Unlike the Mercer's theorem approach, these functional are not necessary orthornomal.

More importantly, we can gain control over the dimension of the RK(HS by means of

selecting the polynomial degree. The simulation so__~-r-;- that the effective dimension of

RK(HS with Gaussian kernel is relatively small.

The definitions of functionals and inner product of a general Hilbert space are given.

Then, a kernel function is imposed on this general Hilbert space to make it a reproducing

kernel Hilbert space. This approach of building an RK(HS with polynomials can also be

found in [45], which is called generalized Fock space. Our contribution is that it is an

RK(HS associated with Gaussian kernel that we explicitly construct by introducing new

definitions of functionals and kernel function.

First we construct an inner product space Ti by defining functionals and inner

product. The evaluation of functional f at any given x: is given by


f (x) = e- a~ k! x,(45


where ao is a constant and (n + 1)-tuple (fo,fl, in ) are the coefficients which

uniquely characterize the polynomial f Then the inner product between any two

functionals f and h can be specified in the form


< f,&>= fk ki (4 6)









where fk, and hk, are COefficientS for f and h respectively and a = (co, al, ..., a,) is a set

of positive constants chosen a priori. It can be easily seen that this inner product space N~

is complete thus forming a Hilbert space.

In order to make 'FNa reproducing kernel Hilbert space, we impose a kernel function a

on 'F in the following form


R(x, y)= e (xy) .y) (4 7)

It can be verified that the Hilbert space N~, equipped with such n, is a reproducing

kernel Hilbert space and the kernel function x(x, -) is a reproducing kernel because of the

following two properties of ~(x, y):

1. x(x, y) as a function of y belongs to 'F for any fixed x because we can rewrite x(x, y)







i.e., the constants (z~/k Gk "), k = 0,1,...,n become thle coefficienlts fk,k =

0, 1,. ., a in the definition of f and thus


x(X, -) E N. (4-9)

2. Given any fe E N, the inner product between reproducing kernel and f yields the

function itself,



<~~ ~~~~ f~x -) =(e2 ).

= ~ e k =f(x). (4-10)

This is so called reproducing r***~i'* I;

The RK(HS constructed above has the freedom to choose the degree of functionals,

i.e., the dimension n of the kernel space 'F. The most interesting case is that we might










extend it to an infinite-dimensional RK(HS provided that the norm of functional is finite as

n oo, i.e., given a sequence of positive weighting constants satisfying certain conditions







Then the functionals, inner product and reproducing kernel in 'Fi will be defined by Eq.

(4-5), Eq. (4-6) and Eq. (4-7) with n = 00.

In the special situation of weights


ak = a, k = 1, 2, ... (4-12)


where ao is a fixed positive constant, then the reproducing kernel Eq. (4-7) in the

infinite-dimensional RK(HS becomes



1 xy



= e 2, (4-13)


which is the Gaussian kernel used widely in machine learning, function approximation,

density estimation, support vector machine, and etc. The constant ao turns out to be the

kernel width. It controls the norm length of those functionals in RK(HS. i.e., the spread of

nonlinear mapped data sample in feature space.

Comparing this method with Mercer's theorem, we notice that there are two 1!! ri

differences between them.

1. First, we have given an explicit expression for the functionals in the RK(HS

associated with Gaussian kernel in terms of polynomials while Mercer's theorem

never does that. We can get the exact evaluations for those functionals at each point










in the RK(HS. This enables us to know exactly the structure of the RK(HS associated

with the Gaussian kernel.

2. Second, the functionals we constructed above are not necessary an orthonormal

basis, while the Mercer's theorem is realized by orthonormalizing the RK(HS. This

perspective provides also a general alternative to build an RK(HS from known

functionals besides from Mercer's theorem.

3. Third, we can have control of the dimension of the RK(HS by means of selecting the

polynomial degrees n in Eq. (4-7).

The method we constructed an RK(HS enables us to have the explicit expression of

the functional in RK(HS associated with Gaussian kernel. Hence we can exactly know

the nonlinear mapping # used in the kernel-based learning algorithm and so operate

directly with the transformed data to extend the algorithms beyond inner products.

Furthermore, as we have the control of the dimension of the RK(HS 'Ff, this might help the

computational complexity issue in kernel-based learning algorithms through approximation

of Gaussian kernel by polynomials as indicated in equation Eq. (4-7).

The previous way of explicitly constructing an RK(HS by means of polynomial

functions was emploi-. I1 by De Figueiredo [45] which is based on the Fock space. The idea

of Fock space was first proposed by Fock in [79] to be used in quantum mechanics, where

quantum states are described in the way of passing from one single object to collections of

objects. More recently Figueiredo introduced an Ia 1.11 a l~ly weighted Fock -pI II. which

was called generalized Fock space in [45]. The space is equipped with an appropriate

weighted inner product, thus forming an RK(HS. The proposed RK(HS has been used in

liearn/nonlinear system and signal analysis, where a number of problems are involving

approximation and inversion of nonlinear functions/functionals and nonlinear operators.

In the univariate case, a generalized Fock space F" is an RK(HS, where the functionals f,









inner product and kernel function F are defined as follows respectively,


< f ,&>y fk k~, (4-15)


F~(u, v)= (uv: _16,(l )

where the real (n + 1)-tuple (folfl, -, I n ) completely characterizes f, a = (co, al,.. -

a,) is a set of positive weighting constants which are chosen a priori according to the

problems under consideration. It can be shown that this generalized Fock space is an

RK(HS. Similar to the RK(HS 'F we constructed above, the generalized Fock space F"

has the freedom of choosing the space dimension. The interesting case is that when the

space becomes infinite dimensional while the norm of the functional satisfying the same

condition Eq. (4-11) as n oo. Then the kernel function Flu, v), defined by Eq. (4-16),

will become an exponential kernel as n oo


is~xy) =co ,(4-17)


given the same weights constraint as Eq. (4-12).

It can be noticed that there are similarity and difference between the RK(HS 'F and

the generalized Fock space F". The definitions for the inner product inside the two spaces

are the same, while the functionals and kernel function are different. The relationship of

the two spaces 'F and F" is connected by a theorem in [2], which states that if H1 and

H2 arT tWO reproducing kernel Hilbert spaces of the same definitions of inner product,

then there exits a positive operator with bound not greater than 1 that transforms H1

into H2 C H1. Comparing the definitions of functionals for two spaces, we can see that

the e- 2/2"o -,j l-,-- the role of a positive operator with bound not greater than 1, thus

transforming the generalized Fock space F" into N~ such that 'FNC F.















10-


1-10

10

LLJ -1

J -20
cr 10


1-25


1-30


1-35
0 5 10 15 20 25 30
Order n


Figure 4-1. Square error between a Gaussian kernel and the constructed kernel in Eq.
(4-7) versus the order of polynomials


We present a simple simulation here to show the effectiveness of approximation of the

polynomial functionals to the Gaussian kernel. In the simulation, we calculate the square

error between the Gaussian kernel and the proposed kernel in Eq. (4-7) of order n. The

kernel width is chosen to be 1, and the range of the calculated data is from -5 to 5. The

Fig. 4-1 plots the square error versus order n. The line becomes flat at order 11 is due to

the computation precision of MATLAB. The figure -II_ _- -; that the effective dimension of

the RK(HS with Gaussian kernel is relatively small. With only order 11, we can effectively

approximate the Gaussian kernel by polynomials. This also indicates that for practical

purpose it is sufficient to work with much smaller dimensional space instead of infinite

dimensional space for a Gaussian kernel in kernel-based learning algorithms.

4.2 RKHS Induced by Correntropy and Centered Correntropy Functions

We have seen that the kernel function nested inside the correntropy and centered

correntropy functions uniquely induces a reproducing kernel Hilbert space. The nonlinear










transformation # maps the original random processes into the high-dintensional RK(HS

where the transformed random processes appear on a sphere if isotropic kernel function

is used. The correntropy- and centered correntropy- can he carried out by operating

on those transformed random processes. However, the correntropy- and centered

correntropy functions are syninetric non-negative definite themselves, hence determine

two reproducing kernel Hilbert spaces as well. This perspective is in the same spirit of

RK(HS in statistical signal processing, where the autocorrelation and covariance functions

are syninetric non-negative definite thus induce reproducing kernel Hilbert space (refer to

the review in chapter 1.2). All the linear statistical signal processing using autocorrelation

and covariance functions can he also treated as functional operations in RK(HS. But the

RK(HS induced by correntropy and centered correntropy functions is different front the

one induced by conventional autocorrelation and covariance functions. The RK(HS by

conventional autocorrelation and covariance functions is based on second order random

processes, hence suitable for the Gaussian processes. While the RK(HS by correntropy and

centered correntropy function includes the high order statistical information of random

processes, hence goes beyond the Gaussian processes. It is conjectured that RK(HS by

correntropy and centered correntropy functions might encompass a large class of processes.

Front the other perspective, the RK(HS by conrrentropy and centered correntropy- is also

different front the one by kernel functions. One obvious characteristic is that RK(HS

by correntropy- and centered correntropy- is data dependent while the RK(HS by kernel

function is data independent. The inclusion of kernel function inside the expectation

operator makes the correntropy and centered correntropy functions unique. It departs

front the conventional RK(HS induced by the autocorrelation and covariance functions

used in statistical signal processing and also the data independent kernel functions used in

kernel machine learning. In this section, we analyze the geometry of RK(HS by correntropy-

and centered correntropy- functions.









4.2.1 Geometry of Nonlinearly Transformed Random Processes

Given a random process {xt : te T}q, we can construct another nonlinearly

transformed random process {@(xt) E[@(xt)] : te T}q, where # is the nonlinear

mapping associated with the kernel function Eq. (4-1). The reason to work with the

zero-mean nonlinearly transformed random process shall become clear shortly. Let us

define a linear I,,r,.itti J.7I spanned by the random process {@(Xt) E[@(xt)] : t E 9),

denoted as L (@(xt) E [@(xt)] : te T ), to be the set of all random variables ( that can be

written as the form

( = e ( (xt, [@(t,)) ,(4 18)
i= 1
for some integer n, real constants c ,..., c,, and tl,...,& e T 4. In other words,

L (@(xt) E[@(xt)] : te T ) contains all finite linear combinations of the random variables

{((Xt) E[@(xt)] : te T}q. Close the set in Eq. (4-18) topologically according to the

convergence in the mean using the norm


||( (|| = [|( (|2 (4-19)

and denote the set of all linear combinations of random variables and its limit points

by L2 (xt) E[@(xt)] : te T ). In other words, L2 (xt) E[@(xt)] : te T ) consists

of all random variables in the linear manifold L (@(xt) E [@(xt)] : te T ), together

with all random variables ( such that there exists a sequence of random variables (a in

L (@P(xt) E:[@(xt)] : te T 1) converg:ing to (, in thle sense that ||( (n|| = E [||(~1- (|-
0 as n 0.

It is well known from the theory of quadratically integrable functions that

L2 (xt) E[@(xt)] : te T ) forms a Hilbert space if the inner product is the one

corresponding to the norm in Eq. (4-19), namely


(1, () = E [(1, ()] (420)

















~1-4`2


Figure 4-2. two vectors in the subspace S

Then by the definition of crosscorrentropy Eq. (3-8), we have


(1, () = E [(1, ()]

=E[(0(x) E[@(x)], #(y) E[@(y)])]

=E [a(x, y)] Ez,E, [a(x, y)] = U(x, y). (4-21)

In other words, the inner product in L2 (xt) E[@(xt)] : te T ) is the centered

correntropy function of the original random variables. This definition will bridge the

space L2 (xt) E[@(xt)] : te T ) and the RK(HS induced by the centered correntropy
function.

Now let us define a 2,- is, I .!.. 0 deviation of any random variable ( in

L2 t)X, E[@(xt)] : te T ) as


o-(E) = E[||(||2s'



= E a~x x) ~z,~z, a~x x) = ~xx).(4-22)









Consider the subspace S spanned by two random variables xl and x2 (We drop the

time index here for simplicity without any ambiguity):


S = {c1(0(X1) E[@(x )]) + c2 2(x) E[@(x2)])} (4-23)


Then it makes sense to talk about the angle between two vectors (1 and (2 in S (see Fig.

4-2). By plane trigonometry, we have


O.2I 1 2~ = 0.2II 1 0 2 2) 2o-(()o-((2) COS ~, (4-24)


which is the same as


1 1~1 2 2 111" 2 I 2~ 2 2 (1, I(2

2111 2 I 12 2||11 ||~ ||( COS (4-25)


Consequently, we have

(I,la 2 1 2x
cos = (4-26)
2 1 1 2 2

The quantity measures the angle between two a .; I I-" in the subspace S of

L2 (xt) E[@(xt)] : te T ). In particular, if two vectors are orthogonal, then 4 is 900.

If two vectors are in same direction, then 4 is Oo. Now we are in a position to define the

3. sI .s..lcorrelation coefficient for any two random variables.

Definition: Given any two random variables x and y, we define a i a.l

correlation coefficient, named correntropy coefficient as

U(x, y)
rl = ,(4-27)


where U(x, y) is the centered correntropy function for x and y, and U(x, x) and U(y, y) are

centered correntropy functions for variables themselves.

This definition of correntropy coefficient is in the same spirit of traditional correlation

coefficient, where the standard covariance function has been replaced by i a.l










covariance function, centered correntropy Eq. (3-8), and the standard deviation functions

has been replaced by ,. is, I .!. 0 deviation function, square root of centered correntropy

with respect to itself Eq. (4-22). However the difference is striking. Conventional

correlation coefficient only measures the second order similarity between two original

random variables, hence it only requires second order statistics in order to attain 0,

i.e., two random variables are uncorrelated. While the correntropy coefficient measures

high order similarity between two original random variables or "second order -!~!!!!! 1.!13

between two nonlinearly transformed random variables in L2 (Xt) E [@(xt)] : t E 9),

therefore it needs high order statistics to attain 0. In fact, by the property 4 in chapter

2.1, the value of correntropy coefficient is between -1 and 1. If two original random

variables are independent, i.e., two nonlinearly transformed random variables by # are

"uncorrelated" in RK(HS induced by kernel function, then the correntropy coefficient

reduces to 0, which means the two vectors in L2 arT Orthogonal. If two random variables

are the same, the correntropy coefficient is 1, which means the two vectors are in same

direction. This also explains the reason why we use centered correntropy functions,

not correntropy fucntions, in the definition of correntropy coefficient. The correntropy

coefficient will never attain 0 if correntropy functions are used because V(x, y) is ak- -7-s

greater than 0. The pre-processing of making random processes zero mean is vital to many

conventional statistical signal processing. In our context, since we do not have explicit

knowledge of the mean of nonlinearly transformed random processes, we have to rely on

centered correntropy functions.

Substituting the definition of centered correntropy Eq. (3-8) and approximating the

ensemble average by sample mean, we can obtain an estimate of correntropy coefficient










directly from data,


i=1 i=1 j=1 (-8



i= 1 j= 1 i= 1 j= 1

where x2C= := (xe, y) is called cross information potential between x anld y [55],

~ c~,cj=1 (x xj) and 1 Ci=1 Cj=1 x(yi,yj) are again the information potential

for x and y respectively. This observation -II__- -is a connection between the correntropy

coefficient and the Cauchy-Schwartz independence measure in Eq. (1-34) because both

of them use the information theoretic learning concepts to measure the independence

of two random variables. Moreover, the Cauchy-Schwartz independence measure is an

approximation to the K~ullback-Leibler divergence [55], this also -II__- -I- the correntropy

coefficient is related to mutual information. These connections need further investigation

in the future work.

4.2.2 Representation of RKHS by Centered Correntropy Function

We have analyzed the geometry of L2 (xt) E[@(xt)] : te T ), where an inner

product of any two vectors is defined in Eq. (4-21) which turns out to be the centered

correntropy function for the original random processes. In this section, we proceed to

prove that there exist a congruence map between L2 (xt) E[@(xt)] : te T ) and RK(HS

induced by centered correntropy function. Therefore it is sufficient to study the Hilbert

space L2 (xt) E[@(xt)] : te T ) in this perspective.

First because the centered correntropy function U(t, s) is non-negative definite, it

uniquely determine a reproducing kernel Hilbert space by Moore-Aronszaj~n Theorem. We

can apply Mercer's theorem to obtain an eigen-decomposition for the centered correntropy

function as
OO129









where Ok, and i', are eigen-values and eigen-functions for the centered correntropy function

respectively.
Then we can define a function g on T~ as the form of


g(t) = 8 '(),(4-30)

where the sequence {ak, k = 1, 2, ... } satisfies the following condition


k=0ea o.(41

Let 'F(U) be the set composed of functions g(-) which can be represented in the form

Eq. (4-30) in terms of eigenfunctions I', and eigenvalues Ok, of the centered correntropy
function U(t, s). Furthermore we might define an inner product of two functions in 'F(U)



(gi, g2 8~ktkbk, (4-32)

where gl and g2 arT Of form Eq. (4-30) and ak, bk, SatiSfy property Eq. (4-31). One might

as well show NF(U') isi complete. Let g,(t) = C[" 0 ka <")', (t) be a Cauchy sequence

in NF(U') suchl that each siequenlce {aL") l, = 1, 2 .. } converges to a limit point ak.
Hence the Cauchy sequence converges to g(t) = C2" o ", '', (t) which belongs to N~(U).

Therefore N~(U) is a Hilbert space. 'F(U) has two important properties which make it a

reproducing kernel Hilbert space. First, let U(t, -) be the function on T~ with value at s
in T~ equal to U(t, s), then by the Mercer's Theorem eigen-expansion for the covariance

function Eq. (4-29), we have


U(~s=8,.I.();n;k=0i 4 3

Therefore, U(t, -) E 'F(U) for each t in T~. Second, for every function g(-) E 'F(U) of form

given by Eq. (4-30) and every t in T~,


(g, U(t, -)) =i~ Os,,< (t)= g(t). (4-34)









By the Moore-Aronszaj~n Theorem, N~(U) is a reproducing kernel Hilbert space with

U(t, s) as the reproducing kernel. It follows that





Thus N~(U) is a representation of the random process {xt : te T}9 with centered

correntropy function U(t, s). One may define a congruence W form N~(U) onto

L2 (xt) E[@(xt)] : te T ) such that


~(U(t, -)) = #(Xt) E[@(xt)]. (4-36)


The congruence W can be explicitly represented as





where the set of (k is an orthogonal random variables belong to L2(R, F, P) and g is any

element in N~(U) in the form of Eq. (4-30).

This way of proceed from the L2 (xt) E [@(xt)] : te T ) to the RK(HS induced

by the centered correntropy function is in the same spirit of the method to build RK(HS

induced by traditional covariance function from the L2 It, : te T ) [20, 25]. However we

now deal with the nonlinearly transformed random processes via the 4.

4.3 Relation Between Two RKHS

There are three spaces we have analyzed so far in this chapter. The first one is the

reproducing kernel Hilbert space induced by the data independent kernel function x(-, -),

where the nonlinearly transformed random processes reside on the sphere if isotropic

kernel functions are used. The inner product in this RK(HS is defined as the kernel

function x(-, -). The second space is the L2 (xt) E[@(xt)] : te T ) which is built

upon the RK(HS induced by kernel functions by including all the linear combination of

.~ in-!!. I.1. nonlinearly transformed random process {((Xt) E[@(xt)] : te T}9 and

its limit points. An inner product is defined in this space as the form of Eq. (4-21),










which turns out to be the centered correntropy function for the original random processes.

Consequently, L2 (xt) E [@(xt)] : te T ) is complete and thus forms a Hilbert space.

The last space is the reproducing kernel Hilbert space induced by the centered correntropy

function directly. The inner product is defined as the centered correntropy function, the

Mercer kernel. It is proved that there exists an isometric isomorphism (congruence) map

between the RK(HS induced by the centered correntropy function and the Hilbert space

L2 (xt) E[@(xt)] : te T ). By isometry, it means the inner product definitions for two

spaces are the same. By isomorphism, it means there exists a one-to-one mapping between

two spaces. It is sufficient to study the RK(HS determined by the centered correntropy

function by considering the Hilbert space L2 (xt) E[@(xt)] : te T ). Then the relation

among these three spaces become clear. The kernel operator inside the correntropy

function induces a reproducing kernel Hilbert space. The expectation operator induces

another RK(HS which is built upon the first RK(HS through the Hilbert space L2 SinCe

RK(HS by centered correntropy function is congruence to L2-

This interpretation of three spaces involved in centered correntropy functions offers

insights into the geometric relationship among the different spaces. Certainly, there are

other owsi~ to tackle the task and further investigations are required. First, we applied

Mercer's theorem to non-negative definite centered correntropy function to come up

with an eigen-decomposition in the previous section. It would further our understanding

if we can gain knowledge of the basis. So it is desired that we might also apply what

we have propose in section 3.1.2 of explicit construction of RK(HS by Gaussian kernel

function to build those basis for centered correntropy function explicitly. This will be

part of my future work. Second, what we have presented involves high dimensionality, be

it in RK(HS by kernel function or in RK(HS by centered correntropy function. The high

dimensionality has its own attractiveness but also poses challenges for data manipulation.

So another research direction is to propose a single dimensional nonlinear transformation

function such that the correlation function of those nonlinearly transformed functions is









the centered correntropy function, namely to seek a nonlinear function f such that for a

given random process {xt : t E 4T}


E[f (xt) f(x,)] = U(t, S) = E[K(Xt, X,)] E,,E,B [a(xt, xs)]. (4-38)


It is conjectured that this f function will implicitly embed the data distribution. As

has been pointed out random processes theorem, any give non-negative definite function

determines a Gaussian process such the correlation function for the Gaussian process

equals to that non-negative definite function [74]. It also -II__- -0 that this f function

might be related to the gaussianization technique.










CHAPTER 5
CORRENTROPY DEPENDENCE 1\EASURE

One of the most important tasks in statistical analysis is to measure the degree of

dependence between random variables. There are numerous measures of dependence

in the literature, mostly based on the distance of the joint probability distribution of

the data and the product of the marginal probability distributions [80]. Some of the

most coninonly used measures of hivariate dependence include but are not limited to

the correlation coefficient, K~endall's 7r, Spearnians' p,, nmaxinmal correlation coefficient,

monotone correlation coefficient, and others. The correlation based dependence measures

only characterize linear relationship between two random variables. [81] proposed a

nonlinear canonical correlation using syninetric nondecreasing functions to quantify

nonlinear dependence without any assumption on the distribution of the random variables.

The dependence measure is also closely related to the measure of amount of

information that one random variable contained about in the other. Because the more

dependent the random variables are, the more information about one ought to be given

by the other and vice versa. Several measures based on information theory have been

proposed in the literature. For example, the mutual information is one well-known

measure which can also be interpreted as the K~ullback-Leibler divergence between the

joint probability density functions and the product of the marginal densities [82, 83]. It

turns out that the K~ullback-Leibler divergence or mutual information is a special case of

the cp-divergence measure when a specific convex function cp is chosen [84]. On the other

hand, [58] generalized Shannon's mutual information as a special norm in the probability

density function space. Silvey's generalized coefficient [85] uses the Radon-Nikodyni

derivative of joint distribution of random variables with respect to the product of their

marginal distributions. Other dependence measures based on information theory include

the relative entropy measure proposed by [86] and others. However Joe's relative entropy

dependence measure, and almost all other entropies fail to he "metric" since they violate










the triangularity rule [87]. In order to overcome this limitation, [87] proposed a metric

measure of dependence to specify the degree of dependence in time series data.

Recently there has been considerable work on using functions in reproducing kernel

Hilbert space (RK(HS) to quantify dependence. [52] introduced kernel dependence

functionals in the context of independent component analysis (ICA). The kernelized

canonical correlation is a kernelization of canonical correlation analysis with regularization

on the functionals. It can also be explained as a particular form of nonlinear canonical

correlation where the nonlinear functions have been chosen as functional evaluations in

RK(HS [88] hased on the representer theorem [49]. The kernel generalized variance is

an extension of kernelized canonical correlation by estimating the spectral norm of the

correlation operator between reproducing kernel Hilbert spaces in the entire spectrum

[52]. Instead of using correlation operator in RK(HS, [89] proposed the kernel constrained

covariance and kernel mutual information based on the covariance operator. The kernel

constrained covariance estimates the spectral norm of the covariance operator between

RK(HSs. It is been proved that the kernel constrained covariance is zero if and only if

the original random variables are independent provided that the kernel is universal. The

kernel constrained covariance can also be viewed as a maximal correlation coefficient

where the function space has been confined to the reproducing kernel Hilbert space. The

kernel mutual information incorporates the entire spectrum of the covariance operator

between RK(HSs and becomes an upper bound of the mutual information estimated with

a Parzen window [89]. These dependence measures based on kernel methods have enjoi- II

much success in independent component analysis [52, 88, 89], quantification of generalized

synchronization between chaotic signals [90], and other machine learning application areas.

With the abundance of dependence measures in the literature, one would naturally

ask which measure is best in characterizing the dependence information of the data.

Riinyi proposed a set of postulates which should be fulfilled by a suitable dependence

measure [91]. The axiomatic framework by Riinyi has drawn much attention ever since.










1\any modifications, extensions and enrichments for these postulates have appeared in

the literature. One of the 1 in r~ criticisms was that these axioms are too strict for most

dependence measures to fulfill. It has been shown by [91] that out of various dependence

measures only the maximal correlation coefficient satisfied all these properties. The kernel

constrained covariance omits the deterministic dependence and upper bound conditions

in the axiomatic framework in order to efficiently estimate the quantity and apply it to

independent component analysis [89].

In this chapter, we define a new parametric correntropy function as a novel

dependence measure.

5.1 Parametric Correntropy Function

An important observation in the definition of the centered correntropy Eq. (3-8) is

that when two random variables are independent (fx,v(:r, y) = fx(r) fy(y)) the quantity

becomes zero but not vice versa. In order to make it a suitable dependence measure, we

modify the definitions of the centered correntropy and correntropy coefficient so that

it will satisfy the condition of attaining zero if and only if two random variables are

independent.

Definition (Parametric Centered Correntropy) Given two random variables :r and y,

the parametric centered correntropy is defined as


Ua,b(:r, Y) = E,;,[n(axr, b + y)] E,.E,[n(axr, b + y)]




where fx~v (:, y) is the joint probability density function (PDF) of random variables :r and

Y, fx(:r) and fy(Y) are the marginal PDFs respectively, R is the probability space and a

and b are parameters in RW.

We present a lemma here stating that the parametric centered correntropy- is zero for

all a and b if and only if :r and y are independent. First, we employ the Fourier transform









for the shift-invariant real valued kernel function as


(z) ecozpiida),) (5-1)

where p is a positive bounded measure. For simplicity we will also assume that the

probability measure p has moments of all orders so that a is infinitely differentiable. If the

tail of p decays sufficiently fast a can be extended to the entire line.

Lemma (Independence): Suppose the measure p defining the kernel a has

support in the entire line. Then Ua,b(X, y) = 0 for all a and b if and only if x and y
are independent.

Proof: The sufficient condition is straightforward. Assume Ua,b(X, y) = 0 for all a and b,

re-write the definition of the parametric centered correntropy as


Ua,b Z, y) = K(ax -y- b) {dFx,v(x, y) dFx(x)dFy(y)}

= x~x bdQ~x y),(5-2)

where Fx,v(X, y) is the joint cumulative distribution function (CDF), and Fx(x) and

Fy(y) are the marginal CDFs for x and y respectively. Need to show that dQ(x, y) = 0.

Ua,b(X, y) = 0 for all a and b means SS ei"(""-y)dQ(x, y)e-iaby(da~) = 0 for all a and b, in

particular for all b. Hence f Sei"(""-v)dQ(x, y) = 0 p almost all a~. Since support p = RW,

this holds for all a~ and a. This is easily written as f ei(""+Pw)dQ(x, y) = 0 for all a~ and P.

Conclusion dQ = 0.

We can also parameterize the correntropy coefficient.








Definition (Parametric Correntropy Coefficient) Given two random variables x and
y, the parametric correntropy coefficient is defined as

Ez,~,[a(ax, b + y)] EE,[a(ax, b + y)]


//i(axibo y)(fxi~v(x-) f(x)fi(y))dxdly


x(b, +- xb y f(x) f(yiildxd
(5-3)


where x(0) is the value of kernel function when its argument is zero. The absolute value of
the parametric correntropy coefficient is bounded by 1.
Proof (Boundedness of parametric correntropy coefficient): By the eigen-decomposition
of the kernel function Eq. (4-1), we have


|E,~,[a(ax, b + y)] EE,[a(ax, b + y)]|

=~ Ez,, A ,-, (ax) k (b + y) -~ EzE A, ,-, (ax) k (b + y)




= Ak |E {( krax) E[ k tZ k(b + y) [ k(b + y)])}|


k VR@k tZ)VR@ ~b +y)) cov(XY)|


kVR ) k t(k~ z kR kb+ ) (Cauchy S


< JvarX~vr(Y

Ichwartz inequality)

(5-4)


where


AkVR@ k tZ
k=0


k x (E[p (ax)] E2 ~k tz)]) .


(5-5)


x() xa, y)fxx fJydxy









The first term of Eq. (5-5) is


and the second term of Eq. (5-5) is


Ak k(LZ iL): Z) /X (xi) dxi(




E,,[(ax, ay) fl)fxydd


Therefore ,


AkVR@ ktZ)) (0)


x(ax, ay) fx(x) fx(y)dxdy.


(5-6)


Likewise


A kVar(@ k(b ) (0 h x, b + y) fy(x) fy(y)dxdy.

Combining Eq. (5-4), Eq. (5-6) and Eq. (5-7) together, we obtain


(5-7)


|Ez,~,[a(ax, b + y)] EE,[a(ax, b + y)]|



Hence the absolute value of the parametric correntropy coefficient is bounded by 1.

5.2 Correntropy Dependence Measure

Based on the parametric correntropy coefficient developed in the previous section,
we formulate a novel dependence measure for two random variables. Ri~nyi gave a set









of fundamental principles that a measure of statistical dependence Q(x, y) between two

random variables x and y should satisfy. These include

1. Q (x, y) is well defined,

2. O < Qx y <1,

3. Q(x, y) = 0 if and only if x and y are independent,

4. Q(x, y) = 1 if and only if y = f(x) or x = g(y), where f and g are Borel measurable
functions .

Riinyi showed that one measure satisfying these constraints is


Q (X, y) = sup corr (f (X), g (y)), (5-8)

where f (x) and g(y) must have finite positive variance, and f g are Borel measurable.

We proposed a dependence measure based on the parametric correntropy coefficient

which satisfies all the desirable properties above.

Definition (Correntropy Dependence Measure) Given two random variables x and

y, the Correntropy Dependence M~easure is defined as


F(x, y) = sup |Gla,b(X, y)|i, (5-9)

where rla,b Z, Y) is the parametric correntropy coefficient Eq. (5-3).

The correntropy dependence measure is a suitable statistical measure which fulfills all

the fundamental conditions listed by Riinyi.

Proof: First the measure Eq. (5-9) is well defined. Since |Ga,~b Z, y) I S between 0 and

1 as proved above, certainly the supreme of |Gla,b Z, Y) IS 1S lo bounded by 0 and 1. The

independence condition is also trivial. If x and y are independent, |Ga,~b(x, y)| = 0, therefore

supa,b Ira,b (X, y) | = 0. Since

0 < | a,b Z, Y) II Sup | Ga,b(X Y)









if sups,b Ira,b(:r, Y) | = 0, then | qu,b (:, y) | = 0. Therefore :r,y are independent. To check the

condition 4, consider the derivation in the boundedness proof Eq. (5-4). The qualities

achieve if and only if when


kp(axr) E[ k RT)~ = c0 kp(b + y) E [ k(b + y)]), V (5-10)


and,

E[( k~(ar) E [cp k al.)2] = 73E[( k~(b + y) E [ k(b + y)])2], V.(5-11)

In fact, both conditions Eq. (5-10) and Eq. (5-11) are equivalent. Fr-om the condition Eq.

(5-10), we obtain

k ETa) = BSk k(b + y) + 4k, Vk (5-12)

where Bk and ,4k are two constant. Let


f (:) = Gk k RF~T) = i:k 8k ki(b+y)f +4k) = (y~i), (5-13))


then :r = f l(g(y)) since functional kpr is continuous and invertible being the eigen-function

of the positive definite kernel. Therefore we conclude the proof.

Unlike the statistical measure proposed by Riinyi Eq. (5-8), the correntropy

dependence measure searches the parameter space for a supreme instead of the entire

function space. Compared to the kernel constrained covariance presented in [89], our

measure does not employ regularization technique because our optimization space

is the one dimensional parameter space. However the kernel constrained covariance

operates on the high-dimensional (possibly infinite dimensional) functional space, therefore

regularization is mandatory in order to make solution achievable. By selecting different

values of parameters a and b, the correntropy dependence measure offers a different scale

value of dependence between random variables :r and y. The idea is similar to the concept

of wavelet which provides multi-resolution information about signal frequency.









CHAPTER 6
CORRENTROPY PRINCIPAL CO1\PONENT ANALYSIS

Principal component analysis (also known as the Karhunen-Lo~ve transformation in

communication theory) is a powerful tool for feature extraction and data dimensionality

reduction in statistical pattern recognition and signal processing. It can he easily

performed by eigen-decomposition of the standard covariance matrix or by adaptive

algorithms that estimate principal components [92]. Principal component analysis or PCA

is really an affine transformation of the coordinate system such that the rate of decrease of

data variance is maximized. The projections of the data onto the new coordinate system

are called principal components. These projections represent the data optimally in a

least-square sense. In feature extraction, PCA transforms the data in such a way that a

small number of principal components can represent the data while retaining most of the

intrinsic variance of the data. These are sometimes called factors or latent variables of the

data [93].

While PCA yields a smaller dimensional linear subspace that best represents the full

data according to a minimum-square-error criterion, it might he a poor representation if

the data structure is non-Gaussian. Hence nonlinear component analysis may be needed.

There have been numerous attempts to define nonlinear components analysis in the latest

decades. Nonlinear PCA is generally seen as a nonlinear generalization of standard PCA

[92, 93]. The principal component is generalized from straight lines to curves. Principal

curves were proposed by Hastie [94] to define local directions that pass through the

high density parts of the data set. The principal curves are found through an iterative

algorithm that minimizes the conditional expectation of projections on the curves.

K~ramer presented a nonlinear PCA hased on auto-associative neural networks. The

auto-associative network performs identity mapping from the input data to the output by

minimizing the square error [95]. Recently, Schoilkopf et
obtain a nonlinear form of PCA [50]. This so called K~ernel PCA is one of the kernel-based










learning algorithms reviewed in OsI Ilpter 1.3. The K~ernel PCA nonlinearly maps the

original input data into an infinite dimensional reproducing kernel Hilbert space by the

data independent kernel and solves the eigen-decomposition of the Gram matrix of the

input data in a high-dimensional feature space. The Gram matrix has a dimension given

by the number of samples NV. The data projections onto the principal directions of the

Gram matrix, i.e. the inner product in feature space, are carried out by means of kernel

functions in the input space. While the utilization of Mercer kernels provides a tractable

way to compute principal components in the high-dimensional feature space, there are

still problems of interpretation and computation of the large dimensional Gram matrix.

Indeed, the number of eigfenfunctions of the Gram matrix is dependent on the number of

data samples NV, not the size of the data space L. Moreover computing Gram matrices for

millions of samples in a small, let us ;?i, two dimensional space becomes wasteful.

In this chapter, we propose a new nonlinear PCA technique based on the correntropy

function, called correntropy PCA. The correntropy function quantifies the similarity

between the L different components of the L dimensional input data vector (or the

time structure in a time series) using the statistical data distribution. The correntropy

also utilizes a kernel methodology, but in a different form: by applying the kernel to

pairs of data vector components, a random vector (or stochastic process) is nonlinearly

transformed into a high dimensional function space where the similarity between the

components of the transformed random variables (or stochastic process) can be measured

by the conventional covariance function. The eigen-decomposition of the covariance

of the transformed data yields the principal directions of the nonlinearly transformed

data. These linear principal directions in feature space correspond to nonlinear principal

directions in the input space. These projections can be efficiently computed by utilizing

the correntropy function. That means, if one has one million samples in a two dimensional

space, it is only necessary to solve a two dimensional eigenvector problem on a matrix










whose entries are computed from one million samples. In many applications this is a

tremendous computational saving.
Given a set of zero mean vector observations xj j = ,...,N NE 3=0

correntropy PCA seeks a direction in the feature space such that the variance of the data

projected onto this direction is maximized. Unlike the kernel method which transforms

data into a feature space sample by sample, correntropy PCA maps data component-wise

into a feature space, i.e., the RK(HS associated with the centered correntropy function. By

the equation Eq. (4-29),







where xi denotes the ith component of the original input data sample x. This nonlinear

mapping transforms the component-wise data into a high dimensional RK(HS which

is associated with the centered correntropy function. By the definition of centered

correntropy function, we have


< W(x4), W(xy) > = U(xe, xy) = Ex,,m~ [a(x xj)] Ei Ezj [a(xi, xj)] (6-1)
N NN
x(Zi jk)~; kj),Vi =1 ,L
k= 1 k= 1 m= 1

where Zik is the ith component of the kth input data sample. The expectation runs over all

the data samples.

Then the covariance matrix of the transformed data in the feature space is given by



i= 1

We now have to find the eigenvalues A > 0 and non-zero eigenvectors satisfying


Cq = Aq.










All the solutions q must lie in the span of W(xl),...,W(xL), i.e., we q is the form of linear

combination of all the W(xl),..., W(xt),


q = # (xi)(6-2)
i= 1

And we may instead consider the set of equations,


< W(Zk), Cq k= ~), Aq >, V k = 1,..., L. (6-3)


Combining equations Eq. (6-2) and Eq. (6-3), we get



j= 1 i= 1



j= 1 i= 1


i= 1

By equation Eq. (6-1), we can define an L xL centered correntropy matrix U by


Ui : = E[a(x xy)] E, Ez [a(xi, xy)] (6-5)


k=1 k=1 m=1



Let k in Eq. (6-4) runs from 1 to L, and write the result in matrix form, we can get


U2P = LAU (6-6)


where p denotes the column vector with entries p ,..., FL. It can be shown that the

solutions of equation Eq. (6-6) are equivalent to the solutions to the following eigenvalue

problem,

Up = LA (6-7)

for nonzero eigenvalues.









For the purpose of principal component extraction, we need to compute the

projections onto the eigenvectors q in the feature space. Let x be a test point, the

projection of x onto the principal direction mapped back to input space is given by



i= 1 j= 1

this is so called a nonlinear principal component.

In summary, we need to take the following steps to compute the nonlinear principal

components: (1) compute the correntropy matrix V by equation Eq. (6-5), where the

expected value is substituted by the average, (2) compute its eigenvectors and eigenvalues

through SVD, and (3) compute the projections of a test point onto the eigenvectors by Eq.

(6-8).

We will present two experimental results to show the effectiveness of correntropy

PCA in findings nonlinear principal directions. The first experiment compares the standard

linear PCA and correntropy PCA to extract features from a two dimensional mixture of

Gaussian distributed data. Specifically, the probability density function is a mixture of

Gaussian modes with the following form


f (x) = 1/2(Ni(ml, Ex) +Ni(m2, C2 ,


where Ni(ml, Ex) and Ni(m2, 2) arT tWO Gaussian distributed data with the mean

vectors and variance matrices given by

-1 10
mi = Ex I
-1 00.1

101 0
m2 C2
1 01

In Fig. 6-1, we plot the contours of the data and of the largest eigfenvalue directions

produced by linear PCA and correntropy PCA respectively. 200 samples are used and










Linear PCA


Correntropy PCA


3- 3-\\
















-2-1 0 1\ \ 2 :1 -10


XO XO

Figure 6-1. Linear PCA versus correntropy PCA for a two-dimensional mixture of
Gaussian distributed data


kernel size is chosen to be 2. The result confirms that linear PCA only provides the linear

directions that maximizes the variance. But since the underlying data is a mixture of

two Gaussian modes, linear PCA fails to consider the directions of the individual modes

but only averages these directions. On the contrary, correntropy PCA is more tuned to

the underlying structure of the data in the input space. correntropy PCA generates a

nonlinear principal direction that follows locally the directions of the individual modes so

that the variance of principal component projected onto this nonlinear curve is maximized.

The experiment shows that correntropy PCAis superior in describing the underlying

structure of the data when compared to the linear PCA method.

Our second experiment compared the kernel PCA, proposed by Schoilkopf et al in

[50], with correntropy PCA. We use the same experiment setup as in [50] in order to

illustrate the performance of correntropy PCA. The data is two-dimensional with three

clusters(Gaussian distribution with standard deviation 0.1). The number of data sample

and the kernel size are chosen to be 90 and 0.1 respectively. Since the number of principal











Elgenvalue=22 558
15


Elgenvalue=20 936
15


S05 05


-0
-0 5~ 0 05 1 -0 1 -0 5 0 05

Elgenvalue=1 794 Elgenvalue=0 206









-0 5 -
-05 0 05 1 -1 -0 5 0 05

Figure 6-2. Kernel PCA versus correntropy- PCA for a two-dimensional mixture of
Gaussian distributed data



components for kernel PCA depends on the number of data samples, there are many

eigen-directions in feature space that are difficult to identify in the input space, so we plot

the two principal components with the largest eigfenvalues from kernel PCA. However the

number of principal components for correntropy- PCA is equal to the dimension of input

space, so there is no ambiguity. Fig. 6-2 shows that both kernel PCA and correntropy

PCA can extract the nonlinear principal components form the data. While kernel PCA

tends to find the local structure for a given data set as the contours circle around different

data clusters -II- -- -r correntropy PCA seeks the underlying global structure of the data

set. The contour in the left bottom plot shows that correntropy- PCA can be tuned to the

data structure by changing the kernel size in the Gaussian kernel, and locate the principal

direction.

In experiments comparing the performance of correntropy PCA with standard linear

PCA and kernel PCA for nonlinear feature extraction, we found two advantages of our

method. First, correntropy PCA can be more tuned to the underlying data structure










than linear PCA so that it can extract the nonlinear principal components front the data,

very much like principal curves. There is no ambiguity since the number of nonlinear

principal components is the same as the dintensionality of the input space. In kernel PCA

it is very difficult to choose the eigen-directions if we can not visualize the data, since

the eigenvectors project locally to the input space. Therefore, it is not easy to separate

1!! ri ~ and minor components. Second, correntropy- PCA has a tremendous computational

complexity advantage over kernel PCA. For example, in the second simulation, we only

need to compute an eigen-decomposition for a 2 x 2 matrix using correntropy PCA while

we have to do eigen-decomposition for a 90 x 90 matrix using kernel PCA. As the training

set increases, the computational complexity of kernel PCA will increase dramatically but

the size of the correntropy matrix remains the same.

In this chapter we applied correntropy concepts in principal component analysis. The

approach is based on finding the eigenvectors of the centered correntropy matrix (same

as the dimension of the data) unlike the Grant matrix used by other kernel methods,

where the dimension of the Grant matrix is dependent on the number of the data. Yet,

the final principle curves we get using this method adequately covers the data in the

direction of nmaxiniun spread (variance in the feature space). Since we are dealing with

a finite dimensional matrix, we get a number of principle curves equal to the dimension

of the data space, and at the same time the computational complexity is drastically

reduced compared to the kernel methods. In general this approach offers a new method

of analyzing data. The study also -II__- -; that the concept of correntropy can he used

for de-correlating the data in the feature space (whitening), which can he applied in the

context of independent component analysis. The future research will apply correntropy

PCA to real data problem and also compare with other nonlinear principal component

analysis methods.










CHAPTER 7
CORRENTROPY PITCH DETERMINATION ALGORITHM

7. 1 Introduction

Pitch, or the fundamental frequency FO, is an important parameter of speech signals.

Accurate determination of pitch pIlai-4 a vital role in acoustical signal processing and has

a wide range of applications in related areas such as coding, synthesis, speech recognition

and others. Numerous pitch determination algorithms (PDA) have been proposed in

the literature [96]. In general, they can be categorized into three classes: time-domain,

freuev I-ii-~li-domain, and time-frequency domain algorithms.

Time-domain PDAs operate directly on the signal temporal structure. These include

but are not limited to zero-crossing rate, peak and valley positions, and autocorrelation.

The autocorrelation model appears to be one of the most popular PDAs for its simplicity,

explanatory power, and physiological plausibility. For a given signal x, with NV samples,

the autocorrelation function R(-r) is defined as

N-1

R(7) = x~ZxL r, (7-1)
n= o

where -r is the delay parameter. For dynamical signals with changing periodicities, a

short-time window can be included to compute the periodicities of the signal within the

window ending at time t as
N-1

n= o
where we is an arbitrary causal window that confines the autocorrelation function into

a neighborhood of the current time. Other similar models can be obtained by replacing

the multiplication by subtraction (or excitatory by inhibitory neural interaction) in the

autocorrelation function such as the average magnitude difference function (AMDF) [97].

C!. i.; !,as.\proposed the squared difference function (SDF) in [98] as

N-1

n= o










The weighted autocorrelation uses an autocorrelation function weighted by the inverse

of an AMDF to extract pitch from noisy speech [99]. All these PDAs based on the

autocorrelation function suffer from at least one unsatisfactory fact: the peak corresponding

to the period for a pure tone is rather wide [100, 101]. This imposes greater challenge for

multiple FO estimation since mutual overlap between voices weakens their pitch cues, and

cues further compete with cues of other voices. The low resolution in pitch estimation

results from the fundamental time-frequency uncertainty principle [102]. To overcome

this drawback, Brown et al. presented a is lii n..--~ i auto correlation function to improve

the resolution of the autocorrelation function for musical pitch extraction [103]. The

is.!...-- 1autocorrelation function includes terms corresponding to d. 00,~ at 27r, 3-r, etc.,

in addition to the usual term with delay -r as

N-1

11=0

However it requires an increase in the length of the signal and less precision in time. It

also requires the a priori selection of the number of delay terms L.

Frean I-ii i l-domain PDAs estimate pitch by using the harmonic structure in the

short-time spectrum. Fre,18. 11. .--- domain methodologies include component frequency

ratios, filter-based methods, cepstrum analysis and multi-resolution methods. Pitch

determination algorithms such as harmonic sieve [104], harmonic product spectrum

[105], sub-harmonic summation [106], and subharmonic to harmonic ratio [107] fall

into this category. Most fra in, :1. ---domain pitch determination methods apply pattern

matching [108]. Others use nonlinear or filtering preprocessing to generate or improve

interpartial spacing and fundamental component cues. The fre. to. :1. --domain PDAs have

the advantage of efficient implementation with fast Fourier transform and theoretical

strength of Fourier analysis. However one weakness is that they rely on the shape and size

of the analysis window. Selection and adjustment of analysis window remain a problem in

estimation.










The time-frequency approach splits the signal over a filter-bank, applies time-domain

methods to each channel waveform, and the results are ..-:-o negated over channels. The

summary, or "po<.!. ii autocorrelation functions across all channels provides pitch

information of the signal. Licklider first presented this idea as a pitch perception model

[109]. Later Lyon and Slaney further developed the methodology and called it cor-

r, 1. -pr in [110, 111]. The correlogram is the first stage processor in a computational

auditory scene analysis (CASA) system [112]. It has also been incorporated into a neural

oscillator to segregate double vowels and multipitch tracking [101, 113]. The strength

of correlogram in pitch estimation is that different frequency channels corresponding to

different signal sources of different pitches can be separated, which makes it useful in

multipitch estimation [101, 114]. Also individual channel weighting can be adapted to

compensate for amplitude mismatches between spectral regions [115].

On the other hand, autocorrelation and power spectrum based pitch determination

algorithms mentioned above only characterizes second-order statistics. In many applications

where non-Gaussianities and nonlinearities are present, these second-order statistical

methodologies might fail to provide all the information about the signals under study.

Higfher-order statistics have been used in pitch determination. Moreno et al. applied

higher-order statistics to extract pitch from noisy speech [116]. But only diagonal third

order cumulants were used for simplicity and computational efficiency which is given by
N-1

c:(k) = Inlur,, n~k k = 0,- -- ,N -1.
n= o

And pitch is found by applying autocorrelation function to the cumulants c(k),
N-1

G(r)~ = c(k)c(k + 7).(74










In this chapter, we propose a new pitch determination algorithm based on the

correntropy function defined in the previous chapters,

N-1

n=o

The proposed pitch determination method is applied after the acoustical signal is

processed by an equivalent rectangular bandwidth (ERB) filter bank in the time domain.

The ERB filter bank acts as a cochlear model to transform a one dimensional acoustical

signal into a two dimensional map of neural firing rate as a function of time and place

[111]. The correntropy function for each channel is calculated and the summation across

all the channels provides the pitch information. As a novel "self-- ~!l nI~ !l.) t measure,

correntropy is able to offer much better resolution than the conventional autocorrelation

function in pitch estimation. Moreover, our pitch determination algorithm can segregate

double vowels without applying any complex model such as a neural oscillator [101].

7.2 Pitch Determination based on Correntropy

The structure of the correntropy definition seems very appropriate to quantify

similarity in time, in a manner that is biologically plausible, that is, it can be implemented

at neuronal level Its argument is sensitive to differences in time instances as correlation,

but instead of being linear across differences, there is an intermediate nonlinear transformation

that gives more emphasis to values that are closer together. Neurons are known to be very

sensitive to time differences, but their highly nonlinear response may also emphasize

similarity at the close range. In the context of pitch determination, the correntropy

function might as well estimate the pitch information of the signal similar to the

autocorrelation function. However, compared to the autocorrelation function model,

our pitch determination algorithm based on the correntropy function offers much better

resolution and enhances the capacity of estimating multiple pitches. Since correntropy

creates many different harmonics of each resonance present in the original time series

due to the nonlinearity of the kernel function, it may also be useful for perceptual pitch


















III




0 50 100 150 200 250
Time (lag) / samples

Figure 7-1. Autocorrelation, narrowed autocorrelation with L = 10 and correntropy
functions of a sinusoid signal.


determination. To illustrate this, we compare the conventional autocorrelation Eq. (-)

narrowed correlation Eq. (7-3) and correntropy functions Eq. (7-5) for a simple sinusoid

signal. Fig. 7-1 plots three functions in the lag with respect to the delay lag in the time

domain. All three functions are able to peak at the same delay lag corresponding to the

period of the sinusoid signal. However it is evident that the peaks obtained from the

correntropy function are much narrower and sharper than those from the conventional

autocorrelation and narrowed autocorrelation functions. In Fig. 7-2, we present the

Fourier transform of each function. The ordinary autocorrelation function only exhibits

one harmonic and the narrowed autocorrelation produces 10 harmonics which is equal

to the number of terms L used in Eq. (7-3). The correntropy function places even more

energy at higher harmonics in frequency. The narrowness of correntropy function in the

time domain implies the rich harmonics present in the frequency domain. This is due to

the nonlinear exponential function included in the definition of the correntropy function.

It should also be noticed that there is a connection between the correntropy function

Eq. (7-5) and the square difference function Eq. (7-2). The correntropy function also

uses inhibitory neural interaction model instead of excitatory one with a Gaussian kernel

function, but it nonlinearly transforms the subtraction of the signals by the exponential
























0 5 10 15 20 25 30 35 40
Frequency (Hz)

Figure 7-2. Fourier transform of autocorrelation, narrowed autocorrelation with L = 10
and correntropy functions of a sinusoid signal.


function. From another perspective, the correntropy function includes the scaled square

different function as an individual term for k = 1 in the summation of Eq. (3-11). But it

contains more information with other higher-order moment terms.

Our pitch determination algorithm first uses cochlear filtering to peripherally

process the speech signal. This is achieved by a bank of 64 gammatone filters which

are distributed in frequency according to their bandwidths [117]. The impulse response of

a gammatone filter is defined as


q() tn-1 -2rat cos(2xrfot+~l


where n is the filter order with center frequency at fo Hz, tb is phase, and a is bandwidth

parameter. The bandwidth increases quasi-logarithmically with respect to the center

frequency. The center frequencies of each filter are equally spaced on the equivalent

rectangular bandwidth scale between 80 Hz and 4000 Hz [118]. This creates a cochleagram,

which is a function of time lag along the horizontal axis and cochlear place, or frequency,

along the vertical axis. The cochlear separates a sound into broad frequency channels

while still contains the time structure of the original sound. It has served as a peripheral


































6
Time (lag) Ims


Figure 7-3. Correlogram (top) and summary (bottom) for the vowel /a/.


pre-process in the computational auditory scene analysis (CASA) model [112], and used

extensively in pitch determination [101, 119].

The periodicity analysis is done by computing the correntropy function at the output

of each cochlear frequency channel,

N-1

n= o

where i stands for channel number and z, is the cochlear output. The kernel bandwidth

is determined using Silverman's rule [120]. The time lag -r is chosen long enough to

include the lowest expected pitch. Generally it is set at least 10ms throughout the

paper. In this way, a picture is formed with horizontal axis as correntropy lags and

vertical axis as cochlear frequency. We name it correntropo~;- ines,, which literally means

1.1. Ise'es of <~~ ni ~!~'13 l p If a signal is periodic, strong vertical lines at certain correntropy

lags appear in the correntropy-gram indicating times when a large number of cochlear


































0 2 4 6 8 10 12
Time (lag) Ims

Figure 7-4. Autocorrelation (top) and summary (bottom) of third order cumulants for the
vowel /a/.


channels are firing synchronously. While the horizontal bands signify different amounts

of energy across frequency regions. The correntropy-gram is similar to the correlogram in

structure but different in content. In order to reduce the dynamic range for display in the

correntropy-gram, the correntropy functions should be normalized such that the zero lag

value is one as given by the following formula,

N-1 N-1



N-1 N-1




where V(0) is the value of correntropy when lag -r = 0. The numerator is called the

centered correntropy which takes out the mean value of the transformed signal in the

RK(HS. C(-r) is also called correntropy coefficient that has been applied to detect nonlinear

dependence among multichannel biomedical signals [121].





0 2 4 6 8 10 12
Time (lag) Ims

Figure 7-5. Narrowed autocorrelation (top) and summary (bottom) for the vowel /a/.


In order to emphasize pitch related structure in the correntropy-gram, the correntropy

functions are summed up across all the channels to form a p.~ ** *!- I" or -I!!! i I ila y"

correntrotr w;- or ia




The summary correntropy-gram measures how likely a pitch would be perceived at a

certain time lag. The pitch frequency can be obtained by inverting the time delay lag. In

our experiment, the summary of correntropy functions is first normalized by subtracting

the mean and dividing by the maximum absolute value. The position of pitch can be

picked by various peak-picking algorithms to identify local maximum above the pre-defined

threshold. Here we calculate the first derivative and mark the position when the value

changes from positive to negative as a local maximum.

Compared to the conventional correlogram model [101, 119], [122], our pitch detector

is able to locate the same period information as the correlogfram, but has much narrower

































0 2 4 6 8 10 12
Time (lag) Ims

Figure 7-6. Correntropy-grant (top) and suninary (hottoni) for the vowel /a/.


peaks. Hence the proposed method enhances the resolution of pitch determination.

Furthermore, since the correlogfrant estimates the likelihood that a pitch exists at a certain

time delay, the suninary correlogrant may generate other "erroneous" peaks besides

the one corresponding to the pitch period [111]. while the suninary correntropy-grant

suppresses values that are dissimilar at all other time d. lI-e by the exponential decay

of the Gaussian function and only peaks at the one corresponding to the pitch period.

For mixtures of concurrent sound sources with different fundamental frequencies, the

suninary correlogrant usually fails to detect multiple pitches without further nonlinear

post-processing. But the suninary correntropy-grant is able to show peaks at different

periods of each source. These characteristics of the proposed method -II--- -a superiority

of the correntropy- function over the autocorrelation function in pitch determination.

1\oreover, the computational complexity of our method, whether the correntropy

function Eq. (7-6) or the correntropy- coefficient Eq. (7-7), remains similar to the


1 g lr igag ne'"e~ ls l l i t r
































6
Time (lag) Ims


Figure 7-7. Correlogram (top) and summary bottomm) for a mixture of vowels /a/ and
/u/.


correlogram. Although there are double summations in the correntropy coefficient, the

computational complexity can he reduced to O(NVlog N) using the Fast-Gauss transform

[12:3]. However, the "r! l.10 .- -~ i autocorrelation function increases computational

complexity by including more delay terms.

7.3 Experiments

In this section, we present three experiments to validate our method. In the first

two simulations, we compare our method with the conventional autocorrelation function

[119], the third order cumulants function [116], and the narrowed autocorrelation function

[10:3] in determining pitches for a single speaker and two combined speakers uttering

different vowels. The synthetic vowels are produced by Slaney's Auditory Toolbox [124].

For a fair comparison, we did not apply any post-processing on the correlogram as was

used in [119]. The conventional autocorrelation function Eq. (7-1), autocorrelation of

third order cumulants functions Eq. (7-4), narrowed autocorrelation functions Eq. (7-3)
































0 2 4 6 8 10 12
Time (lag) Ims

Figure 7-8. Third order cuniulants (top) and suninary (hottoni) for a mixture of vowels
/a/ and /u/.


and correntropy functions Eq. (7-6) are presented after the same cochlear model. In the

third experiment, the proposed method is tested using Bagshaw's database which is a

benchmark for testing PDAs [125].

7.3.1 Single Pitch Determination

Fig. 7-3 to Fig. 7-6 present the pitch determination results for a single synthetic

vowel /a/ with fundamental frequency at 100Hz. The upper plots are the images of

correlation functions, autocorrelations of third order cuniulants, narrowed autocorrelations,

and correntropy functions after the same cochlear model respectively. The bottom

figures are the suninaries of those four images. The kernel size o- in the Gaussian kernel

has been chosen to be 0.01 (we will discuss further kernel size selection in Sec. 8.3)

and L = 10 in the narrowed autocorrelation function Eq. (7-3). The conventional

autocorrelation, third order cuniulants and narrowed autocorrelation are all able to

produce peaks at 10nis corresponding to the pitch of the vowel. But they also generate





0 2 4 6 8 10 12
Time (lag) Ims

Figure 7-9. Narrowed autocorrelations (top) and summary (bottom).


Other erroneous peaks which might confuse pitch determination. On the contrary, the

summary of correntropy-gram provides only one single and narrow peak at 10ms which

is the pitch period of the vowel. And the peak is much narrower than those obtained

from other methods. The correntropy-gram clearly shows a single narrow stripe across all

the frequency channels which concentrates most of the energy. It is the indication of the

fundamental frequency.

The fine structure of hyperbolic contours can also be clearly seen in the correntropy-gram.

Its power spectrum energy spreads equally at all the harmonics. The rows of white spots

across the correntropy-gram reflect the periodic structure. Particularly, the second

harmonic shows two peaks during the time interval when the fundamental frequency

exhibits one. This fine structure is the result of correntropy's ability to contain all the

harmonics of signal. However, the correlogram yields a much wider spread of energy across

the frequency channels. The image with the autocorrelations of the third order cumulants
































0 2 4 6 8 10 12
Time (lag) Ims

Figure 7-10. Correntropy-gram (top) and summary (bottom) for a mixture of vowels /a/
and /u/.


fail to present such structures. Although, the image of the narrowed autocorrelation

functions is able to show some hyperbolic contours, the white vertical stripe at the

fundamental frequency is much wider than that of the correntropy-gram and there are

other white spots in the high frequency channels. These result in the wide peak at 10ms

and other erroneous peaks in the summary of the narrowed autocorrelation functions.

Our proposed method clearly outperforms the conventional autocorrelation function,

third order cumulants method and narrowed autocorrelation function in single pitch

determination case.

7.3.2 Double Pitches Determination

In this example, we consider pitch determination for a mixture of two concurrent

synthetic vowels with /a/ (FO = 100 Hz) and /u/ (FO = 126 Hz) which are separated

by four semitones. We compare the same four methods as the experiment above to




















04 .




-4- correntropy
narrowed correlation
correlation
-- cumulants
u001 02 03 04 05 06 07 08 09
Probability of false alarm

Figure 7-11. The ROC curves for the four PDAs based on correntropy-gram,
autocorrelation, narrowed autocorrelation (L = 15), and autocorrelation
of 3rd order cumulants in double vowels segregation experiment.


demonstrate that the correntropy function is able to determine two pitches presented in

the mixture of two vowels.

Fig. 7-7 to Fig.7-10 present the simulation results. The correlogram method result

shown in Fig. 7-7 only shows one peak corresponding to the pitch of the vowel /a/ while

no indication of the other vowel /u/ at time of 7.9ms is provided. The summary of

correlogfram resembles that of single vowel case in Fig. 7-3. The third order cumulants

method in Fig. 7-8 fails to detect two pitches in the mixture signal. Although there

are two small peaks at 10ms and 7.9ms which correspond to the two pitch periods

respectively, their amplitudes are not large enough to be reliably detected. In Fig. 7-9,

the summary of narrowed autocorrelation functions with L = 15 is able to produce only

one peak at 10ms corresponding to the pitch period of vowel /a/, but there is no peak

at 7.9ms. There are white spots in the low frequency channels in the image of narrowed

autocorrelation functions which are the indications of the second vowel /u/. However,

the amplitude is too small compared with that of vowel /a/ and the information is lost

in the summary plot. A complex neural network oscillator has been used to separate the















730-






O 302 -5


FO difference (semtones

Figure 7-12. The percentage performance of correctly determining pitches for both vowels
for proposed PDA hased on correntropy function and a CASA model.


channels dominated by different voices, and the summaries of individual channels are able

to produce peaks corresponding to different vowels [101].

On the other hand, our method is able to detect two pitches from the mixture of two

vowels. The kernel size o- is set to 0.07 in this experiment. The correntropy-gram in Fig.

7-10 shows a white narrow stripe across high frequency channels at 10ms corresponding

to the pitch period of the vowel /a/. These channels have center frequencies close to

the three formant frequencies of vowel /a/ (F1 = 7:30Hz, F2 = 1090Hz, F:3 = 2440Hz).

The hyperbolic structure can still be seen in the high frequency channels, but the lower

frequency channels have been altered by the presence of vowel /u/. The three high energy

white spots appear along the frequency channels centered at :300Hz which is the first

dormant of vowel /u/. The second white spot locates at 7.9ms matches the pitch period of

vowel /u/. In the summary of correntropy-gram, the first peak at 10ms corresponds to the

pitch period of vowel /a/. It is as narrow as the one in the single vowel case in Fig. 7-6.

The second peak appears at 8.2ms which is only 4Hz off the true pitch frequency (126Hz).

It is much less than the 21' gross error pitch determination evaluation criterion [126]

or 10Hz gross error [99]. The second peak is also much wider than the one at 10ms. The
























0 2 4 6 8 10 12
Time (lag) Ims

Figure 7-1:3. Summary of correntropy functions with different kernel sizes for a single
vowel /a/.


amplitude for the peak at 8.2ms is also smaller than that of peak at 10ms since the energy

ratio is 5.2 times higher for vowel /a/. The pitch shift and peak broadening phenomenon

is due to the fact that vowel /a/ dominates the mixture signal and it generates spurious

peaks which blur that of vowel /u/. However, it is remarkable that our method, with

the proper kernel size, is able to detect two pitches while all other algorithms fail in this

experiment. This simulation clearly demonstrates the superiority of our method over the

conventional correlogfram, third order cumulants and narrowed correlation approaches for

multipitch determination.

7.3.3 Double Vowels Segregation

To further investigate the performance of the proposed PDA, we generate a set

of three vowels: /a/, /u/ and /i/ using Slaney's Auditory Toolbox. Each vowel is

synthesized on 5 pitches corresponding to differences of 0.25, 0.5, 1, 2 and 4 semitones

from 100Hz, and the duration is 1s each. For every mixture of double vowels, one is ah--7i-

with the fundamental frequency at 100Hz, and the other constituent can he any vowel

at any pitch value. In total, we have 45 mixtures of different combinations of vowels

with different pitch values (:3 vowels x :3 vowels x 5 pitches). The detection functions

from each of four methods above have been normalized to 0 and 1. A threshold varies










between 0 and 1 to decide the peaks. If the difference between the detected pitch and

reference is within a certain tolerance, the right pitch is detected. Since the minimum

distance in this experiment is 0.25 semitone from 100Hz, which is 1.45 Hz, we select the

tolerance to be 0.725Hz. Fig. 7-11 plots the ROC curves for the four pitch determination

algorithms based on correntropy function, autocorrelation, narrowed autocorrelation and

autocorrelation of third order cumulants. It clearly shows that our method outperforms

the other three in double-vowel pitch detection. However, none is able to get 100I' .

detection. Notice that the ROC curve for correntropy function contains many points of

zero probability of false alarm, up to 45' of correct detection. This is due to the fact

that correntropy function is able to suppress other erroneous peaks which are away from

pitch positions and concentrate energy around fundamental frequencies. The performance

of autocorrelation and third order cumulants are below 501' detection rate, irrespective

of the number of false alarms generated, which means that most often the second largest

peak is an harmonic of the highest pitch. This is not surprising since both functions fail to

present two peaks for most mixtures in this experiment.

We also present the vowel identification performance to examine the discriminating

ability of correntropy function at different semitones for the mixture of double vowels.

In the experiment, the threshold is chosen such that the first two peaks are detected.

We compare our results with a computational auditory scene analysis (CASA) model

with a network of neural oscillators [101] in Fig. 7-12. The CASA model outperforms

our method at 0.25, 0.5 and 0.5 semitones of FO differences since it uses a sophisticated

network of neural oscillators to assign different channels from ERB filter bank outputs

to different vowels. Our method is just based on the simple summary function of

correntropy-gram. The closer the two fundamental frequencies of two vowel become, the

harder for correntropy function to produce two distinct peaks corresponding to different

pitches. However, our method obtains comparable results to CASA model at 2 and 4

semitones of FO differences. It so__~- -;- that our simple model is able to produce similar


















o=03






0 2 4 6 8 10 12
Time (lag) Ims

Figure 7-14. Summary of correntropy functions with different kernel sizes for a mixture of
vowels /a/ and /u/.


results for double vowel segregation of 2 and 4 semitones of FO differences compared to

the sophisticated CASA model. This certainly shows our technique is very promising.

7.3.4 Benchmark Database Test

We test our pitch determination algorithm with Bagshaw's database [125]. It contains

7298 males and 16948 females speech samples. The groundtruth pitch is estimated at

reference points based on laryngograph data. These estimates are assumed to be equal

to the perceived pitch. The signal is segmented into 38.4ms duration centered at the

reference points in order to make the comparisons between different PDAs fair. The

sampling frequency is 20kHz. The kernel size is selected according to Silverman's rule for

different segments. We use equation Eq. (7-7) to calculate the normalized correntropy

functions such that the summary correntropy function at zero lag is unit. Since the pitch

range for male speaker is 50-250 Hz and 120-400 Hz for female speaker, the PDA searches

local maximums from 2.5 ms above in the summary correntropy function. We set the

threshold to be 0.3 by trial and error so that every local maximum which exceeds 0.3 will

be detected as a pitch candidate.

Table 7-1 summarizes the performance of various PDAs which are taken from [126],

[127]. The performance criterion is the relative number of gross error. A gross error occurs










Table 7-1. Gross error percentage of PDAs evaluation
Male Female
PDA Higfh Low Higfh Low Weigfhted
( .) ( .) ( .) ( .) M ean ( .)
HPS 5.34 28.20 0.46 1.61 11.54
SRPD 0.6;2 2.01 0.39 5.56 4.95
CPD 4.09 0.64 0.61 3.97 4.63
FBPT 1.27 0.64 0.60 3.35 3.48
IPTA 1.40 0.83 0.53 3.12 3.22
PP 0.22 1.74 0.26; 3.20 3.01
SHR 1.29 0.78 0.75 1.69 2.33
SHAPE 0.95 0.48 1.14 0.47 1.55
eSRPD 0.90 0.56 0.43 0.23 0.90
Correntropy 0.71 0.42 0.35 0.18 0.71


when the estimated fundamental frequency is more than 211' off the true pitch value.

The percent gross errors by gender and by lower or higher pitch estimates with respect

to the reference are given in Table 7-1. The weighted gross error is calculated by taking

into account the number of pitch samples for each gender. It clearly shows that for this

particular database correntropy based PDA outperforms others.

7.4 Discussions

In all the previous experiments, a specific kernel size o- in the Gaussian kernel Eq.

(3-3) needs to be selected by the user (free parameter). Actually, the kernel size phI i. an

important role in the performance of our method since it determines the scale at which

the similarity is going to be measured. It has been shown that kernel size controls the

metric of the transformed signal in the RK(HS [65]. If the kernel size is set too large, the

correntropy function approaches the conventional correlation function and fails to detect

any nonlinearity and higher order statistics intrinsic to the data, on the other hand, if

the kernel size is too small, the correntropy function loses its discrimination ability. One

practical way to select the kernel size is given by the Silverman's rule [120]


o- = 0.9ANV-1/










where A is the smaller value between standard deviation of data samples and data

interquartile range scaled by 1.34, and N is the number of data samples.

To illustrate the effect of different kernel sizes, we simulate the summary of

correntropy functions for the same experiments setup in Sec. 8.2 with different kernel

sizes in Fig. 7-13 and Figf. 7-14. It can be seen that if the kernel size is large, a = 1 here,

the summaries of correntropy- functions approach those of correlation functions shown in

Fig. 7-3 and Fig. 7-7. As kernel size approaches the kernel size given by Silverman's rule,

a = 0.01 for the single vowel /a/ case and a = 0.07 for a mixture of /a/ and /u/, the

summary of correntropy functions starts to present a large and narrow peak corresponding

to the pitch of vowel /a/ and show the other vowel /u/. If the kernel size is too small,

a = 0.001 here, the summary of correntropy functions loses its ability to present two

vowels. This is shown in the bottom plot of Fig. 7-14.

7.5 Conclusion

A novel pitch determination algorithm is proposed based on the correntropy- function.

The pitch estimator computes the correntropy- functions for each channel of an ERB filter

bank, and adds across all the channels. Simulations on single and double vowel cases

show that the proposed method exhibits much better resolution than the conventional

correlation function, third order cumulants method and narrowed correlation function in

single and double pitches determination. This so__~-1-;- that correntropy- can discriminate

better pitch when two different speakers speak in the same microphone. This is essential

in computational auditory scene analysis (CASA). Although these results are preliminary

and much further work is needed to evaluate the methods, this technique seems promising

for CASA. The automatic selection of the kernel size or of a multiple kernel size analysis

need to be further investigated to automate the pitch determination algorithm. The

future work also includes incorporating correntropy-gram channel selection to enhance the

discriminating ability of proposed method in multiple pitches determination. A benchmark

database test for various PDAs shows that the proposed PDA outperforms some others










with available results for this dataset. Since correntropy creates many different harmonics

of each resonance present in the original time series due to the nonlinearity, it may also be

useful for perceptual pitch determination.










CHAPTER 8
CORRENTROPY COEFFICIENT AS A NOVEL SIlMILARITY 1\EASITRE

In this chapter, we apply the proposed correntropy coefficient (4-27) to characterize

the similarity between multi-channel signals. Preliminary experiments with simulated data

and multichannel electroencephalogram (EEG) signals during behavior studies elucidate

the performance of the new measure versus the well established correlation coefficient.

8.1 Introduction

Quantification of dynamical interdependence in multi-dimensional complex systems

with spatial extent provides a very useful insight into their spatio-temporal organization.

In practice, the underlying system dynamics are not accessible directly. Only the

observed time series can help decide whether two time series collected from the system

are statistically independent or not and further elucidate any hidden relationship

between them. Extracting such information becomes more difficult if the underlying

dynamical system is nonlinear or the couplings among the subsystems are nonlinear and

non-stationary.

There has been extensive research aimed at detecting the underlying relationships

in multi-dimensional dynamical systems. The classical methodology employs a linear

approach, in particular, the cross correlation and coherence analysis [128]. Cross

correlation measures the linear correlation between two signals in the time domain,

while the coherence function specifies the linear associations in the frequency domain by

the ratio of squares of cross spectral densities divided by the products of two auto-spectra.

There have been several extensions of correlation to more than two pairs of time series

such as directed coherence, directed transfer functions and partial directed coherence [129].

Unfortunately, linear methods only capture linear relationships between the time series,

and might fail to detect nonlinear interdependencies between the underlying dynamical

subsystems.










Nonlinear measures include mutual information and state-space methods. One

technique is the generalized mutual information function [130]. However, a large quantity

of noise-free stationary data is required to estimate these measures based on information

theory, which restricts their applications in practice. Another method is the phase

synchronization where the instantaneous phase using Hilbert transforms is computed

and interdependence is specified in terms of time-dependent phase locking [131]. The

state-space methodologies include similarity-index and synchronization likelihood. The

similarity-index technique and its modifications compute the ratio of average distances

between index points, their nearest neighbors and their mutual nearest ones [132, 133].

Stam et al. proposed the synchronization likelihood to offer a straightforward normalized

estimate of the dynamical coupling between interacting systems [134]. There are several

drawbacks associated with these techniques based on state space embedding. Estimating

the embedding dimension of times series corrupted by measurement noise for a valid

reconstruction, searching a suitable neighborhood size and finding a constant number

of nearest neighbors are a few of many constraints that severely affect the estimation

accuracy.

In this chapter, we use the correntropy coefficient Eq. (4-27) as a novel similarity

measure to quantify the inter-dependencies among multi-channel signals. In practice, the

estimate of correntropy coefficient Eq. (4-28) is calculated between two time series of

interests to measure the similarity.

8.2 Experiments

We test the correntropy coefficient on simulated data set and real world EEG signals

in the experiments.

8.2.1 Two Unidirectionally Coupled H~non maps

First, we test two unidirectionally coupled Hiinon maps which has been extensively

used to validate many synchronization measure in the literature [133, 134]. We apply

the correntropy coefficient in detecting nonlinear interdependence of two unidirectionally










coupled Hiinon maps defined as


.ri(n + 1) = 1.4 .rf(n) + b~z..T2 ) (8 )

.I'2 + 1>=x(n) (8-2)


for the driver, represented as system X, and


UI(n + 1) = 1.4 [Cxri(n) + (1 C)UI(n)]UI(n) + buy2 9) (8 )




for the response, represented as system Y. System X drives system Y with nonlinear

coupling strength C. C ranges front 0 to 1 with 0 heing no coupling and 1 heing complete

coupling. Parameters b., and b, are both set to 0.3 as canonical values for the Hi~non niap

when analyzing identical systems, and to 0.3 and 0.1 respectively for nonidentical ones.

For each coupling strength, we discard the first 10000 iterated time series as transient

and obtain the next 500 data points for experiments. The correntropy coefficient if is

calculated between the first component of system X, .ri, and the first component of system

Y, yl.

In the following simulations, we aim to address these questions: (1) whether

correntropy coefficient increases when the coupling strength C between X and Y for

both identical and nonidentical systems increases; (2) How robust is correntropy coefficient

to different level measurement noises in driver, response and both systems'? (:$)How

sensitive is correntropy coefficient to time dependent sudden change in the dynamics

of interacting systems due to coupling strength'? (4) Can correntropy coefficient detect

nonlinear coupling between driver and response'? (5) How is correntropy coefficient

affected by kernel size and data length'?

8.2.1.1 Variation of Correntropy Coefficient with Coupling Strength

First in Fig. 8-1, we plot the averaged correntropy coefficient if as a function of

coupling strength C for identical nmap (b., = b, = 0.3) and nonidentical nmap (b., = 0.3











1~ (a)b =03.b =03


05


~0
g 01 02 03 04 05 06 07 08 09
S06 (b) b =03. b =01
04

02~
0-t


-20 01 02 03 04 05 06 07 08 09 1
Coupling Strength C

Figure 8-1. Averaged correntropy coefficient for unidirectionally identical (a) and
nonidentical (b) coupled Hi~non maps.


and b, = 0.1) over 10 realizations of different initial conditions. The error bars denote

the standard deviation over the different realizations. Fig. 8-1(a) shows the identical nmap

where the kernel size used in Gaussian kernel has been chosen to be 0.001 according to the

Silvernian's rule Eq. (8-5). For identical chaotic systems, perfect synchronization can he

generated with sufficient degrees of coupling [1:35]. This can he seen front the fact that the

correntropy coefficient if = 1 for C > 0.7 in Fig. 8-1(a) indicating perfect synchronization

occurs between two coupled systems. The critical threshold C = 0.7 corresponds to the

point when the nmaxiniun Lyapunov exponent of the response system becomes negative

and identical synchronization between the systems takes place [1:33]. On the other hand,

the correntropy coefficient if = 0 for C < 0.7 -11---- -ru;~!_ no synchronization even though

two systems are weakly coupled. Similar results have been reported using other nonlinear

interdependence measurement in [1:33, 1:36, 1:37].

Fig. 8-1(b) shows the result for unidirectionally coupled nonidentical systems (b,=

0.3, b, = 0.1). The kernel size is set to 0.4. In this case, identical synchronization is not

possible and the driver has higher dimension than the response. The sharp increase of

the correntropy coefficient at point C = 0.7 as in the identical synchronization situation

can not he observed here. But the correntropy coefficient shows a consistent nionotonic











(a) (b)
0 1 0 02 repos no e +u~l




0 02 04 06 08 1 0 02 04 06 08 1
E (c) (d)
S0 1 driver+ noise (SNR= 10dB) 00 rvr os SR=1B
20 05 0 02


O 0 02 04 06 08 1 0 02 04 06 08 1
(e) (f)


0 02 0 0

0 02 04 06 08 1 0 02 04 06 08 1
Coupling Strength C


Figure 8-2. Influence of different noise levels on correntropy coefficient.


increase with respect to coupling strength except for the region 0.1 < C < 0.3. The local

hump in this region can also be seen in [133, 134, 137]. It is -II_0-r-- -1. that the cause is

most likely due to the local minimum of the maximum Lyapunov exponent [133, 138] in

this region.

8.2.1.2 Robustness Against Measurement Noise

A good interdependence measure should be robust against noise. Next, we analyze

the robustness of correntropy coefficient when time series are contaminated with noise.

There are two types of noise in general: intrinsic noise and addictive measurement noise.

We only consider measurement noise here which does not perturb the inherent dynamics

of systems. Independent realizations of white noise are added to driver, response and both

systems separately. The signal-to-noise (SNR) ratio is set to be 10dB and 1dB respectively

to test the performance of correntropy coefficient at different noise intensity. 500 data

samples are used to calculated the correntropy coefficient, averaged over 20 realizations.

Fig. 8-2 plots the correntropy coefficient for unidirectionally coupled identical Hiinon

map (b, = b, = 0.3) with white noise in response, driver and both systems. Kernel size

is chosen to be 0.04 for SNR = 10dB and 0.08 for SNR = IdB respectively. Notice that

the correntropy coefficient curves with noise become less smooth than that of noise-free











(a) (b)
0 08
015
0 04 u 0 1
0 05

-0 05
0 02 04 06 08 1 0 02 04 06 08 1
E 1(c) (d)
rreno se (SNR = 10dB) 0 15 ~dvr os SR=lB
0 05~ 0 1
0e 005
o -005
O 0 02 04 06 08 1 ~ O00 02 04 06 08 1
(e) (f)
0 06
repneino se (SNR = 10dB) 0 1 response + no se (SNR = 1dB)
0 4 die+noise (SNR = 10dB) me noise (SNR = 1dB) ?Y
00 0


0 02 04 06 08 1 0 02 04 06 08 1
Coupling Strength C


Figure 8-3. Influence of different noise levels on correntropy coefficient.



One, but the sharp increase at C = 0.7 is still obvious for both noise intensities. When

noise level is high (SNR = IdB), the correntropy coefficient curve is more zigzag than

that of 10dB case, however it can still detect increases in the coupling strength. The


figure also -II---- -R- that whether noise is added into driver, response or both system, the

performance of correntropy coefficient is very similar. Fig. 8-3 presents the results for

non-identical Hi~non map (b, = 0.3 and b, = 0.1) with white noise in response, driver

and both systems. Kernel size is selected to 0.05 for SNR = 10dB and 0.2 for SNR = IdB


respectively. The values of the correntropy coefficients at different coupling strength are

averaged over 20 independent realizations. In both levels of noise case, the correntropy

coefficients consistently increase with respect to coupling strength. Also the effect of noise

in response, driver or both systems does not make big differences. Notice that the local

hump around the region 0.2 < C < 0.4 is still observable for all cases. These results show

that the correntropy coefficient is fairly robust even in the case of considerably noisy data.

8.2.1.3 Sensitivity to Time-dependent Dynamical Changes

Next we test how sensitive the correntropy coefficient is to time dependent sudden


change in the dynamics of interacting systems due to coupling strength. In experiment,

change in coupling strength can cause sudden change in the dynamics of interacting











0 (a) bx =03, b =0O3



;02-

S-0 2 5 5 5
;03_ (b)bx=03,b =01






50 150 250 350
Time t


Figure 8-4. Time dependent of correntropy coefficient.


systems, which basically generates non-stationarity in time series. To study such transient

dynamical phenomenon, both identical (b, = b, = 0.3) and non-identical (b, = 0.3 and

b, = 0.1) Hiinon maps are considered here. Dynamical systems are coupled only during

a single epoch and otherwise uncoupled for both cases [134, 137]. We set the coupling

strength C = 0 for n < 10150 and n > 10250 and C = 0.8 for 10150 < n < 10250.

Only 400 data samples are plotted after the first 10000 data are discarded as transient.

The sliding window used to compute the correntropy coefficient is chosen to contain

8 data. Kernel size is set to 0.2 for identical map and 0.3 for non-identical map. The

results are averaged over 20 independent realizations of different initial conditions ranging

0 to 1. Fig. 8-4 plots the correntropy coefficient curves for identical and non-identical

maps. In uncoupled regions, rl fuctuates around 0.01 baseline for identical map and

0.001 for non-identical map. A sharp and clear increase occurs at t = 150 when 0.8

coupling strength between systems X and Y is introduced, and there is a sharp and

clear decrease in rl falling off back to the baseline level when coupling strength between

two systems reduces to zero at t = 250. The interval where rl is noticeably higher than

the baseline level matches nicely to the coupling interval. This phenomenon is observed

both in identical and non-identical Hiinon maps. Therefore, the correntropy coefficient
















06
S04

-02.




kernel width a 10 0 4
10 0 02 coupling strength C

Figure 8-5. Effect of different kernel width on correntropy coefficient for unidirectionally
coupled identical Hiinon maps.


is potentially able to detect sudden change in the coupling between two interacting

dynamical systems with a high temporal resolution, which makes this measure suitable for

non-stationary data sets.

8.2.1.4 Effect of Kernel Width

We have discussed the importance of the kernel width in the performance of the

correntropy coefficient because it is a parametric measure in previous section. Here we

demonstrate this on unidirectionally coupled identical and non-identical Hiinon maps. Fig.

8-5 shows the correntropy coefficient curves of different kernel width for unidirectionally

coupled identical Hi~non map. When the kernel width is chosen too large, o- = 0.1, 0.5,

1, in this case, correntropy coefficient produces erroneous results in the unsynchronized

region 0 < C < 0.7. The results for non-identical Hiinon map are presented in Fig. 8-6.

It can be seen that if kernel width is too small, the increase of the correntropy coefficient

with respect to the coupling strength is not as obvious as those of suitable kernel width (o-

=0.4 here). While the kernel width is too larger, the results of the correntropy coefficient

approach to those of conventional correlation coefficient. In both figures, we see that the

correntropy coefficients can either increase or decrease as the kernel width increases. These

observations are consistent with our theoretical analysis in previous section.















~08-





04 nlldho0
02,,ln srnt



Fiur -6 Efctofdffretkernel width on 0ornrp 4 ofiin o uiietoa


coupled non-identical Hi~non maps.


8.2.1.5 Ability to Quantify Nonlinear Coupling

To demonstrate that correntropy coefficient ty is able to detect the nonlinear coupling

between systems X and Y, we first compare our measure with the conventional correlation

coefficient and one of the well-known nonlinear synchronization measure, similarity

index in [1:33], on identical Hiinon maps. Fig. 8-7 shows the correlation coefficient, the

correntropy coefficient and the similarity index as functions of coupling strength C.

The correntropy coefficient generates exactly the same result as the similarity index.

On the other hand the conventional correlation coefficient performs erratically in the

unsynchronized region C < 0.7. This clearly demonstrates that the correntropy coefficient

outperforms the correlation coefficient in characterization of nonlinear coupling between

two dynamical systems. Compared to the similarity index, the correntropy coefficient has

the advantage of avoiding estimating embedding dimension, choosing nearest neighborhood

and other problems associated with state space embedding method [1:33, 1:39]. The

computational complexity of the correntropy coefficient is still manageable, and the kernel

size is easy to estimate.

We also use multivariate surrogate data to further investigate that the correntropy

coefficient is sensitive to nonlinear coupling. Prichard et al. introduced surrogate method











Correlation coefficient

05-

o- -


05-



Simlarity Index

05-

0
0 01 02 03 04 05 06 07 08 09 1
Coupling Strength C

Figure 8-7. Comparison of correlation coefficient, correntropy coefficient and similarity
index.



in [140]. This method has been applied to detect nonlinear structure in time series.

Basically, to generate multivariate surrogate data, first the Fourier transform is applied to

each of time series, then a common random number is added to each of the phases and an

inverse Fourier transform is applied. The resulting time series have identical power spectra

and cross power spectra as the original time series, but any nonlinear coupling among the

time series has been destroyed. In simulation, we use TISEAN package [141] to generate

19 realizations of the surrogate data for the time series xl(t) in Eq. (8-1) and yl(t) in Eq.

(8-3) for each different coupling strength for the unidirectionally coupled non-identical

Hi~non map. Then we compute the correntropy coefficient for both the original and the

surrogate data with respect to different coupling strength. Fig. 8-8 plots the correntropy

coefficient curve for the original data and the mean value of 19 correntropy coefficients

for the surrogate data with the corresponding maximal and minimal values as error bars.

TIo quan~tify the significance level, wve calculate the Z-Scor~e as Z: = ""'-"" where vori

is the correntropy coefficient value for the original data, ps,,, and o-ser, are the mean

and the standard deviation for the surrogate data respectively. Table 8-1 presents the

Z-Score values for different coupling strength. With the exception of C = 0.2 and 0.4,

the Z-Score values are significantly larger than 1.96 which means the nonlinear coupling










-Original Data
-m- Surrogate Data
07C




2 05~; _


O 04 -





001 02 03 04 05 06 07 08 09
Coupling Strength C

Figure 8-8. Comparison of the correntropy coefficient for the original data and the
surrogate data for unidirectionally coupled non-identical Hi~non niap.


has been detected with a probability p < 0.05. These results clearly demonstrates that

the correntropy coefficient is sensitive to the nonlinearity of the dependence between two

coupled systems.

8.2.2 EEG Signals

In the second experiment, we applied the correntropy coefficient to real EEG signals.

The electrical potentials on the surface of the scalp of a human subject were measured and

recorded with the NeuroScan EEG system (NeuroScan Inc., Conipuntedics, Abbotsford,

Australia). A 64-channel cap was used with electrode locations according to the extended

international 10/20 system and with a linked-earlobe reference. Horizontal and vertical

electrooculogrant (HEOG and VEOG) signals were also recorded for artifact rejection

using two sets of bipolar electrodes. The data sampling rate was fixed at 1000Hz and the

online hand-pass filter range was set to be nmaxinmally wide between 0.05Hz and 200Hz.

Subjects were presented repeatedly (200 times) with uni-niodal auditory and visual

stimuli delivered in the central visual and auditory spaces simultaneously and with the

same strength to the left/right eyes and ears, as well as with simultaneous cross-nlodal

combinations. For the purpose of this study, only the uni-niodal data was used. The visual










Table 8-1. Z-score for the surrogate data

C0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1
Z SCORE 6.943 4.472 1.622 4.585 0.773 7.658 9.908 16.699 12.268 22"0 19.895


stimuli consisted of 5x5 black and white checkerboards presented for 10nis, while the

auditory stimuli were 2000Hz tones with durations of 30nis. The time interval between the

stimuli in any of the experimental conditions was random between 1500nis and 2000nis.

Following standard eye-nlovenient artifact rejection procedures and segmentation into

single epochs with alignment at the onset of the stimuli, all artifact-free epochs were

averaged and normalized to zero mean and unit variance and low-pass filtered at 0-40Hz

for further analysis. We then applied the correntropy coefficient to the averaged data to

quantify the bilateral synchronization or couplings among the corresponding sensory areas

of the brain. In order to test whether the correntropy coefficient was able to detect any

nonlinear couplings in the EEG signals, the results were compared to the conventional

correlation coefficient. A window size of 20nis data is used to calculate both measures

corresponding to the duration of a single dipole activation in the cortex [142]. The kernel

width o- in Gaussian kernel used in correntropy coefficient was chosen to be 0.4.

Fig. 8-9 (a) and (b) show plots of the correlation and correntropy coefficients for

the auditory areas of the brain as a function of time after the subject was exposed only

to the audio stimuli. Several bilaterally-syninetrical pairs of electrodes were selected

in the vicinity of the auditory cortex so that both measures were computed for pairs

FC5-FC6, FC3-FC4, C5-C6, C3-C4, CP5-CP6, CP3-CP4. As shown in Fig. 8-9 (a)

and (b), there are two distinct time intervals 0-270nis and 270-450nis in the auditory

response. Both correlation and correntropy coefficients drop at 270nis. This -II---- -R- that

both measures are able to detect the changes in inter-henlispheric synchronization of the

auditory regions. However, as the electrodes are chosen in different locations away front

the auditory cortex, it is expected that during the synchronization phase (0-270nis) the

synchronization measures for different pairs should be different. Fig. 8-9 (a) shows that
















ii 0.5
FC5-FC6
C- FC4

------- C3-C4
0 CP5-CP6
0~ ----- CP3-CP4









0-0.




0 50 100 150 200 250 300 350 400 4!50
Timelms

Figure 8-9. Comparison of correlation coefficient and correntropy coefficient in
synchronization detection among auditory cortex for audio stimuli EEG signal.


the correlation coefficients for all 6 pairs are grouped together and are unable to detect

the difference in activation, while Fig. 8-9 (b) -II__- -I- that the correntropy coefficient can

differentiate successfully the synchronization strength among different areas of the cortex

above the left and right auditory regions. Notably, as expected from previous studies,

pairs FC5-FC6 and FC3-FC4 exhibit stronger synchronization strength than the others,

while most posterior pairs CP5-CP6 and C5-C6 have weaker synchronization strength.

Also the synchronization patterns reveal lateral similarity in time for the pairs FC5-FC6

and FC3-FC4, for CP5-CP6 and C5-C6, and for CP3-CP4 and C3-C4. Furthermore the

correntropy coefficients for pairs C5-C6, C3-C4 and CP3-CP4 peak simultaneously at 90ms

which corresponds to the first mean global field power (iLGFP) peak of the EEG signal.

















1I0.5
S- 01-02
S- PO7-PO8
PO5-PO6
S------ P7-P8
P5-P6
0 P3-P4 (a)







00
0 10 0 30 40 0 0
Timelm
Figue 8-0. Cmparsonof crreltioncoeficint ad corenropycoeficiet i
chrcerzto ofsnhoiainamn ciia otx oiulsiuu
EE sgnl







There diferece indpaicate tha corlthe oefcen correntropy coefficient ismrseitvandiabet




extract more information as a synchronization measure than the conventional correlation

coefficient.

We also compared both measures when applied to the visual cortical areas. The

measures are presented in Fig. 8-10 as a function of time when the subject is exposed

only to visual stimuli. Again, a window size of 20ms data is used to compute both

the correlation and the correntropy coefficients, and the kernel width o- is again set to

0.4 as in the previous case. We also chose bilaterally symmetrical pairs of electrodes

01-02, PO7-PO8, PO5-PO6, P7-P8, P5-P6 and P3-P4. In Fig. 8-10 (b) the correntropy

coefficients for all pairs except for 01-02 show similar synchronization patterns. The










correntropy coefficient increases at first, then reaches a peak around 275ms, after which

it drops to lower levels. The maximum values of the correntropy coefficients around

275ms correspond to the peak P1 in the visual evoked potential [14:3]. As expected

the synchronization between occipital channels 01 and 02 has the maximum strength

and stays high until it decreases around :350ms. Thus the correntropy- coefficient shows

that the extra-striate visual networks become increasingly recruited and synchronized

until about 275ms after the stimulus onset, while the primary visual cortex is highly

synchronous for a longer period of time, until about :350ms after onset. The channels pair

P7 and P8 exhibits the weakest synchronization strength since it is located the farthest

away from the primary visual cortex compared to other electrode pairs. On the other

hand, the correlation coefficients for most channel pairs group together and di pl w~ the

same level of synchronization until its sharp decrease at around 500ms (except for P7-P8).

The synchronization between P7 and P8 has irregular patterns with a local minimum

around 200ms. This comparison clearly demonstrates that also in this case the correntropy

coefficient measure outperforms the correlation coefficient in the quantification of the EEG

signal coupling between the bilateral occipital regions of the brain in response to visual

stimuli .

8.3 Discussions

In this section, we explore more details about the correntropy coefficient both in

theoretical analysis and practical implementation such as the geometrical interpretation,

connections to information theory, the important role of the kernel size, and the scaling

factor of time series.

8.3.1 Kernel Width

The variance of the Gaussian kernel function is normally called kernel width or kernel

size. The kernel size should be considered a scale parameter controlling the metric in the

projected space. Fr-om the geometrical perspective, the kernel size decides the length of

each of the nonlinearly transformed vectors and the angle between them in the RK(HS










7/,U because ||W(x)| = J*(0) xC~xeC~xy) It can be easiily seen t~hat the

vector length monotonically decreases as the kernel width increases, while the centered

crosscorrentropy exhibits more complex pattern since it also depends on the nature of

the relationship between two random variables. The kernel width controls the ability of

the centered correntropy to capture the nonlinear couplings intrinsic in the signals. If

the kernel width is too large, the correntropy coefficient loses the ability to detect the

nonlinearity and approaches the conventional correlation coefficient; when the kernel

width is too small, the nonlinear transformations # in Eq. (4-3) and W in Eq. (4-29)

cannot interpolate between data points. This can also be verified to apply the Taylor

series expansion to the Gaussian kernel where the kernel width appears as the weighting

parameter in both second and higher order moments. The effect of the kernel size on

different order moments is scaled by the power up to 2k where k is the moment order.

When the kernel size is too large, the contribution of higher order statistics decays rapidly

and the centered crosscorrentropy approaches the conventional crosscovariance function;

on the other hand, when the kernel size is too small, the effect of higher order moments

overweigfhs the second order one. An appropriate kernel size should maintain the balance

of second order and higher order statistics of the signal.

Therefore a good choice of the kernel parameter is crucial for obtaining the good

performance of the proposed method. There are two owsi~ of handling the selection of

kernel size. One is to seek an optimal kernel size. The cross-validation has been one of

the mostly used methods in machine learning field to choose an appropriate kernel width.

Other approaches include the Silverman's rule of thumb which is given by [120]:


o- = 0.9ANV-1/5, (8-5)


where A is the smaller value between standard deviation of data samples and data

interquartile range scaled by 1.34, and N is the number of data samples. The Silverman's

rule is easy to use to choose a good kernel size, hence we will set the kernel width










according to Eq. (8-5) throughout the paper. Alternatively, the kernel size can he

thought as a scale parameter that provides different looks to the dependence among the

variables. Just like in wavelets, the kernel size is able to analyze the dependencies at

different resolutions. Since many real world signals are very complex, this multi-resolution

analysis may elucidate better the relationships.

8.3.2 Scaling Effect

Because the nonlinear transformations # in Eq. (4-3) and W in Eq. (4-29) are

nonlinear, any scaling of the original random variables results in different performance

of the correntropy coefficient. Unlike the conventional correlation coefficient which

is insensitive the amplitude scaling of the signals and only measures the similarity of

the signals through the time, the correntropy coefficient measures both the time and

amplitude similarities between two signals. Therefore, in certain applications, it is vital

to normalize both signals before applying the correntropy coefficient. For example, the

amplitudes of EEG signals are highly dependent on the different electrode impedances. It

is important to normalize all channels of EEG signals to the same dynamical range.

8.4 Conclusion

In this chapter, we apply the correntropy coefficient as a novel nonlinear interdependence

measure. Due to a positive definite kernel function, the correntropy coefficient implicitly

maps the original random variables or time series into an infinite dimensional reproducing

kernel Hilbert space which is uniquely induced by the centered crosscorrentropy function

and essentially computes the cosine of the angle between the two transformed vectors.

Orthogonality in RK(HS 'Ff, corresponds to independence between original random

variables. Comparisons between the correntropy coefficient and the conventional

correlation coefficient on simulated two unidirectionally coupled Hiinon maps time

series and EEG signals collected from sensory tasks clearly illustrate that the correntropy

coefficient is able to extract more information than the correlation coefficient in quantification

of synchronization between interacting dynamical systems.










CHAPTER 9
CONCLUSIONS AND FUTURE WORK(

9.1 Conclusions

In this dissertation, we have analyzed the recently proposed correntropy function [65]

and presented a new centered correntropy function from Time-domain and frequesy ,-i-domain

approaches. It is demonstrated that correntropy and centered correntropy functions not

only capture the time and space structures of signals, but also partially characterize the

high order statistical information and nonlinearity intrinsic to signals. Correntropy and

centered correntropy functions have rich geometrical structures. Correntropy is positive

definite and centered correntropy is non-negative definite, hence by Moore-Aronszaj~n

theorem they uniquely induce reproducing kernel Hilbert spaces. Correntropy and

centered correntropy functions combine the data dependent expectation operator and data

independent kernels to form another data dependent operator. One perspective to work

with correntropy and centered correntropy functions is to treat them as l !.

correlation and covariance functions on nonlinearly transformed random signals via the

data independent kernel functions. Those nonlinearly transformed signals appear on the

sphere in the RK(HS induced by the kernel functions if isotropic kernel functions are used.

The other perspective is to directly work with the RK(HS induced by the correntropy

and centered correntropy functions directly. Now the nonlinearly transformed signals

in the RK(HS is no longer stochastic but rather deterministic. The RK(HS induced by

the correntropy and centered correntropy functions includes the expectation operator

as embedded vectors. The two views further our understandings of correntropy and

centered correntropy functions in geometrical perspective. The two reproducing kernel

Hilbert spaces induced by kernel functions and correntropy functions respectively represent

stochastic and deterministic functional als k-i--;

The correntropy dependence measure is proposed based on the correntropy coefficient

as a novel statistical dependence measure. The new measure satisfies all the fundamental










desirable properties postulated by Renyi. We apply the correntropy concept in pitch

determination, and nonlinear component analysis. The correntropy coefficient is also

eniploi-v I as a novel similarity measure to quantify the inder-dependencies of niulti-channel

signals .

9.2 Future work

As we have provide a new methodology to explicitly construct a reproducing kernel

Hilbert space induced by the data independent kernel functions, we would like to apply

the same methodology to construct the RK(HS induced by the data dependent correntropy

function directly. This shall allow us to analyze the functional basis that consists of the

RK(HS. 1\oreover, if we can have an explicit expression for those functionals, then we can

develop algorithms that are not restricted to the inner product. We have only test our

correntropy PCA algorithms on artificial data sets. It is desired to apply the algorithms to

real data set. Some of the future work might includes

1. applying the parametric correntropy- coefficient in detecting nonlinear coupling for

EEG signal,

2. investigating the relationship between the higher order statistics and correntropy,

3. applying correntropy pitch determination algorithm in multiple pitches tracking.










LIST OF REFERENCES

[1] N. Aronszajn, "The theory of reproducing kernels and their applications," Cam-
bridge Phil****lit 8..I,; S ~~ lla Proceedings, vol. 39, pp. 133-153, 1943.

[2] N. Aronszajn, "Theory of reproducing kernels," Transactions of the American
Mathematical S .. .:. It; vol. 68, no. 3, pp. 337-404, 1950.

[3] A. Povzner, "On a class of hilbert function spaces," Dokl. Akad. Nank. SSSR, vol.
68, pp. 817-820, 1949.

[4] A. Povzner, "On some applications of a class of hilbert function spaces," Dokl.
Akad. Nank. SSSR, vol. 74, pp. 13-16, 1950.

[5] M. G. K~rein, "Hermitian-positive kernels on homogeneous spaces," American
Mathematical S .. .:. It; Translation, vol. 2, no. 34, pp. 69-164, 1963.

[6] E. Hille, "Introduction to general theory of reproducing kernels," Rocky M~ountain
Journal of M~athematics, vol. 2, pp. 321-368, 1972.

[7] H. Meschkowski, Hilbert Spaces with Kernel Function, Springer-\ 11 q Berlin, 1962.

[8] H. S. Shapiro, Topics in Appprox~imation The..<;; Springer-Verlag, Berlin, 1971.

[9] S. Saitoh, Theory of Reproducing Kernels and its Applications, Pitman research
notes in mathematics series. Longman Scientific & Technical, Essex,UK(, 1988.

[10] P. J. Davis, Interp~olation and Approx~imation, Dover, New York, 1975.

[11] S. Bergman, The Kernel Function and Conformal M rry1.. t., American Mathematical
Society, New York, 1950.

[12] L. Schwartz, "Hilbert subspaces of topological vector spaces and associated kernels,"
Journal of A,...el;;-.: Mathematics, vol. 13, pp. 115-256, 1964.

[13] J. Mercer, "Functions of positive and negative type, and their connection with the
theory of integral equations," Philosophical Transactions of the Roriarl 8... .:. in of
London, vol. 209, pp. 415-446, 1909.

[14] E. H. Moore, "On properly positive hermitian matrices," Bulletin of the American
Mathematical S .. .:. It; vol. 23, no. 59, pp. 66-67, 1916.

[15] Salomon Bochner, "Hilbert distances and positive definite functions," The Annals of
Mathematics, vol. 42, no. 3, pp. 647-656, July 1941.

[16] I. J. Schoenberg, j1. 11 s c spaces and positive definite functions," Transactions of the
American M~athemathical 8... .:. It; vol. 44, pp. 522-536, 1938.

[17] I. J. Schoenberg, "Positive definite functions on spheres," Duke M~ath. J., vol. 9, pp.
96-108, 1942.










[18] J. Stewart, "Positive definite functions and generalizations, an historical survey,"
Rocky M~ountain Journal of M~athematics, vol. 6, no. 3, pp. 409-434, September 1976.

[19] D. All .i-, Ed., Reproducing Kernel Spaces and Applications, Birkhauser Verlag,
Germany, 2003.

[20] U. Grenander, Abstract Inference, John Wiley & Sons, New York, 1981.

[21] E. K~re-vois Introducl ,t;, Functional A,:tale;,.: with Applications, John Wiley & Sons,
New York, 1978.

[22] A. N. K~olmogorov, "Stationary sequences in Hilbert space," Bull. M~ath. Univ.
M~oscow, vol. 2, no. 6, 1941.

[23] M. Loibve, "Stochastic processes and brownina motion," in Second Order Random
Funtions, P. Li~vy, Ed., p. 365. Cauthier-Villars, Paris, 1948.

[24] M. Loibve, P,~rol~.:l..7I;l Ti,****U II, Springer-\ 11 I_ Berlin, 4th edition, 1978.

[25] E. Parzen, "Statistical inference on time series by Hilbert space methods," Technical
Report 2S, Statistics Department, Stanford Un : I 1959.

[26] E. Parzen, "Statistical inference on time series by RK(HS methods," in Proc.
12th Binnial eminar anadia Mathematical Congress, R. Pyke, Ed., Montreal,
Canada, 1970, pp. 1-37.

[27] E. Parzen, "An approach to time series analysis," The Annals of M~athematical
Statistics, vol. 32, no. 4, pp. 951-989, December 1961.

[28] E. Parzen, "Extraction and detection problems and reproducing kernel Hilbert
spaces," SIAM~ Journal on Control, vol. 1, pp. 35-62, 1962.

[29] E. Parzen, Time Series A,:tale;,.: Papers, Holden-Day, San Francisco, CA, 1967.

[30] J. H~jek, "On linear statistical problems in stochastic processes," Czechoslovak
Mathematical Journal, vol. 12, pp. 404-444, 1962.

[31] T. Kailath, "RK(HS approach to detection and estimation problems-part I:
Deterministic signals in gaussian noise," IEEE Transactions on Information
Tit.- t~;, vol. IT-17, no. 5, pp. 530-549, September 1971.

[32] T. K~ailath and H. Weinert, "An RK(HS approach to detection and estimation
problems-part II: Gaussian signal detection," IEEE Transactions on Information
Ti,..-r ;, vol. IT-21, no. 1, pp. 15-23, January 1975.

[33] T. K~ailath and D. Duttweiler, "An RK(HS approach to detection and estimation
problems-part III: Generalized innovations representations and a likelihood-ratio
formula," IEEE Transactions on Information The .-<;, vol. IT-18, no. 6, pp. 730-745,
November 1972.










[34] D. Duttweiler and T. K~ailath, "RK(HS approach to detection and estimation
problems-part IV: Non-gaussian detection," IEEE Transactions on Information
Ti,..-r ;, vol. IT-19, no. 1, pp. 19-28, January 1973.

[35] D. Duttweiler and T. K~ailath, "RK(HS approach to detection and estimation
problems-part V: Parameter estimation," IEEE Transactions on Information
Ti,..-r ;, vol. IT-19, no. 1, pp. 29-37, January 1973.

[36] T. Hida and N. Ikeda, "Analysis on hilbert space with reproducing kernel arising
from multiple wiener integral," in Proc. 5th Berkely Symp. on M~athematical
Statistics and Pr o~l~.:l..7.;; L. LeCam and J. Neyman, Eds., 1967, vol. 2, pp. 117-143.

[37] G. K~allianpur, "Advances in probability and related topics," in The Role of
Reproducing Kernel Hilbert Spaces in the Sientr;i of Gaussian Processes, P. Ney, Ed.,
vol. 2, pp. 49-83. Marcel Dekker, New York, 1970.

[38] T. Hida, "Canonical representations of gaussian processes and their applications,"
Kyoto Univ. Coll. Sci. M~em., vol. A33, pp. 109-155, 1960.

[39] G. M. Molchan, "On some problems concerning brownian motion in liivy's sense,"
Ti,* U of Pr o~l~.:l..7.;i And Its Applications, vol. 12, pp. 682-690, 1967.

[40] L. Pitt, "A markov property for gaussian processes with a multidimensional
parameter," Archive for Rational M~echanics and A,:tale;,.: vol. 43, pp. 367-391,
1971.

[41] H. L. Weinert, "Statistical methods in optimal curve fitting," Communications in
Statistics, vol. B7, no. 4, pp. 417-435, 1978.

[42] C. De Boor and R. E. Lynch, "On splines and their minimum properties," Journal
of Mathematics and M~echanics, vol. 15, pp. 953-969, 1966.

[43] L. L. Schumaker, "Fitting surfaces to scattered data," in Applrox~imation Theory II,
G.G. Lorentz, C. K(. Chui, and L.L. Schumaker, Eds., pp. 203-268. Academic Press,
New York, 1976.

[44] G. Wahba, Spline M~odels for Observational Data, vol. 49, SIAM, Philadelphia, PA,
1990.

[45] Rui J.P. DeFigueiredo, "A generalized fock space framework for nonlinear system
and signal analysis," IEEE Transactions on Circ~uit and S;;-/.1 vo. A-3, o
9, pp. 637-647, September 1983.

[46] V. N. Vapnik, The Nature Statistical Learning The .-<;, Springer, New York, 1999.

[47] V. N. Vapnik, Statistical Learning The .-<;, John Wiley & Sons, New York, 1998.

[48] F. Rosenblatt, "The perception: A probabilistic model for information storage and
organization in the brain," P;;. Isa,~ i~ ..~ al Review, vol. 65, no. 6, pp. 386-408, 1958.










[49] B. Schtilkopf and A. Smola, Learning with kernels, MIT Press, Cambridge, MA,
2002.

[50] B. Schtilkopf, A. Smola, and K(.-R. Miller, Nunimlear component analysis as a
kernel eigfenvalue problem," Neural C'omp~ubstion, vol. 10, pp. 1299-1319, 1998.

[51] J. Weston B. Scholkopf S. Mika, G. Ratsch and K(.-R. Muller, "F'isher discriminant
analysis with kernels," in Proc. Neural Networks for S~ll,..el Processing IX, New
Jersey, 1999, vol. 2, pp. 41-48.

[52] F. R. Bach and 31. I. Jordan, "K~ernel independent component analysis," Journal of
Machine Learning Research, vol. :3, pp. 1-48, 2002.

[5:3] Thomas 31. Cover, "Geometrical and statistical properties of systems of linear
inequalities with applications in pattern recognition," IEEE Transactions on
Electronic C'omp~uters, vol. EC-14, no. :3, pp. :326-3:34, June 1965.

[54] T. Evgeniou, T. Poggio, 31. Pontil, and A. Verri, "Regularization and statistical
learning theory for data analysis," C'omp~utational Stratistic~s and Dates A,...el;;-.: vol.
:38, pp. 421-432, 2002.

[55] J. C. Principe, D. Xu, and J. W. Fisher, I~la..i In! .11.i theoretic learning," in
Unsup~ervised Adap~tive Filtering, S. Haykin, Ed., pp. 265-319. John Wiley & Sons,
2000.

[56] K(. E. Hild, D. Erdogmus, and J. C. Principe, "Blind source separation using Riinyi's
mutual informaiton," IEEE S: I...rl Processing Letter, vol. 8, no. 6, pp. 174-176, 2001.

[57] D. Erdogmus and J. C. Principe, "Generalized information potential criterion for
adaptive system training," IEEE Transactions on Neural Networks, vol. 1:3, no. 5,
pp. 10:35-1044, 2002.

[58] A. Riinyi, "On measures of entropy and information," in Selected paper of A. R.'a r;:
vol. 2, pp. 565-580. Akademiai K~iado, Budapest, Hungary, 1976.

[59] D. Erdogmus and J. C. Principe, "An error-entropy minimization algorithm for
supervised training of nonlinear adaptive systems," IEEE Transactions on S y...
Processing, vol. 50, no. 7, pp. 1780-1786, July 2002.

[60] E. Parzen, "On estimation of a probability density function and mode," The Annals
of Mathematical Shetistic~s, vol. :33, no. :3, pp. 1065-1076, September 1962.

[61] 31. G. Genton, "Class of kernels for machine learning: A statistics perspective,"
Journal of Iafechine Learning Research, vol. 2, pp. 299-312, 2001.

[62] S. K~ullback and R. A. Leibler, "On information and sufficiency," The Annals of
Mathematical Shetistic~s, vol. 22, no. 1, pp. 79-86, 1951.










[6:3] D. Xu, J. C. Principe, J. W. Fisher, and H. C. Wu, "A novel measure for
independent component analysis," in Proceedings of the IEEE International
Conference on Acoustic~s. Sp~eech. and S.:e.ll:r Processing(IC'ASSP), 1998, vol. 2, pp.
12-15.

[64] R. Jenssen, D. Erdogmus, J. C. Principe, and T. Eltoft, "The laplacian PDF
distance: A cost function for clustering in a kernel feature space," in Advances
in Neural Information Processing So/;,~i;; (NIPS). 2004, pp. 625-632, MIT Press,
Cambridge.

[65] I. Santamaria, P. Pokharel, and J. C. Principe, "Generalized correlation function:
Definition, properties, and application to blind equalization," IEEE Transactions on
C: ys..rl Processing, vol. 54, no. 6, pp. 2187-2197, 2006.

[66] P. P. Pokharel, R. Agrawal, and J. C. Principe, "Correntropy based matched
filrb Iflr ") in Proc. Iafechine Learning for S y.:l,:rl Processing, Mystic, USA, 2005.

[67] K(.-H. Jeong and J. C. Principe, "The correntropy MACE filter for image
recognition," in Proceedings of the Internatiomel Workshop On Iafechine Learn-
ing for Sy.:l,:rl Processing(l1/ .9P), Maynooth, Ireland, 2006, pp. 9-14.

[68] P. P. Pokharel, J. Xu, D. Erdogmus, and J. C. Principe, "A closed form solution
for a nonlinear Wiener filter," in Proceedings of the IEEE International C'onference
on Acoustic~s. Sp~eech. and S.:e.ll:r Processing(IC'ASSP), Toulouse, France, May 2006,
vol. :3, pp. 720-72:3.

[69] S.-I. Amari and H. Ng I, 1:. Methods of Information G... ;;. / t;, AMS and Oxford
University Press, Providence, RI, 2000.

[70] B. Pistone and C. Sempi, "An infinite-dimensional geometric structure on the space
of all probability measures equivalent to a given one," The Annals of Stratistic~s, vol.
2:3, no. 5, pp. 154:31561, 1995.

[71] C. R. Rao, "Information and accuracy attainable in the estimation of statistical
parameters," Bulletin of the C'aluh Mathematiall 8. .:. It; vol. :37, pp. 81-91, 1945.

[72] T. Jehara, R. K~ondor, and A. Howard, "Probability product kernels," Journal of
Machine Learning Research, vol. 5, pp. 819-844, 2004.

[7:3] T. Jehara and R. K~ondor, "Bhattarcharyya and expected likelihood kernels," in
Proceedings of Annual C'onference on Learning Ti. .. t,;, Washington D.C., 200:3.

[74] 31. B. Pursley, Random Processes in Litear So/;,~i;, Prentice Hall, New Jersey,
2002.

[75] L. Ljung, System II. ,./I.:. el..>,n: Theory for the User, Prentice-Hall, Englewood
Cliffs, NJ, 1987.










[76] Rui J.P. DeFigueiredo and Y. Hu, "Volterra equations and applications," in
V.1IOn I1.~11 N lier Filteing1YO Iof No-GaLUSSZCian PrIoc~SSteS Tin..';,(i, V/olterra Series,
C. Corduneanu and I.W. Sandherg, Eds., pp. 197-202. Gordon and Breach Science,
2002.

[77] R.F. Engle, "Autoregressive conditional heteroscedasticity with estimates of the
variance of united kingdom inflation," Econometrica, vol. 50, pp. 987-1007, 1982.

[78] A. R 1>InIn~~ ~! ~ alluliv~ and S. Canu, lI .In.! -~ reproducing kernels, regularization and
1. ....1. Journal of Iafechine Learning Research, vol. 6, pp. 1485-1515, 2005.

[79] V. Fock, "K~onfigurationsraum und zweite qu .nt. Anylr Z. Phys., vol. 75, pp.
622-647, 19:32.

[80] D. Drouet Mari and S. K~otz, C'orrelation and Dep~edence, Imperical College Press,
London, 2001.

[81] J. Dauxiois and G. 31. Nkiet, "Nonlinear canonical analysis and independence tests,"
The Annedls of Shetistic~s, vol. 26, no. 4, pp. 1254-1278, 1998.

[82] C. B. Bell, \!llitual information and maximal correlation as measures of
dependence," The Annedls of M~athematical Shetistic~s, vol. :33, no. 2, pp. 587-595,
1962.

[8:3] T. Cover and J. Thomas, Elements of In for ts;.el..>,n The <;, Wiley, New York, 1991.

[84] A. C. Micheas and K(. Zografos, \!. I- s teing stochastic dependence using
cp-divergence," Journal of M~ultivariate A,...el;;-.: vol. 97, pp. 765-784, 2006.

[85] S. D. Silvey, "On a measure of association," The Annals of Afrathematical Stratistic~s,
vol. :35, no. :3, pp. 1157-1166, 1964.

[86] H. Joe, "Relative entropy measures of multivariate dependence," Journal of the
American Statistical Asmsociation, vol. 84, no. 405, pp. 157-164, 1989.

[87] C. W. Granger, E. Alaasoumi, and J. Racine, "A dependence metric for possibly
nonlinear processes," Journal of Time Series A,:ale;,: vol. 25, no. 5, pp. 649-669,
2004.

[88] P. L. Lai and C. Fyfe, "K~ernel and nonlinear canonical correlation analysis,"
International Journal of Nieural So/;,~i;; vol. 10, no. 5, pp. :365-377, 2000.

[89] A. Gretton, R. Herbrich, A. Smola, O. Bousquet, and B. Scholkopf, "K~ernel methods
for measuring independence," Journal of Iaftchine Learning Research, vol. 6, pp.
2075-2129, 2005.

[90] H. Suetani, Y. Iba, and K(. Aihara, "Detecting generalized synchronization between
chatoic signals: A kernel-based approachh" Journal of Ph;;-.. A: Afrathematical and
General, vol. :39, pp. 1072:310742, 2006.










[91] A. Riinyi, "On measure of dependence," Acta M~athematica Accedemitte Scientittrum
H;,:(l.g,....r. vol. 10, pp. 441-451, 1959.

[92] K(. I. Diamantaras and S. Y. K~ung, Principal C'omponent Neural Networks: Theory
and Applications, John Wiley & Sons, New York, 1996.

[9:3] I.T. Jolliffe, Principal C'omponent A,..de;,.: Springer, New York, 2nd edition, 2002.

[94] T. Hastie and W. Stuetzle, "Principle curves," Journal of the American Statistical
Association, vol. 84, no. 406, pp. 502-516, June 1989.

[95] 1. A. Kramer, \uilmlear principal component analysis using autoassociative neural
networks," AlchE Journal, vol. :37, pp. 2:3:324:3, 1991.

[96] W. J. Hess, Pitch Determination of Speech S~ll,.'al- Springer, New York, 199:3.

[97] 31. J. Ross, H. L. Shaffer, A. Cohen, R. Freudberg, and H. J. Manley, "Average
magnitude difference function pitch extractor," IEEE Tman~srctions on Acoustic~s.
Speech. and 8.9 ..a~l Processing, vol. ASSP-22, no. 5, pp. :35:3-362, October 1974.

[98] A. de C!. i... !,eas "Cancellation model of pitch perception," The Journal of the
Acoustical S8... .:. Iri of America, vol. 10:3, no. :3, pp. 1261-1271, March 1998.

[99] T. Shimarura and H. K~ohn-hi--- W.~ is,!1t i autocorrelation for pitch extraction of
noisy speech," IEEE Tman~srctions on Speech and Audio Processing, vol. 9, no. 7, pp.
727-730, October 2001.

[100] A. de C!. i.; !,eas. "Pitch and the narrowed autocoindidence histogram," in Proc.
Intl. C'onf. of M~usic Perception and C'ognition, K~yoto, Japan, 1989, pp. 67-70.

[101] G. J. Brown and D. Wang1 "Modelling the perceptual segregation of double vowels
with a network of neural oscillators," Neural Networks, vol. 10, no. 9, pp. 1547-1558,
1997.

[102] Leon Cohen, Time-d. .;, :>. ..;i A,..el;;-.: Prentice Hall, New Jersey, 1995.

[10:3] J. C. Brown and 31. S. Puckette, "Calculation of a is .1~i..--. I autocorrelation
function," The Journal of the Acoustical S -. .:. Iri of America, vol. 85, pp. 1595-1601,
1989.

[104] H. Duifhuis, L. Willems, and R. Sluyter, il. I-llrement of pitch in speech: An
implementation of goldstein's theory of pitch perception," The Journal of the
Acoustical S8... .:. Iri of America, vol. 71, pp. 1568-1580, 1982.

[105] 31. R. Schroeder, "Period histogram and product spectrum: New methods for
fundamental frequency measurement," The Journal of the Acoustical S8... .:. Iri of
America, vol. 4:3, pp. 829-834, 1968.

[106] D. J. Hermes, \!. I-l trement of pitch by subharmonic summation," The Journal of
the Acoustical S8... .:. Iri of America, vol. 8:3, no. 1, pp. 257-264, 1988.










[107] X. Sun, "A pitch determination alogirithm based on subharmonic-to-harmonic
ration," in Proc. 6th Intl Conf. of Sp~oken I~r,:ll;,rll.: Processing, Beijing, ClI.. 2000,
vol. 4, pp. 676-679.

[108] A. de C!. mis.; !,~ "Pitch perception models," in Pitch Nernal Coding and
Perception, C. Plack, A. Oxenham, R. Fay, and A. Popper, Eds. Springer-Verlag,
New York, 2005.

[109] J. C. R. Licklider, "A duplex theory of pitch perception," Explerientia, vol. 7, pp.
128-134, 1951.

[110] R. Lyon, "Computational models of neural auditory processing," in Proceedings
of th1e IEEEC Int6ern9ational C/onference on Acoustics, Speech, and ~S`. g..e:l Process-
ing(ICASSP), San Diego, USA, 1984, pp. 41-44.

[111] M. Slaney and R. F. Lyon, "On the importance of time a temporal representation
of sound," in V/isual Representations of Speech S.:e.ll-~l M. Cooke, S. Beet, and
M. Crawford, Eds., pp. 95-116. John Wiley & Sons, 1993.

[112] D. Wang and G. J. Brown, Eds., Computational Auditory Scene A,:tale;,.: Princi-
p~les, Algorithms, and Applications, John Wiley & Sons, New Jersey, 2006.

[113] M. Wu, D. Wang, and G. J. Brown, "A multipich tracking algorithm for noisy
speech," IEEE Transactions on Speech and Audio Processing, vol. 11, no. 3, pp.
229241, 2003.

[114] R. Meddis and M. Hewitt, "Modeling the identification of concurrent vowels
with different fundamental frequencies," The Journal of the Acoustical S -. .:. Iri of
America, vol. 91, pp. 233-245, 1992.

[115] A. de C!. mis.; !, \!ll slle1b f0 estimation," in Computational Auditory Scene
A,:trle;,.: Principles, Algorithms, and Applications, D. Wang and G. J. Brown, Eds.,
pp. 45-79. John Wiley & Sons, New Jersey, 2006.

[116] A. Moreno and J. Fonollosa, "Pitch determination of noisy speech using higher
order statistics," in Proceedings of the IEEE International Conference on Acoustics,
Speech, and S.:e.ll:r Processing(ICASSP), San Francisco, USA, March 1992, pp.
133-136.

[117] R. D. Patterson, J. Holdsworth, I. Nimmo-Smith, and P. Rice, "SVOS final report,
part B: Implementing a gammatone filterbank," Applied P;;. In~I d..,;i Unit Report
841, 1988.

[118] B. R. Glasberg and B. C. Moore, "Derivation of auditory filter shapes from
notched-noised data," Hearing Research, no. 47, pp. 103-138, 1990.

[119] M. Slaney and R. F. Lyon, "A perceptual pitch detector," in Proceedings of
Lith IEEEC In1ternationl%011 Co(nference on Acoustics, Speech, and S.ge.l,:r Process-
ing(ICASSP), Albuquerque, USA, 1990, pp. 357-360.










[120] B. W. Silverman, D. i,.7 / Estimation for Stratistic~s and Dates A,...el;;-.: Chapman
and Hall, New York, 1986.

[121] J.-W. Xu, H. Bakardii .n A. Cichocki, and J. C. Principe, "A new nonlinear
similarity measure for multichannel biological signals," in Proceedings of the
International Joint C'onference on Neural Network~s(IJC'NN), Orlando, FL USA,
2007.

[122] R. Meddis and 31. Hewitt, "Virtual pitch and phase sensitivity of a computer model
of the auditory periphery: I. pitch identificationn" The Journal of the Acoustical
S8 .. .:. Iri of America, vol. 89, pp. 2866-2882, 1991.

[12:3] S. Han, S. Rao, and J. C. Principe, "Estimating the information potential with the
fast gauss transform," in Proceedings of International conference on Independent
Component A,:tale;,.: and Blind Source Sep~aration(IC'A), C'I I~l. LNCS :3889, pp. 82-89.

[124] 31. Slaney, ," Malcolm Slaney's Auditory Toolbox to implement auditory models and
generate synthetic vowels. Available online at http://www.slaney. Org/malcolm/
pubs.htm1.

[125] P. Bagshaw, ," Paul Bagshaw's database for evaluating pitch determination
algorithms. Available online at http://www.cstr.ed. ac .uk/research/proj ects/
fda.

[126] P. Bgashw, S. Hiler, and 31. Jack, "Enhanced pitch tracking and the processing of f0
contours for computer and intonation '. 1!;ay in Proc. European C'onf. on Speech
C'or17ma., 1993, ppf. 100:31006.

[127] A. Camacho and J. Harris, "A pitch estimation algorithm hased on the smooth
harmonic average peak-to-valley envelope," in Proc. Intl. Symp?. on C'ircuit~s and
S;;-i.;ii; New Orleans, USA, May 2007.

[128] J. C. Shaw, "An introduction to the coherence function and its use in EEG signal
analysis," Journal of M~edical Engineering &I T. e J,,~it~~;, vol. 5, no. 6, pp. 279-288,
1981.

[129] E. Pereda, R. Quian Quiroga, and J. Bhattacharya, \unimlear multivariate analysis
of neurophysiological signals," Progre~ss in N ;, I.: c.-1. 1..i;, vol. 77, pp. 1-37, 2005.

[1:30] B. Pompe, j!. I-IIs !ily statistical dependencies in a time series," Journal of
Statistical Ph;;-.. vol. 7:3, pp. 587-610, 199:3.

[1:31] 31. G. Rosenblum, A. S. Pikovsky, and J. K~urths, "Phase synchronization of chaotic
oscillators," Ph;;-..a ~l Review Letters, vol. 76, no. 11, pp. 1804-1807, 1996.

[1:32] J. Arnhold, P. Grassherger, K(. Lehnertz, and C. E. Elger, "A robust method for
detecting interdependencies: Application to intracranially recorded eeg," P;-.tD
vol. 1:34, pp. 419-430, 1999.










[133] R. Quian Quiroga, J. Arnhold, and P. Grassberger, "Learning driver-response
relationships from synchronization patterns," Ph,;;-.. arl Review E, vol. 61, no. 5, pp.
5142-5148, May 2000.

[134] C. J. Stam and B. W. van Dijk, "Synchronization likelihood: an unbiased measure
of generalized sysnchronization in multivariate data sets," Ph,;;-... t D, vol. 163, pp.
236-251, 2002.

[135] L. M. Pecora and T. L. Carroll, "Synchronization in chaotic systems," Ph,;;-... rl
Review Letters, vol. 64, no. 8, pp. 821-825, 1990.

[136] A. Schmitz, j!. I-mIIl s! statistical dependence and coupling of subsystem," Ph,;;-..a l
Review E, vol. 62, pp. 7508-7511, 2000.

[137] J. Bhattacharya, E. Pereda, and H. Petsche, "Effective detection of coupling in short
and noisy bivariate data," IEEE Transactions on S;,~i;-/ Man and C;,lo i t.. 1..~ 4 B, ,
no. 1, pp. 85-95, February 2003.

[138] S. J. Schiff, P. So, T. C'I I.1. R. E. Burke, and T. Sauer, "Detecting dynamical
interdependence and generalized synchrony through mutual prediction in a neural
ensemble," Ph;;-..~ al Review E, vol. 54, pp. 6708-6724, 1996.

[139] H. K~antz and T. Schreiber, Nonlinear Time Series A,:tale;,.: Cambridge University
Press, Cambridge, UK(, 1997.

[140] D. Prichard and J. Theiler, "Generating surrogate data for time series with several
simultaneously measured variables," Ph,;;-.. arl Review Letters, vol. 73, no. 7, pp.
951-954, 1994.

[141] T. Schreiber and A. Schmitz, "Surrogate time series," Ph,;;-... t D, vol. 142, pp.
346-382, 2000.

[142] K(. K~otani, Y. K~inomoto, M. Yamada, J. Deguchi, M. Tonoike, K(. Horii,
S. Miyatake, T. K~uroiwa, and T. Noguichi, "Spatiotemporal patterns of
movement-related fields in stroke patients," N, n J,~it~~;, 8 clinical ,:' ;,'"/'/;;;-...1I-
ogy, vol. 63, pp. 1-4, 2004.

[143] F. Di Russo, A. Martinez, M. I. Serono, S. Pitzalis, and S. A. Hillyard, "Cortical
sources of the early components of the visual evoked potential," Human Brain
Ma~rr..':,i( vol. 15, pp. 95-111, 2001.









BIOGRAPHICAL SKETCH

Jianwu Xu was born in Wenzhou, CluI, I on December 7, 1979. He received his

Bachelor of Engineering in Electrical Engineering from Z1 0 i; I:_ University, Hangzhou,

('1!!1! I, in June 2002. Since October 2002, he has been working towards his Ph.D. in the

Electrical and Computer Engineering Department at University of Florida, under the

supervision of Dr. Jose Principe with support from Alumni Graduate Fellowship from

University of Florida and NSF grant ECS-0601271. During the summer of 2006, he visited

RIK(EN Brain Science Institute in Tokyo, Japan and worked with Dr. Andrzej Cichocki

on EEG synchronization in the Laboratory for Advanced Brain Signal Processing. His

current research interests broadly include information theoretic learning, adaptive signal

processing, control and machine learning. He is a member of IEEE, Tau Beta Pi and Eta

Kappa Nu.





PAGE 1

1

PAGE 2

2

PAGE 3

3

PAGE 4

Firstandmost,IexpressmysinceregratitudetomyPh.D.advisorDr.JosePrincipeforhisencouragingandinspiringstylethatmadepossiblethecompletionofthiswork.Withouthisguidance,imagination,andenthusiasm,passion,whichIadmire,thisdissertationwouldnothavebeenpossible.Hisphilosophyonautonomousthinkingandtheimportanceofaskingforgoodquestions,moldedmeintoanindependentresearcherfromaPh.D.student.IalsothankmycommitteememberDr.MuraliRaoforhisgreathelpandvaluablediscussionsonreproducingkernelHilbertspace.Hismathematicalrigorrenesthisdissertation.IexpressmysincereappreciationtoDr.JohnM.Sheaforservingasmycommitteememberandtakingtimetocriticize,proofreadandimprovethequalityofthisdissertation.IthankmycommitteememberDr.K.ClintSlattonforprovidingmevaluablecommentsandconstructiveadvice.IamalsogratefultoDr.AndrzejCichockifromtheLaboratoryforAdvancedBrainSignalProcessinginRIKENBrainScienceInstituteinJapanforhisguidanceandwordsofwisdomduringmysummerschoolthere.ThecollaborationwithDr.AndrzejCichocki,HovagimBakardjianandDr.TomaszRutkowskimadethechapter8possibleinthisdissertation.Ithankallofthemfortheirgreathelpandinsightfuldiscussionsonbiomedicalsignalprocessing.ThehospitalityinthelabmademystayinJapanamemorableandwonderfulexperience.DuringmycourseonPh.D.research,IinteractedwithmanyCNELcolleaguesandIbenetedfromthevaluablediscussionsonresearchandlifeatlarge.Especially,IthankformerandcurrentgroupmembersDr.DenizErdogmus,Dr.YaduRao,Dr.PuskalPokharel,Dr.Kyu-HwaJeong,Dr.SeungjuHan,WeifengLiu,SudhirRao,IlPark,AntonioPaivaandRuijangLi,whosecontributionsinthisresearcharetremendous.CertainlythosesleeplessnightstogetherwithRuiYan,MustafaCanOzturkandAnantHegdeforhomeworkandprojectsareasunforgettableasthejoyandfrustration 4

PAGE 5

5

PAGE 6

page ACKNOWLEDGMENTS ................................. 4 LISTOFTABLES ..................................... 8 LISTOFFIGURES .................................... 9 ABSTRACT ........................................ 11 CHAPTERS 1INTRODUCTION .................................. 13 1.1DenitionofReproducingKernelHilbertSpace(RKHS) .......... 13 1.2RKHSinStatisticalSignalProcessing ..................... 15 1.3RKHSinStatisticalLearningTheory ..................... 23 1.4ABriefReviewofInformation-TheoreticLearning(ITL) .......... 26 1.5RecentProgressonCorrentropy ........................ 30 1.6StudyObjectives ................................ 32 2ANRKHSFRAMEWORKFORITL ........................ 33 2.1TheRKHSbasedonITL ............................ 33 2.1.1TheL2SpaceofPDFs ......................... 34 2.1.2RKHSHVBasedonL2(E) ....................... 35 2.1.3CongruenceMapBetweenHVandL2(E) ............... 38 2.1.4ExtensiontoMulti-dimensionalPDFs ................. 39 2.2ITLCostFunctionsinRKHSFramework ................... 40 2.3ALowerBoundfortheInformationPotential ................ 43 2.4Discussions ................................... 44 2.4.1Non-parametricvs.Parametric ..................... 44 2.4.2KernelFunctionasaDependenceMeasure .............. 45 2.5Conclusion .................................... 46 3CORRENTROPYANDCENTEREDCORRENTROPYFUNCTIONS ..... 48 3.1AutocorrentropyandCrosscorrentropyFunctions .............. 48 3.2Frequency-DomainAnalysis .......................... 68 4CORRENTROPYANALYSISBASEDONRKHSAPPROACH ......... 73 4.1RKHSInducedbytheKernelFunction .................... 74 4.1.1CorrentropyRevisitedfromKernelPerspective ............ 75 4.1.2AnExplicitConstructionofaGaussianRKHS ............ 77 4.2RKHSInducedbyCorrentropyandCenteredCorrentropyFunctions .... 83 4.2.1GeometryofNonlinearlyTransformedRandomProcesses ...... 85 4.2.2RepresentationofRKHSbyCenteredCorrentropyFunction ..... 89 6

PAGE 7

......................... 91 5CORRENTROPYDEPENDENCEMEASURE .................. 94 5.1ParametricCorrentropyFunction ....................... 96 5.2CorrentropyDependenceMeasure ....................... 99 6CORRENTROPYPRINCIPALCOMPONENTANALYSIS ........... 102 7CORRENTROPYPITCHDETERMINATIONALGORITHM .......... 110 7.1Introduction ................................... 110 7.2PitchDeterminationbasedonCorrentropy .................. 113 7.3Experiments ................................... 120 7.3.1SinglePitchDetermination ....................... 121 7.3.2DoublePitchesDetermination ..................... 123 7.3.3DoubleVowelsSegregation ....................... 126 7.3.4BenchmarkDatabaseTest ....................... 128 7.4Discussions ................................... 129 7.5Conclusion .................................... 130 8CORRENTROPYCOEFFICIENTASANOVELSIMILARITYMEASURE .. 132 8.1Introduction ................................... 132 8.2Experiments ................................... 133 8.2.1TwoUnidirectionallyCoupledHenonmaps .............. 133 8.2.1.1VariationofCorrentropyCoecientwithCouplingStrength 134 8.2.1.2RobustnessAgainstMeasurementNoise .......... 136 8.2.1.3SensitivitytoTime-dependentDynamicalChanges .... 137 8.2.1.4EectofKernelWidth .................... 139 8.2.1.5AbilitytoQuantifyNonlinearCoupling ........... 140 8.2.2EEGSignals ............................... 142 8.3Discussions ................................... 146 8.3.1KernelWidth .............................. 146 8.3.2ScalingEect .............................. 148 8.4Conclusion .................................... 148 9CONCLUSIONSANDFUTUREWORK ...................... 149 9.1Conclusions ................................... 149 9.2Futurework ................................... 150 LISTOFREFERENCES ................................. 151 BIOGRAPHICALSKETCH ................................ 161 7

PAGE 8

Table page 7-1GrosserrorpercentageofPDAsevaluation ..................... 129 8-1Z-scoreforthesurrogatedata ............................ 143 8

PAGE 9

Figure page 3-1Correntropyandcenteredcorrentropyfori.i.d.andlteredsignalsversusthetimelag ........................................ 59 3-2Autocorrelationandcorrentropyfori.i.d.andARCHseriesversusthetimelag 60 3-3Autocorrelationandcorrentropyfori.i.d.andlinearlylteredsignalsandLorenzdynamicsystemversusthetimelag ......................... 62 3-4Correntropyfori.i.d.signalandLorenztimeserieswithdierentkernelwidth 65 3-5SeparationcoecientversuskernelwidthforGaussiankernel .......... 66 3-6Correntropyfori.i.d.signalandLorenztimeserieswithdierentkernelfunctions 67 4-1SquareerrorbetweenaGaussiankernelandtheconstructedkernelinEq.( 4{7 )versustheorderofpolynomials ........................... 83 4-2twovectorsinthesubspaceS ............................ 86 6-1LinearPCAversuscorrentropyPCAforatwo-dimensionalmixtureofGaussiandistributeddata ................................... 107 6-2KernelPCAversuscorrentropyPCAforatwo-dimensionalmixtureofGaussiandistributeddata ................................... 108 7-1Autocorrelation,narrowedautocorrelationwithL=10andcorrentropyfunctionsofasinusoidsignal. .................................. 114 7-2Fouriertransformofautocorrelation,narrowedautocorrelationwithL=10andcorrentropyfunctionsofasinusoidsignal. ..................... 115 7-3Correlogram(top)andsummary(bottom)forthevowel/a/. .......... 116 7-4Autocorrelation(top)andsummary(bottom)ofthirdordercumulantsforthevowel/a/. ....................................... 117 7-5Narrowedautocorrelation(top)andsummary(bottom)forthevowel/a/. ... 118 7-6Correntropy-gram(top)andsummary(bottom)forthevowel/a/. ........ 119 7-7Correlogram(top)andsummary(bottom)foramixtureofvowels/a/and/u/. 120 7-8Thirdordercumulants(top)andsummary(bottom)foramixtureofvowels/a/and/u/. ..................................... 121 7-9Narrowedautocorrelations(top)andsummary(bottom). ............ 122 9

PAGE 10

.......................................... 123 7-11TheROCcurvesforthefourPDAsbasedoncorrentropy-gram,autocorrelation,narrowedautocorrelation(L=15),andautocorrelationof3rdordercumulantsindoublevowelssegregationexperiment. ..................... 124 7-12ThepercentageperformanceofcorrectlydeterminingpitchesforbothvowelsforproposedPDAbasedoncorrentropyfunctionandaCASAmodel. ..... 125 7-13Summaryofcorrentropyfunctionswithdierentkernelsizesforasinglevowel/a/. .......................................... 126 7-14Summaryofcorrentropyfunctionswithdierentkernelsizesforamixtureofvowels/a/and/u/. ................................. 128 8-1Averagedcorrentropycoecientforunidirectionallyidentical(a)andnonidentical(b)coupledHenonmaps. ............................... 135 8-2Inuenceofdierentnoiselevelsoncorrentropycoecient. ............ 136 8-3Inuenceofdierentnoiselevelsoncorrentropycoecient. ........... 137 8-4Timedependentofcorrentropycoecient. ..................... 138 8-5EectofdierentkernelwidthoncorrentropycoecientforunidirectionallycoupledidenticalHenonmaps. ........................... 139 8-6Eectofdierentkernelwidthoncorrentropycoecientforunidirectionallycouplednon-identicalHenonmaps. ......................... 140 8-7Comparisonofcorrelationcoecient,correntropycoecientandsimilarityindex. ............................................. 141 8-8Comparisonofthecorrentropycoecientfortheoriginaldataandthesurrogatedataforunidirectionallycouplednon-identicalHenonmap. ........... 142 8-9ComparisonofcorrelationcoecientandcorrentropycoecientinsynchronizationdetectionamongauditorycortexforaudiostimuliEEGsignal. .......... 144 8-10ComparisonofcorrelationcoecientandcorrentropycoecientincharacterizationofsynchronizationamongoccipitalcortexforvisualstimulusEEGsignal. .... 145 10

PAGE 11

Myresearchaimedatanalyzingtherecentlyproposedcorrentropyfunctionandpresentsanewcenteredcorrentropyfunctionfromtime-domainandfrequency-domainapproaches.Itdemonstratsthatcorrentropyandcenteredcorrentropyfunctionsnotonlycapturethetimeandspacestructuresofsignals,butalsopartiallycharacterizethehigherorderstatisticalinformationandnonlinearityintrinsictorandomprocesses.Correntropyandcenteredcorrentropyfunctionshaverichgeometricalstructures.Correntropyispositivedeniteandcenteredcorrentropyisnon-negativedenite,hencebyMoore-AronszajntheoremtheyuniquelyinducereproducingkernelHilbertspaces.Correntropyandcenteredcorrentropyfunctionscombinethedatadependentexpectationoperatoranddataindependentkernelstoformanotherdatadependentoperator.Correntropyandcenteredcorrentropyfunctionscanbeformulatedas\generalized"correlationandcovariancefunctionsonnonlinearlytransformedrandomsignalsviathedataindependentkernelfunctions.ThosenonlinearlytransformedsignalsappearonthesphereinthereproducingkernelHilbertspaceinducedbythekernelfunctionsifisotropickernelfunctionsareused.TheotherapproachistodirectlyworkwiththereproducingkernelHilbertspaceinducedbythecorrentropyandcenteredcorrentropyfunctionsdirectly.ThenonlinearlytransformedsignalsinthereproducingkernelHilbertspaceisnolongerstochasticbutratherdeterministic.ThereproducingkernelHilbertspaceinducedbythecorrentropyandcenteredcorrentropyfunctionsincludestheexpectation

PAGE 12

Thecorrentropydependencemeasureisproposedbasedonthecorrentropycoecientasanovelstatisticaldependencemeasure.ThenewmeasuresatisesallthefundamentaldesirablepropertiespostulatedbyRenyi.Weapplythecorrentropyconceptinpitchdetermination,andnonlinearcomponentanalysis.Thecorrentropycoecientisalsoemployedasanovelsimilaritymeasuretoquantifytheinder-dependenciesofmulti-channelsignals.

PAGE 13

Anypositivedenitebivariatefunction(x;y)isareproducingkernelbecauseofthefollowingfundamentaltheorem. (i)foreveryx2E;(x;)2Hand (1{2) (ii)foreveryx2Eandf2H;f(x)=hf;(x;)iH: ThenH:=H()issaidtobeareproducingkernelHilbertspacewithreproducingkernel.Theproperties( 1{2 )and( 1{3 )arecalledthereproducingpropertyof(x;y)inH(). ThereproducingkernelHilbertspaceterminologyhasexistedforalongtimesincealltheGreen'sfunctionsofself-adjointordinarydierentialequationsandsomeboundedGreen'sfunctionsinpartialdierentialequationsbelongtothistype.Butitisnotuntil1943thatN.Aronszajn[ 1 ]systematicallydevelopedthegeneraltheoryofRKHSandnamedtheterm\reproducingkernel".Theexpandedpaper[ 2 ]onhispreviouswork 13

PAGE 14

3 ]andpresentedsomeexamplesin[ 4 ].MeanwhileM.G.KreinalsoderivedsomeRKHSpropertiesinhisstudyofkernelswithcertaininvarianceconditions[ 5 ].OtherworksstudyingRKHStheoryincludeHille[ 6 ],Meschkowski[ 7 ],Shapiro[ 8 ],Saitoh[ 9 ]andDavis[ 10 ].Bergmanintroducedreproducingkernelsinoneandseveralvariablesfortheclassesofharmonicandanalyticfunctions[ 11 ].Heappliedthekernelfunctionsinthetheoryoffunctionsofoneandseveralcomplexvariables,inconformalmappings,pseudo-conformalmappings,invariantRiemannianmetricsandothersubjects.AmoreabstractdevelopmentofRKHSappearsinapaperbySchwarts[ 12 ]. AsdiscussedintheMoore-AronszajnTheorem,RKHStheoryandthetheoryofpositivedenitefunctionsaretwosidesofthesamecoin.In1909,J.MercerexaminedthepositivedenitefunctionssatisfyinginEq.( 1{1 )inthetheoryofintegralequationsdevelopedbyHilbert.Mercerprovedthatpositivedenitekernelshavenicepropertiesamongallthecontinuouskernelsofintegralequations[ 13 ].ThiswasthecelebratedMer-cer'sTheoremwhichbecamethetheoreticfoundationofapplicationofRKHSinstatisticalsignalprocessingandmachinelearning.E.H.MoorealsostudiedthosekernelsinhisgeneralanalysiscontextunderthenameofpositiveHermitianmatrixanddiscoveredthefundamentaltheoremabove[ 14 ].Meanwhile,S.Bochnerexaminedcontinuousfunctions(x)ofarealvariablexsuchthat(x;y)=(xy)satisfyingconditioninhisstudyingofFouriertransformation.Henamedsuchfunctionspositivedenitefunctions[ 15 ].LaterA.Weil,I.Gelfand,D.RaikovandR.Godementgeneralizedthenotionintheirinvestigationsoftopologicalgroups.ThesefunctionswerealsoappliedtothegeneralmetricgeometrybyI.Schoenberg[ 16 17 ],J.V.Neumannandothers.In[ 18 ],J.Stewartprovidesaconcisehistoricalsurveyofpositivedenitefunctionsandtheirprincipalgeneralizationsaswellasausefulbibliography. 14

PAGE 15

19 ]oersareviewofrecentadvanceinRKHSinmanymathematicalelds. MorerelevanttothisproposalistheRKHSmethodsinprobabilitytheory,randomprocessesandstatisticallearningtheory.Iwillpresentabriefreviewonthesetwointhefollowingsectionsseparately. 20 ]. Givenaprobabilityspace(;F;P),wecandenealinearspaceL2(;F;P)tobethesetconsistingalltherandomvariablesXwhosesecondmomentsatisfying FurthermorewecanimposeaninnerproductbetweenanytworandomvariablesXandYinL2(;F;P)as ThenL2(;F;P)becomesaninnerproductspace.MoreoveritpossessesthecompletenesspropertyinthesensethattheCauchysequenceconvergesinthespaceitself[ 21 ].HencetheinnerproductspaceL2(;F;P)ofallsquareintegrablerandomvariablesontheprobabilityspace(;F;P)isaHilbertspace. 15

PAGE 16

forthesecondmomentofalltherandomvariablesxt.Thenforeacht2T,therandomvariablextcanberegardedasadatapointintheHilbertspaceL2(;F;P).Hilbertspacecanthusbeusedtostudytherandomprocesses.Inparticular,wecanconsiderconstructingaHilbertspacespannedbyarandomprocess. Wedenealinearmanifoldforagivenrandomprocessfxt:t2TgtobethesetofallrandomvariablesXwhichcanbewrittenintheformoflinearcombinations foranyn2Nandck2C.ClosethesetinEq.( 1{7 )topologicallyaccordingtotheconvergenceinthemeanusingthenorm anddenotethesetofalllinearcombinationsofrandomvariablesanditslimitpointsbyL2(xt;t2T).Bythetheoryofquadraticallyintegrablefunctions,weknowthatthelinearspaceL2(xt;t2T)formsaHilbertspaceifaninnerproductisimposedbythedenitionofEq.( 1{5 )withcorrespondingnormofEq.( 1{8 ).NoticethatL2(xt;t2T)isincludedintheHilbertspaceofallquadraticallyintegrablefunctionson(;F;P),henceL2(xt;t2T)L2(;F;P): 16

PAGE 17

22 ].Butitwasuntilinthelate1940sthatLoeveestablishedtherstlinkbetweenrandomprocessesandreproducingkernels[ 23 ].Hepointedoutthatthecovariancefunctionofasecond-orderrandomprocessisareproducingkernelandviceversa.Loevealsopresentedthebasiccongruence(isometricisomorphism)relationshipbetweentheRKHSinducedbythecovariancefunctionofarandomprocessandtheHilbertspaceoflinearcombinationsofrandomvariablesspannedbytherandomprocess[ 24 ]. ConsidertwoabstractHilbertspaceH1andH2withinnerproductsdenotedashf1;f2i1andhg1;g2i2respectively,H1andH2aresaidtobeisomorphicifthereexistsaone-to-oneandsurjectivemappingfromH1toH2satisfyingthefollowingproperties forallfunctionalsinH1andanyrealnumber.ThemappingiscalledanisomorphismbetweenH1andH2.TheHilbertspacesH1andH2aresaidtobeisometricifthereexistamappingthatpreservesinnerproducts, forallfunctionsinH1.AmappingsatisfyingbothpropertiesEq.( 1{9 )andEq.( 1{10 )issaidtobeanisometricisomorphismorcongruence.ThecongruencemapsbothlinearcombinationsoffunctionalsandlimitpointsfromH1intocorrespondinglinearcombinationsoffunctionalsandlimitpointsinH2[ 20 ]. Givenasecond-orderrandomprocessfxt:t2TgsatisfyingEq.( 1{6 ),weknowthatthemeanvaluefunction(t)iswelldenedaccordingtotheCauchy-Schwartzinequality.Wecanalwaysassumethat()0,ifnotwecanpreprocesstherandomprocesstoreducetheDCcomponent.Thecovariancefunctionisdenedas 17

PAGE 18

wherek;jistheKroneckerdeltafunction,i.e.,equalto1or0accordingask=jork6=j.Then (1{14) wheretheseriesaboveconvergesabsolutelyanduniformlyonTT[ 13 ]. ThenwecandeneafunctionfonTastheformof wherethesequencefak,k=1,2,...gsatisesthefollowingcondition LetH(R)bethesetcomposedoffunctionsf()whichcanberepresentedintheformEq.( 1{15 )intermsofeigenfunctions'kandeigenvalueskofthecovariancefunctionR(t;s). 18

PAGE 19

wherefandgareofformEq.( 1{15 )andak,bksatisfypropertyEq.( 1{16 ).OnemightaswellshowH(R)iscomplete.Letfn(t)=P1k=0ka(n)k'k(t)beaCauchysequenceinH(R)suchthateachsequencefa(n)k,n=1,2,...gconvergestoalimitpointak.HencetheCauchysequenceconvergestof(t)=P1k=0kak'k(t)whichbelongstoH(R).ThereforeH(R)isaHilbertspace.H(R)hastwoimportantpropertieswhichmakeitareproducingkernelHilbertspace.First,letR(t;)bethefunctiononTwithvalueatsinTequaltoR(t;s),thenbytheMercer'sTheoremeigen-expansionforthecovariancefunctionEq.( 1{14 ),wehave Therefore,R(t;)2H(R)foreachtinT.Second,foreveryfunctionf()2H(R)offormgivenbyEq.( 1{15 )andeverytinT, BytheMoore-AronszajnTheorem,H(R)isareproducingkernelHilbertspacewithR(t;s)asthereproducingkernel.Itfollowsthat ThusH(R)isarepresentationoftherandomprocessfxt:t2TgwithcovariancefunctionR(t;s).OnemaydeneacongruenceGformH(R)ontoL2(xt;t2T)suchthat 19

PAGE 20

NotethatthecongruencemapGcanbecharacterizedastheuniquemappingfromH(R)ontoL2(xt;t2T)satisfyingtheconditionthatforeveryfunctionalfinH(R) ItisobviousthatGinEq.( 1{21 )fulllstheconditionEq.( 1{23 ).Thenthecongruencemapcanberepresentedexplicitlyas whereaksatisesconditionEq.( 1{16 ). ToprovetherepresentationEq.( 1{24 )isavalidanduniquemap,substitutingEq.( 1{22 )andEq.( 1{24 )intoEq.( 1{23 ),weobtain ParzenappliedLoeve'sresultstostatisticalsignalprocessing,particularlytheestimation,regressionanddetectionproblemsinlate1950s.ParzenclearlyillustratedthattheRKHSapproachoersanelegantgeneralframeworkforminimumvarianceunbiased 20

PAGE 21

25 ]derivedthebasicRKHSformulaforthelikelihoodratioindetectionthroughsamplingrepresentationsoftheobservedrandomprocess.Thenonsingularityconditionfortheknownsignalproblemwasalsopresented.In[ 26 ],ParzenprovidedasurveyonthewiderangeofRKHSapplicationsinstatisticalsignalprocessingandrandomprocessestheory.Thestructuralequivalencesamongproblemsincontrol,estimation,andapproximationarebrieydiscussed.Theseresearchdirectionshavebeendevelopedfurthersince1970.MostofParzen'sresultscanbefoundin[ 25 { 28 ].Thebook[ 29 ]containssomeotherpaperspublishedbyParzenin1960s. Meanwhile,aCzechoslovakiastatisticiannamedHajekestablishedthebasiccongruencerelationbetweentheHilbertspaceofrandomvariablesspannedbyarandomprocessandtheRKHSdeterminedbythecovariancefunctionoftherandomprocessunawareoftheworkofLoeve,ParzenandevenAronszajn.Inaremarkablepaper[ 30 ],heshowsthattheestimationanddetectionproblemscanbeapproachedbyinvertingthebasiccongruencemapforstationaryrandomprocesseswithrationalspectraldensities.HajekalsoderivedthelikelihoodratiousingonlytheindividualRKHSnormsunderastrongnonsigularitycondition. Inearly1970s,KailathpresentedaseriesofpapersonRKHSapproachtodetectionandestimationproblems[ 31 { 35 ].Inpaper[ 31 ],KailathdiscussestheRKHSapproachingreatdetailstodemonstrateitssuperiorityincomputinglikelihoodratios,testingfornonsingularity,boundingsignaldetectability,anddeterminingdetectionstability.AsimplebutformalexpressionforthelikelihoodratiousingRKHSnormispresentedinpaper[ 32 ].ItalsopresentsatestthatcanverifythelikelihoodratioobtainedfromtheformalRKHSexpressionsiscorrect.TheRKHSapproachtodetectionproblemsisbasedonthefactthatthestatisticsofazero-meanGaussianrandomprocessarecompletelycharacterizedbyitscovariancefunction,whichturnsouttobeareproducingkernel.Inordertoextend 21

PAGE 22

34 ].Paper[ 35 ]considersthevarianceboundsforunbiasedestimatesofparametersdeterminingthemeanorcovarianceofaGaussianrandomprocess.AnexplicitformulaisalsoprovidedforestimatingthearrivaltimeofastepfunctioninwhiteGaussiannoise. RKHSmethodisalsoappliedtomoredicultaspectsofrandomprocesses.Forinstances,HidaandIkedastudythecongruencerelationbetweenthenonlinearspanofanindependentincrementprocessandtheRKHSdeterminedbyitscharacteristicfunction.Orthogonalexpansionsofnonlinearfunctionsofsuchprocessescanbederivedbasedonthisrelation[ 36 ].KallianpurpresentsanonlinearspanexpressionforaGaussianprocessasthedirectsumoftensorproduct[ 37 ].AnotherimportantRKHSapplicationareaisthecanonical,orinnovations,representationsforGaussianprocesses.HidawasthersttopresenttheconnectionofRKHSandcanonicalrepresentations[ 38 ].Kailathpresentedthegeneralizedinnovationsrepresentationsfornon-whitenoiseinnovationsprocess[ 33 ].RKHShasalsobeenappliedtodealwithMarkovianpropertiesofmultidimensionalGaussianprocesses(randomelds).Paper[ 39 40 ]provideRKHSdevelopmentonmultidimensionalBrownianmotionandtheconditionsformoregeneralGaussianeldstobeMarkovian. BesidesthesuccessfulapplicationsofRKHSinestimation,detectionandotherstatisticalsignalprocessingareas,therehavebeenextensiveresearchonapplyingRKHStoawidevarietyofproblemsinoptimalapproximationincludinginterpolationandsmoothingbysplinefunctionsinoneormoredimensions(curveandsurfacetting).In[ 41 ]Weinertsurveystheone-dimensionalcaseinRKHSformulationofrecursivesplinealgorithmsandconnectionswithleast-squareestimation.OptimalitypropertiesofsplinesaredevelopedandanexplicitexpressionforthereproducingkernelinthepolynomialcaseisproposedbydeBoorin[ 42 ].SchumakerpresentsasurveyofapplicationsofRKHS 22

PAGE 23

43 ].Wahbapresentsextensiveresultsonsplinein[ 44 ].FigueiredotookadierentapproachtoapplyRKHSinnonlinearsystemandsignalanalysis.HebuilttheRKHSfrombottom-upusingarbitrarilyweightedFockspaces[ 45 ].Thespacesarecomposedofpolynomialsorpowerseriesineitherscalarvariableormulti-dimensionalones.ThespacescanalsobeextendedtoinniteorniteVolterrafunctionaloroperatorseries.ThegeneralizedFockspaceshavebeenappliedtononlinearsystemapproximation,semiconductordevicecharacteristicsmodelingandothers[ 45 ]. TheRKHSapproachhasenjoyeditssuccessfulapplicationsinawiderangeofproblemsinstatisticalsignalprocessingsince1940s,andcontinuesbringingnewperspectivesandmethodstowardsoldandnewproblems.TheessentialideabehindthisisthatthereexitsacongruencemapbetweentheHilbertspaceofrandomvariablesspannedbytherandomprocessanditscovariancefunctionwhichdeterminesauniqueRKHS.TheRKHSframeworkprovidesanaturallinkbetweenstochasticanddeterministicfunctionalanalysis. 46 47 ],theriskminimizationcriterionisusedtosearchforthe 23

PAGE 24

Theobjectiveistondtheoptimalfunctionf(x;o)suchthattheriskfunctionalR()isminimizedoverallthepossiblefunctionswhenthejointprobabilitydistributionP(x;y)isxedbutunknownandtheonlyavailableinformationisthedataset. Theevolutionofstatisticallearningtheoryhasundergonethreeperiods.Inthe1960secientlinearalgorithmswereproposedtodetectlinearrelationsbetweentheinputandresponse.Oneexamplewastheperceptronalgorithmwhichwasintroducedin1958[ 48 ].Themajorresearchchallengeatthattimewastheproblemofhowtodetectthenonlinearrelations.Inthemid1980s,theeldofstatisticallearningunderwentanonlinearrevolutionwiththealmostsimultaneousintroductionofbackpropagationmultilayeredneuralnetworksandecientdecisiontreelearningalgorithms.Thisnonlinearrevolutiondrasticallychangedtheeldofstatisticallearning,andsomenewresearchdirectionssuchasbioinformaticsanddataminingwereemerged.However,thesenonlinearalgorithmsweremainlybasedongradientdescent,greedyheuristicsandothernumericaloptimizationtechniquessosueredfromlocalminimaandothers.Becausetheirstatisticalbehaviorwasnotwellunderstood,theyalsoexperiencedovertting.Athirdstageintheevolutionofstatisticallearningtheorytookplaceinthemid-1990swiththeintroductionofsupportvectormachine[ 47 ]andotherkernel-basedlearningalgorithms[ 49 ]suchaskernelprincipalcomponentanalysis[ 50 ],kernelFisherdiscriminantanalysis[ 51 ]andkernelindependentcomponentanalysis[ 52 ].Thenewalgorithmsoeredeciencyinanalyzingnonlinearrelationsfromcomputational,statisticalandconceptualpointsofview,andmadeitpossibletodosointhehigh-dimensionalfeaturespacewithoutthedangersofovertting.Theproblemsoflocalminimaandoverttingthatweretypicalofneuralnetworksanddecisiontreeshavebeenovercome. 24

PAGE 25

1{14 )thatanysymmetricpositivedenitefunction(x;y)canberewrittenasaninnerproductbetweentwovectorsinthefeaturespace,i.e., :x7!p Kernel-basedlearningalgorithmsusetheaboveideatomapthedatafromtheoriginalinputspacetoahigh-dimensional,possiblyinnite-dimensional,featurespace.BytheMoore-AronszajnTheoremintheprevioussection,thereexistsauniqueRKHScorrespondingtothesymmetricpositivedenitekernel(x;y).ThereforethefeaturespacewherethetransformeddataresideisareproducingkernelHilbertspace,wherethenonlinearmappingconstitutesthebasis.Insteadofconsideringthegivenlearningproblemsininputspace,onecandealwiththetransformeddatafk(x);k=1;2;:::ginfeaturespace.Whenthelearningalgorithmscanbeexpressedintermsofinnerproducts,thisnonlinearmappingbecomesparticularinterestingandusefulsinceonecanemploythekerneltricktocomputetheinnerproductsinthefeaturespaceviakernelfunctionswithoutknowingtheexactnonlinearmapping.Theessenceofkernel-basedlearningalgorithmisthattheinnerproductofthetransformeddatacanbeimplicitlycomputedintheRKHSwithoutexplicitlyusingorevenknowingthenonlinearmapping.Hence,byapplyingkernelsonecanelegantlybuildanonlinearversionofalinearalgorithmbasedoninnerproducts.Oneoftherationalestononlinearlymappingthedataintoahigh-dimensionRKHSisCover'stheoremontheseparabilityofpatterns[ 53 ].Cover's 25

PAGE 26

Theresearchonkernel-basedlearningalgorithmsbecameveryactivesinceVapnik'sseminalpaperonsupportvectormachineswaspublishedin1990s.Peoplestartedtoker-nelizedmostthelinearalgorithmswhichcanbeexpressedintermsofinnerproduct.Oneofthedrawbacksofthekernel-basedlearningalgorithmsisthecomputationalcomplexityissue.Mostkernel-basedlearningalgorithmswilleventuallyresultinoperationsonGrammatrixwhosedimensiondependsonthenumberofdata.Forinstance,computationofeigenvaluesandeigenvectorsforathousanddimensionalGrammatrixdemandsagreatdealofcomputationalcomplexity.Therefore,agreatamountofoptimizationalgorithmshavebeendevelopedtoaddressthisissuebasedonnumericallinearalgebra.Ontheotherhand,sincekernel-basedlearningfromdatausuallyendsupanill-posedproblem,regularizationthroughnonlinearfunctionalsbecomesnecessaryandmandatory.Hence,crossvalidationisneededtochooseanoptimalregularizationparameter[ 54 ]. 55 ],wherekernel-baseddensityestimatorsformtheessenceofthislearningparadigm.Information-theoreticlearningisasignalprocessingtechniquethatcombinesinformationtheoryandadaptivesystemstoimplementinformationlteringwithoutrequiringamodelofthedatadistributions.ITLusestheconceptsofParzenwindowingappliedtoRenyi'sentropydenitiontoobtainasamplebysamplealgorithmthatestimatesentropydirectlyfrompairsofsampleinteractions.ByutilizingRenyi'smeasureofentropyandapproximations 26

PAGE 27

Information-theoreticlearninghasachievedexcellentresultsonanumberoflearningscenarios,e.g.blindsourceseparation[ 56 ],supervisedlearning[ 57 ]andothers[ 55 ]. Oneofthemostcommonlyusedcostfunctionsininformation-theoreticlearningisthequadraticRenyi'sentropy[ 58 ].Renyi'sentropyisageneralizationofShannon'sentropy.GivenaPDFf(x)forarandomvariablex,thequadraticRenyi'sentropyisdenedas whichiscalledinformationpotential,sonamedduetoasimilaritywiththepotentialenergyeldinphysics[ 55 ].Theconceptandpropertiesofinformationpotentialhavebeenmathematicallystudiedandanewcriterionbasedoninformationpotentialhasbeenproposed,calledtheMEE(MinimizationErrorEntropy),toadaptlinearandnonlinearsystems[ 59 ].MEEservesasanalternativetotheconventionalMSE(MeanSquareError)innonlinearlteringwithseveraladvantagesinperformance. Anon-parametricasymptoticallyunbiasedandconsistentestimatorforagivenPDFf(x)isdenedas[ 60 ] ^f(x)=1 where(;)iscalledtheParzenwindow,orkernel,whichisthesamesymmetricnon-negativedenitefunctionusedinkernel-basedlearningtheorysuchasGaussiankernel,polynomialkernelandothers[ 61 ].ThenbyapproximatingtheexpectationbythesamplemeaninEq.( 1{28 ),wecanestimatetheinformationpotentialdirectlyfromthe 27

PAGE 28

^I(x)=1 wherefxigNi=1isthedatasampleandNisthetotalnumber.AccordingtotheMercer'stheorem[ 13 ],anysymmetricnon-negativedenitekernelfunctionhasaneigen-decompositionas(x;y)=h(x);(y)iH,where(x)isthenonlinearlytransformeddataintheRKHSHinducedbythekernelfunctionandtheinnerproductisperformedinH.Therefore,wecanre-writetheestimateofinformationpotentialas ^I(x)=*1 ITLhasalsobeenusedtocharacterizethedivergencebetweentworandomvariables.Ininformationtheory,mutualinformationisoneofthequantitiesthatquantiesthedivergencebetweentworandomvariables.Anotherwell-knowndivergencemeasureistheKullback-Leiblerdivergence[ 62 ].However,theKullback-Leiblermeasureisdiculttoevaluateinpracticewithoutimposingsimplifyingassumptionsaboutthedata,thereforenumericalmethodsarerequiredtoevaluatetheintegrals.Inordertointegratethenon-parametricPDFestimationviaParzenwindowingtoprovideanecientestimate,twodivergencemeasuresforrandomvariablesbasedonEuclideandierenceofvectorsinequalityandCauchy-Schwartzinequalityrespectivelyhavebeenproposed[ 55 ]. 28

PAGE 29

ThedivergencemeasurebasedonCauchy-Schwartzinequalityisgivenby Zf2(x)dxZg2(x)dx:(1{32) NoticethatbothDED(f;g)andDCS(f;g)aregreaterthanzero,andtheequalityholdsifandonlyiff(x)=g(x). TheEuclideanEq.( 1{31 )andCauchy-SchwartzdivergencemeasuresEq.( 1{32 )canbeeasilyextendedtotwo-dimensionalrandomvariables.Asaspecialcase,ifwesubstitutethemarginalPDFsfandginEq.( 1{31 )andEq.( 1{32 )byajointPDFf1;2(x1;x2)andtheproductofmarginalPDFsf1(x1)f2(x2)respectively,theEuclidianquadraticmutualinformationisgivenby[ 55 ] andtheCauchy-Schwartzquadraticmutualinformationisdenedas[ 55 ] ZZf21;2(x1;x2)dx1dx2ZZf21(x1)f22(x2)dx1dx2: AscanbeseenfromabovethatIED(f1;f2)0andICS(f1;f2)0.Ifandonlyifthetworandomvariablesarestatisticallyindependent,thenIED(f1;f2)=0andICS(f1;f2)=0.Basically,theEuclideanquadraticmutualinformationmeasurestheEuclideandierencebetweenthejointPDFandthefactorizedmarginals,andlikewisefortheCauchy-Schwartz 29

PAGE 30

63 ],andclustering[ 64 ]. OneofthelimitationsofITListhatitdoesnotconveythetimestructureofsignalsbecauseitassumesi.i.d.data.However,inpracticemostofsignalsinengineeringhavecorrelationintimeortemporalstructures.Itwouldbehelpfultoincorporatethetemporalstructureswhilestillcontaininghighorderstatistics,forinstanceworkingwithcodedsourcesignalsindigitalcommunications. 30

PAGE 31

Recentlyanewgeneralizedcorrelationfunction,calledcorrentropy,hasbeenproposedtocombinethesetwokernelstocharacterizeboththetemporalstructureandstatisticalinformationofrandomprocesses[ 65 ].Thecorrentropyhasbeenappliedtovarioussignalprocessingandmachinelearningproblemsandproducedpromisingresults.Thecorrentropybasedmatchedlteroutperformstheconventionalmatchedlterinimpulsenoisescenario[ 66 ].ThecorrentropyMACElterhasbeenproposedforimagerecognition[ 67 ].SincecorrentropyinducesanewRKHS,itisabletobringnonlinearityintothetraditionalstatisticalsignalprocessing.CorrentropyWienerlternonlinearlytransformstheoriginalrandomprocessintothehighdimensionalRKHSinducedbythekernelfunctionwhileminimizesthemeansquareerrorbetweenthedesiredandoutputsignals.TheoutputsignalhasbeenrepresentedbytheinnerproductbetweenthosenonlinearlytransformedinputsignalandtheweightsinRKHS.ThecorrentropyWienerlterexhibitsmuchbetterperformancethantheconventionalWienerlterandmultilayerperceptron[ 68 ].Theseup-to-datedevelopmentsofcorrentropyclearlydemonstratethepromisingfeaturesandapplicableareas,whicheectivelyintroduceanewnonlinearsignalprocessing 31

PAGE 32

65 ],andpresentanothergeneralizedcovariancefunction,calledcenteredcorrentropy.Thecorrentropyandcenteredcorrentropyfunctionsaretypicallyacombinationofexpectationoperatorandpre-designedoperator.ItcanbeeasilyseenthatthenewoperatorsarealsosymmetricpositivedeniteandthusinducesanotherreproducingkernelHilbertspaceswhichdrasticallychangethestructureofthereproducingkernelHilbertspacesinducedbyconventionalautocorrelationfunctionandthepre-designedkernelfunction.Althoughthecorrentropyandcenteredcorrentropyhavebeenappliedtosomedierentsignalprocessingandmachinelearningproblems,furthertheoreticalanalysisandexperimentalworkareneededtofullyelucidatethenewconceptandevaluatetheassociatedproperties.Thisdissertationstrivestoserveasoneoftheseeorts. Thisdissertationisorganizedasfollows.Inchapter 3 ,thedenitionsofgeneralizedcorrelationandcovariancefunctions,whicharecalledcorrentropyandcenteredcorrentropyrespectively,areproposedandanalyzedfromtime-domainandfrequency-domain.Chapter 4 addressesthegeometricstructureofthereproducingkernelHilbertspacesinducedbythecenteredcorrentropy.AnewexplicitconstructionofRKHSwithGaussiankernelispresented.Aparametriccorrentropyfunctionisproposedinchapter 5 toquantifydependencemeasure.ApplicationofcenteredcorrentropyinprincipalcomponentanalysisispresentedinChapter 6 .Ialsoapplycorrentropyinpitchdeterminationinchapter 7 andnonlinearcouplingmeasureinchapter 8 .Iconcludetheworkandpresentsomefutureworkinchapter 9 32

PAGE 33

Inthischapter,weproposeareproducingkernelHilbertspace(RKHS)frameworkfortheinformation-theoreticlearning(ITL).TheRKHSisuniquelydeterminedbythesymmetricnon-negativedenitekernelfunctionwhichisdenedasthecrossinformationpotential(CIP)inITL.Thecrossinformationpotentialasanintegralofproductoftwoprobabilitydensityfunctionscharacterizessimilaritybetweentworandomvariables.Wealsoprovetheexistenceofaone-to-onecongruencemappingbetweenthepresentedRKHSandtheHilbertspacespannedbyprobabilitydensityfunctions.Allthecostfunctionsintheoriginalinformation-theoreticlearningformulationcanbere-writtenasalgebraiccomputationsonfunctionalvectorsinthereproducingkernelHilbertspace.WeprovealowerboundfortheinformationpotentialbasedonthepresentedRKHS.TheproposedRKHSframeworkoersanelegantandinsightfulgeometricperspectivetowardsinformation-theoreticlearning. Fromthedenitionsofvariouscostfunctionsininformation-theoreticlearning,weseethatthemostfundamentalquantityistheintegralofproductoftwoprobabilitydensityfunctions(PDFs)Rf(x)g(x)dxwhichiscalledthecrossinformationpotential(CIP)[ 55 ].CrossinformationpotentialmeasuresthesimilaritybetweentwoPDFs,whiletheinformationpotentialEq.( 1{28 )isnothingbutameasureofself-similarity.CIPappearsbothinEuclideanandCauchy-Schwartzdivergencemeasures.Inthischapter,weshalldevelopthereproducingkernelHilbertspaceframeworkforinformation-theoreticlearningbasedonthecrossinformationpotential. 33

PAGE 34

foranyIIandi2R.ClosethesetinEq.( 2{1 )topologicallyaccordingtotheconvergenceinthemeanusingthenorm Z(fi(x)fj(x))2dx;8i;j2I(2{2) anddenotethesetofalllinearcombinationsofPDFsanditslimitpointsbyL2(E).L2(E)isanL2spaceonPDFs.Moreover,bythetheoryofquadraticallyintegrablefunctions,weknowthatthelinearspaceL2(E)formsaHilbertspaceifaninnerproductisimposedaccordingly.GivenanytwoPDFsfi(x)andfj(x)inE,wecandeneaninnerproductas Noticethatthisinnerproductisexactlythecrossinformationpotential[ 55 ].ThisdenitionofinnerproducthasacorrespondingnormofEq.( 2{2 ).Hence,theL2(E)equippedwiththeinnerproductEq.( 2{3 )isaHilbertspace.However,itisnotareproducingkernelHilbertspacebecausetheinnerproductdoesnotsatisfythereproducingpropertyinL2(E).NextweshowthattheinnerproductEq.( 2{3 )issymmetricnon-negativedenite,andbytheMoore-AronszajntheoremituniquelydeterminesareproducingkernelHilbertspace. 34

PAGE 35

First,wedeneabivariatefunctiononthesetEas ThisfunctionisalsothedenitionoftheinnerproductEq.( 2{3 ),andthecrossinformationpotentialbetweentwoPDFs.ThiswillbethekernelfunctionintheRKHSHVconstructedbelow.InreproducingkernelHilbertspacetheory,thekernelfunctionisameasureofsimilaritybetweenfunctionals.Aspointedoutearlier,thecrossinformationpotentialisasimilaritymeasurebetweentwoprobabilitydensityfunctions,henceitisnaturalandmeaningfultodenethekernelfunctionassuch.Next,weshowthatfunctionEq.( 2{4 )issymmetricnon-negativedeniteinE. 2{4 )issymmetricnon-negativedeniteinEE!R. 2{4 ).WeconstructtheRKHSHVfrombottom-up.SincefunctionEq.( 2{4 )issymmetricandnon-negativedenite,italsohasaneigen-decompositionbytheMercer'stheorem[ 13 ]as 35

PAGE 36

13 ]. ThenwedeneaspaceHVconsistingofallfunctionalsG()whoseevaluationforanygivenPDFfi(x)2Eisdenedas wherethesequencefak,k=1,2,...gsatisesthefollowingcondition FurthermorewedeneaninnerproductoftwofunctionalsinHVas whereGandFareofformEq.( 2{6 ),andakandbksatisfypropertyEq.( 2{7 ). ItcanbeveriedthatthespaceHVequippedwiththekernelfunctionEq.( 2{4 )isindeedareproducingkernelHilbertspaceandthekernelfunctionV(fi;)isareproducingkernelbecauseofthefollowingtwoproperties: 1.

PAGE 37

GivenanyG2HV,theinnerproductbetweenthereproducingkernelandGyieldsthefunctionitselfbythedenitionEq.( 2{8 ) Therefore,HVisareproducingkernelHilbertspacewiththekernelfunctionandinnerproductdenedabove. 2{5 )as WeemphasizeherethatthereproducingkernelV(fi;fj)isdata-dependentbywhichwemeanthenormofnonlinearlytransformedvectorintheRKHSHVisdependentonthePDFoftheoriginalrandomvariablebecause ifweusetranslation-invariantkernelfunction[ 61 ].Thevalueof(0)isaconstantregardlessoftheoriginaldata.Consequently,thereproducingkernelHilbertspacesHVandHdeterminedbyV(fi;fj)and(x;y)respectivelyareverydierentinnature. 37

PAGE 38

WehavepresentedtwoHilbertspaces,theHilbertspaceL2(E)ofPDFsandthereproducingkernelHilbertspaceHV.Eventhoughtheirelementsareverydierent,thereactuallyexistsaone-to-onecongruencemapping(isometricisomorphism)fromRKHSHVontoL2(E)suchthat (V(fi;))=fi(x):(2{10) NoticethatthemappingpreservesisometrybetweenHVandL2(E)sincebydenitionsofinnerproductEq.( 2{3 )inL2(E)andEq.( 2{9 )inL2(E) Inordertoobtainanexplicitrepresentationof,wedeneanorthogonalfunctionsequencefm(x);m=1;2;:::gsatisfying wherekandk(fi)areeigenvalueandeigenfunctionassociatedwiththekernelfunctionV(fi;fj)bytheMercer'stheoremEq.( 2{5 ).Weachieveanorthogonaldecompositionoftheprobabilitydensityfunctionas ThenormalityconditionisfullledbytheassumptionEq.( 2{11 ). 38

PAGE 39

ItisobviousthatinEq.( 2{10 )fulllstheconditionEq.( 2{13 ).Thenthecongruencemapcanberepresentedexplicitlyas (G)=1Xk=1akk(x);8G2HV;(2{14) whereaksatisesconditionEq.( 2{7 ). ToprovetherepresentationEq.( 2{14 )isavalidanduniquemap,substitutingEq.( 2{12 )andEq.( 2{14 )intoEq.( 2{13 ),weobtain 39

PAGE 40

2{4 )to First,asthekernelfunctionV(fi;fj)inHVisdenedasthecrossinformationpotentialbetweentwoPDFs,immediatelywehave Thatis,thecrossinformationpotentialistheinnerproductbetweentwononlinearlytransformedfunctionalsintheRKHSHV.Theinnerproductquantiessimilaritybetweentwofunctionalswhichisconsistentwiththedenitionofcrossinformationpotential.Theinformationpotentialcanthusbespeciedastheinnerproductofafunctionalwithrespecttoitself TheinformationpotentialappearsasthenormsquareofnonlinearlytransformedfunctionalintheRKHSHV.Therefore,minimizingerrorentropyinITLturnsouttobemaximizationofnormsquareintheRKHSHV(becausetheinformationpotentialistheargumentoftheloginRenyi'squadraticentropy).AsstatedinITL,MEEemployshigher-orderstatisticsinnonlinearadaptivesystemstrainingsinceitisbasedonRenyi'squadraticentropy[ 59 ].Weobserveherethatthehigher-orderstatisticsinMEEbecomes 40

PAGE 41

25 ].ButtheRKHSHRonlytakesthesecond-orderstatisticsintoaccount,i.e.,themeansquareerror.TheRKHSHVimplicitlyembedshigher-orderstatistics.ComparedtotheRKHSHinducedbythepre-designedkernelfunctionusedinthemachinelearning,ourframeworkismoreeleganttheoreticallybecauseitcorrespondstothedenitionofinformationpotentialdirectlywithoutemployinganykernel-basedPDFestimator.Fromthecomputationalpointofview,theestimateofinformationpotentialbasedonParzenwindowPDFestimatoryieldsadirectcalculationofinformationquantityfromdata.Howeverfromthetheoreticalperspective,itismoreappropriatetodenetheRKHSframeworkbasedontheinformationpotentialitselfinsteadoftheestimateusingakernel-basedPDFestimator. BasedonthereformulationsofcrossinformationpotentialEq.( 2{15 )andinformationpotentialEq.( 2{16 )inRKHSHV,wearereadytore-writetheone-dimensionalEuclideanEq.( 1{31 )andCauchy-SchwartzdivergencemeasuresEq.( 1{32 )intermsofoperationsonfunctionalsinHV.First, 1{31 )does.TheCauchy-Schwartzdivergencemeasurecanbephrasedas

PAGE 42

ToextendthesameformulationtotheEuclideanandCauchy-SchwartzquadraticmutualinformationEq.( 1{33 )andEq.( 1{34 ),considertheproductofmarginalPDFsf1(x1)f2(x2)asaspecialsubsetA2ofthe2-dimensionalsquareintegrablePDFssetE2wherethejointPDFcanbefactorizedintoproductofmarignals,i.e.,A2E2.ThenbothmeasurescharacterizedierentgeometricinformationbetweenthejointPDFandthefactorizedmarginalPDFs.TheEuclideanquadraticmutualinformationEq.( 1{33 )canbeexpressedas TheangleistheseparationbetweentwofunctionalvectorsinHV(2).Whentworandomvariablesareindependent(f1;2(x1;x2)=f1(x1)f2(x2)andA2=E2),=0andthedivergencemeasureICS(f1;f2)=0sincetwosetsareequal.If=90,twovectorsinHV(2)areorthogonalandthejointPDFissingulartotheproductofmarginals.Inthiscase,thedivergencemeasureisinnity. TheproposedRKHSframeworkprovidesanelegantandinsightfulgeometricperspectivetowardsinformation-theoreticlearning.AllthecostfunctionsinITLcannowbere-expressedintermsofalgebraicoperationsonfunctionalsinRKHSHV. 42

PAGE 43

1{28 )inthissection.FirstwecitetheprojectiontheoreminHilbertspacethatwewilluseinthefollowingproof. P(sjM)=NXi=1NXj=1hs;uiiK1(i;j)uj;(2{18) whereK(i;j)istheNNGrammatrixwhose(i;j)isgivenbyhui;uji.TheprojectedvectorP(sjM)alsosatisesthefollowingconditions: Thegeometricalexplanationofthetheoremisstraightforward.Readerscanreferto[ 21 ]forathoroughproof.Nowwestatethepropositiononalowerboundfortheinformationpotential.

PAGE 44

2{18 ),wecanndtheorthogonalprojectionofV(f;)ontothesubspaceMas P(V(f;)jM)=NXi;j=1hV(f;);V(gi;)iHVG1(i;j)V(gj;): 2{20 ), 2{19 )satises CombiningEq.( 2{21 )andEq.( 2{21 ),wecometotheconclusionofourpropositionEq.( 2{21 ). 2{21 )oersatheoreticallowerboundforminimizationofinformationpotential. 44

PAGE 45

69 ].Extensiontoinnite-dimensional,non-parametricsub-manifoldhasbeenadvanced[ 70 ].Fornite-dimensional,parametricfamiliesofPDFs,theonlyinvariantmetrictothetangentspaceistheRiemannianstructuredenedbytheFisherinformation[ 69 71 ] 2{4 )isdeneddirectlyinthePDFspace,howeverthekernelfunctionininformationgeometryisdenedintheparameterspacesinceitaimsatestimatingmodelparametersfromdata.Hence,bothmethodologiesdenenon-parametricandparametricreproducingkernelHilbertspacesrespectivelytotackleproblemsofinterestfromdierentperspectives. 2{17 )wasappliedtoindependentcomponentanalysisbasedonthisinterpretation[ 63 ].Usingprobabilitydistributionstomeasuresimilarityisnothingnew.OnecustomaryquantityistheKullback-Leiblerdivergence.However,itisnotpositivedenitenorsymmetric,andhencedoesnothaveareproducingkernelHilbert 45

PAGE 46

Recently,severalprobabilityproductkernelshavebeenproposedinmachinelearningeldtouseensemblesinsteadofindividualdatatocapturedependencebetweengenerativemodels[ 72 ].ItisshownthattheBhattacharyyacoecientdenedby 73 ].Theexpectedlikelihoodkernelin[ 73 ]isexactlythecrossinformationpotential.Butastheyproposedtheprobabilityproductkernelpurelyfrommachinelearningpointofview,itfailedtorelatetoabroaderinformationtheoreticframework.OurcontributioninthischapteristoindependentlyproposeareproducingkernelHilbertspaceframeworkforinformation-theoreticlearning,constructtheRKHSfrombottom-up,andprovethevaliditymathematically.Thereforethekernelfunctionhasarichinformation-theoreticinterpretation.Moreover,asITLismainlyappliedtoadaptivesignalprocessing,weemploynon-parametricmethodtocomputethecrossinformationpotentialkernelandotherquantitieswithoutanexplicitprobabilitydensityfunctionestimation.However,aparametricgenerativemodelisassumedinordertocalculatethekernelintheirapproach[ 72 73 ]. 46

PAGE 47

47

PAGE 48

74 ]. Butwhentheapplicationisrequiredtodealwithnonlinearsystemsornon-gaussiansignals,thesecondorderstatisticsarenotsucientbecausetheyfailtocapturethenonlinearityandhigherorderstatisticsintrinsictotheproblems.Therehavebeenmanyattemptstotacklenonlinearandnon-Gaussiansignalprocessing.Forinstance,theWienerandHammersteinmodelsaretheearlymethodsproposedtoimplementoptimalnonlinearsystemidentication[ 75 ].Inthesemodels,astaticnonlinearitychosenaprioriisplacedinfrontof(orafter)alineartime-invariantsystem,andtheoptimalsolutionisobtainedviaWiener-Hopfequation[ 75 ].OthersincludeVolterraseries[ 76 ].RecentlyPrincipeetalappliedinformationtheoryquantities,likeentropyandmutualinformation,toadaptivesignalprocessing,whichisnamedinformation-theoreticlearning(ITL)[ 55 ].ButITLmethodslackthetimestructureinformationoftherandomprocesses.Toincorporateboththetimestructureandthehighorderstatistics,anewgeneralizedcorrelationfunction,calledcorrentropy,wasproposedbySantamariaetalrecentlyin[ 65 ].Thisgeneralizedcorrelationfunctioncanalsobeviewedasacorrelationfunctionforthetransformed 48

PAGE 49

thegeneralizedcovariancefunction,namedcenteredautocorrentropy,isdenedas foreachtandsinT,whereEdenotesthestatisticalexpectationoperatorand(;)isasymmetricpositivedenitekernelfunction.Noticethatthecorrentropyisthejointexpectationof(xt;xs),whilethecenteredcorrentropyisthedierencebetweenjointexpectationandmarginalexpectationof(xt;xs). Intheliterature,therehavebeenseveralsymmetricpositivedenitekernelfunctionsproposedinmachinelearning,functionapproximation,densityestimation,supportvectormachines,andetc.Forexample,thesigmoidalkernel,Gaussiankernel,polynomialkernelandsplinekerneljusttonameafew.ThemostlyusedkernelfunctionistheGaussiankernelwhichisgivenby BytheMercer'stheoreminchapter 1 Eq.( 1{14 ),anysymmetricpositivedenitefunction(xt;xs)canberewrittenasaninnerproductbetweentwovectorsinthefeature 49

PAGE 50

:xt7!p 1{2 ).Thereforewecanrewritetheautocorrentropyintermsofthenonlinearmappingas Likewisethecenteredautocorrentropycanalsobeexpressedas (3{6) Itcanbeseenthatthecorrentropyfunctionisa\conventional"correlationfunctionforthetransformedrandomprocessinthehigh-dimensionalRKHS.Whilethecenteredcorrentropyisnothingbutthecorrentropyforthezeromean(centered)randomprocessf(xt)E[(xt)]:t2Tg.Thiswayofdeningthegeneralizedcorrelationandcovariancefunctionsisinthesamespiritofthestandardcorrelationandcovariancefunctions. Correntropycanbeappliedbothtoonetimeseries,calledtheautocorrentropyaswehavedenedabove,orapairofmultidimensionalrandomvariables,calledthecrosscorrentropy.Thedenitionsofcrosscorrentropyandcenteredcrosscorrentropyfunctionsarestraightforward.Theycomputethegeneralizedcorrelationacrossthespacestructureinsteadoftimestructure. 50

PAGE 51

whereEdenotesthestatisticalexpectationoperator,(;)isasymmetricpositivedenitekernelfunction,andisthenonlineartransformationassociatedwithkernel(;)Eq.( 1{27 ).Noticethatcorrentropyisthejointexpectationof(x;y),whilethecenteredcorrentropyisthedierencebetweenthejointexpectationandmarginalexpectationof(x;y). Thecorrentropyandcenteredcorrentropyfunctionsforrandomvariablessharethesamepropertiesastheonesforrandomprocessessincetheycanalsobeviewedasoneinstantofcorrentropyandcenteredcorrentropyfunctionsforrandomprocesses.Withoutambiguity,wewillcallbothautocorrentropyandcrosscorrentropycorrentropyfunctionsinthefollowings. Iftherandomprocessfxt:t2Tgispair-wiseindependent,inotherword, wherefXt;Xs(xt;xs)isthejointprobabilitydensityfunction(PDF)andfXt(xt),fXs(xs)arethemarginalPDFs,thenthecorrentropyatt6=sbecomes (3{10) 51

PAGE 52

3{10 )iscalledinformationpotentialandcorrespondstotheargumentofthelogarithmofthequadraticRenyi'sentropywhenaParzenwindowestimatorisused[ 55 ].Hencethegeneralizedcorrelationfunctioniscalledcorrentropy.Underthispair-wiseindependentcondition,thecenteredcorrentropyreducestozerosinceitisdenedasthedierenceofcorrentropyandtheinformationpotential.Intheconventionalsecondorderrandomprocesses,onlypair-wiseuncorrelatednessisrequiredtozerothecovariancefunction.Theconditionofpair-wiseindependenceismuchstrongerthanthepair-wiseuncorrelatedness,thisshowsthatthecenteredcorrentropyrequireshighorderstatisticalinformationinordertoattainzero.Thiscanalsobeenseenfromthefollowingobservation. ApplyingTaylorseriesexpansiontotheGaussiankernel,wecanrewritethecorrentropyfunctionas whichcontainsalltheeven-ordermomentsoftherandomvariable(xtxs).Obviouslydierentkernelfunctionswouldyielddierentexpansions.Butallthekernelfunctionsmentionedaboveinvolvehighorderstatisticalinformationaboutrandomprocesses.Thereforethecorrentropyandcenteredcorrentropypartiallycharacterizesthehighorderstatisticsofrandomprocesses.Thecorrentropyandcenteredcorrentropysatisfythefollowingproperties. 52

PAGE 53

Thereforethecorrentropyispositivedenite.Similarlyforthecenteredcorrentropy, 0E24nXi=1i((xti)E[(xti)])235=E"*nXi=1i((xti)E[(xti)]);nXj=1j((xtj)E[(xtj)])+#=nXi=1nXj=1ijEh(xti)E[(xti)];(xtj)E[(xtj)]i=nXi=1nXj=1ijU(ti;tj): Hencethecenteredcorrentropyispositivesemi-denite. ThepositivedenitenesspropertiesofthecorrentropyandcenteredcorrentropyarethemostfundamentalpropertiesforthesetwofunctionsbecausealltheotherfollowingpropertiescanbededucedfrompositivedenitenessandmoreimportantlyitguaranteesthatthecorrentropyandcenteredcorrentropyuniquelydeterminetworeproducingkernelHilbertspaces.Thispositivedenitenesspropertyenablesthesetwogeneralizedfunctionsawiderangeofpotentialapplicationsinstatisticalsignalprocessing. 53

PAGE 54

Since(xt;xt)>0bythepositivedenitenessofkernelfunction,accordingly,V(t;t)>0.ForU(t;t),leti;j=1andn=1inEq.( 3{13 ),theresultfollows. Letn=2inEq.( 3{12 )andEq.( 3{13 ),twoexpressionsreduceto Wecansubstitute21=V(s;s) 2p 2p 2p 2p intoEq.( 3{14 )andEq.( 3{15 )respectivelytoobtainthepropertiesabove. Arandomprocessfxt:t2Tgissaidtobestrictlystationaryifthenite-dimensionaljointprobabilitydensityfunctionisshift-invariantforeachnandeachchoiceoft1;:::;tninT, 54

PAGE 55

and foralltandinT.Wedenote(x)(0),V(t+;t)andU(t+;t)by(x),V()andU()respectivelywheneverEq.( 3{17 )-Eq.( 3{19 )holdforalltand.Itmightbethatthefunctions(x),V()andU()areshiftinvariantinthesenseofEq.( 3{17 )-Eq.( 3{19 ),yettherandomprocessfxt:t2Tgisnotstrictlystationary.Sincethoseequationsrepresentpropertiesoftherandomprocessthatareofinterestintheirownright,wecandenetwoformsofstationaritythatare,ingeneral,muchweakerthanthestrictstationaritydenedinEq.( 3{16 ),butmuchstrongerthantheconventionalwide-sensestationarityandcovariancestationaritysincealltheeven-ordermomentsshouldbetimeshiftinvariantfromEq.( 3{11 )inordertoobtainaunivariatecorrentropyfunction. 3{17 )andEq.( 3{18 )holdforalltand;theprocessissaidtobecenteredcorrentropystationaryifEq.( 3{19 )holdsforalltand. Supposet=s+inEq.( 3{6 ),thenthecenteredcorrentropybecomes Theright-handsidedoesnotdependonsforacorrentropy-sensestationaryprocess.Consequently,U(s+;s)doesnotdependons.Thereforewehavethefollowingimportantfact:

PAGE 56

forall. Theexpectations,orensembleaverages,ofarandomprocessareaverageacrosstheprocess.Inpractice,wedonothaveinniterealizationsofrandomprocesses.Accordingly,wemightusetimeaveragestoapproximatetheensembleaverages.Inordertoberigorousforusingthisapproach,wehavetoshowthattimeaveragesconvergetocorrespondingensembleaveragesinsomestatisticalsense.Consideracorrentropy-sensestationarydiscrete-timerandomprocessfxn:n2Z+g.Thoughthemeanofthenonlinearlytransformedrandomprocessf(xn):n2Z+gisnotrequiredinapplications,itisofinteresttoinvestigatetherelationshipbetweenensembleaverageandtimeaveragetothemean.Inthisregard,denotethemeanoftheprocessf(xn):n2Z+g,E[(xn)]byanddenethetimeaveragetothemeanas ^(N)=1 whereNisthenumberofavailablesamplesusedintheestimation.TheestimatorEq.( 3{24 )isanunbiasedestimatoroftheensembleaverageoftheprocesssince Furthermore,theprocessissaidtobemean-normergodicinthemean-norm-squareerrorsenseifthemean-norm-squarevalueoftheerrorbetweentheensembleaverageandthe 56

PAGE 57

limN!1Ek^(N)k2=0:(3{26) SubstitutingEq.( 3{24 )intoEq.( 3{26 ),wecanrewrite whereU(nk)isthecenteredcorrentropyatnktimelagforthecorrentropy-senserandomprocess.Letm=nk,thedoublesummationinEq.( 3{27 )canbesimpliedasfollows, Hence,thenecessaryandsucientconditionfortheprocessfxn:n2Z+gtobemean-normergodicinthemean-norm-square-errorsenseisthat limN!11 Thetimeaverage^(N)oftheprocessconvergestotheensembleaverageinthemean-norm-square-errorsenseiftheprocessfxn:n2Z+gisasymptoticallycorrentropy-senseuncorrelatedinthesenseofEq.( 3{29 ). 57

PAGE 58

^V(l)=1 and ^U(l)=1 where1 Inwhatfollows,weconductseveralsimulationstodemonstratesomeofthepropertiesofcorrentropyandcentered-correntropy.First,wewouldliketoillustratetherelationbetweencorrentropyandcentered-correntropyfunctions.3setsof100000i.i.d.datasampleseachofGaussian,exponential,andGammadistributedrespectivelyweregenerated.TheGaussiansourceissettobezeromeanandunitvariance.Theparameterinexponentialdistributionissettobe20.TheparametersinGammadistributionaresettobe0.2and1respectively.ThenwepassthesedatathroughanIIRlterwithtransferfunction Thei.i.d.signalandlteredsignalarebothnormalizedtozeromeanandunitvariance.AGaussiankernelwithunitkernelwidthisusedtoestimatethecorrentropyandcenteredcorrentropy.InFig. 3-1 ,weplotthecorrentropy[ 65 ]andcenteredcorrentropyfunctionsfortheoriginalandlteredsignalrespectively.AsprovedinEq.( 3{10 ),underthepair-wiseindependentassumption,thevalueofcorrentorpyatnon-zerolagistheinformationpotentialandthecenteredcorrentropyatnon-zerolagreducetozerosinceitisdenedasthedierencebetweencorrentropyandinformationpotential.Theleftplotclearlyillustratesthepoint.Theestimatedinformationpotentialsare0.22803,0.26098and0.31452forGaussian,exponentialandGammadistributedsignalsrespectively.Likewise, 58

PAGE 59

Correntropyandcenteredcorrentropyfori.i.d.andlteredsignalsversusthetimelag therightplotshowsthecorrentropyandcenteredcorrentropyforthelteredsignals.Wewillonlyplotthecorrentropyfunctioninthefollowingsimulationssincethedierencebetweencorrentropyandcenteredcorrentropyisonlytheinformationpotentialwhichisaconstantdependingonthekernelandsignal(Eq.( 1{31 )). Oursecondsimulationdemonstratestheeectivenessofcorrentropyandcenteredcorrentropyinpartialcharacterizationofhighorderstatisticsofrandomprocesses.Wecomparetheconventionalautocorrelationandcorrentropyfunctionsfortwodatasets.Therstdatasetis100000i.i.d.dataofGaussiandistributedwithzeromeanandunitvariance.TheseconddatasetistheARCH(autoregressiveconditionalheteroskedasticity)modelwhichisusedineconometricstopredictassetreturnvolatility[ 77 ].TheARCHtimeseriesisuncorrelated(secondorderstatistics)butnoti.i.d..Thetimeseriesmodelis 59

PAGE 60

Autocorrelationandcorrentropyfori.i.d.andARCHseriesversusthetimelag denedas wherefetg1t=0isawhitenoisestochasticprocessand0>0,i08i=1;:::;r.Wechoosefetg1t=0tobei.i.d.Gaussiandistributionofzeromeanandunitvariance,r=1,0=0:2and1=1.AGaussiankernelwithunitkernelwidthisusedtoestimatethecorrentropy.InFig. 3-2 ,theautocorrelationfunctionsforthei.i.d.GaussiandataanduncorrelatedbutdependentARCHtimeseriesaregiveninthetoptwoplots.Itisexpectedthattheconventionalautocorrelationfunctionsforani.i.d.signalandanuncorrelatedsignalarethesamesinceautocorrelationfunctiononlyspeciesthesecondorderstatistics.Thebottomtwoplotsshowthecorrentropyfunctionsforthesamei.i.d.anduncorrelatedsignalsrespectively.Ashasbeenpointedoutpreviously,thecorrentropy 60

PAGE 61

ThethirdsimulationinvestigatestheeectivenessofcorrentropyincapturingthePDFandnonlinearityinformationoftherandomprocesses.Wegenerate3setsof100000i.i.d.datasampleseachofGaussian,exponential,andGammadistributionrespectivelytotestcorrentropy'sabilitytocharacterizePDFinformationofrandomprocesses.TheGaussiansourceissettobezeromeanandunitvariance.Theparameterinexponentialdistributionissettobe20.TheparametersinGammadistributionaresettobe0.2and1respectively.ThenthedatasetispassedthroughtheIIRlterEq.( 3{32 ).AGaussiankernelwithunitkernelwidthisusedtoestimatethecorrentropy.InFig. 3-3 (a-d),weplottheconventionalautocorrelationfunctionandcorrentropyfunctionfortheoriginalandlteredsignalrespectively.BecausetheconventionalautocorrelationfunctiononlyrepresentsthetimestructureofrandomprocesseswhilecontainingnoinformationaboutthePDFofrandomvariables.Consequently,theautocorrelationfunctionsforrandomprocessesofdierentPDFdistributionsarethesame.Thisisdemonstratedintheplots(a)and(c)fori.i.d.dataofdierentdistributionsandlteredsignals.However, 61

PAGE 62

Autocorrelationandcorrentropyfori.i.d.andlinearlylteredsignalsandLorenzdynamicsystemversusthetimelag 62

PAGE 63

1{31 ),whicharecertainlydierentfordierentPDFdistributions.Correntropyalsocontainsthetimestructureofsignals.Comparing(c)and(d)plots,wecanseethattheshapeofcorrentropyisverysimilartothatofautocorrelationfunction.Moreover,correntropyfunctionsaredierentfor3dierentsources.Thereasonthattheseparationofcorrentropyfunctionsfor3dierentsignalsarenotasobviousastheoneini.i.d.case(plot(b))isbecausethedistributionsforthe3lteredsignalsbecomemoresimilar.Intheplots(e)and(f),wedemonstratethedierenceofconventionalcorrelationfunctionandcorrentropyincapturingthenonlinearityintrinsictosystems.TheLorenzdynamicsystemisusedhereforillustration.ThesystemfunctionofLorenztimeseriesisgivenby _x=(yx);_y=yxz+Rx;_z=xybz whereR=28,=10andb=8=3.ThissetofparametermakesLorenzdynamicsystemexhibitchaoticbehavior.10000samplesoftheLorenzdynamicsystemisgeneratedbysolvingtheequationswith4th-orderRunge-Kuttamethodwithintegralstep0.01.Afterwards,itisnormalizedtozeromeanandunitvariance.InFig. 3-3 (e-f),weplottheconventionalautocorrelationfunctionandcorrentropyforthe3componentsofthedynamicsystem.Observethatthecorrentropyfunctionsforthe3statevariablesfollowthesametrend.Theypeakanddroparoundthesametimelag.Fromthedynamicequations,itshowsthatxisnonlinearlycoupledintoyandz.Theperiodicsimilaritiesinonestatevariableaecttheotherstates.Thisisclearlydemonstratedinthecorrentropy 63

PAGE 64

Intheprevioussimulations,wejustusedGaussiankernelwithsamekernelwidth.Infact,thekernelfunctionsandkernelwidthplayacrucialroleinthecorrentropyandcenteredcorrentropyfunctions.Obviously,dierentkernelwidthsresultindierentestimateofinformationpotential.Moreover,kernelwidthcontrolstheabilityofcorrentropytocapturethenonlinearityembeddedinsystems.FromtheRKHSperspective,dierentkernelfunctionsdeterminedierentreproducingkernelHilbertspacesandthekernelwidthdenesthenormandinnerproductofRKHS.Wewillexploremoreonthisissueinthenextchapter.HerewerstpresentanexperimentonthedierentkernelwidthfortheGaussiankernelfunction.Thesamedatasetsonlineari.i.d.andtheLorenztimeseriesfromthepreviousexampleareusedhere.InFig. 3-4 ,weplotthecorrentropyfunctionsfori.i.d.andLorenztimeseriesusingGaussiankernelwithdierentkernelwidthrangingfrom0.01to15.ItcanbeseenfromthetoptwoplotsthatcorrentropylosestheabilitytodierentiatetimeseriesofdierentPDFdistributionsandtodetectthenonlinearityembeddedinLorenztimeserieswhenkernelwidthistoosmall(=0.01and0.1inthisexperiment).Ifthekernelistoobig(=5and15inthisexperiment),thecorrentropyfunctionsapproachtoconventionalautocorrelation.TheshapeofcorrentropyinthebottomtwoplotsareverysimilartotheonesofFig. 3-3 (a)and(e).ThiscanalsobeveriedbytheTaylorseriesexpansionforthecorrentropyusingGaussiankernelinEq.Eq.( 3{11 ).Ifthekernelwidthistoolarge,thevaluesofhighordertermsdecayrapidly.Thereforecorrentropyapproachestoconventionalautocorrelation.Themiddletwoplotsgivethecorrentropyfunctionsofappropriatekernelwidth.Itcanbeseenthatcorrenropycannicelyseparatethei.i.dsignalofdierentPDFdistributionsanddetecttheembeddednonlinearcouplingintheLorenztimeseries. 64

PAGE 65

Correntropyfori.i.d.signalandLorenztimeserieswithdierentkernelwidth Tofurtherquantifytherelationshipbetweenkernelwidthandtheabilityofcorrentropyinseparationi.i.d.signalsofdierentPDFdistributions,wedeneaseparationcoecientS()asafunctionofkernelwidthamongthesignalsofdierentPDFdistributionsby jV(0)mini^I(i)j;(3{35) whereLisnumberofi.i.d.signals,^I(i)istheestimateofinformationpotentialofithsignal(Eq.( 1{31 )),andV(0)isthecorrentropyatzerolag.Thecorrentropyvaluesatzerolagfordierenti.i.d.signalsarethesamesinceGaussiankernelisisotropic.Theseparationcoecientbasicallycomputesthesumofnormalizeddistancebetweendierenti.i.d.signals.Fig. 3-5 plotstheseparationcoecientasafunctionofkernelwidth.Itcanbeeasilyseenthatthebestseparationcoecientachievesat=1. 65

PAGE 66

SeparationcoecientversuskernelwidthforGaussiankernel Next,weinvestigatetheeectofdierentkernelfunctionsusedinthecorrentropyfunctions.Thesamedatasetsareusedagain,andwecomparefourdierentkernelfunctions,Gaussian,polynomial,sigmoidand.Foracompletedescription,sixkernelfunctionexpressionsaregivenhere. 1:Gaussiankernel:(xt;xs)=1 2:Polynomialkernel:(xt;xs)=(1+xtxs)d(3{37) 3:Sigmoidkernel:(xt;xs)=tanh(0xtxs+1)(3{38) 4:Wavekernel:(xt;xs)=1 5:Exponentialkernel:(xt;xs)=exp(jxtxsj)(3{40) 6:Inversemultiquadricskernel:(xt;xs)=1 Inordertohaveafaircomparison,wechoosesuitableparameterforeachofkernelfunctionsforthisspecicdataset.Weselect=1forGaussiankernel,d=2forpolynomialkernel,0=10and1=0:5forsigmoidkernel,=2:5forwavekernel, 66

PAGE 67

Correntropyfori.i.d.signalandLorenztimeserieswithdierentkernelfunctions 67

PAGE 68

3-6 .NoticethatallthekernelfunctionsexceptpolynomialkernelcanseparatedierentPDFdata.ThesigmoidkernelfailstodetectthenonlinearcouplingembeddedinLorenztimeseries.Thewave,exponentialandinversemultiquadricskernelsexhibitreasonableperformancecomparedtoGaussiankernel. whenevertheintegralexists.Inotherword,thecorrentropyspectraldensityfunctionistheFouriertransformofthecenteredcorrentropyfunction. Thevariable!inEq.( 3{42 )isusuallycalledtheradianfrequencyandismeasuredinradianspersecond.Thefrequency!=2,denotedasf,isreferredtoasthe\usual"frequencyandmeasuredinhertz(Hz).Toconvertanexpressionforthecorrentropyspectraldensityasafunctionof!intoanexpressionforthesamespectraldensityasafunctionoff,oneneedstoreplace!by2f.Formally,thecorrentropyspectraldensityas 68

PAGE 69

SinceP(!)=FfU()g,wecanobviouslydenetheinversetransformofthecorrentropyspectraldensityfunction,U()=F1fP(!)g,as =Z1P(f)ej2fdf: Inparticular,whenthetimelag=0,thecorrentropyspectralspectraldensityfunctionbecomes 2Z1P(!)e0d!=1 2Z1P(!)d!=Z1P(f)df: ThisequationshowsthatthedierencebetweentheexpectationofnormsquareandnormsquareoftheexpectationofthenonlinearlytransformedrandomprocessisthetotalareaunderthecorrentropyspectraldensityfunctionP(f)oftheprocess.Insecond-orderrandomprocessesanalysis,thepowerintherandomprocessisthetotalareaunderthepowerspectraldensityfunction.Bythesimilarapproach,wedenethedierencebetweentheexpectationofnormsquareandnormsquareoftheexpectationofthenonlinearlytransformedrandomprocess,inotherword,thecenteredcorrentropyatzerolagU(0),asthecorrentropypower.Fromageometricalperspective,U(0)canalsobeconsideredas\generalized"variance,or\generalized"power,ofthenonlinearlytransformedrandomprocessinthefeaturespace.Physically,P(!)representsthedensityofcorrentropypoweratfrequency!radians/sec,andP(f)playsthesameroleforfrequencyinHz.Thecorrentropypowerinanyfrequencybandisobtainedbyintegratingthecorrentropy 69

PAGE 70

1 2Z!2!1P(!)d!=Zf2f1P(f)df:(3{47) Thecorrentropyspectraldensityfunctionforacorrentropy-sensestationaryrandomprocesssatisesthefollowingproperties: BecauseU()=U()foracorrentropy-sensestationaryrandomprocess,wethenhave [P(!)]denotesthecomplexconjugateofP(!).Toprovethisproperty,noticethat [P(!)]=Z1U()ej!d=Z1U()[ej!]d=Z1U()e+j!d=Z1U()ej(!)d=P(!)=P(!): Thispropertyshowsthatthecorrentropyspectraldensityfunctionisreal. 70

PAGE 71

0E(ZT0((xt)E[(xt)])ej!tdt2)=E(*ZT0((xt)E[(xt)])ej!tdt;ZT0((xs)E[(xs)])ej!sds+)=EZT0ZT0h(xt)E[(xt)];(xs)E[(xs)]iej!(ts)dtds=ZT0ZT0U(ts)ej!(ts)dtds: (3{50) forallT0.Wemakeachange-of-variabletransformationinEq.( 3{50 )bylettingts=.Thetransformationmapstheoriginalregionofintegrationf(t;s):0tT;0sTgontotheregionf(t;):0tT;tTtg.Inotherword,theoriginalregionofsquareischangedintoaparallelogram,whichcanalsobedecomposedintotworegionswithG1=f(t;):tT;0TgandG2=f(t;):0tT+;T0g.ThenEq.( 3{50 )canberewrittenasthesumofanintegralovertheregionG1andanintegralovertheregionG2asfollows: 71

PAGE 72

^UT()=8><>:h1jj Thenforanyxed,wehave limT!1^UT()=U():(3{53) Therefore,thelimitofEq.( 3{51 )becomes limT!1ZTT(Tjj) Theinterchangeoforderofthelimitandtheintegrationisvalidatedbythedominatedconvergencetheorem[ 21 ].Eq.( 3{51 )andEq.( 3{54 )establishedthatP(!)isthelimitofasequenceofnonnegativenumbersforany!,andthelimitofasequenceofnonnegativenumbersmustbenonnegative.Consequently,P(!)0. 72

PAGE 73

Fromthedenitionsofcorrentropyandcenteredcorrentropyfunctions,weknowthattherearetworeproducingkernelHilbertspacesembeddedinthecorrentropyandcenteredcorrentropyfunctions.TherstRKHSisinducedbythesymmetricpositivedenitedataindependentkernelfunctions,liketheGaussiankernel,SigmoidkernelandothersEq.( 3{36 )-Eq.( 3{41 ).ThesekernelfunctionsnonlinearlytransformtheoriginalrandomprocessesintoahighdimensionalRKHS,whichiscalledfeaturespace.Thelinearoperationsonthetransformedrandomprocesseswillbecomenonlinearoperationsontheoriginalrandomprocesseswhenmappedbackbytheinversenonlineartransformation.ThesecondRKHSisinducedbythesymmetricpositivedenitedatadependentcorrentropyandcenteredcorrentropyfunctionsdirectly.UnliketheconventionalautocorrelationandcovariancefunctionsthatoperatedirectlyontheoriginalrandomprocessesanddeterminedatadependentRKHS,correntropyandcenteredcorrentropyoperateontheexpectationofnonlinearlytransformedrandomprocesses,henceinducereproducingkernelHilbertspacesthatareverydierentfromtheonesdeterminedbydataindependentkernelfunctionsandconventionalautocorrelationandcovariancefunctions.Theinclusionofdataindependentkernelfunctionsinsidetheexpectationoperationmakesthecorrentropyandcenteredcorrentropyfunctionsdierentfromkernelfunctionsandexpectationoperationalone.ThisalsochangesthedynamicsoftheRKHSinducedbythecorrentropyandcenteredcorrentropyandmakesitunique. Inthischapter,weinvestigatethecorrentropyandcenteredcorrentropyfunctionsfromageometricalperspectivebyanalyzingthereproducingkernelHilbertspacesinducedbythemdirectlyandtheonebykernelfunctionsalone.ThesetwoapproachestoanalyzethecorrentropyandcenteredcorrentropywillleadustoabetterunderstandingofthegeometricalstructureoftheRKHSinducedbykernelfunctions,correntropyandcenteredcorrentropyrespectively. 73

PAGE 74

1{14 ),anysymmetricpositivedenitefunction(xt;xs)canberewrittenasaninnerproductbetweentwovectorsinthefeaturespace,i.e., :xt7!p 1{2 ).TheconstructionofRKHSbasedupontheeigen-valuesandeigen-functionskand'kfollowsthesameapproachfromEq.( 1{14 )toEq.( 1{20 )bysubstitutingtheautocorrelationfunctionR(t;s)withkernelfunction(xt;xs).Noticethatifisotropickernelfunctionsareused,specicallytheGaussiankernelEq.( 3{36 ),wavekernelEq.( 3{39 ),exponentialkernelEq.( 3{40 )andinversemultiquadricskernelEq.( 3{41 ),thenthenormoftransformedrandomprocessk(xt)k2isaconstantequalsto(0).Thisshowsthatthetransformedrandomprocessf(xt):t2Tgresidesonthesphere.Theisotropickernelfunctionstransformtherandomprocessessuch 74

PAGE 75

ThedataindependentkernelfunctionsincludeGaussiankernel,SigmoidkernelandothersEq.( 3{36 )-Eq.( 3{42 ).Thesedataindependentkernelfunctionsareembeddedinsidethecorrentropyandcenteredcorrentropyfunctions.Ananalysisofcorrentropyandcenteredcorrentropyfromkernelperspectiveoersanewinsight. 4{1 ),wecanrewritethecorrentropyintermsofthenonlinearmappingas Likewisethecenteredcorrentropycanalsobeexpressedas (4{3) Itcanbeseenthatthecorrentropyfunctionisa\conventional"correlationfunctionforthetransformedrandomprocessinthehigh-dimensionalRKHS.Whilethecenteredcorrentropyisnothingbutthecorrentropyforthezeromean(centered)randomprocessf(xt)E[(xt)]:t2Tg.Thiswayofdeningthegeneralizedcorrelationandcovariancefunctionsfollowsthesamespiritofthestandardcorrelationandcovariancefunctions.Thehighorderstatisticsforanyrandomprocessintheinputspaceturnsouttobethe\second-order"statisticsinthefeaturespace.Kernel-basedlearningalgorithmsemploythenonlinearmappingtotreatnonlinearalgorithmsinalinearwayiftheproblemscanbeexpressedintermsofinnerproduct[ 49 ].Thissuggeststhatwecandealwithnonlinearsystemsecientlyandelegantlyinalinearfashionwhenapplyingthecorrentropyandcenteredcorrentropyfunctions. 75

PAGE 76

Therefore,thecenteredcorrentropycanbeviewedastheexpectationofthenormsquareofzero-mean(centered)transformedrandomprocessorthegeneralizedcovarianceoftransformedrandomprocess.Similarly,property4canbecastintermsofthenonlinearlytransformedrandomprocess.Theinequalitycanbere-expressedas Fromchapter 3.1 ,weknowthatiftherandomprocessispair-wiseindependentthenthecorrentropyatnon-zerolagsbecomestheinformationpotentialandthecenteredcorrentropyatnon-zerolagsreducestozero.Underthiscondition,thenonlineartransformedrandomprocessbecomes\uncorrelated"inthefeaturespace.Theconditionofpair-wiseindependenceintheoriginalinputspaceimpliespair-wiseuncorrelatednessinthefeaturespace,butnotviceversa.Thisoerscorrentropyandcenteredcorrentropyawiderangeofpotentialapplicationsinmachinelearningandsignalprocessing. ThekernelfunctionperspectivetocorrentropyandcenteredcorrentropysuggeststhatwecantreatthenestedkernelfunctioninsidetheexpectationoperatorasanimplicitnonlineartransformationthatmapstheoriginalrandomprocessintotheRKHSinduced 76

PAGE 77

78 ].AnyvectorinthatRKHScanberepresentedbylinearcombinationofthe 77

PAGE 78

Inthissection,wetakethepolynomialspaceapproachtoconstructexplicitlyanRKHSassociatedwithoneofthemostpopularkernels,theGaussiankernel.BytransformingageneralizedFockspace[ 45 ]withapositiveoperator,webuildanRKHSassociatedwithGaussiankernel.Thefunctionals,areexplicitlygivenbythepolynomials.UnliketheMercer'stheoremapproach,thesefunctionalarenotnecessaryorthornomal.Moreimportantly,wecangaincontroloverthedimensionoftheRKHSbymeansofselectingthepolynomialdegree.ThesimulationsuggeststhattheeectivedimensionofRKHSwithGaussiankernelisrelativelysmall. ThedenitionsoffunctionalsandinnerproductofageneralHilbertspacearegiven.Then,akernelfunctionisimposedonthisgeneralHilbertspacetomakeitareproducingkernelHilbertspace.ThisapproachofbuildinganRKHSwithpolynomialscanalsobefoundin[ 45 ],whichiscalledgeneralizedFockspace.OurcontributionisthatitisanRKHSassociatedwithGaussiankernelthatweexplicitlyconstructbyintroducingnewdenitionsoffunctionalsandkernelfunction. FirstweconstructaninnerproductspaceHbydeningfunctionalsandinnerproduct.Theevaluationoffunctionalfatanygivenxisgivenby 20nXk=0fk where0isaconstantand(n+1)-tuple(f0,f1,...,fn)arethecoecientswhichuniquelycharacterizethepolynomialf.Thentheinnerproductbetweenanytwofunctionalsfandhcanbespeciedintheform 78

PAGE 79

InordertomakeHareproducingkernelHilbertspace,weimposeakernelfunctiononHinthefollowingform 20nXk=01 ItcanbeveriedthattheHilbertspaceH,equippedwithsuch,isareproducingkernelHilbertspaceandthekernelfunction(x;)isareproducingkernelbecauseofthefollowingtwopropertiesof(x;y): 1. 20nXk=0(xk 20) i.e.,theconstants(xk=kex2 20),k=0;1;:::;nbecomethecoecientsfk;k=0;1;:::;ninthedenitionoff,andthus 2. Givenanyf2H,theinnerproductbetweenreproducingkernelandfyieldsthefunctionitself, 20)fk=ex2 20nXk=0fk Thisissocalledreproducingproperty. TheRKHSconstructedabovehasthefreedomtochoosethedegreeoffunctionals,i.e.,thedimensionnofthekernelspaceH.Themostinterestingcaseisthatwemight 79

PAGE 80

Thenthefunctionals,innerproductandreproducingkernelinHwillbedenedbyEq.( 4{5 ),Eq.( 4{6 )andEq.( 4{7 )withn=1. Inthespecialsituationofweights where0isaxedpositiveconstant,thenthereproducingkernelEq.( 4{7 )intheinnite-dimensionalRKHSbecomes 201Xk=01 0k=ex2+y2 20exy 0=e(xy)2 20; whichistheGaussiankernelusedwidelyinmachinelearning,functionapproximation,densityestimation,supportvectormachine,andetc.Theconstant0turnsouttobethekernelwidth.ItcontrolsthenormlengthofthosefunctionalsinRKHS.i.e.,thespreadofnonlinearmappeddatasampleinfeaturespace. ComparingthismethodwithMercer'stheorem,wenoticethattherearetwomajordierencesbetweenthem. 1. First,wehavegivenanexplicitexpressionforthefunctionalsintheRKHSassociatedwithGaussiankernelintermsofpolynomialswhileMercer'stheoremneverdoesthat.Wecangettheexactevaluationsforthosefunctionalsateachpoint 80

PAGE 81

2. Second,thefunctionalsweconstructedabovearenotnecessaryanorthonormalbasis,whiletheMercer'stheoremisrealizedbyorthonormalizingtheRKHS.ThisperspectiveprovidesalsoageneralalternativetobuildanRKHSfromknownfunctionalsbesidesfromMercer'stheorem. 3. Third,wecanhavecontrolofthedimensionoftheRKHSbymeansofselectingthepolynomialdegreesninEq.( 4{7 ). ThemethodweconstructedanRKHSenablesustohavetheexplicitexpressionofthefunctionalinRKHSassociatedwithGaussiankernel.Hencewecanexactlyknowthenonlinearmappingusedinthekernel-basedlearningalgorithmandsooperatedirectlywiththetransformeddatatoextendthealgorithmsbeyondinnerproducts.Furthermore,aswehavethecontrolofthedimensionoftheRKHSH,thismighthelpthecomputationalcomplexityissueinkernel-basedlearningalgorithmsthroughapproximationofGaussiankernelbypolynomialsasindicatedinequationEq.( 4{7 ). ThepreviouswayofexplicitlyconstructinganRKHSbymeansofpolynomialfunctionswasemployedbyDeFigueiredo[ 45 ]whichisbasedontheFockspace.TheideaofFockspacewasrstproposedbyFockin[ 79 ]tobeusedinquantummechanics,wherequantumstatesaredescribedinthewayofpassingfromonesingleobjecttocollectionsofobjects.MorerecentlyFigueiredointroducedan\arbitrarilyweightedFockspace",whichwascalledgeneralizedFockspacein[ 45 ].Thespaceisequippedwithanappropriateweightedinnerproduct,thusforminganRKHS.TheproposedRKHShasbeenusedinliearn/nonlinearsystemandsignalanalysis,whereanumberofproblemsareinvolvingapproximationandinversionofnonlinearfunctions/functionals,andnonlinearoperators.Intheunivariatecase,ageneralizedFockspaceFnisanRKHS,wherethefunctionalsf, 81

PAGE 82

wherethereal(n+1)-tuple(f0,f1,...,fn)completelycharacterizesf,=(0,1,...,n)isasetofpositiveweightingconstantswhicharechosenaprioriaccordingtotheproblemsunderconsideration.ItcanbeshownthatthisgeneralizedFockspaceisanRKHS.SimilartotheRKHSHweconstructedabove,thegeneralizedFockspaceFnhasthefreedomofchoosingthespacedimension.TheinterestingcaseisthatwhenthespacebecomesinnitedimensionalwhilethenormofthefunctionalsatisfyingthesameconditionEq.( 4{11 )asn!1.ThenthekernelfunctionF(u;v),denedbyEq.( 4{16 ),willbecomeanexponentialkernelasn!1 0; giventhesameweightsconstraintasEq.( 4{12 ). ItcanbenoticedthattherearesimilarityanddierencebetweentheRKHSHandthegeneralizedFockspaceFn.Thedenitionsfortheinnerproductinsidethetwospacesarethesame,whilethefunctionalsandkernelfunctionaredierent.TherelationshipofthetwospacesHandFnisconnectedbyatheoremin[ 2 ],whichstatesthatifH1andH2aretworeproducingkernelHilbertspacesofthesamedenitionsofinnerproduct,thenthereexitsapositiveoperatorwithboundnotgreaterthan1thattransformsH1intoH2H1.Comparingthedenitionsoffunctionalsfortwospaces,wecanseethattheex2=20playstheroleofapositiveoperatorwithboundnotgreaterthan1,thustransformingthegeneralizedFockspaceFnintoHsuchthatHF. 82

PAGE 83

SquareerrorbetweenaGaussiankernelandtheconstructedkernelinEq.( 4{7 )versustheorderofpolynomials WepresentasimplesimulationheretoshowtheeectivenessofapproximationofthepolynomialfunctionalstotheGaussiankernel.Inthesimulation,wecalculatethesquareerrorbetweentheGaussiankernelandtheproposedkernelinEq.( 4{7 )ofordern.Thekernelwidthischosentobe1,andtherangeofthecalculateddataisfrom-5to5.TheFig. 4-1 plotsthesquareerrorversusordern.Thelinebecomesatatorder11isduetothecomputationprecisionofMATLAB.TheguresuggeststhattheeectivedimensionoftheRKHSwithGaussiankernelisrelativelysmall.Withonlyorder11,wecaneectivelyapproximatetheGaussiankernelbypolynomials.ThisalsoindicatesthatforpracticalpurposeitissucienttoworkwithmuchsmallerdimensionalspaceinsteadofinnitedimensionalspaceforaGaussiankernelinkernel-basedlearningalgorithms. 83

PAGE 84

1.2 ).AllthelinearstatisticalsignalprocessingusingautocorrelationandcovariancefunctionscanbealsotreatedasfunctionaloperationsinRKHS.ButtheRKHSinducedbycorrentropyandcenteredcorrentropyfunctionsisdierentfromtheoneinducedbyconventionalautocorrelationandcovariancefunctions.TheRKHSbyconventionalautocorrelationandcovariancefunctionsisbasedonsecondorderrandomprocesses,hencesuitablefortheGaussianprocesses.WhiletheRKHSbycorrentropyandcenteredcorrentropyfunctionincludesthehighorderstatisticalinformationofrandomprocesses,hencegoesbeyondtheGaussianprocesses.ItisconjecturedthatRKHSbycorrentropyandcenteredcorrentropyfunctionsmightencompassalargeclassofprocesses.Fromtheotherperspective,theRKHSbyconrrentropyandcenteredcorrentropyisalsodierentfromtheonebykernelfunctions.OneobviouscharacteristicisthatRKHSbycorrentropyandcenteredcorrentropyisdatadependentwhiletheRKHSbykernelfunctionisdataindependent.Theinclusionofkernelfunctioninsidetheexpectationoperatormakesthecorrentropyandcenteredcorrentropyfunctionsunique.ItdepartsfromtheconventionalRKHSinducedbytheautocorrelationandcovariancefunctionsusedinstatisticalsignalprocessingandalsothedataindependentkernelfunctionsusedinkernelmachinelearning.Inthissection,weanalyzethegeometryofRKHSbycorrentropyandcenteredcorrentropyfunctions. 84

PAGE 85

4{1 ).Thereasontoworkwiththezero-meannonlinearlytransformedrandomprocessshallbecomeclearshortly.Letusdenealinearmanifoldspannedbytherandomprocessf(xt)E[(xt)]:t2Tg,denotedasL((xt)E[(xt)]:t2T),tobethesetofallrandomvariablesthatcanbewrittenastheform forsomeintegern,realconstantsc1;:::;cn,andt1;:::;tn2T.Inotherwords,L((xt)E[(xt)]:t2T)containsallnitelinearcombinationsoftherandomvariablesf(xt)E[(xt)]:t2Tg.ClosethesetinEq.( 4{18 )topologicallyaccordingtotheconvergenceinthemeanusingthenorm anddenotethesetofalllinearcombinationsofrandomvariablesanditslimitpointsbyL2((xt)E[(xt)]:t2T).Inotherwords,L2((xt)E[(xt)]:t2T)consistsofallrandomvariablesinthelinearmanifoldL((xt)E[(xt)]:t2T),togetherwithallrandomvariablessuchthatthereexistsasequenceofrandomvariablesninL((xt)E[(xt)]:t2T)convergingto,inthesensethatknk=p ItiswellknownfromthetheoryofquadraticallyintegrablefunctionsthatL2((xt)E[(xt)]:t2T)formsaHilbertspaceiftheinnerproductistheonecorrespondingtothenorminEq.( 4{19 ),namely 85

PAGE 86

twovectorsinthesubspaceS ThenbythedenitionofcrosscorrentropyEq.( 3{8 ),wehave Inotherwords,theinnerproductinL2((xt)E[(xt)]:t2T)isthecenteredcorrentropyfunctionoftheoriginalrandomvariables.ThisdenitionwillbridgethespaceL2((xt)E[(xt)]:t2T)andtheRKHSinducedbythecenteredcorrentropyfunction. Nowletusdenea\generalized"deviationofanyrandomvariableinL2((xt)E[(xt)]:t2T)as 86

PAGE 87

Thenitmakessensetotalkabouttheanglebetweentwovectors1and2inS(seeFig. 4-2 ).Byplanetrigonometry,wehave whichisthesameas Consequently,wehave cos=h1;2i k1kk2k=U(x1;x2) Thequantitymeasurestheanglebetweentwo\vectors"inthesubspaceSofL2((xt)E[(xt)]:t2T).Inparticular,iftwovectorsareorthogonal,thenis90.Iftwovectorsareinsamedirection,thenis0.Nowweareinapositiontodenethe\generalized"correlationcoecientforanytworandomvariables. whereU(x;y)isthecenteredcorrentropyfunctionforxandy,andU(x;x)andU(y;y)arecenteredcorrentropyfunctionsforvariablesthemselves. Thisdenitionofcorrentropycoecientisinthesamespiritoftraditionalcorrelationcoecient,wherethestandardcovariancefunctionhasbeenreplacedby\generalized" 87

PAGE 88

3{8 ),andthestandarddeviationfunctionshasbeenreplacedby\generalized"deviationfunction,squarerootofcenteredcorrentropywithrespecttoitselfEq.( 4{22 ).Howeverthedierenceisstriking.Conventionalcorrelationcoecientonlymeasuresthesecondordersimilaritybetweentwooriginalrandomvariables,henceitonlyrequiressecondorderstatisticsinordertoattain0,i.e.,tworandomvariablesareuncorrelated.Whilethecorrentropycoecientmeasureshighordersimilaritybetweentwooriginalrandomvariablesor\secondordersimilarity"betweentwononlinearlytransformedrandomvariablesinL2((xt)E[(xt)]:t2T),thereforeitneedshighorderstatisticstoattain0.Infact,bytheproperty4inchapter2.1,thevalueofcorrentropycoecientisbetween-1and1.Iftwooriginalrandomvariablesareindependent,i.e.,twononlinearlytransformedrandomvariablesbyare\uncorrelated"inRKHSinducedbykernelfunction,thenthecorrentropycoecientreducesto0,whichmeansthetwovectorsinL2areorthogonal.Iftworandomvariablesarethesame,thecorrentropycoecientis1,whichmeansthetwovectorsareinsamedirection.Thisalsoexplainsthereasonwhyweusecenteredcorrentropyfunctions,notcorrentropyfucntions,inthedenitionofcorrentropycoecient.Thecorrentropycoecientwillneverattain0ifcorrentropyfunctionsareusedbecauseV(x;y)isalwaysgreaterthan0.Thepre-processingofmakingrandomprocesseszeromeanisvitaltomanyconventionalstatisticalsignalprocessing.Inourcontext,sincewedonothaveexplicitknowledgeofthemeanofnonlinearlytransformedrandomprocesses,wehavetorelyoncenteredcorrentropyfunctions. SubstitutingthedenitionofcenteredcorrentropyEq.( 3{8 )andapproximatingtheensembleaveragebysamplemean,wecanobtainanestimateofcorrentropycoecient 88

PAGE 89

^=1 where1 55 ],1 1{34 )becausebothofthemusetheinformationtheoreticlearningconceptstomeasuretheindependenceoftworandomvariables.Moreover,theCauchy-SchwartzindependencemeasureisanapproximationtotheKullback-Leiblerdivergence[ 55 ],thisalsosuggeststhecorrentropycoecientisrelatedtomutualinformation.Theseconnectionsneedfurtherinvestigationinthefuturework. 4{21 )whichturnsouttobethecenteredcorrentropyfunctionfortheoriginalrandomprocesses.Inthissection,weproceedtoprovethatthereexistacongruencemapbetweenL2((xt)E[(xt)]:t2T)andRKHSinducedbycenteredcorrentropyfunction.ThereforeitissucienttostudytheHilbertspaceL2((xt)E[(xt)]:t2T)inthisperspective. FirstbecausethecenteredcorrentropyfunctionU(t;s)isnon-negativedenite,ituniquelydetermineareproducingkernelHilbertspacebyMoore-AronszajnTheorem.WecanapplyMercer'stheoremtoobtainaneigen-decompositionforthecenteredcorrentropyfunctionas 89

PAGE 90

ThenwecandeneafunctiongonTastheformof wherethesequencefak,k=1,2,...gsatisesthefollowingcondition LetH(U)bethesetcomposedoffunctionsg()whichcanberepresentedintheformEq.( 4{30 )intermsofeigenfunctionskandeigenvalueskofthecenteredcorrentropyfunctionU(t;s).FurthermorewemightdeneaninnerproductoftwofunctionsinH(U)as whereg1andg2areofformEq.( 4{30 )andak,bksatisfypropertyEq.( 4{31 ).OnemightaswellshowH(U)iscomplete.Letgn(t)=P1k=0ka(n)kk(t)beaCauchysequenceinH(U)suchthateachsequencefa(n)k,n=1,2,...gconvergestoalimitpointak.HencetheCauchysequenceconvergestog(t)=P1k=0kakk(t)whichbelongstoH(U).ThereforeH(U)isaHilbertspace.H(U)hastwoimportantpropertieswhichmakeitareproducingkernelHilbertspace.First,letU(t;)bethefunctiononTwithvalueatsinTequaltoU(t;s),thenbytheMercer'sTheoremeigen-expansionforthecovariancefunctionEq.( 4{29 ),wehave Therefore,U(t;)2H(U)foreachtinT.Second,foreveryfunctiong()2H(U)offormgivenbyEq.( 4{30 )andeverytinT, 90

PAGE 91

ThusH(U)isarepresentationoftherandomprocessfxt:t2TgwithcenteredcorrentropyfunctionU(t;s).OnemaydeneacongruenceGformH(U)ontoL2((xt)E[(xt)]:t2T)suchthat ThecongruenceGcanbeexplicitlyrepresentedas wherethesetofkisanorthogonalrandomvariablesbelongtoL2(;F;P)andgisanyelementinH(U)intheformofEq.( 4{30 ). ThiswayofproceedfromtheL2((xt)E[(xt)]:t2T)totheRKHSinducedbythecenteredcorrentropyfunctionisinthesamespiritofthemethodtobuildRKHSinducedbytraditionalcovariancefunctionfromtheL2(xt;:t2T)[ 20 25 ].Howeverwenowdealwiththenonlinearlytransformedrandomprocessesviathe. 4{21 ), 91

PAGE 92

Thisinterpretationofthreespacesinvolvedincenteredcorrentropyfunctionsoersinsightsintothegeometricrelationshipamongthedierentspaces.Certainly,thereareotherwaystotacklethetaskandfurtherinvestigationsarerequired.First,weappliedMercer'stheoremtonon-negativedenitecenteredcorrentropyfunctiontocomeupwithaneigen-decompositionintheprevioussection.Itwouldfurtherourunderstandingifwecangainknowledgeofthebasis.Soitisdesiredthatwemightalsoapplywhatwehaveproposeinsection3.1.2ofexplicitconstructionofRKHSbyGaussiankernelfunctiontobuildthosebasisforcenteredcorrentropyfunctionexplicitly.Thiswillbepartofmyfuturework.Second,whatwehavepresentedinvolveshighdimensionality,beitinRKHSbykernelfunctionorinRKHSbycenteredcorrentropyfunction.Thehighdimensionalityhasitsownattractivenessbutalsoposeschallengesfordatamanipulation.Soanotherresearchdirectionistoproposeasingledimensionalnonlineartransformationfunctionsuchthatthecorrelationfunctionofthosenonlinearlytransformedfunctionsis 92

PAGE 93

Itisconjecturedthatthisffunctionwillimplicitlyembedthedatadistribution.Ashasbeenpointedoutrandomprocessestheorem,anygivenon-negativedenitefunctiondeterminesaGaussianprocesssuchthecorrelationfunctionfortheGaussianprocessequalstothatnon-negativedenitefunction[ 74 ].Italsosuggeststhatthisffunctionmightberelatedtothegaussianizationtechnique. 93

PAGE 94

Oneofthemostimportanttasksinstatisticalanalysisistomeasurethedegreeofdependencebetweenrandomvariables.Therearenumerousmeasuresofdependenceintheliterature,mostlybasedonthedistanceofthejointprobabilitydistributionofthedataandtheproductofthemarginalprobabilitydistributions[ 80 ].Someofthemostcommonlyusedmeasuresofbivariatedependenceincludebutarenotlimitedtothecorrelationcoecient,Kendall's,Spearmans's,maximalcorrelationcoecient,monotonecorrelationcoecient,andothers.Thecorrelationbaseddependencemeasuresonlycharacterizelinearrelationshipbetweentworandomvariables.[ 81 ]proposedanonlinearcanonicalcorrelationusingsymmetricnondecreasingfunctionstoquantifynonlineardependencewithoutanyassumptiononthedistributionoftherandomvariables. Thedependencemeasureisalsocloselyrelatedtothemeasureofamountofinformationthatonerandomvariablecontainedaboutintheother.Becausethemoredependenttherandomvariablesare,themoreinformationaboutoneoughttobegivenbytheotherandviceversa.Severalmeasuresbasedoninformationtheoryhavebeenproposedintheliterature.Forexample,themutualinformationisonewell-knownmeasurewhichcanalsobeinterpretedastheKullback-Leiblerdivergencebetweenthejointprobabilitydensityfunctionsandtheproductofthemarginaldensities[ 82 83 ].ItturnsoutthattheKullback-Leiblerdivergenceormutualinformationisaspecialcaseofthe'-divergencemeasurewhenaspecicconvexfunction'ischosen[ 84 ].Ontheotherhand,[ 58 ]generalizedShannon'smutualinformationasaspecialnormintheprobabilitydensityfunctionspace.Silvey'sgeneralizedcoecient[ 85 ]usestheRadon-Nikodymderivativeofjointdistributionofrandomvariableswithrespecttotheproductoftheirmarginaldistributions.Otherdependencemeasuresbasedoninformationtheoryincludetherelativeentropymeasureproposedby[ 86 ]andothers.HoweverJoe'srelativeentropydependencemeasure,andalmostallotherentropiesfailtobe\metric"sincetheyviolate 94

PAGE 95

87 ].Inordertoovercomethislimitation,[ 87 ]proposedametricmeasureofdependencetospecifythedegreeofdependenceintimeseriesdata. RecentlytherehasbeenconsiderableworkonusingfunctionsinreproducingkernelHilbertspace(RKHS)toquantifydependence.[ 52 ]introducedkerneldependencefunctionalsinthecontextofindependentcomponentanalysis(ICA).Thekernelizedcanonicalcorrelationisakernelizationofcanonicalcorrelationanalysiswithregularizationonthefunctionals.ItcanalsobeexplainedasaparticularformofnonlinearcanonicalcorrelationwherethenonlinearfunctionshavebeenchosenasfunctionalevaluationsinRKHS[ 88 ]basedontherepresentertheorem[ 49 ].ThekernelgeneralizedvarianceisanextensionofkernelizedcanonicalcorrelationbyestimatingthespectralnormofthecorrelationoperatorbetweenreproducingkernelHilbertspacesintheentirespectrum[ 52 ].InsteadofusingcorrelationoperatorinRKHS,[ 89 ]proposedthekernelconstrainedcovarianceandkernelmutualinformationbasedonthecovarianceoperator.ThekernelconstrainedcovarianceestimatesthespectralnormofthecovarianceoperatorbetweenRKHSs.Itisbeenprovedthatthekernelconstrainedcovarianceiszeroifandonlyiftheoriginalrandomvariablesareindependentprovidedthatthekernelisuniversal.ThekernelconstrainedcovariancecanalsobeviewedasamaximalcorrelationcoecientwherethefunctionspacehasbeenconnedtothereproducingkernelHilbertspace.ThekernelmutualinformationincorporatestheentirespectrumofthecovarianceoperatorbetweenRKHSsandbecomesanupperboundofthemutualinformationestimatedwithaParzenwindow[ 89 ].Thesedependencemeasuresbasedonkernelmethodshaveenjoyedmuchsuccessinindependentcomponentanalysis[ 52 88 89 ],quanticationofgeneralizedsynchronizationbetweenchaoticsignals[ 90 ],andothermachinelearningapplicationareas. Withtheabundanceofdependencemeasuresintheliterature,onewouldnaturallyaskwhichmeasureisbestincharacterizingthedependenceinformationofthedata.Renyiproposedasetofpostulateswhichshouldbefullledbyasuitabledependencemeasure[ 91 ].TheaxiomaticframeworkbyRenyihasdrawnmuchattentioneversince. 95

PAGE 96

91 ]thatoutofvariousdependencemeasuresonlythemaximalcorrelationcoecientsatisedalltheseproperties.Thekernelconstrainedcovarianceomitsthedeterministicdependenceandupperboundconditionsintheaxiomaticframeworkinordertoecientlyestimatethequantityandapplyittoindependentcomponentanalysis[ 89 ]. Inthischapter,wedeneanewparametriccorrentropyfunctionasanoveldependencemeasure. 3{8 )isthatwhentworandomvariablesareindependent(fX;Y(x;y)=fX(x)fY(y))thequantitybecomeszerobutnotviceversa.Inordertomakeitasuitabledependencemeasure,wemodifythedenitionsofthecenteredcorrentropyandcorrentropycoecientsothatitwillsatisfytheconditionofattainingzeroifandonlyiftworandomvariablesareindependent. Wepresentalemmaherestatingthattheparametriccenteredcorrentropyiszeroforallaandbifandonlyifxandyareindependent.First,weemploytheFouriertransform 96

PAGE 97

whereisapositiveboundedmeasure.Forsimplicitywewillalsoassumethattheprobabilitymeasurehasmomentsofallorderssothatisinnitelydierentiable.Ifthetailofdecayssucientlyfastcanbeextendedtotheentireline. whereFX;Y(x;y)isthejointcumulativedistributionfunction(CDF),andFX(x)andFY(y)arethemarginalCDFsforxandyrespectively.NeedtoshowthatdQ(x;y)=0.Ua;b(x;y)=0forallaandbmeansRRei(axy)dQ(x;y)eib(d)=0forallaandb,inparticularforallb.HenceRRei(axy)dQ(x;y)=0almostall.Sincesupport=R,thisholdsforallanda.ThisiseasilywrittenasRei(x+y)dQ(x;y)=0foralland.ConclusiondQ=0. 97

PAGE 98

where(0)isthevalueofkernelfunctionwhenitsargumentiszero.Theabsolutevalueoftheparametriccorrentropycoecientisboundedby1. 4{1 ),wehave (5{4) where 98

PAGE 99

5{5 )is 5{5 )is Likewise CombiningEq.( 5{4 ),Eq.( 5{6 )andEq.( 5{7 )together,weobtain 99

PAGE 100

1. 2. 0Q(x;y)1, 3. 4. Renyishowedthatonemeasuresatisfyingtheseconstraintsis wheref(x)andg(y)musthavenitepositivevariance,andf,gareBorelmeasurable. Weproposedadependencemeasurebasedontheparametriccorrentropycoecientwhichsatisesallthedesirablepropertiesabove. (x;y)=supa;bja;b(x;y)j;(5{9) wherea;b(x;y)istheparametriccorrentropycoecientEq.( 5{3 ). ThecorrentropydependencemeasureisasuitablestatisticalmeasurewhichfulllsallthefundamentalconditionslistedbyRenyi. 5{9 )iswelldened.Sinceja;b(x;y)jisbetween0and1asprovedabove,certainlythesupremeofja;b(x;y)jisalsoboundedby0and1.Theindependenceconditionisalsotrivial.Ifxandyareindependent,ja;b(x;y)j=0,thereforesupa;bja;b(x;y)j=0.Since 0ja;b(x;y)jsupa;bja;b(x;y)j;

PAGE 101

5{4 ).Theequalitiesachieveifandonlyifwhen and, Infact,bothconditionsEq.( 5{10 )andEq.( 5{11 )areequivalent.FromtheconditionEq.( 5{10 ),weobtain whereBkandAkaretwoconstant.Let thenx=f1(g(y))sincefunctional'kiscontinuousandinvertiblebeingtheeigen-functionofthepositivedenitekernel.Thereforeweconcludetheproof. 5{8 ),thecorrentropydependencemeasuresearchestheparameterspaceforasupremeinsteadoftheentirefunctionspace.Comparedtothekernelconstrainedcovariancepresentedin[ 89 ],ourmeasuredoesnotemployregularizationtechniquebecauseouroptimizationspaceistheonedimensionalparameterspace.Howeverthekernelconstrainedcovarianceoperatesonthehigh-dimensional(possiblyinnitedimensional)functionalspace,thereforeregularizationismandatoryinordertomakesolutionachievable.Byselectingdierentvaluesofparametersaandb,thecorrentropydependencemeasureoersadierentscalevalueofdependencebetweenrandomvariablesxandy.Theideaissimilartotheconceptofwaveletwhichprovidesmulti-resolutioninformationaboutsignalfrequency. 101

PAGE 102

Principalcomponentanalysis(alsoknownastheKarhunen-Loevetransformationincommunicationtheory)isapowerfultoolforfeatureextractionanddatadimensionalityreductioninstatisticalpatternrecognitionandsignalprocessing.Itcanbeeasilyperformedbyeigen-decompositionofthestandardcovariancematrixorbyadaptivealgorithmsthatestimateprincipalcomponents[ 92 ].PrincipalcomponentanalysisorPCAisreallyananetransformationofthecoordinatesystemsuchthattherateofdecreaseofdatavarianceismaximized.Theprojectionsofthedataontothenewcoordinatesystemarecalledprincipalcomponents.Theseprojectionsrepresentthedataoptimallyinaleast-squaresense.Infeatureextraction,PCAtransformsthedatainsuchawaythatasmallnumberofprincipalcomponentscanrepresentthedatawhileretainingmostoftheintrinsicvarianceofthedata.Thesearesometimescalledfactorsorlatentvariablesofthedata[ 93 ]. WhilePCAyieldsasmallerdimensionallinearsubspacethatbestrepresentsthefulldataaccordingtoaminimum-square-errorcriterion,itmightbeapoorrepresentationifthedatastructureisnon-Gaussian.Hencenonlinearcomponentanalysismaybeneeded.Therehavebeennumerousattemptstodenenonlinearcomponentsanalysisinthelatestdecades.NonlinearPCAisgenerallyseenasanonlineargeneralizationofstandardPCA[ 92 93 ].Theprincipalcomponentisgeneralizedfromstraightlinestocurves.PrincipalcurveswereproposedbyHastie[ 94 ]todenelocaldirectionsthatpassthroughthehighdensitypartsofthedataset.Theprincipalcurvesarefoundthroughaniterativealgorithmthatminimizestheconditionalexpectationofprojectionsonthecurves.KramerpresentedanonlinearPCAbasedonauto-associativeneuralnetworks.Theauto-associativenetworkperformsidentitymappingfromtheinputdatatotheoutputbyminimizingthesquareerror[ 95 ].Recently,ScholkopfetalappliedkernelmethodologytoobtainanonlinearformofPCA[ 50 ].ThissocalledKernelPCAisoneofthekernel-based 102

PAGE 103

1.3 .TheKernelPCAnonlinearlymapstheoriginalinputdataintoaninnitedimensionalreproducingkernelHilbertspacebythedataindependentkernelandsolvestheeigen-decompositionoftheGrammatrixoftheinputdatainahigh-dimensionalfeaturespace.TheGrammatrixhasadimensiongivenbythenumberofsamplesN.ThedataprojectionsontotheprincipaldirectionsoftheGrammatrix,i.e.theinnerproductinfeaturespace,arecarriedoutbymeansofkernelfunctionsintheinputspace.WhiletheutilizationofMercerkernelsprovidesatractablewaytocomputeprincipalcomponentsinthehigh-dimensionalfeaturespace,therearestillproblemsofinterpretationandcomputationofthelargedimensionalGrammatrix.Indeed,thenumberofeigenfunctionsoftheGrammatrixisdependentonthenumberofdatasamplesN,notthesizeofthedataspaceL.MoreovercomputingGrammatricesformillionsofsamplesinasmall,letussay,twodimensionalspacebecomeswasteful. Inthischapter,weproposeanewnonlinearPCAtechniquebasedonthecorrentropyfunction,calledcorrentropyPCA.ThecorrentropyfunctionquantiesthesimilaritybetweentheLdierentcomponentsoftheLdimensionalinputdatavector(orthetimestructureinatimeseries)usingthestatisticaldatadistribution.Thecorrentropyalsoutilizesakernelmethodology,butinadierentform:byapplyingthekerneltopairsofdatavectorcomponents,arandomvector(orstochasticprocess)isnonlinearlytransformedintoahighdimensionalfunctionspacewherethesimilaritybetweenthecomponentsofthetransformedrandomvariables(orstochasticprocess)canbemeasuredbytheconventionalcovariancefunction.Theeigen-decompositionofthecovarianceofthetransformeddatayieldstheprincipaldirectionsofthenonlinearlytransformeddata.Theselinearprincipaldirectionsinfeaturespacecorrespondtononlinearprincipaldirectionsintheinputspace.Theseprojectionscanbeecientlycomputedbyutilizingthecorrentropyfunction.Thatmeans,ifonehasonemillionsamplesinatwodimensionalspace,itisonlynecessarytosolveatwodimensionaleigenvectorproblemonamatrix 103

PAGE 104

Givenasetofzeromeanvectorobservationsxj,j=1;:::;N,xj2RL,PNj=1xj=0,correntropyPCAseeksadirectioninthefeaturespacesuchthatthevarianceofthedataprojectedontothisdirectionismaximized.Unlikethekernelmethodwhichtransformsdataintoafeaturespacesamplebysample,correntropyPCAmapsdatacomponent-wiseintoafeaturespace,i.e.,theRKHSassociatedwiththecenteredcorrentropyfunction.BytheequationEq.( 4{29 ), :RL7!Fx7![(x1);(x2);;(xL)]; (6{1) =1 Thenthecovariancematrixofthetransformeddatainthefeaturespaceisgivenby Wenowhavetondtheeigenvalues0andnon-zeroeigenvectorssatisfying

PAGE 105

Andwemayinsteadconsiderthesetofequations, CombiningequationsEq.( 6{2 )andEq.( 6{3 ),weget ByequationEq.( 6{1 ),wecandeneanLLcenteredcorrentropymatrixUby (6{5) =1 6{4 )runsfrom1toL,andwritetheresultinmatrixform,wecanget ;(6{6) where 6{6 )areequivalenttothesolutionstothefollowingeigenvalueproblem, =L ;(6{7) fornonzeroeigenvalues. 105

PAGE 106

thisissocalledanonlinearprincipalcomponent. Insummary,weneedtotakethefollowingstepstocomputethenonlinearprincipalcomponents:(1)computethecorrentropymatrixVbyequationEq.( 6{5 ),wheretheexpectedvalueissubstitutedbytheaverage,(2)computeitseigenvectorsandeigenvaluesthroughSVD,and(3)computetheprojectionsofatestpointontotheeigenvectorsbyEq.( 6{8 ). WewillpresenttwoexperimentalresultstoshowtheeectivenessofcorrentropyPCAinndingnonlinearprincipaldirections.TherstexperimentcomparesthestandardlinearPCAandcorrentropyPCAtoextractfeaturesfromatwodimensionalmixtureofGaussiandistributeddata.Specically,theprobabilitydensityfunctionisamixtureofGaussianmodeswiththefollowingform 6-1 ,weplotthecontoursofthedataandofthelargesteigenvaluedirectionsproducedbylinearPCAandcorrentropyPCArespectively.200samplesareusedand 106

PAGE 107

LinearPCAversuscorrentropyPCAforatwo-dimensionalmixtureofGaussiandistributeddata kernelsizeischosentobe2.TheresultconrmsthatlinearPCAonlyprovidesthelineardirectionsthatmaximizesthevariance.ButsincetheunderlyingdataisamixtureoftwoGaussianmodes,linearPCAfailstoconsiderthedirectionsoftheindividualmodesbutonlyaveragesthesedirections.Onthecontrary,correntropyPCAismoretunedtotheunderlyingstructureofthedataintheinputspace.correntropyPCAgeneratesanonlinearprincipaldirectionthatfollowslocallythedirectionsoftheindividualmodessothatthevarianceofprincipalcomponentprojectedontothisnonlinearcurveismaximized.TheexperimentshowsthatcorrentropyPCAissuperiorindescribingtheunderlyingstructureofthedatawhencomparedtothelinearPCAmethod. OursecondexperimentcomparedthekernelPCA,proposedbyScholkopfetalin[ 50 ],withcorrentropyPCA.Weusethesameexperimentsetupasin[ 50 ]inordertoillustratetheperformanceofcorrentropyPCA.Thedataistwo-dimensionalwiththreeclusters(Gaussiandistributionwithstandarddeviation0.1).Thenumberofdatasampleandthekernelsizearechosentobe90and0.1respectively.Sincethenumberofprincipal 107

PAGE 108

KernelPCAversuscorrentropyPCAforatwo-dimensionalmixtureofGaussiandistributeddata componentsforkernelPCAdependsonthenumberofdatasamples,therearemanyeigen-directionsinfeaturespacethatarediculttoidentifyintheinputspace,soweplotthetwoprincipalcomponentswiththelargesteigenvaluesfromkernelPCA.HoweverthenumberofprincipalcomponentsforcorrentropyPCAisequaltothedimensionofinputspace,sothereisnoambiguity.Fig. 6-2 showsthatbothkernelPCAandcorrentropyPCAcanextractthenonlinearprincipalcomponentsformthedata.WhilekernelPCAtendstondthelocalstructureforagivendatasetasthecontourscirclearounddierentdataclusterssuggest,correntropyPCAseekstheunderlyingglobalstructureofthedataset.ThecontourintheleftbottomplotshowsthatcorrentropyPCAcanbetunedtothedatastructurebychangingthekernelsizeintheGaussiankernel,andlocatetheprincipaldirection. InexperimentscomparingtheperformanceofcorrentropyPCAwithstandardlinearPCAandkernelPCAfornonlinearfeatureextraction,wefoundtwoadvantagesofourmethod.First,correntropyPCAcanbemoretunedtotheunderlyingdatastructure 108

PAGE 109

Inthischapterweappliedcorrentropyconceptsinprincipalcomponentanalysis.Theapproachisbasedonndingtheeigenvectorsofthecenteredcorrentropymatrix(sameasthedimensionofthedata)unliketheGrammatrixusedbyotherkernelmethods,wherethedimensionoftheGrammatrixisdependentonthenumberofthedata.Yet,thenalprinciplecurveswegetusingthismethodadequatelycoversthedatainthedirectionofmaximumspread(varianceinthefeaturespace).Sincewearedealingwithanitedimensionalmatrix,wegetanumberofprinciplecurvesequaltothedimensionofthedataspace,andatthesametimethecomputationalcomplexityisdrasticallyreducedcomparedtothekernelmethods.Ingeneralthisapproachoersanewmethodofanalyzingdata.Thestudyalsosuggeststhattheconceptofcorrentropycanbeusedforde-correlatingthedatainthefeaturespace(whitening),whichcanbeappliedinthecontextofindependentcomponentanalysis.ThefutureresearchwillapplycorrentropyPCAtorealdataproblemandalsocomparewithothernonlinearprincipalcomponentanalysismethods. 109

PAGE 110

96 ].Ingeneral,theycanbecategorizedintothreeclasses:time-domain,frequency-domain,andtime-frequencydomainalgorithms. Time-domainPDAsoperatedirectlyonthesignaltemporalstructure.Theseincludebutarenotlimitedtozero-crossingrate,peakandvalleypositions,andautocorrelation.TheautocorrelationmodelappearstobeoneofthemostpopularPDAsforitssimplicity,explanatorypower,andphysiologicalplausibility.ForagivensignalxnwithNsamples,theautocorrelationfunctionR()isdenedas whereisthedelayparameter.Fordynamicalsignalswithchangingperiodicities,ashort-timewindowcanbeincludedtocomputetheperiodicitiesofthesignalwithinthewindowendingattimetas 97 ].Cheveigneproposedthesquareddierencefunction(SDF)in[ 98 ]as SDF()=1 110

PAGE 111

99 ].AllthesePDAsbasedontheautocorrelationfunctionsuerfromatleastoneunsatisfactoryfact:thepeakcorrespondingtotheperiodforapuretoneisratherwide[ 100 101 ].ThisimposesgreaterchallengeformultipleF0estimationsincemutualoverlapbetweenvoicesweakenstheirpitchcues,andcuesfurthercompetewithcuesofothervoices.Thelowresolutioninpitchestimationresultsfromthefundamentaltime-frequencyuncertaintyprinciple[ 102 ].Toovercomethisdrawback,Brownetal.presenteda\narrowed"autocorrelationfunctiontoimprovetheresolutionoftheautocorrelationfunctionformusicalpitchextraction[ 103 ].The\narrowed"autocorrelationfunctionincludestermscorrespondingtodelaysat2,3,etc.,inadditiontotheusualtermwithdelayas Howeveritrequiresanincreaseinthelengthofthesignalandlessprecisionintime.ItalsorequirestheaprioriselectionofthenumberofdelaytermsL. Frequency-domainPDAsestimatepitchbyusingtheharmonicstructureintheshort-timespectrum.Frequency-domainmethodologiesincludecomponentfrequencyratios,lter-basedmethods,cepstrumanalysisandmulti-resolutionmethods.Pitchdeterminationalgorithmssuchasharmonicsieve[ 104 ],harmonicproductspectrum[ 105 ],sub-harmonicsummation[ 106 ],andsubharmonictoharmonicratio[ 107 ]fallintothiscategory.Mostfrequency-domainpitchdeterminationmethodsapplypatternmatching[ 108 ].Othersusenonlinearorlteringpreprocessingtogenerateorimproveinterpartialspacingandfundamentalcomponentcues.Thefrequency-domainPDAshavetheadvantageofecientimplementationwithfastFouriertransformandtheoreticalstrengthofFourieranalysis.Howeveroneweaknessisthattheyrelyontheshapeandsizeoftheanalysiswindow.Selectionandadjustmentofanalysiswindowremainaprobleminestimation. 111

PAGE 112

109 ].LaterLyonandSlaneyfurtherdevelopedthemethodologyandcalleditcor-relogram[ 110 111 ].Thecorrelogramistherststageprocessorinacomputationalauditorysceneanalysis(CASA)system[ 112 ].Ithasalsobeenincorporatedintoaneuraloscillatortosegregatedoublevowelsandmultipitchtracking[ 101 113 ].Thestrengthofcorrelograminpitchestimationisthatdierentfrequencychannelscorrespondingtodierentsignalsourcesofdierentpitchescanbeseparated,whichmakesitusefulinmultipitchestimation[ 101 114 ].Alsoindividualchannelweightingcanbeadaptedtocompensateforamplitudemismatchesbetweenspectralregions[ 115 ]. Ontheotherhand,autocorrelationandpowerspectrumbasedpitchdeterminationalgorithmsmentionedaboveonlycharacterizessecond-orderstatistics.Inmanyapplicationswherenon-Gaussianitiesandnonlinearitiesarepresent,thesesecond-orderstatisticalmethodologiesmightfailtoprovidealltheinformationaboutthesignalsunderstudy.Higher-orderstatisticshavebeenusedinpitchdetermination.Morenoetal.appliedhigher-orderstatisticstoextractpitchfromnoisyspeech[ 116 ].Butonlydiagonalthirdordercumulantswereusedforsimplicityandcomputationaleciencywhichisgivenby 2NN1Xk=(N1)c(k)c(k+):(7{4) 112

PAGE 113

Theproposedpitchdeterminationmethodisappliedaftertheacousticalsignalisprocessedbyanequivalentrectangularbandwidth(ERB)lterbankinthetimedomain.TheERBlterbankactsasacochlearmodeltotransformaonedimensionalacousticalsignalintoatwodimensionalmapofneuralringrateasafunctionoftimeandplace[ 111 ].Thecorrentropyfunctionforeachchanneliscalculatedandthesummationacrossallthechannelsprovidesthepitchinformation.Asanovel\self-similarity"measure,correntropyisabletooermuchbetterresolutionthantheconventionalautocorrelationfunctioninpitchestimation.Moreover,ourpitchdeterminationalgorithmcansegregatedoublevowelswithoutapplyinganycomplexmodelsuchasaneuraloscillator[ 101 ]. 113

PAGE 114

Autocorrelation,narrowedautocorrelationwithL=10andcorrentropyfunctionsofasinusoidsignal. determination.Toillustratethis,wecomparetheconventionalautocorrelationEq.( 7{1 ),narrowedcorrelationEq.( 7{3 )andcorrentropyfunctionsEq.( 7{5 )forasimplesinusoidsignal.Fig. 7-1 plotsthreefunctionsinthelagwithrespecttothedelaylaginthetimedomain.Allthreefunctionsareabletopeakatthesamedelaylagcorrespondingtotheperiodofthesinusoidsignal.Howeveritisevidentthatthepeaksobtainedfromthecorrentropyfunctionaremuchnarrowerandsharperthanthosefromtheconventionalautocorrelationandnarrowedautocorrelationfunctions.InFig. 7-2 ,wepresenttheFouriertransformofeachfunction.Theordinaryautocorrelationfunctiononlyexhibitsoneharmonicandthenarrowedautocorrelationproduces10harmonicswhichisequaltothenumberoftermsLusedinEq.( 7{3 ).Thecorrentropyfunctionplacesevenmoreenergyathigherharmonicsinfrequency.Thenarrownessofcorrentropyfunctioninthetimedomainimpliestherichharmonicspresentinthefrequencydomain.Thisisduetothenonlinearexponentialfunctionincludedinthedenitionofthecorrentropyfunction. ItshouldalsobenoticedthatthereisaconnectionbetweenthecorrentropyfunctionEq.( 7{5 )andthesquaredierencefunctionEq.( 7{2 ).ThecorrentropyfunctionalsousesinhibitoryneuralinteractionmodelinsteadofexcitatoryonewithaGaussiankernelfunction,butitnonlinearlytransformsthesubtractionofthesignalsbytheexponential 114

PAGE 115

Fouriertransformofautocorrelation,narrowedautocorrelationwithL=10andcorrentropyfunctionsofasinusoidsignal. function.Fromanotherperspective,thecorrentropyfunctionincludesthescaledsquaredierentfunctionasanindividualtermfork=1inthesummationofEq.( 3{11 ).Butitcontainsmoreinformationwithotherhigher-ordermomentterms. Ourpitchdeterminationalgorithmrstusescochlearlteringtoperipherallyprocessthespeechsignal.Thisisachievedbyabankof64gammatonelterswhicharedistributedinfrequencyaccordingtotheirbandwidths[ 117 ].Theimpulseresponseofagammatonelterisdenedas 118 ].Thiscreatesacochleagram,whichisafunctionoftimelagalongthehorizontalaxisandcochlearplace,orfrequency,alongtheverticalaxis.Thecochlearseparatesasoundintobroadfrequencychannelswhilestillcontainsthetimestructureoftheoriginalsound.Ithasservedasaperipheral 115

PAGE 116

Correlogram(top)andsummary(bottom)forthevowel/a/. pre-processinthecomputationalauditorysceneanalysis(CASA)model[ 112 ],andusedextensivelyinpitchdetermination[ 101 119 ]. Theperiodicityanalysisisdonebycomputingthecorrentropyfunctionattheoutputofeachcochlearfrequencychannel, whereistandsforchannelnumberandznisthecochlearoutput.ThekernelbandwidthisdeterminedusingSilverman'srule[ 120 ].Thetimelagischosenlongenoughtoincludethelowestexpectedpitch.Generallyitissetatleast10msthroughoutthepaper.Inthisway,apictureisformedwithhorizontalaxisascorrentropylagsandverticalaxisascochlearfrequency.Wenameitcorrentropy-gram,whichliterallymeans\picturesofcorrentropy".Ifasignalisperiodic,strongverticallinesatcertaincorrentropylagsappearinthecorrentropy-gramindicatingtimeswhenalargenumberofcochlear 116

PAGE 117

Autocorrelation(top)andsummary(bottom)ofthirdordercumulantsforthevowel/a/. channelsareringsynchronously.Whilethehorizontalbandssignifydierentamountsofenergyacrossfrequencyregions.Thecorrentropy-gramissimilartothecorrelograminstructurebutdierentincontent.Inordertoreducethedynamicrangefordisplayinthecorrentropy-gram,thecorrentropyfunctionsshouldbenormalizedsuchthatthezerolagvalueisoneasgivenbythefollowingformula, 121 ]. 117

PAGE 118

Narrowedautocorrelation(top)andsummary(bottom)forthevowel/a/. Inordertoemphasizepitchrelatedstructureinthecorrentropy-gram,thecorrentropyfunctionsaresummedupacrossallthechannelstoforma\pooled"or\summary"correntropy-gram, Comparedtotheconventionalcorrelogrammodel[ 101 119 ],[ 122 ],ourpitchdetectorisabletolocatethesameperiodinformationasthecorrelogram,buthasmuchnarrower 118

PAGE 119

Correntropy-gram(top)andsummary(bottom)forthevowel/a/. peaks.Hencetheproposedmethodenhancestheresolutionofpitchdetermination.Furthermore,sincethecorrelogramestimatesthelikelihoodthatapitchexistsatacertaintimedelay,thesummarycorrelogrammaygenerateother\erroneous"peaksbesidestheonecorrespondingtothepitchperiod[ 111 ].whilethesummarycorrentropy-gramsuppressesvaluesthataredissimilaratallothertimedelaysbytheexponentialdecayoftheGaussianfunctionandonlypeaksattheonecorrespondingtothepitchperiod.Formixturesofconcurrentsoundsourceswithdierentfundamentalfrequencies,thesummarycorrelogramusuallyfailstodetectmultiplepitcheswithoutfurthernonlinearpost-processing.Butthesummarycorrentropy-gramisabletoshowpeaksatdierentperiodsofeachsource.Thesecharacteristicsoftheproposedmethodsuggestasuperiorityofthecorrentropyfunctionovertheautocorrelationfunctioninpitchdetermination. Moreover,thecomputationalcomplexityofourmethod,whetherthecorrentropyfunctionEq.( 7{6 )orthecorrentropycoecientEq.( 7{7 ),remainssimilartothe 119

PAGE 120

Correlogram(top)andsummary(bottom)foramixtureofvowels/a/and/u/. correlogram.Althoughtherearedoublesummationsinthecorrentropycoecient,thecomputationalcomplexitycanbereducedtoO(NlogN)usingtheFast-Gausstransform[ 123 ].However,the"narrowed"autocorrelationfunctionincreasescomputationalcomplexitybyincludingmoredelayterms. 119 ],thethirdordercumulantsfunction[ 116 ],andthenarrowedautocorrelationfunction[ 103 ]indeterminingpitchesforasinglespeakerandtwocombinedspeakersutteringdierentvowels.ThesyntheticvowelsareproducedbySlaney'sAuditoryToolbox[ 124 ].Forafaircomparison,wedidnotapplyanypost-processingonthecorrelogramaswasusedin[ 119 ].TheconventionalautocorrelationfunctionEq.( 7{1 ),autocorrelationofthirdordercumulantsfunctionsEq.( 7{4 ),narrowedautocorrelationfunctionsEq.( 7{3 ) 120

PAGE 121

Thirdordercumulants(top)andsummary(bottom)foramixtureofvowels/a/and/u/. andcorrentropyfunctionsEq.( 7{6 )arepresentedafterthesamecochlearmodel.Inthethirdexperiment,theproposedmethodistestedusingBagshaw'sdatabasewhichisabenchmarkfortestingPDAs[ 125 ]. 7-3 toFig. 7-6 presentthepitchdeterminationresultsforasinglesyntheticvowel/a/withfundamentalfrequencyat100Hz.Theupperplotsaretheimagesofcorrelationfunctions,autocorrelationsofthirdordercumulants,narrowedautocorrelations,andcorrentropyfunctionsafterthesamecochlearmodelrespectively.Thebottomguresarethesummariesofthosefourimages.ThekernelsizeintheGaussiankernelhasbeenchosentobe0.01(wewilldiscussfurtherkernelsizeselectioninSec. 8.3 )andL=10inthenarrowedautocorrelationfunctionEq.( 7{3 ).Theconventionalautocorrelation,thirdordercumulantsandnarrowedautocorrelationareallabletoproducepeaksat10mscorrespondingtothepitchofthevowel.Buttheyalsogenerate 121

PAGE 122

Narrowedautocorrelations(top)andsummary(bottom). othererroneouspeakswhichmightconfusepitchdetermination.Onthecontrary,thesummaryofcorrentropy-gramprovidesonlyonesingleandnarrowpeakat10mswhichisthepitchperiodofthevowel.Andthepeakismuchnarrowerthanthoseobtainedfromothermethods.Thecorrentropy-gramclearlyshowsasinglenarrowstripeacrossallthefrequencychannelswhichconcentratesmostoftheenergy.Itistheindicationofthefundamentalfrequency. Thenestructureofhyperboliccontourscanalsobeclearlyseeninthecorrentropy-gram.Itspowerspectrumenergyspreadsequallyatalltheharmonics.Therowsofwhitespotsacrossthecorrentropy-gramreecttheperiodicstructure.Particularly,thesecondharmonicshowstwopeaksduringthetimeintervalwhenthefundamentalfrequencyexhibitsone.Thisnestructureistheresultofcorrentropy'sabilitytocontainalltheharmonicsofsignal.However,thecorrelogramyieldsamuchwiderspreadofenergyacrossthefrequencychannels.Theimagewiththeautocorrelationsofthethirdordercumulants 122

PAGE 123

Correntropy-gram(top)andsummary(bottom)foramixtureofvowels/a/and/u/. failtopresentsuchstructures.Although,theimageofthenarrowedautocorrelationfunctionsisabletoshowsomehyperboliccontours,thewhiteverticalstripeatthefundamentalfrequencyismuchwiderthanthatofthecorrentropy-gramandthereareotherwhitespotsinthehighfrequencychannels.Theseresultinthewidepeakat10msandothererroneouspeaksinthesummaryofthenarrowedautocorrelationfunctions.Ourproposedmethodclearlyoutperformstheconventionalautocorrelationfunction,thirdordercumulantsmethodandnarrowedautocorrelationfunctioninsinglepitchdeterminationcase. 123

PAGE 124

TheROCcurvesforthefourPDAsbasedoncorrentropy-gram,autocorrelation,narrowedautocorrelation(L=15),andautocorrelationof3rdordercumulantsindoublevowelssegregationexperiment. demonstratethatthecorrentropyfunctionisabletodeterminetwopitchespresentedinthemixtureoftwovowels. Fig. 7-7 toFig. 7-10 presentthesimulationresults.ThecorrelogrammethodresultshowninFig. 7-7 onlyshowsonepeakcorrespondingtothepitchofthevowel/a/whilenoindicationoftheothervowel/u/attimeof7.9msisprovided.ThesummaryofcorrelogramresemblesthatofsinglevowelcaseinFig. 7-3 .ThethirdordercumulantsmethodinFig. 7-8 failstodetecttwopitchesinthemixturesignal.Althoughtherearetwosmallpeaksat10msand7.9mswhichcorrespondtothetwopitchperiodsrespectively,theiramplitudesarenotlargeenoughtobereliablydetected.InFig. 7-9 ,thesummaryofnarrowedautocorrelationfunctionswithL=15isabletoproduceonlyonepeakat10mscorrespondingtothepitchperiodofvowel/a/,butthereisnopeakat7.9ms.Therearewhitespotsinthelowfrequencychannelsintheimageofnarrowedautocorrelationfunctionswhicharetheindicationsofthesecondvowel/u/.However,theamplitudeistoosmallcomparedwiththatofvowel/a/andtheinformationislostinthesummaryplot.Acomplexneuralnetworkoscillatorhasbeenusedtoseparatethe 124

PAGE 125

ThepercentageperformanceofcorrectlydeterminingpitchesforbothvowelsforproposedPDAbasedoncorrentropyfunctionandaCASAmodel. channelsdominatedbydierentvoices,andthesummariesofindividualchannelsareabletoproducepeakscorrespondingtodierentvowels[ 101 ]. Ontheotherhand,ourmethodisabletodetecttwopitchesfromthemixtureoftwovowels.Thekernelsizeissetto0.07inthisexperiment.Thecorrentropy-graminFig. 7-10 showsawhitenarrowstripeacrosshighfrequencychannelsat10mscorrespondingtothepitchperiodofthevowel/a/.Thesechannelshavecenterfrequenciesclosetothethreeformantfrequenciesofvowel/a/(F1=730Hz,F2=1090Hz,F3=2440Hz).Thehyperbolicstructurecanstillbeseeninthehighfrequencychannels,butthelowerfrequencychannelshavebeenalteredbythepresenceofvowel/u/.Thethreehighenergywhitespotsappearalongthefrequencychannelscenteredat300Hzwhichistherstformantofvowel/u/.Thesecondwhitespotlocatesat7.9msmatchesthepitchperiodofvowel/u/.Inthesummaryofcorrentropy-gram,therstpeakat10mscorrespondstothepitchperiodofvowel/a/.ItisasnarrowastheoneinthesinglevowelcaseinFig. 7-6 .Thesecondpeakappearsat8.2mswhichisonly4Hzothetruepitchfrequency(126Hz).Itismuchlessthanthe20%grosserrorpitchdeterminationevaluationcriterion[ 126 ]or10Hzgrosserror[ 99 ].Thesecondpeakisalsomuchwiderthantheoneat10ms.The 125

PAGE 126

Summaryofcorrentropyfunctionswithdierentkernelsizesforasinglevowel/a/. amplitudeforthepeakat8.2msisalsosmallerthanthatofpeakat10mssincetheenergyratiois5.2timeshigherforvowel/a/.Thepitchshiftandpeakbroadeningphenomenonisduetothefactthatvowel/a/dominatesthemixturesignalanditgeneratesspuriouspeakswhichblurthatofvowel/u/.However,itisremarkablethatourmethod,withtheproperkernelsize,isabletodetecttwopitcheswhileallotheralgorithmsfailinthisexperiment.Thissimulationclearlydemonstratesthesuperiorityofourmethodovertheconventionalcorrelogram,thirdordercumulantsandnarrowedcorrelationapproachesformultipitchdetermination. 126

PAGE 127

7-11 plotstheROCcurvesforthefourpitchdeterminationalgorithmsbasedoncorrentropyfunction,autocorrelation,narrowedautocorrelationandautocorrelationofthirdordercumulants.Itclearlyshowsthatourmethodoutperformstheotherthreeindouble-vowelpitchdetection.However,noneisabletoget100%detection.NoticethattheROCcurveforcorrentropyfunctioncontainsmanypointsofzeroprobabilityoffalsealarm,upto45%ofcorrectdetection.Thisisduetothefactthatcorrentropyfunctionisabletosuppressothererroneouspeakswhichareawayfrompitchpositionsandconcentrateenergyaroundfundamentalfrequencies.Theperformanceofautocorrelationandthirdordercumulantsarebelow50%detectionrate,irrespectiveofthenumberoffalsealarmsgenerated,whichmeansthatmostoftenthesecondlargestpeakisanharmonicofthehighestpitch.Thisisnotsurprisingsincebothfunctionsfailtopresenttwopeaksformostmixturesinthisexperiment. Wealsopresentthevowelidenticationperformancetoexaminethediscriminatingabilityofcorrentropyfunctionatdierentsemitonesforthemixtureofdoublevowels.Intheexperiment,thethresholdischosensuchthatthersttwopeaksaredetected.Wecompareourresultswithacomputationalauditorysceneanalysis(CASA)modelwithanetworkofneuraloscillators[ 101 ]inFig. 7-12 .TheCASAmodeloutperformsourmethodat0.25,0.5and0.5semitonesofF0dierencessinceitusesasophisticatednetworkofneuraloscillatorstoassigndierentchannelsfromERBlterbankoutputstodierentvowels.Ourmethodisjustbasedonthesimplesummaryfunctionofcorrentropy-gram.Thecloserthetwofundamentalfrequenciesoftwovowelbecome,theharderforcorrentropyfunctiontoproducetwodistinctpeakscorrespondingtodierentpitches.However,ourmethodobtainscomparableresultstoCASAmodelat2and4semitonesofF0dierences.Itsuggeststhatoursimplemodelisabletoproducesimilar 127

PAGE 128

Summaryofcorrentropyfunctionswithdierentkernelsizesforamixtureofvowels/a/and/u/. resultsfordoublevowelsegregationof2and4semitonesofF0dierencescomparedtothesophisticatedCASAmodel.Thiscertainlyshowsourtechniqueisverypromising. 125 ].Itcontains7298malesand16948femalesspeechsamples.Thegroundtruthpitchisestimatedatreferencepointsbasedonlaryngographdata.Theseestimatesareassumedtobeequaltotheperceivedpitch.Thesignalissegmentedinto38.4msdurationcenteredatthereferencepointsinordertomakethecomparisonsbetweendierentPDAsfair.Thesamplingfrequencyis20kHz.ThekernelsizeisselectedaccordingtoSilverman'srulefordierentsegments.WeuseequationEq.( 7{7 )tocalculatethenormalizedcorrentropyfunctionssuchthatthesummarycorrentropyfunctionatzerolagisunit.Sincethepitchrangeformalespeakeris50-250Hzand120-400Hzforfemalespeaker,thePDAsearcheslocalmaximumsfrom2.5msaboveinthesummarycorrentropyfunction.Wesetthethresholdtobe0.3bytrialanderrorsothateverylocalmaximumwhichexceeds0.3willbedetectedasapitchcandidate. Table 7-1 summarizestheperformanceofvariousPDAswhicharetakenfrom[ 126 ],[ 127 ].Theperformancecriterionistherelativenumberofgrosserror.Agrosserroroccurs 128

PAGE 129

GrosserrorpercentageofPDAsevaluation Female PDA Low High Low Weighted (%) (%) (%) (%) Mean(%) HPS 5.34 28.20 0.46 1.61 11.54 SRPD 0.62 2.01 0.39 5.56 4.95 CPD 4.09 0.64 0.61 3.97 4.63 FBPT 1.27 0.64 0.60 3.35 3.48 IPTA 1.40 0.83 0.53 3.12 3.22 PP 0.22 1.74 0.26 3.20 3.01 SHR 1.29 0.78 0.75 1.69 2.33 SHAPE 0.95 0.48 1.14 0.47 1.55 eSRPD 0.90 0.56 0.43 0.23 0.90 Correntropy 0.71 0.42 0.35 0.18 0.71 whentheestimatedfundamentalfrequencyismorethan20%othetruepitchvalue.ThepercentgrosserrorsbygenderandbylowerorhigherpitchestimateswithrespecttothereferencearegiveninTable 7-1 .Theweightedgrosserroriscalculatedbytakingintoaccountthenumberofpitchsamplesforeachgender.ItclearlyshowsthatforthisparticulardatabasecorrentropybasedPDAoutperformsothers. 3{3 )needstobeselectedbytheuser(freeparameter).Actually,thekernelsizeplaysanimportantroleintheperformanceofourmethodsinceitdeterminesthescaleatwhichthesimilarityisgoingtobemeasured.IthasbeenshownthatkernelsizecontrolsthemetricofthetransformedsignalintheRKHS[ 65 ].Ifthekernelsizeissettoolarge,thecorrentropyfunctionapproachestheconventionalcorrelationfunctionandfailstodetectanynonlinearityandhigherorderstatisticsintrinsictothedata;ontheotherhand,ifthekernelsizeistoosmall,thecorrentropyfunctionlosesitsdiscriminationability.OnepracticalwaytoselectthekernelsizeisgivenbytheSilverman'srule[ 120 ]

PAGE 130

Toillustratetheeectofdierentkernelsizes,wesimulatethesummaryofcorrentropyfunctionsforthesameexperimentssetupinSec. 8.2 withdierentkernelsizesinFig. 7-13 andFig. 7-14 .Itcanbeseenthatifthekernelsizeislarge,=1here,thesummariesofcorrentropyfunctionsapproachthoseofcorrelationfunctionsshowninFig. 7-3 andFig. 7-7 .AskernelsizeapproachesthekernelsizegivenbySilverman'srule,=0:01forthesinglevowel/a/caseand=0:07foramixtureof/a/and/u/,thesummaryofcorrentropyfunctionsstartstopresentalargeandnarrowpeakcorrespondingtothepitchofvowel/a/andshowtheothervowel/u/.Ifthekernelsizeistoosmall,=0:001here,thesummaryofcorrentropyfunctionslosesitsabilitytopresenttwovowels.ThisisshowninthebottomplotofFig. 7-14 130

PAGE 131

131

PAGE 132

Inthischapter,weapplytheproposedcorrentropycoecient( 4{27 )tocharacterizethesimilaritybetweenmulti-channelsignals.Preliminaryexperimentswithsimulateddataandmultichannelelectroencephalogram(EEG)signalsduringbehaviorstudieselucidatetheperformanceofthenewmeasureversusthewellestablishedcorrelationcoecient. Therehasbeenextensiveresearchaimedatdetectingtheunderlyingrelationshipsinmulti-dimensionaldynamicalsystems.Theclassicalmethodologyemploysalinearapproach,inparticular,thecrosscorrelationandcoherenceanalysis[ 128 ].Crosscorrelationmeasuresthelinearcorrelationbetweentwosignalsinthetimedomain,whilethecoherencefunctionspeciesthelinearassociationsinthefrequencydomainbytheratioofsquaresofcrossspectraldensitiesdividedbytheproductsoftwoauto-spectra.Therehavebeenseveralextensionsofcorrelationtomorethantwopairsoftimeseriessuchasdirectedcoherence,directedtransferfunctionsandpartialdirectedcoherence[ 129 ].Unfortunately,linearmethodsonlycapturelinearrelationshipsbetweenthetimeseries,andmightfailtodetectnonlinearinterdependenciesbetweentheunderlyingdynamicalsubsystems. 132

PAGE 133

130 ].However,alargequantityofnoise-freestationarydataisrequiredtoestimatethesemeasuresbasedoninformationtheory,whichrestrictstheirapplicationsinpractice.AnothermethodisthephasesynchronizationwheretheinstantaneousphaseusingHilberttransformsiscomputedandinterdependenceisspeciedintermsoftime-dependentphaselocking[ 131 ].Thestate-spacemethodologiesincludesimilarity-indexandsynchronizationlikelihood.Thesimilarity-indextechniqueanditsmodicationscomputetheratioofaveragedistancesbetweenindexpoints,theirnearestneighborsandtheirmutualnearestones[ 132 133 ].Stametal.proposedthesynchronizationlikelihoodtooerastraightforwardnormalizedestimateofthedynamicalcouplingbetweeninteractingsystems[ 134 ].Thereareseveraldrawbacksassociatedwiththesetechniquesbasedonstatespaceembedding.Estimatingtheembeddingdimensionoftimesseriescorruptedbymeasurementnoiseforavalidreconstruction,searchingasuitableneighborhoodsizeandndingaconstantnumberofnearestneighborsareafewofmanyconstraintsthatseverelyaecttheestimationaccuracy. Inthischapter,weusethecorrentropycoecientEq.( 4{27 )asanovelsimilaritymeasuretoquantifytheinter-dependenciesamongmulti-channelsignals.Inpractice,theestimateofcorrentropycoecientEq.( 4{28 )iscalculatedbetweentwotimeseriesofintereststomeasurethesimilarity. 133 134 ].Weapplythecorrentropycoecientindetectingnonlinearinterdependenceoftwounidirectionally 133

PAGE 134

(8{1) (8{2) forthedriver,representedassystemX,and (8{3) (8{4) fortheresponse,representedassystemY.SystemXdrivessystemYwithnonlinearcouplingstrengthC.Crangesfrom0to1with0beingnocouplingand1beingcompletecoupling.Parametersbxandbyarebothsetto0.3ascanonicalvaluesfortheHenonmapwhenanalyzingidenticalsystems,andto0.3and0.1respectivelyfornonidenticalones.Foreachcouplingstrength,wediscardtherst10000iteratedtimeseriesastransientandobtainthenext500datapointsforexperiments.ThecorrentropycoecientiscalculatedbetweentherstcomponentofsystemX,x1,andtherstcomponentofsystemY,y1. Inthefollowingsimulations,weaimtoaddressthesequestions:(1)whethercorrentropycoecientincreaseswhenthecouplingstrengthCbetweenXandYforbothidenticalandnonidenticalsystemsincreases;(2)Howrobustiscorrentropycoecienttodierentlevelmeasurementnoisesindriver,responseandbothsystems?(3)Howsensitiveiscorrentropycoecienttotimedependentsuddenchangeinthedynamicsofinteractingsystemsduetocouplingstrength?(4)Cancorrentropycoecientdetectnonlinearcouplingbetweendriverandresponse?(5)Howiscorrentropycoecientaectedbykernelsizeanddatalength? 8-1 ,weplottheaveragedcorrentropycoecientasafunctionofcouplingstrengthCforidenticalmap(bx=by=0:3)andnonidenticalmap(bx=0:3 134

PAGE 135

Averagedcorrentropycoecientforunidirectionallyidentical(a)andnonidentical(b)coupledHenonmaps. andby=0:1)over10realizationsofdierentinitialconditions.Theerrorbarsdenotethestandarddeviationoverthedierentrealizations.Fig. 8-1 (a)showstheidenticalmapwherethekernelsizeusedinGaussiankernelhasbeenchosentobe0.001accordingtotheSilverman'sruleEq.( 8{5 ).Foridenticalchaoticsystems,perfectsynchronizationcanbegeneratedwithsucientdegreesofcoupling[ 135 ].Thiscanbeseenfromthefactthatthecorrentropycoecient=1forC0:7inFig. 8-1 (a)indicatingperfectsynchronizationoccursbetweentwocoupledsystems.ThecriticalthresholdC=0:7correspondstothepointwhenthemaximumLyapunovexponentoftheresponsesystembecomesnegativeandidenticalsynchronizationbetweenthesystemstakesplace[ 133 ].Ontheotherhand,thecorrentropycoecient=0forC<0:7suggestingnosynchronizationeventhoughtwosystemsareweaklycoupled.Similarresultshavebeenreportedusingothernonlinearinterdependencemeasurementin[ 133 136 137 ]. Fig. 8-1 (b)showstheresultforunidirectionallycouplednonidenticalsystems(bx=0:3;by=0:1).Thekernelsizeissetto0.4.Inthiscase,identicalsynchronizationisnotpossibleandthedriverhashigherdimensionthantheresponse.ThesharpincreaseofthecorrentropycoecientatpointC=0:7asintheidenticalsynchronizationsituationcannotbeobservedhere.Butthecorrentropycoecientshowsaconsistentmonotonic 135

PAGE 136

Inuenceofdierentnoiselevelsoncorrentropycoecient. increasewithrespecttocouplingstrengthexceptfortheregion0:1
PAGE 137

Inuenceofdierentnoiselevelsoncorrentropycoecient. one,butthesharpincreaseatC=0.7isstillobviousforbothnoiseintensities.Whennoiselevelishigh(SNR=1dB),thecorrentropycoecientcurveismorezigzagthanthatof10dBcase,howeveritcanstilldetectincreasesinthecouplingstrength.Thegurealsosuggeststhatwhethernoiseisaddedintodriver,responseorbothsystem,theperformanceofcorrentropycoecientisverysimilar.Fig. 8-3 presentstheresultsfornon-identicalHenonmap(bx=0:3andby=0:1)withwhitenoiseinresponse,driverandbothsystems.Kernelsizeisselectedto0.05forSNR=10dBand0.2forSNR=1dBrespectively.Thevaluesofthecorrentropycoecientsatdierentcouplingstrengthareaveragedover20independentrealizations.Inbothlevelsofnoisecase,thecorrentropycoecientsconsistentlyincreasewithrespecttocouplingstrength.Alsotheeectofnoiseinresponse,driverorbothsystemsdoesnotmakebigdierences.Noticethatthelocalhumparoundtheregion0:2
PAGE 138

Timedependentofcorrentropycoecient. systems,whichbasicallygeneratesnon-stationarityintimeseries.Tostudysuchtransientdynamicalphenomenon,bothidentical(bx=by=0:3)andnon-identical(bx=0:3andby=0:1)Henonmapsareconsideredhere.Dynamicalsystemsarecoupledonlyduringasingleepochandotherwiseuncoupledforbothcases[ 134 137 ].WesetthecouplingstrengthC=0forn10150andn10250andC=0:8for10150
PAGE 139

EectofdierentkernelwidthoncorrentropycoecientforunidirectionallycoupledidenticalHenonmaps. ispotentiallyabletodetectsuddenchangeinthecouplingbetweentwointeractingdynamicalsystemswithahightemporalresolution,whichmakesthismeasuresuitablefornon-stationarydatasets. 8-5 showsthecorrentropycoecientcurvesofdierentkernelwidthforunidirectionallycoupledidenticalHenonmap.Whenthekernelwidthischosentoolarge,=0.1,0.5,1,inthiscase,correntropycoecientproduceserroneousresultsintheunsynchronizedregion0
PAGE 140

Eectofdierentkernelwidthoncorrentropycoecientforunidirectionallycouplednon-identicalHenonmaps. 133 ],onidenticalHenonmaps.Fig. 8-7 showsthecorrelationcoecient,thecorrentropycoecientandthesimilarityindexasfunctionsofcouplingstrengthC.Thecorrentropycoecientgeneratesexactlythesameresultasthesimilarityindex.OntheotherhandtheconventionalcorrelationcoecientperformserraticallyintheunsynchronizedregionC<0:7.Thisclearlydemonstratesthatthecorrentropycoecientoutperformsthecorrelationcoecientincharacterizationofnonlinearcouplingbetweentwodynamicalsystems.Comparedtothesimilarityindex,thecorrentropycoecienthastheadvantageofavoidingestimatingembeddingdimension,choosingnearestneighborhoodandotherproblemsassociatedwithstatespaceembeddingmethod[ 133 139 ].Thecomputationalcomplexityofthecorrentropycoecientisstillmanageable,andthekernelsizeiseasytoestimate. Wealsousemultivariatesurrogatedatatofurtherinvestigatethatthecorrentropycoecientissensitivetononlinearcoupling.Prichardetal.introducedsurrogatemethod 140

PAGE 141

Comparisonofcorrelationcoecient,correntropycoecientandsimilarityindex. in[ 140 ].Thismethodhasbeenappliedtodetectnonlinearstructureintimeseries.Basically,togeneratemultivariatesurrogatedata,rsttheFouriertransformisappliedtoeachoftimeseries,thenacommonrandomnumberisaddedtoeachofthephasesandaninverseFouriertransformisapplied.Theresultingtimeserieshaveidenticalpowerspectraandcrosspowerspectraastheoriginaltimeseries,butanynonlinearcouplingamongthetimeserieshasbeendestroyed.Insimulation,weuseTISEANpackage[ 141 ]togenerate19realizationsofthesurrogatedataforthetimeseriesx1(t)inEq.( 8{1 )andy1(t)inEq.( 8{3 )foreachdierentcouplingstrengthfortheunidirectionallycouplednon-identicalHenonmap.Thenwecomputethecorrentropycoecientforboththeoriginalandthesurrogatedatawithrespecttodierentcouplingstrength.Fig. 8-8 plotsthecorrentropycoecientcurvefortheoriginaldataandthemeanvalueof19correntropycoecientsforthesurrogatedatawiththecorrespondingmaximalandminimalvaluesaserrorbars.Toquantifythesignicancelevel,wecalculatetheZ-ScoreasZ=jvorigsurrj 8-1 presentstheZ-Scorevaluesfordierentcouplingstrength.WiththeexceptionofC=0.2and0.4,theZ-Scorevaluesaresignicantlylargerthan1.96whichmeansthenonlinearcoupling 141

PAGE 142

Comparisonofthecorrentropycoecientfortheoriginaldataandthesurrogatedataforunidirectionallycouplednon-identicalHenonmap. hasbeendetectedwithaprobabilityp<0:05.Theseresultsclearlydemonstratesthatthecorrentropycoecientissensitivetothenonlinearityofthedependencebetweentwocoupledsystems. 142

PAGE 143

Z-scoreforthesurrogatedata 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 4.472 1.622 4.585 0.773 7.658 9.908 16.699 12.268 22.588 19.895 142 ].ThekernelwidthinGaussiankernelusedincorrentropycoecientwaschosentobe0.4. Fig. 8-9 (a)and(b)showplotsofthecorrelationandcorrentropycoecientsfortheauditoryareasofthebrainasafunctionoftimeafterthesubjectwasexposedonlytotheaudiostimuli.Severalbilaterally-symmetricalpairsofelectrodeswereselectedinthevicinityoftheauditorycortex,sothatbothmeasureswerecomputedforpairsFC5-FC6,FC3-FC4,C5-C6,C3-C4,CP5-CP6,CP3-CP4.AsshowninFig. 8-9 (a)and(b),therearetwodistincttimeintervals0-270msand270-450msintheauditoryresponse.Bothcorrelationandcorrentropycoecientsdropat270ms.Thissuggeststhatbothmeasuresareabletodetectthechangesininter-hemisphericsynchronizationoftheauditoryregions.However,astheelectrodesarechosenindierentlocationsawayfromtheauditorycortex,itisexpectedthatduringthesynchronizationphase(0-270ms)thesynchronizationmeasuresfordierentpairsshouldbedierent.Fig. 8-9 (a)showsthat 143

PAGE 144

ComparisonofcorrelationcoecientandcorrentropycoecientinsynchronizationdetectionamongauditorycortexforaudiostimuliEEGsignal. thecorrelationcoecientsforall6pairsaregroupedtogetherandareunabletodetectthedierenceinactivation,whileFig. 8-9 (b)suggeststhatthecorrentropycoecientcandierentiatesuccessfullythesynchronizationstrengthamongdierentareasofthecortexabovetheleftandrightauditoryregions.Notably,asexpectedfrompreviousstudies,pairsFC5-FC6andFC3-FC4exhibitstrongersynchronizationstrengththantheothers,whilemostposteriorpairsCP5-CP6andC5-C6haveweakersynchronizationstrength.AlsothesynchronizationpatternsreveallateralsimilarityintimeforthepairsFC5-FC6andFC3-FC4,forCP5-CP6andC5-C6,andforCP3-CP4andC3-C4.FurthermorethecorrentropycoecientsforpairsC5-C6,C3-C4andCP3-CP4peaksimultaneouslyat90mswhichcorrespondstotherstmeanglobaleldpower(MGFP)peakoftheEEGsignal. 144

PAGE 145

ComparisonofcorrelationcoecientandcorrentropycoecientincharacterizationofsynchronizationamongoccipitalcortexforvisualstimulusEEGsignal. Thesedierencesindicatethatthecorrentropycoecientismoresensitiveandisabletoextractmoreinformationasasynchronizationmeasurethantheconventionalcorrelationcoecient. Wealsocomparedbothmeasureswhenappliedtothevisualcorticalareas.ThemeasuresarepresentedinFig. 8-10 asafunctionoftimewhenthesubjectisexposedonlytovisualstimuli.Again,awindowsizeof20msdataisusedtocomputeboththecorrelationandthecorrentropycoecients,andthekernelwidthisagainsetto0.4asinthepreviouscase.WealsochosebilaterallysymmetricalpairsofelectrodesO1-O2,PO7-PO8,PO5-PO6,P7-P8,P5-P6andP3-P4.InFig. 8-10 (b)thecorrentropycoecientsforallpairsexceptforO1-O2showsimilarsynchronizationpatterns.The 145

PAGE 146

143 ].AsexpectedthesynchronizationbetweenoccipitalchannelsO1andO2hasthemaximumstrengthandstayshighuntilitdecreasesaround350ms.Thusthecorrentropycoecientshowsthattheextra-striatevisualnetworksbecomeincreasinglyrecruitedandsynchronizeduntilabout275msafterthestimulusonset,whiletheprimaryvisualcortexishighlysynchronousforalongerperiodoftime,untilabout350msafteronset.ThechannelspairP7andP8exhibitstheweakestsynchronizationstrengthsinceitislocatedthefarthestawayfromtheprimaryvisualcortexcomparedtootherelectrodepairs.Ontheotherhand,thecorrelationcoecientsformostchannelpairsgrouptogetheranddisplaythesamelevelofsynchronizationuntilitssharpdecreaseataround500ms(exceptforP7-P8).ThesynchronizationbetweenP7andP8hasirregularpatternswithalocalminimumaround200ms.ThiscomparisonclearlydemonstratesthatalsointhiscasethecorrentropycoecientmeasureoutperformsthecorrelationcoecientinthequanticationoftheEEGsignalcouplingbetweenthebilateraloccipitalregionsofthebraininresponsetovisualstimuli. 146

PAGE 147

4{3 )andinEq.( 4{29 )cannotinterpolatebetweendatapoints.ThiscanalsobeveriedtoapplytheTaylorseriesexpansiontotheGaussiankernelwherethekernelwidthappearsastheweightingparameterinbothsecondandhigherordermoments.Theeectofthekernelsizeondierentordermomentsisscaledbythepowerupto2kwherekisthemomentorder.Whenthekernelsizeistoolarge,thecontributionofhigherorderstatisticsdecaysrapidlyandthecenteredcrosscorrentropyapproachestheconventionalcrosscovariancefunction;ontheotherhand,whenthekernelsizeistoosmall,theeectofhigherordermomentsoverweighsthesecondorderone.Anappropriatekernelsizeshouldmaintainthebalanceofsecondorderandhigherorderstatisticsofthesignal. Thereforeagoodchoiceofthekernelparameteriscrucialforobtainingthegoodperformanceoftheproposedmethod.Therearetwowaysofhandlingtheselectionofkernelsize.Oneistoseekanoptimalkernelsize.Thecross-validationhasbeenoneofthemostlyusedmethodsinmachinelearningeldtochooseanappropriatekernelwidth.OtherapproachesincludetheSilverman'sruleofthumbwhichisgivenby[ 120 ]: whereAisthesmallervaluebetweenstandarddeviationofdatasamplesanddatainterquartilerangescaledby1.34,andNisthenumberofdatasamples.TheSilverman'sruleiseasytousetochooseagoodkernelsize,hencewewillsetthekernelwidth 147

PAGE 148

8{5 )throughoutthepaper.Alternatively,thekernelsizecanbethoughtasascaleparameterthatprovidesdierentlookstothedependenceamongthevariables.Justlikeinwavelets,thekernelsizeisabletoanalyzethedependenciesatdierentresolutions.Sincemanyrealworldsignalsareverycomplex,thismulti-resolutionanalysismayelucidatebettertherelationships. 4{3 )andinEq.( 4{29 )arenonlinear,anyscalingoftheoriginalrandomvariablesresultsindierentperformanceofthecorrentropycoecient.Unliketheconventionalcorrelationcoecientwhichisinsensitivetheamplitudescalingofthesignalsandonlymeasuresthesimilarityofthesignalsthroughthetime,thecorrentropycoecientmeasuresboththetimeandamplitudesimilaritiesbetweentwosignals.Therefore,incertainapplications,itisvitaltonormalizebothsignalsbeforeapplyingthecorrentropycoecient.Forexample,theamplitudesofEEGsignalsarehighlydependentonthedierentelectrodeimpedances.ItisimportanttonormalizeallchannelsofEEGsignalstothesamedynamicalrange. 148

PAGE 149

65 ]andpresentedanewcenteredcorrentropyfunctionfromTime-domainandfrequency-domainapproaches.Itisdemonstratedthatcorrentropyandcenteredcorrentropyfunctionsnotonlycapturethetimeandspacestructuresofsignals,butalsopartiallycharacterizethehighorderstatisticalinformationandnonlinearityintrinsictosignals.Correntropyandcenteredcorrentropyfunctionshaverichgeometricalstructures.Correntropyispositivedeniteandcenteredcorrentropyisnon-negativedenite,hencebyMoore-AronszajntheoremtheyuniquelyinducereproducingkernelHilbertspaces.Correntropyandcenteredcorrentropyfunctionscombinethedatadependentexpectationoperatoranddataindependentkernelstoformanotherdatadependentoperator.Oneperspectivetoworkwithcorrentropyandcenteredcorrentropyfunctionsistotreatthemas\generalized"correlationandcovariancefunctionsonnonlinearlytransformedrandomsignalsviathedataindependentkernelfunctions.ThosenonlinearlytransformedsignalsappearonthesphereintheRKHSinducedbythekernelfunctionsifisotropickernelfunctionsareused.TheotherperspectiveistodirectlyworkwiththeRKHSinducedbythecorrentropyandcenteredcorrentropyfunctionsdirectly.NowthenonlinearlytransformedsignalsintheRKHSisnolongerstochasticbutratherdeterministic.TheRKHSinducedbythecorrentropyandcenteredcorrentropyfunctionsincludestheexpectationoperatorasembeddedvectors.Thetwoviewsfurtherourunderstandingsofcorrentropyandcenteredcorrentropyfunctionsingeometricalperspective.ThetworeproducingkernelHilbertspacesinducedbykernelfunctionsandcorrentropyfunctionsrespectivelyrepresentstochasticanddeterministicfunctionalanalysis. Thecorrentropydependencemeasureisproposedbasedonthecorrentropycoecientasanovelstatisticaldependencemeasure.Thenewmeasuresatisesallthefundamental 149

PAGE 150

1. applyingtheparametriccorrentropycoecientindetectingnonlinearcouplingforEEGsignal, 2. investigatingtherelationshipbetweenthehigherorderstatisticsandcorrentropy, 3. applyingcorrentropypitchdeterminationalgorithminmultiplepitchestracking. 150

PAGE 151

[1] N.Aronszajn,\Thetheoryofreproducingkernelsandtheirapplications,"Cam-bridgePhilosophySoceityProceedings,vol.39,pp.133{153,1943. [2] N.Aronszajn,\Theoryofreproducingkernels,"TransactionsoftheAmericanMathemathicalSociety,vol.68,no.3,pp.337{404,1950. [3] A.Povzner,\Onaclassofhilbertfunctionspaces,"Dokl.Akad.Nauk.SSSR,vol.68,pp.817{820,1949. [4] A.Povzner,\Onsomeapplicationsofaclassofhilbertfunctionspaces,"Dokl.Akad.Nauk.SSSR,vol.74,pp.13{16,1950. [5] M.G.Krein,\Hermitian-positivekernelsonhomogeneousspaces,"AmericanMathematicalSocietyTranslation,vol.2,no.34,pp.69{164,1963. [6] E.Hille,\Introductiontogeneraltheoryofreproducingkernels,"RockyMountainJournalofMathematics,vol.2,pp.321{368,1972. [7] H.Meschkowski,HilbertSpaceswithKernelFunction,Springer-Verlag,Berlin,1962. [8] H.S.Shapiro,TopicsinAppproximationTheory,Springer-Verlag,Berlin,1971. [9] S.Saitoh,TheoryofReproducingKernelsanditsApplications,Pitmanresearchnotesinmathematicsseries.LongmanScientic&Technical,Essex,UK,1988. [10] P.J.Davis,InterpolationandApproximation,Dover,NewYork,1975. [11] S.Bergman,TheKernelFunctionandConformalMapping,AmericanMathematicalSociety,NewYork,1950. [12] L.Schwartz,\Hilbertsubspacesoftopologicalvectorspacesandassociatedkernels,"JournalofAnalysisMathematics,vol.13,pp.115{256,1964. [13] J.Mercer,\Functionsofpositiveandnegativetype,andtheirconnectionwiththetheoryofintegralequations,"PhilosophicalTransactionsoftheRoyalSocietyofLondon,vol.209,pp.415{446,1909. [14] E.H.Moore,\Onproperlypositivehermitianmatrices,"BulletinoftheAmericanMathematicalSociety,vol.23,no.59,pp.66{67,1916. [15] SalomonBochner,\Hilbertdistancesandpositivedenitefunctions,"TheAnnalsofMathematics,vol.42,no.3,pp.647{656,July1941. [16] I.J.Schoenberg,\Metricspacesandpositivedenitefunctions,"TransactionsoftheAmericanMathemathicalSociety,vol.44,pp.522{536,1938. [17] I.J.Schoenberg,\Positivedenitefunctionsonspheres,"DukeMath.J.,vol.9,pp.96{108,1942. 151

PAGE 152

J.Stewart,\Positivedenitefunctionsandgeneralizations,anhistoricalsurvey,"RockyMountainJournalofMathematics,vol.6,no.3,pp.409{434,September1976. [19] D.Alpay,Ed.,ReproducingKernelSpacesandApplications,BirkhauserVerlag,Germany,2003. [20] U.Grenander,AbstractInference,JohnWiley&Sons,NewYork,1981. [21] E.Kreyszig,IntroductoryFunctionalAnalysiswithApplications,JohnWiley&Sons,NewYork,1978. [22] A.N.Kolmogorov,\StationarysequencesinHilbertspace,"Bull.Math.Univ.Moscow,vol.2,no.6,1941. [23] M.Loeve,\Stochasticprcessesandbrowninamotion,"inSecondOrderRandomFuntions,P.Levy,Ed.,p.365.Cauthier-Villars,Paris,1948. [24] M.Loeve,ProbabilityTheoryII,Springer-Verlag,Berlin,4thedition,1978. [25] E.Parzen,\StatisticalinferenceontimeseriesbyHilbertspacemethods,"TechnicalReport23,StatisticsDepartment,StanfordUniversity,1959. [26] E.Parzen,\StatisticalinferenceontimeseriesbyRKHSmethods,"inProc.12thBiennialSeminarCanadianMathematicalCongress,R.Pyke,Ed.,Montreal,Canada,1970,pp.1{37. [27] E.Parzen,\Anapproachtotimeseriesanalysis,"TheAnnalsofMathematicalStatistics,vol.32,no.4,pp.951{989,December1961. [28] E.Parzen,\ExtractionanddetectionproblemsandreproducingkernelHilbertspaces,"SIAMJournalonControl,vol.1,pp.35{62,1962. [29] E.Parzen,TimeSeriesAnalysisPapers,Holden-Day,SanFrancisco,CA,1967. [30] J.Hajek,\Onlinearstatisticalproblemsinstochasticprocesses,"CzechoslovakMathematicalJournal,vol.12,pp.404{444,1962. [31] T.Kailath,\RKHSapproachtodetectionandestimationproblems{partI:Deterministicsignalsingaussiannoise,"IEEETransactionsonInformationTheory,vol.IT-17,no.5,pp.530{549,September1971. [32] T.KailathandH.Weinert,\AnRKHSapproachtodetectionandestimationproblems{partII:Gaussiansignaldetection,"IEEETransactionsonInformationTheory,vol.IT-21,no.1,pp.15{23,January1975. [33] T.KailathandD.Duttweiler,\AnRKHSapproachtodetectionandestimationproblems{partIII:Generalizedinnovationsrepresentationsandalikelihood-ratioformula,"IEEETransactionsonInformationTheory,vol.IT-18,no.6,pp.730{745,November1972. 152

PAGE 153

D.DuttweilerandT.Kailath,\RKHSapproachtodetectionandestimationproblems{partIV:Non-gaussiandetection,"IEEETransactionsonInformationTheory,vol.IT-19,no.1,pp.19{28,January1973. [35] D.DuttweilerandT.Kailath,\RKHSapproachtodetectionandestimationproblems{partV:Parameterestimation,"IEEETransactionsonInformationTheory,vol.IT-19,no.1,pp.29{37,January1973. [36] T.HidaandN.Ikeda,\Analysisonhilbertspacewithreproducingkernelarisingfrommultiplewienerintegral,"inProc.5thBerkelySymp.onMathematicalStatisticsandProbability,L.LeCamandJ.Neyman,Eds.,1967,vol.2,pp.117{143. [37] G.Kallianpur,\Advancesinprobabilityandrelatedtopics,"inTheRoleofReproducingKernelHilbertSpacesintheStudyofGaussianProcesses,P.Ney,Ed.,vol.2,pp.49{83.MarcelDekker,NewYork,1970. [38] T.Hida,\Canonicalrepresentationsofgaussianprocessesandtheirapplications,"KyotoUniv.Coll.Sci.Mem.,vol.A33,pp.109{155,1960. [39] G.M.Molchan,\Onsomeproblemsconcerningbrownianmotioninlevy'ssense,"TheoryofProbabilityAndItsApplications,vol.12,pp.682{690,1967. [40] L.Pitt,\Amarkovpropertyforgaussianprocesseswithamultidimensionalparameter,"ArchiveforRationalMechanicsandAnalysis,vol.43,pp.367{391,1971. [41] H.L.Weinert,\Statisticalmethodsinoptimalcurvetting,"CommunicationsinStatistics,vol.B7,no.4,pp.417{435,1978. [42] C.DeBoorandR.E.Lynch,\Onsplinesandtheirminimumproperties,"JournalofMathematicsandMechanics,vol.15,pp.953{969,1966. [43] L.L.Schumaker,\Fittingsurfacestoscattereddata,"inApproximationTheoryII,G.G.Lorentz,C.K.Chui,andL.L.Schumaker,Eds.,pp.203{268.AcademicPress,NewYork,1976. [44] G.Wahba,SplineModelsforObservationalData,vol.49,SIAM,Philadelphia,PA,1990. [45] RuiJ.P.DeFigueiredo,\Ageneralizedfockspaceframeworkfornonlinearsystemandsignalanalysis,"IEEETransactionsonCircuitsandSystems,vol.CAS-30,no.9,pp.637{647,September1983. [46] V.N.Vapnik,TheNatureStatisticalLearningTheory,Springer,NewYork,1999. [47] V.N.Vapnik,StatisticalLearningTheory,JohnWiley&Sons,NewYork,1998. [48] F.Rosenblatt,\Theperceptron:Aprobabilisticmodelforinformationstorageandorganizationinthebrain,"PsychologicalReview,vol.65,no.6,pp.386{408,1958. 153

PAGE 154

B.ScholkopfandA.Smola,Learningwithkernels,MITPress,Cambridge,MA,2002. [50] B.Scholkopf,A.Smola,andK.-R.Muller,\Nonlinearcomponentanalysisasakerneleigenvalueproblem,"NeuralComputation,vol.10,pp.1299{1319,1998. [51] J.WestonB.ScholkopfS.Mika,G.RatschandK.-R.Muller,\Fisherdiscriminantanalysiswithkernels,"inProc.NeuralNetworksforSignalProcessingIX,NewJersey,1999,vol.2,pp.41{48. [52] F.R.BachandM.I.Jordan,\Kernelindependentcomponentanalysis,"JournalofMachineLearningResearch,vol.3,pp.1{48,2002. [53] ThomasM.Cover,\Geometricalandstatisticalpropertiesofsystemsoflinearinequalitieswithapplicationsinpatternrecognition,"IEEETransactionsonElectronicComputers,vol.EC-14,no.3,pp.326{334,June1965. [54] T.Evgeniou,T.Poggio,M.Pontil,andA.Verri,\Regularizationandstatisticallearningtheoryfordataanalysis,"ComputationalStatisticsandDataAnalysis,vol.38,pp.421{432,2002. [55] J.C.Principe,D.Xu,andJ.W.Fisher,\Informationtheoreticlearning,"inUnsupervisedAdaptiveFiltering,S.Haykin,Ed.,pp.265{319.JohnWiley&Sons,2000. [56] K.E.Hild,D.Erdogmus,andJ.C.Principe,\BlindsourceseparationusingRenyi'smutualinformaiton,"IEEESignalProcessingLetter,vol.8,no.6,pp.174{176,2001. [57] D.ErdogmusandJ.C.Principe,\Generalizedinformationpotentialcriterionforadaptivesystemtraining,"IEEETransactionsonNeuralNetworks,vol.13,no.5,pp.1035{1044,2002. [58] A.Renyi,\Onmeasuresofentropyandinformation,"inSelectedpaperofA.Renyi,vol.2,pp.565{580.AkademiaiKiado,Budapest,Hungary,1976. [59] D.ErdogmusandJ.C.Principe,\Anerror-entropyminimizationalgorithmforsupervisedtrainingofnonlinearadaptivesystems,"IEEETransactionsonSignalProcessing,vol.50,no.7,pp.1780{1786,July2002. [60] E.Parzen,\Onestimationofaprobabilitydensityfunctionandmode,"TheAnnalsofMathematicalStatistics,vol.33,no.3,pp.1065{1076,September1962. [61] M.G.Genton,\Classofkernelsformachinelearning:Astatisticsperspective,"JournalofMachineLearningResearch,vol.2,pp.299{312,2001. [62] S.KullbackandR.A.Leibler,\Oninformationandsuciency,"TheAnnalsofMathematicalStatistics,vol.22,no.1,pp.79{86,1951. 154

PAGE 155

D.Xu,J.C.Principe,J.W.Fisher,andH.C.Wu,\Anovelmeasureforindependentcomponentanalysis,"inProceedingsoftheIEEEInternationalConferenceonAcoustics,Speech,andSignalProcessing(ICASSP),1998,vol.2,pp.12{15. [64] R.Jenssen,D.Erdogmus,J.C.Principe,andT.Eltoft,\ThelaplacianPDFdistance:Acostfunctionforclusteringinakernelfeaturespace,"inAdvancesinNeuralInformationProcessingSystems(NIPS).2004,pp.625{632,MITPress,Cambridge. [65] I.Santamaria,P.Pokharel,andJ.C.Principe,\Generalizedcorrelationfunction:Denition,properties,andapplicationtoblindequalization,"IEEETransactionsonSignalProcessing,vol.54,no.6,pp.2187{2197,2006. [66] P.P.Pokharel,R.Agrawal,andJ.C.Principe,\Correntropybasedmatchedltering,"inProc.MachineLearningforSignalProcessing,Mystic,USA,2005. [67] K.-H.JeongandJ.C.Principe,\ThecorrentropyMACElterforimagerecognition,"inProceedingsoftheInternationalWorkshopOnMachineLearn-ingforSignalProcessing(MLSP),Maynooth,Ireland,2006,pp.9{14. [68] P.P.Pokharel,J.Xu,D.Erdogmus,andJ.C.Principe,\AclosedformsolutionforanonlinearWienerlter,"inProceedingsoftheIEEEInternationalConferenceonAcoustics,Speech,andSignalProcessing(ICASSP),Toulouse,France,May2006,vol.3,pp.720{723. [69] S.-I.AmariandH.Nagaoka,MethodsofInformationGeometry,AMSandOxfordUniversityPress,Providence,RI,2000. [70] B.PistoneandC.Sempi,\Aninnite-dimensionalgeometricstructureonthespaceofallprobabilitymeasuresequivalenttoagivenone,"TheAnnalsofStatistics,vol.23,no.5,pp.1543{1561,1995. [71] C.R.Rao,\Informationandaccuracyattainableintheestimationofstatisticalparameters,"BulletinoftheCalcuttaMathematicalSociety,vol.37,pp.81{91,1945. [72] T.Jebara,R.Kondor,andA.Howard,\Probabilityproductkernels,"JournalofMachineLearningResearch,vol.5,pp.819{844,2004. [73] T.JebaraandR.Kondor,\Bhattarcharyyaandexpectedlikelihoodkernels,"inProceedingsofAnnualConferenceonLearningTheory,WashingtonD.C.,2003. [74] M.B.Pursley,RandomProcessesinLinearSystems,PrenticeHall,NewJersey,2002. [75] L.Ljung,SystemIdentication:TheoryfortheUser,Prentice-Hall,EnglewoodClis,NJ,1987. 155

PAGE 156

RuiJ.P.DeFigueiredoandY.Hu,\Volterraequationsandapplications,"inOnNonlinearFilteringofNon-GaussianProcessesThroughVolterraSeries,C.CorduneanuandI.W.Sandberg,Eds.,pp.197{202.GordonandBreachScience,2002. [77] R.F.Engle,\Autoregressiveconditionalheteroscedasticitywithestimatesofthevarianceofunitedkingdomination,"Econometrica,vol.50,pp.987{1007,1982. [78] A.RakotomamonjyandS.Canu,\Frames,reproducingkernels,regularizationandlearning,"JournalofMachineLearningResearch,vol.6,pp.1485{1515,2005. [79] V.Fock,\Kongurationsraumundzweitequantelung,"Z.Phys.,vol.75,pp.622{647,1932. [80] D.DrouetMariandS.Kotz,CorrelationandDepedence,ImperiealCollegePress,London,2001. [81] J.DauxioisandG.M.Nkiet,\Nonlinearcanonicalanalysisandindependencetests,"TheAnnalsofStatistics,vol.26,no.4,pp.1254{1278,1998. [82] C.B.Bell,\Mutualinformationandmaximalcorrelationasmeasuresofdependence,"TheAnnalsofMathematicalStatistics,vol.33,no.2,pp.587{595,1962. [83] T.CoverandJ.Thomas,ElementsofInformationTheory,Wiley,NewYork,1991. [84] A.C.MicheasandK.Zografos,\Measureingstochasticdependenceusing'-divergence,"JournalofMultivariateAnalysis,vol.97,pp.765{784,2006. [85] S.D.Silvey,\Onameasureofassociation,"TheAnnalsofMathematicalStatistics,vol.35,no.3,pp.1157{1166,1964. [86] H.Joe,\Relativeentropymeasuresofmultivariatedependence,"JournaloftheAmericanStatisticalAssociation,vol.84,no.405,pp.157{164,1989. [87] C.W.Granger,E.Maasoumi,andJ.Racine,\Adependencemetricforpossiblynonlinearprocesses,"JournalofTimeSeriesAnalysis,vol.25,no.5,pp.649{669,2004. [88] P.L.LaiandC.Fyfe,\Kernelandnonlinearcanonicalcorrelationanalysis,"InternationalJournalofNeuralSystems,vol.10,no.5,pp.365{377,2000. [89] A.Gretton,R.Herbrich,A.Smola,O.Bousquet,andB.Scholkopf,\Kernelmethodsformeasuringindependence,"JournalofMachineLearningResearch,vol.6,pp.2075{2129,2005. [90] H.Suetani,Y.Iba,andK.Aihara,\Detectinggeneralizedsynchronizationbetweenchatoicsignals:Akernel-basedapproach,"JournalofPhysicsA:MathematicalandGeneral,vol.39,pp.10723{10742,2006. 156

PAGE 157

A.Renyi,\Onmeasureofdependence,"ActaMathematicaAcademiaeScientiarumHungaricae,vol.10,pp.441{451,1959. [92] K.I.DiamantarasandS.Y.Kung,PrincipalComponentNeuralNetworks:TheoryandApplications,JohnWiley&Sons,NewYork,1996. [93] I.T.Jollie,PrincipalComponentAnalysis,Springer,NewYork,2ndedition,2002. [94] T.HastieandW.Stuetzle,\Principlecurves,"JournaloftheAmericanStatisticalAssociation,vol.84,no.406,pp.502{516,June1989. [95] M.A.Kramer,\Nonlinearprincipalcomponentanalysisusingautoassociativeneuralnetworks,"AIchEJournal,vol.37,pp.233{243,1991. [96] W.J.Hess,PitchDeterminationofSpeechSignals,Springer,NewYork,1993. [97] M.J.Ross,H.L.Shaer,A.Cohen,R.Freudberg,andH.J.Manley,\Averagemagnitudedierencefunctionpitchextractor,"IEEETransactionsonAcoustics,Speech,andSignalProcessing,vol.ASSP-22,no.5,pp.353{362,October1974. [98] A.deCheveigne,\Cancellationmodelofpitchperception,"TheJournaloftheAcousticalSocietyofAmerica,vol.103,no.3,pp.1261{1271,March1998. [99] T.ShimaruraandH.Kobayashi,\Weightedautocorrelationforpitchextractionofnoisyspeech,"IEEETransactionsonSpeechandAudioProcessing,vol.9,no.7,pp.727{730,October2001. [100] A.deCheveigne,\Pitchandthenarrowedautocoindidencehistogram,"inProc.Intl.Conf.ofMusicPerceptionandCognition,Kyoto,Japan,1989,pp.67{70. [101] G.J.BrownandD.Wang,\Modellingtheperceptualsegregationofdoublevowelswithanetworkofneuraloscillators,"NeuralNetworks,vol.10,no.9,pp.1547{1558,1997. [102] LeonCohen,Time-FrequencyAnalysis,PrenticeHall,NewJersey,1995. [103] J.C.BrownandM.S.Puckette,\Calculationofa\narrowed"autocorrelationfunction,"TheJournaloftheAcousticalSocietyofAmerica,vol.85,pp.1595{1601,1989. [104] H.Duifhuis,L.Willems,andR.Sluyter,\Measurementofpitchinspeech:Animplementationofgoldstein'stheoryofpitchperception,"TheJournaloftheAcousticalSocietyofAmerica,vol.71,pp.1568{1580,1982. [105] M.R.Schroeder,\Periodhistogramandproductspectrum:Newmethodsforfundamentalfrequencymeasurement,"TheJournaloftheAcousticalSocietyofAmerica,vol.43,pp.829{834,1968. [106] D.J.Hermes,\Measurementofpitchbysubharmonicsummation,"TheJournaloftheAcousticalSocietyofAmerica,vol.83,no.1,pp.257{264,1988. 157

PAGE 158

X.Sun,\Apitchdeterminationalogirithmbasedonsubharmonic-to-harmonicration,"inProc.6thIntlConf.ofSpokenLanguageProcessing,Beijing,China,2000,vol.4,pp.676{679. [108] A.deCheveigne,\Pitchperceptionmodels,"inPitch-NerualCodingandPerception,C.Plack,A.Oxenham,R.Fay,andA.Popper,Eds.Springer-Verlag,NewYork,2005. [109] J.C.R.Licklider,\Aduplextheoryofpitchperception,"Experientia,vol.7,pp.128{134,1951. [110] R.Lyon,\Computationalmodelsofneuralauditoryprocessing,"inProceedingsoftheIEEEInternationalConferenceonAcoustics,Speech,andSignalProcess-ing(ICASSP),SanDiego,USA,1984,pp.41{44. [111] M.SlaneyandR.F.Lyon,\Ontheimportanceoftime-atemporalrepresentationofsound,"inVisualRepresentationsofSpeechSignals,M.Cooke,S.Beet,andM.Crawford,Eds.,pp.95{116.JohnWiley&Sons,1993. [112] D.WangandG.J.Brown,Eds.,ComputationalAuditorySceneAnalysis-Princi-ples,Algorithms,andApplications,JohnWiley&Sons,NewJersey,2006. [113] M.Wu,D.Wang,andG.J.Brown,\Amultipichtrackingalgorithmfornoisyspeech,"IEEETransactionsonSpeechandAudioProcessing,vol.11,no.3,pp.229241,2003. [114] R.MeddisandM.Hewitt,\Modelingtheidenticationofconcurrentvowelswithdierentfundamentalfrequencies,"TheJournaloftheAcousticalSocietyofAmerica,vol.91,pp.233{245,1992. [115] A.deCheveigne,\Multiplef0estimation,"inComputationalAuditorySceneAnalysis-Principles,Algorithms,andApplications,D.WangandG.J.Brown,Eds.,pp.45{79.JohnWiley&Sons,NewJersey,2006. [116] A.MorenoandJ.Fonollosa,\Pitchdeterminationofnoisyspeechusinghigherorderstatistics,"inProceedingsoftheIEEEInternationalConferenceonAcoustics,Speech,andSignalProcessing(ICASSP),SanFrancisco,USA,March1992,pp.133{136. [117] R.D.Patterson,J.Holdsworth,I.Nimmo-Smith,andP.Rice,\SVOSnalreport,partB:Implementingagammatonelterbank,"AppliedPsychologyUnitReport2341,1988. [118] B.R.GlasbergandB.C.Moore,\Derivationofauditoryltershapesfromnotched-noiseddata,"HearingResearch,,no.47,pp.103{138,1990. [119] M.SlaneyandR.F.Lyon,\Aperceptualpitchdetector,"inProceedingsoftheIEEEInternationalConferenceonAcoustics,Speech,andSignalProcess-ing(ICASSP),Albuquerque,USA,1990,pp.357{360. 158

PAGE 159

B.W.Silverman,DensityEstimationforStatisticsandDataAnalysis,ChapmanandHall,NewYork,1986. [121] J.-W.Xu,H.Bakardjian,A.Cichocki,andJ.C.Principe,\Anewnonlinearsimilaritymeasureformultichannelbiologicalsignals,"inProceedingsoftheInternationalJointConferenceonNeuralNetworks(IJCNN),Orlando,FLUSA,2007. [122] R.MeddisandM.Hewitt,\Virtualpitchandphasesensitivityofacomputermodeloftheauditoryperiphery:I.pitchindentication,"TheJournaloftheAcousticalSocietyofAmerica,vol.89,pp.2866{2882,1991. [123] S.Han,S.Rao,andJ.C.Principe,\Estimatingtheinformationpotentialwiththefastgausstransform,"inProceedingsofInternationalConferenceonIndependentComponentAnalysisandBlindSourceSeparation(ICA),Charleston,SC,USA,2006,LNCS3889,pp.82{89. [124] M.Slaney,,"MalcolmSlaney'sAuditoryToolboxtoimplementauditorymodelsandgeneratesyntheticvowels.Availableonlineat [125] P.Bagshaw,,"PaulBagshaw'sdatabaseforevaluatingpitchdeterminationalgorithms.Availableonlineat [126] P.Bgashw,S.Hiler,andM.Jack,\Enhancedpitchtrackingandtheprocessingoff0contoursforcomputerandintonationteaching,"inProc.EuropeanConf.onSpeechComm.,1993,pp.1003{1006. [127] A.CamachoandJ.Harris,\Apitchestimationalgorithmbasedonthesmoothharmonicaveragepeak-to-valleyenvelope,"inProc.Intl.Symp.onCircuitsandSystems,NewOrleans,USA,May2007. [128] J.C.Shaw,\AnintroductiontothecoherencefunctionanditsuseinEEGsignalanalysis,"JournalofMedicalEngineering&Technology,vol.5,no.6,pp.279{288,1981. [129] E.Pereda,R.QuianQuiroga,andJ.Bhattacharya,\Nonlinearmultivariateanalysisofneurophysiologicalsignals,"ProgressinNeurobiology,vol.77,pp.1{37,2005. [130] B.Pompe,\Measuringstatisticaldependenciesinatimeseries,"JournalofStatisticalPhysics,vol.73,pp.587{610,1993. [131] M.G.Rosenblum,A.S.Pikovsky,andJ.Kurths,\Phasesynchronizationofchaoticoscillators,"PhysicalReviewLetters,vol.76,no.11,pp.1804{1807,1996. [132] J.Arnhold,P.Grassberger,K.Lehnertz,andC.E.Elger,\Arobustmethodfordetectinginterdependencies:Applicationtointracraniallyrecordedeeg,"PhysicaD,vol.134,pp.419{430,1999. 159

PAGE 160

R.QuianQuiroga,J.Arnhold,andP.Grassberger,\Learningdriver-responserelationshipsfromsynchronizationpatterns,"PhysicalReviewE,vol.61,no.5,pp.5142{5148,May2000. [134] C.J.StamandB.W.vanDijk,\Synchronizationlikelihood:anunbiasedmeasureofgeneralizedsysnchronizationinmultivariatedatasets,"PhysicaD,vol.163,pp.236{251,2002. [135] L.M.PecoraandT.L.Carroll,\Synchronizationinchaoticsystems,"PhysicalReviewLetters,vol.64,no.8,pp.821{825,1990. [136] A.Schmitz,\Measuringstatisticaldependenceandcouplingofsubsystem,"PhysicalReviewE,vol.62,pp.7508{7511,2000. [137] J.Bhattacharya,E.Pereda,andH.Petsche,\Eectivedetectionofcouplinginshortandnoisybivariatedata,"IEEETransactionsonSystems,ManandCyberneticsB,,no.1,pp.85{95,February2003. [138] S.J.Schi,P.So,T.Chang,R.E.Burke,andT.Sauer,\Detectingdynamicalinterdependenceandgeneralizedsynchronythroughmutualpredictioninaneuralensemble,"PhysicalReviewE,vol.54,pp.6708{6724,1996. [139] H.KantzandT.Schreiber,NonlinearTimeSeriesAnalysis,CambridgeUniversityPress,Cambridge,UK,1997. [140] D.PrichardandJ.Theiler,\Generatingsurrogatedatafortimeserieswithseveralsimultaneouslymeasuredvariables,"PhysicalReviewLetters,vol.73,no.7,pp.951{954,1994. [141] T.SchreiberandA.Schmitz,\Surrogatetimeseries,"PhysicaD,vol.142,pp.346{382,2000. [142] K.Kotani,Y.Kinomoto,M.Yamada,J.Deguchi,M.Tonoike,K.Horii,S.Miyatake,T.Kuroiwa,andT.Noguichi,\Spatiotemporalpatternsofmovement-relatedeldsinstrokepatients,"Neurology&clinicalneurophysiol-ogy,vol.63,pp.1{4,2004. [143] F.DiRusso,A.Martinez,M.I.Serono,S.Pitzalis,andS.A.Hillyard,\Corticalsourcesoftheeearlycomponentsofthevisualevokedpotential,"HumanBrainMapping,vol.15,pp.95{111,2001. 160

PAGE 161

JianwuXuwasborninWenzhou,ChinaonDecember7,1979.HereceivedhisBachelorofEngineeringinElectricalEngineeringfromZhejiangUniversity,Hangzhou,China,inJune2002.SinceOctober2002,hehasbeenworkingtowardshisPh.D.intheElectricalandComputerEngineeringDepartmentatUniversityofFlorida,underthesupervisionofDr.JosePrincipewithsupportfromAlumniGraduateFellowshipfromUniversityofFloridaandNSFgrantECS-0601271.Duringthesummerof2006,hevisitedRIKENBrainScienceInstituteinTokyo,JapanandworkedwithDr.AndrzejCichockionEEGsynchronizationintheLaboratoryforAdvancedBrainSignalProcessing.Hiscurrentresearchinterestsbroadlyincludeinformationtheoreticlearning,adaptivesignalprocessing,controlandmachinelearning.HeisamemberofIEEE,TauBetaPiandEtaKappaNu. 161