<%BANNER%>

Time Series Analysis with Information Theoretic Learning and Kernel Methods

Permanent Link: http://ufdc.ufl.edu/UFE0021482/00001

Material Information

Title: Time Series Analysis with Information Theoretic Learning and Kernel Methods
Physical Description: 1 online resource (119 p.)
Language: english
Creator: Pokharel, Puskal P
Publisher: University of Florida
Place of Publication: Gainesville, Fla.
Publication Date: 2007

Subjects

Subjects / Keywords: Electrical and Computer Engineering -- Dissertations, Academic -- UF
Genre: Electrical and Computer Engineering thesis, Ph.D.
bibliography   ( marcgt )
theses   ( marcgt )
government publication (state, provincial, terriorial, dependent)   ( marcgt )
born-digital   ( sobekcm )
Electronic Thesis or Dissertation

Notes

Abstract: The major goal of our research is to develop simple and effective nonlinear versions of some basic time series tools for signal detection, optimal filtering, and on-line adaptive filtering. These extensions shall be based on concepts being developed in information theoretic learning (ITL) and kernel methods. In general all ITL algorithms can be interpreted from kernel methods because ITL is based on extracting higher order information (that beyond second order as given by the autocorrelation function) directly from the data samples by exploiting nonparametric density estimation using translation invariant kernel functions. ITL in general is still lacking in providing tools to better exploit time structures in the data because the assumption of independently distributed data samples is usually an essential requirement. Kernel methods provide an elegant means of obtaining nonlinear versions of linear algorithms expressed in terms of inner products by using the so called kernel trick and Mercer's theorem. This has given rise to a variety of algorithms in the field of machine learning but most of them are computationally very expensive using a large Gram matrix of dimension same as the number of data points. Since these large matrices are usually ill-conditioned, they require an additional regularization step in methods like kernel regression. Our goal is to design basic signal analysis tools for time signals that extract higher order information from the data directly like ITL and also avoid the complexities of many kernel methods. We present new methods for time series analysis (matched filtering and optimal adaptive filtering) based on the newly invented concept in ITL, correntropy and kernel methods. Correntropy induces a RKHS that has the same dimensionality as the input space but is nonlinearly related to it. It is different from the conventional kernel methods, in both scope and detail. This in effect helps us to derive some elegant versions of a few tools that form the basic building block of signal processing like the matched filter (correlation receiver), the Wiener filter, and the least mean square (LMS) filter.
General Note: In the series University of Florida Digital Collections.
General Note: Includes vita.
Bibliography: Includes bibliographical references.
Source of Description: Description based on online resource; title from PDF title page.
Source of Description: This bibliographic record is available under the Creative Commons CC0 public domain dedication. The University of Florida Libraries, as creator of this bibliographic record, has waived all rights to it worldwide under copyright law, including all related and neighboring rights, to the extent allowed by law.
Statement of Responsibility: by Puskal P Pokharel.
Thesis: Thesis (Ph.D.)--University of Florida, 2007.
Local: Adviser: Principe, Jose C.

Record Information

Source Institution: UFRGP
Rights Management: Applicable rights reserved.
Classification: lcc - LD1780 2007
System ID: UFE0021482:00001

Permanent Link: http://ufdc.ufl.edu/UFE0021482/00001

Material Information

Title: Time Series Analysis with Information Theoretic Learning and Kernel Methods
Physical Description: 1 online resource (119 p.)
Language: english
Creator: Pokharel, Puskal P
Publisher: University of Florida
Place of Publication: Gainesville, Fla.
Publication Date: 2007

Subjects

Subjects / Keywords: Electrical and Computer Engineering -- Dissertations, Academic -- UF
Genre: Electrical and Computer Engineering thesis, Ph.D.
bibliography   ( marcgt )
theses   ( marcgt )
government publication (state, provincial, terriorial, dependent)   ( marcgt )
born-digital   ( sobekcm )
Electronic Thesis or Dissertation

Notes

Abstract: The major goal of our research is to develop simple and effective nonlinear versions of some basic time series tools for signal detection, optimal filtering, and on-line adaptive filtering. These extensions shall be based on concepts being developed in information theoretic learning (ITL) and kernel methods. In general all ITL algorithms can be interpreted from kernel methods because ITL is based on extracting higher order information (that beyond second order as given by the autocorrelation function) directly from the data samples by exploiting nonparametric density estimation using translation invariant kernel functions. ITL in general is still lacking in providing tools to better exploit time structures in the data because the assumption of independently distributed data samples is usually an essential requirement. Kernel methods provide an elegant means of obtaining nonlinear versions of linear algorithms expressed in terms of inner products by using the so called kernel trick and Mercer's theorem. This has given rise to a variety of algorithms in the field of machine learning but most of them are computationally very expensive using a large Gram matrix of dimension same as the number of data points. Since these large matrices are usually ill-conditioned, they require an additional regularization step in methods like kernel regression. Our goal is to design basic signal analysis tools for time signals that extract higher order information from the data directly like ITL and also avoid the complexities of many kernel methods. We present new methods for time series analysis (matched filtering and optimal adaptive filtering) based on the newly invented concept in ITL, correntropy and kernel methods. Correntropy induces a RKHS that has the same dimensionality as the input space but is nonlinearly related to it. It is different from the conventional kernel methods, in both scope and detail. This in effect helps us to derive some elegant versions of a few tools that form the basic building block of signal processing like the matched filter (correlation receiver), the Wiener filter, and the least mean square (LMS) filter.
General Note: In the series University of Florida Digital Collections.
General Note: Includes vita.
Bibliography: Includes bibliographical references.
Source of Description: Description based on online resource; title from PDF title page.
Source of Description: This bibliographic record is available under the Creative Commons CC0 public domain dedication. The University of Florida Libraries, as creator of this bibliographic record, has waived all rights to it worldwide under copyright law, including all related and neighboring rights, to the extent allowed by law.
Statement of Responsibility: by Puskal P Pokharel.
Thesis: Thesis (Ph.D.)--University of Florida, 2007.
Local: Adviser: Principe, Jose C.

Record Information

Source Institution: UFRGP
Rights Management: Applicable rights reserved.
Classification: lcc - LD1780 2007
System ID: UFE0021482:00001


This item has the following downloads:


Full Text
xml version 1.0 encoding UTF-8
REPORT xmlns http:www.fcla.edudlsmddaitss xmlns:xsi http:www.w3.org2001XMLSchema-instance xsi:schemaLocation http:www.fcla.edudlsmddaitssdaitssReport.xsd
INGEST IEID E20101206_AAAACE INGEST_TIME 2010-12-06T11:50:37Z PACKAGE UFE0021482_00001
AGREEMENT_INFO ACCOUNT UF PROJECT UFDC
FILES
FILE SIZE 993714 DFID F20101206_AAAXQQ ORIGIN DEPOSITOR PATH pokharel_p_Page_059.jp2 GLOBAL false PRESERVATION BIT MESSAGE_DIGEST ALGORITHM MD5
52be2b08fd9917f077267ec94cc8e697
SHA-1
4429b1f0f7ca21f077e6dff290f4fe78be156bb3
1051949 F20101206_AAAXRF pokharel_p_Page_083.jp2
bea893b37f0f5bce53389d6f84bccd84
3e401f7a297bf913aa105fb4eb8df252d1ff06a4
596952 F20101206_AAAXQR pokharel_p_Page_060.jp2
2cf25c26312899c7d610f18786ecc7e6
50ebe9633d0a7a799ac3aad5b6ef499b93ac5586
579370 F20101206_AAAXRG pokharel_p_Page_085.jp2
79450e3158816b3c18d51bd665b7be9a
687c62e874e99cf07af6795a30609d5f62a3197d
957545 F20101206_AAAXQS pokharel_p_Page_062.jp2
15414446e387a7dd4e8d72c5c15de01f
b6a46284fa7843e36d5f9326a181bcb7a97ead24
925258 F20101206_AAAXRH pokharel_p_Page_086.jp2
15a3da3fdc6b0f5627e8f9dff9d99b10
03525913bf37178536097ef1884a76b44c07ceac
735380 F20101206_AAAXQT pokharel_p_Page_065.jp2
1b5215d9df8bdd88554059ff4184bc86
f1732bfd164612427878f13fc8d7a2a70606e328
908014 F20101206_AAAXRI pokharel_p_Page_087.jp2
fc3cd8245ec53c27ff4690ae6a47e519
98e195758132f7f6d0e4804e2e0757ad55a56240
985189 F20101206_AAAXQU pokharel_p_Page_066.jp2
1ec9196cb0ff8693bc004cc27ce369c1
de895a9d05c4d4c6500bb38c8157bd7aa3e58f83
78436 F20101206_AAAXRJ pokharel_p_Page_088.jp2
26ccab9a703163cabdae140c12f39018
c96edbc3d1b2a8c4aae5a44d5d4ca61fc67fa062
1051962 F20101206_AAAXQV pokharel_p_Page_068.jp2
73215ff958f9bbb0d1767e9feb4d7fb8
6f35642587fa803340e3e66fd36547e9f3592124
821229 F20101206_AAAXRK pokharel_p_Page_089.jp2
7b0de5187d42fb32678cf47bef9c4f6d
10c4eccc747c80c7b24ca0514581758e0137b293
791660 F20101206_AAAXQW pokharel_p_Page_069.jp2
60581e04a57fcb876dacf645f8e747a8
5ba52ff139a79377c4a80c4dd690fb64f3e12217
127987 F20101206_AAAXSA pokharel_p_Page_112.jp2
9dd713e4fcd200d021571753ddfc00cb
36c5e425707b5fb23867d373bcf292098d94cc57
749651 F20101206_AAAXRL pokharel_p_Page_090.jp2
75ecec098e9ed7752277b3b7b8e56130
6fbb061a97c460627e613c9fcbaaee51aeb94aed
912360 F20101206_AAAXQX pokharel_p_Page_070.jp2
5ec86bb5dee7f1f078c6a926ff36c794
9a82bb8b2cb8d87bc0ddc123b9289cc083038391
131802 F20101206_AAAXSB pokharel_p_Page_113.jp2
aa0d07b1d7decb5366a53f75e073a11d
c5405a8d70621ca96dc88131f8636f2c9e7c2da5
741329 F20101206_AAAXRM pokharel_p_Page_091.jp2
03b7035d1ad4d59dda2539db5dae8cfe
1d3abbb26db20984d65c5d14095c5a569e993917
1015028 F20101206_AAAXQY pokharel_p_Page_072.jp2
84bd7ff367d802aa0213f472d6092017
4aa46f41eda957e50180b954a9ef56bc3ab5c8e9
127884 F20101206_AAAXSC pokharel_p_Page_115.jp2
65e5df150e1ee3615614816741f9fb06
9d6665c36b3b731459e2643389ecb36e0533b522
1051976 F20101206_AAAXRN pokharel_p_Page_092.jp2
c52787e584c58e43b5a15becce66efa6
55386c3b0853f65c6b5cc5aa72a4ef104c6032ab
1051939 F20101206_AAAXQZ pokharel_p_Page_074.jp2
f7826d5e63db5ecbfad36b3ca5a791a2
f2b4099a349c149a3fdcd18027b1bd8bb20c5712
43619 F20101206_AAAXSD pokharel_p_Page_119.jp2
a3173eb5d75f905cb9df41f0acafb30b
57d3ff539c55f605682063a2d296ee90cf4ddf9d
1051950 F20101206_AAAXRO pokharel_p_Page_093.jp2
dcc29e0ce709945bce1fff4b6114433f
9198b3695cbc827ce0a91d847cf75eaa776d2a3b
1053954 F20101206_AAAXSE pokharel_p_Page_002.tif
36a6dc0e7e4b23eaee7906d10ffdc905
a0f9bcb07bc0ddf696087784294bfa07aa9c5da6
685665 F20101206_AAAXRP pokharel_p_Page_094.jp2
ed6e6263de1de9f0a93305bd759c29d6
b660271b9bb763535e1d027e64b5f18cd0cb59cd
F20101206_AAAXSF pokharel_p_Page_003.tif
346ff2c11ec0a9e6dbb51f453ba63c8b
1cb5bfe95c10e63113cc2e9df7e95ec690d4244d
47765 F20101206_AAAXRQ pokharel_p_Page_097.jp2
1aed0d5874961f87cf6c912d805092da
149b821f2160ba5a303c5422383cc861cb9cdafe
958960 F20101206_AAAXRR pokharel_p_Page_099.jp2
a9c3d51a6c27206fdac14f817c7a4c2c
36bc332d81f5f1999bf3afc2221b838005f56432
F20101206_AAAXSG pokharel_p_Page_004.tif
dc20cb7257f9bd10bcef4f65f1217833
83cca1ffb86c6ee774edbd66ee1e35f2d6921cda
816769 F20101206_AAAXRS pokharel_p_Page_100.jp2
4180145ccbb392c17f40f0a888b558e1
6f9e560ebe556d4c5f73a147e92f47c37e86ef4a
25271604 F20101206_AAAXSH pokharel_p_Page_005.tif
6738ea951ba8cf0cb4fbc45d4eaa8850
361e6facf37ae057aad737883595fe8c5ca49141
1051943 F20101206_AAAXRT pokharel_p_Page_101.jp2
64e7274ec1894701e655ef367120fa33
6692fff7f292c4e727e344f38b62e6700d7f71fe
F20101206_AAAXSI pokharel_p_Page_007.tif
f5b3e09e05837e128d05cf395425b82c
9b8180eab2443229b3028f1e798469de46214cc3
824922 F20101206_AAAXRU pokharel_p_Page_102.jp2
db0c6864ff96b5f35f27f3762776ee8a
bfbb56b9eacfa1819b2f24ecaf9dd6a4cfa4de3b
F20101206_AAAXSJ pokharel_p_Page_008.tif
42a5c2fa75bd2daf108ae1bd624dfbfc
2628f8558e120ab7d5a4dbf6e2a939097b82eac1
67353 F20101206_AAAXRV pokharel_p_Page_103.jp2
4f8eea071e89a160b40bf33ae5aaab36
bba8342067fb4de32f1ce075f5589d20a6c0576c
F20101206_AAAXSK pokharel_p_Page_009.tif
ccf9654f2f4b62dff27323d0d02c5d9b
260516ca03ae42673ea3d644bf37e3789057d12c
112500 F20101206_AAAXRW pokharel_p_Page_105.jp2
1bbc3806a06bad50fe326e536779fe63
079eed9f010c39ef9da6ec14ef8d754df528e484
F20101206_AAAXSL pokharel_p_Page_010.tif
a7e9348bad9f881563010c96c9cd8ebc
3e870673306c725af043e8c4c9aaca1b8240d66f
118903 F20101206_AAAXRX pokharel_p_Page_106.jp2
b55043cf1ef5ed49fc98b1812d22c0f7
373bcea0c9d112054afdf90d4cebb8246911586e
F20101206_AAAXTA pokharel_p_Page_028.tif
8b6fd51ae77ded2b60d29841a1e7f1c0
954ff1dc5d710517efc3d76d6e463c6cb6219c21
F20101206_AAAXSM pokharel_p_Page_011.tif
757080e6f3eb98292d7478f85a3c0b02
395e05ae9a67c5b78b702d57ad87ebff62640977
124933 F20101206_AAAXRY pokharel_p_Page_108.jp2
2de08cd2540985b60e14c24392d3c402
5f97beab0a51051487f9b872966e59eca3dd8cbc
F20101206_AAAXTB pokharel_p_Page_029.tif
09136c3daa73575610fa614422b46ffd
8b31151deef3b66b2e24ae47d5e2cd7cf27d6b01
F20101206_AAAXSN pokharel_p_Page_012.tif
232542dd03e53f80880b32697c4d84cc
afa7eff769a37ee133e5800fb857b43e8a5a2db4
45861 F20101206_AAAXRZ pokharel_p_Page_111.jp2
0f736ba91fcad178eee7bc63ccd8a0f9
47d65b62ab80ddb55e0b1c363c160f0c8a8ad207
F20101206_AAAXTC pokharel_p_Page_031.tif
0768989a29735eba4c57f755a39dba65
ffcf71d189ac3febdc3b22f08625545fb4e3c9dd
F20101206_AAAXSO pokharel_p_Page_013.tif
fd6b6f75d1468e8719290ec3677281f6
fdde068b0d39bb30bd3221db7bb5242819b87a65
F20101206_AAAXTD pokharel_p_Page_032.tif
e24bf7d543252b6e9f2a8442ceeffb94
bb73b3c82025ba55756257941f458300616c09eb
F20101206_AAAXSP pokharel_p_Page_014.tif
21ec28692e48733584fd31566f45c048
1e6d34d38ddf9f60375ce2c51f7419992e074b11
F20101206_AAAXTE pokharel_p_Page_033.tif
1c1546449b129174b10aa210ecddab89
865c055bd89c56ee58f4dac671be55e02869d670
F20101206_AAAXSQ pokharel_p_Page_015.tif
f8727cbb3ac701a6be295318826a36b4
20921f7d2014b98dd4c28e5e4bc11ab8a4ef55f5
F20101206_AAAXTF pokharel_p_Page_034.tif
9f1417684fc6a5ad285c6ef009796260
d1be2602b1f2d0b8034f35e691397da5573bcebc
F20101206_AAAXSR pokharel_p_Page_016.tif
3971c286289d0268fb2c196731e77019
4bfb9953046be98a235e984ba509d458352ee8c7
F20101206_AAAXTG pokharel_p_Page_035.tif
6eb02f8a9f714ecdb42bf5fc950cc1f7
cfcfb40af9c7e8d39369bb1c08bd6880abbb4b45
F20101206_AAAXSS pokharel_p_Page_017.tif
d6230b049d761bcd420d715e9132aae6
de5e818ba9ac2f963139ffd96cba8eaec3e380ac
F20101206_AAAXST pokharel_p_Page_018.tif
2fba9110bc5ecae11d7fb319a7c91f33
81a9bcef8dbff8001fddae3ae065f69f11f5acd5
F20101206_AAAXTH pokharel_p_Page_036.tif
ee7ba66b9032e7f1db79b5d218a91acf
7b9d2c964ca626dc440636a96c8d7630f2c7480b
F20101206_AAAXSU pokharel_p_Page_019.tif
ea590643ed8d6d0189dcd10b08f071b8
465841a384fc8ed1e7c3f199aa42a585ba9d472e
F20101206_AAAXTI pokharel_p_Page_038.tif
09f5ba08682a35b5614315e9070be4bd
0333d00119f24f6c8bc3eafe7d396374a4cd7bbc
F20101206_AAAXSV pokharel_p_Page_021.tif
68d45a5fd5d716059b0401a5804fff1b
8ca11c15217b5aff78d4f21f47788cd2a3b4670c
F20101206_AAAXTJ pokharel_p_Page_039.tif
321984d0fadf06ec4db49e893cf01f5a
f66d990ce93cec4317b4e42b8edb7ffe5c0d6cc5
F20101206_AAAXSW pokharel_p_Page_022.tif
c5a6e79525edcdb426fd686ef37f26e2
66136299d03c4586d1cd0dabd828ef7be50a65ef
F20101206_AAAXTK pokharel_p_Page_040.tif
41f2108276540674902ad6f84068bcd9
9b435374ef0863f709fd4f1a3175a27aebc10eaf
F20101206_AAAXSX pokharel_p_Page_023.tif
e02d42476cacd9c0113a2f2884bc2708
22a069049fbf5c6a5ab1a3cee62764a605922db5
F20101206_AAAXUA pokharel_p_Page_063.tif
6c6e03361f03a51cbc0f13e10fc91697
b18d823baac0df428614ea86a4a5524cd288c289
F20101206_AAAXTL pokharel_p_Page_042.tif
42aad810341bf9dce41af73e8820f0ee
8ba466efa35a136b98e7aa7937063ca344f448ba
F20101206_AAAXSY pokharel_p_Page_025.tif
ca3f63918d7dc240fbd2c2c402b84a6d
393b84a723d00b6890fa0d2a68d671b26871e419
F20101206_AAAXUB pokharel_p_Page_065.tif
9dc0476b2463956126c43db396420d24
414a3de543e79e5f258c8bffe2ad65b1c4b66946
F20101206_AAAXTM pokharel_p_Page_043.tif
3e6e504c61ad5d6ddac6a2cf410e0c31
1fbdf21b29d8e5098d363988fae3115fb33156d3
F20101206_AAAXSZ pokharel_p_Page_026.tif
1907593e06a261b5411955506d22cbb1
fdb80d7914e987f92b03a083c9dceba4b9654536
F20101206_AAAXUC pokharel_p_Page_066.tif
868981d3f47cd8c8704c91deee4f9688
ba1801bcb99b1718fd1595df629ce95242e1a135
F20101206_AAAXTN pokharel_p_Page_044.tif
1402c76def115dd3c4b6daa39e4e52fa
ce5ac781b5b428fa3df306becf367006a85c17ee
F20101206_AAAXUD pokharel_p_Page_067.tif
acea35be2d251da06e690b4df83bf5fa
e0c60c60e6d670d97ba75424c8242693cb55e662
F20101206_AAAXTO pokharel_p_Page_045.tif
7f75c6937bb5b7befe75c4b51c98d03e
bc2bd43a52d9d2250336c6523f9f6613d03e3459
F20101206_AAAXUE pokharel_p_Page_068.tif
df92d9158feebbd353cf85023d81b2a3
f74f5c8e2a82f53b5e9753c49af71ff63a7c9126
F20101206_AAAXTP pokharel_p_Page_046.tif
a36edad8b2219c750d03b078eb9f1969
5d0374726038add991b75f24170c6d7f1048684a
F20101206_AAAXUF pokharel_p_Page_070.tif
20df577216b6463aad0964caf74e88bb
16793c1435df603f289bd34625267f5de23f1740
F20101206_AAAXTQ pokharel_p_Page_047.tif
1d94bf4751aa3dc51134a9888f508df9
78319df4d347cce75ada58fd9e1f54f2af3160c4
F20101206_AAAXUG pokharel_p_Page_071.tif
f51dfaef57849532631d2c7ea16d379d
4bffddbca14bf556171e9b39e26b0f042ceba94a
F20101206_AAAXTR pokharel_p_Page_048.tif
781be0e66c568b0a7f8ca5b3f0ec6f97
67bc28df8d138fe33d8fa1ed4d75892232d35330
1894 F20101206_AAAYAA pokharel_p_Page_041.txt
b754fe7e282b4619a79a17261ea2c897
b6aa89eff5bba0dcb4ddc26512f7ef0a6e071400
F20101206_AAAXUH pokharel_p_Page_072.tif
d5d871bf54e18cd662f3dad61dd793bf
ccbf22787fecce6e511f192edec0169745811552
F20101206_AAAXTS pokharel_p_Page_049.tif
fc09a507a41684e4b391e209c16df44b
16a01683bb7e6891b74ce1914dff999d79da2a1d
2360 F20101206_AAAYAB pokharel_p_Page_043.txt
ff889dcc1c2833cd72cd6f8fd4f6fa21
5c74a6e53d50da4a63f2d3564f0403d74a2a53d1
F20101206_AAAXTT pokharel_p_Page_051.tif
1234a242c1f3eede895f12276c2c2e49
16ab5dd5cc75e858491a98c89354c2589c0cd9cb
1530 F20101206_AAAYAC pokharel_p_Page_046.txt
330aad40706c0bc8524fbaeccf6cbb07
dce80d86fbcbd201705a38701f98e6bb61e58e4f
F20101206_AAAXUI pokharel_p_Page_074.tif
f19990be415cafc3698e7cd09ff72400
218626561ed4d4c670a1d5c20a9f42416840a0bb
F20101206_AAAXTU pokharel_p_Page_052.tif
50f008d5cb0e629431e6324c7c123bd6
5e707e0caaf23dc942f641cf2d5f472d0890e8b5
1447 F20101206_AAAYAD pokharel_p_Page_047.txt
62ee1f1efdde757341a8609d3eba8b00
21ec3b3224c12fd25fbbc4a47f2093f305d7cadb
F20101206_AAAXUJ pokharel_p_Page_075.tif
453ee404a32a917ed985685dd9b4f5d5
0b0add7f35094cb3ccfbe04d10cdf05320c86afb
F20101206_AAAXTV pokharel_p_Page_054.tif
85ad0e3853bff32c6172615bd6d518d1
6c2dbdb8fe74bfbf04b96500f9ac4ca00588d624
1552 F20101206_AAAYAE pokharel_p_Page_048.txt
a336e86d433a02f403c4a48dc9b3e7cf
dfb40bc39403eb2f52c20582a32903190a1cd077
F20101206_AAAXUK pokharel_p_Page_076.tif
c099fc5467647dca0f19e8f6b408950a
17e04cfb128eb4553e8c4018bac624989fe32615
F20101206_AAAXTW pokharel_p_Page_055.tif
fe92672d872304cc6d398e1696ca2798
681cf77f088cc2f339449ad10633645f50cb1913
965 F20101206_AAAYAF pokharel_p_Page_049.txt
0d2742665d562b2bcd29b6dbca943c01
7ca74a208582c63e9ce94dbf37aaafb6e0f4ee53
F20101206_AAAXUL pokharel_p_Page_078.tif
1e6c45a5c4eace78e25aa92dcf9d2103
6d075336c92f572d59bd5501f071232551a88e88
F20101206_AAAXTX pokharel_p_Page_058.tif
a5f347bc6328aa9b96f927ec2cd61852
65e8e45b107e9aa4ca558dc29bbeded8995981f2
1243 F20101206_AAAYAG pokharel_p_Page_050.txt
521878731da9cee49e3355c3f9b10a36
f41140bd520988385114d2d6503037652fdc0f33
F20101206_AAAXVA pokharel_p_Page_095.tif
f356d217f745be89780fd257d45d0d7b
b691f6cc3fae29765631b6a9eff8816a88ab67fd
F20101206_AAAXUM pokharel_p_Page_080.tif
f6622f4e7f23f41c5e6cb9c0f1b7412e
c9f6429e72062d527ce7652be5ee5185ca6ad939
F20101206_AAAXTY pokharel_p_Page_059.tif
6e16f3eb0973a79fd0172742fd3da54b
c6a4f91d381de409afaffc3718bd0e759a267972
479 F20101206_AAAYAH pokharel_p_Page_051.txt
99328655455ce6650923ee85b6aa7f3d
3d3037972de630b99597744e0e5a0d5dd6a36dcf
F20101206_AAAXVB pokharel_p_Page_096.tif
520e913257823d1107ce3b41528ff74c
ee6e886e7a6f61e0a17ae6d62074115c53b6984d
F20101206_AAAXUN pokharel_p_Page_081.tif
24277ca1426363097f14d47fe89fdf4e
ce7f1046ecc35758cea8643e575b01490898477f
F20101206_AAAXTZ pokharel_p_Page_062.tif
ca94fc8714a9ac43da746e8efc92a573
dde2db5786941301ff2662eb555e86c558d29bb0
536 F20101206_AAAYAI pokharel_p_Page_052.txt
f16e4ed26e2f2f22337d9dff8f1e59be
08ff2bf1ba46067ed6b6153d7924777ed6e57b13
F20101206_AAAXVC pokharel_p_Page_098.tif
557e8dea57f9a9a640ed62aeb4dcdab8
00e027d0d9462b07e2e7519b55679733ab172ed1
F20101206_AAAXUO pokharel_p_Page_082.tif
6124e2cfaf7f5da328c18b3a36406a11
897e58268f38943cd63cd0cae3cb39ef8adaddb6
486 F20101206_AAAYAJ pokharel_p_Page_053.txt
695c1d819c2e8fd7e72d23c7833974e1
676639bdd3074944b35564bdace0bc3c045316db
F20101206_AAAXVD pokharel_p_Page_099.tif
64b9e82c1220b62b121c781c755e528a
a6e6e5bbfe1b120f458544a8d9e13cdea35f7478
F20101206_AAAXUP pokharel_p_Page_083.tif
24344f277fe568bd22b9971ae80c12c5
a274a20025ee2dc502942c2440661d74c9e1701d
473 F20101206_AAAYAK pokharel_p_Page_054.txt
b855d6c84eb32226da12e09dd19f40c4
1d18e6bb44a2eff4ffd7c1d380a5f5fc7b2740b2
F20101206_AAAXVE pokharel_p_Page_100.tif
d25ee833744eb3d05cc16a90ee62004c
4bdceff0b8faa96374659857bff5c8a9d2f64d90
F20101206_AAAXUQ pokharel_p_Page_084.tif
2a1a498dfaabfc14d5a1c1d2f7eb6e81
6a665e339a37c900c2dbf43874591241bb7cc04f
2189 F20101206_AAAYAL pokharel_p_Page_055.txt
c719bbd13c15b3271b665840d70eaaac
7368e69fabec3771808b2c55e94e92d714ce25ed
F20101206_AAAXVF pokharel_p_Page_101.tif
15f7579e08dd4b8adfaebd1ea0a2b016
cf62305e126cca11e8b84199564ec370fbc482cc
F20101206_AAAXUR pokharel_p_Page_085.tif
78a43ea0f51efa7e0745f8dd35eabe91
e218d0855a4873fb98bdf6010219a304eb20a946
2016 F20101206_AAAYBA pokharel_p_Page_072.txt
aba1fadb235f7fd7262247337d98f203
9407bb6c5cf30802868f2c5182fe6430cfc0fcef
2212 F20101206_AAAYAM pokharel_p_Page_056.txt
ffe53815b1ea0dcc4101bc8c419d5f19
c9961391b50d662ff28f6ea2e4295a4a0615eef6
F20101206_AAAXVG pokharel_p_Page_103.tif
71139d795576ac38163268a10981160d
8ae7ae4f19a89579617a3c444e57c94468286fa9
F20101206_AAAXUS pokharel_p_Page_086.tif
f8a0e17adc2a28ecb17b62815336fc1a
289cddcb31ec324b1ca724bec1367d409d5be794
F20101206_AAAYBB pokharel_p_Page_073.txt
24e5c1a153bf28ddb797ba9767dda04d
a64b0ccd629f9e3cba7d13e0e88dcaa3b81e412f
363 F20101206_AAAYAN pokharel_p_Page_057.txt
8dc9cb77771528406bcfd1b2b378d716
59f98cb41411eb22d58369f606f8ca1ec22a6385
F20101206_AAAXVH pokharel_p_Page_104.tif
5a570a32b814b3d044861e19f6b4c8ea
72fa22c7009964a1395e2e009ebffbef70004129
F20101206_AAAXUT pokharel_p_Page_087.tif
c377b2f61f96e145f029190aa1e9c7eb
1ee15ca3bb551c62736235369b2cd0c9e22c430c
2336 F20101206_AAAYBC pokharel_p_Page_074.txt
daec22528ffa2f749b68b52f8585f7d2
d6525b97d27182fb7763c2d245d885295a9dda4b
2054 F20101206_AAAYAO pokharel_p_Page_058.txt
918c3eeced923ef410f953f61fc62fca
8bf41a025357e7c2fd61741946af9337bcd2aba9
F20101206_AAAXVI pokharel_p_Page_105.tif
63501ecf5c93ae571c1c5571665c3a33
caec95439cd8011df338dc87726f4da159bb68bb
F20101206_AAAXUU pokharel_p_Page_088.tif
f4db2d2cc2ea28ebf8ccfaab9c5a7606
e669f6fce79b255e48b530ef8f69e1fdc82b26b7
1629 F20101206_AAAYBD pokharel_p_Page_075.txt
3c58258fed19d25c00c82406cdc2f9e0
28749764571b3188ee6559b3471d0e3d040cfc21
1804 F20101206_AAAYAP pokharel_p_Page_059.txt
b592cca2a01314a36b74f0ce8afd7474
471d2078162f52b410c7bccfbb20f20ea913d932
F20101206_AAAXUV pokharel_p_Page_089.tif
61642c238c1382049a944460010d7779
03fc2f530ba317a7359605fefda28f935bc40a6d
2130 F20101206_AAAYBE pokharel_p_Page_076.txt
04e0f069cc122160724586e6137b4c62
dfbd2b1caf002dcfb9bc0c4fd416f925429cb788
1089 F20101206_AAAYAQ pokharel_p_Page_060.txt
e77550c4d314dca0413ce608b23b6185
e88cd9a5ca418255167bddc63e679a6859e863bb
F20101206_AAAXVJ pokharel_p_Page_106.tif
921b74cec534dcc65fb4407375ae8e8f
6363ed7f28f9dd81d66928a7e4932164b8f6dce2
F20101206_AAAXUW pokharel_p_Page_090.tif
baa2adefd8e8b219f300a0ec5a9253cf
a2029c2b302a3e35f35a21534c615bde313ee7b0
1278 F20101206_AAAYBF pokharel_p_Page_077.txt
0e877f378113af4134d80db72c01b67c
842c63d71d7778f547cc6c5116961f486b955ae6
1251 F20101206_AAAYAR pokharel_p_Page_061.txt
a8190ed45f85b70320bbcbf5aeef87e8
6396e4c4621a58643c5d563f7d1ae9b8586ffcd8
F20101206_AAAXVK pokharel_p_Page_107.tif
113bd4917ef9e101f343c143edeb16c8
642ddacf2503d9f23eb4bc984ee3d2d6d205ec0c
F20101206_AAAXUX pokharel_p_Page_091.tif
e5d1a6da45f7a80709d06fe221ea724c
8542cadf90e307e036c62d7c2f6dd1982e75b9dc
1618 F20101206_AAAYBG pokharel_p_Page_078.txt
50a3664172bcc460a1ac6de4a954d354
fc7d306530044bced77e313fa42529b98229ac9a
16132 F20101206_AAAXWA pokharel_p_Page_011.pro
9f90dedad372b534341ffa4b08ba9d6e
036cbf2901ade613548be58f90280fd091ed52de
1947 F20101206_AAAYAS pokharel_p_Page_062.txt
2cef8299f603e56cde8e175646d22c89
d6e94e5f48c006ed64c989d32cba51c3e02f5bc7
F20101206_AAAXVL pokharel_p_Page_108.tif
c82c1efe385c60a024190ee5a7059f37
334de43ee3da7c4da86aad8679bc8408057b6b75
F20101206_AAAXUY pokharel_p_Page_092.tif
3566db613b0024493d323961beec9810
73f69e3527c0ce860d43f387bd17b2e59b60f0a6
2451 F20101206_AAAYBH pokharel_p_Page_079.txt
d9bc495f3bb27fb1fccc98718fc3c858
75d893b443e7601da2a678d911f9bde2d7bdb60e
56449 F20101206_AAAXWB pokharel_p_Page_012.pro
db7a78f628baa0503297ad4c99b2ea12
8c5229b2b4c92e000f0f8a8eaf2633baa2d6ec21
F20101206_AAAXVM pokharel_p_Page_109.tif
3a58d9a4d681b95e1e22071020c0988a
fd320f34e2ca4304fe96a2c154ef165e8e6e2d12
F20101206_AAAXUZ pokharel_p_Page_094.tif
93f9990a52cf1926b664df6f0af51c7d
09a1e00f7d25b7a8cd03bba3d86736a2961974ac
1928 F20101206_AAAYBI pokharel_p_Page_080.txt
accfffec44738d6d0c63417622338944
ead029dbf76e049dbed9aca13d6ac13418a0cbea
51254 F20101206_AAAXWC pokharel_p_Page_013.pro
f0ea4933d80a872bc98aae572833a075
18edfa6adf19b1e83e8c918b4f868a3ae24fcf8d
2320 F20101206_AAAYAT pokharel_p_Page_063.txt
862946e59c2ad0ab4d1bb758c938ec0b
a0b8b6ce921df4a4b9b226b8e3ca6f1ba8ece9ff
F20101206_AAAXVN pokharel_p_Page_110.tif
a53ebf4a93e7a3f5d4bd7c116cc48c2c
7a95857cd6220dfdb6849641b07903217104768d
2211 F20101206_AAAYBJ pokharel_p_Page_082.txt
264d72f79d96dbfce5731ca6c31d020f
a3d0c753f10f8d9bb5e1b961d7cecd369470bb68
37370 F20101206_AAAXWD pokharel_p_Page_014.pro
62e4fbd09601c250d02f7aa618ac0e6e
838b93fa6e8732efb6822662c2facb12c7f81507
1792 F20101206_AAAYAU pokharel_p_Page_064.txt
02527d694846beaac3a7ef6a81d91bed
95e13b89dee360760610ac8cac1b772c174bdb57
F20101206_AAAXVO pokharel_p_Page_111.tif
71f4b881afdc28a276d8f360ba1aea3e
1d0d4b591fb38e31fcca57613add2ec7755c38e4
883 F20101206_AAAYBK pokharel_p_Page_085.txt
6bf39b00a4a9a2f10e29287088541876
a78f607c3afdfb54c468f4c190f4058959230a46
61582 F20101206_AAAXWE pokharel_p_Page_016.pro
0ce71764cb26b52cb697f2076e1b3afb
28857410823ef13183e24e11f8ff5c0aeca0bc72
1944 F20101206_AAAYAV pokharel_p_Page_066.txt
5ff870aa37b6b39729adcfb54a3d04e3
61a73914bc0cdc0af24cfd472fec005cf226b331
F20101206_AAAXVP pokharel_p_Page_112.tif
e0c87808841500db418c982e1c12c382
ad80c27073f18b83b16d9cdf254f76d06968bbd9
1940 F20101206_AAAYBL pokharel_p_Page_087.txt
90ec08a4e24d16fd68502981c75945b4
cc28a2154f556a2eea31d8afb911c0efc96f2dae
44817 F20101206_AAAXWF pokharel_p_Page_017.pro
a18c94e1b949595edbd9c9195415adbc
b4dc845b03eedf2d2c7d74af3b9f94b6f0c86b71
1830 F20101206_AAAYAW pokharel_p_Page_067.txt
1279abc168ef2dd7c680bf496dec7dfb
d14f6b760c9af408965deba6bf6b38bfee69578f
F20101206_AAAXVQ pokharel_p_Page_114.tif
521d5bc25ede6744a663cfd16fc79677
a8d3b2302c48822fe0cb207c175f03b1da95182f
1347 F20101206_AAAYBM pokharel_p_Page_090.txt
935b35862b66b9f930168a7b634a6ff1
5c8dcfecbb100a84eccbe55925798f27f48ef20f
45080 F20101206_AAAXWG pokharel_p_Page_018.pro
5bd4e63daa2d4ebc75b8063c6926e7d8
8a31555a5a78f6a54a7abcea2bea5a050e80a83b
1877 F20101206_AAAYAX pokharel_p_Page_068.txt
683695298c2990d9956a4e3537c6ddd7
4108d7fde33ec6b98076fbe0109a5d26bb346706
F20101206_AAAXVR pokharel_p_Page_115.tif
26f0cabaceefbf04706ecbb51c1064b1
baf73af6dd57f7fb742c71d2adb59d8a96d09a00
2417 F20101206_AAAYCA pokharel_p_Page_108.txt
58b127b7fe02252df66582bd7f9bd4cb
6f94e048d22d23ec38b91217f62ab151a2d32343
1346 F20101206_AAAYBN pokharel_p_Page_091.txt
7aa287dcb828442d52593f7b0816571a
f3dab9f2bb94f3320bd3d85a16572dcfbfaead8d
44530 F20101206_AAAXWH pokharel_p_Page_019.pro
02a9955dbbb5a6056687197030da90aa
30006bbe4219604ee5a155e121f3436875c8f0d7
1433 F20101206_AAAYAY pokharel_p_Page_070.txt
278931f56ed1e7b22501a4cb6dd89084
4061874103082cd9014927a893087245c68aa3ce
F20101206_AAAXVS pokharel_p_Page_116.tif
242a06e9d37207fece658428217874cd
371857b673936e0b726082169c117d6278ab8ed6
2251 F20101206_AAAYCB pokharel_p_Page_109.txt
e4e7b0d84def8daa5b23701d2e13a0ee
db51f24aa00f5cb7891514dcad929ee469e69d6c
2172 F20101206_AAAYBO pokharel_p_Page_092.txt
be53915e2cf61c935a587cabeffb5df0
086fcac6f81054fd5646563eb4562f0309a473a3
56603 F20101206_AAAXWI pokharel_p_Page_020.pro
f5f289d81b378294435e00280ac17b12
4b93dbf18a55929882ebb0a9cad7272c107a0661
1044 F20101206_AAAYAZ pokharel_p_Page_071.txt
0fc383aa405f644f0caf5b7c26a50d81
32ec30271ddbb27358ef50b83053b65cd62c28e4
F20101206_AAAXVT pokharel_p_Page_117.tif
966562b8165774acc1a54df11010d402
fec0bc1f8e821d619c327107b2a5ee4efeb297c0
2523 F20101206_AAAYCC pokharel_p_Page_113.txt
7bf8d752b83fc29ec3c525355240cde9
95f25220fa18c7b43b52c3021735f7bc10280571
1864 F20101206_AAAYBP pokharel_p_Page_093.txt
39831d549a5f86bed8c5d6d36c1de236
3608ae7669e2f5628324ff6caeef65130defeba7
48058 F20101206_AAAXWJ pokharel_p_Page_021.pro
7a9aeab04f0d3a19303a494de2d4b4f6
54ffd556464c197831f0745f22040b56669c8967
8188 F20101206_AAAXVU pokharel_p_Page_001.pro
90ebb695108203cf22b0481396ffcb4a
456b62ebcc9d58e463286b9d8e6c0aac624c9c22
2448 F20101206_AAAYCD pokharel_p_Page_115.txt
95a430eb7ea815cf34189488ace63f28
b0a964ceccd9a4b0e2f035a34f81a70d293aca2d
1359 F20101206_AAAYBQ pokharel_p_Page_094.txt
f9e1f451b21c0ce9da2b2e34109f0b62
8d7f4b9d46554176ebb05c79653fdf64f5d05625
910 F20101206_AAAXVV pokharel_p_Page_002.pro
e7c5d779abd41a8726a2ee790e994038
2c9bb3c64ce74764c16a23a20fa161ea9330b004
2491 F20101206_AAAYCE pokharel_p_Page_116.txt
75401ff3742bffaf454fb640fceecf53
5eebb5ae5d3e3e40b657e480833127693904c071
2331 F20101206_AAAYBR pokharel_p_Page_096.txt
5c5d66addfc75458cbd5beb09d5ba2b5
226965c133c765b4bca15726ce111c50521571e6
55761 F20101206_AAAXWK pokharel_p_Page_022.pro
0feea451ac1c3d397694ec2f3c774cb9
60ab512f3472a3c3dd86140c690c8330d7ebe1cf
30716 F20101206_AAAXVW pokharel_p_Page_004.pro
c3ea2b3e852418e5ced2dc4a34a3da13
1247f0b8b8c3b8c42001b6c81274c54f6a288976
256 F20101206_AAAYCF pokharel_p_Page_118.txt
8a3e567d3d724a4fcae967efa5d2baa2
8bebc442aa7fda436e8a42252e917354895e1b9d
34511 F20101206_AAAXXA pokharel_p_Page_048.pro
d3fe20885699997ab96fdd0f01af4b25
c1aac12d6b51a958129e1f79b701660863d2c525
861 F20101206_AAAYBS pokharel_p_Page_097.txt
7004d0d8d5cfc592725b551ed25b110a
97be081d6a476ccb5829dfa884ccf3c67d21eb3e
36688 F20101206_AAAXWL pokharel_p_Page_025.pro
76e84e2d1a78cbf9dcf97b5ce6cbefbe
47f3b9c33aad691f1bd410a5d94e4e213bd8bdfc
41798 F20101206_AAAXVX pokharel_p_Page_006.pro
9d67108202f1b7ecbafd7320192a324f
df907215a17438f308983bdd3ee19a921b20da10
25340 F20101206_AAAYCG pokharel_p_Page_056.QC.jpg
36d219ae64c232f476f88a74bf33accf
580f2acc27fa8df5fe64dbdf183dea01552fe01e
18825 F20101206_AAAXXB pokharel_p_Page_049.pro
efa9ea2cdfd6228e2cb778ba597172c2
8eb9f92f584d2311315ef7e61ecfa7e351a9d5ee
2237 F20101206_AAAYBT pokharel_p_Page_098.txt
1f32982e2c693df82ba4dc8a8d5efe7a
7f817daa357ec6858d09e653d393dc98f294d0ba
33338 F20101206_AAAXWM pokharel_p_Page_027.pro
c5af77b8883db85aef61764df282c96b
2d4f26f80bead4eec446eb941e9e7aa01e566923
66124 F20101206_AAAXVY pokharel_p_Page_008.pro
8a64347e2a85d3221c071c0efb6fae24
2f93c06943c5e2473fc1c30ef1c91dc0b6f3dd61
5319 F20101206_AAAYCH pokharel_p_Page_091thm.jpg
09155840fbd0145e52d6b7a1730cf854
e0602b5d488890d85f22d6c6becbc69d908082dd
8941 F20101206_AAAXXC pokharel_p_Page_051.pro
55a3c274750a4a37e839406f2f1b76b2
7dbb5289331b2e7f3022b3fcfb8b32bce69f028e
27495 F20101206_AAAXWN pokharel_p_Page_029.pro
7c564ee66d2fbbb332b6eb2827a7dc55
7efbb2c82e3fa28ab1d9ce2827d46ea2774f7ba3
62291 F20101206_AAAXVZ pokharel_p_Page_009.pro
9db0eb59c36358546b47c038e6621159
d1a04c001a7b1936bbf7482367d8f545f3f8adea
4437 F20101206_AAAYCI pokharel_p_Page_060thm.jpg
3530e2818178b063e3987ba8834c14f5
c12b0e5fcabb9c93cf709117ca1c882db968f0dd
51149 F20101206_AAAXXD pokharel_p_Page_055.pro
5dc546cdbaf4a8c97a8b09bae839d3b2
f68feef8350cf90b19fdd9b209a58b9df268f3ca
1923 F20101206_AAAYBU pokharel_p_Page_099.txt
0053d80709ad6d4562950d152db8c290
ea388932f39e4db24dcbac32a9242eecfaa74640
58213 F20101206_AAAXWO pokharel_p_Page_033.pro
2b0e5b2d05554e54c0c532374f37aaf4
ab4707071df5da759a9ee2d35619c192583909ed
27592 F20101206_AAAYCJ pokharel_p_Page_033.QC.jpg
2861e2de605f518736c59945ec4e41b0
aa750312a1c95e9467b0067ed26cdcaa5a852172
51133 F20101206_AAAXXE pokharel_p_Page_056.pro
2c2bbcf019bd0e34d8f9f3e0c67420e2
be5a8767d6fcc004c46cc711cada8384fa009676
1752 F20101206_AAAYBV pokharel_p_Page_100.txt
fd745df9f5756e8a8d74520097c7b5c4
5d793597906c182333365329c394dd6164123617
19879 F20101206_AAAXWP pokharel_p_Page_035.pro
86ad045949c4327f4e7593711f687005
c73147542c7ad099933b2fddb148fc915ede905c
25730 F20101206_AAAYCK pokharel_p_Page_092.QC.jpg
8323c108e9fefb2012cbe5211a8eac33
ac3eb2ae6bf749bd22346b726566fdc2063bab0d
7582 F20101206_AAAXXF pokharel_p_Page_057.pro
11022fb9f329cf51a241782755f9904b
25e69538ce5f603d5d5c1e885a7097666718e24b
2358 F20101206_AAAYBW pokharel_p_Page_101.txt
7fd7268318f2099732fda7d75a324a3e
ece14007fd2df3ff9472801c9d56d277c753caf5
39188 F20101206_AAAXWQ pokharel_p_Page_036.pro
0c938a59dcf1241d48fa1d5e0aaf58ea
cbcf28c85e4acc099d9fa155a408a43d3939e6d5
4593 F20101206_AAAYCL pokharel_p_Page_103thm.jpg
90d2da48f164bb605cbf76b5a82df10e
239f31b08fa151b8143f9f5ee2653357369319ea
44523 F20101206_AAAXXG pokharel_p_Page_059.pro
d40e3e12a062122e6a9f6eae070da414
87a5233862b046c8839cb96ee02f081bedf25f94
1308 F20101206_AAAYBX pokharel_p_Page_103.txt
88b1fd2df064787c6d464a594835168c
d1f74917d3cc71b774d9695b70ee3b09cefa1cca
47044 F20101206_AAAXWR pokharel_p_Page_037.pro
d98eb7ff89de304f213d95dd1a72f9bb
60e3784c866d4f90bd55c623b79a225757d9ada1
1361 F20101206_AAAYDA pokharel_p_Page_002thm.jpg
a38a1d6e8ba3081ebc87f2cc54bd41f0
28ee4eaa24be88476fd8067dd0acbf2dcae3a472
20775 F20101206_AAAYCM pokharel_p_Page_042.QC.jpg
79cdf01f16e7658944cf2be7ba26043a
8225ed3e328519ba921e2e9f162cfdefa437e4ef
25922 F20101206_AAAXXH pokharel_p_Page_060.pro
766f9d0b045ee20290c4d7a4f8bfdd14
5e19d9c0e2fb0f89e8967f997ed13e5dfc918471
2301 F20101206_AAAYBY pokharel_p_Page_106.txt
a061ae6f887419f743b97024fe125e90
6f2eaae9a36ad842f924ecaf955ef94de55d41e2
46761 F20101206_AAAXWS pokharel_p_Page_038.pro
b409a4cadf48da996b17ce7d5c1af360
a042e2baa33e779f2375d8245b3a624879d23235
3188 F20101206_AAAYDB pokharel_p_Page_003.QC.jpg
ccee7df768410dcefdc2058e718d579e
16e4c8924924b6ea851cb367bfb5991f75a9be86
5691 F20101206_AAAYCN pokharel_p_Page_042thm.jpg
25acd6714480c9d03cd48164c9044fdd
ae13a57e8323c0c769a2ca9f787e794570d50403
42956 F20101206_AAAXXI pokharel_p_Page_062.pro
a7022b82619a75a30f6033aba51e4631
80844a1a58eed759f1e24e01d55ef049ee32e3cf
2434 F20101206_AAAYBZ pokharel_p_Page_107.txt
b05a70da8d4bb688c913b19c7d7b44af
9c55b73aad16380422aeef2c0d34c02c3c966d01
42519 F20101206_AAAXWT pokharel_p_Page_039.pro
3f3d20b1f3d1b6e50272a77298268c8e
17571d1cbf9c3d709222d023af8fd4de293e98a0
1434 F20101206_AAAYDC pokharel_p_Page_003thm.jpg
5512b6e6ca73204a968212bcb8dd57b6
cbe50836d6ed610cc2f0c0e9e99692e55c60019d
20257 F20101206_AAAYCO pokharel_p_Page_047.QC.jpg
7222974d485fef5888d4b3e6c553c515
b52f8c57633654c2a03d8e5d71be6c498b5beb4b
57929 F20101206_AAAXXJ pokharel_p_Page_063.pro
2f946f26585bab89a297a8b75833848c
8ebdba548ce7b96f2d965e7399ecc090cf436917
46364 F20101206_AAAXWU pokharel_p_Page_041.pro
c1200f9c7a7fe24c1efc19ef6373cf61
a53ff6786c9fe3a573f20ce33e0fd975116edc1f
14629 F20101206_AAAYDD pokharel_p_Page_004.QC.jpg
43510f3a2b9fa6f3bac65c5037d23c46
eb42fd49020a24f04607697b908d97e0a87fa454
22352 F20101206_AAAYCP pokharel_p_Page_032.QC.jpg
4fe715ddb7369e87de9e878752f0e0dd
239230dd628c0778814edba5f84e4433c5bb9460
35997 F20101206_AAAXXK pokharel_p_Page_064.pro
5952b9f09354ad19b55686dced7a1427
8a6336b62667260cb1711b273585762998ff51f6
39570 F20101206_AAAXWV pokharel_p_Page_042.pro
528a43750e5dcf42164b358d006d3a77
a708423c0a86ae7ebab569af6d6b2570b2b03359
3973 F20101206_AAAYDE pokharel_p_Page_004thm.jpg
e6b0b302ec73acd029809e77fc5dd55e
6ddd8192094ad00669cf289fb8ba2930b5a8a061
5294 F20101206_AAAYCQ pokharel_p_Page_026thm.jpg
89a2ab93ec019e2ce28b52889b601a1d
fa84f59a1290237759eb9b19772c5bb75bd5eb2c
59574 F20101206_AAAXWW pokharel_p_Page_043.pro
e8c067bf5a8232f4f2b5cb01b28eb218
fdb5054325b2aecb5910ff78d18bc4f5bbc6b65b
5745 F20101206_AAAYDF pokharel_p_Page_006thm.jpg
e13fd3821a13dfda5240ea33b24407cd
6dd07afb94b006ef903f10059121f98918098c2c
5662 F20101206_AAAYCR pokharel_p_Page_046thm.jpg
dfc7c414eef43cb0b5fccf73aba88328
cf98de297c986f45615341bc19b000e4b5784da7
41295 F20101206_AAAXXL pokharel_p_Page_066.pro
18bf5a01ee0163194d5f411628d6e6f5
3b3706799c6719752af14ba9a306a73c1bab82b5
32056 F20101206_AAAXWX pokharel_p_Page_045.pro
8d0976f6b5dab7ddbad6b2de41074948
90ba88f110e3e848b321d407acd904162ce0f1bf
5193 F20101206_AAAYDG pokharel_p_Page_007.QC.jpg
4c08932f870f5d21d10658fd1d2f9c79
0c9ffa9639e9976e2a7a267efb05bb90b272e28e
20915 F20101206_AAAXYA pokharel_p_Page_084.pro
ab358542f29c1d25224846285aec4923
c42140ab30b5cd1d3b91d297cb3d4a4b58487b7a
29179 F20101206_AAAYCS pokharel_p_Page_034.QC.jpg
2c52eb0bc6a86ce02e589fa60b4efca1
1f7cedf925eae946d0ca9caf519f1501b93906b0
44120 F20101206_AAAXXM pokharel_p_Page_068.pro
666ff74d2fc71a3d62ea95b1eaa15429
0318f6ad3625060704ea9a3d4013fe2e39aa6e04
33232 F20101206_AAAXWY pokharel_p_Page_046.pro
ec8fb75fb3f78ca7abd548b79561a0dd
eb4188310da8cf018c64fb8433a92f69561073a7
26713 F20101206_AAAYDH pokharel_p_Page_008.QC.jpg
5c079b674137b5f7f3c09cd3ea0dfe5f
7c18d3b18035bb688d420067117d4935560f8228
19256 F20101206_AAAXYB pokharel_p_Page_085.pro
a0bff065eca07c3bda73b9b0ff087728
b9fcec0941759b0505bcdbc1859edd95ea5ba14d
11013 F20101206_AAAYCT pokharel_p_Page_097.QC.jpg
c51538d59d45ebf457292e8fbdbcad0c
666af9acff9086d48b3d9b1744fcc1a54689caaf
28627 F20101206_AAAXXN pokharel_p_Page_069.pro
a8577b81eef909a2a98aec8335e7365a
99454c20b514a6653dea0c166d27df699c369ec1
31498 F20101206_AAAXWZ pokharel_p_Page_047.pro
bcc443e07d15feaf55081a778817f082
a233f28ebb43966f2f04056a3d9fdcb10255f4db
6354 F20101206_AAAYDI pokharel_p_Page_008thm.jpg
870fab5b4c8fe8b88cb7b0e5d5c7d658
36e908dcfe5c093a838438d44757514b7be8bba9
29601 F20101206_AAAXYC pokharel_p_Page_089.pro
a52cc7408a6ac5671576a36d59706414
f24531493a0221001af7887c746b086f30952538
5681 F20101206_AAAYCU pokharel_p_Page_076thm.jpg
fd225c93b5c0167f307388fe4cabc5b7
4b3667eebf3cfe6ba6bd42c3c0472d321225e2ca
32954 F20101206_AAAXXO pokharel_p_Page_070.pro
829bf12be7b254322955b0187750e29d
b4d3d18caa25c51cee1a7b5bf94427b6476dd1d4
25080 F20101206_AAAYDJ pokharel_p_Page_009.QC.jpg
d80e7f56ed2b4918fa8b8c98bea9be72
7e9bea52d8b04bdf97bdc3df07858e68b72e9b58
31313 F20101206_AAAXYD pokharel_p_Page_091.pro
aaef564502945cfa72c6fe27ae6bad55
95c5def571744ef7cfb692a6b9a617b53050b835
14055 F20101206_AAAXXP pokharel_p_Page_071.pro
4e38a6f9f48c9d29789afa1883de18d5
3efb38c26693407b3c9a9efbee8d51748a3a1bfd
6309 F20101206_AAAYDK pokharel_p_Page_009thm.jpg
0ec741db4f0aab9de59349fe5f9713bb
0695fd13de4ed0e02ee9cb39fcf6308ae864c015
55017 F20101206_AAAXYE pokharel_p_Page_092.pro
e5641ce75ef5958e6e9aecb4ae451682
0c7ad731ee035456e841d1df5a77f7b3b1a0ac3f
18831 F20101206_AAAYCV pokharel_p_Page_069.QC.jpg
dbd0825e9fed6b77ae2ab5ccaaf98961
03faadb8687d670b8b10c8cb7ec01a29d27e139c
42117 F20101206_AAAXXQ pokharel_p_Page_072.pro
61ed3578038dbbb5a62863553084d78d
b6895c4d110597c68d73417126ab361413246e30
5558 F20101206_AAAYDL pokharel_p_Page_010thm.jpg
87c4b45b35cf3e44c385ca2f0392f48b
69ea96a8662e3cd4a144e72cc5a6f21dacb8da75
46223 F20101206_AAAXYF pokharel_p_Page_093.pro
5447222c65b5e75a3534f9915da31ac0
b0f283a7042b53ae3716063146ed470e499bf217
28009 F20101206_AAAYCW pokharel_p_Page_015.QC.jpg
81ce51628e890e0c48b9672137bea291
ae467cb67f0bf5afab51b4ce2e64a484d60f2f71
38969 F20101206_AAAXXR pokharel_p_Page_073.pro
af991d7e759b7dcdb71d10098ac3ced9
1feb22f64ae2b5d461d8d131d2bdf35ed2696960
23496 F20101206_AAAYEA pokharel_p_Page_021.QC.jpg
30e58a2eb90ef539b453ffda039d1588
e04e7f6e15f417fb979606f26d82afdeaa7ab760
8723 F20101206_AAAYDM pokharel_p_Page_011.QC.jpg
d202c06747b3b42dd6007ca538d8f15b
6b3242e1557606b67469a86a4daabaf3ebdb4ca5
29116 F20101206_AAAXYG pokharel_p_Page_094.pro
68c1f1ddb7208a888fafcc9b1fba1602
0eee6f70fdec49dfebe93dd57436688c6c754566
180355 F20101206_AAAYCX UFE0021482_00001.xml FULL
8ba1789064f610fee7732a59eb97aba2
88537398a92b514d6ef5279b532550610bbabc22
56775 F20101206_AAAXXS pokharel_p_Page_074.pro
a3cb53acadd2cb627e186e030ef9ac03
6b389dcdc33df9c69e82ef318b37e64a03a77328
6156 F20101206_AAAYEB pokharel_p_Page_021thm.jpg
7ead324e8f16b6826678c4fb2bc1c3ea
d0ece179c0e5c0a5fb4c259ade42598d64b7c0e8
2544 F20101206_AAAYDN pokharel_p_Page_011thm.jpg
a4d2bcb40c5d007073885f7ca0aae90a
0f495047cf7b13db269e241f8275d4b56b99803a
58617 F20101206_AAAXYH pokharel_p_Page_096.pro
b2f7824402c95e9c883a6367f3eca180
5a23c1814b1aa42581577b16d2c45d02adcc116d
7392 F20101206_AAAYCY pokharel_p_Page_001.QC.jpg
dd799cea8eda416307bcb0c38f622624
51feb78e76668cb16c62528520f8cb481bfe715b
36821 F20101206_AAAXXT pokharel_p_Page_075.pro
e1710767fa2e47257fab925e2965c631
43123996baa45078f6c7ae210a303eaabda698d1
26631 F20101206_AAAYEC pokharel_p_Page_022.QC.jpg
a5a7a296bf9c1ba70f3e708443794659
4bb1718df5dd39ba6e102ed819355c85d2a5fc0e
25760 F20101206_AAAYDO pokharel_p_Page_012.QC.jpg
0b2d23ce101f526f056eaf182c978f63
af2f1b65495d9d97af9368b4d49596f27c26e7ca
21506 F20101206_AAAXYI pokharel_p_Page_097.pro
3b8cdb2c996de9af3901019aa8003a7b
eec93ad8f4c259d95fe428af4fa76a8ac167bd5f
3248 F20101206_AAAYCZ pokharel_p_Page_002.QC.jpg
67697e620df95e8051ce1d5300c1f9a8
c503df67c65b0a493cadd96131b8b3b1e1bdf18b
40893 F20101206_AAAXXU pokharel_p_Page_076.pro
23c63750b334a2c2ef56e3ba87c7ef9a
1139f8267f34c0a292e9de6063650f6a78fe1c35
19239 F20101206_AAAYED pokharel_p_Page_023.QC.jpg
22baefa424a9212fd9c982c451494030
71cb7086061c12cb561e79caab6189e4f37b1e5c
6778 F20101206_AAAYDP pokharel_p_Page_012thm.jpg
18530dcaa5c50ce59267e05fde98be7c
e24a462aba192f57d1d962a24e7667bba8925c22
54399 F20101206_AAAXYJ pokharel_p_Page_098.pro
eeb6f0d4d66087e5706e66ac6ff8fabd
a61295b411a8d96cddc43f688f67fb79c968c187
28242 F20101206_AAAXXV pokharel_p_Page_078.pro
3be35b72183caf7c9ba623c46f2954d5
77d9357956720ecef7c54510bdeb167eb4ee7329
5204 F20101206_AAAYEE pokharel_p_Page_023thm.jpg
ac6d094abb401b869d1aecf055e5dda7
1e1c72dee91c0875cbb4dae6ad53f3529fce0318
24794 F20101206_AAAYDQ pokharel_p_Page_013.QC.jpg
ecfb0289196fc20ef63a358e2bce6a93
6ce9dfe1c1b68978fb3bbd9381403039644dee4c
35122 F20101206_AAAXYK pokharel_p_Page_100.pro
45538bcb0eb0e48162bc95dc72f537f7
9ed2e67bdaa29794a2430601a247d3980df979cf
62580 F20101206_AAAXXW pokharel_p_Page_079.pro
ae0a3767b9b797dd58c666e538def297
4782e3bf5ee4b18fd4b91b319eca5aa9d87b131a
5428 F20101206_AAAYEF pokharel_p_Page_025thm.jpg
f5268e3b1ba8ebcc8f2ef03d3245fe7d
423041c8dfd0692b0f4de0a68144b0c73c70c840
6547 F20101206_AAAYDR pokharel_p_Page_013thm.jpg
c5b3b4b0a0d3fbc39c664fd1ca8d8168
60ccb7a8e34ce86157d8bf6b132b30505d64b582
30519 F20101206_AAAXYL pokharel_p_Page_102.pro
f3cc04ffcead4e93a197fc9e7afde315
30254b27bc986604a645e099148e1f57269f2561
44547 F20101206_AAAXXX pokharel_p_Page_080.pro
f64c167173190094bb75c6aeac248fcd
8ff40f1f93398f45fe439883a10f65a7afe400c7
18503 F20101206_AAAYEG pokharel_p_Page_026.QC.jpg
1bed60fa6076bfb0b9349d984f285da4
b5fdc9e28982bd8dfe6d2690b91ca69f1e50961b
2703 F20101206_AAAXZA pokharel_p_Page_005.txt
8db45258f6ca5f8725e7a371f9936657
0176f9b9b934efcf574575d2d15a033550c15308
22092 F20101206_AAAYDS pokharel_p_Page_014.QC.jpg
a55efde0edf68de2662c8f747ed2b711
2dae111ff649679fec342995ea6ccfe54933da4a
18002 F20101206_AAAYEH pokharel_p_Page_027.QC.jpg
5b84ad03f00ffe03d76a46bad505f051
36c6c5404073d67ccf89a35d6c8575aa909f771a
1967 F20101206_AAAXZB pokharel_p_Page_006.txt
389481b7eb9680cf737f22cdb95eb97c
da99f32c5b88a1c6d9c4d6d50b778bf55f7459a5
6202 F20101206_AAAYDT pokharel_p_Page_014thm.jpg
b35e800a9e77e73171847017a766de6b
1bca5d1010c8da6ddaa588bd54ee24ca616c5a3e
23466 F20101206_AAAXYM pokharel_p_Page_103.pro
c19ef1807c579746100f1145260914dc
9e28e81e1254a926899cc4ab6836d298bd5040a5
55418 F20101206_AAAXXY pokharel_p_Page_082.pro
6976e830f4bbcb05c2db7d3d2186453d
9b3f0239621e72dd20df1673ac01562f27622451
16352 F20101206_AAAYEI pokharel_p_Page_029.QC.jpg
13479524e477e8f3ce8137c502446272
dd7bde0837d920fd0f25d6faea5521f803ea3f95
2748 F20101206_AAAXZC pokharel_p_Page_008.txt
6c4d9721c855785baf48a07fb3de60e7
3f685042dcad783e0d7433b24bea762865faa999
29418 F20101206_AAAYDU pokharel_p_Page_016.QC.jpg
921f538543d2683ced31747131cd2022
81f2b3f54f7b28f7dd1b997252d66217c4271759
5730 F20101206_AAAXYN pokharel_p_Page_104.pro
f2e1d74a35bb766eb6768acabcc9dc46
938e2efc291f580d1ca95e89ca6e4a31e7af53ed
59856 F20101206_AAAXXZ pokharel_p_Page_083.pro
8a96edf5bd28fe55c9756cfae991372a
677650c812a2a320f44c7a11690cdff918f9359e
5035 F20101206_AAAYEJ pokharel_p_Page_029thm.jpg
d1dd7c942a023650cbdd71171f3c4335
33aa9c45d2b4ec8b75de88d581de226330c65b1c
2624 F20101206_AAAXZD pokharel_p_Page_009.txt
6cab2d260230fc2de8a0ff0e19bf8dd7
8f10e3c99aa6df16cd5d589241fba98f7129654e
21614 F20101206_AAAYDV pokharel_p_Page_017.QC.jpg
3a0d525c63ed74995710b7d812e2156d
d1639498efe483e86734aba02933109fa64060d3
58346 F20101206_AAAXYO pokharel_p_Page_106.pro
ce27ad298a1f4fa090e55b59b67756f8
f0917114f4aa723718259892df36564cc5e7bb0d
18722 F20101206_AAAYEK pokharel_p_Page_030.QC.jpg
755b6f81305eb2524e972ebf1fcfb7ed
25412b8ff844dbe76e6e0c3e5d13b0febe6838bc
2196 F20101206_AAAXZE pokharel_p_Page_010.txt
2bb6e4c79a1fb286a5e9a82dc4340f66
3abf1a88bfce65d07b87908893ce1a5f17fdc69a
60706 F20101206_AAAXYP pokharel_p_Page_108.pro
b27d066e9d3c9f919b76f096d14d33a0
7517a59f67b32cf9aabefc63d8b5fc9316def37a
5394 F20101206_AAAYEL pokharel_p_Page_030thm.jpg
6b05a00a5a297f8fa63b6b09d07eceb9
a4fbd90e8b5394407e78717a996117bb9ae8e278
2299 F20101206_AAAXZF pokharel_p_Page_012.txt
e4b42a30e07687779b2d1abfc117d5cc
94117e174d61bacde624a5762e0a58f81e3d0c0e
22564 F20101206_AAAYDW pokharel_p_Page_018.QC.jpg
9fa4750581e1b556ebbb2dc6ab010c7d
e930b0dcd80f795cea927f6e056b7a6dfe7f607b
56513 F20101206_AAAXYQ pokharel_p_Page_110.pro
cc8919c1c7bae20f89b6bf440b863abb
1bab81bb2a35d1bb49e90202b235ae59e362e7ee
18977 F20101206_AAAYEM pokharel_p_Page_031.QC.jpg
b2f9d3593f319e1c785b2121672fc08d
689f6c3addbde7fdea192a43659abc17032fe899
1485 F20101206_AAAXZG pokharel_p_Page_014.txt
c678291d8a028056b30272716ee3d16e
beb58ffe4c54987f65b4eb801e2781ba5432c313
5899 F20101206_AAAYDX pokharel_p_Page_018thm.jpg
eb18d6307a0f2acb57bfbcbc34404bf7
205da2e602cf27edead420f8b085edac2e7ddaaf
20241 F20101206_AAAXYR pokharel_p_Page_111.pro
be4832affd390f5d112ee1a611ab0034
71829f5b8ce9c07e124b75690482b79b637e462a
6725 F20101206_AAAYFA pokharel_p_Page_044thm.jpg
6a910279d4d824bb86f630aac60f4df0
29155584bc6b03564f60f9ed94dba8a3a8cc5b7b
5418 F20101206_AAAYEN pokharel_p_Page_031thm.jpg
c3cae33410a9d79a5e2471a0bfd37b5a
ba027400f1c4fc58b45d7cfc14b28c381b9b4c60
2359 F20101206_AAAXZH pokharel_p_Page_015.txt
d8a3b8a7d0424702342f4adc26b42ead
9c9a4c3602684b022f50a87b1737b1528c941bad
5999 F20101206_AAAYDY pokharel_p_Page_019thm.jpg
9eb4bc99003f00c30f0504ffedd26b4b
810b6ec361a08fd7e018505025be15f3bf0b1b6e
62973 F20101206_AAAXYS pokharel_p_Page_113.pro
24d4833420b6de09d52182301c7a2101
1cc5cf7217eec7ede14d4d3aec9ba9b7ece77ac0
19814 F20101206_AAAYFB pokharel_p_Page_045.QC.jpg
926852b0e843d8323a5a3ec7d85e445b
8a3f156961f7728e7c71710def84a893ff2ae760
6348 F20101206_AAAYEO pokharel_p_Page_032thm.jpg
8fae53171873e7cf54e2f6d7f26882a9
3f20df80f99e5a2e8b43b6f061312b7ec15bb9fd
F20101206_AAAXZI pokharel_p_Page_016.txt
b60acdd99912220e23079764569c713b
000e1e12d813069afc10036fb005dce890aac6a2
27398 F20101206_AAAYDZ pokharel_p_Page_020.QC.jpg
fb034ec60fee5f6eb72b8f985fedad17
89dd62606dfd332bc5bbfed296a4d0c7c585e461
62154 F20101206_AAAXYT pokharel_p_Page_114.pro
13243be88b9d4af8f42cb3613d674188
199085da482c946474e5a0d6bd7a1ecc53519f26
5678 F20101206_AAAYFC pokharel_p_Page_045thm.jpg
d3f3808d26b2608270c5da56c7228272
43d14528892b84bbf48334f292c823b141363bcc
7002 F20101206_AAAYEP pokharel_p_Page_033thm.jpg
1472a85642314633255d81bafef60d79
6d84f4de8901ea3937e38292323a4951894e74f9
1778 F20101206_AAAXZJ pokharel_p_Page_017.txt
35fef6e11fdc8b01623071ca0803c458
e460804c119c74b1b8c1814ce85d614f4d7fb81a
60672 F20101206_AAAXYU pokharel_p_Page_115.pro
5457c5aaccb0a29e003c0d076c4422c0
44c63defc3fc1fdf4cabc085f578efdea7205b9d
5702 F20101206_AAAYFD pokharel_p_Page_047thm.jpg
556d07636f7288f195d965dfaf30448f
5823fac0c9a35dc8aad9f975d20af78b36756705
10276 F20101206_AAAYEQ pokharel_p_Page_035.QC.jpg
4f568b95bbfea8e2ab4104e395769195
96c4039e81f72f2f4e821b06ba5ab20898fa21cc
1908 F20101206_AAAXZK pokharel_p_Page_019.txt
661b46b6a9a2f784c933e97d457a91e8
8470fbea5b08b6d069ecc2e7a858f3fa1c2d2fa6
62476 F20101206_AAAXYV pokharel_p_Page_116.pro
61cf36f42d28b35aca1a53c272e0d82f
3936404a75413b0b6b903b1ae10fd272f5875194
21102 F20101206_AAAYFE pokharel_p_Page_048.QC.jpg
646697eee92bdaf943d3530d96cb93fa
eb6dd40d4a1d5c9020ee07b57d008ad145d10631
2877 F20101206_AAAYER pokharel_p_Page_035thm.jpg
4fe3a96ec6054a73e492586091161ed4
aee3afe839c85f3c09118bf6895ca20296bd1a05
2276 F20101206_AAAXZL pokharel_p_Page_022.txt
005954905ffbc50e39087b3ec55d4f81
2d3aebd6748af78f4d08a592ae0ace85b4af21e4
64242 F20101206_AAAXYW pokharel_p_Page_117.pro
9b1506213db27118731890e14ad84c96
e3c93e45a7a55bc0220593234617a9086acc5bc6
5957 F20101206_AAAYFF pokharel_p_Page_048thm.jpg
99059688437a51f3c183a51b7c128cc3
4def186ab22095d1f02c719383fbaacf2ff6ebd0
20665 F20101206_AAAYES pokharel_p_Page_036.QC.jpg
89204ab55f0828adc37b62b217bceccf
db690af23a958d581506d6c27c02e00b3c338f1e
1797 F20101206_AAAXZM pokharel_p_Page_023.txt
2b9a9bd5f132a9c873f9f3b5c14ca9c1
6ff37bb2061c441f464ab017136ecc738871e392
19350 F20101206_AAAXYX pokharel_p_Page_119.pro
133c45018b2154abcac417721238db91
ed3baf04f45cfe958ee4888d6aebeacc18899432
17499 F20101206_AAAYFG pokharel_p_Page_049.QC.jpg
1230638de1b01e5ad9276cbd28bf016a
10db7aca4c944d6abdad746937b95211e4f78f56
20139 F20101206_AAAYET pokharel_p_Page_037.QC.jpg
2090683a5dbb23b1ac250de6c666d41f
c3f5dedfe3ad4f6f68b153c21bb83386e7ddeb47
93 F20101206_AAAXYY pokharel_p_Page_002.txt
93485893d4a00549cb9e467e00e971d8
66631ed8e38607de3e961108cc1d4f3009545956
15222 F20101206_AAAYFH pokharel_p_Page_050.QC.jpg
94e76c3dfaca67887cfd6e7327fcc26b
fa38aced9d5ab00f65b330c685b2ff84b492270c
5564 F20101206_AAAYEU pokharel_p_Page_037thm.jpg
7ea6093fbff2e4421c0dfbabb6ba13eb
a8171e6377036a92661ed2d5e598ae0ed3c57129
1366 F20101206_AAAXZN pokharel_p_Page_024.txt
56fa1efb18f2a52976fd03808e3fcbc2
8822c6776ebf0f47c7167f82cf79644907e4f282
112 F20101206_AAAXYZ pokharel_p_Page_003.txt
a9d376ce50c6ab1bfa87b690964460d5
747c259f65cea9dec8d11caf2521ec6cf5a2ffa0
4455 F20101206_AAAYFI pokharel_p_Page_050thm.jpg
bc885431ac682faebec8211cd26ad6b5
69f4327606688e48139bcc5cc9858bca6246d71b
21993 F20101206_AAAYEV pokharel_p_Page_039.QC.jpg
206c39f4d19eb7c8abdd15a54c2e3cf5
15c0c2e4a2b900086fa021e16e5987614edd4e0c
1671 F20101206_AAAXZO pokharel_p_Page_025.txt
0415db69cdcc91bc55d16e84a0757a14
448453916fb7cd81aeb34199b2371e04f71bfca7
10824 F20101206_AAAYFJ pokharel_p_Page_051.QC.jpg
ca52b23d55ecfa5a2329d7f1e8b2f314
b633e6127c9a3f8dd31c2bb774fd6f1921f5eedc
1639 F20101206_AAAXZP pokharel_p_Page_026.txt
500001ff743041fca3626c7f0cc9a733
4d1fdf09d52795c2447e2a5185108195a998131c
3574 F20101206_AAAYFK pokharel_p_Page_051thm.jpg
a6211c4bcd867e9e4e1f8ec45f1d19ca
15c390a623bf5cd51c0847134914594bb9abb988
5749 F20101206_AAAYEW pokharel_p_Page_039thm.jpg
39976f5884a4c88a9b4d915fbf185dc3
44bde21e42b157f142403203301540ebdd39ce8d
1336 F20101206_AAAXZQ pokharel_p_Page_029.txt
144be92a860069ff60639f7be56d863e
4d6836939a516b47bc9176e4c1c746d20a994c2b
8579 F20101206_AAAYFL pokharel_p_Page_052.QC.jpg
e3a9241f29a5ff03b2fb050e684e1527
8acbf0e301ff6e232c202bec3b493d34779d89ea
1807 F20101206_AAAXZR pokharel_p_Page_030.txt
b6a0663eadcc9182936699691beba0bf
ee0075bd4e33333e5d37d46acfba4e95fb25a565
21077 F20101206_AAAYGA pokharel_p_Page_062.QC.jpg
6ca2b3f9d4106eb05ef2895b0a45a948
80e59acea30f0c76bdeaca51b2ea7301885edb66
3102 F20101206_AAAYFM pokharel_p_Page_052thm.jpg
68c850ec4f205426e4dfac73acdf7ab4
10bfbd62e2b77466c0e2f39c3c8e363525fd377a
6045 F20101206_AAAYEX pokharel_p_Page_040thm.jpg
a69e41cf311263fa97e62566546e31e4
19e3833fbb059002c8357eddf40a6f32154ada90
622 F20101206_AAAXZS pokharel_p_Page_031.txt
09079921a95d22c02134b6c288eebb7b
3c37171ac6142930c3be3d3bb0650d8a06f3270f
5828 F20101206_AAAYGB pokharel_p_Page_062thm.jpg
5a2343e448a05f4a27f08877954c8ed8
e11420c471d34e112439df188572714c6883ea1c
11491 F20101206_AAAYFN pokharel_p_Page_053.QC.jpg
b45c455b4ad897ca3f2fedd4a6550243
8c97a5e3c89c09cb1e0eaa807bbe0dbbb089fe95
6259 F20101206_AAAYEY pokharel_p_Page_041thm.jpg
5e67757f418edb5809895fbb0a67a54c
55e03b5bc510872d84a6afc01641576652f42d30
2174 F20101206_AAAXZT pokharel_p_Page_032.txt
12323d4d1d152389449d0711d3219613
5aef5b3f49756c0230e9e684476fc2f58a9a6342
27845 F20101206_AAAYGC pokharel_p_Page_063.QC.jpg
25e18b500222e6a362d96cb714d4ee7a
734b8f116f67a91007338ea59630cbdade90ec79
3863 F20101206_AAAYFO pokharel_p_Page_053thm.jpg
53fe561d9d4b0c20d6d84902ac15cc6e
ebec7015d93c4bd646be4b0ac88bd328d1bab36a
6726 F20101206_AAAYEZ pokharel_p_Page_043thm.jpg
5458b4c17269fdc7445bad6b33f4deff
08b811673c50d9555d6ce5e2e53397057f3ef14b
2307 F20101206_AAAXZU pokharel_p_Page_033.txt
7cd8ea96f0064c607798cca2e0e3b1ae
5f148a5c604784dcc5f9bf79e235a2fe9537fcef
6858 F20101206_AAAYGD pokharel_p_Page_063thm.jpg
01cb0acd5fd9224810b53e365afcd118
1e11ab92d315c9df1ba95b512cef1ad7ee6eaa31
7700 F20101206_AAAYFP pokharel_p_Page_054.QC.jpg
81f3f358dc7bffd411f321020923506b
6750114924e0280b0429cb7837f41a93666e5219
792 F20101206_AAAXZV pokharel_p_Page_035.txt
b83d65e82bd96b362e8f830d95f89839
94f3a22cf9495cc1cbebf5bef86980be9a42a921
17688 F20101206_AAAYGE pokharel_p_Page_064.QC.jpg
2f1d71c84258d44b94039807811dc86b
83cc0060f7d05c7a0a89947b2985cca489cc3e11
2721 F20101206_AAAYFQ pokharel_p_Page_054thm.jpg
38a77911ba86b0c85b4db2218d91ebff
f09ae607794da3d4a440257ce0c478a1b5ce05cd
2029 F20101206_AAAXZW pokharel_p_Page_037.txt
31b8745d81bb66974e5c4f29a0941998
11b24244a2f03dad8af2b8fd89baabe7ad222843
19048 F20101206_AAAXDD pokharel_p_Page_025.QC.jpg
806d23365ec88c3da4535c01b375105a
e8db336832c06c8791d92d3ce2bb9afec6ac026a
5143 F20101206_AAAYGF pokharel_p_Page_064thm.jpg
970e838c7ef9778d318f3811bb411936
ed3cdaa7afbb0112c7629da34e1283be2723c27b
25076 F20101206_AAAYFR pokharel_p_Page_055.QC.jpg
8a41af88c8d1208fb473e5649592b4c3
fdeca37e54bf360347236e877623003bf8dc6cf5
2182 F20101206_AAAXZX pokharel_p_Page_038.txt
ca0195b4ba670ed28ab3078fc198b745
b8bb2436c5122b979aca9694df9704422515f533
1167 F20101206_AAAXDE pokharel_p_Page_084.txt
f4d4012c607d8a268d5a28c8a3963e12
7b7198da35ebff2d3f091ebb67b94071577f45ca
17884 F20101206_AAAYGG pokharel_p_Page_065.QC.jpg
1a2609b96d4387e2c5f0838d3bca4f9b
72846d287a29a2b147ed3f34b4bc1367483a4071
6510 F20101206_AAAYFS pokharel_p_Page_055thm.jpg
b9d51fcc48e93ae88c262fd568c06c58
0fb6d189a7f723fef1f94134336f089e6e1cd5c9
1885 F20101206_AAAXZY pokharel_p_Page_039.txt
b18d23fb78c1277afa22bb45b4c64a15
55fc13afabc52ba2b76a21a94784f752cbdc12ef
1051958 F20101206_AAAXDF pokharel_p_Page_040.jp2
4f7b9e5944da7d278176a9c79070eb44
4b8b6be327c86a38d9822b9ca6c4634ebe99e313
21571 F20101206_AAAYGH pokharel_p_Page_066.QC.jpg
b40dcfde166d709fed0134b9e9201da7
29bad6543bc2c0464f62f71e1550e3fc632579be
10172 F20101206_AAAYFT pokharel_p_Page_057.QC.jpg
9294740fbc9858faa767e7b8aa72af37
699b36b2c4b10e37499b75c9a60550ccc7506c67
1982 F20101206_AAAXZZ pokharel_p_Page_040.txt
5662b249a29ac01c0d3e20b11cd28f67
f56bfce72123211f87ffc3e7f39f6dc7a16c0652
5320 F20101206_AAAXDG pokharel_p_Page_027thm.jpg
1afb7123a4e202f4de0ae7dc02b73dc0
54f7af6a5babcd64004a1e12de619c9a6f627ed0
5770 F20101206_AAAYGI pokharel_p_Page_066thm.jpg
3829e542b67ecc5dfe1f7acfed19126f
36e392229e6267c5f848ec0a86b79571be3899d4
3995 F20101206_AAAYFU pokharel_p_Page_057thm.jpg
71c8b9ce529fcd7deab2ac97fb0cecc4
216457f38ddbeced2ecbcf21a5dbe86ee4d580e4
F20101206_AAAXDH pokharel_p_Page_069.tif
10ccc1ecf88f1b498c23c3c42e681d75
626b0d0df9688937f7d7a6e238708f0699c826ae
5568 F20101206_AAAYGJ pokharel_p_Page_067thm.jpg
b10635c5591ee0f2b0e14c4fc07c1e42
ff30393e79d142d4ba4a068711882dcca27edd68
5628 F20101206_AAAYFV pokharel_p_Page_058thm.jpg
cc137a69512962963fae7ec30559010d
d5ea6fd72f385e0fca0075e8893a37c40e8e0768
6136 F20101206_AAAXDI pokharel_p_Page_003.jp2
fd70af4f3ffede90a59f41ba2adee8ee
06f35e06e114dcf28e43cdc5a9585ee640bf3289
24010 F20101206_AAAYGK pokharel_p_Page_068.QC.jpg
790f727d4c077f1b92097fa5016e77b6
ab95cbe964016025a581e4c4982d50a4d8826e84
22684 F20101206_AAAYFW pokharel_p_Page_059.QC.jpg
24d300918f8828dacefebfc1b6df5fb7
29514347f50e4bff33431b60646a264d47b0c6eb
19426 F20101206_AAAXDJ pokharel_p_Page_075.QC.jpg
09ec83dcf70fdf96cb7e1f5509f03ef7
e4b724b33e183dc3515839eac7ff9743557e346f
6400 F20101206_AAAYGL pokharel_p_Page_068thm.jpg
62aab4d8d6ba796ed31dd5b88818b098
caf24f71efd1675df0d1109cd89bfd93fe7a191e
15220 F20101206_AAAYFX pokharel_p_Page_060.QC.jpg
1884795e660b0b86e36b1006b69ff2be
e85c18dcc833c16bd0196ce691837f474e285c94
53610 F20101206_AAAXDK pokharel_p_Page_094.jpg
192cb582d1d0866626bb5d7599128908
fbe78dd7cd3dfe0d506fab76ce0edd2116641aef
5305 F20101206_AAAYGM pokharel_p_Page_069thm.jpg
b8ac5dac5c4ea1f140c4cc31b7295b82
50744b8edc5b457f74675a89949f62370638ed33
28974 F20101206_AAAYHA pokharel_p_Page_079.QC.jpg
7db26f00159180a9f0eb3c96afff712b
fd15d9d61639098b710b45420797ab608eb6c936
87225 F20101206_AAAXDL pokharel_p_Page_020.jpg
20ec10fb72ff66903287a533a5fb1f62
a185654b681bcef9e0e733da3140f0b885d6062f
22279 F20101206_AAAYGN pokharel_p_Page_070.QC.jpg
0a19ca3c397c275c922e4852e5e713c8
876c60dae72a9d0872a5d49e7f07c951af996e38
16869 F20101206_AAAYFY pokharel_p_Page_061.QC.jpg
bae9427afd7d672be18838d50bf2cb78
6b376c07d6dd01cfa9dd37de3f269883731bdbce
7099 F20101206_AAAYHB pokharel_p_Page_079thm.jpg
a68abe8854708f6beb386778b97a9095
6f4e6109d2b49b061629883ea6a70377a5506682
1763 F20101206_AAAXDM pokharel_p_Page_095.txt
e04d1143d7f6861de88022bbac18c50a
5405e121d83d93cb564947dd1152cc4c0a9e1839
6254 F20101206_AAAYGO pokharel_p_Page_070thm.jpg
e2ae5d14f0a68ca6899f3d032d494df5
ee15c7e086cab69388083afa3615e78daaa7acad
4899 F20101206_AAAYFZ pokharel_p_Page_061thm.jpg
40d911de378ef52c6c0e2825055718b1
504acd5842d1707853d34180dd123dc4fb5165a0
67603 F20101206_AAAXEA pokharel_p_Page_010.jpg
38435eab8c4b9bb75547c83d0a4b10ab
ec90711b66c53fc314d45c5aedf12693cdbe4a96
21896 F20101206_AAAYHC pokharel_p_Page_080.QC.jpg
0f4fdd89e15f5043a694dfd1407c4d2a
034bfcc1823eda1364d641236e78e438f19121c2
955465 F20101206_AAAXDN pokharel_p_Page_039.jp2
b828cd4425aa68f4970787dc44b0d17f
e6154ba0ca35cfbfcb169ef8099bfad96b3b7c88
3919 F20101206_AAAYGP pokharel_p_Page_071thm.jpg
1de24a8463bd7fd4d337726d09654152
535b9f4b4879464f63eeaf1758c2631150d506f8
17778 F20101206_AAAXEB pokharel_p_Page_078.QC.jpg
65c7c952e4f8ca65691afbf74cdbad3f
dd5183c69f7093aee2093f06448dacfcd352db06
6093 F20101206_AAAYHD pokharel_p_Page_080thm.jpg
27393e6ac40a42e4703dde02eef43f4b
e6aa7ea3fe76affb48cb8dc84c5bda5b8bd5a4c7
19298 F20101206_AAAXDO pokharel_p_Page_024.QC.jpg
eb6a175b1f9bf73a719e883c6d8aa207
7d36b3eefbfcc0c42a6171b3e01e748516fe3461
21528 F20101206_AAAYGQ pokharel_p_Page_072.QC.jpg
a094acfa48cfaab76d665d0d1db9f1e2
2d23c3b4b109ee25b74bf07a5369c3da10e9fd22
20180 F20101206_AAAXEC pokharel_p_Page_028.QC.jpg
c4c1d668aae3b6bd9e6ffbf7f00bf04e
87481d7d01beb273f32aca90f02e5d20d659df5a
16582 F20101206_AAAYHE pokharel_p_Page_081.QC.jpg
3f0a3588b1a0cb2f23eaa662fc90b021
117d52ed432185a0f8b01ab6ef6fb3e2a78e1845
21580 F20101206_AAAXDP pokharel_p_Page_010.QC.jpg
9dc5f4e8e8afe7634a33eca611f489f4
95358354f610fe4dc0c8340b361338fdc123b0c6
5737 F20101206_AAAYGR pokharel_p_Page_072thm.jpg
5ef4a37b22a99698cbd72c537bb0107f
9306757eabe0cf8c0a977de9dc6e1cc845b7b1c5
F20101206_AAAXED pokharel_p_Page_064.tif
b9c7d40a06a4af9eec5c39543b7630eb
db7cabf44743ffaf45970adf56bc643b65a4b0a1
4795 F20101206_AAAYHF pokharel_p_Page_081thm.jpg
162e8d37e165b26c20c6f9deceb2e2b0
9515cd95399906738d2fc4e657087d432684de7f
35353 F20101206_AAAXDQ pokharel_p_Page_053.jpg
bca31587838be42a60b7887c1ebc5ff1
ffb89bd508ceab36b9f76b5c45ab58a42a011c99
17723 F20101206_AAAYGS pokharel_p_Page_073.QC.jpg
b3cf052d035ef7c2e63eca30e5a726c2
d5bb53239f439ac41262a78f7a06e8e40beae958
65886 F20101206_AAAXEE pokharel_p_Page_048.jpg
de94e67cc21da20ee13aea1b4e878e4b
d184ce68b2e404f49ac61d06926f00a1a6f71602
27516 F20101206_AAAYHG pokharel_p_Page_082.QC.jpg
b84dd43ce4c6d84c6a8eeca9b2599e3b
c7f82a6a6b796769cc7d501ad5c5039655052028
1430 F20101206_AAAXDR pokharel_p_Page_027.txt
167c37af0f930d5d6eaec27a6fd24b04
b1c44746b1100bee6bb3d3bd2bc6f4cc7e867b15
4634 F20101206_AAAYGT pokharel_p_Page_073thm.jpg
1e0d45d6f7e9ea9a8ab544c99b48f704
d1181add0aff0e1cbf717ea5bcdc599b2c131482
23160 F20101206_AAAXEF pokharel_p_Page_040.QC.jpg
d53d435d33b3fbb2878539bf97555276
39be6b86e7318c746be42b874237f393005e26ff
7060 F20101206_AAAYHH pokharel_p_Page_082thm.jpg
f5dfa256c84ca2e2e8d1c477a4499a1b
d2e6b67ddb4770f1b96da2474d23bd3fa52961f5
61952 F20101206_AAAXDS pokharel_p_Page_095.jpg
48c72041508dd734498b8aacf29a0207
05c232e439140f89be59fe6b72804b50f9b4b3d6
26807 F20101206_AAAYGU pokharel_p_Page_074.QC.jpg
bdf8977979be07aba252d6c776ee4b9c
2283098b1adaf5c11005dcdab8ab551a0b88636c
1051956 F20101206_AAAXEG pokharel_p_Page_012.jp2
4160584f855b8cd98b3b245980ac8665
756243ff041675b2c393c12ad1cf511cae00f966
27943 F20101206_AAAYHI pokharel_p_Page_083.QC.jpg
01475185ab23d1142af0af9c24e6d86f
d0784ddd04aadc4f53171caeb965e8c3f41373d5
F20101206_AAAYHJ pokharel_p_Page_083thm.jpg
6882949caaa1dea6937ac28f8a868c33
dca661cc2fb6195a84d2eaf0b730850de26ab8b3
36236 F20101206_AAAXDT pokharel_p_Page_088.pro
52dd24b56161ee1941a874bdc1e111a1
1fb3343ab3549dc57c8d39451ec22a3170ad34cf
5403 F20101206_AAAYGV pokharel_p_Page_075thm.jpg
6976781d6b173b93ff5b89a55a01562c
2c7011b01d7024c13e4c0453743341a535ce28fb
130863 F20101206_AAAXEH pokharel_p_Page_116.jp2
15219a447e32aecb220322851507e7b0
3013ca3b2778d9d0a5b123de6274a27d2e87b828
15017 F20101206_AAAYHK pokharel_p_Page_085.QC.jpg
4f0d1d2961dcd47aa6733559a3aed5e4
8bbc1a50d25f5de04ceeaa5e66238ece93aff7fb
6458 F20101206_AAAXDU pokharel_p_Page_092thm.jpg
b34fb0f58e7536b0dcaaab35bb886e8b
bd78be94ed33a5dedac705a9db8807421b5b00d6
20570 F20101206_AAAYGW pokharel_p_Page_076.QC.jpg
153302b6ac849c9a5660f669db9b56ad
525882078f45664f070fc0a1a933fe49f4d8d6f6
47125 F20101206_AAAXEI pokharel_p_Page_032.pro
a5249594c2eb2727346e19068c429f08
aa70865e56af74ad1abd40783b80f79870e0b993
4645 F20101206_AAAYHL pokharel_p_Page_085thm.jpg
490e91e68f34bda96e1fd6895d4b86bf
ad8e41a3da346facf9769b7f7cf7c61eaa68a3be
55651 F20101206_AAAXDV pokharel_p_Page_005.pro
66c9fcd3b0bbb62587c9eb280569e233
f0fe61334944f952278e0e741000e050a4e6455a
17815 F20101206_AAAYGX pokharel_p_Page_077.QC.jpg
dfb8142e0ee259749022d7715d0c4329
4801b3a9784406c429d76ca44b382606095f5376
F20101206_AAAXEJ pokharel_p_Page_056.tif
0d19164dcd7205356215ea90b0b6b5a1
8af61edc6e229f8d69d05570c8aa06c22dcdfa13
6200 F20101206_AAAYIA pokharel_p_Page_096thm.jpg
2fefbdf5aeb45435ca353cdf1aff1d44
5d9489597581c325c468158e21e7acc0b775111c
20726 F20101206_AAAYHM pokharel_p_Page_086.QC.jpg
6f901300a7a12d34a1c387fc11a702b9
eba5f9914692f0b95e4471052e6d0f0b88e1095a
54616 F20101206_AAAXDW pokharel_p_Page_044.pro
044b788e3b8eabe448a1aebaab7f2e16
a341d3cdc6ecd60ae21117f29b8201698bd3fb09
4919 F20101206_AAAYGY pokharel_p_Page_077thm.jpg
f93944153453e513f3e44f31e03587a8
61c5827748c31881391cfd1d6dd9b9492283ecc8
803584 F20101206_AAAXEK pokharel_p_Page_064.jp2
dbac6d62a838f3a1f204833b9294b92b
5f8f7c72ee668599c992506fe1b775496c73b69e
3123 F20101206_AAAYIB pokharel_p_Page_097thm.jpg
9d88b704e49c6d8e5a5eb338a6c3bbfd
b5652d84f8113928dfb6e34c5ebef752b994d793
5612 F20101206_AAAYHN pokharel_p_Page_087thm.jpg
4ab5006d98a9b934b73fd88679b83b0d
2fc14b787ae06d1934ed07e0bf1b79a8e9188514
F20101206_AAAXEL pokharel_p_Page_061.tif
9858a9702c9de6000049fabc0a962b9b
3c09f051198ecd9a93be046ca1d5e0ee20b0ec24
26153 F20101206_AAAYIC pokharel_p_Page_098.QC.jpg
03d16e6bd88a59efbe99f3e3d507a9ed
a0e487bcb34845bb3eea9e96bed8759af363b11a
16991 F20101206_AAAYHO pokharel_p_Page_088.QC.jpg
a2a95cb0a679ca555a57991d53460dee
0f5ceabef6ac7f5fdb8ae546160152de53790249
37103 F20101206_AAAXDX pokharel_p_Page_067.pro
24ee1c331818416727f8a2dbdebbd1cd
257eb1cc2a4557fb2d5157117fe306ec526da695
5087 F20101206_AAAYGZ pokharel_p_Page_078thm.jpg
0d44d7ebc0f39d84740fdad1cb62ffe2
d86093d7cdf5cf0ed276e3954243f4b8e656cc68
76718 F20101206_AAAXFA pokharel_p_Page_068.jpg
1506fb0f5c2fd1c0480621a220397d0f
5a2cfefa368f30d7ffa18feff0480247b3f05f12
55072 F20101206_AAAXEM pokharel_p_Page_049.jpg
57fea42d4063572ce39d49ca6e4b0b33
b44c974a18673d2515824124e820e56a6ae08e24
6505 F20101206_AAAYID pokharel_p_Page_098thm.jpg
abb60bd6d18c11d38f4d122458e879e0
67049d72463919aef941d1f754cc59db5d806368
19746 F20101206_AAAYHP pokharel_p_Page_089.QC.jpg
66d738b8f87a30b78d312887db874aec
441bded8ee91637f3bdf1436849f1724a894e443
5503 F20101206_AAAXDY pokharel_p_Page_099thm.jpg
1f0197e03cdc05045a838176e2d6a7cb
b681f7a971862d82f0216201add089a78d215b09
57608 F20101206_AAAXFB pokharel_p_Page_065.jpg
238982fa511534a409e4a3b7ca24dbef
7d6985f38b059383388bc42b9c1365f7c0087707
230 F20101206_AAAXEN pokharel_p_Page_007.txt
8672dbd2285ed4a3d30e66851bc8aae7
85dd25e4dc92becf90e1be2a853240c7b1f8db87
20716 F20101206_AAAYIE pokharel_p_Page_099.QC.jpg
d9e7d971161bafb6d3ed20daa239858d
8f693e6367fe3836d067cb218eb18345580f5d87
5683 F20101206_AAAYHQ pokharel_p_Page_089thm.jpg
c47be22bab1a07d922b7e5034cc442f3
8c604b93e6fb1444c571d4a72bc3dc39ca86c730
57975 F20101206_AAAXDZ pokharel_p_Page_026.jpg
17c84903b6439ac2baedff5d959d8ffa
78f732fc8b35c9ce53884dc8d150fc7f53cb9ee8
423 F20101206_AAAXFC pokharel_p_Page_104.txt
78f05b91f9a7993d66771a20c0235dc8
951f2309df2aaa3883f08bea3a2416441701173f
23330 F20101206_AAAXEO pokharel_p_Page_019.QC.jpg
115d160da0532d38761e6ac78f014f99
f7effa15d9086fc2c2f85d16ec22f4ead1ad8090
19677 F20101206_AAAYIF pokharel_p_Page_100.QC.jpg
0080d15935f7a61a91213094f71b16ee
1ed149b29eb53568ed8e7b8ba62fa201fc212a9f
18073 F20101206_AAAYHR pokharel_p_Page_090.QC.jpg
9be05b230884b1bf88bd859beb8cd05c
90258d3f316d52932cb5363a3e1ace0859b5e73c
2443 F20101206_AAAXFD pokharel_p_Page_034.txt
ad36369c007efd44b4b2256ec3b01bfe
0adf0b51a5f1505211c5d122f3434ef7db291dd9
5504 F20101206_AAAXEP pokharel_p_Page_054.pro
a692a04d0c2ab3a28e0754d6e7656c9c
b7718a8445ee5daf3dc1c509f5a52916faa5088b
5540 F20101206_AAAYIG pokharel_p_Page_100thm.jpg
55a65efe030916284ebeb9d119af85e3
8fa23663d5291cf61b036d6e992bb14762140b04
5171 F20101206_AAAYHS pokharel_p_Page_090thm.jpg
4179a08690eedb38f6f2d794186d5498
1c16fcdc394604f13d0ce4c2d26b6b9f611256ac
93614 F20101206_AAAXFE pokharel_p_Page_034.jpg
c6870a5cc23bd20b7e6ba482a6dbc96d
fe5f4f67726e2a37428c24c974e58bbb7018f7f7
27782 F20101206_AAAXEQ pokharel_p_Page_061.pro
2af4930132d61343c11c5dba3d3fa302
7da0890fbf4f259af25484b9115f31a273ca8ca8
27786 F20101206_AAAYIH pokharel_p_Page_101.QC.jpg
4bf11e1e5ab301da8ecc34d61f0a9dd4
6eb660f5b917c1a3c2a01012afdaa4ccfcab2e8e
23970 F20101206_AAAYHT pokharel_p_Page_093.QC.jpg
b0d380ce6839f0ef1e2a489033d3c0e3
d75c1609975ecf2d85e32ed0043067eacfc91b09
767130 F20101206_AAAXFF pokharel_p_Page_026.jp2
ca45342c9e2601d06a5470fb475f8fa0
05b473b86f211bef941be4a02f4ea8d8e812d574
115662 F20101206_AAAXER pokharel_p_Page_110.jp2
c0c964f87510b09b9f144b4878d82696
82e8a7242556bb412070b432641c69091beb3bdc
6700 F20101206_AAAYII pokharel_p_Page_101thm.jpg
10c2cd1412d9f9135bd8e5394657043f
67a2dfc26ed7e267f70ade95bff3ce2f763f2e5a
6242 F20101206_AAAYHU pokharel_p_Page_093thm.jpg
9298168b3e90b0ba635ebf353ae34fba
f99be7bf141e83f66898d662ef1cf4f14bc9c438
F20101206_AAAXFG pokharel_p_Page_077.tif
033856a2e0a3f6a12d9e4788f2580c87
72cb5179e5bf9fad9f4b289a19effa27e6d4480a
46513 F20101206_AAAXES pokharel_p_Page_058.pro
446c07af166dfeefa1be0c827acaefdb
746081fb812b1e337f3b42e8f5a04ba11a232e0d
18835 F20101206_AAAYIJ pokharel_p_Page_102.QC.jpg
eeaea997789ed60a8264a4a931906add
89219bea06b8eaac2b2d5b112184850a16d3ea4f
17384 F20101206_AAAYHV pokharel_p_Page_094.QC.jpg
abbb6ceef8471bd1f72b9e50254b9e62
dff5be93d592a5550ab4a733aab64746e24a3ec8
1060215 F20101206_AAAXFH pokharel_p.pdf
7b033449de0c0cee67a9d957032b75e3
a7d649b8ebccab2b2cfe7c1b784928a8f26ea6aa
740368 F20101206_AAAXET pokharel_p_Page_027.jp2
338bcf17305e47d9e9ddb1b9fc209c2f
bd17c870e62e012b7910f97408ace1194d67c7da
5357 F20101206_AAAYIK pokharel_p_Page_102thm.jpg
ee39079635e6f6ce4fcd67bf42046fb6
2f43aee3e278ee6fc13ce460d89965d4ef47055c
4662 F20101206_AAAYHW pokharel_p_Page_094thm.jpg
9e043e9620b9b3876550b79ec4173fce
39f66ef5dcce17c46ebb8f7e1541abe07d66abf2
2215 F20101206_AAAXFI pokharel_p_Page_001thm.jpg
423825ee0a5c525b1c8347dede11cf38
5fb77d930e641fb3d3aa4779e527b8e20a870256
50974 F20101206_AAAXEU pokharel_p_Page_010.pro
9ed049368fe5c88473c43db74947353f
e425bb60661b507785d53c5e88a5c8498c6573fa
8331 F20101206_AAAYIL pokharel_p_Page_104.QC.jpg
dabf0892ee8be17caa37e84bd4783b3b
1f9f27e8c9e9825fa459036426d2a9358ea8459c
19499 F20101206_AAAYHX pokharel_p_Page_095.QC.jpg
80de8a74b45e510c8d67844042f65557
d13557819f0e0798d3c60c2d9d38aeeadee3bc3d
528461 F20101206_AAAXFJ pokharel_p_Page_084.jp2
6dd06f354e65856c58ab8d02108bd816
3efb6a29db8d86162a1e4d5bbbef2825286554e9
52852 F20101206_AAAXEV pokharel_p_Page_029.jpg
475a58c51384615bd8cbe265312f2cef
3ae21d0ddbeb3291b657720ac44cdf5acd30fd15
6235 F20101206_AAAYJA pokharel_p_Page_114thm.jpg
92f7d4186095492957c89aa51774868f
b0af5bb0866286117575adfa194873223e328a6c
2936 F20101206_AAAYIM pokharel_p_Page_104thm.jpg
9839c3dbf00b661078f6155bba6e9f4c
552d00b63e8db39389a377dc4f2137cfdb926106
5362 F20101206_AAAYHY pokharel_p_Page_095thm.jpg
0808374dd834b6edc5a948441a2c79ed
2772fb8c36fed832159049e0b6a6f0179f0b4567
647 F20101206_AAAXFK pokharel_p_Page_011.txt
ee00d0a40eb455678206775887a0cd2e
f41222d14bfe085d0754e94385d85b2f281eeb91
5325 F20101206_AAAXEW pokharel_p_Page_005thm.jpg
7910f896c3ecb79e0eb05d0f5c37289d
23776d4985bd632bcb1c8a00f6d47a5d67d8a01e
23385 F20101206_AAAYJB pokharel_p_Page_115.QC.jpg
5fe454b7f2d656ccff9896658a853939
64eae3a75efab3c5d58d64df71c89ffa6b8691c7
5822 F20101206_AAAYIN pokharel_p_Page_105thm.jpg
515d2efbad2d348c77c6e9dc62cf803d
fa21e5e09101b3b44d24ae76633386ce66a39395
25163 F20101206_AAAYHZ pokharel_p_Page_096.QC.jpg
9e69366549f54cd7efb4aa525c22778c
f76a1802240637cb735e44f77ac829f98662a62a
842742 F20101206_AAAXFL pokharel_p_Page_095.jp2
b7b4095e6b94aab009df4e292dc53b33
dc0dfd224527414784ba578c98a272b78458c871
66999 F20101206_AAAXEX pokharel_p_Page_004.jp2
3c975d9ad6bcabc41860f6df74c327f4
cc744eb6785affedda6a302a750149a40ed43790
24525 F20101206_AAAYJC pokharel_p_Page_116.QC.jpg
e04077d7fc73172f661bb811220117b6
8fdff3cd1cc76a0244cb56fc634da11c9b286e98
24504 F20101206_AAAYIO pokharel_p_Page_106.QC.jpg
7b5e3bff67ba18f511bbe86109649f3b
7c39da8657c50d0dca36bea61f33ac76935a2bdb
30033 F20101206_AAAXGA pokharel_p_Page_024.pro
d0fc8994c4588ffbe035c73741e71486
e669950c88da2c3007f317dc9c77dc66f6ac1794
5794 F20101206_AAAXFM pokharel_p_Page_038thm.jpg
4ee3195f333d3a0ea732554314bd6011
c40143debbcbc2a97f90b79ad6e96322eb21cf95
6417 F20101206_AAAYJD pokharel_p_Page_116thm.jpg
65d071bb88ac049c12fa31d925d47a52
00bfb48807bba5092c2b2e4c3b5f0597ee930009
6211 F20101206_AAAYIP pokharel_p_Page_106thm.jpg
26f5cb37398eb9dd79d94c62342294a9
565668ae95b24325b0322090857371d4ae2101a6
59328 F20101206_AAAXGB pokharel_p_Page_015.pro
1ba2936373004464e5e2f553ba5ad7da
7e5f7610f704369c8fdcacf7a4fa5435cef4b96f
30867 F20101206_AAAXFN pokharel_p_Page_081.pro
ed685d94cccdce469ba73363788a34ae
db615f155c23f430c48d1e8a18b191f7fc9055e9
19297 F20101206_AAAXEY pokharel_p_Page_067.QC.jpg
784a301f482e60904b775cabcaa93af7
f2cf27514ecc9ae0fa1327efa8842f4d80ce1bfc
24529 F20101206_AAAYJE pokharel_p_Page_117.QC.jpg
58a8eee65f2edfb679a1608104e5cf77
02d37b4753939cbe7759a623bca6fbd0b87f645d
26212 F20101206_AAAYIQ pokharel_p_Page_107.QC.jpg
d204f10d5480011f8b19d1fae302c1eb
775773b86d2961fc9a0d850c3831bcf464407758
4432 F20101206_AAAXGC pokharel_p_Page_084thm.jpg
a2ea93a65a5bffc29303fcf19217aea4
e5b38924173fef70be47b20a9b81de1af214ed43
9559 F20101206_AAAXFO pokharel_p_Page_053.pro
f8b36d3f9fa0a932c8b69b30934a8654
67df470913d34900c32f6709c69914832e5f4f3f
2469 F20101206_AAAXEZ pokharel_p_Page_112.txt
ee47b5761c2675ab49a47d185f907d8b
1fcbc13baf6d277d621939a6e2fcf4a4e48048f3
5007 F20101206_AAAYJF pokharel_p_Page_118.QC.jpg
12598fa29be94eb030f040977e46ff8e
31d32fc751b14119bbd7ecc5ea265c2292e7f6e5
6310 F20101206_AAAYIR pokharel_p_Page_107thm.jpg
a2f0a7c2b74ed3457b604c6bbeb1cb19
7868e1dbad1fe68edad91b4aed6cf45cf2f98e08
6359 F20101206_AAAXGD pokharel_p_Page_108thm.jpg
c357bff4dafca638d497bae45af691b3
756b399d71ec3462e7a3d14dd2b533f1596f4b4c
6835 F20101206_AAAXFP pokharel_p_Page_020thm.jpg
1f196c82602877b6ccb8c31a08821d90
8e6a29b891a1f4792935ebd929d454ba59cdbabe
1720 F20101206_AAAYJG pokharel_p_Page_118thm.jpg
6ce4d9820ebe4d1642b65805fdce654f
89582edb2f22b5a6b19aadce0c681d8a9f64b0ec
25627 F20101206_AAAYIS pokharel_p_Page_108.QC.jpg
c07731132cf10ecf48d7b870e2812ac5
f2656112bb1b83e87b3b5bebb103180caecffb41
56737 F20101206_AAAXGE pokharel_p_Page_078.jpg
f9c28857f368c24a511e699ba62486a8
f5b82603e41042362f2958ed2f600d6164fd6c4c
78334 F20101206_AAAXFQ pokharel_p_Page_096.jpg
639e9fb3c58b122cf568069c76c383ce
458775b665d8b360ed62687d3353b7c0131208d1
10316 F20101206_AAAYJH pokharel_p_Page_119.QC.jpg
3632673d7cd75a3bfe2fd48cdda38edd
e89347f962ba30620f5bcd3655470d673a5bf280
23697 F20101206_AAAYIT pokharel_p_Page_109.QC.jpg
42fb6518affe40e8918669341a4454eb
26bdcc0f8fb958e4f8a408468c5e7eb752225b01
7049 F20101206_AAAXGF pokharel_p_Page_016thm.jpg
648db5fc75947bf565f05e329b9b992b
b2ede383a630b096baa7715b8e71de3a7a9157b4
15299 F20101206_AAAXFR pokharel_p_Page_118.jp2
2712877b385fa499fa16cac10486139f
22e42b5f7b460711a70632e776f15e34e4a02839
2965 F20101206_AAAYJI pokharel_p_Page_119thm.jpg
f73644bba0491757bf9b820e920ef8ad
16931ef81a6a4084f607e8eae7a19b046b14b229
6032 F20101206_AAAYIU pokharel_p_Page_109thm.jpg
45df8f8e4f459634dd644d621dcbe5bc
5efdec43964350ded0e59113ff769142ed0095cf
21434 F20101206_AAAXGG pokharel_p_Page_038.QC.jpg
dfebe429c3aba12b8eef8832fc4b089b
78fde669a858c971512ba97f2a5bfb07c335171b
133986 F20101206_AAAXFS pokharel_p_Page_117.jp2
9644f2bed125b543df319edec1ab8613
0f45469642e89b8c53d0092fc6f46d6aefc3fa8b
23955 F20101206_AAAYIV pokharel_p_Page_110.QC.jpg
35c506bebdc319cdedf1ada0fd91ca1d
64152ee4fe18b3f49aa0836a29a8c5b7b53c9b6e
1984 F20101206_AAAXGH pokharel_p_Page_021.txt
6c39910db9dc45c09dd49a64e471cd1f
d54fad167aabbd6a1270ec80675909c1cb36909a
2349 F20101206_AAAXFT pokharel_p_Page_083.txt
eccc13b4e16e44a2410f8f7fb82a10ce
9ef4507654454f6da78690bf50c1c74b52877098
10339 F20101206_AAAYIW pokharel_p_Page_111.QC.jpg
8a9ad970470906ab1267c42bad135527
7b6db245eef6fcdfda56dcf6c7b7d8f5d0327bb0
6334 F20101206_AAAXFU pokharel_p_Page_112thm.jpg
97bd128fbdc02091ceedb150168b8925
d619941ccf82c9ee60a7394fa157ce39004ea9d5
F20101206_AAAXGI pokharel_p_Page_050.tif
cfc5fdc5ca2e0c1b29f019283b343dbd
9325ab3755eb5848fb36127d9b12a5f3f8bce523
2917 F20101206_AAAYIX pokharel_p_Page_111thm.jpg
afd1310efc21051ef8b5409be568f2ed
8bb3fd7853e373bcd652f28b8199be560de57f94
6213 F20101206_AAAXFV pokharel_p_Page_110thm.jpg
a3d8fab386d6cd758c1292b181537574
cd7016778fbb4ff8126026597756c3f9dcf7a1e8
15260 F20101206_AAAXGJ pokharel_p_Page_103.QC.jpg
50007db98db9e405aac3852fdf129ab9
030f0ad711ec91bd6f00d066b9a5cd58f6341c49
24468 F20101206_AAAYIY pokharel_p_Page_113.QC.jpg
8c4a2ac9e4ccff43b87ba55e65d266b5
b6c5b9af80565ee96d22a7285d062a544386e8f7
63416 F20101206_AAAXFW pokharel_p_Page_089.jpg
88654b6e43f9e124362a6b60cc60dd4a
7c4fde18e87c24df91b1ef15d14a38cc20ba4462
824986 F20101206_AAAXGK pokharel_p_Page_024.jp2
ffa0e7f4731895e4dc3cfe0a1f2672d3
b97984de59bb23cda80780038f3f5399a1020f88
6414 F20101206_AAAYIZ pokharel_p_Page_113thm.jpg
b92493c1b2c86fae2a15cee22d7ad900
f608e4ed55dbfb622e859d516cdc1a9aecd8691e
5251 F20101206_AAAXFX pokharel_p_Page_065thm.jpg
04e704eb11b2aae518fc610e498d8549
9a950e257295542519bdd31261b1e2dab8661eec
F20101206_AAAXGL pokharel_p_Page_020.txt
22ee6d28b999a6893698a8951359fee2
35f660ad5a1abded7ca4fb28ae1c7fd78a9a90cc
1416 F20101206_AAAXFY pokharel_p_Page_102.txt
f18a1abe956dd8566dcebf49bca12a0b
b2151550a64992780ab7f2071a46bfd69c331277
23189 F20101206_AAAXHA pokharel_p_Page_054.jpg
25174a1329ccd272bc88a47112f45a28
bacda4792a926959446e8547d37b84d33c585513
F20101206_AAAXGM pokharel_p_Page_117thm.jpg
23a64d47e3ec5a9b7e51f2f5a7f697ce
a0a3f4ab1944c052e91ef4f2ab1c8ad0a8edad30
5856 F20101206_AAAXHB pokharel_p_Page_086thm.jpg
028ddcffca4e4b8ff08f0be330fbea72
0e2c91858d5d3334f71d78174d1f80b166ce2665
42253 F20101206_AAAXGN pokharel_p_Page_087.pro
3ba151b00235127a18a85c0a85f3e253
af439b95b6f0876fa115e084c5c5cf663f91bf7d
23906 F20101206_AAAXFZ pokharel_p_Page_050.pro
2eb7ab72bb6efc8d6ff0b35e6d81f491
9749bc54f1de0adb88d1439db24cfbe1fafff0ce
7137 F20101206_AAAXHC pokharel_p_Page_034thm.jpg
231918e47867beafd7046398124122f1
28e11d6a8d971b8b2fc0af1af5120a58e8fad4fd
6914 F20101206_AAAXGO pokharel_p_Page_015thm.jpg
91b5191aa96121cc9508c90811f229bc
a3134c95b793d30fe5e1e1fa4b1e0c2f4b310fb1
5450 F20101206_AAAXHD pokharel_p_Page_002.jp2
be377eb134ae2059ec60da2cfc45d55f
1e2a8ec9d8913a8377d0136ccc0f7640ee6be4b1
63429 F20101206_AAAXGP pokharel_p_Page_067.jpg
24a75b8bb0bf98f36750b5da8ab0384b
2b2b4af57a00bf978370920ec1f33c542810fcfc
21485 F20101206_AAAXHE pokharel_p_Page_005.QC.jpg
1afcf931de29c915b60e49c980425fa5
0d92c95355946cde62526612469c46def173657e
62407 F20101206_AAAXGQ pokharel_p_Page_100.jpg
ecd280e427cbbcef82118915cbfae6f0
ddc264512038c44936964bbf3a5fce7e04efb3a9
F20101206_AAAXHF pokharel_p_Page_041.tif
031253e9423dc8df4aca6b355f2df5eb
25d1e2229d71b073bf3779451079dc32f951346f
5595 F20101206_AAAXGR pokharel_p_Page_028thm.jpg
2b424df3ac36e417be3ac2cc1ced7f90
a7b4adb3018496490a1d9c29a48c5aae8e5fac38
23593 F20101206_AAAXHG pokharel_p_Page_112.QC.jpg
b03ce7e6d51ec1271d0d7ac5e1d3b6fa
be065736f7b064eafc785cf1f87f7faa8105b0f5
11756 F20101206_AAAXGS pokharel_p_Page_071.QC.jpg
60849260b64623e592616074d3b926f7
44b285ddc797e1ef04f99fd1cff1232c08c44b50
32747 F20101206_AAAXHH pokharel_p_Page_090.pro
c4b4c4fffea9862847b493bb7d837ca8
af0584d9c086b4740bacb6ed10cfdc738982c7e6
26214 F20101206_AAAXGT pokharel_p_Page_065.pro
9c7b6fe27e588917c50648f218e29ab9
43b63a3f188cce19e310d040bb7e63b424a2ae9b
1051931 F20101206_AAAXHI pokharel_p_Page_098.jp2
21fd5d3627e4f25ee1ecc8297fe77400
39cec61cb66d19447d34261768a370fc7ceeb028
F20101206_AAAXGU pokharel_p_Page_097.tif
a66d55d6034f6fa0c344c19445d3ac92
de086a8d9601ecee5e2bfda9d3f5530fb1c8b620
457338 F20101206_AAAXHJ pokharel_p_Page_071.jp2
20bfaa1e11328ca2ceb3eed25c4c75ab
121dc14831590ab164f4cd1c7e125ad762c6f760
1051972 F20101206_AAAXGV pokharel_p_Page_082.jp2
1078b8a6e61ccbc17198de13bf276673
e2830892dce2eb9e192181b49fc2a90c1af58741
132515 F20101206_AAAXHK pokharel_p_Page_114.jp2
236d1ccc221df705627e8659b5c94ca6
15f2aa3759e226ae4ef9ba111c98c5244555c4c5
24280 F20101206_AAAXGW pokharel_p_Page_114.QC.jpg
3a799441de3053db8dcb031f0d038d1c
191f16874f4a6eebb69bf68f7700a0edbe9a8820
1051985 F20101206_AAAXHL pokharel_p_Page_005.jp2
7ebb36bf564e0aea512a7a6497074740
a5a252c9facbbdafa3797d5f71184af6092783cf
F20101206_AAAXGX pokharel_p_Page_024.tif
59db35230607fb8366c4e9ef8277555f
f49253e6ac911f12a76835cfc65f913ec86aa95c
7039 F20101206_AAAXIA pokharel_p_Page_052.pro
2a634ecf34203d054be6d09178a656a7
a3dadf65a85525c1cf8fefe546d927285bfa4be0
33782 F20101206_AAAXHM pokharel_p_Page_030.pro
56df70c01610c07d7cb1b1b117c26fd2
4611b8a7bae3569e24d3a0e20769da509d74f114
33058 F20101206_AAAXGY pokharel_p_Page_026.pro
304c4e727174661571196c08d2144793
a0ffac19f91c1cb93b42b89b832fd2e6ad950d63
39043 F20101206_AAAXIB pokharel_p_Page_023.pro
f7eaf77c6d0abb37f2b6705d614aa748
41563cb407b05aae72fb20015dc9cecb33fcdae2
5279 F20101206_AAAXHN pokharel_p_Page_024thm.jpg
52e7e4697bbe8fc4d7e247b5682aefff
0410b8b823fa3f3193122119120e09e5554c93dd
36393 F20101206_AAAXGZ pokharel_p_Page_095.pro
3a137e7cff4b88eaa4ac53622a1b203e
f11b0758cfd4ee02b1470aa7c4086dcd84a539c5
472 F20101206_AAAXIC pokharel_p_Page_001.txt
9a3c085dffed3c7e620deaf49d9f2bfc
85bded8da7e712a6ca10011a973548bb4b756037
23195 F20101206_AAAXHO pokharel_p_Page_006.QC.jpg
086a66b6a002a39308d874e033fb3454
1391d4aadee4fa5aca427df27bf749f3b5bb43da
80626 F20101206_AAAXID pokharel_p_Page_116.jpg
1daa1bc638e748e90d5165801de13cd9
3859bbff24e9f782503c4577a325845a9d16fe15
1280 F20101206_AAAXHP pokharel_p_Page_004.txt
7005d32e85e925ff6f3300e8dbc916f8
adf49759f8603fe6d3ce2da2920889cb49125b41
1503 F20101206_AAAXIE pokharel_p_Page_081.txt
8c6af771b56a7c41d1842e6babf9d2ae
a0666dd027cdc513a1334bbad3237bcc6f13a089
127979 F20101206_AAAXHQ pokharel_p_Page_107.jp2
4aa89dbfe4c49c0faf1152bd186c8b14
baf5eed016b0db0e35e60525cf88e8f309c5f347
61904 F20101206_AAAXIF pokharel_p_Page_107.pro
fb969ade28c87cde3f15eca53298add1
fb2c9e8d5cb27340a71f5819b9249f34bfb40a47
54576 F20101206_AAAXHR pokharel_p_Page_105.pro
7124c20b1561a1f9eb1e3bcc8d375cae
b4751bece5da9aca47e03f9a8fe5304d204f90c3
27486 F20101206_AAAXIG pokharel_p_Page_104.jp2
1f680ac50437cdda9043b41ad8376b4d
225b1207a7d5a75c1a9d50160ba725d7dfe4f4fe
1051984 F20101206_AAAXHS pokharel_p_Page_006.jp2
79404520f7b21ae6f6b78cf43f0804fe
06df3582022e5104b30d8d23becaba496cbe74a0
F20101206_AAAXIH pokharel_p_Page_079.tif
e2d10ed9c199b864ef8e66b2ce4fe66f
197bcb13cf470a5da27e09197bb43c866284e7f0
2259 F20101206_AAAXHT pokharel_p_Page_105.txt
f96755a8e526846ac4f96af231c56f41
52d56d7811825e774f6427e42e8c0ca8e356ee7b
1051983 F20101206_AAAXII pokharel_p_Page_079.jp2
06821ccf94b40f4e3f01a5aef5be44ca
0347bd629f530e8e0d5346d9af26a5118d0a0093
68392 F20101206_AAAXHU pokharel_p_Page_039.jpg
8192f2641c0a6073b53c4c060ae9f962
147fae6c433d0c9d8a1bf21facda8266fa9bf62c
2502 F20101206_AAAXIJ pokharel_p_Page_114.txt
b9f293a73b81c240d5c53123ec516332
1e00633dbc87592f0cb595feb05f55b28d634140
80301 F20101206_AAAXHV pokharel_p_Page_115.jpg
0e4c8c7e9f944509e39b6b2781480cdd
e0a3a4ffd709c5da51cb4b249b4d36b7609d975a
6286 F20101206_AAAXIK pokharel_p_Page_115thm.jpg
adbd9dffc8511d6efa250f034c228ecc
4505fbf34de8471b3051cc22e1ee1734e679ec3a
6772 F20101206_AAAXHW pokharel_p_Page_074thm.jpg
b64156f09d0f92c2b5ec6b3977de1ed5
061c5e9a1168dfa9c89f6dbf81022a27687a40bd
5895 F20101206_AAAXIL pokharel_p_Page_059thm.jpg
c7068541fac9c4b587165673937251e3
fddfadec5466f740a6be59f7e853c198be821b85
48738 F20101206_AAAXHX pokharel_p_Page_040.pro
5b812bdacdd6639bc2562dd8bd5e67ae
23faa47a10b45324341804108d18ca3fe28cd250
1742 F20101206_AAAXIM pokharel_p_Page_007thm.jpg
d4a4b111344aaac34947168b7ff3a567
83c6cd17cd814d321caff95a75a2f7a27771a7bb
38776 F20101206_AAAXHY pokharel_p_Page_028.pro
4bda7290fcf262544e80f7f1f8d1e4c9
12e84eb31fd5de3c8fb23736288d547d55f2bb9e
806 F20101206_AAAXJA pokharel_p_Page_111.txt
7ac26bca92a6573298e9f8e7fe2cad98
1f23ec356c26ad8e8a6516ddba10371cb48f9b06
5332 F20101206_AAAXIN pokharel_p_Page_007.pro
0fe416667dd79216b6931f9363e6470e
74779cd6054b5455f26c52d866104b05c5d2dd95
F20101206_AAAXHZ pokharel_p_Page_113.tif
9c050c077fad8dd4a15780f17223694c
9a7c8150d6de36bd90e99882c210be6616baf49b
F20101206_AAAXJB pokharel_p_Page_119.tif
99e0220567b1898d6537184b5308dc96
743a4969303014effb314e01933012d59e4776fb
F20101206_AAAXIO pokharel_p_Page_037.tif
132a26eee184806a4359e13cc0ea76c4
a063cba3785ec7e87e16f7145095109a8a73d584
F20101206_AAAXJC pokharel_p_Page_020.tif
9fdf4ffde269ba2aa23f72774e1ff33a
1d0af1bc9cbef2f35fbe9f5ccc57196d4a90941a
2584 F20101206_AAAXIP pokharel_p_Page_117.txt
a0896629db3a5a47180dd2b90028ad71
04a9ae7eb3182609cb8e6742585b3c61a4699d7b
23440 F20101206_AAAXJD pokharel_p_Page_041.QC.jpg
4a55abb9c7dc775be4b286f88efd4cb8
d6879e28aa5b1298518fd3673806d037635f1d99
6867 F20101206_AAAXIQ pokharel_p_Page_022thm.jpg
99b8fa99d5b144f1ffd1e1265465a250
c007a6f6cb39b30a64c4f4af615d863b8bc8f065
63825 F20101206_AAAXJE pokharel_p_Page_045.jpg
1ed2b67f5450adf182def5eee6fa801a
6e7bbe0790898383c66a32b39ff65ff0732752d9
65823 F20101206_AAAXIR pokharel_p_Page_042.jpg
1edf7b62420087b0236be86e5fb4d31d
597c9cadbf1bcc6473352cf330512b09ae652722
70512 F20101206_AAAXJF pokharel_p_Page_032.jpg
a56b134ae04bc03213c0791aa04fb17a
92c31ecd5b247fd7283a31849645b66d1e94265e
F20101206_AAAXIS pokharel_p_Page_030.tif
fcd5c62f1b3d20912d8f9a0ff388b0da
2e665c1373b9752672b3b8f264f1572a6dafd2b7
1063 F20101206_AAAXJG pokharel_p_Page_065.txt
31b34fac292c8a7302cf419fa1bde3d5
eb30da8adde699307ab0b910b84a0d67df2614e6
25839 F20101206_AAAXIT pokharel_p_Page_044.QC.jpg
9eb7b58afdd9f3bbd2c8aa204526e098
3d86884aa52074d9135b58f8dabec9fb2212569b
684265 F20101206_AAAXJH pokharel_p_Page_061.jp2
31b91e8505b800f98ad3e7e78a3216de
7413a807b33f2cba4d6fec8a6f6ad5b893101a70
F20101206_AAAXIU pokharel_p_Page_053.tif
9d128b0b055754b919eb360f19ee73cc
c230bce44c0fa3c8d963e22b15c481e51b10824a
1419 F20101206_AAAXJI pokharel_p_Page_045.txt
84705bd78b3f8b62ca7350a97d82cbec
9b9551e643d041f334a92cbde4144a9639b8502a
2227 F20101206_AAAXIV pokharel_p_Page_110.txt
f2c3ae442d2cd5110181df07c3d552e7
558ce9af46644bbe0a06979535041e9c6310df01
1051900 F20101206_AAAXJJ pokharel_p_Page_021.jp2
8fee9a87b30354fda3bd469db1415f32
7c2f7fc4cfc4f2db6d8f413862bf3f8dae874c05
F20101206_AAAXIW pokharel_p_Page_073.tif
14d51666cbeb71357a88adb463cad93f
1efc6c98a06d97bcc33c08b0d35f4a5bcab439c2
74522 F20101206_AAAXJK pokharel_p_Page_021.jpg
79e11a8e57beb3dccc0e7d48f27c0823
eee861ad4dd4e8e401792d0f09b1f5e210cee213
6507 F20101206_AAAXIX pokharel_p_Page_056thm.jpg
71410bef095eef0d33598212fa75032a
35f955af8d905b8ed032d9d39dc8534958a2393b
59153 F20101206_AAAXJL pokharel_p_Page_101.pro
307846ab46450f50afac8fc197c0247b
4d0f6aa351d5d665965d88a772f0e39f8067f02a
1873 F20101206_AAAXIY pokharel_p_Page_036.txt
ccc40b7fc7fb03e96c407ffbae4d7503
81233399b432db322e913e7452ccbbe398f50853
21099 F20101206_AAAXKA pokharel_p_Page_087.QC.jpg
ece70885406c5704c1cc2a613f6ce86f
c11372c1d22b851ff80127a37b4f453a2c6815f5
20437 F20101206_AAAXJM pokharel_p_Page_058.QC.jpg
75b8db578c148363f7fb2dc175e8c5ea
65efe7d54ffd80f599ec36b2733d38da78a8f5a4
61341 F20101206_AAAXIZ pokharel_p_Page_112.pro
8bde4a3583d4e6555a73cb5877cad2c9
c2938558d5eb3d0cf890d9db5a6d6803d86eaec6
5437 F20101206_AAAXKB pokharel_p_Page_017thm.jpg
886ff19b2911fd38a04ce459eb3145c5
5e99aa865a2a602d9a044aba3b8897a1a48850a2
1643 F20101206_AAAXJN pokharel_p_Page_028.txt
fbdd3ba3f4bbb3b1b0c4bc0d695901fb
461ad9c29b9bd5a061503ce7c344ef658afa919b
62428 F20101206_AAAXKC pokharel_p_Page_034.pro
2d8e7dfc88f7f4aa1e353c94d342b413
50d098b3a695e0f7e6d7ae49c2f943562857fbc1
57167 F20101206_AAAXJO pokharel_p_Page_023.jpg
10bb14b37f612f40070e37350a6fe8c3
a11e53dda2308efdea443454488f1c9c6a424dc6
F20101206_AAAXKD pokharel_p_Page_036thm.jpg
9fdf8140f752f9c5f6fcdef6e4bdff9e
4320c9c8609c55105debab15643da0a1a5fb1866
1619 F20101206_AAAXJP pokharel_p_Page_088.txt
f9159a10062e9b824d6283604c16d4fa
46c5652ca76f766c9c5f5cbdfd6e4798626a665e
2110 F20101206_AAAXKE pokharel_p_Page_013.txt
27d7b6596a372529b792214b5728c7be
31b50a44db3a101a53a1d6da7fcf81774ed17fde
F20101206_AAAXJQ pokharel_p_Page_042.txt
c379d549afaf72fc27e7170f346ad09e
80d3318af2d8d4cc0fc59b284db49b2cbb087dd3
1178 F20101206_AAAXKF pokharel_p_Page_069.txt
371744da5c3e6c1f664e347e294c79d8
333c88e0cd456f5b0e9b00e8f3629efa1657736c
44365 F20101206_AAAXJR pokharel_p_Page_084.jpg
3abd5a7b66f2ce5c860976c5d6f641b2
a5d67211096fc262537304d6bd5bdedf2b214256
F20101206_AAAXKG pokharel_p_Page_118.tif
cc6789e38b3bf39a250911bb39f1ae04
b08cb4833c7ec649b55b32b0bdc379970750030a
1315 F20101206_AAAXJS pokharel_p_Page_089.txt
d611f02a297525702c091f53b73c1c37
53d6b6b16a32c0949ce83a18e0f775dd65672523
692240 F20101206_AAAXKH pokharel_p_Page_081.jp2
6963ca7df21ffb04a7592939bfb79dfc
4c7eee596918e308c4ba75b51ca9212c6ab8b61c
41271 F20101206_AAAXJT pokharel_p_Page_086.pro
bea57c4b5248bd90e19903ce5d95666f
1d5d4b3df94ef81bd0de4375e1ce9fc9511ad209
F20101206_AAAXKI pokharel_p_Page_093.tif
87fbbacd05453e9dc7f08bfe2bc84085
8b5fabcab8c9c956cf1346cfec8c1bb168dd9c98
43009 F20101206_AAAXJU pokharel_p_Page_099.pro
338147b0ddaa5fd9e016fd030535b5df
52d8600618286ffd207be77944765efa8e936911
1300 F20101206_AAAXKJ pokharel_p_Page_003.pro
011c77031169b3b8193db2c4e1e33e37
b536d8bea84bac2fdc4f55cbae04e566b4cb0304
F20101206_AAAXJV pokharel_p_Page_027.tif
c3bf001c396f9cd6a53dd0a248ee3c17
2a745f5988fa0a55f0d121a5843277e8fe79046e
805 F20101206_AAAXKK pokharel_p_Page_119.txt
89743728d3c443760199bf8d1e676f63
38e2e97b230c070482c4f785518571b17df35210
F20101206_AAAXJW pokharel_p_Page_060.tif
4f2b2ac51bc210e2d1b226cf1d31e5f4
779d397bf7595dc4b40f938ec24a328b1d5e695e
5055 F20101206_AAAXKL pokharel_p_Page_088thm.jpg
d2fae41e8bdcfdc2788e1b9e6c618786
560e3c56c57b8306ac1981242b52e49e3ee0ea1e
20662 F20101206_AAAXJX pokharel_p_Page_046.QC.jpg
a4dc23ab5c6d027cdf7b18d3f3f8a3b1
4b9bbea20e4a507af539fb83ac0537360ec0b23e
1051986 F20101206_AAAXLA pokharel_p_Page_063.jp2
d8b5bcd6f945fed426ffa2c8a0aa0616
730368cd76b2686101ec17b30bf4cdccbe4d3e05
F20101206_AAAXKM pokharel_p_Page_057.tif
a38510839a8e4dbb5b87a38f3aa699ed
2ca471c59febeea1ae41bbafdf60da64bc7b7dd6
1051978 F20101206_AAAXJY pokharel_p_Page_015.jp2
2c82378d1786856003709994987dd1dc
b8a17fea89c1958fda75c8ee9c7f7863324ae04c
340976 F20101206_AAAXLB pokharel_p_Page_053.jp2
a68113424404e51ebc8a65b9ab8ac05b
e34a3b8ae982fd59df2f772a533dfe8ec3052eb8
5378 F20101206_AAAXKN pokharel_p_Page_049thm.jpg
af1a122f64f270ec23bc8995d1fa968e
30851bafef653f9aacea8e620bf8eaf85889ed30
F20101206_AAAXJZ pokharel_p_Page_006.tif
4c2d7da3e0d899c3b692db0687c5bf44
f9caf4ed5560fd4d121d40d44980f77f49951d42
886468 F20101206_AAAXLC pokharel_p_Page_067.jp2
85e753f02f50d25a305f6773f65d9e8b
1f260a0ed4ecb86958e779610e2e1fd0fb058e23
F20101206_AAAXKO pokharel_p_Page_102.tif
53a3a0e1a6c991e90126d67e6bb1aa5e
d7409b84eacbc7a14858b257c503399606877e2c
1827 F20101206_AAAXLD pokharel_p_Page_086.txt
7c9f7b4527f7fc0d0057a7d429d7c203
519ac51368dc737be16255c38a365593aa4ae9c5
82576 F20101206_AAAXKP pokharel_p_Page_113.jpg
a69e2b8cab4b5be44ea7c29453c7e4a7
0ff857a2583f1b537bcc600133e5015254978445
31532 F20101206_AAAXLE pokharel_p_Page_077.pro
213e3e4e8cffacc3ee65d5b45c615a90
8fc39f9bd75780bc0a9b118626d82f957197c300
2051 F20101206_AAAXKQ pokharel_p_Page_018.txt
4140f258dbed9633414e0447072d618f
7b97fe868626ec56808be0f73bccd36a876b0a78
24718 F20101206_AAAXKR pokharel_p_Page_001.jp2
f260296952edb1a88ed35612100ac69d
c630f4c4a4cacc467df87f4214b58d280276f3c5
871950 F20101206_AAAXLF pokharel_p_Page_036.jp2
0158a2c1d06922f4f1d81cbd7bcb68fd
f338c42b8d0bb5ec8503f088afc663cce3280b29
23243 F20101206_AAAXKS pokharel_p_Page_105.QC.jpg
a6c1fd9137c7077588bbfa3812a320b8
8e87e61ea37a88367463946b9ecfaecbcbc469e1
27453 F20101206_AAAXLG pokharel_p_Page_043.QC.jpg
4cb2525477723ed5d13cfa23d8a79efb
64360a077181cfb3b84fcc5717c0f1b87a042998
F20101206_AAAXKT pokharel_p_Page_001.tif
6b7189cc3bb157db3e7b84a58d9a891c
adc28beb93e3d3088f45f0a807da7d952a5213cf
83231 F20101206_AAAXLH pokharel_p_Page_073.jp2
5cc05305fb686421fd94d33093f1e770
4b5cbb2ec933d01ac814a4e2c15bee45974693a7
116472 F20101206_AAAXKU pokharel_p_Page_109.jp2
867824e0bc8e23beec99a99a74c4398a
b5c50b3955874b8b5c1c39bf3b1a1f293c45a28d
14871 F20101206_AAAXLI pokharel_p_Page_084.QC.jpg
30f4e1b60c3dbaad5eb0af0e659dc356
977abc02de1198f0a94b11946d1b0556df37ee00
6218 F20101206_AAAXKV pokharel_p_Page_118.pro
3e12d10b548f584d842fa6c9345eedd1
6058336f27f9ee952bacef313e0f3c4e8a1ff29a
56321 F20101206_AAAXLJ pokharel_p_Page_109.pro
08b3ed05e365d45b3e4510e702cb2e26
fbdc1ec67e9364277abf36ccb02d481fe61d2ed5
18788 F20101206_AAAXKW pokharel_p_Page_091.QC.jpg
a9663229d977c60e73fa26be42b74e4b
7fe8ee5c4ac51e59668674609fa8b9e9a97406dc
139254 F20101206_AAAXLK UFE0021482_00001.mets
b8a3eb29d5248bf3b80df632e51f87e3
67d01a394050b784c6a5227b63b69ce77ab44cb7
11889 F20101206_AAAXKX pokharel_p_Page_031.pro
8bcb15dfb3e8ab983a3e0cb81a7014ea
3c92fc134540404a1ad6e4f9c24f5f551949209e
2170 F20101206_AAAXKY pokharel_p_Page_044.txt
c104c5527d16dacc54a0966f2327dddb
6db8f00d42c4f6637434a9e995eb04e7ae672f6f
88871 F20101206_AAAXMA pokharel_p_Page_015.jpg
cd534545227af04ee0d9a40c29a1594a
3d9023c4fa3a43efc30a2ba672cd2c7aa3707acc
121144 F20101206_AAAXKZ pokharel_p_Page_096.jp2
8041fe37c98bca897a5a57b62ffb1980
08cd7431bf7950c2a593475323c12c656ab21690
92308 F20101206_AAAXMB pokharel_p_Page_016.jpg
0c5ba4698a2a07293444b422f383565c
fa77cde664ff8e9506d5065d0ad438d5822f6759
23654 F20101206_AAAXLN pokharel_p_Page_001.jpg
8d66dd7c30139ad769dfc2064c6474f8
9d263256eca74e3d1a2b448d93918f13ab45bac7
68948 F20101206_AAAXMC pokharel_p_Page_017.jpg
d72e19209a35499105fe154fa9ee2fae
3a8382c7ae8355d01bd498ba5877280c696bf8b1
9938 F20101206_AAAXLO pokharel_p_Page_002.jpg
5f307f4e19b82762fc19175d0f16ea11
ce939c88aa09146f83dd23ffa21568dd0a9c80df
72737 F20101206_AAAXMD pokharel_p_Page_018.jpg
ffc2165c6ca63eb9d30fc3f76651c2ea
7aba0b752c26d3f8008d0f1ebe0402ab9b976496
9946 F20101206_AAAXLP pokharel_p_Page_003.jpg
6b78dd44fe87e90c9a620cdab0eb397c
6c1c9203f31e2e3ba9ce74cb66737a087ec4e830
74911 F20101206_AAAXME pokharel_p_Page_019.jpg
211a3916585dc04e16d491c6f4ca2a9f
699147fbdca5e25cdd92e9fa404c144294f4bca9
44916 F20101206_AAAXLQ pokharel_p_Page_004.jpg
b42d5fec2a17b252778504d325b2a314
5a083367c7147cfad12b342c64869221487d933f
86057 F20101206_AAAXMF pokharel_p_Page_022.jpg
f24d8bfdc89003e3ee96102b20319a28
2d3dc7c1c2d1bac36c5fddd045f8ea8c0443d1e1
81119 F20101206_AAAXLR pokharel_p_Page_005.jpg
4f23cd5c698b52b8e9a99efc558dd1b5
fcc5375ec9bff6046d9721debf70848cb9c16dc5
61320 F20101206_AAAXMG pokharel_p_Page_024.jpg
1260d6d7bc88e3b0958bfebe31c005d2
1c9737c977579011d9fb3d47db44ccf7f8cdc86c
85149 F20101206_AAAXLS pokharel_p_Page_006.jpg
a0d50f096b574dcbcf18fa11164ce536
aeb2f890e6f5600ea005ce8c188314fd43484212
60370 F20101206_AAAXMH pokharel_p_Page_025.jpg
827c15b2e01edbe797a04e92fe7cfb71
510bc0745accca90917ea5cc96c42b75c699345c
16534 F20101206_AAAXLT pokharel_p_Page_007.jpg
0a14430e2d2b862272701be0edcdd997
6f24ffeba11d73a1d794e31a255b75c038e50a22
55301 F20101206_AAAXMI pokharel_p_Page_027.jpg
bc64aafabef36dfaca24b26dfcce4e87
1230277ac126184fe58494e3243e8aa2aa1028da
98118 F20101206_AAAXLU pokharel_p_Page_008.jpg
7a58107aed2b57c19a8a3cba20c38a55
13d77a60f2a18c822734a6e391e239d5e07ad97d
65275 F20101206_AAAXMJ pokharel_p_Page_028.jpg
4b6304861bd0eb77824000e80d3500ce
fb49a35a448f743cfec434a3c1a2f93d9c032a92
80402 F20101206_AAAXLV pokharel_p_Page_009.jpg
1757885ff60a7e9538b3f8458ff568c9
047154d33c0f013b8e157fc12dc46de0ce110234
57678 F20101206_AAAXMK pokharel_p_Page_030.jpg
045072dc11d405fa70a89e0f6f2fca8c
ad87f31d5c57ed7e6c3fd27087a23a1acdd3444f
27052 F20101206_AAAXLW pokharel_p_Page_011.jpg
60259e05b076b64849bd7f1d2cc1864a
a262902e60f8c8d3b8744abd39fcf6884839eda7
80010 F20101206_AAAXNA pokharel_p_Page_055.jpg
879dc1fa497ce2682457a61a7254f878
6b9722802615d493760ba7935df81ec547e4178f
65515 F20101206_AAAXML pokharel_p_Page_031.jpg
614c5495c2acd13cd61bc22f66dc21c2
5b9672ecb617177b00b79e7909fb5f20969bd12c
85403 F20101206_AAAXLX pokharel_p_Page_012.jpg
568ab5bf5c3687971095dd81dfd7d016
fa4b2a4729a63680a1f9fd289d11709b7d4aee84
88703 F20101206_AAAXMM pokharel_p_Page_033.jpg
a7c2a3775111ac73a9970b0632a29895
d63dff3c02dcbf6174420bfe21cbbb9cac0b53c1
79133 F20101206_AAAXLY pokharel_p_Page_013.jpg
797c3232fdee05901e590d73d0b5a622
652c7bc0b2e8ea38a0cd3b02d5d737cf404f7b8d
80062 F20101206_AAAXNB pokharel_p_Page_056.jpg
73c4801e6241a5a9a02e7244ac7c855f
25a228ca631ee1502328f031d45116cdec7c7565
31897 F20101206_AAAXMN pokharel_p_Page_035.jpg
c74fe218bddfc3f1339b534bd52860b0
289ca511473679545f857fe25ffb93ce8d3c36b1
71387 F20101206_AAAXLZ pokharel_p_Page_014.jpg
139a19a4f9e82db3946d3398d522990e
a713efd06857704b6ae114089c4d2765e6be7e0e
29523 F20101206_AAAXNC pokharel_p_Page_057.jpg
9890b859ffdd1b75b60d83efcb2b6362
b7ee2f53d31c763071f259aa1276fb2d89049f2e
66558 F20101206_AAAXMO pokharel_p_Page_036.jpg
7ba53526c6472f04ffc63767b0c6d166
d8529aff2837495519829eea434b515f0b60c85b
64232 F20101206_AAAXND pokharel_p_Page_058.jpg
e10c661d44a19a5ec02a467ac66f21ab
cc4849d5afcb89e327ae286ea0dcac997629523a
63786 F20101206_AAAXMP pokharel_p_Page_037.jpg
b2313d85ff7f5c1f04c5ce200a301ee1
2f51338045b2b12ba6c0e6826721e0b3e6e4c7ca
70952 F20101206_AAAXNE pokharel_p_Page_059.jpg
d19049948dc454ea62794f5c1f71fd81
c976e89e708d3718c16d4e4c72fb091c0bf2051f
67880 F20101206_AAAXMQ pokharel_p_Page_038.jpg
4f3bd5b38541bf36e840c3084c23d6c2
69f961c02c063d51e73f942bf7b8abefb1547acc
45189 F20101206_AAAXNF pokharel_p_Page_060.jpg
106a2342e5a1b0fbfe7093603f6dabf6
3d3bbd0a078cd38ad5666b6fc8fe78f50c74a1b2
74777 F20101206_AAAXMR pokharel_p_Page_040.jpg
c3ad57210fe21898a990e42e67f43076
9fa678e970be8ab5ee233f7a00a595ed6b89eed5
51550 F20101206_AAAXNG pokharel_p_Page_061.jpg
9078d1d35d5b05fd74bc9ecce4c6c198
8d2ed016d6f091299231a5f824ea08e27f54413b
76352 F20101206_AAAXMS pokharel_p_Page_041.jpg
b9265c55688e752bb2ddfbc72f8480dd
31064610a3612f798015c0efb695794ed16deec1
67867 F20101206_AAAXNH pokharel_p_Page_062.jpg
a4a78f2f98c65a4f2a5b2ef4aa8714c0
a6a716ca9737083ce9fc75e7e298bd6f3e1ce175
87259 F20101206_AAAXMT pokharel_p_Page_043.jpg
480b3eb3d65cc29d79c664a63c54553b
7743d40332528ec940f6b1ef0b4aa34de3a2ae0d
87630 F20101206_AAAXNI pokharel_p_Page_063.jpg
881faabd75b85d5fdf7270b4f1061b03
825da0aea2667a31a12e79a27ba16d81f29d53e0
81656 F20101206_AAAXMU pokharel_p_Page_044.jpg
a8e2e6bdc2b5161d011ae5cf910262ca
e2affba2cc2d49a037bad4d90a9711c60188f436
58404 F20101206_AAAXNJ pokharel_p_Page_064.jpg
266f2137e51175fc4f3dd51a319a4f43
e00291613204a982e3225a7803fe96754c3e259a
65149 F20101206_AAAXMV pokharel_p_Page_046.jpg
a574bc487a8cc1bcf8e6ed88b4b2452a
1a6c657091c27952d494c3302c674ed809bcfb3e
69725 F20101206_AAAXNK pokharel_p_Page_066.jpg
ade3230a140e37e48ae1f92503a222d2
feb5c0a6af78f800beaeaa0f1fc6e1f8def8ae3e
64294 F20101206_AAAXMW pokharel_p_Page_047.jpg
66a0986821201663a41996aa9b8b9875
852519498d48229b7c485191213a74273d26933d
58020 F20101206_AAAXNL pokharel_p_Page_069.jpg
705ce1fa3799d0426e9559363b0f41f1
a7bc74af6437154106f2dc29e14fa0c7f2629ec6
46164 F20101206_AAAXMX pokharel_p_Page_050.jpg
f7b2c569efd4a4d963e55a1e6ed8f8dc
bac35385d4a8ebc20fbf4089360078546df4eb46
65043 F20101206_AAAXOA pokharel_p_Page_086.jpg
f097b8b0036dc059eb7d4b106b0fe752
30638b81793c606b548f9abf7a4acd27e8d36fd7
69470 F20101206_AAAXNM pokharel_p_Page_070.jpg
66ec869fd1bdef59588b58a8f7b2a7ad
a7f88007554c811ecdc0578e18ff2bd4c3e031ac
35043 F20101206_AAAXMY pokharel_p_Page_051.jpg
ef7fd795de16c726294fabe717adddfe
0895285258011414a9069b116e4c8b2079c93322
66217 F20101206_AAAXOB pokharel_p_Page_087.jpg
a6bc4a92313058811248ce528b6bf050
929f8863d1027baaac7b407672c7524688001dc4
35518 F20101206_AAAXNN pokharel_p_Page_071.jpg
3257f9ac77cd66dc7842c04b058080a5
abf8df0ad4d337eab10a2c5ac89ec124dcbaa532
25833 F20101206_AAAXMZ pokharel_p_Page_052.jpg
4eb7af649cf038e725fb391e400e8b99
fcd4df6d99a08a0c3a0fc2016af84d33a93dc1ca
68787 F20101206_AAAXNO pokharel_p_Page_072.jpg
2efcf3848071b6622f4df15855026c32
ad243f1c5780b27924774b794a6cb40a19a83973
51596 F20101206_AAAXOC pokharel_p_Page_088.jpg
3673752b0f8f6c391bbac09b44c2865c
1b37bbf98fe0f692c9210b34e35fef233522dffa
54144 F20101206_AAAXNP pokharel_p_Page_073.jpg
da68091f20415f62ac762797e54f8b85
258206fefeef61269556ebf01bebb0db4b24a1f9
56560 F20101206_AAAXOD pokharel_p_Page_090.jpg
02c3ba047c88724d792c65b61978b42a
9f55533930577bb683406d0b4f619a2ab75b1bea
87860 F20101206_AAAXNQ pokharel_p_Page_074.jpg
9d16a03a998817e1c813fc62e2f3e220
90c83f6b2f6fd05eabd7d1a6a686ec54e6953977
58937 F20101206_AAAXOE pokharel_p_Page_091.jpg
e654e12c5e83ea504f3d082d8ef4e4fd
3c076adc510f1c169717ea722578affece27c450
61280 F20101206_AAAXNR pokharel_p_Page_075.jpg
dd68811d5d780a5aab69e6aaf1452d05
4620dc89d5833640175d94b77bd4fadb2e90f492
84518 F20101206_AAAXOF pokharel_p_Page_092.jpg
5ce01ec2620bdde578e29de7cf8c6bec
6bdc0fe590660a3d5df390af21795214526af97a
65624 F20101206_AAAXNS pokharel_p_Page_076.jpg
806d31224cfe9309582ff62b6b8578e9
5f02d03a60a333497ca24e96b6a121c7be161fc5
76084 F20101206_AAAXOG pokharel_p_Page_093.jpg
7df7b9505fb167a9543010364fb3b1b8
3ba7bace0be2dc4c6e722adbfb7770df0089ac8a
57581 F20101206_AAAXNT pokharel_p_Page_077.jpg
ab733c3f08018fb73d0ab52a0467a439
d9a9f5806461fcd588ebfcd84bdf3f33833fbb98
34653 F20101206_AAAXOH pokharel_p_Page_097.jpg
488f3a8002b088adb47108a30c697cba
6b1ad482c7500e3ed6c9a36db21aead54fee2bb9
92736 F20101206_AAAXNU pokharel_p_Page_079.jpg
07bb9ec128c752e454570f272b414988
4fc9545dc5c63f3d5848c2ced3b5ba763b92698c
85118 F20101206_AAAXOI pokharel_p_Page_098.jpg
94e3d8099f75d4c58cf7665cfdadabd9
47ef45453d6d0f2c4271bd0cc154b811770069e3
71533 F20101206_AAAXNV pokharel_p_Page_080.jpg
a2ba2073dfcf2875be503c04330f15ba
1977b26e6dbe5a002023dad9ec4b7a43510066dd
70366 F20101206_AAAXOJ pokharel_p_Page_099.jpg
c5a32abd78ed4191c9e86e910ebed644
aca38f3cf705629fc5aec87a13ebaa1dbbd40180
52233 F20101206_AAAXNW pokharel_p_Page_081.jpg
54f06ca425aaa8085470fb9f867b0484
a1ad20d7805161890e619a0505f7ee5092ab1e64
87564 F20101206_AAAXOK pokharel_p_Page_101.jpg
439bfc5d5eb353c92df25597fe8ec16c
4d827e33b7022c08aa80e0ffbfa199828b325b27
87739 F20101206_AAAXNX pokharel_p_Page_082.jpg
0fa71138bd1e76dce53638b8972af38c
7017e4a7df188fc23c39b35eb2623f09fcf4d1ba
201803 F20101206_AAAXPA pokharel_p_Page_007.jp2
ca14f1cbb727bc243f847129a70d43df
3bc40f251fb56c2184abc1a4be8642217fb0b398
61077 F20101206_AAAXOL pokharel_p_Page_102.jpg
e5d90984da83b8f594d1ddc97afea902
12d4aac461e0d0a924a12fbbbf673d2df414ec6a
90860 F20101206_AAAXNY pokharel_p_Page_083.jpg
2c86d67e9f25d31ae3375c83833cca2e
d5a09c11e429a6cf729fe4139cbcf879332b9d4e
F20101206_AAAXPB pokharel_p_Page_008.jp2
7d9661f683f70b38cd918078d0fa9e79
c5b3eeece34a721dcd7ffed501a11ba47d05491d
45840 F20101206_AAAXOM pokharel_p_Page_103.jpg
7f8263ff614d40cc57cbd6cd8d342841
0664b65559f4d97354ab8ffcf2254a6c5dcc5d56
48839 F20101206_AAAXNZ pokharel_p_Page_085.jpg
9c98a96773b4bdd31289dd07d04f229c
18e262c304d950dac2feaca30a37a8027bf3b9a7
1051981 F20101206_AAAXPC pokharel_p_Page_009.jp2
ca78c63f9ceee1fbc24bd54d2b91e9db
6d8cbd06ffd618fd093e932a3256450d7608a1d5
24785 F20101206_AAAXON pokharel_p_Page_104.jpg
66e2196da37a3e414a39549aeb9317fb
93eb5ad7a595249fab0fd4f28f442db6e7de3bf6
73152 F20101206_AAAXOO pokharel_p_Page_105.jpg
00536bc4c18bef1d561ee6125b3a6ac2
84b719384ae4979441c33f8d236fe3a940bb8897
106438 F20101206_AAAXPD pokharel_p_Page_010.jp2
388fe4c2adac00af2631dbfd8b4c4285
e846936e5464523754131af9f02a6560c7ec062a
76990 F20101206_AAAXOP pokharel_p_Page_106.jpg
768bae55b971b31ee811c3330a7da175
8e211639ac473a35ad77296ab652c1e9b33e194f
35386 F20101206_AAAXPE pokharel_p_Page_011.jp2
39d0f19a5c3b96023cf1b444c22abe8a
3801026eba176df5203d4b67a479c0b9095bed3f
81213 F20101206_AAAXOQ pokharel_p_Page_107.jpg
a65c103e6b986fe28248e1074285258e
ca0a630fa1cbb6f893e3b9ca1a7d5733cbff2328
1051908 F20101206_AAAXPF pokharel_p_Page_013.jp2
c81c01f6459f84421b1b01c15b55d20d
bf188136aabc007f61c9cf5e1bfc4f0e9dde45f5
78108 F20101206_AAAXOR pokharel_p_Page_108.jpg
8fbcbed96260b97cc3f37fd40a4fb361
9c24ffb3b3f003f340e1f513bfca68f49302228e
F20101206_AAAXPG pokharel_p_Page_014.jp2
880ce6b9216d2b66bb55c6623d040aa9
8b0f4fd01d58981a058fd230f55a9ac054829cd6
74451 F20101206_AAAXOS pokharel_p_Page_109.jpg
922087dfaf6aae58cad6ebddf224209f
d1bf816e6a361e10555172dea32af0e8fb33ab5a
F20101206_AAAXPH pokharel_p_Page_016.jp2
359983620d965fce25f7968b5e60ae13
d970ca3912066310eb401c9cb1f4fcf565202111
74190 F20101206_AAAXOT pokharel_p_Page_110.jpg
524f0318382a00509e50602dd93df1b5
6f3bd1e1633e07949b0adc95af55efb7a5a91117
1000808 F20101206_AAAXPI pokharel_p_Page_017.jp2
8cefcb4de665b1503d7e2c188b9f4833
e205c4f4c90ad69082f92ad7b15767de2fb0efbb
33083 F20101206_AAAXOU pokharel_p_Page_111.jpg
8bd721c97ef19225c1de7c3e429621a3
ce741dce43452c677cfbdccdbe59e0309a573706
1009511 F20101206_AAAXPJ pokharel_p_Page_018.jp2
1c2737adf7f0395db3e4e2ac94aba9fd
ac3982d1e54e3bcc8be9bce03bd9a338ccc756e3
81112 F20101206_AAAXOV pokharel_p_Page_112.jpg
5f83da39293a1535f91cf9e8efab371d
6b9c8ce1ee864d1eab2e4b9ae58538c6281de7f3
1023466 F20101206_AAAXPK pokharel_p_Page_019.jp2
3f200c429c96e72f734f1ecd8ea2e0b4
ed66f04fe79477d2909e249177a78333486b3b32
82713 F20101206_AAAXOW pokharel_p_Page_114.jpg
59046c83c7c8d830e3be162c356c41bf
410b037631b2d82dcb3354d858355dcbae3cc980
893464 F20101206_AAAXQA pokharel_p_Page_042.jp2
2ba494776dfd378ddc14d9afbd074718
78bb4cd77cbeb6b7fcf2506c05642edaa3bf4caf
1051952 F20101206_AAAXPL pokharel_p_Page_020.jp2
e2c33a9cac24602548ae835eec18e983
7d166a996aee74385efa325fc66aab486635b8fe
82714 F20101206_AAAXOX pokharel_p_Page_117.jpg
eec3ba98eeed2858d7816aea2725b987
4c09bd5bdcccc294ec4c8f7abfc3d049c77cc286
F20101206_AAAXQB pokharel_p_Page_043.jp2
d859deb158854713839f929e90c49cc0
cdd4948037ce66e5ef2e271beb1feadea401051d
1051954 F20101206_AAAXPM pokharel_p_Page_022.jp2
c45e406a659238aa7af37c1ffd0734fb
cc3dcbc9a947b91a4cbb73110822baf8d9062888
15483 F20101206_AAAXOY pokharel_p_Page_118.jpg
9ea14adfe27a89fa7f429bd2989082fb
cf7168ffa5bc660a43145af7c138050125f162ea
1051932 F20101206_AAAXQC pokharel_p_Page_044.jp2
4b483dbc161d6752d98e3e650b29c199
45e4a30f185d144a687cd48d6b896b9a05e5d9bb
92575 F20101206_AAAXPN pokharel_p_Page_023.jp2
a4ceb537b650d16a916fa142582aa408
7bfe3b1ea8e88500c60b46986bab66b050b4c3aa
32845 F20101206_AAAXOZ pokharel_p_Page_119.jpg
f1c7c9a3302fd685dfcaac785ad9186d
2c6b651a0914171217ceb40bf9b84dd2af5e644b
844739 F20101206_AAAXPO pokharel_p_Page_025.jp2
dd51bb42eaac80e2b6652e34f0f0f85f
642117851b47856bfe4bf15306be0587fd803dec
835621 F20101206_AAAXQD pokharel_p_Page_045.jp2
7d4f841c1b0e3e7505a2779b3ce4882d
4077a180529f9a3ebbed6a5f6ed6dfc1643cec90
898116 F20101206_AAAXPP pokharel_p_Page_028.jp2
4d7ccde16a78d63d2f5eddd4a32ce450
1ef0bc435356f017f34d9c8d3fb688ce7d09c999
735304 F20101206_AAAXPQ pokharel_p_Page_029.jp2
484d1ddad7ba572b0b3de2d70dd251d8
a803b6422efbc9a14c1edc71d9f5102b83d5acac
848554 F20101206_AAAXQE pokharel_p_Page_046.jp2
360f4a9c8580037fa5cedc1f6b7cb0e4
4291c95f5f57f79c95519e45ac78d73f5acf9c04
834935 F20101206_AAAXPR pokharel_p_Page_030.jp2
c37917e01ec411e743acf42b6c35ba10
ffcc31c8dd57cc87741f639b596257144248de59
841947 F20101206_AAAXQF pokharel_p_Page_047.jp2
085eec961b31ef3532a9e73480b996af
a69b5a92c1af392511c589fa9b87baba8006fb9d
1051982 F20101206_AAAXPS pokharel_p_Page_031.jp2
d5f34fc6738908f269b6868433826191
586721152b72531cb9a3477cddd381c9f7599366
845736 F20101206_AAAXQG pokharel_p_Page_048.jp2
53ab8283d4f7be1460a80c833c9f1e02
4d8133e1abc0b9e7468e0beb592e46c02cef1805
1042081 F20101206_AAAXPT pokharel_p_Page_032.jp2
2c83682a5aa290d43d001adec5952e87
9c0e00aeb52e31ca4e8e007271acdbd7f6901f8d
681671 F20101206_AAAXQH pokharel_p_Page_049.jp2
597b3f91a10dbd12b128dbf4ec47bd1a
d218d1c18381fd19c73a16625cc09589ba80b2d7
1051945 F20101206_AAAXPU pokharel_p_Page_033.jp2
ba2a43a1478e21d575db50aa693c7054
feebd7bf1f418673f22f8b87e22f98dc4f4e4348
582117 F20101206_AAAXQI pokharel_p_Page_050.jp2
f2d447e8ba28c98b03b0b89321b5172d
723c1844dd81cfe21c40fdbc3cde5d244a62182b
1051938 F20101206_AAAXPV pokharel_p_Page_034.jp2
0339df96e450a41ad55dd2ff4080ddb3
5f88052e1a14d2ccd6aa76a03eab94d259118fa1
371161 F20101206_AAAXQJ pokharel_p_Page_051.jp2
68449872b619629f2f55fb7658bab148
ed229508250e37cf444238ec6407849eb3f20680
44203 F20101206_AAAXPW pokharel_p_Page_035.jp2
f1e53195465f4726dda5d57f59dcaffd
e5f1f25c0b7057b9c6e929c2b0736a59cb27cfc2
26909 F20101206_AAAXQK pokharel_p_Page_052.jp2
fe45b68d2ccf9976a3c02619bbc7fe89
9255aca59e9c180db092aff37b88ff52714a3a2c
98267 F20101206_AAAXPX pokharel_p_Page_037.jp2
ec01738584e50aacb8c4a54272639d2c
6addc99133e064f70e1d608e618e71bd51c90f54
854373 F20101206_AAAXRA pokharel_p_Page_075.jp2
f6810fffa0cc7dc34e30936ec9ffe875
0cc975a137d7a31da83951ce9d5bcd79af6249db
179821 F20101206_AAAXQL pokharel_p_Page_054.jp2
77bd0708957d9b91e58ff6a4d9f61a5f
eb16c40cbc8084c1e441b303bd3df5b848c771c0
991588 F20101206_AAAXPY pokharel_p_Page_038.jp2
c4cf01535ba0714574bb5d21a48e8585
89fd779f0c15670321a0dd571b6edf218fc92aa8
909680 F20101206_AAAXRB pokharel_p_Page_076.jp2
7d85170509e15ed83f52751444a0ca18
706fe12b5939eba29fe359d63bc70b92ec167bd9
1051960 F20101206_AAAXQM pokharel_p_Page_055.jp2
e0c3e6cb4e2d82726b5fb39e5b8a48c4
d23b4d037ed966347c444d640667afd39e48bbd7
F20101206_AAAXPZ pokharel_p_Page_041.jp2
c8d0fb86a37f6533c59b3a5c0d471d4c
f091ea77eca0a449d40fbab8ce3c968f2831a25c
831583 F20101206_AAAXRC pokharel_p_Page_077.jp2
5338e24c3aa9a5ab7f84e74b803a3006
a820feee71f2daf92cb0b09f91b556b6612ee7f2
1051919 F20101206_AAAXQN pokharel_p_Page_056.jp2
e729a2b24fc5bad509de05de8b6a7a4e
f02e101ddd4c29065a4eec42c9b1353acb46d663
685574 F20101206_AAAXRD pokharel_p_Page_078.jp2
e3b0f8296899bb6c9c4723a002dbe290
a7ddf96e7bb93059b5ce1764655934b194650350
286009 F20101206_AAAXQO pokharel_p_Page_057.jp2
7e9b437024853cf0679a9cf49c390e94
ae4de41c37b82b81944b17bc059a8890cb1f72d2
993485 F20101206_AAAXRE pokharel_p_Page_080.jp2
0e4b5dbaf3fd7f6a7df0355bfb3e51c2
6e5f439c0f1fde3ac8ca308fcd4637cacd94ee67
97365 F20101206_AAAXQP pokharel_p_Page_058.jp2
e71605dc51687ee3e9779b93bd403728
4c871cfc090bbd9b02b6c57bb0e37fba8a43efb4







TIME SERIES ANALYSIS WITH INFORMATION THEORETIC LEARNING AND
KERNEL METHODS


















By
PUSKAL P.POKHAREL


A DISSERTATION PRESENTED TO THE GRADUATE SCHOOL
OF THE UNIVERSITY OF FLORIDA IN PARTIAL FULFILLMENT
OF THE REQUIREMENTS FOR THE DEGREE OF
DOCTOR OF PHILOSOPHY

UNIVERSITY OF FLORIDA

2007



































S2007 Puskal P. Pokharel





































To my parents, professors, and friends.










ACKNOWLEDGMENTS

With patience, persistence and support of many individuals, my stay for doctoral

research has been very nienorial and fruitful. Without the help and the active encouragement

of some of these people, the amount of time and effort required of the Ph.D. degree would

have made it overwhelmingly daunting.

I acknowledge and thank Dr. Jose C. Principe for his role as advisor and mentor

during my stay here. Our numerous discussions and exchange of ideas has been integral

to this research. I also thank hint for providing a stimulating environment in the

Computational NeuroEngineering Laboratory where I have developed and expanded a

lot of my engineering and research capabilities.

I am also grateful to the nienters of my advisory coninittee, Dr. John G. Harris, Dr.

K(. Clint Slatton and Dr. Murali Rao for their time and valuable ell__. -r;n~- I also would

like to express my appreciation to all the CNEL nienters, especially, those of the ITL

group for their collaboration in this and other research projects.

Finally, I thank my parents for encouraging me to pursue a Ph.D. in the first place,

and for supporting me every step of the way.











TABLE OF CONTENTS

page

ACK(NOWLEDGMENTS ......... . .. .. 4

LIST OF TABLES ......... ..... .. 7

LIST OF FIGURES ......... .. . 8

ABSTRACT ......... ..... . 10

CHAPTER

1 INTRODUCTION ......... ... .. 12

1.1 Signal Detection ......... . .. .. 15
1.2 Optimal and Adaptive Filtering . ..... .. 16

2 NEW SIMILARITY AND CORRELATION MEASURE ... .. .. 18

2.1 K~ernel-Based Algorithms and ITL . .... .. 18
2.2 Generalized Correlation Function . .. .. .. 21
2.2.1 Correntropy as Generalized Correlation .. .. .. .. 21
2.2.2 Properties ......... ... 25
2.3 Similarity Measure ......... .. 28
2.4 Optimal Signal Processing Based on Correntropy .. .. .. :32

:3 CORRENTROPY BASED MATCHED FILTERING ... .. ... 36

:3.1 Detection Statistics and Template Matching .... .. :36
:3.1.1 Linear Matched Filter ........ .. :36
:3.1.2 Correntropy as Decision Statistie ... ... .. 37
:3.1.3 Interpretation from K~ernel Methods ... .. .. :39
:3.1.4 Impulsive Noise Distributions .... .... . 40
:3.1.4.1 Two-ternt Gaussian mixture model .. .. .. 40
:3.1.4.2 Alpha-stable distribution ... ... .. 41
:3.1.4.3 Locally suboptinmal receiver ... .. .. 41
:3.1.5 Selection of K~ernel Size . ..... .. 42
:3.2 Experiments and Results ......... .. .. 4:3
:3.2.1 Additive White Gaussian Noise .... .. . 44
:3.2.2 Additive Impulsive Noise by Mixture of Gaussians .. .. .. .. 45
:3.2.3 Alpha-Stable Noise in Linear C'I .Ill., I ... ... .. .. 46
:3.2.4 Effect of K~ernel Size ....... ... .. 47
:3.2.5 Low Cost C11 F~ Using Triangular K~ernel .. .. .. .. 48

4 APPLICATION TO SHAPE CLASSIFICATION OF PARTIALLY OCCLITDED
OBJECTS ... ......... ............. 55

4.1 Introduction ......... . . 55













4.2 Problem Model and Solution.


5 WIENER FILTERING AND REGRESSION.


5.1 Linear Wiener Filter

5.2 Correntropy Based Wiener Filter
5.3 Simple Regression Models with K~ernels.
5.3.1 Radial Basis Function Network.

5.3.2 Nadarava Watson Estimator.
5.:3.3 ?-li &~~! I. 1 RBF Network

5.4 Parametric Iniprovenient on Correntropy Filter
5.5 Experiments and Results.

5.5.1 System Identification
5.5.2 Time Series Prediction of Alackey-Glass Time Series

6 ON-LINE FILTER S


6.1 Background .....

6.2 Correntropy LMS ......
6.3 K~ernel LMS ......

6.4 Self-Regularized Property ....... .....
6.5 Experimental Comparison between LMS and K(LMS .
6.6 K~ernel LMS with Restricted Growth .....

6.6.1 Sparsity Condition for the Feature Vectors ..
6.6.2 Algorithm .

6.7 Experimental Comparison between K(LMS and K(LMS

6.8 Application to Nonlinear System Identification ....
6.8.1 Motivation ...... .

6.8.2 Identification of a Wiener System .....
6.8.3 Experimental Results ......

6.9 Suninary .....


... 74
... 75
... 76

... 80
... 82
... 84
... 84
87

Growth 89
91
91
9:3
9:3
9


. .


. .
.

with
. .


.
.
.

.
. .
.
.
. ..

Restricted
.
.
.
.
.


7 APPLICATION TO BLIND EQUALIZATION ...


. . 98


Motivation.

Problem Setting .

Cost Function and Iterative Algforithm
Simulation Results


8 SITAINARY .............


8.1 Suninary
8.2 Other Applications .......
8.3 Future Work .........


LIST OF REFERENCES


BIOGRAPHICAL SKETCH . ...


. .. 119










LIST OF TABLES


Table page

3-1 Values for the statistic for the two cases using the Gaussian kernel. .. .. .. 38

4-1 Recognition and misclassification rates. ...... .. . 60










LIST OF FIGURES


Figure page

1-1 Set up of a typical time series problem. . ..... 13

1-2 General overview for solving time series problems in the feature space .. .. 14

2-1 Probabilistic interpretation of Vxy (the maximum of each curve has been normalized
to 1 for visual convenience). . .. ... ... 31

3-1 Receiver operating characteristic curves for synchronous detection in AWGN
channel with kernel variance a2(CM~F) 15 a2(MI1) 15 (the curves for MF
and Cill~ for 10dB overlap). ......... ... .. 45

3-2 Receiver operating characteristic curves for .I-i-itchronous detection in AWGN
with kernel variance a2(CM~F) 15 a2(MI1) 15. ... .. .. 46

3-3 Receiver operating characteristic curves for synchronous detection in additive
impulsive noise with kernel variance a2(CM~F) 5, a2(MI1) 2. .. .. .. 47

3-4 Receiver operating characteristic curves for .I-i-itchronous detection in additive
impulsive noise with kernel variance a2(CM~F) 5, a2(MI1) 2. .. .. .. 48

3-5 Receiver operating characteristic curves for synchronous detection in additive
white ac-stable distributed noise, kernel variance a2 3, a~ 1.1. .. . .. 49

3-6 Receiver operating characteristic curves for synchronous detection in additive
white ac-stable distributed noise, kernel variance a2 3, SNR-15 dB, the plots
for MI and C \! I almost coincide for both values of a~... .. .. .. 50

3-7 Receiver operating characteristic curves for .I-i-itchronous detection in additive
white ac-stable distributed noise, kernel variance a2 3, SNR-15 dB, the plots
for MI and C \! I almost coincide for both values of a~... .. .. .. 51

3-8 Area under the ROC for various kernel size values for additive alpha-stable distributed
noise using synchronous detection, a~ = 1.1 ..... .. . 52

3-9 Area under the ROC for various kernel size values for additive impulsive noise
with the mixture of Gaussians using synchronous detection. .. .. .. .. 53

3-10 Triangular function that can be used as a kernel. ... .. . .. 53

3-11 Receiver operating characteristic curves for SNR of 5 dB for the various detection
methods. ......... ... . 54

4-1 The fish template database. ......... .. .. 57

4-2 The occluded fish. ......... .. .. 60

4-3 The extracted boundary of the fish in 4-2. ..... .. 61










5-1 The computation of the output data sample given an embedded input vector
using the correntropy Wiener filter. . ...... .. 65

5-2 Time d. 1i w. 1 neural network to be modeled. ..... .. . 69

5-3 Input and output signals generated by the TDNN (desired response), CF and
WF using a time embeding of 2. ......... ... 70

5-4 Mean square error values for WF and CF for the system modeling example. 71

5-5 Mean square error values for MG time series prediction for various embedding
size. ............ .... ..... .... 71

5-6 Mean square error values for MG time series prediction for different size of training
data. ......... ............ .... 72

6-1 Linear filter structure with feature vectors in the feature space. .. .. .. .. 77

6-2 Error samples for K(LMS in predicting Alackey-Glass time series. .. .. .. .. 78

6-3 Learning curves for the LAIS, the K(LMS and the regularized solution. .. .. 84

6-4 Comparison of the mean square error for the three methods with varying embedding
dimension (filter order for LMS) of the input. .. .. .. 85

6-5 Learning curve of linear LMS and kernel LMS with restricted growth for three
values of 6. .. ... . .. 89

6-6 Performance of kernel LMS with restricted growth for various values of 6. .. 90

6-7 Number of kernel centers after training for various values of 6. .. .. .. .. 91

6-8 A nonlinear Wiener system. ......... ... .. 9:3

6-9 Signal in the middle of the Wiener system vs. output signal for binary input
symbols and different indexes n. ......... ... .. 94

6-10 Mean square error for the identification of the nonlinear Wiener system with
the three methods. The values for K(RLS is only shown after the the first window. 95

7-1 Inter symbol interference (ISI) convergence curves for correntropy and C \!.9
under Gaussian noise. ......... . .. 102

7-2 Convergence curves of the equalizer coefficients for correntropy and C' \!.9 under
Gaussian noise. The true solution is w = (0, 1, -0.5). ... .. . .. 10:3

7-3 Inter symbol interference (ISI) convergence curves for the correntropy and Cil1.9
under impulsive noise. ......... . .. 104









Abstract of Dissertation Presented to the Graduate School
of the University of Florida in Partial Fulfillment of the
Requirements for the Degree of Doctor of Philosophy

TIME SERIES ANALYSIS WITH INFORMATION THEORETIC LEARNING AND
KERNEL METHODS

By

Puskal P. Pokharel

December 2007

C'I I!r: Jose C. Principe
Major: Electrical and Computer Engineering

The major goal of our research is to develop simple and effective nonlinear versions

of some basic time series tools for signal detection, optimal filteringf, and on-line adaptive

filteringf. These extensions shall be based on concepts being developed in information

theoretic learning (ITL) and kernel methods. In general all ITL algorithms can be

interpreted from kernel methods because ITL is based on extracting higher order

information (that beyond second order as given by the autocorrelation function) directly

from the data samples by exploiting nonparametric density estimation using translation

invariant kernel functions. ITL in general is still lacking in providing tools to better

exploit time structures in the data because the assumption of independently distributed

data samples is usually an essential requirement. Kernel methods provide an elegant

means of obtaining nonlinear versions of linear algorithms expressed in terms of inner

products by using the so called kernel trick and Mercer's theorem. This has given

rise to av w-1I~ii of algorithms in the field of machine learning but most of them are

computationally very expensive using a large Gram matrix of dimension same as the

number of data points. Since these large matrices are usually ill-conditioned, they require

an additional regularization step in methods like kernel regression. Our goal is to design

basic signal analysis tools for time signals that extract higher order information from the

data directly like ITL and also avoid the complexities of many kernel methods.










We present new methods for time series analysis (matched filtering and optimal

adaptive filtering) hased on the newly invented concept in ITL, correntropy and kernel

methods. Correntropy induces a RK(HS that has the same dintensionality as the input

space but is nonlinearly related to it. It is different front the conventional kernel methods,

in both scope and detail. This in effect helps us to derive some elegant versions of a

few tools that form the basic buildings block of signal processing like the matched filter

(correlation receiver), the Wiener filter, and the least mean square (L1\S) filter.









CHAPTER 1
INTRODUCTION

Natural processes of interest for engineering are composed of two hasic characteristics:

statistical distribution of amplitudes and time structure. Time in itself is very fundamental

and is crucial to many world problems, and the instantaneous random variables are hardly

ever independently distributed, i.e., stochastic processes possess a time structure. For

this reason there are widely used measures that quantify the time structure like the

autocorrelation function. On the other hand, there are a number of methods that are

solely based on the statistical distribution, ignoring the time structure. A single measure

that includes both of these important characteristics could greatly enhance the theory of

stochastic random processes. The fact that reproducing kernels are covariance functions as

described by Aronszajn [1] and Parzen [2] explains their early role in inference problems.

More recently, numerous algorithms using kernel methods including Support Vector

Machines (SVAI) [3], kernel principal component analysis (K(-PCA) [4], kernel Fisher

discriminant analysis (K(-FDA) [5], and kernel canonical correlation analysis (K(-CCA)

[6], [7] have been proposed. Likewise, advances in information theoretic learning (ITL),

have brought out a number of applications, where entropy and divergence employ Parzen's

nonparametric estimation [8], [9], [10], [11]. Alany of these algorithms have given very

elegant solutions to complicated nonlinear problems. Most of all these contemporary

algorithms are based on assumptions of independent distribution of data, which in many

cases is not realistic. Obviously, accurate description of a stochastic process requires both

the information of the distribution and that of its time structure.

We can describe time series problems seen in engineering in broadly two 1! in r~

categories.

(A) Detection: This problem has a general setup given by figure 1-1. A known signal

template (usually chosen from a finite set of possibilities) passes through a channel which

modifies the signal based on an underlying system model. Usually an addition source










00150






Signal with a Means orf distortion Observed signal
time structure3 and delay


Figure 1-1. Set up of a typical time series problem.


independent of the channel and the template is also present which is called the noise. This

results in an observed signal that is a distorted version of the original signal. The problem

then is simply to detect which signal template was transmitted through the channel based

on the observed signal, usually without knowing or calculating the channel explicitly [12].

(B) Estimation: Estimation usually involves estimating or calculating any component

(or a part of it) of the time series problem model given in figure 1-1 based on the observed

signal. Those components can he the original input signal, channel parameters or even

certain noise characteristics. The problem may or may not have prior knowledge of the

input signal. If such information is absent then such an estimation problem is called blind.

Good examples of blind estimation are blind source separation [13], blind deconvolution

[14] and blind equalization [15]. There can he other estimation problems as well like noise

estimation (usually involves calculating certain noise parameters like covariance), system

identification [16], time series prediction [17], interference cancellation [18], etcetera.

The integral component for both these categories of problems is a correlation (or more

generally similarity) measure that extracts information about the time structure. These

ideas are summarized in figure 1-2. It is here that out proposed method of higher order

correlation will make an impact for the improvement of these basic tasks in time series

a Presently, the most widely used method of similarity is the correlation function

which appears naturally when the underlying assumptions are limited to linearity and

Gaussianity. The kernel methods that were mentioned earlier also employ a similarity





















.... ....... .......... ....Input space




I..... ~Featu~:r~sae~ spc


Figure 1-2. General overview for solving time series problems in the feature space


measure, the kernel function. But here the similarity is point-wise among pairs of data

samples. Thus to represent the data set for solving most of the problems one is required

to evaluate the kernel pairwise with all the available data samples creating a large Gram

matrix. Though these type of methods can solve certain complex problems, they are

usually quite burdensome both on time and memory. Yet many researchers are attracted

toward kernel methods because it can solve complex problems elegantly using conventional

optimal signal processing theory but in a rich kernel induced reproducing kernel Hilbert

space (RK(HS). This space is usually very high dimensional, but most solutions in the

RK(HS can be readily calculated in the input space using the kernel function which acts

as an inner product of the feature vectors. Through our study, we shall present a new

function that is a statistical measure having the same form as the correlation function and

like the the kernel will have an associated RK(HS. This provides a completely new avenue

of research, potentially solving complex problems more accurately and conveniently.

Specifically, we shall define a generalized correlation function, which due to its intriguing










relationship with Renyi's Quadratic entropy and properties similar to correlation, is

termed as correntropy.

This new function can provide novel means of performing various signal processing

tasks involving detection and estimation. The specific tasks that we discuss in this

dissertation are signal detection (correlation receiver) and optimal filtering. The integral

component for both of these problems is a similarity measure, using the concepts of

correntropy and kernel methods. Now we introduce these tasks more specifically.

1.1 Signal Detection

Detection of known signals transmitted through linear and nonlinear channels is

an important fundamental problem in signal processing theory with a wide range of

applications in communications, radar, and biomedical engineering to name just a few [12,

19]. The linear correlation filter or matched filter has been the basic building block for the

1!! I iW~~y of these applications. The limitations of the matched filter, though, are already

defined by the assumptions under which its optimality can he proved. It is well known

that for the detection of a known signal linearly added to white Gaussian noise (AWGN)

the matched filter maximizes the signal to noise ratio (SNR) among all linear filters

[20]. Theoretically, this means that the matched filter output is a maximum likelihood

statistic for hypothesis testing under the assumptions of linearity and Gaussianity. The

optimality is predicated on the sufficiency of second order statistics to characterize the

noise. Unfortunately, most real world signals are not completely characterized by their

second order statistics and sub-optimality inevitably creeps in. Optimal detection in

non Gaussian noise (or non-linear) environments requires the use of the characteristic

function and is much more complex [21] because it requires higher order statistics to

accurately model the noise. This motivates the recent interest in nonlinear filters (kernel

matched filters) [22, 23] or nonlinear cost functions [24], but the computational complexity

of such systems outweighs their usefulness in applications where high processing delay

cannot he tolerated such as in radar and mobile communication systems. With kernel










methods, a nonlinear version of the template matching problem is first formulated

in kernel feature space by using a nonlinear mapping and the so called kernel trick

[22] is emploi-u 0 to give a computationally tractable formulation. But the correlation

matrix formed in this infinite dimensional feature space is also infinitely large and the

resulting formulation is complex using a large set of training data. Alternatively it can

he formulated as a discriminant function in kernel space [23], but still suffers from the

need to train the system before hand and store the training data. The matched filter

based on quadratic mutual information is another recently introduced nonlinear filter that

maximizes the mutual information between the template and the output of the filter [24].

This method does not require an initial training step since it is non-parametric. However,

the method requires the estimation of the quadratic mutual information with kernels and

is ideally valid only for identically and independently distributed (iid) samples, which is

rarely the case in reality. Moreover, the computational load is still O(NV2) at best. The

derivation of the method introduced in this dissertation uses a recently introduced positive

definite function called correntropy [15], which quantifies higher order moments of the

noise distribution and has a computational complexity of O(NV), thus providing a useful

combination of good representation and less computational complexity.

1.2 Optimal and Adaptive Filtering

Due to the power of the solution and the relatively easy implementation, Wiener

filters have been extensively used in all the areas of electrical engineering. Despite this

wide spread use, Wiener filters are solutions in linear vector spaces. Therefore, many

attempts have been made to create nonlinear solutions to the Wiener filter mostly based

on Volterra series [25], but unfortunately the solutions are very complex with many

coefficients. There are also two types of nonlinear models that have been commonly used:

The Hammerstein and the Wiener models. They are composed of a static nonlinearity and

a linear system, where the linear system is adapted using the Wiener solution. However,

the choice of the nonlinearity is critical for good performance, because linear solution is










obtained in the transformed space. The recent advances of nonlinear signal processing

have used nonlinear filters, commonly known as dynamic neural networks [26] that have

been extensively used in the basic same applications of Wiener filters when the system

under study is nonlinear. However, there is no analytical solution to obtain the parameters

of multi-i i-, 1. II neural networks. They are normally trained using the back propagation

algorithm or its modifications. In some other cases, a nonlinear transformation of the

input is first implemented and a regression is computed at the output. Good examples

of this are the radial basis function (RBF) network [27] and the kernel methods [3, 28].

The disadvantage of these alternate techniques of projection is the tremendous amount of

computation required due to the required inversion of a huge matrix and there is usually

a need for regularization. We show how to extend the analytic solution in linear vector

spaces proposed by Wiener to a nonlinear manifold that is obtained through a reproducing

kernel Hilbert space. The main idea is to transform the input data nonlinearly to be

Gaussian distributed and then apply the linear Wiener filter solution. Our method

actually encompasses and enriches the Hammerstein model by inducing nonlinearities

which may not be achieved via static nonlinearity. For viable computation, this approach

utilizes a rather simplistic approximation due to which the results are less optimal but still

performs better than the linear Wiener filter. To improve its results we propose to obtain

the least mean square error solution on line following the approach emploi-, II in Widrow's

famous LMS algorithms [29].









CHAPTER 2
NEW SIMILARITY AND CORRELATION MEASURE

2.1 Kernel-Based Algorithms and ITL

In the last years a number of kernel methods, including Support Vector Machines

(SVAI) [:3], kernel principal component analysis (K(-PCA) [4], kernel Fisher discriminant

on~ ll-k- (K(-FDA) [5], and kernel canonical correlation analysis (K(-CCA) [6],[7] have been

proposed and successfully applied to important signal processing problems. The basic idea

of kernel algorithms is to transform the data xi from the input space to a high dimensional

feature space of vectors #(xi), where the inner products can he computed using a positive

definite kernel function satisfying Mercer's conditions [:3]: n(xi, xj) = (0(xi), #(xj)). This

simple and elegant idea allows us to obtain nonlinear versions of any linear algorithm

expressed in terms of inner products, without even knowing the exact mapping Q.

A particularly interesting characteristic of the feature space is that it is a reproducing

kernel Hilbert space (RK(HS): i.e., the span of functions {s(-, x) : x e X} defines a unique

functional Hilbert space [1], [2], [:30], [:31]. The crucial property of these spaces is the

reproducing r', 'i'' l ;, of the kernel


f (x) = (K(-, x), f ) V fe F (2-1)


In particular, we can define our nonlinear mapping from the input space to a RK(HS

as #(x) = s(-, x), then we have


(0(x, #y)) (s-, x, s-, y) =~xy),(2-2)


and thus #(x) = s(-, x) defines the Hilbert space associated with the kernel.

Without loss of generality, in this chapter we will only consider the translation-invariant

Gaussian kernel, which is the most widely used Mercer kernel.

1 || Ix y||2 (3
n(x y)= ex-(2)
2/o-r 20-2









On the other hand, Information Theoretic Learning (ITL) addresses the issue of

extracting information directly from data in a non-parametric manner [8]. Typically,

Renyi's entropy or some approximation to the K~ullback-Leibler distance have been used as

ITL cost functions and they have achieved excellent results on a number of problems: e.g.,

time series prediction [9], blind source separation [10] or equalization [11].

It has been recently shown that ITL cost functions, when estimated using the Parzen

method, can also be expressed using inner products in a kernel feature space which

is defined by the Parzen kernel, thus -II__- -1;En-:: a close relationship between ITL and

kernel methods [32],[33]. For instance, if we have a data set xl, xN E R, and the

corresponding set of transformed data points #(xi), #(xN), then it turns out that the

squared mean of the transformed vectors, i.e.,


||Ime. 12 =t Xi) Xy) K Xi (- xy), (2-4)
i= 1 j= 1 i= 1 j= 1

is the information potential V(x) as defined in [8]1 Another interesting concept in

information theoretic learning is the Cauchy-Schwarz pdf distance, which has been proved

to be effective for measuring the closeness between two probability density functions and

has been used successfully for non-parametric clustering. If px(x) and py(y) are the two

pdfs, Cn III y:-Schwarz pdf distance is defined [8] by

S px (X) py(x) dx
D(px, v) = -log > 0. (2-5)


Mutual information (j!1) indicates the amount of shared information between two or more

random variables. In information theory, the MI between two random variables X and Y is



1 The quadratic Renyi's entropy is defined as HR = log(V(x)).









traditionally defined by Shannon as [34].


Is (X; Y) = pxy(x, y log Pxy (x, y) dxdy (2-6)

where pxy(x, y) is the joint probability density function (pdf) of X and Y, and Px(x)

and Py (y) are the marginal pdfs. The crucial property of mutual information for our

purposes is the fact that it measures the dependency (even nonlinear) between two

random variables X and Y. If X and Y are independent, MI becomes zero [34]. In a

sense, MI can be considered a generalization of correlation to nonlinear dependencies;

that is MI can be used to detect nonlinear dependencies between two random variables,

whereas the usefulness of correlation is limited to linear dependencies. However, in order

to estimate MI one has to assume that the samples are iid, which is not the case for

templates that are wave forms. Although Shannons MI is the traditionally preferred

measure of shared information, essentially it is a measure of divergence between the

variables X and Y from independence. Based on this understanding, a different, but

qualitatively similar measure of independence can be obtained using the C IIII 1w:-Schwartz

inequality for inner products in vector spaces: (x, y) < ||X|| ||y||. The following expression

is defined as the C ..vs-S--chwartz Mutual Information (CS-QMI) between X and Y [8]:

1 f p~x (X, y)dxdy ff pX2 (Xp2 (\ ,
Is (X; Y) =log 2(2-7)
2 (JI' Irxy (x1 y) px(x) py (Iy) Idxd)2

With data available, I,(X; Y) can be estimated using Parzen window density estimation

and can be used as a statistic for signal detection as in [24]. We shall use CS-QMI as a

comparison against our proposed method as well since this is a direct template matching

scheme that requires no training and shows improved performance in non-Gaussian and

nonlinear situations. Similarly, the equivalence between kernel independent component

we~ lli1;; (K(-ICA) and a C IIII I1w-Schwartz independence measure has been pointed out

in [35]. In fact, all learning algorithms that use nonparametric probability density

function (pdf) estimates in the input space admit an alternative formulation as kernel










methods expressed in terms of dot products. This interesting link allows us to gain some

geometrical understanding of kernel methods, as well as to determine the optimal kernel

parameters by looking at the pdf estimates in the input space.

Since the cost functions optimized by ITL algorithms (or, equivalently, by kernel

methods) involve pdf estimates, these techniques are able to extract the higher order

statistics of the data and that explains to some extent the improvement over their

linear counterparts observed in a number of problems. Despite its evident success, a

1!! r ~ limitation of all these techniques is that they assume independent and identically

distributed (i.i.d.) input data. However, in practice most of the signals in engineering have

some correlation or temporal structure. 1\oreover, this temporal structure can he known

in advance for some problems (for instance in digital communications working with coded

source signals). Therefore, it seems that most of the conventional ITL measures are not

using all the available information in the case of temporally correlated (non-white) input

signals. The main goal is to present a new function that unlike conventional ITL measures

effectively exploits both the statistical and the time-domain information about the input

signal. This new function, which we refer to as correntropy function, will be presented in

the next section.

2.2 Generalized Correlation Function

2.2.1 Correntropy as Generalized Correlation

A new measure of generalized correlation called correntropy was presented in [15].

The definition is as follows:

L/et {xt, tET } be
correlation function V(s, t) is I. Im.1

V(s, t) =E [n(:(s), r(t)) (2-8)









where E [-] denotes mathematical expectation. Using a series expansion for the Gaussian

kernel, the correlation function can be rewritten as



n=0 i ~

which involves all the even-order moments of the random variable ||x, xt||. Specifically,

the term corresponding to a = 1 in (2-9) is proportional to E [||x,||2] + E [||xt 2"

2E [||xxt |2] 'T t -T~ 2Re(S: t). This show-s that, the information provided by the
conventional autocorrelation function is included within the new function. From (2-9), we

can see that in order to have a univariate correlation function, all the even-order moments

must be invariant to a time shift. This is a stronger condition than wide sense stationarity,

which involves only second-order moments. 1Vore precisely, a sufficient condition to have

V(t, t -r) = V(-r) is that the input stochastic process must be strictly stationary on the

even moments, this means that the joint pdf p(xt, xt+,), must be unaffected by a change of

time origin. We will assume this condition in the rest of the dissertation when using V(-r).

For a discrete-time stationary stochastic process we define the generalized correlation

function as V [m] = E [a(x, xn-m)], which can be easily estimated through the sample

mean

V. m] = 1 i h(x. x-). (2-10)

This form higher order correlation has been termed correntropy because of its unique

capability to provide information both on the time structure (correlation) of the signal as

well as on the distribution of the random variable (average across dam1i~ or indexes is the

information potential) [15]. Correntropy is positive definite and because of which, defines

a whole new reproducing kernel Hilbert space (RK(HS) [1]. It also has a maximum when

the two indexes s and t are equal. These properties are also satisfied by the widely known

correlation function [2]. These facts have made it possible to explore a whole new avenue
of research which in essence combines the benefits of kernel methods and information

theoretic learning [36], [37]. In the same way a covariance version of correntropy can









be defined by employing vectors that are centered in the feature space defined by the

Mercer's kernel. Given any two random variables xl and x2, the corresponding feature

vectors #(xl) and #(x2) can be centered as,


# (xi) = # (xi) E [@(xi)], i = 1, 2. (2-11)

Then the inner product between these two centered feature vectors becomes




(2-12)

Employing a statistical expectation on both sides of the above equation we get

E [(xt,), 4(x2) = E[(9(xt),#(x2j))-(E (x),E (2
(2-13)
=E[a(xl, x2)]- Ez, E, [a(xl, xa2 -

This results in the centered correntropy between the two random variables. Now we can

define the centered correntropy for random vectors:

Given a random vector x = [xl, x2, L itS CentCTed COrrentropy can be 7 t.la

a matrix: V such that the i, jth element is


V(i, j) = E[a(xi, xj)] E,,Ez, [a(xi, xj)]. (2-14)

The following theorem is the basis of the novel Wiener filter formulation.

Theorem 1. For r, t, ; 9 miin. 10.i positive / I~ ~,.:/: kernel (i.e., M~ercer's Kernel)

(x(i),x(j)) I 1." ~It. on sEx s9 and a random process x(u), the correntropy V I. I;,.. l

with V(i, j) = E[a(x(i), x(j))] E,(i)Excj, [a(x(i), x(j))] is a covariance function of a

Gaussian random process.

Proof: It can be proved that V is non-negative definite and symmetrical. The

symmetry of V is a direct consequence of the symmetry of the kernel. Non-negativity of

V can be shown by using the vectors #(x(i)) obtained from Mercer's theorem. Let us take









any set of real numbers (axl, 82, ., QL) HOt all ZeTOS. Then,


i=1 j= 1 i= 1 j= 1

Using (2-13),

L::L~vi L L
i= 1 j= 1 i= 1 j= 1

(l~ L Iii~= La~a)

i= 1-.,,i~ll, j


-15)


> 0.
(2-16)
Thus V is non-negative definite and symmetrical. Now, it can be proved that R is the
auto-covariance function of a random process if and only if R is a symmetric non-negative
kernel [2]. The theorem then follows. This means that given V(i, j) for a random process

x(u), there exists a Gaussian process z(u) such that E[z(i).z(j)] -E[z(i)].E[z(j)] = V(i, j).
Theorem 2. For r,:;;.:l~,l.: ;./. llyl and i.::/* i'' '-'*7- '-ll;i distributed random vector x, the
corresponding Gaussian random vector z in the feature space 7~ ,: [I..1 by correntropy is
uncorrelated, i.e., the covariance matrix: is diagonal.

Proof: Let V be the centered correntropy matrix for x and hence covariance matrix
for z with its i, je" element with i / j as

V(i, j) = E[zi.zy] E[ze].E[zy]

=E[ai(x(i), x~j))] E,:(i)E,:(j) [ai(x(i), x~j))] (2 1'7)



It also easy to see that since x is identically distributed, all the diagonal terms are also

equal.










2.2.2 Properties

Some important properties of the GCF can be listed as follows:

Property 1: For any symmetric positive definite kernel (i.e., Mercer kernel) x(x,, xt)

defined on R x R, the generalized correlation function defined as V(s, t) = E [a(x,, xt)] is

a reproducing kernel.

Proof: Since x(x,, xt) is symmetrical, it is obvious that V(s, t) is also symmetrical.

Now, since s(x,, xt) is positive definite, for any set of a points {xl, x,} and any set of

real numbers {al, a,}, not all zero



i= 1 j= 1

It is also true that for any strictly positive function g(-, -) of two random variables x and y,

E [g(x, y)] > 0. Then



i= 1 j=1 1]>
(2-19'

i= 1 j= 1 i= 1 j= 1

Thus, V(s, t) is both symmetric and positive definite. Now, the Moore-Aronszaj~n

theorem [1] proves that for every real symmetric positive definite function a of two

real variables, there exists a unique reproducing kernel Hilbert space (RK(HS) with a as its

reproducing kernel. Hence, V(s, t) = E [a(x,, xt)] is a reproducing kernel. This concludes

the demonstration.

The following properties consider a discrete-time stochastic process, obviously, the

properties are also satisfied for continuous-time processes.

Property 2: Vim] is a symmetric function: V[-m] = Vim].

Property 3: V [m] reaches its maximum at the origin, i.e., V [m] < V [0], Vm.

Property 4: Vim] > 0 and V[0]= .










All these properties can be easily proved. Properties 2 and 3 are also satisfied by the

conventional autocorrelation function, whereas Property 4 is a direct consequence of the

positiveness of the Gaussian kernel.

Property 5: Let {x,, a = 0, NV 1} be a set of i.i.d data drawn according to

some distribution p(x). The mean value of the GCF estimator (2-10) coincides with the

estimate of information potential obtained through Parzen windowing with Gaussian

kernels .

Proof: Th~e ParzenI pdf estimate is given? byi pil) = ( C2"- x(xr ,), anld th~e

estimate of the information potential is


(2-20)




On the other hand the

< m < (NV 1), and


N-1
(cm) 1 1V I
2NV 1 N |m|
m= -N+1

Finally, it is trivial to check that all the terms in (2-20) are also in (2-21). This

concludes the proof.

Property 5 clearly demonstrates that this generalization includes information about

the pdf. On the other hand, we also showed that it also conveys information about the

correlation. For this reasons, in the sequel we will refer to V [m] as correntropy.


N-12


N-1 N-1

i=0 n=0

where R denotes a Gaussian kernel with twice the kernel size of m.

GFC estimate is Vim] = -m EE xx,-xum, o -N- )

therefore its mean value is









Property 6: Given Vim] for m =

correntropy matrix of dimensions P x P


0, P 1, then the following Toeplitz


V [0]

V [1]


V [1]

V [0]


(2-22)


is positive definite.

Proof: Matrix V canl be decomplosed as V = E m A, where A, is given by


x(x, x,)

s(x, x,_1)


s(xn x,)


- (

- (


(2-23)


\Is(x,-x,_,_ ) IF(X, -X,-P-2) *


if K(xi, xj) is a kernel satisfying Mercer's conditions, then A, is a positive definite matrix

Va. On the other hand, the sum of positive definite matrices is also positive definite [38],

this proves that V is a positive definite matrix.

Property 7: Let {xc,, nE T} be a discrete-time w.s.s. zero-mean Gaussian process

with autocorrelation function r [m] = E [x,x,-m]. The correntropy function for this process

is given by


V [m] = '


where a is the kernel size and a2 [m] = 2(r [0] r [m]).

Proof: The correntropy function is defined as Vim]

a zero-mean Gaussian random process, for m / 0 zm =

Gaussian random variable with variance a2 [m] = 2(r [0]


m= 0

m/0


(2-24)


= [a(x, xn-m)]. Since x, is

X, X,-m is also a zero-mean

- r[m]). Therefore


/"1 z2
V~2/m]r = szm exp a ~m dm.


(2-25)


- V [P 1]

- V [P 2]



-- V[0]


\V[P-1] V[P-2]


(xn Xn-P-









Since we are considering a Gaussian kernel with variance a2, equation (2-25) is the

convolution of two zero-mean Gaussians of variances a2 and a [m]2 eValuated at the origin,

this yields (2-24) immediately.

Property 7 clearly reflects that correntropy conveys information about the time

structure of the process and also about its pdf via quadratic Renyi's entropy. As a

consequence of Property 7, if {xa~, n E T} is a white zero-mean Gaussian process with

variance a2 we have that V [m] = V m / 0, which coincides with the mean value

of the function and, of course, is the information potential of a Gaussian random variable

of variance a when its pdf has been estimated via Parzen windowing with a Gaussian

kernel of size a2

Property 8: The correntropy estimator (2-10) is unbiased and .-i-mptotically

consistent.

The properties of the estimator can be derived following the same lines used for the

conventional correlation function [39].

2.3 Similarity Measure

For template matching and detection the decision statistic is in general a similarity

function. For matched filteringf the cross-correlation function is used. Here we define a

non-linear ; rn.7.r r.7;, metric which can be explained in probabilistic terms. Inspired by the

concept of correntropy, we define a means of measuring similarity between two random

processes.

The cross-correntropy at i and j for two random processes Xk and k~ can be I.

by:

Vxy(i, j) =E[is(Xi, Yj)]. (2-26)



2 (()2d 1










With the assumptions of ergodicity and stationarity, the cross correntropy at lag ni

between the two random processes can he estimated by


Vxy(m) =, ):(.I'-ne Yk) (2-27)


For detection if we assume that we know the timing of the signal we just need to use


Vxy(0) =, ):(.Iks Yk). (2-28)


Property 9: The cross-correntropy estimate defined in (2-28) for two random

variables .I'k and yk, each iid, approaches through Parzen density estimation to a measure

of probability density along a line given by


Dxy = Pxy(.r, Y)6(.r y)dxrdy, (2-29)


where 6(.r y) is the Dirac's delta function. Dxy in (2-29) defines the integral along the
.r y f te oin prbailiy dnsty uncio (PF) xy Hence, this gives a direct

probabilistic measure on how similar the two random variables are.

Proof: The estimate of the joint pdf using Parzen windowing with a Gaussian kernel

and available N samples can he written as


Pxy(.r, Y) =,) n(.r,. -k) /Us U) (2-30)


where no is the Gaussian kernel with variance O.2. Using the estimate of the pdf we have


Dxy (2-31)
N :%2


No o










Here we can use the convolution property of the Gaussian kernel resulting in


Dxy = Zk Uk)(2-34)


This coincides with (2-28) with the kernel having variance 2o.2

This can also be inferred from (2-26) since


Vxy(0) = E [x,(x, y)]


= x,(x, )Pxy(x, y)dxdy. (-5
-OO -OO

This is the just the integral along a strip (defined by the kernel) on the x = y line. For

o 0, the kernel would be the Dirac's delta function coinciding with (2-29). Figure 2-1

illustrates this graphically. This means that Vxy(0) increases as more data points in the

joint space lie closer to the line x = y and kernel size regulates what is considered close.

This also give the motivation of using Vxy as a similarity measure between X and Y.

Property 10: The cross-correntropy estimate defined in (2-28) for two random

variables xi and yi, each iid, is directly related through Parzen density estimation to

Cauchy-Schwarz pdf distance defined in (2-5).

To explore the relationship between these two quantities let us estimate the pdf's of

these random variables using Parzen estimation:


x~x) x~x x4)(2-36)
i= 1


v (7)= (7 Ni)(2-37)
i= 1
The numerator in (2-5) can be written as



j= 1i= 1 N j= 1i= 1 (2-38)
E FyF~xn J p,(XY)]= E: p(X ,Y) 7' Ea T gC,(xrLy ) = Vxv.11
i= 1

















0.2





7 1 -05
-2 -.5 -

Figue 21. robbilsti intrprtaton f Vy (he aximm o eah crvehasbee
normalize to 1 fo viua convenience). r
Here2 th aprxmain are forr swthn btensttsialepcai osadaml





\r1 1

2
where ~~Ij G(X) an G()aeetmtso nfrainptnil deie n(-) o
and Y, espectvely. Nte tha Vxy account or the nteracionbtentetorno
variables in~ th CacyScwr pdf d istne sinc it gie an esimt ofth ine
product~~~~~~~~~ bewe h w d' ssoni 23)










Property 11: The cross correntropy function between two random variables X and

Y defined in (2-26) is the cross correlation of two random variables U and Z. That is,

N N
Vxy = :(Zk, rk kk z (2-40)
k=1 k=1

where Ruz is the estimate of the cross correlation between U and Z using sample average.

The correntropy matrix for the vector [X Y]T given by


Vxy = Vxx Vxy 2-1
vyx VYY (21

is positive definite then using similar arguments [2] used to prove theorems 1 and 2, a

random vector [U Z]T will exist whose correlation matrix is given by the same matrix.

Thus

R~ = Re Rz = Vxy. (2-42)
Rzu Rzz

Hence two random variables U and Z will exist satisfying (2-40). This shows that using

the cross correntropy function is equivalent to simply computing cross correlation of two

other random variables nonlinearly related to original data through correntropy.

2.4 Optimal Signal Processing Based on Correntropy

As we have discussed earlier given the data samples {X,}z,N= the correntropy- kernel

creates another data set {zz(i)}z", preserving the similarity measure as E[zz(i) zz(j)]

E[k(xi, xj)] = V(i, j). In [2] Parzen has described an interesting concept of the Hilbert

space representation of a random process. For a stochastic process {Xt, tE T} with T

being an index set, the auto-correlation function Rx (t, s) defined as E [XtX,] is symmetric

and positive-definite, thus defining an RK(HS. Parzen showed that the inner product

structure of this RK(HS is equivalent to the conventional theory of second order stochastic

processes. According to Parzen, a symmetric non-negative kernel is the covariance kernel

of a random function (in other words random process) and vice versa. Therefore, given a









random process {Xt, tE T} there exists another random process {ze, tE T} such that


E[zez,] = V(t, s) = E[k(X,, X,)]. (2-43)


zt is nonlinearly related to, but of the same dimension as Xt, while preserving the

similarity measure in the sense of (2-43). Meanwhile, the linear manifold of {ze, tE T}

forms a congruent RK(HS with correntropy V(t, s) as the reproducing kernel. The vectors

in this space would be nonlinearly related to the input space based on the underlying

input statistics and hence any linear signal processing solution using a covariance

function (given by correntropy) in this feature space would automatically give a nonlinear

formulation with respect to the original input data. For instance, if we construct a

desired signal by the optimal projection on the space of this filter would also be a linear

optimal filter in the feature space and corresponding a nonlinear filter would be obtained

with respect to the original input, thus potentially modeling the nonlinear dependencies

between the input and the desired input. This concept of using the feature space given by

correntropy can be clearly extended for most supervised learning methods in general. In

the following chapters we shall present the usefulness of the correntropy function both as

a similarity and correlation measure as well as a means of optimal signal processing in the

corresponding feature space.

We conclude this chapter by pointing out the differences and similarities between

the RK(HS induced by correntropy (VRK(HS) versus the RK(HS induced by the Gaussian

kernel (GRK(HS) as used in kernel methods, and the old Parzen formulation of the RK(HS

(PRK(HS) based on autocorrelation of random processes. The primary characteristic of

the GRK(HS is the nonlinear transformation to a high dimensional space controlled by the

number of input data samples. This creates difficulties in understanding some solutions

in the input space (e.g. kernel PCA), and requires regularized solutions, since the number

of unknowns is equal to the number of samples. Powerful and elegant methods have been

used to solve this difficulty [3], [40], but they complicate the solution. In fact, instead of










straight least squares for kernel regression, one has to compute a constrained quadratic

optimization problem. The nonlinearity is controlled solely by the kernel utilized, and

most of the times it is unclear how to select it optimally, although some important works

have addressed the difficulty [41]. Although correntropy using the Gaussian kernel is

related to the GRK(HS, VRK(HS is conceptually, and practically much closer to PRK(HS.

Since correntropy is different from correlation in the sense that it involves high-order

statistics of input signals, the VRK(HS induced by auto-correntropy is not equivalent to

statistical inference on Gaussian processes. The transformation from the input space to

VRK(HS is nonlinear and the inner product structure of VRK(HS provides the possibility

of obtaining close form optimal nonlinear filter solutions by utilizing high-order statistics

as we shall also demonstrate. Another important difference compared with existing

machine learning methods based on the GRK(HS is that the feature space has the same

dimension of the input space. This has advantages because in VRK(HS there is no need

to regularize the solution, when the number of samples is large compared with the

input space dimension. Further work needs to be done regarding this point, but we

hypothesize that in our methodology, regularization is automatically achieved due to the

inbuilt constraint on the dimensionality of the data. The fixed dimensionality also carries

disadvantages because the user has no control of the VRK(HS dimensionality. Therefore,

the quality of the nonlinear solution depends solely on the nonlinear transformation

between the input space and VRK(HS. Another important attribute of the VRK(HS is

that the nonlinear transformation is mediated by the data statistics. As it is well known

from the theory of Hammerstein-Wiener models [42] for nonlinear function approximation

(and to a certain extent in SVM theory [3]), the nonlinearity pIIl us an important role in

performance. In other words, in the Hammerstein models, a static nonlinearity which is

independent of the data is chosen a priori whereas our approach induces the nonlinearity

implicitly by defining a generalized similarity measure which is data dependent. Our

method actually encompasses and enriches Hammerstein-Wiener models by inducing










nonlinearities which may not he achieved via static nonlinearities, and are tuned to the

data (i.e. the same kernel may induce different nonlinear transformations for different data

sets). The RK(HS induced by correntropy is indeed different front the two most widely

used RK(HS studied in machine learning and statistical signal processing. Being different

does not mean that it will ak-- 0-< he better, but these results show that this provides a

promising opening for both theoretical and applied research. Still certain facets of this

research, like the exact relationship of the input data with the feature data, requires

further investigation for a more concrete understanding of the impact on the betterment of

conventional signal processing.









CHAPTER 3
CORRENTROPY BASED MATCHED FILTERING

3.1 Detection Statistics and Template Matching

3.1.1 Linear Matched Filter

Detecting a known signal in noise can be statistically formulated as a hypothesis

testing problem, choosing between two hypotheses, one with the signal s1~,k preSent (H1)

and the other with the signal SO~k preSent (Ho). When the channel is additive with noise,

nk, independent of the signal templates, the two hypotheses are

Ho : rk n Gk so,k
(3-1)
H1 : rk = nk + 1,k.

When both the hypotheses are governed by Gaussian probability distribution, the optimal

decision statistic is given by the log likelihood [43] as

|Ro| 2 1
L~r)= lo I 2[(r ml) R (r mi) (r mo) Ro (r mo)] (32


where Ro and RI are the respective covariance matrices, and mo and mi are the

respective mean vectors of the hypotheses. L(r) is then compared to a threshold rl,

and if the statistic is greater, hypothesis H1 otherwise Ho is chosen. This is a complicated

quadratic form in the data that is usually not preferred to be computed. Assumption of

independently and identically distributed (iid) zero mean noise would reduce this to a

much simpler expression. Then, Ro = RI = 0.21, mo = so and mi = sl, where of is the

noise variance, I is the identity matrix, and so and sl are the two possible transmitted

signal vectors.

Ltr) = -i [(r sl)T(r s1) (r so)T(r so)]
-- [rr 2r' T +T s si rTr +rs -ss (3-3)
-- [-2r st + s 1si + 2r'so Ll s0 so]










The terms not depending on r can he dropped and the expression can he rescaled with out

any effect on the final decision. These changes will be reflected on the threshold to which

the statistics is compared. So, we can use the following decision statistic which is nothing

but the difference between the correlation between the received signal r and the templates.


L(r) = rT(sl -- so). (:34)


This, in fact, is the output yk, of the matched filter at the time instant k = 7r, the signal

length, such that

Yk = rk k~ (:35)

where denotes convolution and hk, s1,,-k so~,7- is the matched filter impulse response.

Thus, the filter output is composed of a signal and a noise component. The output

achieves its maximum value at the time instant -r when there is maximum correlation

between the matched filter impulse response and the template thereby maximizing the

signal to noise ratio (SNR) (defined as the ratio of the total energy of the signal template

to the noise variance)

SNR= 4 (:36)
is k=0
The matched filter is one of the fundamental building blocks of almost all communication

receivers, automatic target recognition systems, and many other applications where

transmitted waveforms are known. The wide applicability of the matched filter principle

is due to its simplicity and optimality under the linear additive white Gaussian noise

(AWGN) framework.

3.1.2 Correntropy as Decision Statistic

Now inspired by properties 9 and 10 in chapter 2, we shall define the decision

statistic used hence forth. The properties above demonstrate that cross-correntropy

is a probabilistic similarity between two random vectors. Let us assume that we have

a receiver and a channel with white additive noise. For simplicity we take the binary











Table 3-1. Values for the statistic for the two cases using the Gaussian kernel.


Recelived signal Statistic, Lo
N si,i-so,i+Ri) N (n)
7 O1 2 2)g1 62a
N~\70 N~\70
i= 1 i= 1
N (ng)" N (s,z-so,i+ni)
7 H S11 (n 2ag 1 202
N~\70 N\i0
i= 1 i= 1


detection problem where the received vector is r with two possible cases: (a) when the

signal so is present (hypothesis Ho, ro = n+so) and (b) when the signal sl is present

(hypothesis rl = n + si). Now we basically want to check whether the received vector

is 'closer' to sl (validating H1) or to so (validating Ho) based on our similarity measure

(correntropy). Thus when the timings of the signal is known and they are synchronized we

define the following as the correntropy matched filter statistic:

N N
Lo(r) = C i( rs, si,L) ) ( i, Joi). (3-7)
i= 1 i= 1

With the two cases of either transmitting rl (signal sl in noise) or ro (signal so in noise)

table 3-1 summarizes the values of the statistics for the linear matched filter and the

correntropy matched filter using the Gaussian kernel. Since the pdf of the noise as is

considered symmetrical, as and -us are statistically the same and hence Lo(rl)=

-Lo(ro).

We should also note that the correntropy matched filter given by (3-7) defaults to the

linear matched filter because according to property 11 in chapter 2, (3-7) is also measuring

the difference in correlation between the random feature vectors derived form correntropy

corresponding to r, so and sl. Thus the correntropy matched filter creates a linear

matched filter that is nonlinearly related to the original input data. If the transmitted

signal is 1 I__- d by a certain delay and this is not known a priori, the following correntropy










decision statistic should be used:


Lo(r) = max (i~ri-, sI) max K~iri-mm SOi) I (3-8)
IN N
i= 1 i=i

where ri-m is the ith sample in the received signal vector of samples in the symbol window

d.l I- we by m.

Now depending on whether the detection scheme is synchronous or not, all that is

required is for Lo to be compared with a threshold to decide when the signal template was

transmitted.

3.1.3 Interpretation from Kernel Methods

So far we have introduced the correntropy matched filter from an information

theoretic learning perspective (ITL). We arrive at expression (3-7) from kernel methods

with a few assumptions. We shall transform the data from the input space to the kernel

feature space and compute (3-4) in the feature space by using the kernel trick. But

instead of using the original kernel a we shall use the sum of kernels, a defined by

N39
i= 1

where r = [rl, T2, *, rrN]" and s = [sl, s2, *, sN]T are the input vectors. Note that

& is a valid kernel by all means. It is trivial to show that a is symmetrical and positive

definite merely from the fact that it is a sum of symmetrical and positive definite functions

implying that it is a Mercer kernel. Hence R can be written as


x~r, ) = "(r)(s),(3-10)


where # maps the input vector to a possibly (depending on the kernel chosen) infinite

dimensional feature vector. With the two possible transmitted vectors so and sl and ro

as the received signal in the input space, the corresponding feature vectors will be #(so),

#(sl) and #(r), respectively. Now applying the the decision statistic (3-4) in the feature










space we get


T 1 1
Ler)= @(t)- (so)) #(r)= <(i 8 1,i- i: (ri, so,i). (3-11)


L, coincides with (3-7). Of course L, is not the maximum likelihood statistic in the

feature space, since the data in the feature space is not guaranteed to be Gaussian. But

second order information in the feature space is known to extract higher order information

in the input space. For example (2-4) shows that the squared norm of the mean of the

feature vectors is the information potential which gives Renyi's quadratic entropy. We can

expect the same effect here and, as we shall see later, is also demonstrated in the results.

3.1.4 Impulsive Noise Distributions

Since we aim to show the effectiveness of the proposed method in impulsive noise

environments, we shall briefly introduce the most commonly used pdf models for

such distributions. These distributions are commonly used to model noise observed in

low-frequency atmospheric noise, fluorescent lighting systems, combustion engine ignition,

radio and underwater acoustic channels, economic stock prices, and biomedical signals

[44], [45], [46]. There are two main models used in literature that we present next. To our

knowledge, a single detection method that can be applied easily to both such models has

not been presented in literature so far. We shall demonstrate that the proposed Cijll is an

exception.

3.1.4.1 Two-term Gaussian mixture model

The two-term Gaussian mixture model, which is an approximation to the more

general Middleton Class A noise model [47] has been used to test various algorithms under

an impulsive noise environment [23], [46], [48]. The noise is generated as a mixture of two

Gaussian, density, functionsl sch ha te noise;, distribution, C~f,\u = 1, -if E)(, a)
EN(0, a), wher E~ID~ ISI~IUI the I percntag of1 niseID spikesV~II andI usull ag >> l \V









3.1.4.2 Alpha-stable distribution

ac-stable distributions are also widely used to model impulsive noise behavior [44],

[45]. This distribution gradually deviates from Gaussianity as a~ decreases from 2 to 1

(when the distribution becomes Cauchy). This range is also appropriate because even

though the higher moments diverge, the mean is still defined. Though the pdf of an

ac-stable distribution does not have a closed form expression, it can be expressed in terms

of its characteristic function (Fourier transform of the pdf). The general form of the

characteristic function and many details on the ac-stable distribution can be found in [49].

Here we shall only consider the symmetric ac-stable noise. The characteristic function of

such noise is given by



where o- represents a scale parameter, similar to a standard deviation. Such a random

variable has no moment greater or equal to a~, except for the case a~ = 2 [49]. It is

noteworthy that the deterministic or degenerate case (a~ = 0), the Gaussian case (a~=2),

the Cauchy case (a~=1) and the Levy or Pearson distribution (in the non-symmetric

framework, for a~=0.5) are the only cases for which the pdf possesses a closed form

expression [49].

3.1.4.3 Locally suboptimal receiver

When we present the simulations in a later section, for one case using additive a~-

stable noise, we shall compare the performance of the proposed C' \! I detector with the

locally suboptimum (LSO) detector, which gives an impressive performance with minimum

complexity [50]. So we shall briefly introduce these concepts. For details please refer to the

respective citations.

The LSO detector is derived directly from the locally optimum (LO) detector [51],

whose test statistic is given by


Tro (r) = k9O k)3)









where the nonlinear score function is


gro(X) =,(3-14)


and f,(x) is the first derivative of f,(x), the pdf of the additive noise. Since f,(x) does

not have a closed form expression when the noise is ac-stable distributed. gLo cannot be

found exactly. The LSO detector uses an approximation for the score function


gaso(x) = c,||

where c = W and A 2.'73a 1.75 is the empirical estimate of the peak of gro. As it

can be easily seen, a drawback of the LSO detector is that one needs to know the value of

alpha before it can be emploi- II We shall assume that this information available in the

simulations that will be presented at the end of this chapter.

3.1.5 Selection of Kernel Size

The kernel size (variance parameter in the Gaussian kernel) is a free parameter in

kernel based and information theoretic methods, so it has to be chosen by the user. There

are in fact numerous publications in statistics on selecting the proper kernel size [52], [53],

[54], in the realm of density estimation, but a systematic study on the effect of kernel size

in ITL is far from complete. For the particular case of the correntropy matched filter we

can show the following interesting property.

Property 1: The decision using statistic Lo reduces to the optimal Gaussian ML

decision in the input space given by (3-4) as the kernel size (variance of the Gaussian

Kernel) increases.

Proof: Using the Taylor's series expansion we get


x,(rn, as) = 1-+---- (3-16)
2o-2 2(2o-2 2









Since order of the terms increases by 2 with each term, as a increases, the contribution of

the higher order terms becomes less significant compared to the lower order terms. Then,

to,(rn, 8s,) a 1 11I2 and it is easy to see that using Lo is equivalent to using L, where

L is given by (3-4).

This means that the kernel size acts as means of tuning the correntropy matched

filter, larger kernel size adjusting it to linearity and Gaussianity. For instance in AWGN

by property 1, the kernel size should be relatively larger than the dynamic range of the

received signal so that the performance defaults to that of the linear matched filter,

which is optimal for this case. In the experiments presented below, the kernel size is

chosen heuristically so that the best performance is observed. In the cases where the

correntropy matched filter is expected to bring benefits, like for additive impulsive noise,

the kernel size should be selected according to density estimation rules to represent well

the template (i.e. Silverman's rule [53], or on the order of the variance of the template

signal). Silverman's rule is given by opt a (1-1(2d +1)}1+d, where a, is the standard

deviation of the data, NV is the data size and d is the dimension of the data. As shall be

illustrated later, choosing the appropriate kernel size for the detection problem at hand

is not very exacting (as in many other kernel estimation problems) since a wide interval

provides optimal performance for a given noise environment.

3.2 Experiments and Results

Receiver operating characteristic (ROC) curves [55] were used to compare the

performance of the linear matched filter (ill ), the matched filter based on mutual

information (\!1) and the proposed correntropy matched filter (C'jll~). ROC curves give

the plot of the probability of detection Po against the probability of false alarm PFA foT a

range [0, 00) of threshold values, y (the highest threshold corresponds to the origin). The

area under the ROC can be used as a means of measuring the over all performance of a

detector. The ROC curves were plotted for various values of signal to noise ratio (SNR),

defined as the ratio of the total energy of the signal template to the noise variance. For










cc-stable distributions, where the variance of the noise is not defined, SNR was estimated

as signal power to squared scale of noise ratio. 10,000 Monte-Carlo (jlC) simulations

were run for each of the following cases. The template is a sinusoid signal with length 64

given by as = sin(27ri/10), i = 1, 2,..., 64. Segments (chips) of length equal to the signal

template, some containing the signal and others without the signal, were generated with

az probability of transmitting a signal. Various simulations were performed under the
followingf situations.

1. Additive white Gaussian noise linear channel, Tk, Sk sr k*n

2. Additive white impulsive noise linear channel where the noise is generated as a

mixture of two Gaussian density functions such that the noise distribution fr (n) =

(1 E)NV(0, af) + ENV(0, a 2), where the percentage of noise spikes, E = 0.15 and


:3. Additive zero mean cc-stable noise channel.

For each case two sets of ROC curves shall be presented -one for the case when

the timing of the received symbols are known and the signal is synchronized (using the

statistic (:37)) and the other when there might he an unknown delay in the received

symbols (using the statistic (:38)). These two cases shall be referred respectively as

synchronous detection and .-i-nchronous detection in the following sections. For the latter,

the delay was simulated to be not larger than 20 samples and hence the corresponding

similarity measures (correlation ( \!I ), correntropy (C'l\! I), mutual information ( \! ) and

correlation with the suboptimal score (LSO)) were maximized over the lags less than 20.

We have used the method given in [56] with the skewness parameter /3 = 0 to generate the

symmetrical cc-stable distributed samples.

3.2.1 Additive White Gaussian Noise

For the AWGN case, the normal matched filter is optimal so it should outperform all

the other filters under test. But the C \! I although obtained in a different way, provides

almost the same performance as can he expected from (:316) for large kernel size, and










observed in figure 3-1 for synchronous detection and figure 3-2 for .l- inchronous detection.

In fact, the performance of the C \!1~ will approach arbitrarily close to that of the MF as

the kernel size increases. The MI based matched filter also approaches similar performance

when the kernel size is increased to high values, but the results are a bit inferior because

the computation of CS-QMI [24] disregards the ordering of the samples.




0.9 ---- CMF -.







0.2 C

0.1




0.



ll 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9
P,

Figure 3-1. Receiver operating characteristic curves for synchronous detection in AWGN
channel with kernel variance O.2(CM~F)= 15 0.2(MI) 15 (the curves for MF
and Cijll~ for 10 dB overlap).


3.2.2 Additive Impulsive Noise by Mixture of Gaussians

When the additive noise is impulsive, the proposed C \! I clearly outperforms both the

MI detector and the linear MF (see figure 3-3 for synchronous detection and figure 3-4 for

.I- i achronous detection). The increased robustness to impulsive noise can be attributed to

the properties of the correntropy function using the Gaussian kernel as mentioned earlier

(heavy attenuation for large data values of the outliers).














0.9~ e-- --- CMF
SMI






0.3

0.2


0.1

0.



U 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9
P,


Figure 3-2. Receiver operating characteristic curves for .l-inchronous detection in AWGN
with kernel variance O.2(CM~F) 15 0.2(MI) 15.


3.2.3 Alpha-Stable Noise in Linear Channel

Now let us observe the behavior of the detection methods for ac-stable distributed

noise. In this case the comparisons with the LSO detector is also presented, since it is

close to optimal for additive ac-stable distributed noise [50]. We assume that the value of

alpha is known before hand to use this detector. Since these distributions have second

moments tending to infinity, the matched filter utterly fails. Figures 3-5 and 3-6 show

these results for synchronous detection. It can be seen that the C11 F~ and the LSO

detector both give almost identical performance, but the LSO detector requires one to

know the exact values of cta priori. Of course, as alpha increases and approaches 2, the

performance of the linear MF improves since the noise then becomes Gaussian (see figure

3-6). The curves in figure 3-6 for the two values of a~ corresponding to each nonlinear

detector (C'jIF, LSO and MI) almost coincide. Since the variance of the ac-stable noise













0.9~~, ...--** ---- CF






0.3.
0.2
0.1
O.B
(I7 0. 0. .3 04 0. 07 0. .
0.6 t :~P
,FA
Fiur :-:. cive opertin chr dersi curves fo crnosdtcin nadtv
imuliv nois wihkre aia: .('F -5 .(f)2

isntwl dfnd N wudipyth ai fsqae cl o os osinlpwr







Figure :3-7. showse the RO pltsn foar cthe ithe .mi-sfnchronous detection. Oeagin, adthe




performance of the C' \! I rivals that of the LSO detector, which is exclusively designed for

the cc-stable noise for a given c0. These simulations demonstrate the effectiveness of the

proposed C \! F for this widely used impulsive noise model.

3.2.4 Effect of Kernel Size

The choice of kernel size, though important, is not as critical as in many other

kernel methods and density estimation problems. For detection with the correntropy

matched filter, we decided to plot the area under the curve for different values of the

kernel size to evaluate its effect on the ROC. As can he seen in figures :3-9 and :3-8, a wide

range of the kernel size works well. For the case with impulsive noise using the two-term

Gaussian model the values given by the Silverman's rule (2.2 and 4.05 for SNR values













0.9C ." -CM



0.7 /~



S0.5

0.4 SNR=5 dB SNR=0 dB

0.3 -I

0.2 4.

0.1:


(I 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9
P,

Figure 3-4. Receiver operating characteristic curves for .l-inchronous detection in additive
impulsive noise with kernel variance O.2(CIfF) 5, o.2(Afl) 2.


of 5dB and OdB respectively) fall in the high performance region, but for the cc-stable

noise, the Silverman's rule is not computable since the variance of the noise is ill-defined.

However it is trivial to choose a value for kernel size through a quick scan to select the

best performance for the particular application.

3.2.5 Low Cost CMF Using Triangular Kernel

Though we have presented most of our arguments previously using the Gaussian

kernel, these arguments are valid for any valid kernel. In fact, the type of kernel is usually

chosen based on the type of problem or simply based on convenience. For example if

it is known before hand that the nonlinearity involved in a system to be estimated is

polynomial, then a polynomial kernel would be used [57]. Likewise if the problem solution

is based on Parzen's pdf estimation, then a pdf kernel is used like the Gaussian, Laplacian,

etcetera. Here we shall demonstrate the use of a triangular kernel so as to simplify the























0.3 ~
0.2 .-:****** M

O.B 1 '-' M


(I7 0 .

0.P
0FA

Figure ~ ~ ~ ~ SN=1d :35 eevroeaigcaatrsi uv s frn hrlonosdtdin nadtv
wht -tbedstiue oskenlvrac .


imlmntto f h '\!Feenmr.Th s f h ausa ere hug ie

superior~~~~~~~~~~~~~ pefrmne a mahn usal evl e thi thog aplnmia ae
exason erigtiscmlxiymy o h-nh ncsay.Frintnei u
problem ~ ~ ~ ~ ~ ~ ~ ~ ~ hee yuigtetinulrkre wihi aldPre enlF w a

simliy te Fimlemnttin. hetringla kene evlute bewen to ea
points0. :r an sgvn





where a is realve posiiertn numbe cr isi andsfrsycrnu dtcininadt











Figur :-1 sow the l~ is fumnttion. All o theideguas aend l arg lumet pted ieen this rhapt























a"0.75 _' =1.8 CL=1.5



0.65)

0.6C ** MF
---- CMF
0.55 :- M
I LSO
0.5
O 0.05 0.1 0.15 0.2 0.25 0.3 0.35 0.4 0.45 0.5
P,


Figure :3-6. Receiver operating characteristic curves for synchronous detection in additive
white nc-stable distributed noise, kernel variance O.2 :3, SNR 15 dB, the plots
for 1\I and C'jI l almost coincide for both values of cn.


will still be valid for this kernel. Note that using this kernel, one does not require even a

single multiplication operation. Figure :3-11 shows the ROC plot of the previous methods

along with the the C' \! I~ using the triangular kernel. The SNR is 5dB with the noise

generated with a mixture of Gaussian pdfs as discussed before. The triangular kernel

(:317) used a width (2a = 1.5). The parameter values for the other methods are the same

as shown in figure :3-:3.






























0.9-

0.8

0.7

0.6

a" 0.5

0.4

0.3-

0.2

0.1

0L
0l


0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1
P,


Figure 3-7.


Receiver operating characteristic curves for .-i-nchronous detection in additive
white ac-stable distributed noise, kernel variance O.2 3, SNR 15 dB, the plots
for MI and Cil I F almost coincide for both values of a~.







































m SNR=15 dB

," 0.85-



0.8



0.75 -1~

SNR=10 clB M

0.7
0.5 1 1.5 2 2.5 3 3.5 4 4.5
kernel size (cy


Figure 3-8. Area under the ROC for various kernel size values for additive alpha-stable
distributed noise using synchronous detection, a~ = 1.1.














0.95-

0.9-

0.85-


0.8 t


0.75-

0.7-

0.65-


I II I I I I Il
0 0.5 1 1.5 2 2.5 33.5 4 4.5
kernel size (cy

Figure 3-9. Area under the ROC for various kernel size values for additive impulsive noise
with the mixture of Gaussians using synchronous detection.


(a")


Figure 3-10. Triangular function that can be used as a kernel.
































0.9

0.8

0.7

0.6

a"0.5

0.4-

0.3

0.2

0.1-

O
II 0.1 0.2




Figure 3-11. Receiver operating
detection methods.


0.3 0.4 0.5 0.6 O 7 0.8 0.9 1
P,


characteristic curves for SNR of 5 dB for the various










CHAPTER 4
APPLICATION TO SHAPE CLASSIFICATION OF PARTIALLY OCCLUDED
OBJECTS

4.1 Introduction

A vast amount of papers and books in the statistical and pattern analysis literature

has been devoted to the problem of recognizing shapes using various landmarks [58], [59], [60].

Nevertheless, one of the 1 in r~ challenges yet to be solved, is the automatic extraction

of landmarks for the shape when the object is partially occluded [60],[61],[46]. The

problem is mostly caused by misplaced landmarks corresponding the occluded part of

the object. Using the proposed robust detector the influence of the occluded portions

will automatically be less significant. The idea is that the occlusion may be treated

as outliers, and hence maybe modeled as some kind of heavy tailed distribution. In a

previous chapter, we have already seen that the correntropy matched filter is robust

against impulsive noise. with our approach there is no need to extract so called landmarks

based on certain properties of the shape boundary like change in curvature or presence of

corners, etc. Instead our algorithm uses the shape boundaries directly, trying to match it

with one of the known shape templates. This is done by simultaneously adjusting for the

possible rotations and scaling required through a fixed number of parameter updates.

4.2 Problem Model and Solution

A large class of classification problems can he solved using the classical linear model.

Consider the following Af-ary classification problem. For ni = 0, 1, .. ., Af 1, the mth

hypothesis is



Y = AX,n + E, (4-1)

where Y is an K x NV measurement matrix, AX,, is the measurement component due to

the signal of interest when model or class ni is in force, and E is an additive zero mean

noise matrix of size K x NV. A is an unknown K x P matrix. X,> is an P x NV matrix. The

signal component can also be called a subspace signal [46]. All these quantities may also










be complex valued. Hence, given the measurement matrix Y, the problem is to classify

which of the M~ signal components XmA is present. If some of the model parameters are

unknown like here, a usual approach emploi- II for detectors is the generalized likelihood

ratio (GLR) principle [62]. The GLR detector replaces the unknown parameters by the

their maximum likelihood estimates (jl 1.11). However, if the pdf of the noise matrix is

unknown or it is not possible to find the MLEs of those parameters, the GLR based

methods are hard to implement. This is usually true when the noise model deviates

from the usual assumptions of Gaussianity. It is true here in our example as well. In

this chapter we shall demonstrate the proposed robust detector on shape classification

of partially occluded fish silhouettes. Figure 4-1 shows the set of 16 fish silhouettes

(downloaded from www.1ems.brown. edu/vision/software) that constitute the templates

that were applied in the numerical example. For each silhouette the boundary was

extracted. In relation to the model given by (4-1), Xm represents the matrix whose each

column is a point in the boundary in the shape and A is a scaling and rotation matrix of

the form

A = I b(4-2)
b a

A distorted version of one of the shapes in the template database is given. That shape is

then attempted to be matched with one of the database elements. Hence in our case, 16

template matching operations are performed. For each case, the optimal scaling, rotation

and translation parameters are found based on optimizing (maximizing) the correntropy

between the database shape boundary and the boundary of the given distorted shape.

Thus the corresponding cost function is

IAxi-d-yil
J(A, d) =: e- 2,2 (4-3)
i= 1

where xi and yi are the ith column entries of Xm and Y respectively. This function is now

maximized with respect to the scaling and rotation matrix, and the translation vector d.





10


Figure 4-1. The fish template database.


Let

a


Then by differentiating (4-3) with respect

get the following fixed point update rule:


to a and equating the gradient to zero one can


N Axi-d-yi 2

a <--
N. 2p IAxi-d-yi 2 )
i= 1


(4-5)


13


15










with

zi = xilyil + Zi29i2 (4-6)
xeivi2 xi29il

where Zik, and yik, are the kth components of the vectors xi and yi respectively. Similarly

by differentiating with respect to d and equating the gradient to zero one can get the

followingf fixed point update rule for d:

^' t x IAxi-d-y i 22
d <--l (4-7)
IAxi-d-yi 2
i= 1

After the optimization process is finished for each of the templates in the database, the

shape corresponding to the largest value of the cost function is chosen. Thus the occluded

shape is recognized.

Since there were 16 silhouettes, a database of the 16 templates is stored extracting the

boundaries for each of them. Since a shape is completely described by its boundary, we

can use any conventional technique to get the boundaries. For instance in our example, for

each silhouette template in the database, 100 points were evenly extracted to represent the

boundaries.

We have to also extract the boundary points for the occluded shape as well. When

using the proposed robust detector, we have to extract an ordered set of points (like a

time series). In addition we also have to extract a fixed number of boundary points, the

same as the number of boundary points in the templates. Note that these boundary points

have to be properly ordered since the kernel function is evaluated for each point in the

boundary with one point in the template. An easy and apparently effective thing to do is

to order the points in the following way:

For k = 1, 2,. ., databasesize, pick the kth point in the boundary of the template and

choose the closest point in the boundary of the occluded object to be its kth point. We

shall call this the nearest point ordering (NPO). Since the perimeter given by the occluded










object might be very large, it is advisable to extract a number of boundary points for the

occluded object greater than the templates in the database. Here we have used twice the

number of points for the occluded shape than what was used for the database templates.

Thus for each shape template, there will be an ordered set of vector points, i.e.,

the matrix Xm m = 1, 2, .. ., M~ of boundary points and also a corresponding vector of

boundary points of the occluded object given by Y. M~ is the number of templates or

classes (here the database size, 16). The summary of the algorithm is presented next.


Center boundary points of the received occluded shape by subtracting the mean vector

from each point.

For template index m = ItoM~

.For fixed point update index, i = 1 to number of iterations

.Perform nearest point ordering of Y with respect to Xm to get the new Y.

.Perform an update of A, d using (4-5) and (4-7).

.End For loop getting Aopt, dopt.

.Compute Jot(m) = J(Aopt, dopt)*

End For loop getting Jot (m) (m = 1, 2, ... M)

mopt = arg maX opt B)

Clo~~~--- the moptth shape template to be the most likely shape for the received occluded

object.


The inputs to the detector were occluded versions of the shape number 1 in figure 4-1.

The occluded versions were constructed by random rotation (within +60 degrees), scaling

by a random factor (between 1 and 2) and translation of two fishes of shape no. 1. Only

occluded silhouettes where the ratio of the occluded area to the area of the silhouette itself

was between 50I' and CII' were considered. Figures 4-2 and 4-3 show a possible occluded

version of shape no. 1 and its boundaries respectively.




































Figure 4-2. The occluded fish.


Table 4-1. Recognition and nxisclassification rates.


Shape number Proposed method LMedS (RWR)
1 95' :
2 to 16 5' 2'


1000 Monte Carlo (1!C) simulations were performed by generating the occluded

shape randomly and classifying it to one of the 16 possible known shapes. The table

shows the recognition and nxisclassification rates for the proposed method along with the

LMedS method using row weighted residuals (RWR) as reported in [46]. As in all the

methods that use the Gaussian kernel, the kernel size a is important. In this application

kernel size tunes what is considered close and far. The far points are heavily diminished

when computing the correntropy value. For instance in this numerical demonstration, a

kernel size of a = 15 was used. Since a is the standard deviation term in the Gaussian

kernel, points that are within de distance front one another will return much higher














1009 -I



90 C





50 -+

40 --"
..*' ..."
3011 .
20
0 20 40 60 80 100 120



Figure 4-3. The extracted boundary of the fish in 4-2.


values when the kernel is evaluated than the points that are much far away. So using

a o- of 15 signifies that the outlier-like points caused by occlusion will mostly be much

larger than 15 pixels apart from their corresponding points of the known shape boundary

and hence will have less impact on the evaluation of the correntropy function, but the

boundary points that are intact in the occluded image will likely be much closer than 15

pixels to the corresponding points of the undistorted boundary. This concept can he used

to roughly estimate the required kernel size and may further he verified through a few

simulations by randomly generating occluded images. This kernel size was chosen after

several simulations with random occlusion so as to maximize the percentage of detection.

From the results it is clear that our technique provides a simple and elegant method with

exceptional results.










CHAPTER 5
WIENER FILTERING AND REGRESSION

5.1 Linear Wiener Filter

For convenience we now present a brief derivation of the linear Wiener filter. Given a

real zero mean time series x(u), the output of this linear FIR filter is given by

L-1

i=0

where the weights to,i = 0,1i,..., L 1 are found by minimizing the mean square

error (ilrmi) between the output y(u) and a zero mean desired response d(u), and L is

the memory of the finite impulse response system, which is also the delay embedding

dimension of the time series when (5-1) is represented in vector form given by


y(u) = w'x(u), (5-2)


where w = [wo, WL-l] and x(u) = [x(u), x(n 1),..., x(n L + 1)]. It is not hard to

see that the w that minimizes the MSE between y(u) and d(u) [63] is given by


w = R lp, (5-3)


where R, is the autocorrelation? m~atrix given by R. = E [x(ua)x(ua)T] and p is the the

cross-correlation vector given by p = E [d(u)x(u)]. Of course, (5-3) assumes stationarity

of the time series. Assuming ergodicity, the expected values can be estimated by the

corresponding time averages with a sufficiently large window. These assumptions are valid

throughout this presentation.

It is well known that the use of mean square error as an optimization criterion

is optimal in the maximum likelihood sense for Gaussian processes. Hence all linear

estimation and filteringf methods, including Wiener filter presented above, that use this

criterion perform optimally for Gaussian processes. But in nature many signals cannot be

expected to follow this assumption strictly. Speech, chaotic time series, and biomedical









signals are just a few examples. Attempts have been made to create nonlinear solutions to

the Wiener filter mostly based on Volterra series [25]. The approach of using the Volterra

series gives solutions that are very complex with many coefficients. One of the most

simplest and closest in structure to the linear Wiener filter is the Hammerstein model [64],

which basically is a static nonlinearity followed by the linear Wiener filter. Though this

method has shown some promise especially for certain system approximation problems

[42], it has not been able to be applied to more general signal processing problems. Using

a static nonlinearity to transform the data itself also provides the limitations. Use of

static nonlinearity only addresses the problem for a very small set of nonlinear systems.

Moreover, extra complications arise from training for the optimal nonlinearity.

5.2 Correntropy Based Wiener Filter

We can now derive the nonlinear Wiener filter based on Correntropy. A relatively

straight forward derivation uses the correntropy function given by (2-10). As we have

shown in chapter 2, given the random process x(u), (2-10) acts as a correlation function

for some Gaussian random process z(u). i.e.,


E[is(x(i), x(j))] = E[z(i) z(j)]. (5-4)


z(u) is the transformed random process that is nonlinearly related to the original input

x(u), but the exact functional relation of z(u) with x(u) is still unknown, just like in

kernel methods where the exact values of feature vectors are not known explicitly. But this

gives a basis of formulating the well known optimal least squares solution with this new

process z(u). The desired response used for this Wiener filter is still the original desired

signal d(u). So this can be thought of as modifying the original input so that the linear

manifold of the new correntropy transformed input can better include the solution to

obtain the original d(u). So we are still exploiting a linear structure but of a manifold that

is nonlinearly related to the original input. This mapping from x(u) to z(u) is not simply a

static nonlinear relation like with the Hammerstein models [42], but is related to both the









choice of the kernel (like in kernel methods) and the statistics of the input data. So this

signal z(u) is the input to a linear Wiener filter. Using a filter length of L, and d(u) as the

desired signal, the solution is easily given by,


w = Rz lPz, (5-5)


and

y(u) = w z(u) (5-6)

where z = [z(u),z(n 1),..., z(n L + 1)]" and pz = E [d(u)z(u)]. Further assuming

ergodicity we approximate the E [] operator by the time average. So we have,


y(u) = w'z(u) (5-7)


and the filter output becomes,

y (u) = z (u) w

= znu)R;') E d(k)z(k)
N L-1 L-1
EE z (n i)rijz(k j)d(k)58
k=1 j=0 i=0 58
N L-1 L-1
E d(k) C C ay {z(n i)z(k. j)}
k= 1 j= 0 i= 0
N -1 L-1

k=1 j=0 i=0

where rPay is the ije" element of the matrix R;'. The final expression is obtained by

approximating z(n i)z(k j) by x(x(n i), x(k j)), which holds good on an

average because of (5-4). The final output is obtained by matching the variance of y(u)

to that of the desired signal. This mismatch in scale is most likely because of the above

approximation. This final result is also presented in [36].

Fr-om (5-8) it is obvious that the computational cost for producing the output at

a given time instant is O(L21V), Where NV is the number of training data points used.

The computation involved is illustrated in figure 5-1. This computation arises from the









fact that the mapping to feature vectors is not explicitly known and hence the weight
vector cannot be numerically calculated. Instead every time all the training data samples
have to be utilized to compute the output directly. This is similar to all the kernel based

regression methods. But we do have an advantage that an inversion of the teoplitz
correntropy matrix is involved (with complexity O(L2)) inStead of the large Gram matrix

(which is O(N3")) Which is required for ]rn lw: kernel methods. The matrix inversions in
both cases have to be done once. As we shall see in the next section, we can rearrange the
terms in (5-8) to make it comparable with other well known regression methods. We shall
use shall use these similarities in order to improve the CF.


Figure 5-1. The computation of the output data sample given an embedded input vector
using the correntropy Wiener filter.


RiKIn), KI~)
r~~c
xIn-L+l>,xl~>


RiKIn),KIj-L+1))


Rix~n-L+l>,xl~'-L+1)) j










5.3 Simple Regression Models with Kernels

We can study the correntropy filter (CF) from a regression perspective and propose

some improvements. For this let us rewrite the CF equation (5-8) in the following form:

L-1 L-1 N-1 L-1 L-1

j=0 i=0 k=0 j=0 i=0

where xi(u) is the ith component of the L dimensional vector x(u) and a belongs to an

index set which may be time. The CF can be thought of as doing regression with the

components of the vectors x(u) obtaining my (u) and then linearly combining them using

the weights rPay to get the final estimate y(u). So one means of further improving the CF

is by optimizing the only weight parameters rPay using the training data. This will likely

compensate to an extent for the simplistic approximation used to obtain (5-8). Before

we proceed, some well known parametric and nonparametric regression estimators are

mentioned next.

5.3.1 Radial Basis Function Network

The RBF network [27] is a well known estimator which can be explained by the

followingf equation:
N-1
yi(u) = ree, r.(x(u), x(k)), (5-10)

where the weights I,, are computed so as to minimize the MSE between the estimate y(u)

and the desired variable d(u).

5.3.2 Nadaraya Watson Estimator

Unlike the RBF network and more like the CF, the Nadaraya Watson (NW) estimator

[26] is a nonparametric regression technique. By nonparametric we mean that no

parameter is directly optimized with respect to the desired signal d(u). The output

estimate is given by
N-1
E d(k)s(x(u), x(k))
l(u) = k= (-1
k=0 (,(J)









This equation can be derived as an approximation to the optimal mean square error

estimation given by in the following manner:


y(u) = E~dlx = x())} = d -(dix =x(u)) dd, (5-12)


where p(dlx = x(u)) is the conditional probability density fraction of d given x = x(u),

which can be estimated by Parzen windowing such that
n-1
E x(d, d(k)) x(x(u), x(k))
p~~)=p(d, x(u)) k=0 (5-13)
p(x(u))>-


Then from (5-12),
n-1 N-1
C (x(u), x(k)) fd (d, d(k)) dd C d(k)s(x(u), x(k))
V( )= k0 k=05-r14j
n-1 N-1
C (x(u), x(k)) C (x(u), x(k))

where f' d x(d, d(k)) dd gives the center d(k) of the Gaussian kernel.

5.3.3 Normalized RBF Network

The normalized RBF network (NRBF) [26] is, in fact, a combination of the above two

regression methods. The estimate of the output is given by,
N-1

y(u7) = -1 (5-15)---



where the weights are again are computed so as the minimize the MSE between the

estimate and the desired variable d(u).

5.4 Parametric Improvement on Correntropy Filter

If we look at the techniques given by equations (5-10), (5-11) and (5-15) and that

given by (5-9), it is clear that the kernels operate on the whole vectors x(u) in (5-10),

(5-11) and (5-15) where as the kernels operate on the components x; (u) of the vector

x(u) in (5-9) resulting in the terms yij(u). These terms are then linearly combined.










Two improve the results given (5-9), we shall find the partial regression terms yij(u) by

means similar to (5-12) and find the weights by minimizing the mean square error at the

output. The vector x(k) is L dimensional given by x(k) = [xl(k), x2(k),..., XL(k)]". Let

x = [xl, x2, *, XL]T be a random input vector. Given a new point x(n), the regression

estimate with the ith component of the input given as the je" component of x(u) is given

N-1





If we take the final output to be a linear combination of these partial regression terms,

then
N-1





This estimator is similar in form with the CF expression in (5-9) except for the normalization

terms. Now the problem is finding the weights, cmy. We can easily find an optimal set of

weights by minimizing the MSE of the estimate y(u) with respect to the reference d(u). It

will bear a closed form solution since the estimate is linear in the weights. We can simply

make a long vector of dimension L2 and use the Wiener solution. This unlike the RBF

network or kernel regression has the number of parameters independent to the training

data size and usually is much smaller than the data size. We shall call the estimator given

in (5-17) as weighted partial regression (WPR).

5.5 Experiments and Results

We shall use the normalized mean square error as a means of comparing the

performance of the above described methods. We shall also show change in performance

with the size of the training data. Two different experiments were conducted:

1. System identification of a time-delay neural network (TDNN) basically to show the

comparison of the CF with the WF for a simple nonlinear dynamical system.










2. Time series prediction of the Mackey-Glass (jlG-30) time series.

5.5.1 System Identification

Synthetically generated input was passed through a TDNN resulting in a output time

series. Given the same input the CF tried to generate the same output. We shall present

this performance along with that of the linear Wiener filter to support our claim that the

CF works better in a nonlinear environment. Figure 5-2 shows the TDNN system which

was imitated by the CF and the WF. For these results the input to all the three systems

was x(u) = sin(10n) + 1.2 sin(30n) + 0.2 sin(24n). The parameters for the TDNN were


w11

w21

w31

w41


wl2

w22

w32

w42


-0.6;9

1.25

-1.44

-0.40


0.82

0.71


0.86;

-1.59

0.57

0.69


(5-18)


v1

v2


(5-19)


mnput


Figure 5-2. Time d.l I- we neural network to be modeled.


Figure 5-3 shows the input and the output signals generated by the TDNN (desired

response), CF and WF. Of course, the output of the TDNN was used as the desired

response for the CF and WF. The performance of the CF and the WF can be seen in













1 ------ Correntropy







1-c










700 750 800 850 900 950 1000 1050 1100
input to the systems


Figure 5-3. Input and output signals generated by the TDNN (desired response), CF and
WF using a time embedingf of 2.


figure 5-4 for various time embedding dimension. This clearly shows superior performance

of the CF over the WF in a nonlinear environment. Both systems use the same number

of parameters as the embedding dimension, but for the CF, the parameters (elements of

correntropy matrix) are determined from the input data beforehand instead of through

training or optimization with respect the desired response. Next we shall compare the CF

and the WPR with the RBF network and the Nared eva~-Watson estimator.

5.5.2 Time Series Prediction of Mackey-Glass Time Series

Here we shall show the results of one step prediction of the Mackey-Glass (ilG30)

time series. Figure 5-5 shows the comparison of AISE values for different filter lengths

(embedding dimension). As can he easily seen for the four nonlinear methods, the best

results were achieved for filter length of L = 6 and for the WPR it was achieved for

L = 7. These values should be close to the optimal embedding length according to Takens



















----- correntropy







-'
~~,. ''
'
''
~ ~
~ ' ' '






c
c

r

c


-


10
















10


2 3 4 5 6 7

input embeckling


Figure 5-4. Mean square error values for WF and CF for the system modeling example.


-WPR
**F-- normalized RBF
--r-- N-W estrnator
******* RBF network


.ce**


B
embedding dimension


7 B 9


Figure 5-5. Mean square error values for MG time series prediction for various embeddingf

S1Ze.













-WPR
o -~-9- normalized RBF
---' N-W estimator
******* RBF network
-5~ ******C F


1 0 ....':L"

-15 -.

-20

-25~ -***


30 100 150 200 250 300
training data size


Figure 5-6. Mean square error values for MG time series prediction for different size of
training data.


embedding theorem [65]. Figure 5-6 shows the performance as a function of the amount

of training data used for an embedding dimension of 6. The CF, the RBF network, the

NW estimator, normalized RBF, and the WPR used the kernel width (0.2) ValueS of 0.56,

2, 0.22, 2, and 0.07, respectively. These values were determined experimentally to give

the least MSE for the respective methods. All the estimators used a training data size of

100 samples and tested using 200 samples. Parameters for the RBF and normalized RBF

networks were found by adding a regularization term of 10-3 to the diagonal of the Gram

matrix during training to avoid ill-conditioning.

We can conclude that the nonlinear Wiener filter (CF) is superior to the linear

Wiener filter but is still 1. I_;-; other nonlinear methods. Since this filter has no

parameter trained with respect to the desired response (it uses the data directly), it is

unfair to compare it with other parametric methods. With respect to the N-W estimator,

the CF is still a bit 1. I_; -; for larger data sizes. The derivation of the CF emploi-v 4

a very naive approximation in (5-8) that most likely gives rise to most of this error.

Since the exact transformation from the input data points to where linear filtering frame









work is emploi- a is not known, it is difficult to find a correction in this approximation.

Nonetheless, this discussion is still intriguing and motivates us to find a better means

of incorporating correntropy in minimum mean square error(j!lrl) filtering. One good

approach is to devise an on-line scheme using stochastic gradients. This formulation will

use a better approximation to derive the correntropy based LMS (least mean square)

algorithm. This is discussed in more detail in the next chapter.

It can be observed that the prediction using the weighted partial regression (WPR)

method, which was derived by using the same form as the CF but also using the ideas

from the other simple nonlinear regression techniques, is very similar in performance with

the RBF and the normalized RBF networks and also improves upon the CF for all sizes

of the training data. A noteworthy property observed with the correntropy filter (CF),

the WPR and the NW regression is that the error during training and that during testing

are very similar, unlike that for the RBF and the normalized RBF network. This can

be attributed to the absence of trained parameters in the CF and NW regression and

fewer parameters in WPR than in the RBF networks. The WPR has only L2 parameterS

(independent of the data size), L being the embedding dimension, where as the RBF and

the normalized RBF networks have NV parameters each, where NV is the number of data

centers (the whole training data here).









CHAPTER 6
ON-LINE FILTERS

6.1 Background

The least mean squares (LMS) filter is widely used in all areas of adaptive learning

from system identification to channel equalization. The popularity of this algorithm

since its introduction by Widrow and Hoff [66] in the 1960s has grown immensely

because of its simplicity and effectiveness. The idea is an intelligent simplification of

the gradient decent method for learning [29] by using the local estimate of the mean

square error. In other words, the LMS algorithm is said to employ a stochastic gradient

instead of the deterministic gradient used in the method of steepest decent. By design, it

avoids the estimation of correlation functions and matrix inversions. Unfortunately these

characteristic do not extend to nonlinear adaptive filters. Nonlinear system adaptation

based on local gradients require separate blocks of data, called the training set, to be made

available before operation. Of course, the LMS algorithm can still be easily used to adapt

the linear output lIn-;-r of nonlinear models as the radial basis function (RBF) network.

However, the centers of the basis are selected by the training data (either by centering

each Gaussian on a sample or through clustering [27]), so a block of data is still necessary.

For muiltibsi-;r perceptrons (j H.P's) the LMS stochastic gradient has to be heavily modified

to take into consideration the nonlinearities (the backpropagation algorithm [27]). In

MLPs the need for a training set is not fundamental, i.e. backpropagation can be applied

one sample at a time, but since all the parameters of the system change with each sample,

the performance is so poor early on and the convergence so slow that a training set is

practically required. Moreover, the performance surface is most often non-convex with

many local minima, which further complicates training.

Kernel methods have also been proposed to produce nonlinear algorithms from linear

ones expressed with inner products by employing the famed kernel trick [3]. More recently,

there has been an interest in the machine learning community to train kernel regressors









or classifiers one sample at a time to counteract the size of the huge datasets of some real

world applications. Important theoretical results have proved the convergence of on line

learning algorithms with regularization in reproducing kernel Hilbert spaces [67], and a

simple stochastic gradient based algorithm for adaptation has been developed [68]. Here

derive two LMS type filters, one based on correntropy and the other on kernel methods,

though both can be related through kernel methods.

6.2 Correntropy LMS

As described in the previous chapter, given the random process x(u), (2-10) acts as a

correlation function for some Gaussian random process z(u). i.e.,


E[is(x(i), x(j))] = E[z(i) z(j)] (6-1)

Let this signal z(u) be the input to a linear filter such that the output is given by


y(u) = w'z(u), (6-2)

where z = [z(u),zx(n 1),...,zx(n L + 1)] The least mean squares (LMS) filter

can be derived by using the stochastic version of cost function (obtained by dropping

the expected value operator in J(wz) = E~w Tz(u) d(u)}2) TOSulting in J(wz) =




V J(wz) = -2e(u)z(u), (6-3)

where e/u) = d'lu) wun'zlu) is the instantaneous error at time n. Since we are minimizingr

the cost function, we shall apply the method of gradient decent using the stochastic

gradient (6-3). Thus the updated weight at each instant n is given by,


wz(u) = wz(n 1) + 2rle(n 1)z(n 1). (6-4)










From (6-4) it can be easily seen that wz(u) is related to the initialization wz(0) such that

n-1
wz(u) = wz(0) + 29 e(i)z(i). (6-5)
i= 1

We can put wz (0) = 0 and the output at n is given by

n-1

y(un) =z(u) wz(u) =29 e(i) {~lz(i)zu)}
i= 1
n-1 L-1
=211 e(i) {z(i -k)z(n -k)} (: 6-6)

n-1 L-1



L-1 L-1
where we have approximated C {z(i k)z(n k)} by C x(x(i k), x(n k)). This
k=0 k=0
approximation becomes more accurate as L increases since


E[z(i k)z(n k)] = E[s (x(i k), x(n k))]. (6-7)

n-1 L-1
We shall refer to the formulation y(u) = 2rl C e(i) C s(x(i k), x(n k)) as correntropy
ii1 k=0
LMS (CLMS).

6.3 Kernel LMS

In this section we present the kernel LMS algorithm. The basic idea is to perform

the linear LMS algorithm given by (6-4) in the kernel feature space. For this let us

assume that # maps the point x(u) in input space to # (x(u)) in the kernel feature space

with (4 (x(u)) O (x(m))) = s (x(u), x(m)), where (-, -) represents the inner product

in the kernel Hilbert space. This transformation to feature space for the most widely

used kernels is nonlinear, and depending on the choice of the kernel this space can be

infinite dimensional. The Gaussian kernel that we use here will correspond to an infinite

dimensional Hilbert space. Since this feature space is linear, O (x(u)) can be considered an

infinite dimensional column vector with usual vector inner products. Let R be the weight

vector in this space such that the output is y(u) = (R(u), O (x(u))). R(u) is R at time













d(n)

e(n)


H(n)


Figure 6-1. Linear filter structure with feature vectors in the feature space.

n. Let d(u) be the desired response. Figure 6-1 shows the nonlinear filtering scheme with

the input vector x(u) being transformed to the infinite feature vector # (x(u)), whose

components are then linearly combined by the infinite dimensional weight vector. Note

that there is only one 1.v-;r of weights in this nonlinear filter, but since the size of feature

space is potentially infinite it is a universal approximator [63]. Now, due to the linear

structure of the RK(HS cost function Jo(u) = E[(d(u) y(u))2] can be minimized with

respect to R. This can be done in the same way as done in (6-4) using the stochastic

instantaneous estimate of the gradient vector, which yields


R(n + 1) = R(u) + 2rle(u)@ (x(u)) (6-8)

Like before, rl is the step-size parameter that controls the convergence, speed, and

misadjustment of the adaptation algorithm [63, 69]. The only catch here is that R in

(6-8) is in the infinite dimensional feature space and it would be practically impossible to

update for a directly. Instead we shall use (6-8) to relate each R(u) to its initialization











RO(). This would easily give


n-1

n(l) = n(0) +217 eii)@ (x~i)). (6-9)
i=0

For convenience we shall choose RO() to be zero (hence e(0) = d(0)). The final expression

for R(u) becomes
n-1

n(u) =291 e(i)@I (x(i)). (6- 10)
i=0

It is here we shall exploit the kernel trick. Given R(u) from (6-10) and the input # (x(u))

the output at n is given by

n-1
y(u) = (R(u), O (x(u))) = rl C e(i) (4 (x(i)) O (x(u)))
n-] =0 (6-11)

i=0

We call (6-11) the K~ernel LMS algorithm. It is clear that, given the kernel, K~ernel LMS


1.5







-1 -0.


-1.5



-2.5
O 50 100 150 200 250 300
tirne index

Figure 6-2. Error samples for K(LMS in predicting Mackey-Glass time series.


has a unique solution because it is solving a quadratic problem in feature space. Notice

also that the weights of the nonlinear filter are never explicitly used in the K~ernel LMS

algorithm, so the order of the filter is not user controllable. More importantly, since the

present output is determined solely by previous inputs and all the previous errors, it










can be readily computed in the input space. These error samples are similar to the ones

in sequential state estimation where they are also termed as innovations [63], as they

add new information to improve the output estimate. Each new input sample results

in an output and hence a corresponding error, which is never modified further and it is

incorporated in the estimate of the next output. This recursive computation makes K~ernel

LMS especially useful for on-line nonlinear signal processing, but the complexity of the

algorithm increases linearly as new error samples are used. The error samples obtained

for the Mackey-Glass time series (used in our experiments) are shown in figure 6-2, and

they show a surprisingly fast convergence. The initial errors in the adaptation tend to

prevent the algorithm from over fittingf the data. The initial errors are the most dominant

since the errors decay quickly and seem to have a prominent role in the estimates of the

output. We believe this nonparametric improvement in the output samples is a reason for

not requiring any kind of explicit regularization. In practice, if the errors are satisfactorily

small after p input samples, then the upper index in the summation in (6-11) can be

fixed and the computation complexity for each successive output will be O(p). This is,

of course, larger than that for the linear LMS algorithm, but still is smaller than most

nonlinear algorithms. The kernel size (the variance parameter in the radial basis function)

is a free parameter which affects the overall performance of the algorithm like in any

kernel based algorithm and can be chosen through a quick cross validation step. The

Silverman's rule [53] of thumb is another alternative. For our simulations the kernel size

chosen by the variance of the data works reasonably well. Here, just like for the linear

LMS algorithm, the stepsize controls the convergence, speed, and misadjustment of K~ernel

LMS. Through linear adaptive filter theory it can be expected that for convergence, rl

is upper bounded by the largest eigenvalue of the data covariance (in the feature space)

which is difficult to estimate and dependent upon the kernel size.

Comparing equations (6-6) and (6-11) we can see that CLMS can be considered

a special case of K(LMS, where instead of using the kernel n (x(i), x(u)), the kernel









L-1
C te(X(i k), x(n k)) is used which is a valid symmetric and positive definite kernel,

since it is a sum of symmetric and positive definite functions. Thus hence forth we shall

discuss about K(LMS and it would also imply CLMS unless otherwise stated.

6.4 Self-Regularized Property

Kernel LMS through the Mercer theorem is nothing but plain LMS in the feature

space (potentially infinite dimensional). Training a large weight vector usually warrants

proper regularization which is invariably done in methods like kernel regression [70] and

some RBF networks [27]. But LMS, as we have also observed in our experiments, can be

shown to have an inherent regularization. One way to show that is to demonstrate that

weight vector tends to a small norm solution. This is especially true when the weight

vector is initialized to zero. We shall discuss this approach in more detail though a more

rigorous argument can also be made through classical regression theory [71].

For this discussion we need to consider the natural modes for adaptation. These are

nothing but the components along the dominant eigfenvectors of the data autocorrelation

matrix R (this could be an infinite dimensional matrix for K(LMS). Let qi be the ith

eigenvector (corresponding to the ith largest eigenvalue As) of R. Let us denote the

weight-error vector in the LMS filter by


e(u) = w, w(u), (6-12)


where w, denotes the optimum Wiener solution for the tap-weight vector and w(u) is the

estimate produced by the LMS filter at the iteration n. Then, e(u) can be written as


e(2) = ): (ujq,, (6-13)


where I (u) is the projection of e(u) on the eigenvector qi such that


S(u) = qi e~7u), (6-14)









and the squared norm of e(u) is given by


||e~ )||2= of(u).(6-15)

Similarly the weight vector w(u) can be written in terms of its natural modes as


w~u) tT (~qi,(6167)

such that

||w(u7)|| __ 2 (6 17)

As derived in [63], when the step-size parameter of the LMS filter is small the first two

moments of the natural mode I (u) are given by,


F [I (u)]= I (0) (-1 A)", (6-18)

and

E~vf(u) =
where
Regularization is mostly necessary when the autocorrelation matrix is ill-conditioned,

that is when a few eigenvalues are very small. This is also true for K(LMS. So when an

eigenvalue As is small such that pX As

F [r(u)]a 0),(6-20)

and

E [vf (u)] a o (0). (6-21)

Hence when we initialize the weight vector w(0) to 0, the natural modes w(u) = I; of

corresponding to the small eigenvalues will remain almost unchanged i.e. around 0 in

the mean sense. Also the excess mean square error is defined as the difference between

the mean square error J(u) produced by the LMS filter at time n and the minimum









mean-square error Jmin. Then, it has been shown also in [63] that


re, z) = .I(n) Imin = CXE [?(u)]. (6-22)

This means that the performance of the LMS filter is not affected by the natural modes

corresponding to the very small eigenvalues. This means that for a minimum norm

weight vector one requires the modes I; (u) corresponding to the very small eigenvalues

to remain around zero and from (6-17),(6-17) and (6-17) this is what the LMS algorithm

achieves when the weight vector is initialized as zero. The fact that the LMS algorithm is

regularized can also be demonstrated through conventional regularization theory which is

discussed in more detail in [71].

6.5 Experimental Comparison between LMS and KLMS

To demonstrate the performance of the proposed method we will present some

simulation results. The mean square error will be used to compare the performance of

the K~ernel LMS algorithm (K(LMS) and that of the linear LMS algorithm for the one

step prediction of the Mackey-Glass (ilG30) time series. The simulations implement

equation (6-11) for the K(LMS and the equation (6-6) for the CLMS. The kernel size and

the step size were determined for best results after scanning the parameters. The data

was normalized to unity variance before hand. The kernel size, O.2 WaS chosen to be 1 for

these experiments. It was also observed that the performance was not very sensitive to the

kernel sizes between 1 and 4. The values of the step size for the LMS, the K(LMS and the

CLMS algorithms were chosen as 0.01, 0.5, and 0.05 respectively.

The plots presented include the learning curve and comparisons of MSE values for

different embedding dimensions. To plot the learning curve after each update the learning

was frozen and a new batch of 300 data samples was used to estimate the mean square

error. Figure 6-3 shows the learning curve for both algorithms (the step size values were

chosen for fastest convergence). Surprisingly, the speeds of convergence of both methods

(LMS and K(LMS) are comparable (i.e. the two learning curves are basically parallel to










each other) even though the K(LMS is working in an infinite dimensional Hilbert space,

where theoretically the eigenvalue spread is unconstrained and statistical reasoning

requires regularization to be performed [72]. This can he attributed to the fact that

the scattering of the data, although existing in an infinite dimensional space in theory,

is such that far lesser canonical directions are dominant (the corresponding covariance

matrix has a few eigfenvalues that are dominant, while the others are zero for practical

purposes). Since the LMS only searches the space of significant eigenvalues anyway, it

tends to concentrate on the signal subspace and so----- -0- that explicit regularization

is not required. Figure 6-3 also includes the regularized case [68] where results were

only satisfactory when the regularization parameter was close to zero. Here we used a

regfularization parameter of 0.02 and stepsize of 0.7. These parameters were chosen after a

set of trials to obtain the best MSE after convergence. A systematic way of choosing the

regularization parameter for this kind of stochastic learning is still lacking.

The K(LMS and CLMS filters achieve the best results for the embedding dimension

of 6 (figure 6-4), which is the optimal value for the MG time series according to Takens

embedding theorem [65]. For the plots, all the algorithms were trained (on-line) for

200 data samples and were tested using a testing set of :300 new data points. As a

reference we have also included the results using an R BF network (trained with the least

squares method) with 200 RBFs centered at the samples, and :300 new samples used for

testing. The results obtained with K(LMS is surprisingly close to that of the RBF network

considering the 200 "0-- !,h are used so differently: in the RBF all the 200 weights

are optimized for the data set, while in the K(LMS the errors are computed and fixed

for the subsequent samples. Notice also that training the K(LMS and the CLMS filters

were much simpler with no need of matrix computation and inversion. As we have

already mentioned the 1!! ri ~ drawback in the K(LMS method is the continually growing

architecture. In the following sections we shall provide a very effective remedy for this.












1.8

1.6 -CM

1.4 H -- regularized

1.2




0.8

0.6

0.4-




20 40 60 80 100 120 140 160 180
time inclex


Figure 6-3. Learning curves for the LMS, the K(LMS and the regularized solution.


6.6 Kernel LMS with Restricted Growth

6.6.1 Sparsity Condition for the Feature Vectors

By checking the linear dependency of the input feature vectors, one can find the

means of reducing the growing architecture of the kernel LMS algorithm. In this

subsection, the basic method of checking the linear dependency is presented. A vector

# (x(u)) is linearly dependent on the vectors in {@ (x(i)) : i s r,_l} (where r,_l is the list

of indexes of the vectors prior to time a that are linearly independent) if

length(rn-1)
P (xr(u)) = /S n-,) (6-23)
















100



LU
4 101




102


3 4 5 6 7
time delay embedding


8 910


Figure 6-4. Comparison of the mean square error for the three methods
embedding dimension (filter order for LMS) of the input.


with varying


with at least one nonzero PIk, Where Tu-1,k is the kth element of r,_ So we can use the

following cost fumetion to estimate kI and to check for linear dependency:


th(rn-1)

k=


_12




2k,_ Ip+~7G,_ ,a


(6-24)


where k,_l = [ ( u) r _,)), ( u) ( -12 -* n 1n1

[P1, #2, lengthi rn-1)] Gn_1 is the Gram matrix formed with the vectors in {@ (x(ij))

i s r,_l}, and ,_l is a matrix whose kth column is # (x(Tu-1,k)). Minimizing (6-24)


lengi


= (P> (X(u)) -




= s(X(u), x()) -









would result in
SGn_1 = k_,T

Or t T G,_1 (6-25)




Itk is~>,~> eayt seta is the Gram matrix G, with n added

to the list r,_l. Thus we just need to find the vector P such that all but maybe except

the last element of -01 G,_1 are zeros. Let TX be the transform on G, such that


TAG, is upper triangular. Then, the lasrow toi o f TX aife t n
Rn-1

hence from (6-24) to = 1

Thus t, gives the least square solution to the cost function given in (6-24). Now, if

the cost function at the the lest square solution is sufficiently small ( wi less than a small

positive 5)then we shall consider this new feature vector to be linearly dependent to the

vectors given by the indexes in r,_l. Otherwise, the vector is considered to be almost or

sufficiently linearly independent to the vectors given by r,_l, adding its index n to the list

of independent vectors r,_l.

Using the optimal value given by (6-25), the cost function would be


Jopt = s (X(u), x(u)) kT__70 (6-26)


It is easy to see that Jot is nothing but the last element of t,G, or accordingly the last

diagonal element of the upper triangular matrix TAG,. So the method of testing the

linear dependency simply requires two steps- first, use a linear transform to make the

Gram matrix upper triangular and second, compare the element in the last row and

last column of the resulting matrix to 6 as mentioned before. Next we explain the exact

numerical technique using sequential Gaussian elimination to perform these steps resulting

the improved algorithm.










6.6.2 Algorithm

Let the new feature input # (x(u)) in (6-11) be linearly dependent on the previous

feature vectors, O (x(i)),

i = 0, 1,. ., a 1. Then for some real asi's,
n-1
0 (x(n)) = as@(x(i)). (6-27)
i=0

Substituting this in (6-11),

n-1
y~n 1) 9 o~n~~s x~i) x~ + 1), 6-28)
i=0

where ea,n(i) = e(i) + cl (u). Thus the computation of y(n + 1) requires one less kernel

evaluation than that given by (6-11). If more such input feature vectors are linearly

dependent to the previous ones, growth of the algorithm so__~-r-- II. by (6-11) would reduce

significantly. Of course, if the data samples in the input space are distinct and when a

positive definite kernel function is used, none of the corresponding feature vectors will

be exactly linearly dependent. But in practice they can be awfully close. That is why

large Gram matrices are usually ill conditioned. In fact, we shall effectively employ one

step of Gaussian elimination at each time instant on the Gram matrix given by the kernel

to determine the linear dependency of the new input feature vector with the previous

ones. Before we explain the approach let us define a few variables. G, is the kernel Gram

matrix corresponding to the inputs {x(i),i i rn}, and


vn = [s (x(u), x(rnt)) K (x(u), x(Tu2)) ., (6-29)


where r, is the vector of indexes of the input whose corresponding feature vectors are

linearly independent and Tak, is the kth element of r,. The definition of r, will be clear

when the steps of the algorithm are presented next:

1. Initialize TT = T1 = 1; GT = G1 = 1; rl = 1; eot = 1.









2. The output and the error are calculated as:


y(u) = rlea,ny,
lenges(rs-1) (6-30)

i= 1

and e(u) = d(u) y(u). ea,,(i) is the ith element of esp.

3. G, = Gl, hv
Gv, 1V


4.~ To = It is easy to see that T,G, = G,.

5. Perform n Gaussian elimination row operations on G, and the same operations on

T, to obtain G* and TA. Then at each step G~ is upper triangular and T* G,=



6. If the last element g, of the last row (and hence the whole row) of GA is zero (or

less than a small user defined quantity 6), it is obvious that the Gram matrix G,

is ill-conditioned and hence the new feature vector, O (x(i)) is (almost) linearly

dependent on the previous feature vectors. If this is true and the last row of TX is

tn, then

tnG, a 0 (6-31)

and hence to gives the weights with which # (x(n)) is linearly dependent with the

previous vectors

{((x(i)),i i r,}. In this case the feature vector # (x(n)) is redundant and the

matrices T* and G* are reverted back to T*_ and G*_ by simply deleting the last

row and the last column. The update ea,,(i) = co,a-1(i) + cl (u) is also performed

where ea,a(i) is the ith element of ea,n and asi = -t,,4, where to,s is the ith element of

tn-

7. If g, > 6, then # (x(n)) is certainly linearly independent from the previous vectors

and the updates r, = [r,_l, n] and ea,n = [ea,a-, le(u)] are performed.


























~)101





0 100 200 300 400 500 600 700 800 900 1000
time index

Figure 6-5. Learning curve of linear LMS and kernel LMS with restricted growth for three
values of 6.


6.7 Experimental Comparison between KLMS and KLMS with Restricted
Growth

To demonstrate the performance of the proposed method we will present some

simulation results. The mean square error will be used to compare the performance of the

Kernel LMS with restricted growth (K(LMSR) and that of the linear LMS algorithm for

the one step prediction of the Mackey-Glass (ilG30) time series. For these experiments,

an embedding dimension of 6 was used which is the optimal value according to Takens

embedding theorem [65].The simulations implement equation (6-30) for the K(LMSR and

equation (6-4) for the linear LMS. The kernel size and the step size were determined for

best results after scanning the parameters. The data was normalized to unity variance

beforehand. The kernel size, O.2 was chosen to be 1 for these experiments. It was also

observed that the performance was not very sensitive to the kernel sizes between 1 and 4.





















2~ 10 -










0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8



Figure 6-6. Performance of kernel LMS with restricted growth for various values of 6.


The values of the step size for the LMS and the K(LMSR algorithms were chosen as 0.01

and 0.5 respectively.

The plots presented include the learning curve and comparisons of AISE values for

different values of the tolerance threshold 6. Note that for 6 = 0 we obtain the ever

growing R BF network. To plot the learning curve, after each update the learning was

frozen and a new batch of 300 data samples was used to estimate the mean square error.

Figure 6-5 shows the learning curve for LMS and K(LMSR algorithms (the step size values

were chosen for fastest convergence). It can he observed that there is a slight penalty in

performance for larger values of 6 since the RBF filter size is actually very different in

these three cases (see figure 6-7 ). The speeds of convergence for three different values of

6 are practically the same in the beginning since all the three algorithms are still growing,

but after teh 500th sample they stabilize (i.e. the learning curves are almost parallel

to each other). This shows that the learning process still continues in the algorithm















500-


c 400-


300-



E 200-


100-



O 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8



Figure 6-7. Number of kernel centers after training for various values of 6.


even when the new kernel centers are not added. Figure 6-6 shows the performance of

the algorithm for a range of values of 6 and figure 6-7 shows the number of total kernel

centers accumulated using 600 training data samples. As expected the number of kernel

centers reduces drastically as 6 increases from 600 to less than 100. Yet the degradation

in performance for this data set is comparatively much less and the overall filter performs

much better than the adaptive linear combiner trained with L1\S.

6.8 Application to Nonlinear System Identification

6.8.1 Motivation

As we have already mentioned in the previous sections, in addition to Vapnik's widely

known support vector machine [3] there have been numerous kernel methods proposed in

the recent years like, kernel principle component analysis [4], kernel F'isher discriminant

on~ ll-k- [5], and kernel canonical correlation analysis [6], [7], with applications in

classification and nonlinear regression. Though they have shown enormous potential










in solving complex problems, their usefulness has been rather limited in many real systems

requiring on-line operation. These algorithms cannot he emploi- 0 online because of the

difficulties posed by the basic kernel method, such as the time and memory complexity

and the need to avoid overfitting.

One of the early online nonlinear regression algorithms with a well tailored complexity

is the resource allocation network (R AN) [73]. The algorithm was presented as growing

radial basis function (R BF) network where for each new data sample that satisfies some

novelty condition, one R BF unit is added to the network. At each step the weights to the

RBFs is adapted using the LMS algorithm [63]. Though the RAN is simple and effective

it requires the user to heuristically choose two threshold parameters for the novelty test,

which may create inconsistent results. More recently a kernel RLS algorithm has been

proposed that dealt with the complexity issues by applying a sparsification procedure to

limit the size of the kernel Gram matrix [74]. This algorithm allows for online training

but cannot handle time-varying data. An improvement has also been presented to this

approach through the sliding window kernel R LS algorithm [57], where an efficient update

rule is presented to compute the least square solution for the kernel weights with data in

window that slides one sample at time. This approach has one simple flaw, the response

time of the system to any abrupt change in the data is roughly the same as the window

length that is chosen a priori. In addition, it requires the choice of a regularization

parameter, which is not trivial. These algorithms have been used for nonlinear system

identification. In certain cases, like for estimating a communication channel, it is preferred

that the implementation is online and that it responds to nonstationary variations. To this

cause we shall employ the K~ernel LMS algorithm that we presented in a previous chapter.

As we shall see through simulations this approach has certain advantages over the other

methods.









6.8.2 Identification of a Wiener System

The Wiener system is a well-known simple non-linear system which consists of a

linear system followed by a static (memoryless) nonlinearity (see figure 6-8). Such a

channel can he encountered in digital satellite communications and in digital magnetic

recording. Traditionally, this type of nonlinear equalization or channel estimation has

been tackled hv nonlinear structures such as neural networks and MLPs [75]. Here we

shall compare our results with the sliding window K~ernel RLS (K(RLS), since it has

already been shown that K(RLS performs better than the MLPs. In addition we shall also

show the results with the R AN as a reference. We should note that the R AN is the least

complex with computation of O(NV) for each time step, where NV is the number of RBF

(kernel) units present at each step. Our method, K(LMS with restricted growth, and K(RLS

have a computation complexity of O(NV2). For K(LMSR, NV is the number of kernel units

present at each step and for K(RLS, it is the size of the sliding window. As we shall see

shortly, in certain cases, the number of kernel units required for K(LMSR can he much

smaller than the size of the sliding window for (R LS for the same performance, hence

K(RLS would be far more costly.






SH (z) f (-) t



Figure 6-8. A nonlinear Wiener system.


6.8.3 Experimental Results

Supervised learning is used for the identification of an unknown Wiener system. To

test the tracking capability of the various algorithms, we shall switch the coefficients of

the linear system at a given instant. As mentioned in [57] we would expect the response

time of K(RLS algorithm for this change to be roughly the window size that is chosen.











































**

















* I


We would also expect the K(LMS and RAN algorithms to be much faster since they don't

employ any windowing scheme.

During the first part of the simulation, the linear channel is Hl(z) = 1 + 0.0668z- -

0.4764z-2 + 0.8070z-3 and after receiving 600 symbols it is changed to H2()=

0.4326z-1 0.0656z-2 + 0.7153z-3. A binary signal (x, E -1, +1) is sent through

this channel after which the signal is transformed nonlinearly according to the nonlinear

function, v(u) = tanh(u(u)), where v(u) is the linear filter output. A scatter plot of u(u)

versus v(u) can be found in figure 6-9. Finally, 15 dB of additive white Gaussian noise is

added. The Wiener system is treated as a black box of which only input and output are

known. The sliding window RLS algorithm was applied starting from the time index equal

to the window length since that much of data is required.


0.8



0.4-

0.2 -

0-

-0.2-

-0.4-






-1
-2.5


-2 -1.5 -1 -0.5 O
u~n)


0.5 1 1.5 2 2.5


Figure 6-9. Signal in the middle of the Wiener system vs. output signal for binary input
symbols and different indexes n.











System identification was performed by using three techniques: (1) kernel LMS with

constrained growth (K(LMS), (2) sliding window kernel RLS (K(RLS), and (3) the resource

allocation network (RAN). For all the methods we applied time embedding techniques in

which the length L of the linear channel was known. The input vectors for the algorithms

were time d.l I- II vectors of length L, x, = [x,_L 1,.,..., x,]T. Since the length of the

linear channels H1 and H2 WaS 4, We alSO used L = 4. Here are the parameter values that

were chosen, mostly heuristically, for each method so as to obtain the best performance:

(1) K(LMSR: threshold for the linear dependency test, 6 = .2; step size=0.2; kernel

size, O.2 = 1. (2) K(RLS: window size N= 120; regularization parameter =0.001; kernel

size, O.2 = 1. (3) RAN: the novelty test parameters error parameter= 0.17 and distance

parameter=0.17; step size=0.2; kernel size, O.2 =

The MSE values at different time instances (learning curve) are given in figure 6-10

As expected the K(RLS approach is the slowest and takes a time span approximately



-KLMS
014~ ,,.....KRLS
-----RAN

O 12-



01 -









II

100 200 300 400 500 600 700 800 9HO
sample Index n


Figure 6-10. Mean square error for the identification of the nonlinear Wiener system with
the three methods. The values for K(RLS is only shown after the the first
window.










equal to the length of the sliding window to adjust for a sudden change. The R AN and

the proposed K(LMSR algorithm both seem to work similarly with same response time to

an abrupt change in the channel. Though the R AN is computationally more efficient it

requires one more parameter to be decided a priori. Also, both R AN and K(LMSR restrict

there architecture to only 16 radial basis functions in the network, automatically through

their respective criteria. But using 16 as the sliding window length (and hence 16 RBFs)

for K(RLS is so poor that we have not included the results. K(RLS required a window size

of around 100 for comparable results. This is because it is only using 16 samples at any

given time for estimation, where as R AN and K(LMSR (hoth LMS based algorithms),

though use only 16 radial basis functions, incorporate the information of the past data

until the given instance to estimate the output. R AN and K(LMSR are continuously

adapting at each step, so these two approaches are better than K(RLS So overall for this

example, in terms of design simplicity and performance, K(LMSR is the most appropriate

choice since it has lesser parameters than R AN, but in terms of the computation speed,

R AN could be used.

6.9 Summary

We have derived directly Widrow's least mean squares algorithm in an infinite

dimensional kernel Hilbert space and modified the algorithm to contain growth of the

resulting nonlinear filter. The algorithm uses the kernel trick to obtain the final output,

effectively resulting in the adaptation of a nonlinear filter without the complexities of

backpropagation and utilizing the data pretty much as the conventional linear filter

trained with LMS. Although stochastic learning is known in the machine learning

coninunity, the problem of growing complexity had still not been addressed for this

class of problems. Here we have proposed a set of sequential steps to test the linear

dependency (in the feature space) of each incoming data with the previous input samples.

This, in turn, provides a means of distributing the present error to weight the kernel

functions centered at the previous input samples and restrict the growth of the algorithm










complexity. This method can also be potentially used for other on-line kernel based

algorithms where a batch of data is not available a priori and the complexity of the

algorithm needs to be restricted sequentially one sample at a time.

It is also noteworthy that the K(LMS and the improved K(LMSR algorithms basically

solve the least squares problem (in the feature space) implying that the gradient search is

on a smooth quadratic performance surface, resulting in reliable convergence without the

hassles of local nxininia. Another interesting fact when the Gaussian (or any translation

invariant) kernel is utilized, is that the transformed data lies on the surface of a sphere

i.e., the transformed data is automatically normalized, behaving like the normalized LMS

algorithm, which an important advantage.









CHAPTER 7
APPLICATION TO BLIND EQUALIZATION

7.1 Motivation

In digital communication systems, the transmitted signals are often distorted through

a handlimited channel which introduces intersymbol interference (ISI). Blind equalization

refers to the problem of restoring the original digital signal when only the channel's output

is available. To solve the problem, blind techniques exploit some knowledge about the

statistical properties of the input signal or the structure of the channel [76].

Benveniste et. al. [77] were the first to prove that a sufficient condition for perfect

equalization is that the pdf of the recovered symbols he equal to the source pdf. According

to this idea some blind algorithms aiming at forcing a given pdf at the output of the

equalizer have been proposed [78], [79],[80]. Later, Shalvi and Weinstein relaxed this

condition by showing that it is enough to match a few cumulants between the output

signal and the source in order to achieve perfect equalization [81]. For instance, the

so-called super-exponential algorithm [82] maximizes the kurtosis at the output of the

channel subject to a constraint on the output variance. Other algorithms based on

this result are the cumulant matching methods proposed by Tugnait [83], and also by

Hatzinakos and Nikias [84].

To summarize, the theoretical results in [77] and [81] demonstrate that the higher-order

statistics are enough to solve the problem. This fact may explain why to date the time

structure of the source signals has not been explicitly exploited in blind equalization

algorithms, even though for most practical cases it is known that the source is correlated

(in coded systems, for instance). However, we feel that in any learning or adaptive

algorithm we should use as much prior information as possible about the problem.

According to this idea, the proposed correntropy function enables us to derive new cost

functions for blind equalization that exploits simultaneously our knowledge about both the

pdf of the source and its time structure.









7.2 Problem Setting

We consider a baud-rate sampled baseband representation of the digital communication

system. As a simple example of a coded digital communications system we consider the

Alternate Mark Inversion (AMI) line coding, which is a line code used for T1 and E1. In

AMI, the logical "O" is represented by no line signal and the logical "1" is represented by

positive and negative alternate pulses [19]. After matched filtering and sampling at the

baud rate we have a correlated symbol sequence drawn from the alphabet {-1, 0,+1} with

probabilities 1/4, 1/2 and 1/4, respectively.

Now, let us suppose that the source AMI signal s, is sent through a linear time-invariant

channel with coefficients h,. The resulting channel output can be expressed as


x, = hea-i +en,(7-1)


where e, is a zero-mean white Gaussian noise with variance of2. The objective of a blind

linear equalizer is to remove the ISI at its output without using any training sequence.

Typically, the equalizer is designed as an FIR filter with M~ coefficients w, then, its output

is given by

yr = C ". = w xn. (7-2)
i=0
Probably, the most popular adaptive blind algorithms are the family of Godard

algorithms [85], which are stochastic gradient descent (SGD) methods for minimizing the
cost functions

JG'(w)= E [(|y,|" Rp)2] p =1, 2, (7 3)

where Rl, = ii depends on the input constellation (pdf). For the particular case

p = 2, Eq.(7-3) is the cost function of the constant modulus algorithm (C'jl.4), [85],[86].

Using an SGD minimization approach, the CilA. can be written as


w, 1 = w, p-(|y,|n2 RUnXn. (7-4)









As a measure of equalization performance we use the ISI defined as

Cn |len2 maX, 18, 2
ISI = 10 logo (7-5)
max, '0 |2

where 8 = h + w is the combined channel-equalizer impulse response, which is a delta

function for a zero-forcing equalizer.

7.3 Cost Function and Iterative Algorithm

The theoretical correntropy function for the source AMI signal is given by

~(0), k =
V,(k) = (0+ ()+(() k=+1(7-6)



where x(x) is the Gaussian kernel defined in 2-3. This function conveys information about

the pdf and the correlation of the source AMI signal. Therefore, we proposed to use the

following cost function for equalization


Je(w) = (v(k) V,(k))2;(77

where V,(k) is the correntropy function estimated at the output of the equalizer and P

is the number of lags (notice that we have removed the zero lag since it is ah--ws~ equal

to K(0)). In order to derive an SGD adaptive algorithm we can use a sliding window

approach and update once per input sample according to the following algorithm


we 1 = we + p (K(k) -Vk) (7-8)

where


8v(>w N,-kW ik i-W-)X ik. (7-9)
i=n-N,-k+1

A drawback of the proposed approach in an online setting is that a large number of

samples is required in order to have reliable and low variance estimates of the correntropy-









function. To mitigate somehow this problem we propose to use a "stop-and-go" technique

[87]. Specifically, the equalizer coefficients are only updated when the error function

obtained using hard decisions and the error function using the output of the equalizer have

the same sign.

7.4 Simulation Results

In the first example the AMI signal is distorted by an IIR channel: Hl(z) = ,~;~

then white Gaussian noise is added for a final SNR=20 dB. A 3-tap equalizer was used

and initialized with the center coefficient set to unity and the rest to zero.

The correntropy function was estimated using a window of NV, = 100 samples and

P = 4 lags. On the other hand, the kernel size was selected as the standard deviation

of the equalizer's output o- = std(y). Fig. 7 shows the ISI curves for correntropy and

C' \!.4 obtained by averaging 25 independent Monte Carlo simulations. The step-size for

correntropy was p = 0.6, whereas for C \!.9 was Meme = 0.04 (in both cases these are the

largest step-sizes for which all the trials converged). Fig. 8 shows the evolution of the

coefficients of the equalizer for correntropy (solid line) and C' \!.9 (dotted line) starting

from wo = (0, 1, 0), the zero-ISI solution is (0, 1, -0.5) (dashed line).

We see that for this example the correntropy function effectively extracts more

information than the C \!.9 about the source signals and it is able to provide a better

solution. In order to explain the poor behavior of C \!.9 for this particular case, we should

remember that the use of a nonuniform symbol distribution has the effect of raising the

kurtosis, thus making the pdf of the source distribution more Gaussian. Specifically, the

kurtosis for the AMI signal of our example: is =i~ 2, whereas for a uniform BPSK

the kurtosis is 1, for a uniform 32-PAM is 1.798 and for a Gaussian is 3. Although the

source remains sub-Gaussian the increase in kurtosis has the effect of lifting the C \!.9 cost

function (thus increasing the excess mean squared error), and flattening its surface (thus

reducing the convergence speed) [88]. Moreover, the use of a correlated source can also

cause 1!! ri ~ problems in the C \!.9 convergence as it was also pointed out in [88]. These










1!! ri ~ drawbacks when there is source correlation as well as high kurtosis, seem to affect

most of the conventional blind equalization techniques. For instance, the algorithm in [82]

(which uses the kurtosis as a cost function) was not even able to open the eye diagram for

most trials using a window of 100 samples. For this reason we do not include the results

obtained with this method here.





-5-



-10


I\ CMA
-15-



-20-
Correntropy

-25-


0 5000 10000 15000
Iterations (symbols)

Figure 7-1. Inter symbol interference (ISI) convergence curves for correntropy and C'jlA
under Gaussian noise.


Another interesting property of the correntropy function is its robustness against

impulsive noise. This additional advantage is due to the fact that when an outlier is

present, the inner product in the feature space computed via the Gaussian kernel tends

to be zero (i.e., s(yi yi-k) a 0 when either yi or yi-k have a large value). To illustrate

this point we have used the same simulation example, but this time the channel output

is distorted with impulsive noise generated according to the following Gaussian mixture

model.










I J
Correntropy
w ......... CMA





0.5-



o~Wo

W2
-0.5~-- - -----



-1
0 500 1000 1500
I te rati on0s

Figure 7-2. Convergence curves of the equalizer coefficients for correntropy and C \!.9
under Gaussian noise. The true solution is w = (0, 1, -0.5).



a (2 1 ) n )2(70
f (u) = x exp (-0
So-,ra 2o- No-2ra 2o-
where typically e << 1 and of: >> of2. Specifically, in our simulations we used e = 0.15,

of >> 500-2 and choose o-2 to get a final SNVR = 20 dB. Fig. 9 show the ISI curves

obtained in this case. Now the step-size for correntropy was p = 0.4 and Mema = 0.001. We

can see that even for this small step-size the C \!.9 is not able to converge due to the large

noise spikes at the channel output. On the other hand, with the correntropy function we

obtain practically the same convergence behavior as in the Gaussian case.
























CMA
-5


-10


-15-


-20 Correntropy


-25


-30
0 54~~~9ter nations (symbols)0 50

Figure 7-3. Inter symbol interference (ISI) convergence curves for the correntropy and
CN! Al under impulsive noise.









CHAPTER 8
SUMMARY

8.1 Summary

Our research presents a novel means of extracting information of time structures

and applied it for time series analysis (matched filtering and optimal adaptive filtering)

based on kernel methods and the newly invented concept in ITL, correntropy. Correntropy

induces an intriguing RK(HS but also generates a finite dimensional covariance function,

which unlike the Gram matrix using kernel methods, is ahr-l- .- well conditioned and the

corresponding solutions do not require explicit regularization. Therefore, it is different

from the conventional kernel methods, in both scope and detail. We have demonstrated

these ideas by proposing the robust matched filter and the correntropy Wiener filter.

The matched filter is a very simple and effective method that can be used in a

v .vi.~ iv of scenarios by just tuning the kernel size. Our results show that when the noise is

Gaussian the matched filter (ill ) has an overall higher probability of detection. However,

by increasing the kernel size, the C'l\l~ approaches to this optimal performance with

practically the same result. Clearly, the matched filter being based on second order

statistics alone has shortcomings when non-Gaussian noise is introduced by the channel.

In this case with an appropriate kernel size the C111l outperforms the traditional matched

filter -II---- -r;1.-_ that the RK(HS defined by the correntropy is superior, especially for

impulsive noise environments, which can readily be observed in low-frequency atmospheric

noise, fluorescent lighting systems, combustion engine ignition, radio and underwater

acoustic channels, economic stock prices, and biomedical signals. It also outperforms

the matched filter using mutual information which has a much higher computational

burden. We have also presented the exceptional behavior of our new method for detection

in alpha-stable distributed noise. It is noteworthy that the performance of our match

filter closely rivals that of the locally suboptimal detector, which is exclusively designed










for cc-stable distributions but requires a priori the value of a~ before the detector can he

designed.

Results clearly show the superiority and flexibility of the C'jll~ in a wide variety

of situations with least complexity, which otherwise would require heavy training and

complex algorithms. This exceptional robustness against impulsive noise can he attributed

to the fact that correntropy evaluates the integral of the joint pdf of two random variables

along the line that represents their equality, thus focusing more on the likelihood of the

two random variables having close values and less on the heavy tails that leptokurtic noise

generates. It is also noteworthy that the proposed method works impressively even when

the template size is just 64, which is another limitation for complex methods.

We should also mention the comparison of the C'j \ I~ and other kernel based detection

filters. We see that the computation complexity of the C' Ill~ is linear with the template

length, unlike other kernel based filters with a complexity of at least O(NV2). This is due

to the use of the correntropy as the similarity measure, instead of the large matrix of

projected points required by other kernel methods.

Our research also presents an investigation on a new type of Wiener filter based on

the recently introduced correntropy function. This dissertation shows a means of directly

using correntropy as the autocorrelation function of the projected data. This approach

can also be used to extend other linear learning schemes to nonlinear algorithms. The

method also avoids the large gram matrix used for many kernel based methods and the

RBF network. Instead system order is decoupled from the data size. Since correntropy

effectively uses higher order information, the correntropy Wiener filter performs better

than the linear filter in a variety of scenarios. Yet the limitations on the performance

of the method is due to the relatively naive approximation emploi-, I1 to derive the final

formulation. Though correntropy filter has a structure very similar to the Hammerstein

model, the nonlinearity is implicit, derived from the statistics of the data and the kernel

emploi-. I1










Finally, we have presented the nonlinear LMS algorithms that do not need regularization.

Correntropy LMS (CLMS) is derived by nxinintizing the stochastic gradient of the

MSE and using correntropy as the generalized correlation function. It also employs an

approximation, one which has been seen to be more accurate than the one eniploi- II to

derive the correntropy Wiener filter. In addition we have also presented the kernel LMS

(K(LMS) algorithm which is derived front kernel methods which also incorporates CLMS as

special case. Overall these methods are very simple and efficient and are very comparable

to the solution obtained by kernel methods (or R BF networks with centers at input data

points) but without the complexities of large matrix operations and the heuristics of

choosing appropriate regularization. Their defining characteristics is that the solutions rely

on improving successive output samples based on the errors incurred in the past.

We have also addressed the issue of kernel LMS that could potentially prohibit its use

in real world problems, namely the continuous addition of a kernel to the solution with

each time step. This results in a constant growth of the filter architecture, thus increasing

the storage and computational complexity with each new sample. Since the error values

would eventually die out, the growth could be artificially stopped when the error reaches

below a given tolerance. But this would also stop the continuous learning process that

makes LMS so attractive. Instead we present a more intelligent approach by using the

linear dependency as a criterion for deciding the growth of the filter architecture without

hindering the continuous saniple-by-saniple learning process. We also demonstrated how

this could effectively nxinintize the number of kernel functions with nxininmal effect on the

performance by applying it to identify a Wiener channel with a binary input where the

linear filter component of the channel changes abruptly.

The simulation results show that the kernel LMS approach is faster and more cost

effective both in storage and computation than the recently proposed sliding window

kernel RLS. Of course, it could be expected because the response time for an abrupt

change for the kernel RLS is approximately the length of the window that is applied.










In fact, the same arguments that is used for the superior tracking of the linear LMS

algorithm versus the linear RLS algorithm can be used here as well. The sample by sample

stochastic adaptation involved in LMS provides an exceptional advantage.

We have presented two other useful applications that exemplify the various

advantages of using the theory and ideas presented in this dissertation. We have

demonstrated the use of the robust correntropy template matching concept in shape

classification for occluded objects. Our method uses a fundamentally different approach

to deal with the ever so unavoidable occlusions that are encountered in real world image

processing applications. Without using the various landmarks that are usually emploi-- 4

for shape detection, we have used the fact that occlusions practically behave like impulsive

noise. Hence correntropy has proven to be an effective means of tackling such occlusions.

We have also presented application on blind equalization. There have been numerous

research articles published in this field. But invariably all methods either assume that

the transmitted source signal is iid or simply ignore the possible time structure. The

assumption has ahr-l- .- been that there is no information available about the source before

hand. But this is not ahr-l- .- true especially with communication channels. Usually a well

known coding scheme is emploi-c 4 by the transmitter, which translates to a known time

structure. In certain cases, the correntropy function of the source signal can be calculated

analytically before hand. So the idea is to effectively use this information about the time

structure. We have seen that the use of correntropy gives better results for correlated

sources than a leading well known method called the constant modulus algorithm (C'\! A).

8.2 Other Applications

Though we have presented just a few applications, the concepts and ideas that

have been presented are not limited for just these. This research -II_ _- -is an elegant

framework, based on the ideas of information theoretic learning and kernel methods,

that could solve many difficult problems where the usual assumptions of Gaussianity

and linearity will no longer be valid. This would include many areas of engineering like










communications, geosensingf, computer vision, financial data modeling and biomedical

engineering just to name a few. The same concept of correntropy as a reproducing kernel

and a measure of higher order correlation has also been utilized by K~yu-Hwa Jeong for the

correntropy based minimum average correlation energy filter (C'j!ACE), Jian-Wu Xu for

pitch detection in speech, and Seungju Han for robust beam forming with very promising

results. Certain other applications like using correntropy for compressive sampling and

identifying MIMO channels is also being explored.

8.3 Future Work

We have presented elegant nonlinear versions of three basic time series problems in

signal processingf- the matched filter, the Wiener filter, and the LMS filter. At the same

time all these novel methods leave some room for improvement either in continuing the

theoretical study or in bettering their practical implementation. Hence the future work

should address these issues.

(a) The correntropy matched filter has proven to be a simple and effective tool for

signal detection in a non-Gaussian environment, particularly in impulsive noise. We have

also provided a novel interpretation of correntropy in terms of the integral of the pdf

function along the line which explains the working of this filter. Theoretically we have also

shown that the correntropy matched filter is actually a linear matched filter for feature

vectors nonlinearly related to the input data though the correntropy function itself. We

should also note that this nonlinearity is not merely through a static nonlinearity but

is more complex and related to the statistics of the input data. Though the notion of

optimality in terms of correntropy being correlation can be established in the space of

the transformed vectors through this relationship, a direct optimality in terms of the

input data would render the theory as more complete. Given the nature of the nonlinear

mapping on the input data and the kernel trick that is emploi-x I it might not be easily

feasible to derive this formulation as an exact maximum likelihood criterion, but with










some simplification a near-optimal theoretical interpretation would be welcome and

possibly more feasible.

(b) The 1 in r limitation of the correntropy Wiener filter is the approximation

that gives the final expression. It is a simple one and does give a working solution such

that it effectively gives better solutions than that by the linear Wiener filter. But the

approximation step has to be reviewed for the filter to perform consistently better than

other comparable nonlinear methods like the Nadaraya Watson estimator. In fact, like

the previous point, this also boils down to understanding the direct relationship between

the input data and the transformed feature points on which the correlation function is

computed.

(c) The backbone of better elucidating both the theoretical and practical implications

of using the various methods presented in this study is, in fact, understanding better

the relationship of the input data with the feature space induced by correntropy. This

would not only further the theoretical standing of the concepts, but also improve the

performance of the signal processing tools developed through this research. A starting

point for this could be studying the parametric adjustments made to the correntropy

Wiener filter to improve its performance and the formulation of the correntropy LMS

algorithm. The theories of kernel methods and information theoretic 1& Illrf11, and

various differential geometry concepts also has an important part to pIIl i- to further the

understanding.

(d) Finally we would want to apply our ideas to more real problems. We have

applied these intriguing concepts to blind equalization and nonlinear system identification

with promising results, but these applications were tested on synthetically designed

experiments. We further applied a variation of the correntropy matched filter for

recognizing occluded objects using a database of real fish shapes. The basis of this

application was the hypothesis that partial occlusion can be modeled as impulsive noise.

This is a deviation from the more conventional approach of using certain landmarks.










The performance was far better than recently published results that eniploi- 0 the same

hypothesis where conventional methods are known to fail. This concept has to be further

explored, probably to use it as a part of a more complex object recognition algorithm.

Other environments where impulsive behavior is abundant are low-frequency atmospheric

noise, fluorescent lighting systems, combustion engine ignition, radio and underwater

acoustic channels, economic stock prices, etcetera. We have just started to explore various

avenues of applicability. A lot of possible real problems could benefit front our approach

of reforniulating conventional signal processing solutions using the novel similarity and

correlation measure as demonstrated through this study.










LIST OF REFERENCES


[1] N. Aronszajn, "Theory of reproducing kernels," Transactions of the American
Afrathematical S... .:. It; vol. 68, pp. :337-404, 1950.

[ 2] E. Parzen, Statistical M~ethods on Time Series by Hilbert Sp~ace M~ethods, Applied
Mathematics and Statistics Laboratory, Stanford University, Stanford, CA, 1959,
Technical Report.

[:3] V. Vapnik, The Nature of Stratistical Learning The <;, Springer, New York, 1995.

[4] B. Schiilkopf, A. Smola, and K(.-R. Miller, "Nonlinear component analysis as a kernel
eigenvalue problem," Neural C'omp~utation, vol. 10, pp. 1299-1319, 1998.

[5] S. Mika, G. Ratsch, J. Wetson, B. Schiilkopf, and K(.-R. Miller, "Fisher discriminant
analysis with kernels," in IEEE Workshop on Neural Networks for S. I,..rl Processing
IX, Madison, WI, USA, 1999, pp. 41-48.

[6] F. R. Bach and 31. I. Jordan, "K~ernel independent component analysis," Journal of
Iafechine Learning Research, vol. :3, pp. 1-48, 2002.

[7] D. R. Hardoon, S. Szedmak, and J. Shawe-Taylor, "Canonical correlation analysis: an
overview with application to learning methods," Neural C'omp~ubstion, vol. 16, no. 12,
pp. 2699-2664, 2004.

[8] J. C. Principe, D. Xu, and J. Fisher, "Information theoretic 1. llrflr ") in Unsup~er-
vised Adap~tive Filtering, S. Haykin, Ed. Wiley, 2000.

[9] D. Erdogmus and J. C. Principe, "An error-entropy minimization algorithm for
supervised training of nonlinear adaptive systems," IEEE Transactions on S~ll...rl
Processing, vol. 50, pp. 1780-1786, 2002.

[10] K(. E. Hild II, D. Erdogmus, and J. C. Principe, "Blind source separation using
Renyi's mutual information," IEEE S.:ll...rl Processing Letters, vol. 8, pp. 174-176,
2001.

[11] I. Santamarfa, D. Erdogmus, and J. C. Principe, "Entropy minimization for
supervised digital communications channel equalization," IEEE Transactions on
S~li,..rl Processing, vol. 16, no. 12, pp. 2699-2664, 2004.

[12] C. W. Helstrom, Elements of S~ll...rl Detection and Estimuation, Prentice Hall, New
Jersy, 1995.

[1:3] J. Cardoso, "Blind signal separation: Statistical principles," Proceedings of the IEEE
(Sp~ecial Lasue), vol. 86, pp. 2009-2025, 1998.

[14] A. Bell and T. Sejnowski, "An information-maximization approach to blind
separation and blind deconvolution," Neural C'omp~utation, vol. 7, pp. 1129-1159,
1995.










[15] I. Santamaria, P. P. Pokharel, and J. C. Principe, "Generalized correlation function:
Definition, properties and application to blind equalization," IEEE Transactions on
S~li,..rl Processing, vol. 54, no. 6, pp. 2187-2197, 2006.

[16] A. P. Sage and J. L. Melsa, System I I. ,./I.:. rl..>,n, Academic Press, New York, 1971.

[ 17] M. Pourahmadi, Foundations of time series r, t..; &;. and prediction 'i,, ., ;, Wiley,
New York, 2001.

[18] Y. Zhao and S.-G. HT I__il. I1. lIll, ~I I.)! i s interference self-cancellation scheme for
ofdm mobile communication systems," IEEE Transactions on Communication, vol.
49, no. 7, pp. 1185-1191, 2001.

[19] J. G. Proakis, Digital Communications, McGraw-Hill, New York, 4th edition, 2001.

[20] G. L. Turin, "An introduction to matched filters," IEEE Transactions on Information
The .-<;, vol. 19, pp. 19-28, 1973.

[21] D. Duttweiler and T. K~ailath, "An RK(HS approach to detection and estimation
problems-part IV: NonGaussian detection," IEEE Transactions on Information
The -<;; pp. 310-329, 1960.

[22] H. K~won and N. M. No 1-0 .11I, "Hyperspectral target detection using kernel matched
subspace detector," in ICIP, 2004, pp. 3327-3330.

[23] I. Santamaria, D. Erdogmus, R. Agrawal, and J. C. Principe, "Robust matched
filtering in the feature space," in EUSIPCO, 2005.

[24] D. Erdogmus, R. Agrawal, and J. C. Principe, "A mutual information extension to
the matched filter," S.:ll..rl Processing, vol. 85, pp. 927-935, 2005.

[25] R. J. P. deFigueiredo and Y. Hu, "On nonlinear filtering of non-Gaussian processes
through Volterra series," in V/olterra Equations and Applications, pp. 197-202. Gordon
and Breach Science Publishers, Amsterdam, 2002.

[26] Simon Haykin, Neural Networks: A Comprehensive Foundation, Prentice Hall, New
Jersey, 2nd edition, 1998.

[27] C. M. Bishop, Neural Networks for Pattern Recognition, Oxford University Press,
New York, 1995.

[28] K(. R. Muller, A. J. Smola, Gunner Ratsch, B. Scholkopf, J. K~ohlmorgen, and
V. Vapnik, "Usingf support vector machines for time series prediction," in Advances
in Kernel M~ethods. MIT press, Cambridgfe, 1999.

[29] B. Widrow and S. D. Stearns, Adaptive S~ll,..rl Processing, Prentice-Hall, Englewood
Cliffs, New Jersey, 1985.

[30] S. Saitoh, Theory of Reproducing Kernels and its Applications, Longman Scientific
and Technical, U.K(., 1988.










[31] G. Wahba, "Spline models for observational data," in CBM~S-NSF Ri I.: y...trl Confer-
ence Series in Applied M~athematics, Philadelphia, USA, 1990, vol. 59.

[32] R. Jenssen, D. Erdogmus, J. C. Principe, and T. Eltoft, "The Laplacian PDF
distance: A cost function for clustering in a kernel feature space," in Neural Informa-
tion Processing S;;-/. :i 2004.

[33] R. Jenssen, D. Erdogmus, J. C. Principe, and T. Eltoft, "Towards a unification of
information theoretic learning and kernel methods," in IEEE International Workshop
on Machine Learning for S.:ll..rl Processing, 2004, pp. 443-451.

[34] T. M. Cover and J. A. Thomas, Elements of rlofor tI,,l..>,n The <;, Wiley, New York,
1991.

[35] J.-W Xu, D. Erdogmus, R. Jenssen, and J. C. Principe, "An information-theoretic
perspective to kernel independent component analysis," in ICASSP 9005,
Philadelphia, USA, 2004.

[36] P. Pokharel, J. W. Xu, D. Erdogmus, and J. C. Principe, "A closed form solution for
a nonlinear Wiener filter," in IEEE International Conference on Acoustics, Speech,
and S.:li..rl Processing, 2006, vol. 3, pp. 720-723.

[37] K(. H. Jeong and J. C. Principe, "The correntropy MACE filter for image
recognition," Accepted for IEEE International Workshop on M~achine Learning
for S~li,..rl Processing, 2006.

[38] G.H. Golub and C.F. Van Loan, M~atrix: Computations, John Hopkins University
Press, Baltimore, 2nd edition, 1989.
[39] S. K~ay, Fundamentals of Statistical S.ge.lPoesng oue1 stmto h
Prentice Hall, New Jersey, 1993.

[40] T. Evegeniou, M. Pontil, and T. Poggio, "Regularization networks and support vector
machines," Advances in Computational M~athematics, vol. 13, no. 1, pp. 1-50, 2000.

[41] K(. Tsuda, M. K~awanabe, G. Ratsch, and S. Sonnenburg, "A new discriminative
kernel from probabilistic models," Neural Computation, vol. 14, pp. 2397-2414, 2002.

[42] W. Greblicki, \. is!!~!! .1 Av estimation in hammerstein systems based on ordered
observations," IEEE Transactions on m..:l:r processing, vol. 44, no. 5, pp. 1224-1233,
1996.

[43] L. L. Scharf, Statistical S~ll,..rl Processing: Detection Estimation, and Time Series
A,..rle;,: Addison-Wesley, New York, 1991.

[44] J. H. McCulloch, "Financial applications of stable distributions," in Handbook of
Statistics, G. S. Madala and C. R. Rao, Eds., vol. 14, pp. 393-425. Elsevier, 1996.










[45] C. L. Nikias and 31. Shao, S.:ll..rl Processing with Alpher-Strable Distributions and
Applications, John Wiley and Sons, 1995.

[46] A. B. Salberg, Alfred Hanssen, and L. L. Scharf, "Robust multidimensional matched
subspace classifiers based on weighted least-squares," IEEE Tnen~srctions on S y.:(,:rl
Processing, vol. 55, pp. 87:3880, 2007.

[47] S. 31. Zabin and H. V. Poor, "Efficient estimation of the class A parameters via the
em algorithm," IEEE Tman~srctions on Infortuation The ..<;; vol. 47, pp. 60-72, 1991.

[48] H. V. Poor and 31. Tanda, \!.ill usteler detection in flat fading non-gaussian channels,"
IEEE Tman~srctions on C'otmunications, vol. 50, pp. 1769-1777, 2002.

[49] V. 31. Zolotarev, One-Dimensional Stable Distributions, American Mathematical
Society, Providence, R. I., 1986.

[50] C. L. Brown, "Score functions for locally suboptimum and locally suboptimum rank
detection in alpha-stable interference," in 11th IEEE S: I...rl Processing Workshop on
Statistical S: I,..rl Processing. SSP2001, Singapore, 2001, pp. 58-61.

[51] ~ ~ C S.A Ksa ,:r S..1Detection in Non-Gaussian Noise, Springer-\ ~11. New York,
1988.

[52] L. Devroye and G. Lugosi, C'ombinatorical M~ethods in D. 0 -.7 / Estimuation, Springer,
New York, 2001.

[53] B. W. Silverman, D. am/ /i Estimuation for Stratistic~s and Dates A,:tale;,.: Ch1 11pin 11 and
Hall, London, 1986.

[54] R. P. W. Duin, "On the choice of smoothing parameters for Parzen density estimators
of probability density," IEEE Tman~srctions on C'omputers, vol. 25, no. 11, pp.
1175-1179, 1976.

[55] S. K~ay, Fundramental~s of Statistical S~ll,..rl Processing. Volume 2: Detection Tis.. ;,
Prentice Hall, New Jersey, 1998.

[56] R. Weron, "On the C'I 1inl ers-Mallows-Stuck method for simulating skewed stable
random variables," Stratistic~s and Pr o~l~.:l..7.;i Letters, vol. 28, pp. 165-171, 1996.

[57] S. V. Vaerenbergh, J. Via, and I. Santamaria, "Nonlinear system identification using
a new sliding-window kernel RLS algorithm," Journal of C'oymunications, vol. 2, no.
3, pp. 1-8, 2007.

[58] I.L. Dryden and K(.V. Alardia, Statistical >Torl'.. A,...rle;,.: Wiley, Chichister, U.K(.,
1998.

[59] D. G. K~endall, T. K(. Carne D. Barden, and H. Le, b',~l''.g and b',~l''.g The.-<;, Wiley,
Chichister, U.K(., 1999.










[60] N. Ansari and E. J. Delp, "Partial shape recognition: A landmark-based approach,"
IEEE Tman~srctions on Pattern A,:tale;,.: and Iaftchine Intelligence, vol. 12, no. 5, pp.
470-483, 1990.

[61] J. Zhang, X. Zhang, H. K~rim, and G. G. Walter, "Object representation and
recognition in shape spaces," Pattern Recognition, vol. :36, pp. 114:31154, 200:3.

[62] L. L. Scharf and B. Friedlander, jl Ileched subspace detectors," IEEE Tman~srctions
on Sy..al:r Processing, vol. 42, pp. 2146-2157, 1994.

[6:3] Simon Haykin, Adap~tive Filter The ..<;; Prentice-Hall, Upper Saddle River, NJ 07458,
USA, fourth edition, 2002.

[64] A. Balestrino and A. Caiti, "Approximation of Hammerstein/Wiener dynamic
models," in International Joint C'onference on Neural Networks, 2000, vol. 1, pp.
70-74.

[65] F. Takens, "Detecting strange attractors in turbulence," in Lecture Notes in
Afrathematic~s, vol. 898 of D;,:;;l...a s..l Sl,;-ii.;; and Turbulence, pp. :366-381. Springer
Verlagf, 1981.

[66] B. Widrow, Adap~tive filters I: Fundamsental~s (TR 6764-6), Stanford Electronics
Laboratories, Stanford, CA, 1966, Technical Report.

[67] S. Smale and Y. Yao, "Online learning algorithms," Forundations of comp~ubstional
mathematics, vol. 6, pp. 145-170, 2006.

[68] J. K~ivinen, A. Smola, and R. Williamson, "Online learning with kernels," IEEE
Tman~srctions on C: I,..al Processing, vol. 52, pp. 2165-2176, 2004.

[69] A. H. Cl .0 4, Fundramental~s of Adap~tive Filtering, John Wiley, New Jersey, 200:3.

[70] B. Schoilkopf, Learning with kernels : .sup~port vector machines. ,.gellal~rization.
optimization. and beyond, MIT Press, Cambridge, 2002.

[71] Weifeng Liu, P. P. Pokharel, and J. C. Principe, KEernel least mean square
algorithm," submitted to IEEE Tman~srctions on S~ll...rl Processing.

[72] A. N. Tikhonov and V. Y. Arsenin, Solution of Ill-Posed Problems, Wiley, New York,
1977.

[7:3] John Platt, "A resource-allocating network for function interpolation," Neural
C'omp~ubstion, vol. :3, no. 2, pp. 21:3225, 1991.

[74] Y. Engel, S. Alannor, and R. Meir, "The kernel recursive least-squares algorithm,"
IEEE Tman~srctions on Sy.:l,:rl Processing, vol. 52, no. 8, pp. 2275-2285, 2004.

[75] D. Erdogmus, D. Rende, J. C. Principe, and T.F. Womg, \unimlear channel
equalization using muiltil we;r perceptrons with information-theoretic criterion," in










IEEE Workshop on Neural Networks for S.:ll..rl Processing IX, North Falmouth, MA,
USA, 2001, pp. 443-451.

[76] Z. Ding and Y. Li, Blind Equalization and II~ :.:; t'l:..t:.>n, Marcel Dekker, New York,
2001.

[77] A. Benveniste, M Goursat, and G. Rouget, "Robust identification of a non-minimum
phase system: blind adjustment of a linear equalizer in data communications," IEEE
Transactions on Automatic Control, vol. 25, no. 3, pp. 385-399, 1980.

[78] J. Sala-Alvarez and G. Vazquez-Grau, "Statistical reference criteria for adaptive
signal processing in digital communications," IEEE Transactions on S.:(l..rl Process-
ing, vol. 45, no. 1, pp. 14-31, 1997.

[79] T. Adali, X. Liu, and M. K(. Sonmez, "Conditional distribution learning with neural
networks and its application to channel equalization," IEEE Transactions on S~ I,..rl
Processing, vol. 45, pp. 1051-1064, 1997.

[80] M. Lazaro, I. Santamaria, C. Pantaleon, D. Erdogmus, and J. C. Principe, jl Iltched
pdf-based blind equalization," in IEEE International Conference on Acoustics Sp~eech
and S.:ll..rl Processing, Hong K~ong, Clun!~ 1, 2003, vol. 4, pp. 297-300.

[81] O. Shalvi and E. Weinstein, L~ i-- criteria for blind deconvolution of nonminimum
phase channels," IEEE Transactions on In~:~ for tl..>,n Tit.- t~;, vol. 36, no. 2, pp.
312-321, 1990.

[82] O. Shalvi and E. Weinstein, "Super-exponential methods for blind deconvolution,"
IEEE Transactions on Information The.-<;, vol. 39, no. 2, pp. 504-519, 1993.

[83] J. K(. Tugnait, "Blind estimation and equalization of digital communication fir
channels using cumulant matching," IEEE Transactions on communications, vol. 43,
pp. 1240-1245, 1995.

[84] D. Hatzinakos and C. L. Nikias, "Blind equalization based on higher-order statistics
(hos)," in Blind Deconvolution, S. Haykin, Ed. Prentice-Hall, Englewood Cliffs, NJ,
1994.

[85] D. N. Godard, "Self-recovering equalization and carrier tracking in two-dimensional
data communication systems," IEEE Transactions on communications, vol. 28, pp.
1867-1875, 1980.

[86] J. R. Treichler and B. G. Agee, "A new approach to multipath correction of constant
modulus signals," IEEE Transactions on Acoustics, Sp~eech, and S. ge..rl Processing,
vol. 31, pp. 349-372, 1983.

[87] G. Picchi and G. Prati, "Blind equalization and carrier recovery using stop-and-go
decision-directed algorithm," IEEE Transactions on Communications, vol. 35, pp.
877-887, 1987.










[88] Jr. C. R. Johnson, P. Schniter, T. J. Endres, J. D. Behm, D. R. Brown, and R. A.
Casas, "Blind equalization usingf the constant modulus criterion: A review," Proceed-
ings of the IEEE, vol. 86, no. 10, pp. 1927-1950, 1998.









BIOGRAPHICAL SKETCH

Puskal P. Pokharel was born in Lalitpur, ?-' 1. I1 in 1981. He received his Bachelor

of Technology in Electronics and Communication Engineering from the Indian Institute

of Technology (IIT), Roorkee, India in 2003 and his Master of Science in Electrical and

Computer Engineering from the University of Florida in 2005. He has worked at the

Computational NeuroEngineering Laboratory (CNEL) as a research assistant pursuing his

Ph.D. under the supervision of Dr. Jose C. Principe. He is a member of the Tau Beta Pi

honor society and a student member of the IEEE. His current research interests include

digital signal processing, machine 1. Illrf11. information theoretic 1. llrf111: and their

ap pli cat ions .





PAGE 1

TIMESERIESANALYSISWITHINFORMATIONTHEORETICLEARNINGANDKERNELMETHODSByPUSKALP.POKHARELADISSERTATIONPRESENTEDTOTHEGRADUATESCHOOLOFTHEUNIVERSITYOFFLORIDAINPARTIALFULFILLMENTOFTHEREQUIREMENTSFORTHEDEGREEOFDOCTOROFPHILOSOPHYUNIVERSITYOFFLORIDA2007 1

PAGE 2

c2007PuskalP.Pokharel 2

PAGE 3

Tomyparents,professors,andfriends. 3

PAGE 4

ACKNOWLEDGMENTSWithpatience,persistenceandsupportofmanyindividuals,mystayfordoctoralresearchhasbeenverymemorialandfruitful.Withoutthehelpandtheactiveencouragementofsomeofthesepeople,theamountoftimeandeortrequiredofthePh.D.degreewouldhavemadeitoverwhelminglydaunting.IacknowledgeandthankDr.JoseC.Principeforhisroleasadvisorandmentorduringmystayhere.Ournumerousdiscussionsandexchangeofideashasbeenintegraltothisresearch.IalsothankhimforprovidingastimulatingenvironmentintheComputationalNeuroEngineeringLaboratorywhereIhavedevelopedandexpandedalotofmyengineeringandresearchcapabilities.Iamalsogratefultothemembersofmyadvisorycommittee,Dr.JohnG.Harris,Dr.K.ClintSlattonandDr.MuraliRaofortheirtimeandvaluablesuggestions.IalsowouldliketoexpressmyappreciationtoalltheCNELmembers,especially,thoseoftheITLgroupfortheircollaborationinthisandotherresearchprojects.Finally,IthankmyparentsforencouragingmetopursueaPh.D.intherstplace,andforsupportingmeeverystepoftheway. 4

PAGE 5

TABLEOFCONTENTS page ACKNOWLEDGMENTS ................................. 4 LISTOFTABLES ..................................... 7 LISTOFFIGURES .................................... 8 ABSTRACT ........................................ 10 CHAPTER 1INTRODUCTION .................................. 12 1.1SignalDetection ................................. 15 1.2OptimalandAdaptiveFiltering ........................ 16 2NEWSIMILARITYANDCORRELATIONMEASURE ............. 18 2.1Kernel-BasedAlgorithmsandITL ....................... 18 2.2GeneralizedCorrelationFunction ....................... 21 2.2.1CorrentropyasGeneralizedCorrelation ................ 21 2.2.2Properties ................................ 25 2.3SimilarityMeasure ............................... 28 2.4OptimalSignalProcessingBasedonCorrentropy .............. 32 3CORRENTROPYBASEDMATCHEDFILTERING ............... 36 3.1DetectionStatisticsandTemplateMatching ................. 36 3.1.1LinearMatchedFilter .......................... 36 3.1.2CorrentropyasDecisionStatistic .................... 37 3.1.3InterpretationfromKernelMethods .................. 39 3.1.4ImpulsiveNoiseDistributions ..................... 40 3.1.4.1Two-termGaussianmixturemodel ............. 40 3.1.4.2Alpha-stabledistribution .................. 41 3.1.4.3Locallysuboptimalreceiver ................. 41 3.1.5SelectionofKernelSize ......................... 42 3.2ExperimentsandResults ............................ 43 3.2.1AdditiveWhiteGaussianNoise .................... 44 3.2.2AdditiveImpulsiveNoisebyMixtureofGaussians .......... 45 3.2.3Alpha-StableNoiseinLinearChannel ................. 46 3.2.4EectofKernelSize .......................... 47 3.2.5LowCostCMFUsingTriangularKernel ............... 48 4APPLICATIONTOSHAPECLASSIFICATIONOFPARTIALLYOCCLUDEDOBJECTS ....................................... 55 4.1Introduction ................................... 55 5

PAGE 6

4.2ProblemModelandSolution .......................... 55 5WIENERFILTERINGANDREGRESSION .................... 62 5.1LinearWienerFilter .............................. 62 5.2CorrentropyBasedWienerFilter ....................... 63 5.3SimpleRegressionModelswithKernels .................... 66 5.3.1RadialBasisFunctionNetwork ..................... 66 5.3.2NadarayaWatsonEstimator ...................... 66 5.3.3NormalizedRBFNetwork ....................... 67 5.4ParametricImprovementonCorrentropyFilter ............... 67 5.5ExperimentsandResults ............................ 68 5.5.1SystemIdentication .......................... 69 5.5.2TimeSeriesPredictionofMackey-GlassTimeSeries ......... 70 6ON-LINEFILTERS ................................. 74 6.1Background ................................... 74 6.2CorrentropyLMS ................................ 75 6.3KernelLMS ................................... 76 6.4Self-RegularizedProperty ........................... 80 6.5ExperimentalComparisonbetweenLMSandKLMS ............. 82 6.6KernelLMSwithRestrictedGrowth ..................... 84 6.6.1SparsityConditionfortheFeatureVectors .............. 84 6.6.2Algorithm ................................ 87 6.7ExperimentalComparisonbetweenKLMSandKLMSwithRestrictedGrowth 89 6.8ApplicationtoNonlinearSystemIdentication ................ 91 6.8.1Motivation ................................ 91 6.8.2IdenticationofaWienerSystem ................... 93 6.8.3ExperimentalResults .......................... 93 6.9Summary .................................... 96 7APPLICATIONTOBLINDEQUALIZATION .................. 98 7.1Motivation .................................... 98 7.2ProblemSetting ................................. 99 7.3CostFunctionandIterativeAlgorithm .................... 100 7.4SimulationResults ............................... 101 8SUMMARY ...................................... 105 8.1Summary .................................... 105 8.2OtherApplications ............................... 108 8.3FutureWork ................................... 109 LISTOFREFERENCES ................................. 112 BIOGRAPHICALSKETCH ................................ 119 6

PAGE 7

LISTOFTABLES Table page 3-1ValuesforthestatisticforthetwocasesusingtheGaussiankernel. ....... 38 4-1Recognitionandmisclassicationrates. ....................... 60 7

PAGE 8

LISTOFFIGURES Figure page 1-1Setupofatypicaltimeseriesproblem. ....................... 13 1-2Generaloverviewforsolvingtimeseriesproblemsinthefeaturespace ...... 14 2-1ProbabilisticinterpretationofVXYthemaximumofeachcurvehasbeennormalizedto1forvisualconvenience. ............................. 31 3-1ReceiveroperatingcharacteristiccurvesforsynchronousdetectioninAWGNchannelwithkernelvariance2CMF=152MI=15thecurvesforMFandCMFfor10dBoverlap. ............................ 45 3-2ReceiveroperatingcharacteristiccurvesforasynchronousdetectioninAWGNwithkernelvariance2CMF=152MI=15. ................ 46 3-3Receiveroperatingcharacteristiccurvesforsynchronousdetectioninadditiveimpulsivenoisewithkernelvariance2CMF=5,2MI=2. ........ 47 3-4Receiveroperatingcharacteristiccurvesforasynchronousdetectioninadditiveimpulsivenoisewithkernelvariance2CMF=5,2MI=2. ........ 48 3-5Receiveroperatingcharacteristiccurvesforsynchronousdetectioninadditivewhite-stabledistributednoise,kernelvariance2=3,=1:1. ......... 49 3-6Receiveroperatingcharacteristiccurvesforsynchronousdetectioninadditivewhite-stabledistributednoise,kernelvariance2=3,SNR=15dB,theplotsforMIandCMFalmostcoincideforbothvaluesof. ............... 50 3-7Receiveroperatingcharacteristiccurvesforasynchronousdetectioninadditivewhite-stabledistributednoise,kernelvariance2=3,SNR=15dB,theplotsforMIandCMFalmostcoincideforbothvaluesof. ............... 51 3-8AreaundertheROCforvariouskernelsizevaluesforadditivealpha-stabledistributednoiseusingsynchronousdetection,=1:1. .................... 52 3-9AreaundertheROCforvariouskernelsizevaluesforadditiveimpulsivenoisewiththemixtureofGaussiansusingsynchronousdetection. ........... 53 3-10Triangularfunctionthatcanbeusedasakernel. ................. 53 3-11ReceiveroperatingcharacteristiccurvesforSNRof5dBforthevariousdetectionmethods. ........................................ 54 4-1Theshtemplatedatabase. ............................. 57 4-2Theoccludedsh. .................................. 60 4-3Theextractedboundaryoftheshin 4-2 ..................... 61 8

PAGE 9

5-1ThecomputationoftheoutputdatasamplegivenanembeddedinputvectorusingthecorrentropyWienerlter. ......................... 65 5-2Timedelayedneuralnetworktobemodeled. .................... 69 5-3InputandoutputsignalsgeneratedbytheTDNNdesiredresponse,CFandWFusingatimeembedingof2. ........................... 70 5-4MeansquareerrorvaluesforWFandCFforthesystemmodelingexample. .. 71 5-5MeansquareerrorvaluesforMGtimeseriespredictionforvariousembeddingsize. .......................................... 71 5-6MeansquareerrorvaluesforMGtimeseriespredictionfordierentsizeoftrainingdata. .......................................... 72 6-1Linearlterstructurewithfeaturevectorsinthefeaturespace. ......... 77 6-2ErrorsamplesforKLMSinpredictingMackey-Glasstimeseries. ......... 78 6-3LearningcurvesfortheLMS,theKLMSandtheregularizedsolution. ...... 84 6-4ComparisonofthemeansquareerrorforthethreemethodswithvaryingembeddingdimensionlterorderforLMSoftheinput. ................... 85 6-5LearningcurveoflinearLMSandkernelLMSwithrestrictedgrowthforthreevaluesof. ...................................... 89 6-6PerformanceofkernelLMSwithrestrictedgrowthforvariousvaluesof. .... 90 6-7Numberofkernelcentersaftertrainingforvariousvaluesof. .......... 91 6-8AnonlinearWienersystem. ............................. 93 6-9SignalinthemiddleoftheWienersystemvs.outputsignalforbinaryinputsymbolsanddierentindexesn. ........................... 94 6-10MeansquareerrorfortheidenticationofthenonlinearWienersystemwiththethreemethods.ThevaluesforKRLSisonlyshownafterthetherstwindow. 95 7-1IntersymbolinterferenceISIconvergencecurvesforcorrentropyandCMAunderGaussiannoise. ................................ 102 7-2ConvergencecurvesoftheequalizercoecientsforcorrentropyandCMAunderGaussiannoise.Thetruesolutionisw=;1;)]TJ/F15 11.955 Tf 9.298 0 Td[(0:5. ............... 103 7-3IntersymbolinterferenceISIconvergencecurvesforthecorrentropyandCMAunderimpulsivenoise. ................................ 104 9

PAGE 10

AbstractofDissertationPresentedtotheGraduateSchooloftheUniversityofFloridainPartialFulllmentoftheRequirementsfortheDegreeofDoctorofPhilosophyTIMESERIESANALYSISWITHINFORMATIONTHEORETICLEARNINGANDKERNELMETHODSByPuskalP.PokharelDecember2007Chair:JoseC.PrincipeMajor:ElectricalandComputerEngineeringThemajorgoalofourresearchistodevelopsimpleandeectivenonlinearversionsofsomebasictimeseriestoolsforsignaldetection,optimalltering,andon-lineadaptiveltering.TheseextensionsshallbebasedonconceptsbeingdevelopedininformationtheoreticlearningITLandkernelmethods.IngeneralallITLalgorithmscanbeinterpretedfromkernelmethodsbecauseITLisbasedonextractinghigherorderinformationthatbeyondsecondorderasgivenbytheautocorrelationfunctiondirectlyfromthedatasamplesbyexploitingnonparametricdensityestimationusingtranslationinvariantkernelfunctions.ITLingeneralisstilllackinginprovidingtoolstobetterexploittimestructuresinthedatabecausetheassumptionofindependentlydistributeddatasamplesisusuallyanessentialrequirement.KernelmethodsprovideanelegantmeansofobtainingnonlinearversionsoflinearalgorithmsexpressedintermsofinnerproductsbyusingthesocalledkerneltrickandMercer'stheorem.ThishasgivenrisetoavarietyofalgorithmsintheeldofmachinelearningbutmostofthemarecomputationallyveryexpensiveusingalargeGrammatrixofdimensionsameasthenumberofdatapoints.Sincetheselargematricesareusuallyill-conditioned,theyrequireanadditionalregularizationstepinmethodslikekernelregression.OurgoalistodesignbasicsignalanalysistoolsfortimesignalsthatextracthigherorderinformationfromthedatadirectlylikeITLandalsoavoidthecomplexitiesofmanykernelmethods. 10

PAGE 11

WepresentnewmethodsfortimeseriesanalysismatchedlteringandoptimaladaptivelteringbasedonthenewlyinventedconceptinITL,correntropyandkernelmethods.CorrentropyinducesaRKHSthathasthesamedimensionalityastheinputspacebutisnonlinearlyrelatedtoit.Itisdierentfromtheconventionalkernelmethods,inbothscopeanddetail.Thisineecthelpsustoderivesomeelegantversionsofafewtoolsthatformthebasicbuildingblockofsignalprocessinglikethematchedltercorrelationreceiver,theWienerlter,andtheleastmeansquareLMSlter. 11

PAGE 12

CHAPTER1INTRODUCTIONNaturalprocessesofinterestforengineeringarecomposedoftwobasiccharacteristics:statisticaldistributionofamplitudesandtimestructure.Timeinitselfisveryfundamentalandiscrucialtomanyworldproblems,andtheinstantaneousrandomvariablesarehardlyeverindependentlydistributed,i.e.,stochasticprocessespossessatimestructure.Forthisreasontherearewidelyusedmeasuresthatquantifythetimestructureliketheautocorrelationfunction.Ontheotherhand,thereareanumberofmethodsthataresolelybasedonthestatisticaldistribution,ignoringthetimestructure.Asinglemeasurethatincludesbothoftheseimportantcharacteristicscouldgreatlyenhancethetheoryofstochasticrandomprocesses.ThefactthatreproducingkernelsarecovariancefunctionsasdescribedbyAronszajn[ 1 ]andParzen[ 2 ]explainstheirearlyroleininferenceproblems.Morerecently,numerousalgorithmsusingkernelmethodsincludingSupportVectorMachinesSVM[ 3 ],kernelprincipalcomponentanalysisK-PCA[ 4 ],kernelFisherdiscriminantanalysisK-FDA[ 5 ],andkernelcanonicalcorrelationanalysisK-CCA[ 6 ],[ 7 ]havebeenproposed.Likewise,advancesininformationtheoreticlearningITL,havebroughtoutanumberofapplications,whereentropyanddivergenceemployParzen'snonparametricestimation[ 8 ],[ 9 ],[ 10 ],[ 11 ].Manyofthesealgorithmshavegivenveryelegantsolutionstocomplicatednonlinearproblems.Mostofallthesecontemporaryalgorithmsarebasedonassumptionsofindependentdistributionofdata,whichinmanycasesisnotrealistic.Obviously,accuratedescriptionofastochasticprocessrequiresboththeinformationofthedistributionandthatofitstimestructure.Wecandescribetimeseriesproblemsseeninengineeringinbroadlytwomajorcategories.ADetection:Thisproblemhasageneralsetupgivenbygure 1-1 .Aknownsignaltemplateusuallychosenfromanitesetofpossibilitiespassesthroughachannelwhichmodiesthesignalbasedonanunderlyingsystemmodel.Usuallyanadditionsource 12

PAGE 13

Figure1-1.Setupofatypicaltimeseriesproblem. independentofthechannelandthetemplateisalsopresentwhichiscalledthenoise.Thisresultsinanobservedsignalthatisadistortedversionoftheoriginalsignal.Theproblemthenissimplytodetectwhichsignaltemplatewastransmittedthroughthechannelbasedontheobservedsignal,usuallywithoutknowingorcalculatingthechannelexplicitly[ 12 ].BEstimation:Estimationusuallyinvolvesestimatingorcalculatinganycomponentorapartofitofthetimeseriesproblemmodelgiveningure 1-1 basedontheobservedsignal.Thosecomponentscanbetheoriginalinputsignal,channelparametersorevencertainnoisecharacteristics.Theproblemmayormaynothavepriorknowledgeoftheinputsignal.Ifsuchinformationisabsentthensuchanestimationproblemiscalledblind.Goodexamplesofblindestimationareblindsourceseparation[ 13 ],blinddeconvolution[ 14 ]andblindequalization[ 15 ].Therecanbeotherestimationproblemsaswelllikenoiseestimationusuallyinvolvescalculatingcertainnoiseparameterslikecovariance,systemidentication[ 16 ],timeseriesprediction[ 17 ],interferencecancellation[ 18 ],etcetera.Theintegralcomponentforboththesecategoriesofproblemsisacorrelationormoregenerallysimilaritymeasurethatextractsinformationaboutthetimestructure.Theseideasaresummarizedingure 1-2 .Itisherethatoutproposedmethodofhigherordercorrelationwillmakeanimpactfortheimprovementofthesebasictasksintimeseriesanalysis.Presently,themostwidelyusedmethodofsimilarityisthecorrelationfunctionwhichappearsnaturallywhentheunderlyingassumptionsarelimitedtolinearityandGaussianity.Thekernelmethodsthatwerementionedearlieralsoemployasimilarity 13

PAGE 14

Figure1-2.Generaloverviewforsolvingtimeseriesproblemsinthefeaturespace measure,thekernelfunction.Butherethesimilarityispoint-wiseamongpairsofdatasamples.ThustorepresentthedatasetforsolvingmostoftheproblemsoneisrequiredtoevaluatethekernelpairwisewithalltheavailabledatasamplescreatingalargeGrammatrix.Thoughthesetypeofmethodscansolvecertaincomplexproblems,theyareusuallyquiteburdensomebothontimeandmemory.YetmanyresearchersareattractedtowardkernelmethodsbecauseitcansolvecomplexproblemselegantlyusingconventionaloptimalsignalprocessingtheorybutinarichkernelinducedreproducingkernelHilbertspaceRKHS.Thisspaceisusuallyveryhighdimensional,butmostsolutionsintheRKHScanbereadilycalculatedintheinputspaceusingthekernelfunctionwhichactsasaninnerproductofthefeaturevectors.Throughourstudy,weshallpresentanewfunctionthatisastatisticalmeasurehavingthesameformasthecorrelationfunctionandlikethethekernelwillhaveanassociatedRKHS.Thisprovidesacompletelynewavenueofresearch,potentiallysolvingcomplexproblemsmoreaccuratelyandconveniently.Specically,weshalldeneageneralizedcorrelationfunction,whichduetoitsintriguing 14

PAGE 15

relationshipwithRenyi'sQuadraticentropyandpropertiessimilartocorrelation,istermedascorrentropy.Thisnewfunctioncanprovidenovelmeansofperformingvarioussignalprocessingtasksinvolvingdetectionandestimation.Thespecictasksthatwediscussinthisdissertationaresignaldetectioncorrelationreceiverandoptimalltering.Theintegralcomponentforbothoftheseproblemsisasimilaritymeasure,usingtheconceptsofcorrentropyandkernelmethods.Nowweintroducethesetasksmorespecically.1.1SignalDetectionDetectionofknownsignalstransmittedthroughlinearandnonlinearchannelsisanimportantfundamentalprobleminsignalprocessingtheorywithawiderangeofapplicationsincommunications,radar,andbiomedicalengineeringtonamejustafew[ 12 19 ].Thelinearcorrelationlterormatchedlterhasbeenthebasicbuildingblockforthemajorityoftheseapplications.Thelimitationsofthematchedlter,though,arealreadydenedbytheassumptionsunderwhichitsoptimalitycanbeproved.ItiswellknownthatforthedetectionofaknownsignallinearlyaddedtowhiteGaussiannoiseAWGNthematchedltermaximizesthesignaltonoiseratioSNRamongalllinearlters[ 20 ].Theoretically,thismeansthatthematchedlteroutputisamaximumlikelihoodstatisticforhypothesistestingundertheassumptionsoflinearityandGaussianity.Theoptimalityispredicatedonthesuciencyofsecondorderstatisticstocharacterizethenoise.Unfortunately,mostrealworldsignalsarenotcompletelycharacterizedbytheirsecondorderstatisticsandsub-optimalityinevitablycreepsin.OptimaldetectioninnonGaussiannoiseornon-linearenvironmentsrequirestheuseofthecharacteristicfunctionandismuchmorecomplex[ 21 ]becauseitrequireshigherorderstatisticstoaccuratelymodelthenoise.Thismotivatestherecentinterestinnonlinearlterskernelmatchedlters[ 22 23 ]ornonlinearcostfunctions[ 24 ],butthecomputationalcomplexityofsuchsystemsoutweighstheirusefulnessinapplicationswherehighprocessingdelaycannotbetoleratedsuchasinradarandmobilecommunicationsystems.Withkernel 15

PAGE 16

methods,anonlinearversionofthetemplatematchingproblemisrstformulatedinkernelfeaturespacebyusinganonlinearmappingandthesocalledkerneltrick[ 22 ]isemployedtogiveacomputationallytractableformulation.Butthecorrelationmatrixformedinthisinnitedimensionalfeaturespaceisalsoinnitelylargeandtheresultingformulationiscomplexusingalargesetoftrainingdata.Alternativelyitcanbeformulatedasadiscriminantfunctioninkernelspace[ 23 ],butstillsuersfromtheneedtotrainthesystembeforehandandstorethetrainingdata.Thematchedlterbasedonquadraticmutualinformationisanotherrecentlyintroducednonlinearlterthatmaximizesthemutualinformationbetweenthetemplateandtheoutputofthelter[ 24 ].Thismethoddoesnotrequireaninitialtrainingstepsinceitisnon-parametric.However,themethodrequirestheestimationofthequadraticmutualinformationwithkernelsandisideallyvalidonlyforidenticallyandindependentlydistributediidsamples,whichisrarelythecaseinreality.Moreover,thecomputationalloadisstillON2atbest.Thederivationofthemethodintroducedinthisdissertationusesarecentlyintroducedpositivedenitefunctioncalledcorrentropy[ 15 ],whichquantieshigherordermomentsofthenoisedistributionandhasacomputationalcomplexityofON,thusprovidingausefulcombinationofgoodrepresentationandlesscomputationalcomplexity.1.2OptimalandAdaptiveFilteringDuetothepowerofthesolutionandtherelativelyeasyimplementation,Wienerltershavebeenextensivelyusedinalltheareasofelectricalengineering.Despitethiswidespreaduse,Wienerltersaresolutionsinlinearvectorspaces.Therefore,manyattemptshavebeenmadetocreatenonlinearsolutionstotheWienerltermostlybasedonVolterraseries[ 25 ],butunfortunatelythesolutionsareverycomplexwithmanycoecients.Therearealsotwotypesofnonlinearmodelsthathavebeencommonlyused:TheHammersteinandtheWienermodels.Theyarecomposedofastaticnonlinearityandalinearsystem,wherethelinearsystemisadaptedusingtheWienersolution.However,thechoiceofthenonlinearityiscriticalforgoodperformance,becauselinearsolutionis 16

PAGE 17

obtainedinthetransformedspace.Therecentadvancesofnonlinearsignalprocessinghaveusednonlinearlters,commonlyknownasdynamicneuralnetworks[ 26 ]thathavebeenextensivelyusedinthebasicsameapplicationsofWienerlterswhenthesystemunderstudyisnonlinear.However,thereisnoanalyticalsolutiontoobtaintheparametersofmulti-layeredneuralnetworks.Theyarenormallytrainedusingthebackpropagationalgorithmoritsmodications.Insomeothercases,anonlineartransformationoftheinputisrstimplementedandaregressioniscomputedattheoutput.GoodexamplesofthisaretheradialbasisfunctionRBFnetwork[ 27 ]andthekernelmethods[ 3 28 ].Thedisadvantageofthesealternatetechniquesofprojectionisthetremendousamountofcomputationrequiredduetotherequiredinversionofahugematrixandthereisusuallyaneedforregularization.WeshowhowtoextendtheanalyticsolutioninlinearvectorspacesproposedbyWienertoanonlinearmanifoldthatisobtainedthroughareproducingkernelHilbertspace.ThemainideaistotransformtheinputdatanonlinearlytobeGaussiandistributedandthenapplythelinearWienerltersolution.OurmethodactuallyencompassesandenrichestheHammersteinmodelbyinducingnonlinearitieswhichmaynotbeachievedviastaticnonlinearity.Forviablecomputation,thisapproachutilizesarathersimplisticapproximationduetowhichtheresultsarelessoptimalbutstillperformsbetterthanthelinearWienerlter.ToimproveitsresultsweproposetoobtaintheleastmeansquareerrorsolutiononlinefollowingtheapproachemployedinWidrow'sfamousLMSalgorithms[ 29 ]. 17

PAGE 18

CHAPTER2NEWSIMILARITYANDCORRELATIONMEASURE2.1Kernel-BasedAlgorithmsandITLInthelastyearsanumberofkernelmethods,includingSupportVectorMachinesSVM[ 3 ],kernelprincipalcomponentanalysisK-PCA[ 4 ],kernelFisherdiscriminantanalysisK-FDA[ 5 ],andkernelcanonicalcorrelationanalysisK-CCA[ 6 ],[ 7 ]havebeenproposedandsuccessfullyappliedtoimportantsignalprocessingproblems.Thebasicideaofkernelalgorithmsistotransformthedataxifromtheinputspacetoahighdimensionalfeaturespaceofvectorsxi,wheretheinnerproductscanbecomputedusingapositivedenitekernelfunctionsatisfyingMercer'sconditions[ 3 ]:xi;xj=hxi;xji.Thissimpleandelegantideaallowsustoobtainnonlinearversionsofanylinearalgorithmexpressedintermsofinnerproducts,withoutevenknowingtheexactmapping.AparticularlyinterestingcharacteristicofthefeaturespaceisthatitisareproducingkernelHilbertspaceRKHS:i.e.,thespanoffunctionsf;x:x2XgdenesauniquefunctionalHilbertspace[ 1 ],[ 2 ],[ 30 ],[ 31 ].Thecrucialpropertyofthesespacesisthereproducingpropertyofthekernelfx=h;x;fi;8f2F:{1Inparticular,wecandeneournonlinearmappingfromtheinputspacetoaRKHSasx=;x,thenwehavehx;yi=h;x;;yi=x;y;{2andthusx=;xdenestheHilbertspaceassociatedwiththekernel.Withoutlossofgenerality,inthischapterwewillonlyconsiderthetranslation-invariantGaussiankernel,whichisthemostwidelyusedMercerkernel.x)]TJ/F37 11.955 Tf 11.955 0 Td[(y=1 p 2exp)]TJ/F26 11.955 Tf 11.291 20.444 Td[(kx)]TJ/F37 11.955 Tf 11.955 0 Td[(yk2 22!{3 18

PAGE 19

Ontheotherhand,InformationTheoreticLearningITLaddressestheissueofextractinginformationdirectlyfromdatainanon-parametricmanner[ 8 ].Typically,Renyi'sentropyorsomeapproximationtotheKullback-LeiblerdistancehavebeenusedasITLcostfunctionsandtheyhaveachievedexcellentresultsonanumberofproblems:e.g.,timeseriesprediction[ 9 ],blindsourceseparation[ 10 ]orequalization[ 11 ].IthasbeenrecentlyshownthatITLcostfunctions,whenestimatedusingtheParzenmethod,canalsobeexpressedusinginnerproductsinakernelfeaturespacewhichisdenedbytheParzenkernel,thussuggestingacloserelationshipbetweenITLandkernelmethods[ 32 ],[ 33 ].Forinstance,ifwehaveadatasetx1;;xN2Rd,andthecorrespondingsetoftransformeddatapointsxi;;xN,thenitturnsoutthatthesquaredmeanofthetransformedvectors,i.e.,kmk2=*1 NNXi=1xi;1 NNXj=1xj+=1 N2NXi=1NXj=1xi)]TJ/F37 11.955 Tf 11.956 0 Td[(xj;{4istheinformationpotentialVxasdenedin[ 8 ] 1 .AnotherinterestingconceptininformationtheoreticlearningistheCauchy-Schwarzpdfdistance,whichhasbeenprovedtobeeectiveformeasuringtheclosenessbetweentwoprobabilitydensityfunctionsandhasbeenusedsuccessfullyfornon-parametricclustering.IfpXxandpYyarethetwopdfs,Cauchy-Schwarzpdfdistanceisdened[ 8 ]byDpX;pY=)]TJ/F21 11.955 Tf 9.298 0 Td[(logRpXxpYxdx q Rp2XxdxRp2Yxdx0:{5MutualinformationMIindicatestheamountofsharedinformationbetweentwoormorerandomvariables.Ininformationtheory,theMIbetweentworandomvariablesXandYis 1ThequadraticRenyi'sentropyisdenedasHR=)]TJ/F15 11.955 Tf 11.291 0 Td[(logVx. 19

PAGE 20

traditionallydenedbyShannonas[ 34 ].IsX;Y=ZZpXYx;ylogpXYx;y pXxpYydxdy{6wherepXYx;yisthejointprobabilitydensityfunctionpdfofXandY,andPXxandPYyarethemarginalpdfs.ThecrucialpropertyofmutualinformationforourpurposesisthefactthatitmeasuresthedependencyevennonlinearbetweentworandomvariablesXandY.IfXandYareindependent,MIbecomeszero[ 34 ].Inasense,MIcanbeconsideredageneralizationofcorrelationtononlineardependencies;thatisMIcanbeusedtodetectnonlineardependenciesbetweentworandomvariables,whereastheusefulnessofcorrelationislimitedtolineardependencies.However,inordertoestimateMIonehastoassumethatthesamplesareiid,whichisnotthecasefortemplatesthatarewaveforms.AlthoughShannonsMIisthetraditionallypreferredmeasureofsharedinformation,essentiallyitisameasureofdivergencebetweenthevariablesXandYfromindependence.Basedonthisunderstanding,adierent,butqualitativelysimilarmeasureofindependencecanbeobtainedusingtheCauchy-Schwartzinequalityforinnerproductsinvectorspaces:hx;yikxkkyk.ThefollowingexpressionisdenedastheCaucy-SchwartzMutualInformationCS-QMIbetweenXandY[ 8 ]:IsX;Y=1 2logRRp2XYx;ydxdyRRpX2xpY2ydxdy )]TJ 5.48 -0.054 Td[(RRpXYx;ypXxpYydxdy2{7Withdataavailable,IsX;YcanbeestimatedusingParzenwindowdensityestimationandcanbeusedasastatisticforsignaldetectionasin[ 24 ].WeshalluseCS-QMIasacomparisonagainstourproposedmethodaswellsincethisisadirecttemplatematchingschemethatrequiresnotrainingandshowsimprovedperformanceinnon-Gaussianandnonlinearsituations.Similarly,theequivalencebetweenkernelindependentcomponentanalysisK-ICAandaCauchy-Schwartzindependencemeasurehasbeenpointedoutin[ 35 ].Infact,alllearningalgorithmsthatusenonparametricprobabilitydensityfunctionpdfestimatesintheinputspaceadmitanalternativeformulationaskernel 20

PAGE 21

methodsexpressedintermsofdotproducts.Thisinterestinglinkallowsustogainsomegeometricalunderstandingofkernelmethods,aswellastodeterminetheoptimalkernelparametersbylookingatthepdfestimatesintheinputspace.SincethecostfunctionsoptimizedbyITLalgorithmsor,equivalently,bykernelmethodsinvolvepdfestimates,thesetechniquesareabletoextractthehigherorderstatisticsofthedataandthatexplainstosomeextenttheimprovementovertheirlinearcounterpartsobservedinanumberofproblems.Despiteitsevidentsuccess,amajorlimitationofallthesetechniquesisthattheyassumeindependentandidenticallydistributedi.i.d.inputdata.However,inpracticemostofthesignalsinengineeringhavesomecorrelationortemporalstructure.Moreover,thistemporalstructurecanbeknowninadvanceforsomeproblemsforinstanceindigitalcommunicationsworkingwithcodedsourcesignals.Therefore,itseemsthatmostoftheconventionalITLmeasuresarenotusingalltheavailableinformationinthecaseoftemporallycorrelatednon-whiteinputsignals.ThemaingoalistopresentanewfunctionthatunlikeconventionalITLmeasureseectivelyexploitsboththestatisticalandthetime-domaininformationabouttheinputsignal.Thisnewfunction,whichwerefertoascorrentropyfunction,willbepresentedinthenextsection.2.2GeneralizedCorrelationFunction2.2.1CorrentropyasGeneralizedCorrelationAnewmeasureofgeneralizedcorrelationcalledcorrentropywaspresentedin[ 15 ].Thedenitionisasfollows:Letfxt;t2TgbeastochasticprocesswithTbeinganindexset.ThegeneralizedcorrelationfunctionVs;tisdenedasafunctionfromTTintoR+givenbyVs;t=E[xs;xt];{8 21

PAGE 22

whereE[]denotesmathematicalexpectation.UsingaseriesexpansionfortheGaussiankernel,thecorrelationfunctioncanberewrittenasVs;t=1 p 21Xn=0)]TJ/F15 11.955 Tf 9.299 0 Td[(1n 2n2nn!Ekxs)]TJ/F37 11.955 Tf 11.955 0 Td[(xtk2n;{9whichinvolvesalltheeven-ordermomentsoftherandomvariablekxs)]TJ/F37 11.955 Tf 11.955 0 Td[(xtk.Specically,thetermcorrespondington=1in 2{9 isproportionaltoEkxsk2+Ekxtk2)]TJ/F15 11.955 Tf -429.578 -23.908 Td[(2Ekxsxtk2=2xs+2xt)]TJ/F15 11.955 Tf 12.487 0 Td[(2Rxs;t.Thisshowsthattheinformationprovidedbytheconventionalautocorrelationfunctionisincludedwithinthenewfunction.From 2{9 ,wecanseethatinordertohaveaunivariatecorrelationfunction,alltheeven-ordermomentsmustbeinvarianttoatimeshift.Thisisastrongerconditionthanwidesensestationarity,whichinvolvesonlysecond-ordermoments.Moreprecisely,asucientconditiontohaveVt;t)]TJ/F21 11.955 Tf 12.276 0 Td[(=Visthattheinputstochasticprocessmustbestrictlystationaryontheevenmoments;thismeansthatthejointpdfpxt;xt+;mustbeunaectedbyachangeoftimeorigin.WewillassumethisconditionintherestofthedissertationwhenusingV.Foradiscrete-timestationarystochasticprocesswedenethegeneralizedcorrelationfunctionasV[m]=E[xn)]TJ/F37 11.955 Tf 11.955 0 Td[(xn)]TJ/F22 7.97 Tf 6.586 0 Td[(m],whichcanbeeasilyestimatedthroughthesamplemean^V[m]=1 N)]TJ/F21 11.955 Tf 11.955 0 Td[(m+1NXn=mxn)]TJ/F37 11.955 Tf 11.955 0 Td[(xn)]TJ/F22 7.97 Tf 6.586 0 Td[(m:{10Thisformhigherordercorrelationhasbeentermedcorrentropybecauseofitsuniquecapabilitytoprovideinformationbothonthetimestructurecorrelationofthesignalaswellasonthedistributionoftherandomvariableaverageacrossdelaysorindexesistheinformationpotential[ 15 ].Correntropyispositivedeniteandbecauseofwhich,denesawholenewreproducingkernelHilbertspaceRKHS[ 1 ].Italsohasamaximumwhenthetwoindexessandtareequal.Thesepropertiesarealsosatisedbythewidelyknowncorrelationfunction[ 2 ].Thesefactshavemadeitpossibletoexploreawholenewavenueofresearchwhichinessencecombinesthebenetsofkernelmethodsandinformationtheoreticlearning[ 36 ],[ 37 ].Inthesamewayacovarianceversionofcorrentropycan 22

PAGE 23

bedenedbyemployingvectorsthatarecenteredinthefeaturespacedenedbytheMercer'skernel.Givenanytworandomvariablesx1andx2,thecorrespondingfeaturevectorsx1andx2canbecenteredas,~xi=xi)]TJ/F21 11.955 Tf 11.956 0 Td[(E[xi];i=1;2:{11ThentheinnerproductbetweenthesetwocenteredfeaturevectorsbecomesD~x1;~x2E=hx1;x2i+E[x1]:E[x2]hE[x1];x2ihx1;E[x2]i:{12EmployingastatisticalexpectationonbothsidesoftheaboveequationwegetEhD~x1;~x2Ei=E[hx1;x2i])-222(hE[x1];E[x2]i=E[x1;x2])]TJ/F21 11.955 Tf 11.955 0 Td[(Ex2Ex1[x1;x2]:{13Thisresultsinthecenteredcorrentropybetweenthetworandomvariables.Nowwecandenethecenteredcorrentropyforrandomvectors:Givenarandomvectorx=[x1;x2;:::;xL]T,itscenteredcorrentropycanbedenedasamatrixVsuchthatthei;jthelementisVi;j=E[xi;xj])]TJ/F21 11.955 Tf 11.955 0 Td[(ExiExj[xi;xj]:{14ThefollowingtheoremisthebasisofthenovelWienerlterformulation. Theorem1. Foranysymmetricpositivedenitekerneli.e.,Mercer'sKernelxi;xjdenedon<
PAGE 24

anysetofrealnumbers1;2;:::;Lnotallzeros.Then,LXi=1LXj=1ijVi;j=LXi=1LXj=1ijE[xi;xj])]TJ/F21 11.955 Tf 11.955 0 Td[(ExiExj[xi;xj]{15Using 2{13 ,LXi=1LXj=1ijVi;j=E"LXi=1LXj=1ijhxi)]TJ/F21 11.955 Tf 11.955 0 Td[(E[xi];xj)]TJ/F21 11.955 Tf 11.956 0 Td[(E[xj]i#=E"*LXi=1i[xi)]TJ/F21 11.955 Tf 11.955 0 Td[(E[xi]];LXj=1j[xj)]TJ/F21 11.955 Tf 11.955 0 Td[(E[xj]]+#=E24LXi=1i[xi)]TJ/F21 11.955 Tf 11.955 0 Td[(E[xi]]2350:{16ThusVisnon-negativedeniteandsymmetrical.Now,itcanbeprovedthatRistheauto-covariancefunctionofarandomprocessifandonlyifRisasymmetricnon-negativekernel[ 2 ].Thetheoremthenfollows.ThismeansthatgivenVi;jforarandomprocessxn,thereexistsaGaussianprocessznsuchthatE[zi:zj])]TJ/F21 11.955 Tf 10.451 0 Td[(E[zi]:E[zj]=Vi;j. Theorem2. Foranyidenticallyandindependentlydistributedrandomvectorx,thecorrespondingGaussianrandomvectorzinthefeaturespacedenedbycorrentropyisuncorrelated,i.e.,thecovariancematrixisdiagonal.Proof:LetVbethecenteredcorrentropymatrixforxandhencecovariancematrixforzwithitsi;jthelementwithi6=jasVi;j=E[zi:zj])]TJ/F21 11.955 Tf 11.955 0 Td[(E[zi]:E[zj]=E[xi;xj])]TJ/F21 11.955 Tf 11.955 0 Td[(ExiExj[xi;xj]=ExiExj[xi;xj])]TJ/F21 11.955 Tf 11.955 0 Td[(ExiExj[xi;xj]=0:{17Italsoeasytoseethatsincexisidenticallydistributed,allthediagonaltermsarealsoequal. 24

PAGE 25

2.2.2PropertiesSomeimportantpropertiesoftheGCFcanbelistedasfollows:Property1:Foranysymmetricpositivedenitekerneli.e.,Mercerkernelxs;xtdenedonRR,thegeneralizedcorrelationfunctiondenedasVs;t=E[xs;xt]isareproducingkernel.Proof:Sincexs;xtissymmetrical,itisobviousthatVs;tisalsosymmetrical.Now,sincexs;xtispositivedenite,foranysetofnpointsfx1;;xngandanysetofrealnumbersfa1;;ang,notallzeronXi=1nXj=1aiajxi;xj>0:{18Itisalsotruethatforanystrictlypositivefunctiong;oftworandomvariablesxandy,E[gx;y]>0.ThenE"nXi=1nXj=1aiajxi;xj#>0nXi=1nXj=1aiajE[xi;xj]=nXi=1nXj=1aiajVi;j>0:{19Thus,Vs;tisbothsymmetricandpositivedenite.Now,theMoore-Aronszajntheorem[ 1 ]provesthatforeveryrealsymmetricpositivedenitefunctionoftworealvariables,thereexistsauniquereproducingkernelHilbertspaceRKHSwithasitsreproducingkernel.Hence,Vs;t=E[xs;xt]isareproducingkernel.Thisconcludesthedemonstration.Thefollowingpropertiesconsideradiscrete-timestochasticprocess,obviously,thepropertiesarealsosatisedforcontinuous-timeprocesses.Property2:V[m]isasymmetricfunction:V[)]TJ/F21 11.955 Tf 9.299 0 Td[(m]=V[m].Property3:V[m]reachesitsmaximumattheorigin,i.e.,V[m]V[0],8m.Property4:V[m]0andV[0]=1 p 2. 25

PAGE 26

Allthesepropertiescanbeeasilyproved.Properties2and3arealsosatisedbytheconventionalautocorrelationfunction,whereasProperty4isadirectconsequenceofthepositivenessoftheGaussiankernel.Property5:Letfxn;n=0;;N)]TJ/F15 11.955 Tf 11.956 0 Td[(1gbeasetofi.i.ddatadrawnaccordingtosomedistributionpx.ThemeanvalueoftheGCFestimator 2{10 coincideswiththeestimateofinformationpotentialobtainedthroughParzenwindowingwithGaussiankernels.Proof:TheParzenpdfestimateisgivenby^px=1 NPN)]TJ/F20 7.97 Tf 6.587 0 Td[(1n=0x)]TJ/F21 11.955 Tf 12.933 0 Td[(xn,andtheestimateoftheinformationpotentialisV=Z11 NN)]TJ/F20 7.97 Tf 6.586 0 Td[(1Xn=0x)]TJ/F21 11.955 Tf 11.955 0 Td[(xn!2dx=1 N2N)]TJ/F20 7.97 Tf 6.587 0 Td[(1Xi=0N)]TJ/F20 7.97 Tf 6.586 0 Td[(1Xn=0~xn)]TJ/F21 11.955 Tf 11.955 0 Td[(xi;{20where~denotesaGaussiankernelwithtwicethekernelsizeof.OntheotherhandtheGFCestimateis^V[m]=1 NjmjPN)]TJ/F20 7.97 Tf 6.587 0 Td[(1n=mxn)]TJ/F37 11.955 Tf 12.322 0 Td[(xn)]TJ/F22 7.97 Tf 6.587 0 Td[(m,for)]TJ/F15 11.955 Tf 9.299 0 Td[(N)]TJ/F15 11.955 Tf 12.322 0 Td[(1mN)]TJ/F15 11.955 Tf 12.322 0 Td[(1,andthereforeitsmeanvalueisD^V[m]E=1 2N)]TJ/F15 11.955 Tf 11.955 0 Td[(1N)]TJ/F20 7.97 Tf 6.587 0 Td[(1Xm=)]TJ/F22 7.97 Tf 6.587 0 Td[(N+11 N)-222(jmjxn)]TJ/F37 11.955 Tf 11.955 0 Td[(xn)]TJ/F22 7.97 Tf 6.586 0 Td[(m:{21Finally,itistrivialtocheckthatallthetermsin 2{20 arealsoin 2{21 .Thisconcludestheproof.Property5clearlydemonstratesthatthisgeneralizationincludesinformationaboutthepdf.Ontheotherhand,wealsoshowedthatitalsoconveysinformationaboutthecorrelation.Forthisreasons,inthesequelwewillrefertoV[m]ascorrentropy. 26

PAGE 27

Property6:GivenV[m]form=0;;P)]TJ/F15 11.955 Tf 13.264 0 Td[(1,thenthefollowingToeplitzcorrentropymatrixofdimensionsPPV=0BBBBBBB@V[0]V[1]V[P)]TJ/F15 11.955 Tf 11.955 0 Td[(1]V[1]V[0]V[P)]TJ/F15 11.955 Tf 11.955 0 Td[(2]............V[P)]TJ/F15 11.955 Tf 11.956 0 Td[(1]V[P)]TJ/F15 11.955 Tf 11.955 0 Td[(2]V[0]1CCCCCCCA;{22ispositivedenite.Proof:MatrixVcanbedecomposedasV=PNn=mAnwhereAnisgivenbyAn=0BBBBBBB@xn)]TJ/F37 11.955 Tf 11.955 0 Td[(xnxn)]TJ/F37 11.955 Tf 11.956 0 Td[(xn)]TJ/F20 7.97 Tf 6.587 0 Td[(1xn)]TJ/F37 11.955 Tf 11.955 0 Td[(xn)]TJ/F22 7.97 Tf 6.586 0 Td[(P)]TJ/F20 7.97 Tf 6.587 0 Td[(1xn)]TJ/F37 11.955 Tf 11.955 0 Td[(xn)]TJ/F20 7.97 Tf 6.587 0 Td[(1xn)]TJ/F37 11.955 Tf 11.955 0 Td[(xnxn)]TJ/F37 11.955 Tf 11.955 0 Td[(xn)]TJ/F22 7.97 Tf 6.586 0 Td[(P)]TJ/F20 7.97 Tf 6.587 0 Td[(2............xn)]TJ/F37 11.955 Tf 11.955 0 Td[(xn)]TJ/F22 7.97 Tf 6.587 0 Td[(P)]TJ/F20 7.97 Tf 6.586 0 Td[(1xn)]TJ/F37 11.955 Tf 11.955 0 Td[(xn)]TJ/F22 7.97 Tf 6.586 0 Td[(P)]TJ/F20 7.97 Tf 6.587 0 Td[(2xn)]TJ/F37 11.955 Tf 11.955 0 Td[(xn1CCCCCCCA;{23ifxi;xjisakernelsatisfyingMercer'sconditions,thenAnisapositivedenitematrix8n.Ontheotherhand,thesumofpositivedenitematricesisalsopositivedenite[ 38 ];thisprovesthatVisapositivedenitematrix.Property7:Letfxn;n2Tgbeadiscrete-timew.s.s.zero-meanGaussianprocesswithautocorrelationfunctionr[m]=E[xnxn)]TJ/F22 7.97 Tf 6.586 0 Td[(m].ThecorrentropyfunctionforthisprocessisgivenbyV[m]=8><>:1 p 2;m=01 p 22+2[m];m6=0{24whereisthekernelsizeand2[m]=2r[0])]TJ/F21 11.955 Tf 11.955 0 Td[(r[m].Proof:ThecorrentropyfunctionisdenedasV[m]=E[xn)]TJ/F21 11.955 Tf 11.956 0 Td[(xn)]TJ/F22 7.97 Tf 6.586 0 Td[(m].Sincexnisazero-meanGaussianrandomprocess,form6=0zm=xn)]TJ/F21 11.955 Tf 12.55 0 Td[(xn)]TJ/F22 7.97 Tf 6.587 0 Td[(misalsoazero-meanGaussianrandomvariablewithvariance2[m]=2r[0])]TJ/F21 11.955 Tf 11.955 0 Td[(r[m].ThereforeV[m]=Z1zm1 p 2[m]exp)]TJ/F26 11.955 Tf 11.291 16.857 Td[(z2m 22[m]dzm:{25 27

PAGE 28

SinceweareconsideringaGaussiankernelwithvariance2,equation 2{25 istheconvolutionoftwozero-meanGaussiansofvariances2and[m]2evaluatedattheorigin;thisyields 2{24 immediately.Property7clearlyreectsthatcorrentropyconveysinformationaboutthetimestructureoftheprocessandalsoaboutitspdfviaquadraticRenyi'sentropy.AsaconsequenceofProperty7,iffxn;n2Tgisawhitezero-meanGaussianprocesswithvariance2xwehavethatV[m]=1 p 22+2x,8m6=0,whichcoincideswiththemeanvalueofthefunctionand,ofcourse,istheinformationpotential 2 ofaGaussianrandomvariableofvariance2x,whenitspdfhasbeenestimatedviaParzenwindowingwithaGaussiankernelofsize2.Property8:Thecorrentropyestimator 2{10 isunbiasedandasymptoticallyconsistent.Thepropertiesoftheestimatorcanbederivedfollowingthesamelinesusedfortheconventionalcorrelationfunction[ 39 ].2.3SimilarityMeasureFortemplatematchinganddetectionthedecisionstatisticisingeneralasimilarityfunction.Formatchedlteringthecross-correlationfunctionisused.Herewedeneanon-linearsimilaritymetricwhichcanbeexplainedinprobabilisticterms.Inspiredbytheconceptofcorrentropy,wedeneameansofmeasuringsimilaritybetweentworandomprocesses.Thecross-correntropyatiandjfortworandomprocessesXkandYkcanbedenedby:VXYi;j=E[Xi;Yj]:{26 2R^px2dx=1 p 22+2x 28

PAGE 29

Withtheassumptionsofergodicityandstationarity,thecrosscorrentropyatlagmbetweenthetworandomprocessescanbeestimatedby^VXYm=1 NNXk=1xk)]TJ/F22 7.97 Tf 6.587 0 Td[(m;yk{27Fordetectionifweassumethatweknowthetimingofthesignalwejustneedtouse^VXY=1 NNXk=1xk;yk:{28Property9:Thecross-correntropyestimatedenedin 2{28 fortworandomvariablesxkandyk,eachiid,approachesthroughParzendensityestimationtoameasureofprobabilitydensityalongalinegivenbyDXY=Z1Z1PXYx;yx)]TJ/F21 11.955 Tf 11.955 0 Td[(ydxdy;{29wherex)]TJ/F21 11.955 Tf 12.216 0 Td[(yistheDirac'sdeltafunction.DXYin 2{29 denestheintegralalongthex=yofthejointprobabilitydensityfunctionPDFPXY.Hence,thisgivesadirectprobabilisticmeasureonhowsimilarthetworandomvariablesare.Proof:TheestimateofthejointpdfusingParzenwindowingwithaGaussiankernelandavailableNsamplescanbewrittenas^PXYx;y=1 NNXk=1x;xky;yk2{30whereistheGaussiankernelwithvariance2.Usingtheestimateofthepdfwehave^DXY{31=1 NNXk=11Z1Zx;xky;ykx;ydxdy{32=1 NNXn=11Zx;xkx;ykdx:{33 29

PAGE 30

HerewecanusetheconvolutionpropertyoftheGaussiankernelresultingin^DXY=1 NNXk=1p 2xk;yk2{34Thiscoincideswith 2{28 withthekernelhavingvariance22Thiscanalsobeinferredfrom 2{26 sinceVXY=E[x;y]=1Z1Zx;yPXYx;ydxdy:{35Thisisthejusttheintegralalongastripdenedbythekernelonthex=yline.For!0,thekernelwouldbetheDirac'sdeltafunctioncoincidingwith 2{29 .Figure 2-1 illustratesthisgraphically.ThismeansthatVXYincreasesasmoredatapointsinthejointspacelieclosertothelinex=yandkernelsizeregulateswhatisconsideredclose.ThisalsogivethemotivationofusingVXYasasimilaritymeasurebetweenXandY.Property10:Thecross-correntropyestimatedenedin 2{28 fortworandomvariablesxiandyi,eachiid,isdirectlyrelatedthroughParzendensityestimationtoCauchy-Schwarzpdfdistancedenedin 2{5 .Toexploretherelationshipbetweenthesetwoquantitiesletusestimatethepdf'softheserandomvariablesusingParzenestimation:^pXx=1 NNXi=1x;xi{36^pYy=1 NNXi=1y;yi{37Thenumeratorin 2{5 canbewrittenasR^pXx^pYxdx=1 N2NPj=1NPi=1Rx;xix;yidx=1 N2NPj=1NPi=1p 2xi;yjEYEX[p 2X;Y]=E[p 2X;Y]1 NNPi=1p 2xi;yi=^VXY:{38 30

PAGE 31

Figure2-1.ProbabilisticinterpretationofVXYthemaximumofeachcurvehasbeennormalizedto1forvisualconvenience. Heretheapproximationsareforswitchingbetweenstatisticalexpectationsandsampleaverages,marginalexpectationsabovewithrespecttothesubscriptintheexpectationoperatorarereplacedwithajointexpectationwithoutthesubscript.Thesestepsarevalidaslongastherandomvariablesareindependent.^VXYisthesameasin 2{28 withakernelvarianceof22.Thusfrom 2{5 ,theCauchy-SchwarzpdfdistancecanbewrittenasD^pX;^pY=)]TJ/F15 11.955 Tf 11.291 0 Td[(logVXY+1 2logGX+1 2logGY;{39whereGXandGYareestimatesofinformationpotentialsdenedin 2{4 forXandY,respectively.NotethatVXYaccountsfortheinteractionbetweenthetworandomvariablesintheCauchy-Schwarzpdfdistance,sinceitgivesanestimateoftheinnerproductbetweenthetwopdf'sasshownin 2{38 31

PAGE 32

Property11:ThecrosscorrentropyfunctionbetweentworandomvariablesXandYdenedin 2{26 isthecrosscorrelationoftworandomvariablesUandZ.Thatis,^VXY=1 NNXk=1xk;yk=1 NNXk=1ukzk=^RUZ;{40where^RUZistheestimateofthecrosscorrelationbetweenUandZusingsampleaverage.Thecorrentropymatrixforthevector[XY]Tgivenby^VXY=264^VXX^VXY^VYX^VYY375{41ispositivedenitethenusingsimilararguments[ 2 ]usedtoprovetheorems1and2,arandomvector[UZ]Twillexistwhosecorrelationmatrixisgivenbythesamematrix.Thus^RUZ=264^RUU^RUZ^RZU^RZZ375=^VXY:{42HencetworandomvariablesUandZwillexistsatisfying 2{40 .Thisshowsthatusingthecrosscorrentropyfunctionisequivalenttosimplycomputingcrosscorrelationoftwootherrandomvariablesnonlinearlyrelatedtooriginaldatathroughcorrentropy.2.4OptimalSignalProcessingBasedonCorrentropyAswehavediscussedearliergiventhedatasamplesfxigNi=1thecorrentropykernelcreatesanotherdatasetfzxigNi=1preservingthesimilaritymeasureasE[zxizxj]=E[kxi;xj]=Vi;j.In[ 2 ]ParzenhasdescribedaninterestingconceptoftheHilbertspacerepresentationofarandomprocess.ForastochasticprocessfXt;t2TgwithTbeinganindexset,theauto-correlationfunctionRXt;sdenedasE[XtXs]issymmetricandpositive-denite,thusdeninganRKHS.ParzenshowedthattheinnerproductstructureofthisRKHSisequivalenttotheconventionaltheoryofsecondorderstochasticprocesses.AccordingtoParzen,asymmetricnon-negativekernelisthecovariancekernelofarandomfunctioninotherwordsrandomprocessandviceversa.Therefore,givena 32

PAGE 33

randomprocessfXt;t2Tgthereexistsanotherrandomprocessfzt;t2TgsuchthatE[ztzs]=Vt;s=E[kXt;Xs]:{43ztisnonlinearlyrelatedto,butofthesamedimensionasXt,whilepreservingthesimilaritymeasureinthesenseof 2{43 .Meanwhile,thelinearmanifoldoffzt;t2TgformsacongruentRKHSwithcorrentropyVt;sasthereproducingkernel.Thevectorsinthisspacewouldbenonlinearlyrelatedtotheinputspacebasedontheunderlyinginputstatisticsandhenceanylinearsignalprocessingsolutionusingacovariancefunctiongivenbycorrentropyinthisfeaturespacewouldautomaticallygiveanonlinearformulationwithrespecttotheoriginalinputdata.Forinstance,ifweconstructadesiredsignalbytheoptimalprojectiononthespaceof,thislterwouldalsobealinearoptimallterinthefeaturespaceandcorrespondinganonlinearlterwouldbeobtainedwithrespecttotheoriginalinput,thuspotentiallymodelingthenonlineardependenciesbetweentheinputandthedesiredinput.Thisconceptofusingthefeaturespacegivenbycorrentropycanbeclearlyextendedformostsupervisedlearningmethodsingeneral.Inthefollowingchaptersweshallpresenttheusefulnessofthecorrentropyfunctionbothasasimilarityandcorrelationmeasureaswellasameansofoptimalsignalprocessinginthecorrespondingfeaturespace.WeconcludethischapterbypointingoutthedierencesandsimilaritiesbetweentheRKHSinducedbycorrentropyVRKHSversustheRKHSinducedbytheGaussiankernelGRKHSasusedinkernelmethods,andtheoldParzenformulationoftheRKHSPRKHSbasedonautocorrelationofrandomprocesses.TheprimarycharacteristicoftheGRKHSisthenonlineartransformationtoahighdimensionalspacecontrolledbythenumberofinputdatasamples.Thiscreatesdicultiesinunderstandingsomesolutionsintheinputspacee.g.kernelPCA,andrequiresregularizedsolutions,sincethenumberofunknownsisequaltothenumberofsamples.Powerfulandelegantmethodshavebeenusedtosolvethisdiculty[ 3 ],[ 40 ],buttheycomplicatethesolution.Infact,insteadof 33

PAGE 34

straightleastsquaresforkernelregression,onehastocomputeaconstrainedquadraticoptimizationproblem.Thenonlinearityiscontrolledsolelybythekernelutilized,andmostofthetimesitisunclearhowtoselectitoptimally,althoughsomeimportantworkshaveaddressedthediculty[ 41 ].AlthoughcorrentropyusingtheGaussiankernelisrelatedtotheGRKHS,VRKHSisconceptually,andpracticallymuchclosertoPRKHS.Sincecorrentropyisdierentfromcorrelationinthesensethatitinvolveshigh-orderstatisticsofinputsignals,theVRKHSinducedbyauto-correntropyisnotequivalenttostatisticalinferenceonGaussianprocesses.ThetransformationfromtheinputspacetoVRKHSisnonlinearandtheinnerproductstructureofVRKHSprovidesthepossibilityofobtainingcloseformoptimalnonlinearltersolutionsbyutilizinghigh-orderstatisticsasweshallalsodemonstrate.AnotherimportantdierencecomparedwithexistingmachinelearningmethodsbasedontheGRKHSisthatthefeaturespacehasthesamedimensionoftheinputspace.ThishasadvantagesbecauseinVRKHSthereisnoneedtoregularizethesolution,whenthenumberofsamplesislargecomparedwiththeinputspacedimension.Furtherworkneedstobedoneregardingthispoint,butwehypothesizethatinourmethodology,regularizationisautomaticallyachievedduetotheinbuiltconstraintonthedimensionalityofthedata.ThexeddimensionalityalsocarriesdisadvantagesbecausetheuserhasnocontroloftheVRKHSdimensionality.Therefore,thequalityofthenonlinearsolutiondependssolelyonthenonlineartransformationbetweentheinputspaceandVRKHS.AnotherimportantattributeoftheVRKHSisthatthenonlineartransformationismediatedbythedatastatistics.AsitiswellknownfromthetheoryofHammerstein-Wienermodels[ 42 ]fornonlinearfunctionapproximationandtoacertainextentinSVMtheory[ 3 ],thenonlinearityplaysanimportantroleinperformance.Inotherwords,intheHammersteinmodels,astaticnonlinearitywhichisindependentofthedataischosenaprioriwhereasourapproachinducesthenonlinearityimplicitlybydeningageneralizedsimilaritymeasurewhichisdatadependent.OurmethodactuallyencompassesandenrichesHammerstein-Wienermodelsbyinducing 34

PAGE 35

nonlinearitieswhichmaynotbeachievedviastaticnonlinearities,andaretunedtothedatai.e.thesamekernelmayinducedierentnonlineartransformationsfordierentdatasets.TheRKHSinducedbycorrentropyisindeeddierentfromthetwomostwidelyusedRKHSstudiedinmachinelearningandstatisticalsignalprocessing.Beingdierentdoesnotmeanthatitwillalwaysbebetter,buttheseresultsshowthatthisprovidesapromisingopeningforboththeoreticalandappliedresearch.Stillcertainfacetsofthisresearch,liketheexactrelationshipoftheinputdatawiththefeaturedata,requiresfurtherinvestigationforamoreconcreteunderstandingoftheimpactonthebettermentofconventionalsignalprocessing. 35

PAGE 36

CHAPTER3CORRENTROPYBASEDMATCHEDFILTERING3.1DetectionStatisticsandTemplateMatching3.1.1LinearMatchedFilterDetectingaknownsignalinnoisecanbestatisticallyformulatedasahypothesistestingproblem,choosingbetweentwohypotheses,onewiththesignals1;kpresentH1andtheotherwiththesignals0;kpresentH0.Whenthechannelisadditivewithnoise,nk,independentofthesignaltemplates,thetwohypothesesareH0:rk=nk+s0;kH1:rk=nk+s1;k:{1WhenboththehypothesesaregovernedbyGaussianprobabilitydistribution,theoptimaldecisionstatisticisgivenbytheloglikelihood[ 43 ]asLr=logjR0j1 2 jR1j1 2)]TJ/F15 11.955 Tf 13.151 8.088 Td[(1 2r)]TJ/F37 11.955 Tf 11.956 0 Td[(m1TR)]TJ/F20 7.97 Tf 6.586 0 Td[(11r)]TJ/F37 11.955 Tf 11.955 0 Td[(m1)]TJ/F15 11.955 Tf 11.955 0 Td[(r)]TJ/F37 11.955 Tf 11.955 0 Td[(m0TR)]TJ/F20 7.97 Tf 6.586 0 Td[(10r)]TJ/F37 11.955 Tf 11.955 0 Td[(m0{2whereR0andR1aretherespectivecovariancematrices,andm0andm1aretherespectivemeanvectorsofthehypotheses.Lristhencomparedtoathreshold,andifthestatisticisgreater,hypothesisH1otherwiseH0ischosen.Thisisacomplicatedquadraticforminthedatathatisusuallynotpreferredtobecomputed.Assumptionofindependentlyandidenticallydistributediidzeromeannoisewouldreducethistoamuchsimplerexpression.Then,R0=R1=2nI,m0=s0andm1=s1,where2nisthenoisevariance,Iistheidentitymatrix,ands0ands1arethetwopossibletransmittedsignalvectors.Lr=0)]TJ/F20 7.97 Tf 18.119 4.707 Td[(1 22nr)]TJ/F37 11.955 Tf 11.955 0 Td[(s1Tr)]TJ/F37 11.955 Tf 11.955 0 Td[(s1)]TJ/F15 11.955 Tf 11.955 0 Td[(r)]TJ/F37 11.955 Tf 11.955 0 Td[(s0Tr)]TJ/F37 11.955 Tf 11.955 0 Td[(s0=)]TJ/F20 7.97 Tf 15.463 4.707 Td[(1 22nrTr)]TJ/F15 11.955 Tf 11.955 0 Td[(2rTs1+sT1s1)]TJ/F37 11.955 Tf 11.955 0 Td[(rTr+2rTs0)]TJ/F37 11.955 Tf 11.955 0 Td[(sT0s0=)]TJ/F20 7.97 Tf 15.464 4.708 Td[(1 22n)]TJ/F15 11.955 Tf 9.299 0 Td[(2rTs1+sT1s1+2rTs0)]TJ/F37 11.955 Tf 11.955 0 Td[(sT0s0{3 36

PAGE 37

Thetermsnotdependingonrcanbedroppedandtheexpressioncanberescaledwithoutanyeectonthenaldecision.Thesechangeswillbereectedonthethresholdtowhichthestatisticsiscompared.So,wecanusethefollowingdecisionstatisticwhichisnothingbutthedierencebetweenthecorrelationbetweenthereceivedsignalrandthetemplates.~Lr=rTs1)]TJ/F37 11.955 Tf 11.955 0 Td[(s0:{4This,infact,istheoutputykofthematchedlteratthetimeinstantk=,thesignallength,suchthatyk=rkhk{5wheredenotesconvolutionandhk=s1;)]TJ/F22 7.97 Tf 6.586 0 Td[(k)]TJ/F21 11.955 Tf 10.304 0 Td[(s0;)]TJ/F22 7.97 Tf 6.587 0 Td[(kisthematchedlterimpulseresponse.Thus,thelteroutputiscomposedofasignalandanoisecomponent.TheoutputachievesitsmaximumvalueatthetimeinstantwhenthereismaximumcorrelationbetweenthematchedlterimpulseresponseandthetemplatetherebymaximizingthesignaltonoiseratioSNRdenedastheratioofthetotalenergyofthesignaltemplatetothenoisevarianceSNR=1 2nTXk=0s2k:{6Thematchedlterisoneofthefundamentalbuildingblocksofalmostallcommunicationreceivers,automatictargetrecognitionsystems,andmanyotherapplicationswheretransmittedwaveformsareknown.ThewideapplicabilityofthematchedlterprincipleisduetoitssimplicityandoptimalityunderthelinearadditivewhiteGaussiannoiseAWGNframework.3.1.2CorrentropyasDecisionStatisticNowinspiredbyproperties9and10inchapter2,weshalldenethedecisionstatisticusedhenceforth.Thepropertiesabovedemonstratethatcross-correntropyisaprobabilisticsimilaritybetweentworandomvectors.Letusassumethatwehaveareceiverandachannelwithwhiteadditivenoise.Forsimplicitywetakethebinary 37

PAGE 38

Table3-1.ValuesforthestatisticforthetwocasesusingtheGaussiankernel. ReceivedsignalStatistic,LC r=n+s01 Np 22NPi=1e)]TJ/F20 7.97 Tf 7.782 6.083 Td[(s1;i)]TJ/F22 7.97 Tf 6.587 0 Td[(s0;i+ni2 22)]TJ/F20 7.97 Tf 27.677 4.707 Td[(1 Np 22NPi=1e)]TJ/F20 7.97 Tf 7.782 5.698 Td[(ni2 22 r=n+s11 Np 22NPi=1e)]TJ/F20 7.97 Tf 7.782 5.699 Td[(ni2 22)]TJ/F20 7.97 Tf 27.677 4.707 Td[(1 Np 22NPi=1e)]TJ/F20 7.97 Tf 7.782 6.084 Td[(s1;i)]TJ/F22 7.97 Tf 6.587 0 Td[(s0;i+ni2 22 detectionproblemwherethereceivedvectorisrwithtwopossiblecases:awhenthesignals0ispresenthypothesisH0,r0=n+s0andbwhenthesignals1ispresenthypothesis,r1=n+s1.Nowwebasicallywanttocheckwhetherthereceivedvectoris`closer'tos1validatingH1ortos0validatingH0basedonoursimilaritymeasurecorrentropy.Thuswhenthetimingsofthesignalisknownandtheyaresynchronizedwedenethefollowingasthecorrentropymatchedlterstatistic:LCr=1 NNXi=1ri;s1;i)]TJ/F20 7.97 Tf 14.818 4.707 Td[(1 NNXi=1ri;s0;i:{7Withthetwocasesofeithertransmittingr1signals1innoiseorr0signals0innoisetable3-1summarizesthevaluesofthestatisticsforthelinearmatchedlterandthecorrentropymatchedlterusingtheGaussiankernel.Sincethepdfofthenoiseniisconsideredsymmetrical,niand)]TJ/F21 11.955 Tf 9.299 0 Td[(niarestatisticallythesameandhenceLCr1=)]TJ/F21 11.955 Tf 9.298 0 Td[(LCr0.Weshouldalsonotethatthecorrentropymatchedltergivenby 3{7 defaultstothelinearmatchedlterbecauseaccordingtoproperty11inchapter2, 3{7 isalsomeasuringthedierenceincorrelationbetweentherandomfeaturevectorsderivedformcorrentropycorrespondingtor,s0ands1.Thusthecorrentropymatchedltercreatesalinearmatchedlterthatisnonlinearlyrelatedtotheoriginalinputdata.Ifthetransmittedsignalislaggedbyacertaindelayandthisisnotknownapriori,thefollowingcorrentropy 38

PAGE 39

decisionstatisticshouldbeused:LCr=maxm1 NNXi=1ri)]TJ/F22 7.97 Tf 6.587 0 Td[(m;s1i)]TJ/F15 11.955 Tf 11.956 0 Td[(maxm1 NNXi=1ri)]TJ/F22 7.97 Tf 6.587 0 Td[(m;s0i;{8whereri)]TJ/F22 7.97 Tf 6.586 0 Td[(mistheithsampleinthereceivedsignalvectorofsamplesinthesymbolwindowdelayedbym.Nowdependingonwhetherthedetectionschemeissynchronousornot,allthatisrequiredisforLCtobecomparedwithathresholdtodecidewhenthesignaltemplatewastransmitted.3.1.3InterpretationfromKernelMethodsSofarwehaveintroducedthecorrentropymatchedlterfromaninformationtheoreticlearningperspectiveITL.Wearriveatexpression 3{7 fromkernelmethodswithafewassumptions.Weshalltransformthedatafromtheinputspacetothekernelfeaturespaceandcompute 3{4 inthefeaturespacebyusingthekerneltrick.Butinsteadofusingtheoriginalkernelweshallusethesumofkernels,denedbyr;s=1 NNXi=1ri;si;{9wherer=[r1;r2;:::;rN]Tands=[s1;s2;:::;sN]Taretheinputvectors.Notethatisavalidkernelbyallmeans.ItistrivialtoshowthatissymmetricalandpositivedenitemerelyfromthefactthatitisasumofsymmetricalandpositivedenitefunctionsimplyingthatitisaMercerkernel.Hencecanbewrittenasr;s=Trs;{10wheremapstheinputvectortoapossiblydependingonthekernelchoseninnitedimensionalfeaturevector.Withthetwopossibletransmittedvectorss0ands1andr0asthereceivedsignalintheinputspace,thecorrespondingfeaturevectorswillbes0,s1andr,respectively.Nowapplyingthethedecisionstatistic 3{4 inthefeature 39

PAGE 40

spacewegetLr=)]TJ/F15 11.955 Tf 6.78 -6.662 Td[(s1)]TJ/F15 11.955 Tf 13.256 3.022 Td[(s0Tr=1 NNXi=1ri;s1;i)]TJ/F15 11.955 Tf 12.878 8.088 Td[(1 NNXi=1ri;s0;i:{11Lcoincideswith 3{7 .OfcourseLisnotthemaximumlikelihoodstatisticinthefeaturespace,sincethedatainthefeaturespaceisnotguaranteedtobeGaussian.Butsecondorderinformationinthefeaturespaceisknowntoextracthigherorderinformationintheinputspace.Forexample 2{4 showsthatthesquarednormofthemeanofthefeaturevectorsistheinformationpotentialwhichgivesRenyi'squadraticentropy.Wecanexpectthesameeecthereand,asweshallseelater,isalsodemonstratedintheresults.3.1.4ImpulsiveNoiseDistributionsSinceweaimtoshowtheeectivenessoftheproposedmethodinimpulsivenoiseenvironments,weshallbrieyintroducethemostcommonlyusedpdfmodelsforsuchdistributions.Thesedistributionsarecommonlyusedtomodelnoiseobservedinlow-frequencyatmosphericnoise,uorescentlightingsystems,combustionengineignition,radioandunderwateracousticchannels,economicstockprices,andbiomedicalsignals[ 44 ],[ 45 ],[ 46 ].Therearetwomainmodelsusedinliteraturethatwepresentnext.Toourknowledge,asingledetectionmethodthatcanbeappliedeasilytobothsuchmodelshasnotbeenpresentedinliteraturesofar.WeshalldemonstratethattheproposedCMFisanexception.3.1.4.1Two-termGaussianmixturemodelThetwo-termGaussianmixturemodel,whichisanapproximationtothemoregeneralMiddletonClassAnoisemodel[ 47 ]hasbeenusedtotestvariousalgorithmsunderanimpulsivenoiseenvironment[ 23 ],[ 46 ],[ 48 ].ThenoiseisgeneratedasamixtureoftwoGaussiandensityfunctionssuchthatthenoisedistributionfNn=)]TJ/F21 11.955 Tf 12.572 0 Td[("N;21+"N;22,where"isthepercentageofnoisespikesandusually22>>21. 40

PAGE 41

3.1.4.2Alpha-stabledistribution-stabledistributionsarealsowidelyusedtomodelimpulsivenoisebehavior[ 44 ],[ 45 ].ThisdistributiongraduallydeviatesfromGaussianityasdecreasesfrom2to1whenthedistributionbecomesCauchy.Thisrangeisalsoappropriatebecauseeventhoughthehighermomentsdiverge,themeanisstilldened.Thoughthepdfofan-stabledistributiondoesnothaveaclosedformexpression,itcanbeexpressedintermsofitscharacteristicfunctionFouriertransformofthepdf.Thegeneralformofthecharacteristicfunctionandmanydetailsonthe-stabledistributioncanbefoundin[ 49 ].Hereweshallonlyconsiderthesymmetric-stablenoise.Thecharacteristicfunctionofsuchnoiseisgivenbyu=e)]TJ/F22 7.97 Tf 6.586 0 Td[(juj;{12whererepresentsascaleparameter,similartoastandarddeviation.Sucharandomvariablehasnomomentgreaterorequalto,exceptforthecase=2[ 49 ].Itisnoteworthythatthedeterministicordegeneratecase=0,theGaussiancase=2,theCauchycase=1andtheLevyorPearsondistributioninthenon-symmetricframework,for=0.5aretheonlycasesforwhichthepdfpossessesaclosedformexpression[ 49 ].3.1.4.3LocallysuboptimalreceiverWhenwepresentthesimulationsinalatersection,foronecaseusingadditive-stablenoise,weshallcomparetheperformanceoftheproposedCMFdetectorwiththelocallysuboptimumLSOdetector,whichgivesanimpressiveperformancewithminimumcomplexity[ 50 ].Soweshallbrieyintroducetheseconcepts.Fordetailspleaserefertotherespectivecitations.TheLSOdetectorisderiveddirectlyfromthelocallyoptimumLOdetector[ 51 ],whoseteststatisticisgivenbyTLOr=NXk=1skgLOrk;{13 41

PAGE 42

wherethenonlinearscorefunctionisgLOx=)]TJ/F15 11.955 Tf 14.342 11.242 Td[(_fnx fnx;{14and_fnxistherstderivativeoffnx,thepdfoftheadditivenoise.Sincefnxdoesnothaveaclosedformexpressionwhenthenoiseis-stabledistributed.gLOcannotbefoundexactly.TheLSOdetectorusesanapproximationforthescorefunctiongLSOx=8><>:cx;jxj+1 x;jxj>;{15wherec=+1 2and=2:73)]TJ/F15 11.955 Tf 12.409 0 Td[(1:75istheempiricalestimateofthepeakofgLO.Asitcanbeeasilyseen,adrawbackoftheLSOdetectoristhatoneneedstoknowthevalueofalphabeforeitcanbeemployed.Weshallassumethatthisinformationavailableinthesimulationsthatwillbepresentedattheendofthischapter.3.1.5SelectionofKernelSizeThekernelsizevarianceparameterintheGaussiankernelisafreeparameterinkernelbasedandinformationtheoreticmethods,soithastobechosenbytheuser.Thereareinfactnumerouspublicationsinstatisticsonselectingtheproperkernelsize[ 52 ],[ 53 ],[ 54 ],intherealmofdensityestimation,butasystematicstudyontheeectofkernelsizeinITLisfarfromcomplete.Fortheparticularcaseofthecorrentropymatchedlterwecanshowthefollowinginterestingproperty.Property1:ThedecisionusingstatisticLCreducestotheoptimalGaussianMLdecisionintheinputspacegivenby 3{4 asthekernelsizevarianceoftheGaussianKernelincreases.Proof:UsingtheTaylor'sseriesexpansionwegetrn;sn=1)]TJ/F15 11.955 Tf 13.151 8.088 Td[(rn)]TJ/F21 11.955 Tf 11.955 0 Td[(sn2 22+rn)]TJ/F21 11.955 Tf 11.955 0 Td[(sn4 2222)-222({16 42

PAGE 43

Sinceorderofthetermsincreasesby2witheachterm,asincreases,thecontributionofthehigherordertermsbecomeslesssignicantcomparedtothelowerorderterms.Then,rn;sn1)]TJ/F20 7.97 Tf 13.24 5.699 Td[(rn)]TJ/F22 7.97 Tf 6.587 0 Td[(sn2 22anditiseasytoseethatusingLCisequivalenttousing~L,where~Lisgivenby 3{4 .Thismeansthatthekernelsizeactsasmeansoftuningthecorrentropymatchedlter,largerkernelsizeadjustingittolinearityandGaussianity.ForinstanceinAWGNbyproperty1,thekernelsizeshouldberelativelylargerthanthedynamicrangeofthereceivedsignalsothattheperformancedefaultstothatofthelinearmatchedlter,whichisoptimalforthiscase.Intheexperimentspresentedbelow,thekernelsizeischosenheuristicallysothatthebestperformanceisobserved.Inthecaseswherethecorrentropymatchedlterisexpectedtobringbenets,likeforadditiveimpulsivenoise,thekernelsizeshouldbeselectedaccordingtodensityestimationrulestorepresentwellthetemplatei.e.Silverman'srule[ 53 ],orontheorderofthevarianceofthetemplatesignal.Silverman'sruleisgivenbyopt=xf4N)]TJ/F20 7.97 Tf 6.587 0 Td[(1d+1g1 1+d,wherexisthestandarddeviationofthedata,Nisthedatasizeanddisthedimensionofthedata.Asshallbeillustratedlater,choosingtheappropriatekernelsizeforthedetectionproblemathandisnotveryexactingasinmanyotherkernelestimationproblemssinceawideintervalprovidesoptimalperformanceforagivennoiseenvironment.3.2ExperimentsandResultsReceiveroperatingcharacteristicROCcurves[ 55 ]wereusedtocomparetheperformanceofthelinearmatchedlterMF,thematchedlterbasedonmutualinformationMIandtheproposedcorrentropymatchedlterCMF.ROCcurvesgivetheplotoftheprobabilityofdetectionPDagainsttheprobabilityoffalsealarmPFAforarange[0;1ofthresholdvalues,thehighestthresholdcorrespondstotheorigin.TheareaundertheROCcanbeusedasameansofmeasuringtheoverallperformanceofadetector.TheROCcurveswereplottedforvariousvaluesofsignaltonoiseratioSNR,denedastheratioofthetotalenergyofthesignaltemplatetothenoisevariance.For 43

PAGE 44

-stabledistributions,wherethevarianceofthenoiseisnotdened,SNRwasestimatedassignalpowertosquaredscaleofnoiseratio.10,000Monte-CarloMCsimulationswererunforeachofthefollowingcases.Thetemplateisasinusoidsignalwithlength64givenbysi=sini=10,i=1;2;:::;64.Segmentschipsoflengthequaltothesignaltemplate,somecontainingthesignalandotherswithoutthesignal,weregeneratedwitha1 2probabilityoftransmittingasignal.Varioussimulationswereperformedunderthefollowingsituations. 1. AdditivewhiteGaussiannoiselinearchannel,rk=sk+nk. 2. AdditivewhiteimpulsivenoiselinearchannelwherethenoiseisgeneratedasamixtureoftwoGaussiandensityfunctionssuchthatthenoisedistributionfNn=)]TJ/F21 11.955 Tf 12.728 0 Td[("N;21+"N;22,wherethepercentageofnoisespikes,"=0:15and22=5021. 3. Additivezeromean-stablenoisechannel.ForeachcasetwosetsofROCcurvesshallbepresented{oneforthecasewhenthetimingofthereceivedsymbolsareknownandthesignalissynchronizedusingthestatistic 3{7 andtheotherwhentheremightbeanunknowndelayinthereceivedsymbolsusingthestatistic 3{8 .Thesetwocasesshallbereferredrespectivelyassynchronousdetectionandasynchronousdetectioninthefollowingsections.Forthelatter,thedelaywassimulatedtobenotlargerthan20samplesandhencethecorrespondingsimilaritymeasurescorrelationMF,correntropyCMF,mutualinformationMIandcorrelationwiththesuboptimalscoreLSOweremaximizedoverthelagslessthan20.Wehaveusedthemethodgivenin[ 56 ]withtheskewnessparameter=0togeneratethesymmetrical-stabledistributedsamples.3.2.1AdditiveWhiteGaussianNoiseFortheAWGNcase,thenormalmatchedlterisoptimalsoitshouldoutperformalltheotherltersundertest.ButtheCMF,althoughobtainedinadierentway,providesalmostthesameperformanceascanbeexpectedfrom 3{16 forlargekernelsize,and 44

PAGE 45

observedingure 3-1 forsynchronousdetectionandgure 3-2 forasynchronousdetection.Infact,theperformanceoftheCMFwillapproacharbitrarilyclosetothatoftheMFasthekernelsizeincreases.TheMIbasedmatchedlteralsoapproachessimilarperformancewhenthekernelsizeisincreasedtohighvalues,buttheresultsareabitinferiorbecausethecomputationofCS-QMI[ 24 ]disregardstheorderingofthesamples. Figure3-1.ReceiveroperatingcharacteristiccurvesforsynchronousdetectioninAWGNchannelwithkernelvariance2CMF=152MI=15thecurvesforMFandCMFfor10dBoverlap. 3.2.2AdditiveImpulsiveNoisebyMixtureofGaussiansWhentheadditivenoiseisimpulsive,theproposedCMFclearlyoutperformsboththeMIdetectorandthelinearMFseegure 3-3 forsynchronousdetectionandgure 3-4 forasynchronousdetection.TheincreasedrobustnesstoimpulsivenoisecanbeattributedtothepropertiesofthecorrentropyfunctionusingtheGaussiankernelasmentionedearlierheavyattenuationforlargedatavaluesoftheoutliers. 45

PAGE 46

Figure3-2.ReceiveroperatingcharacteristiccurvesforasynchronousdetectioninAWGNwithkernelvariance2CMF=152MI=15. 3.2.3Alpha-StableNoiseinLinearChannelNowletusobservethebehaviorofthedetectionmethodsfor-stabledistributednoise.InthiscasethecomparisonswiththeLSOdetectorisalsopresented,sinceitisclosetooptimalforadditive-stabledistributednoise[ 50 ].Weassumethatthevalueofalphaisknownbeforehandtousethisdetector.Sincethesedistributionshavesecondmomentstendingtoinnity,thematchedlterutterlyfails.Figures 3-5 and 3-6 showtheseresultsforsynchronousdetection.ItcanbeseenthattheCMFandtheLSOdetectorbothgivealmostidenticalperformance,buttheLSOdetectorrequiresonetoknowtheexactvaluesofapriori.Ofcourse,asalphaincreasesandapproaches2,theperformanceofthelinearMFimprovessincethenoisethenbecomesGaussianseegure 3-6 .Thecurvesingure 3-6 forthetwovaluesofcorrespondingtoeachnonlineardetectorCMF,LSOandMIalmostcoincide.Sincethevarianceofthe-stablenoise 46

PAGE 47

Figure3-3.Receiveroperatingcharacteristiccurvesforsynchronousdetectioninadditiveimpulsivenoisewithkernelvariance2CMF=5,2MI=2. isnotwelldened,SNRwouldimplytheratioofsquaredscaleofnoisetosignalpower.Figure 3-7 showstheROCplotsforthetheasynchronousdetection.Onceagain,theperformanceoftheCMFrivalsthatoftheLSOdetector,whichisexclusivelydesignedforthe-stablenoiseforagiven.ThesesimulationsdemonstratetheeectivenessoftheproposedCMFforthiswidelyusedimpulsivenoisemodel.3.2.4EectofKernelSizeThechoiceofkernelsize,thoughimportant,isnotascriticalasinmanyotherkernelmethodsanddensityestimationproblems.Fordetectionwiththecorrentropymatchedlter,wedecidedtoplottheareaunderthecurvefordierentvaluesofthekernelsizetoevaluateitseectontheROC.Ascanbeseeningures 3-9 and 3-8 ,awiderangeofthekernelsizeworkswell.Forthecasewithimpulsivenoiseusingthetwo-termGaussianmodelthevaluesgivenbytheSilverman'srule.2and4.05forSNRvalues 47

PAGE 48

Figure3-4.Receiveroperatingcharacteristiccurvesforasynchronousdetectioninadditiveimpulsivenoisewithkernelvariance2CMF=5,2MI=2. of5dBand0dBrespectivelyfallinthehighperformanceregion,butforthe-stablenoise,theSilverman'sruleisnotcomputablesincethevarianceofthenoiseisill-dened.Howeveritistrivialtochooseavalueforkernelsizethroughaquickscantoselectthebestperformancefortheparticularapplication.3.2.5LowCostCMFUsingTriangularKernelThoughwehavepresentedmostofourargumentspreviouslyusingtheGaussiankernel,theseargumentsarevalidforanyvalidkernel.Infact,thetypeofkernelisusuallychosenbasedonthetypeofproblemorsimplybasedonconvenience.Forexampleifitisknownbeforehandthatthenonlinearityinvolvedinasystemtobeestimatedispolynomial,thenapolynomialkernelwouldbeused[ 57 ].LikewiseiftheproblemsolutionisbasedonParzen'spdfestimation,thenapdfkernelisusedliketheGaussian,Laplacian,etcetera.Hereweshalldemonstratetheuseofatriangularkernelsoastosimplifythe 48

PAGE 49

Figure3-5.Receiveroperatingcharacteristiccurvesforsynchronousdetectioninadditivewhite-stabledistributednoise,kernelvariance2=3,=1:1. implementationoftheCMFevenmore.TheuseoftheGaussiankernelthoughgivessuperiorperformance,amachineusuallyevaluatesthisthroughapolynomialbasedexpansion.Bearingthiscomplexitymaynotalwaysbenecessary.Forinstance,inourproblemhere,byusingthetriangularkernelwhichisavalidParzenkernel,wecansimplifytheCMFimplementation.ThetriangularkernelevaluatedbetweentworealpointsxandyisgivenbyUax;y=Tx)]TJ/F22 7.97 Tf 6.587 0 Td[(y a;{17whereaisarealpositivenumberandTx=8><>:jxj;0;for)]TJ/F15 11.955 Tf 11.955 0 Td[(1x1otherwise:{18Figure 3-10 showsthisfunction.Alloftheideasandargumentspresentedinthischapter 49

PAGE 50

Figure3-6.Receiveroperatingcharacteristiccurvesforsynchronousdetectioninadditivewhite-stabledistributednoise,kernelvariance2=3,SNR=15dB,theplotsforMIandCMFalmostcoincideforbothvaluesof. willstillbevalidforthiskernel.Notethatusingthiskernel,onedoesnotrequireevenasinglemultiplicationoperation.Figure 3-11 showstheROCplotofthepreviousmethodsalongwiththetheCMFusingthetriangularkernel.TheSNRis5dBwiththenoisegeneratedwithamixtureofGaussianpdfsasdiscussedbefore.Thetriangularkernel 3{17 usedawidtha=1:5.Theparametervaluesfortheothermethodsarethesameasshowningure 3-3 50

PAGE 51

Figure3-7.Receiveroperatingcharacteristiccurvesforasynchronousdetectioninadditivewhite-stabledistributednoise,kernelvariance2=3,SNR=15dB,theplotsforMIandCMFalmostcoincideforbothvaluesof. 51

PAGE 52

Figure3-8.AreaundertheROCforvariouskernelsizevaluesforadditivealpha-stabledistributednoiseusingsynchronousdetection,=1:1. 52

PAGE 53

Figure3-9.AreaundertheROCforvariouskernelsizevaluesforadditiveimpulsivenoisewiththemixtureofGaussiansusingsynchronousdetection. Figure3-10.Triangularfunctionthatcanbeusedasakernel. 53

PAGE 54

Figure3-11.ReceiveroperatingcharacteristiccurvesforSNRof5dBforthevariousdetectionmethods. 54

PAGE 55

CHAPTER4APPLICATIONTOSHAPECLASSIFICATIONOFPARTIALLYOCCLUDEDOBJECTS4.1IntroductionAvastamountofpapersandbooksinthestatisticalandpatternanalysisliteraturehasbeendevotedtotheproblemofrecognizingshapesusingvariouslandmarks[ 58 ],[ 59 ],[ 60 ].Nevertheless,oneofthemajorchallengesyettobesolved,istheautomaticextractionoflandmarksfortheshapewhentheobjectispartiallyoccluded[ 60 ],[ 61 ],[ 46 ].Theproblemismostlycausedbymisplacedlandmarkscorrespondingtheoccludedpartoftheobject.Usingtheproposedrobustdetectortheinuenceoftheoccludedportionswillautomaticallybelesssignicant.Theideaisthattheocclusionmaybetreatedasoutliers,andhencemaybemodeledassomekindofheavytaileddistribution.Inapreviouschapter,wehavealreadyseenthatthecorrentropymatchedlterisrobustagainstimpulsivenoise.withourapproachthereisnoneedtoextractsocalledlandmarksbasedoncertainpropertiesoftheshapeboundarylikechangeincurvatureorpresenceofcorners,etc.Insteadouralgorithmusestheshapeboundariesdirectly,tryingtomatchitwithoneoftheknownshapetemplates.Thisisdonebysimultaneouslyadjustingforthepossiblerotationsandscalingrequiredthroughaxednumberofparameterupdates.4.2ProblemModelandSolutionAlargeclassofclassicationproblemscanbesolvedusingtheclassicallinearmodel.ConsiderthefollowingM-aryclassicationproblem.Form=0;1;:::;M)]TJ/F15 11.955 Tf 12.484 0 Td[(1,themthhypothesisisY=AXm+E;{1whereYisanKNmeasurementmatrix,AXmisthemeasurementcomponentduetothesignalofinterestwhenmodelorclassmisinforce,andEisanadditivezeromeannoisematrixofsizeKN.AisanunknownKPmatrix.XmisanPNmatrix.Thesignalcomponentcanalsobecalledasubspacesignal[ 46 ].Allthesequantitiesmayalso 55

PAGE 56

becomplexvalued.Hence,giventhemeasurementmatrixY,theproblemistoclassifywhichoftheMsignalcomponentsXmAispresent.Ifsomeofthemodelparametersareunknownlikehere,ausualapproachemployedfordetectorsisthegeneralizedlikelihoodratioGLRprinciple[ 62 ].TheGLRdetectorreplacestheunknownparametersbythetheirmaximumlikelihoodestimatesMLE.However,ifthepdfofthenoisematrixisunknownoritisnotpossibletondtheMLEsofthoseparameters,theGLRbasedmethodsarehardtoimplement.ThisisusuallytruewhenthenoisemodeldeviatesfromtheusualassumptionsofGaussianity.Itistruehereinourexampleaswell.Inthischapterweshalldemonstratetheproposedrobustdetectoronshapeclassicationofpartiallyoccludedshsilhouettes.Figure 4-1 showsthesetof16shsilhouettesdownloadedfromwww.lems.brown.edu/vision/softwarethatconstitutethetemplatesthatwereappliedinthenumericalexample.Foreachsilhouettetheboundarywasextracted.Inrelationtothemodelgivenby 4{1 ,XmrepresentsthematrixwhoseeachcolumnisapointintheboundaryintheshapeandAisascalingandrotationmatrixoftheformA=264a)]TJ/F21 11.955 Tf 9.299 0 Td[(bba375:{2Adistortedversionofoneoftheshapesinthetemplatedatabaseisgiven.Thatshapeisthenattemptedtobematchedwithoneofthedatabaseelements.Henceinourcase,16templatematchingoperationsareperformed.Foreachcase,theoptimalscaling,rotationandtranslationparametersarefoundbasedonoptimizingmaximizingthecorrentropybetweenthedatabaseshapeboundaryandtheboundaryofthegivendistortedshape.ThusthecorrespondingcostfunctionisJA;d=NXi=1e)]TJ 7.782 4.523 Td[(kAxi)]TJ/F39 5.978 Tf 5.756 0 Td[(d)]TJ/F39 5.978 Tf 5.756 0 Td[(yik2 22;{3wherexiandyiaretheithcolumnentriesofXmandYrespectively.Thisfunctionisnowmaximizedwithrespecttothescalingandrotationmatrix,andthetranslationvectord. 56

PAGE 57

Figure4-1.Theshtemplatedatabase. Leta=264ab375:{4Thenbydierentiating 4{3 withrespecttoaandequatingthegradienttozeroonecangetthefollowingxedpointupdaterule:aNPi=1zie)]TJ 7.782 4.523 Td[(kAxi)]TJ/F39 5.978 Tf 5.756 0 Td[(d)]TJ/F39 5.978 Tf 5.756 0 Td[(yik2 22 NPi=1kxik2e)]TJ 7.782 4.523 Td[(kAxi)]TJ/F39 5.978 Tf 5.757 0 Td[(d)]TJ/F39 5.978 Tf 5.757 0 Td[(yik2 22;{5 57

PAGE 58

withzi=264xi1yi1+xi2yi2xi1yi2)]TJ/F21 11.955 Tf 11.955 0 Td[(xi2yi1375;{6wherexikandyikarethekthcomponentsofthevectorsxiandyirespectively.Similarlybydierentiatingwithrespecttodandequatingthegradienttozeroonecangetthefollowingxedpointupdateruleford:dNPi=1Axi)]TJ/F37 11.955 Tf 11.955 0 Td[(yie)]TJ 7.782 4.523 Td[(kAxi)]TJ/F39 5.978 Tf 5.756 0 Td[(d)]TJ/F39 5.978 Tf 5.756 0 Td[(yik2 22 NPi=1e)]TJ 7.782 4.523 Td[(kAxi)]TJ/F39 5.978 Tf 5.756 0 Td[(d)]TJ/F39 5.978 Tf 5.756 0 Td[(yik2 22:{7Aftertheoptimizationprocessisnishedforeachofthetemplatesinthedatabase,theshapecorrespondingtothelargestvalueofthecostfunctionischosen.Thustheoccludedshapeisrecognized.Sincetherewere16silhouettes,adatabaseofthe16templatesisstoredextractingtheboundariesforeachofthem.Sinceashapeiscompletelydescribedbyitsboundary,wecanuseanyconventionaltechniquetogettheboundaries.Forinstanceinourexample,foreachsilhouettetemplateinthedatabase,100pointswereevenlyextractedtorepresenttheboundaries.Wehavetoalsoextracttheboundarypointsfortheoccludedshapeaswell.Whenusingtheproposedrobustdetector,wehavetoextractanorderedsetofpointslikeatimeseries.Inadditionwealsohavetoextractaxednumberofboundarypoints,thesameasthenumberofboundarypointsinthetemplates.Notethattheseboundarypointshavetobeproperlyorderedsincethekernelfunctionisevaluatedforeachpointintheboundarywithonepointinthetemplate.Aneasyandapparentlyeectivethingtodoistoorderthepointsinthefollowingway:Fork=1;2;:::;databasesize,pickthekthpointintheboundaryofthetemplateandchoosetheclosestpointintheboundaryoftheoccludedobjecttobeitskthpoint.WeshallcallthisthenearestpointorderingNPO.Sincetheperimetergivenbytheoccluded 58

PAGE 59

objectmightbeverylarge,itisadvisabletoextractanumberofboundarypointsfortheoccludedobjectgreaterthanthetemplatesinthedatabase.Herewehaveusedtwicethenumberofpointsfortheoccludedshapethanwhatwasusedforthedatabasetemplates.Thusforeachshapetemplate,therewillbeanorderedsetofvectorpoints,i.e.,thematrixXmm=1;2;:::;MofboundarypointsandalsoacorrespondingvectorofboundarypointsoftheoccludedobjectgivenbyY.Misthenumberoftemplatesorclassesherethedatabasesize,16.Thesummaryofthealgorithmispresentednext. Centerboundarypointsofthereceivedoccludedshapebysubtractingthemeanvectorfromeachpoint.Fortemplateindexm=1toM...Forxedpointupdateindex,i=1tonumberofiterations......PerformnearestpointorderingofYwithrespecttoXmtogetthenewY.......PerformanupdateofA;dusing 4{5 and 4{7 ....EndForloopgettingAopt;dopt....ComputeJoptm=JAopt;dopt.EndForloopgettingJoptmm=1;2;:::;Mmopt=argmaxnJoptmChoosethemoptthshapetemplatetobethemostlikelyshapeforthereceivedoccludedobject. Theinputstothedetectorwereoccludedversionsoftheshapenumber1ingure 4-1 .Theoccludedversionswereconstructedbyrandomrotationwithin60degrees,scalingbyarandomfactorbetween1and2andtranslationoftwoshesofshapeno.1.Onlyoccludedsilhouetteswheretheratiooftheoccludedareatotheareaofthesilhouetteitselfwasbetween50%and90%wereconsidered.Figures 4-2 and 4-3 showapossibleoccludedversionofshapeno.1anditsboundariesrespectively. 59

PAGE 60

Figure4-2.Theoccludedsh. Table4-1.Recognitionandmisclassicationrates. ShapenumberProposedmethodLMedSRWR 195%63% 2to165%27% 1000MonteCarloMCsimulationswereperformedbygeneratingtheoccludedshaperandomlyandclassifyingittooneofthe16possibleknownshapes.ThetableshowstherecognitionandmisclassicationratesfortheproposedmethodalongwiththeLMedSmethodusingrowweightedresidualsRWRasreportedin[ 46 ].AsinallthemethodsthatusetheGaussiankernel,thekernelsizeisimportant.Inthisapplicationkernelsizetuneswhatisconsideredcloseandfar.Thefarpointsareheavilydiminishedwhencomputingthecorrentropyvalue.Forinstanceinthisnumericaldemonstration,akernelsizeof=15wasused.SinceisthestandarddeviationtermintheGaussiankernel,pointsthatarewithindistancefromoneanotherwillreturnmuchhigher 60

PAGE 61

Figure4-3.Theextractedboundaryoftheshin 4-2 valueswhenthekernelisevaluatedthanthepointsthataremuchfaraway.Sousingaof15signiesthattheoutlier-likepointscausedbyocclusionwillmostlybemuchlargerthan15pixelsapartfromtheircorrespondingpointsoftheknownshapeboundaryandhencewillhavelessimpactontheevaluationofthecorrentropyfunction,buttheboundarypointsthatareintactintheoccludedimagewilllikelybemuchcloserthan15pixelstothecorrespondingpointsoftheundistortedboundary.Thisconceptcanbeusedtoroughlyestimatetherequiredkernelsizeandmayfurtherbeveriedthroughafewsimulationsbyrandomlygeneratingoccludedimages.Thiskernelsizewaschosenafterseveralsimulationswithrandomocclusionsoastomaximizethepercentageofdetection.Fromtheresultsitisclearthatourtechniqueprovidesasimpleandelegantmethodwithexceptionalresults. 61

PAGE 62

CHAPTER5WIENERFILTERINGANDREGRESSION5.1LinearWienerFilterForconveniencewenowpresentabriefderivationofthelinearWienerlter.Givenarealzeromeantimeseriesxn,theoutputofthislinearFIRlterisgivenbyyn=L)]TJ/F20 7.97 Tf 6.587 0 Td[(1Xi=0wixn)]TJ/F21 11.955 Tf 11.955 0 Td[(i;{1wheretheweightswi;i=0;1;:::;L)]TJ/F15 11.955 Tf 13.061 0 Td[(1arefoundbyminimizingthemeansquareerrorMSEbetweentheoutputynandazeromeandesiredresponsedn,andListhememoryoftheniteimpulseresponsesystem,whichisalsothedelayembeddingdimensionofthetimeserieswhen 5{1 isrepresentedinvectorformgivenbyyn=wTxn;{2wherew=[w0;:::;wL)]TJ/F20 7.97 Tf 6.587 0 Td[(1]andxn=[xn;xn)]TJ/F15 11.955 Tf 12.164 0 Td[(1;:::;xn)]TJ/F21 11.955 Tf 12.165 0 Td[(L+1].ItisnothardtoseethatthewthatminimizestheMSEbetweenynanddn[ 63 ]isgivenbyw=R)]TJ/F20 7.97 Tf 6.587 0 Td[(1p;{3whereRistheautocorrelationmatrixgivenbyR=ExnxnTandpisthethecross-correlationvectorgivenbyp=E[dnxn].Ofcourse, 5{3 assumesstationarityofthetimeseries.Assumingergodicity,theexpectedvaluescanbeestimatedbythecorrespondingtimeaverageswithasucientlylargewindow.Theseassumptionsarevalidthroughoutthispresentation.ItiswellknownthattheuseofmeansquareerrorasanoptimizationcriterionisoptimalinthemaximumlikelihoodsenseforGaussianprocesses.Hencealllinearestimationandlteringmethods,includingWienerlterpresentedabove,thatusethiscriterionperformoptimallyforGaussianprocesses.Butinnaturemanysignalscannotbeexpectedtofollowthisassumptionstrictly.Speech,chaotictimeseries,andbiomedical 62

PAGE 63

signalsarejustafewexamples.AttemptshavebeenmadetocreatenonlinearsolutionstotheWienerltermostlybasedonVolterraseries[ 25 ].TheapproachofusingtheVolterraseriesgivessolutionsthatareverycomplexwithmanycoecients.OneofthemostsimplestandclosestinstructuretothelinearWienerlteristheHammersteinmodel[ 64 ],whichbasicallyisastaticnonlinearityfollowedbythelinearWienerlter.Thoughthismethodhasshownsomepromiseespeciallyforcertainsystemapproximationproblems[ 42 ],ithasnotbeenabletobeappliedtomoregeneralsignalprocessingproblems.Usingastaticnonlinearitytotransformthedataitselfalsoprovidesthelimitations.Useofstaticnonlinearityonlyaddressestheproblemforaverysmallsetofnonlinearsystems.Moreover,extracomplicationsarisefromtrainingfortheoptimalnonlinearity.5.2CorrentropyBasedWienerFilterWecannowderivethenonlinearWienerlterbasedonCorrentropy.Arelativelystraightforwardderivationusesthecorrentropyfunctiongivenby 2{10 .Aswehaveshowninchapter2,giventherandomprocessxn, 2{10 actsasacorrelationfunctionforsomeGaussianrandomprocesszn.i.e.,E[xi;xj]=E[zizj]:{4znisthetransformedrandomprocessthatisnonlinearlyrelatedtotheoriginalinputxn,buttheexactfunctionalrelationofznwithxnisstillunknown,justlikeinkernelmethodswheretheexactvaluesoffeaturevectorsarenotknownexplicitly.Butthisgivesabasisofformulatingthewellknownoptimalleastsquaressolutionwiththisnewprocesszn.ThedesiredresponseusedforthisWienerlterisstilltheoriginaldesiredsignaldn.Sothiscanbethoughtofasmodifyingtheoriginalinputsothatthelinearmanifoldofthenewcorrentropytransformedinputcanbetterincludethesolutiontoobtaintheoriginaldn.Sowearestillexploitingalinearstructurebutofamanifoldthatisnonlinearlyrelatedtotheorignalinput.ThismappingfromxntoznisnotsimplyastaticnonlinearrelationlikewiththeHammersteinmodels[ 42 ],butisrelatedtoboththe 63

PAGE 64

choiceofthekernellikeinkernelmethodsandthestatisticsoftheinputdata.SothissignalznistheinputtoalinearWienerlter.UsingalterlengthofL,anddnasthedesiredsignal,thesolutioniseasilygivenby,w=Rz)]TJ/F20 7.97 Tf 6.587 0 Td[(1pz;{5andyn=wTzn{6wherez=[zn;zn)]TJ/F15 11.955 Tf 12.418 0 Td[(1;:::;zn)]TJ/F21 11.955 Tf 12.418 0 Td[(L+1]Tandpz=E[dnzn].FurtherassumingergodicityweapproximatetheE[]operatorbythetimeaverage.Sowehave,yn=wTzn{7andthelteroutputbecomes,yn=zTnw=znR)]TJ/F20 7.97 Tf 6.587 0 Td[(1z1 NNPk=1dkzk=1 NNPk=1L)]TJ/F20 7.97 Tf 6.587 0 Td[(1Pj=0L)]TJ/F20 7.97 Tf 6.587 0 Td[(1Pi=0zn)]TJ/F21 11.955 Tf 11.955 0 Td[(i~rijzk)]TJ/F21 11.955 Tf 11.955 0 Td[(jdk=1 NNPk=1dkL)]TJ/F20 7.97 Tf 6.587 0 Td[(1Pj=0L)]TJ/F20 7.97 Tf 6.586 0 Td[(1Pi=0~rijfzn)]TJ/F21 11.955 Tf 11.955 0 Td[(izk)]TJ/F21 11.955 Tf 11.955 0 Td[(jg=1 NNPk=1dkL)]TJ/F20 7.97 Tf 6.586 0 Td[(1Pj=0L)]TJ/F20 7.97 Tf 6.587 0 Td[(1Pi=0~rijxn)]TJ/F21 11.955 Tf 11.956 0 Td[(i;xk)]TJ/F21 11.955 Tf 11.955 0 Td[(j{8where~rijistheijthelementofthematrixR)]TJ/F20 7.97 Tf 6.586 0 Td[(1z.Thenalexpressionisobtainedbyapproximatingzn)]TJ/F21 11.955 Tf 13.207 0 Td[(izk)]TJ/F21 11.955 Tf 13.207 0 Td[(jbyxn)]TJ/F21 11.955 Tf 13.206 0 Td[(i;xk)]TJ/F21 11.955 Tf 13.207 0 Td[(j,whichholdsgoodonanaveragebecauseof 5{4 .Thenaloutputisobtainedbymatchingthevarianceofyntothatofthedesiredsignal.Thismismatchinscaleismostlikelybecauseoftheaboveapproximation.Thisnalresultisalsopresentedin[ 36 ].From 5{8 itisobviousthatthecomputationalcostforproducingtheoutputatagiventimeinstantisOL2N,whereNisthenumberoftrainingdatapointsused.Thecomputationinvolvedisillustratedingure 5-1 .Thiscomputationarisesfromthe 64

PAGE 65

factthatthemappingtofeaturevectorsisnotexplicitlyknownandhencetheweightvectorcannotbenumericallycalculated.Insteadeverytimeallthetrainingdatasampleshavetobeutilizedtocomputetheoutputdirectly.Thisissimilartoallthekernelbasedregressionmethods.ButwedohaveanadvantagethataninversionoftheteoplitzcorrentropymatrixisinvolvedwithcomplexityOL2insteadofthelargeGrammatrixwhichisON3whichisrequiredformanykernelmethods.Thematrixinversionsinbothcaseshavetobedoneonce.Asweshallseeinthenextsection,wecanrearrangethetermsin 5{8 tomakeitcomparablewithotherwellknownregressionmethods.WeshalluseshallusethesesimilaritiesinordertoimprovetheCF. Figure5-1.ThecomputationoftheoutputdatasamplegivenanembeddedinputvectorusingthecorrentropyWienerlter. 65

PAGE 66

5.3SimpleRegressionModelswithKernelsWecanstudythecorrentropylterCFfromaregressionperspectiveandproposesomeimprovements.ForthisletusrewritetheCFequation 5{8 inthefollowingform:yn=L)]TJ/F20 7.97 Tf 6.586 0 Td[(1Xj=0L)]TJ/F20 7.97 Tf 6.586 0 Td[(1Xi=0~rij1 NN)]TJ/F20 7.97 Tf 6.587 0 Td[(1Xk=0dnxin;xjk=L)]TJ/F20 7.97 Tf 6.586 0 Td[(1Xj=0L)]TJ/F20 7.97 Tf 6.586 0 Td[(1Xi=0~rijyijn;{9wherexinistheithcomponentoftheLdimensionalvectorxnandnbelongstoanindexsetwhichmaybetime.TheCFcanbethoughtofasdoingregressionwiththecomponentsofthevectorsxnobtainingyijnandthenlinearlycombiningthemusingtheweights~rijtogetthenalestimateyn.SoonemeansoffurtherimprovingtheCFisbyoptimizingtheonlyweightparameters~rijusingthetrainingdata.Thiswilllikelycompensatetoanextentforthesimplisticapproximationusedtoobtain 5{8 .Beforeweproceed,somewellknownparametricandnonparametricregressionestimatorsarementionednext.5.3.1RadialBasisFunctionNetworkTheRBFnetwork[ 27 ]isawellknownestimatorwhichcanbeexplainedbythefollowingequation:yn=N)]TJ/F20 7.97 Tf 6.587 0 Td[(1Xk=0wkxn;xk;{10wheretheweightswkarecomputedsoastominimizetheMSEbetweentheestimateynandthedesiredvariabledn.5.3.2NadarayaWatsonEstimatorUnliketheRBFnetworkandmoreliketheCF,theNadarayaWatsonNWestimator[ 26 ]isanonparametricregressiontechnique.Bynonparametricwemeanthatnoparameterisdirectlyoptimizedwithrespecttothedesiredsignaldn.Theoutputestimateisgivenbyyn=N)]TJ/F20 7.97 Tf 6.586 0 Td[(1Pk=0dkxn;xk N)]TJ/F20 7.97 Tf 6.587 0 Td[(1Pk=0xn;xk:{11 66

PAGE 67

Thisequationcanbederivedasanapproximationtotheoptimalmeansquareerrorestimationgivenbyinthefollowingmanner:yn=Efdjx=xng=Zdpdjx=xndd;{12wherepdjx=xnistheconditionalprobabilitydensityfunctionofdgivenx=xn,whichcanbeestimatedbyParzenwindowingsuchthatpdjx=pd;xn pxn=n)]TJ/F20 7.97 Tf 6.587 0 Td[(1Pk=0d;dkxn;xk n)]TJ/F20 7.97 Tf 6.586 0 Td[(1Pk=0xn;xk{13Thenfrom 5{12 ,yn=n)]TJ/F20 7.97 Tf 6.586 0 Td[(1Pk=0xn;xkRdd;dkdd n)]TJ/F20 7.97 Tf 6.587 0 Td[(1Pk=0xn;xk=N)]TJ/F20 7.97 Tf 6.587 0 Td[(1Pk=0dkxn;xk N)]TJ/F20 7.97 Tf 6.586 0 Td[(1Pk=0xn;xk{14whereRdd;dkddgivesthecenterdkoftheGaussiankernel.5.3.3NormalizedRBFNetworkThenormalizedRBFnetworkNRBF[ 26 ]is,infact,acombinationoftheabovetworegressionmethods.Theestimateoftheoutputisgivenby,yn=N)]TJ/F20 7.97 Tf 6.587 0 Td[(1Pk=0wkxn;xk N)]TJ/F20 7.97 Tf 6.587 0 Td[(1Pk=0xn;xk;{15wheretheweightsareagainarecomputedsoastheminimizetheMSEbetweentheestimateandthedesiredvariabledn.5.4ParametricImprovementonCorrentropyFilterIfwelookatthetechniquesgivenbyequations 5{10 5{11 and 5{15 andthatgivenby 5{9 ,itisclearthatthekernelsoperateonthewholevectorsxnin 5{10 5{11 and 5{15 whereasthekernelsoperateonthecomponentsxinofthevectorxnin 5{9 resultinginthetermsyijn.Thesetermsarethenlinearlycombined. 67

PAGE 68

Twoimprovetheresultsgiven 5{9 ,weshallndthepartialregressiontermsyijnbymeanssimilarto 5{12 andndtheweightsbyminimizingthemeansquareerrorattheoutput.ThevectorxkisLdimensionalgivenbyxk=[x1k;x2k;:::;xLk]T.Letx=[x1;x2;:::;xL]Tbearandominputvector.Givenanewpointxn,theregressionestimatewiththeithcomponentoftheinputgivenasthejthcomponentofxnisgivenbyyijn=Efdjxi=xjng=N)]TJ/F20 7.97 Tf 6.587 0 Td[(1Pk=0dkxjn;xik N)]TJ/F20 7.97 Tf 6.586 0 Td[(1Pk=0xjn;xik:{16Ifwetakethenaloutputtobealinearcombinationofthesepartialregressionterms,thenyn=LXi=1LXj=1ijyijn=LXi=1LXj=1ijN)]TJ/F20 7.97 Tf 6.587 0 Td[(1Pk=0dkxjn;xik N)]TJ/F20 7.97 Tf 6.586 0 Td[(1Pk=0xjn;xik:{17ThisestimatorissimilarinformwiththeCFexpressionin 5{9 exceptforthenormalizationterms.Nowtheproblemisndingtheweights,ij.WecaneasilyndanoptimalsetofweightsbyminimizingtheMSEoftheestimateynwithrespecttothereferencedn.Itwillbearaclosedformsolutionsincetheestimateislinearintheweights.WecansimplymakealongvectorofdimensionL2andusetheWienersolution.ThisunliketheRBFnetworkorkernelregressionhasthenumberofparametersindependenttothetrainingdatasizeandusuallyismuchsmallerthanthedatasize.Weshallcalltheestimatorgivenin 5{17 asweightedpartialregressionWPR.5.5ExperimentsandResultsWeshallusethenormalizedmeansquareerrorasameansofcomparingtheperformanceoftheabovedescribedmethods.Weshallalsoshowchangeinperformancewiththesizeofthetrainingdata.Twodierentexperimentswereconducted: 1. Systemidenticationofatime-delayneuralnetworkTDNNbasicallytoshowthecomparisonoftheCFwiththeWFforasimplenonlineardynamicalsystem. 68

PAGE 69

2. TimeseriespredictionoftheMackey-GlassMG-30timeseries.5.5.1SystemIdenticationSyntheticallygeneratedinputwaspassedthroughaTDNNresultinginaoutputtimeseries.GiventhesameinputtheCFtriedtogeneratethesameoutput.WeshallpresentthisperformancealongwiththatofthelinearWienerltertosupportourclaimthattheCFworksbetterinanonlinearenvironment.Figure 5-2 showstheTDNNsystemwhichwasimitatedbytheCFandtheWF.Fortheseresultstheinputtoallthethreesystemswasxn=sin0n+1:2sinn+0:2sinn.TheparametersfortheTDNNwere266666664w11w12w21w31w41w22w32w42377777775=266666664)]TJ/F15 11.955 Tf 9.298 0 Td[(0:690:861:25)]TJ/F15 11.955 Tf 9.298 0 Td[(1:44)]TJ/F15 11.955 Tf 9.298 0 Td[(0:40)]TJ/F15 11.955 Tf 9.298 0 Td[(1:590:570:69377777775{18,264v1v2375=2640:820:71375{19 Figure5-2.Timedelayedneuralnetworktobemodeled. Figure 5-3 showstheinputandtheoutputsignalsgeneratedbytheTDNNdesiredresponse,CFandWF.Ofcourse,theoutputoftheTDNNwasusedasthedesiredresponsefortheCFandWF.TheperformanceoftheCFandtheWFcanbeseenin 69

PAGE 70

Figure5-3.InputandoutputsignalsgeneratedbytheTDNNdesiredresponse,CFandWFusingatimeembedingof2. gure 5-4 forvarioustimeembeddingdimension.ThisclearlyshowssuperiorperformanceoftheCFovertheWFinanonlinearenvironment.Bothsystemsusethesamenumberofparametersastheembeddingdimension,butfortheCF,theparameterselementsofcorrentropymatrixaredeterminedfromtheinputdatabeforehandinsteadofthroughtrainingoroptimizationwithrespectthedesiredresponse.NextweshallcomparetheCFandtheWPRwiththeRBFnetworkandtheNaradaya-Watsonestimator.5.5.2TimeSeriesPredictionofMackey-GlassTimeSeriesHereweshallshowtheresultsofonesteppredictionoftheMackey-GlassMG30timeseries.Figure 5-5 showsthecomparisonofMSEvaluesfordierentlterlengthsembeddingdimension.Ascanbeeasilyseenforthefournonlinearmethods,thebestresultswereachievedforlterlengthofL=6andfortheWPRitwasachievedforL=7.ThesevaluesshouldbeclosetotheoptimalembeddinglengthaccordingtoTakens 70

PAGE 71

Figure5-4.MeansquareerrorvaluesforWFandCFforthesystemmodelingexample. Figure5-5.MeansquareerrorvaluesforMGtimeseriespredictionforvariousembeddingsize. 71

PAGE 72

Figure5-6.MeansquareerrorvaluesforMGtimeseriespredictionfordierentsizeoftrainingdata. embeddingtheorem[ 65 ].Figure 5-6 showstheperformanceasafunctionoftheamountoftrainingdatausedforanembeddingdimensionof6.TheCF,theRBFnetwork,theNWestimator,normalizedRBF,andtheWPRusedthekernelwidth2valuesof0.56,2,0.22,2,and0.07,respectively.ThesevaluesweredeterminedexperimentallytogivetheleastMSEfortherespectivemethods.Alltheestimatorsusedatrainingdatasizeof100samplesandtestedusing200samples.ParametersfortheRBFandnormalizedRBFnetworkswerefoundbyaddingaregularizationtermof10)]TJ/F20 7.97 Tf 6.586 0 Td[(3tothediagonaloftheGrammatrixduringtrainingtoavoidill-conditioning.WecanconcludethatthenonlinearWienerlterCFissuperiortothelinearWienerlterbutisstilllaggingothernonlinearmethods.Sincethislterhasnoparametertrainedwithrespecttothedesiredresponseitusesthedatadirectly,itisunfairtocompareitwithotherparametricmethods.WithrespecttotheN-Westimator,theCFisstillabitlaggingforlargerdatasizes.ThederivationoftheCFemployedaverynaiveapproximationin 5{8 thatmostlikelygivesrisetomostofthiserror.Sincetheexacttransformationfromtheinputdatapointstowherelinearlteringframe 72

PAGE 73

workisemployedisnotknown,itisdiculttondacorrectioninthisapproximation.Nonetheless,thisdiscussionisstillintriguingandmotivatesustondabettermeansofincorporatingcorrentropyinminimummeansquareerrorMSEltering.Onegoodapproachistodeviseanon-lineschemeusingstochasticgradients.ThisformulationwilluseabetterapproximationtoderivethecorrentropybasedLMSleastmeansquarealgorithm.Thisisdiscussedinmoredetailinthenextchapter.ItcanbeobservedthatthepredictionusingtheweightedpartialregressionWPRmethod,whichwasderivedbyusingthesameformastheCFbutalsousingtheideasfromtheothersimplenonlinearregressiontechniques,isverysimilarinperformancewiththeRBFandthenormalizedRBFnetworksandalsoimprovesupontheCFforallsizesofthetrainingdata.AnoteworthypropertyobservedwiththecorrentropylterCF,theWPRandtheNWregressionisthattheerrorduringtrainingandthatduringtestingareverysimilar,unlikethatfortheRBFandthenormalizedRBFnetwork.ThiscanbeattributedtotheabsenceoftrainedparametersintheCFandNWregressionandfewerparametersinWPRthanintheRBFnetworks.TheWPRhasonlyL2parametersindependentofthedatasize,Lbeingtheembeddingdimension,whereastheRBFandthenormalizedRBFnetworkshaveNparameterseach,whereNisthenumberofdatacentersthewholetrainingdatahere. 73

PAGE 74

CHAPTER6ON-LINEFILTERS6.1BackgroundTheleastmeansquaresLMSlteriswidelyusedinallareasofadaptivelearningfromsystemidenticationtochannelequalization.ThepopularityofthisalgorithmsinceitsintroductionbyWidrowandHo[ 66 ]inthe1960shasgrownimmenselybecauseofitssimplicityandeectiveness.Theideaisanintelligentsimplicationofthegradientdecentmethodforlearning[ 29 ]byusingthelocalestimateofthemeansquareerror.Inotherwords,theLMSalgorithmissaidtoemployastochasticgradientinsteadofthedeterministicgradientusedinthemethodofsteepestdecent.Bydesign,itavoidstheestimationofcorrelationfunctionsandmatrixinversions.Unfortunatelythesecharacteristicdonotextendtononlinearadaptivelters.Nonlinearsystemadaptationbasedonlocalgradientsrequireseparateblocksofdata,calledthetrainingset,tobemadeavailablebeforeoperation.Ofcourse,theLMSalgorithmcanstillbeeasilyusedtoadaptthelinearoutputlayerofnonlinearmodelsastheradialbasisfunctionRBFnetwork.However,thecentersofthebasisareselectedbythetrainingdataeitherbycenteringeachGaussianonasampleorthroughclustering[ 27 ],soablockofdataisstillnecessary.FormultilayerperceptronsMLPstheLMSstochasticgradienthastobeheavilymodiedtotakeintoconsiderationthenonlinearitiesthebackpropagationalgorithm[ 27 ].InMLPstheneedforatrainingsetisnotfundamental,i.e.backpropagationcanbeappliedonesampleatatime,butsincealltheparametersofthesystemchangewitheachsample,theperformanceissopoorearlyonandtheconvergencesoslowthatatrainingsetispracticallyrequired.Moreover,theperformancesurfaceismostoftennon-convexwithmanylocalminima,whichfurthercomplicatestraining.Kernelmethodshavealsobeenproposedtoproducenonlinearalgorithmsfromlinearonesexpressedwithinnerproductsbyemployingthefamedkerneltrick[ 3 ].Morerecently,therehasbeenaninterestinthemachinelearningcommunitytotrainkernelregressors 74

PAGE 75

orclassiersonesampleatatimetocounteractthesizeofthehugedatasetsofsomerealworldapplications.ImportanttheoreticalresultshaveprovedtheconvergenceofonlinelearningalgorithmswithregularizationinreproducingkernelHilbertspaces[ 67 ],andasimplestochasticgradientbasedalgorithmforadaptationhasbeendeveloped[ 68 ].HerederivetwoLMStypelters,onebasedoncorrentropyandtheotheronkernelmethods,thoughbothcanberelatedthroughkernelmethods.6.2CorrentropyLMSAsdescribedinthepreviouschapter,giventherandomprocessxn, 2{10 actsasacorrelationfunctionforsomeGaussianrandomprocesszn.i.e.,E[xi;xj]=E[zizj]{1Letthissignalznbetheinputtoalinearltersuchthattheoutputisgivenbyyn=wTzn;{2wherez=[zn;zn)]TJ/F15 11.955 Tf 12.911 0 Td[(1;:::;zn)]TJ/F21 11.955 Tf 12.912 0 Td[(L+1]T.TheleastmeansquaresLMSltercanbederivedbyusingthestochasticversionofcostfunctionobtainedbydroppingtheexpectedvalueoperatorinJwz=EfwTzzn)]TJ/F21 11.955 Tf 12.979 0 Td[(dng2resultingin^Jwz=fwTzzn)]TJ/F21 11.955 Tf 11.955 0 Td[(dng2andthegradientisgivenbyr^Jwz=)]TJ/F15 11.955 Tf 9.298 0 Td[(2enzn;{3whereen=dn)]TJ/F37 11.955 Tf 12.023 0 Td[(wTzznistheinstantaneouserrorattimen.Sinceweareminimizingthecostfunction,weshallapplythemethodofgradientdecentusingthestochasticgradient 6{3 .Thustheupdatedweightateachinstantnisgivenby,wzn=wzn)]TJ/F15 11.955 Tf 11.955 0 Td[(1+2en)]TJ/F15 11.955 Tf 11.955 0 Td[(1zn)]TJ/F15 11.955 Tf 11.955 0 Td[(1:{4 75

PAGE 76

From 6{4 itcanbeeasilyseenthatwznisrelatedtotheinitializationwzsuchthatwzn=wz+2n)]TJ/F20 7.97 Tf 6.586 0 Td[(1Xi=1eizi:{5Wecanputwz=0andtheoutputatnisgivenbyyn=znTwzn=2n)]TJ/F20 7.97 Tf 6.586 0 Td[(1Xi=1eifziTzng=2n)]TJ/F20 7.97 Tf 6.586 0 Td[(1Xi=1eiL)]TJ/F20 7.97 Tf 6.587 0 Td[(1Xk=0fzi)]TJ/F21 11.955 Tf 11.956 0 Td[(kzn)]TJ/F21 11.955 Tf 11.955 0 Td[(kg2n)]TJ/F20 7.97 Tf 6.587 0 Td[(1Xi=1eiL)]TJ/F20 7.97 Tf 6.586 0 Td[(1Xk=0xi)]TJ/F21 11.955 Tf 11.955 0 Td[(k;xn)]TJ/F21 11.955 Tf 11.955 0 Td[(k;{6wherewehaveapproximatedL)]TJ/F20 7.97 Tf 6.586 0 Td[(1Pk=0fzi)]TJ/F21 11.955 Tf 11.955 0 Td[(kzn)]TJ/F21 11.955 Tf 11.955 0 Td[(kgbyL)]TJ/F20 7.97 Tf 6.586 0 Td[(1Pk=0xi)]TJ/F21 11.955 Tf 11.955 0 Td[(k;xn)]TJ/F21 11.955 Tf 11.955 0 Td[(k.ThisapproximationbecomesmoreaccurateasLincreasessinceE[zi)]TJ/F21 11.955 Tf 11.955 0 Td[(kzn)]TJ/F21 11.955 Tf 11.955 0 Td[(k]=E[xi)]TJ/F21 11.955 Tf 11.955 0 Td[(k;xn)]TJ/F21 11.955 Tf 11.955 0 Td[(k]:{7Weshallrefertotheformulationyn=2n)]TJ/F20 7.97 Tf 6.586 0 Td[(1Pi=1eiL)]TJ/F20 7.97 Tf 6.586 0 Td[(1Pk=0xi)]TJ/F21 11.955 Tf 11.956 0 Td[(k;xn)]TJ/F21 11.955 Tf 11.955 0 Td[(kascorrentropyLMSCLMS.6.3KernelLMSInthissectionwepresentthekernelLMSalgorithm.ThebasicideaistoperformthelinearLMSalgorithmgivenby 6{4 inthekernelfeaturespace.Forthisletusassumethatmapsthepointxnininputspacetoxninthekernelfeaturespacewithhxn;xmi=xn;xm,whereh;irepresentstheinnerproductinthekernelHilbertspace.Thistransformationtofeaturespaceforthemostwidelyusedkernelsisnonlinear,anddependingonthechoiceofthekernelthisspacecanbeinnitedimensional.TheGaussiankernelthatweuseherewillcorrespondtoaninnitedimensionalHilbertspace.Sincethisfeaturespaceislinear,xncanbeconsideredaninnitedimensionalcolumnvectorwithusualvectorinnerproducts.Letbetheweightvectorinthisspacesuchthattheoutputisyn=hn;xni.nisattime 76

PAGE 77

Figure6-1.Linearlterstructurewithfeaturevectorsinthefeaturespace. n.Letdnbethedesiredresponse.Figure 6-1 showsthenonlinearlteringschemewiththeinputvectorxnbeingtransformedtotheinnitefeaturevectorxn,whosecomponentsarethenlinearlycombinedbytheinnitedimensionalweightvector.Notethatthereisonlyonelayerofweightsinthisnonlinearlter,butsincethesizeoffeaturespaceispotentiallyinniteitisauniversalapproximator[ 63 ].Now,duetothelinearstructureoftheRKHScostfunctionJn=E[dn)]TJ/F21 11.955 Tf 11.955 0 Td[(yn2]canbeminimizedwithrespectto.Thiscanbedoneinthesamewayasdonein 6{4 usingthestochasticinstantaneousestimateofthegradientvector,whichyieldsn+1=n+2enxn:{8Likebefore,isthestep-sizeparameterthatcontrolstheconvergence,speed,andmisadjustmentoftheadaptationalgorithm[ 63 69 ].Theonlycatchhereisthatin 6{8 isintheinnitedimensionalfeaturespaceanditwouldbepracticallyimpossibletoupdatefordirectly.Insteadweshalluse 6{8 torelateeachntoitsinitialization 77

PAGE 78

.Thiswouldeasilygiven=+2n)]TJ/F20 7.97 Tf 6.587 0 Td[(1Xi=0eixi:{9Forconvenienceweshallchoosetobezerohencee=d.Thenalexpressionfornbecomesn=2n)]TJ/F20 7.97 Tf 6.587 0 Td[(1Xi=0eixi:{10Itishereweshallexploitthekerneltrick.Givennfrom 6{10 andtheinputxntheoutputatnisgivenbyyn=hn;xni=n)]TJ/F20 7.97 Tf 6.586 0 Td[(1Pi=0eihxi;xni=n)]TJ/F20 7.97 Tf 6.587 0 Td[(1Pi=0eixi;xn:{11Wecall 6{11 theKernelLMSalgorithm.Itisclearthat,giventhekernel,KernelLMS Figure6-2.ErrorsamplesforKLMSinpredictingMackey-Glasstimeseries. hasauniquesolutionbecauseitissolvingaquadraticprobleminfeaturespace.NoticealsothattheweightsofthenonlinearlterareneverexplicitlyusedintheKernelLMSalgorithm,sotheorderofthelterisnotusercontrollable.Moreimportantly,sincethepresentoutputisdeterminedsolelybypreviousinputsandallthepreviouserrors,it 78

PAGE 79

canbereadilycomputedintheinputspace.Theseerrorsamplesaresimilartotheonesinsequentialstateestimationwheretheyarealsotermedasinnovations[ 63 ],astheyaddnewinformationtoimprovetheoutputestimate.Eachnewinputsampleresultsinanoutputandhenceacorrespondingerror,whichisnevermodiedfurtheranditisincorporatedintheestimateofthenextoutput.ThisrecursivecomputationmakesKernelLMSespeciallyusefulforon-linenonlinearsignalprocessing,butthecomplexityofthealgorithmincreaseslinearlyasnewerrorsamplesareused.TheerrorsamplesobtainedfortheMackey-Glasstimeseriesusedinourexperimentsareshowningure 6-2 ,andtheyshowasurprisinglyfastconvergence.Theinitialerrorsintheadaptationtendtopreventthealgorithmfromoverttingthedata.Theinitialerrorsarethemostdominantsincetheerrorsdecayquicklyandseemtohaveaprominentroleintheestimatesoftheoutput.Webelievethisnonparametricimprovementintheoutputsamplesisareasonfornotrequiringanykindofexplicitregularization.Inpractice,iftheerrorsaresatisfactorilysmallafterpinputsamples,thentheupperindexinthesummationin 6{11 canbexedandthecomputationcomplexityforeachsuccessiveoutputwillbeOp.Thisis,ofcourse,largerthanthatforthelinearLMSalgorithm,butstillissmallerthanmostnonlinearalgorithms.Thekernelsizethevarianceparameterintheradialbasisfunctionisafreeparameterwhichaectstheoverallperformanceofthealgorithmlikeinanykernelbasedalgorithmandcanbechosenthroughaquickcrossvalidationstep.TheSilverman'srule[ 53 ]ofthumbisanotheralternative.Foroursimulationsthekernelsizechosenbythevarianceofthedataworksreasonablywell.Here,justlikeforthelinearLMSalgorithm,thestepsizecontrolstheconvergence,speed,andmisadjustmentofKernelLMS.Throughlinearadaptiveltertheoryitcanbeexpectedthatforconvergence,isupperboundedbythelargesteigenvalueofthedatacovarianceinthefeaturespacewhichisdiculttoestimateanddependentuponthekernelsize.Comparingequations 6{6 and 6{11 wecanseethatCLMScanbeconsideredaspecialcaseofKLMS,whereinsteadofusingthekernelxi;xn,thekernel 79

PAGE 80

L)]TJ/F20 7.97 Tf 6.587 0 Td[(1Pk=0xi)]TJ/F21 11.955 Tf 11.955 0 Td[(k;xn)]TJ/F21 11.955 Tf 11.956 0 Td[(kisusedwhichisavalidsymmetricandpositivedenitekernel,sinceitisasumofsymmetricandpositivedenitefunctions.ThushenceforthweshalldiscussaboutKLMSanditwouldalsoimplyCLMSunlessotherwisestated.6.4Self-RegularizedPropertyKernelLMSthroughtheMercertheoremisnothingbutplainLMSinthefeaturespacepotentiallyinnitedimensional.Trainingalargeweightvectorusuallywarrantsproperregularizationwhichisinvariablydoneinmethodslikekernelregression[ 70 ]andsomeRBFnetworks[ 27 ].ButLMS,aswehavealsoobservedinourexperiments,canbeshowntohaveaninherentregularization.Onewaytoshowthatistodemonstratethatweightvectortendstoasmallnormsolution.Thisisespeciallytruewhentheweightvectorisinitializedtozero.Weshalldiscussthisapproachinmoredetailthoughamorerigorousargumentcanalsobemadethroughclassicalregressiontheory[ 71 ].Forthisdiscussionweneedtoconsiderthenaturalmodesforadaptation.ThesearenothingbutthecomponentsalongthedominanteigenvectorsofthedataautocorrelationmatrixRthiscouldbeaninnitedimensionalmatrixforKLMS.LetqibetheitheigenvectorcorrespondingtotheithlargesteigenvalueiofR.Letusdenotetheweight-errorvectorintheLMSlterbyen=w)]TJ/F37 11.955 Tf 11.955 0 Td[(wn;{12wherewdenotestheoptimumWienersolutionforthetap-weightvectorandwnistheestimateproducedbytheLMSlterattheiterationn.Then,encanbewrittenasen=Xivinqi;{13wherevinistheprojectionofenontheeigenvectorqisuchthatvin=qTien;{14 80

PAGE 81

andthesquarednormofenisgivenbykenk2=Xiv2in:{15Similarlytheweightvectorwncanbewrittenintermsofitsnaturalmodesaswn=Xi~winqi;{16suchthatkwnk2=Xi~w2i{17Asderivedin[ 63 ],whenthestep-sizeparameteroftheLMSlterissmallthersttwomomentsofthenaturalmodevinaregivenby,E[vin]=vi)]TJ/F21 11.955 Tf 11.955 0 Td[(in;{18andE[v2in]=Jmin 2)]TJ/F21 11.955 Tf 11.955 0 Td[(i+)]TJ/F21 11.955 Tf 11.955 0 Td[(i2nv2i)]TJ/F21 11.955 Tf 17.946 8.088 Td[(Jmin 2)]TJ/F21 11.955 Tf 11.955 0 Td[(i;{19whereJministheminimummeansquareerrorasgivenbytheoptimalsolutionw.Regularizationismostlynecessarywhentheautocorrelationmatrixisill-conditioned,thatiswhenafeweigenvaluesareverysmall.ThisisalsotrueforKLMS.Sowhenaneigenvalueiissmallsuchthati1,E[vin]vi;{20andE[v2in]v2i:{21Hencewhenweinitializetheweightvectorwto0,thenaturalmodeswn=~wiofcorrespondingtothesmalleigenvalueswillremainalmostunchangedi.e.around0inthemeansense.AlsotheexcessmeansquareerrorisdenedasthedierencebetweenthemeansquareerrorJnproducedbytheLMSlterattimenandtheminimum 81

PAGE 82

mean-squareerrorJmin.Then,ithasbeenshownalsoin[ 63 ]thatJexn=Jn)]TJ/F21 11.955 Tf 11.955 0 Td[(JminXiiE[v2in]:{22ThismeansthattheperformanceoftheLMSlterisnotaectedbythenaturalmodescorrespondingtotheverysmalleigenvalues.Thismeansthatforaminimumnormweightvectoronerequiresthemodes~wincorrespondingtotheverysmalleigenvaluestoremainaroundzeroandfrom 6{17 6{17 and 6{17 thisiswhattheLMSalgorithmachieveswhentheweightvectorisinitializedaszero.ThefactthattheLMSalgorithmisregularizedcanalsobedemonstratedthroughconventionalregularizationtheorywhichisdiscussedinmoredetailin[ 71 ].6.5ExperimentalComparisonbetweenLMSandKLMSTodemonstratetheperformanceoftheproposedmethodwewillpresentsomesimulationresults.ThemeansquareerrorwillbeusedtocomparetheperformanceoftheKernelLMSalgorithmKLMSandthatofthelinearLMSalgorithmfortheonesteppredictionoftheMackey-GlassMG30timeseries.Thesimulationsimplementequation 6{11 fortheKLMSandtheequation 6{6 fortheCLMS.Thekernelsizeandthestepsizeweredeterminedforbestresultsafterscanningtheparameters.Thedatawasnormalizedtounityvariancebeforehand.Thekernelsize,2waschosentobe1fortheseexperiments.Itwasalsoobservedthattheperformancewasnotverysensitivetothekernelsizesbetween1and4.ThevaluesofthestepsizefortheLMS,theKLMSandtheCLMSalgorithmswerechosenas0.01,0.5,and0.05respectively.TheplotspresentedincludethelearningcurveandcomparisonsofMSEvaluesfordierentembeddingdimensions.Toplotthelearningcurveaftereachupdatethelearningwasfrozenandanewbatchof300datasampleswasusedtoestimatethemeansquareerror.Figure 6-3 showsthelearningcurveforbothalgorithmsthestepsizevalueswerechosenforfastestconvergence.Surprisingly,thespeedsofconvergenceofbothmethodsLMSandKLMSarecomparablei.e.thetwolearningcurvesarebasicallyparallelto 82

PAGE 83

eachothereventhoughtheKLMSisworkinginaninnitedimensionalHilbertspace,wheretheoreticallytheeigenvaluespreadisunconstrainedandstatisticalreasoningrequiresregularizationtobeperformed[ 72 ].Thiscanbeattributedtothefactthatthescatteringofthedata,althoughexistinginaninnitedimensionalspaceintheory,issuchthatfarlessercanonicaldirectionsaredominantthecorrespondingcovariancematrixhasafeweigenvaluesthataredominant,whiletheothersarezeroforpracticalpurposes.SincetheLMSonlysearchesthespaceofsignicanteigenvaluesanyway,ittendstoconcentrateonthesignalsubspaceandsuggeststhatexplicitregularizationisnotrequired.Figure 6-3 alsoincludestheregularizedcase[ 68 ]whereresultswereonlysatisfactorywhentheregularizationparameterwasclosetozero.Hereweusedaregularizationparameterof0.02andstepsizeof0.7.TheseparameterswerechosenafterasetoftrialstoobtainthebestMSEafterconvergence.Asystematicwayofchoosingtheregularizationparameterforthiskindofstochasticlearningisstilllacking.TheKLMSandCLMSltersachievethebestresultsfortheembeddingdimensionof6gure 6-4 ,whichistheoptimalvaluefortheMGtimeseriesaccordingtoTakensembeddingtheorem[ 65 ].Fortheplots,allthealgorithmsweretrainedon-linefor200datasamplesandweretestedusingatestingsetof300newdatapoints.AsareferencewehavealsoincludedtheresultsusinganRBFnetworktrainedwiththeleastsquaresmethodwith200RBFscenteredatthesamples,and300newsamplesusedfortesting.TheresultsobtainedwithKLMSissurprisinglyclosetothatoftheRBFnetworkconsideringthe200"weights"areusedsodierently:intheRBFallthe200weightsareoptimizedforthedataset,whileintheKLMStheerrorsarecomputedandxedforthesubsequentsamples.NoticealsothattrainingtheKLMSandtheCLMSltersweremuchsimplerwithnoneedofmatrixcomputationandinversion.AswehavealreadymentionedthemajordrawbackintheKLMSmethodisthecontinuallygrowingarchitecture.Inthefollowingsectionsweshallprovideaveryeectiveremedyforthis. 83

PAGE 84

Figure6-3.LearningcurvesfortheLMS,theKLMSandtheregularizedsolution. 6.6KernelLMSwithRestrictedGrowth6.6.1SparsityConditionfortheFeatureVectorsBycheckingthelineardependencyoftheinputfeaturevectors,onecanndthemeansofreducingthegrowingarchitectureofthekernelLMSalgorithm.Inthissubsection,thebasicmethodofcheckingthelineardependencyispresented.Avectorxnislinearlydependentonthevectorsinfxi:i2rn)]TJ/F20 7.97 Tf 6.586 0 Td[(1gwherern)]TJ/F20 7.97 Tf 6.587 0 Td[(1isthelistofindexesofthevectorspriortotimenthatarelinearlyindependentifxn=lengthrn)]TJ/F18 5.978 Tf 5.756 0 Td[(1Xk=1kxrn)]TJ/F20 7.97 Tf 6.587 0 Td[(1;k{23 84

PAGE 85

Figure6-4.ComparisonofthemeansquareerrorforthethreemethodswithvaryingembeddingdimensionlterorderforLMSoftheinput. withatleastonenonzerok,wherern)]TJ/F20 7.97 Tf 6.586 0 Td[(1;kisthekthelementofrn)]TJ/F20 7.97 Tf 6.586 0 Td[(1.Sowecanusethefollowingcostfunctiontoestimatekandtocheckforlineardependency:J)]TJET1 0 0 1 159.754 294.689 cmq[]0 d0 J0.478 w0 0.239 m7.271 0.239 lSQ1 0 0 1 -159.754 -294.689 cmBT/F21 11.955 Tf 159.754 284.952 Td[(=xn)]TJ/F22 7.97 Tf 11.955 15.874 Td[(lengthrn)]TJ/F18 5.978 Tf 5.757 0 Td[(1Xk=1kxrn)]TJ/F20 7.97 Tf 6.586 0 Td[(1;k2=xn)]TJET1 0 0 1 251.8 255.305 cmq[]0 d0 J0.478 w0 0.239 m8.454 0.239 lSQ1 0 0 1 -251.8 -255.305 cmBT/F15 11.955 Tf 251.8 245.701 Td[(n)]TJ/F20 7.97 Tf 6.586 0 Td[(1 2=xn;xn)]TJ/F15 11.955 Tf 11.955 0 Td[(2xnT n)]TJ/F20 7.97 Tf 6.587 0 Td[(1 + T)]TJET1 0 0 1 396.125 228.409 cmq[]0 d0 J0.478 w0 0.239 m8.454 0.239 lSQ1 0 0 1 -396.125 -228.409 cmBT/F15 11.955 Tf 396.125 218.805 Td[(n)]TJ/F20 7.97 Tf 6.586 0 Td[(1T n)]TJ/F20 7.97 Tf 6.587 0 Td[(1 =xn;xn)]TJ/F15 11.955 Tf 11.955 0 Td[(2kn)]TJ/F20 7.97 Tf 6.586 0 Td[(1 + TGn)]TJ/F20 7.97 Tf 6.587 0 Td[(1 ;{24wherekn)]TJ/F20 7.97 Tf 6.586 0 Td[(1=[xn;xrn)]TJ/F20 7.97 Tf 6.586 0 Td[(1;1;xn;xrn)]TJ/F20 7.97 Tf 6.587 0 Td[(1;2;:::;xn;xrn)]TJ/F20 7.97 Tf 6.587 0 Td[(1;n)]TJ/F20 7.97 Tf 6.587 0 Td[(1]T, =1;2;:::;lengthrn)]TJ/F18 5.978 Tf 5.756 0 Td[(1T,Gn)]TJ/F20 7.97 Tf 6.586 0 Td[(1istheGrammatrixformedwiththevectorsinfxi:i2rn)]TJ/F20 7.97 Tf 6.587 0 Td[(1g,and n)]TJ/F20 7.97 Tf 6.587 0 Td[(1isamatrixwhosekthcolumnisxrn)]TJ/F20 7.97 Tf 6.586 0 Td[(1;k.Minimizing 6{24 85

PAGE 86

wouldresultin TGn)]TJ/F20 7.97 Tf 6.587 0 Td[(1=kTn)]TJ/F20 7.97 Tf 6.586 0 Td[(1or;)]TJET1 0 0 1 265.367 662.99 cmq[]0 d0 J0.478 w0 0.239 m7.271 0.239 lSQ1 0 0 1 -265.367 -662.99 cmBT/F21 11.955 Tf 265.367 653.253 Td[(1T264Gn)]TJ/F20 7.97 Tf 6.586 0 Td[(1kTn)]TJ/F20 7.97 Tf 6.587 0 Td[(1375=0{25Itiseasytoseethat264Gn)]TJ/F20 7.97 Tf 6.586 0 Td[(1kn)]TJ/F20 7.97 Tf 6.587 0 Td[(1kTn)]TJ/F20 7.97 Tf 6.587 0 Td[(1xn;xn375istheGrammatrixGnwithnaddedtothelistrn)]TJ/F20 7.97 Tf 6.587 0 Td[(1.Thuswejustneedtondthevector suchthatallbutmaybeexceptthelastelementof)]TJET1 0 0 1 191.505 547.591 cmq[]0 d0 J0.478 w0 0.239 m7.271 0.239 lSQ1 0 0 1 -191.505 -547.591 cmBT/F21 11.955 Tf 191.505 537.854 Td[(1TGn)]TJ/F20 7.97 Tf 6.586 0 Td[(1arezeros.LetTnbethetransformonGnsuchthatTnGnisuppertriangular.Then,thelastrowtnofTnsatisestn264Gn)]TJ/F20 7.97 Tf 6.586 0 Td[(1kTn)]TJ/F20 7.97 Tf 6.587 0 Td[(1375=0andhencefrom 6{24 tn=)]TJET1 0 0 1 215.777 466.694 cmq[]0 d0 J0.478 w0 0.239 m7.271 0.239 lSQ1 0 0 1 -215.777 -466.694 cmBT/F21 11.955 Tf 215.777 456.957 Td[(1.Thustngivestheleastsquaresolutiontothecostfunctiongivenin 6{24 .Now,ifthecostfunctionatthethelestsquaresolutionissucientlysmallsaylessthanasmallpositivethenweshallconsiderthisnewfeaturevectortobelinearlydependenttothevectorsgivenbytheindexesinrn)]TJ/F20 7.97 Tf 6.587 0 Td[(1.Otherwise,thevectorisconsideredtobealmostorsucientlylinearlyindependenttothevectorsgivenbyrn)]TJ/F20 7.97 Tf 6.586 0 Td[(1,addingitsindexntothelistofindependentvectorsrn)]TJ/F20 7.97 Tf 6.587 0 Td[(1.Usingtheoptimalvaluegivenby 6{25 ,thecostfunctionwouldbeJopt=xn;xn)]TJ/F37 11.955 Tf 11.955 0 Td[(kTn)]TJ/F20 7.97 Tf 6.586 0 Td[(1 :{26ItiseasytoseethatJoptisnothingbutthelastelementoftnGnoraccordinglythelastdiagonalelementoftheuppertriangularmatrixTnGn.Sothemethodoftestingthelineardependencysimplyrequirestwosteps{rst,usealineartransformtomaketheGrammatrixuppertriangularandsecond,comparetheelementinthelastrowandlastcolumnoftheresultingmatrixtoasmentionedbefore.NextweexplaintheexactnumericaltechniqueusingsequentialGaussianeliminationtoperformthesestepsresultingtheimprovedalgorithm. 86

PAGE 87

6.6.2AlgorithmLetthenewfeatureinputxnin 6{11 belinearlydependentonthepreviousfeaturevectors,xi,i=0;1;:::;n)]TJ/F15 11.955 Tf 11.955 0 Td[(1.Thenforsomereali's,xn=n)]TJ/F20 7.97 Tf 6.587 0 Td[(1Xi=0ixi:{27Substitutingthisin 6{11 ,yn+1=n)]TJ/F20 7.97 Tf 6.586 0 Td[(1Xi=0e;nixi;xn+1;{28wheree;ni=ei+ien.Thusthecomputationofyn+1requiresonelesskernelevaluationthanthatgivenby 6{11 .Ifmoresuchinputfeaturevectorsarelinearlydependenttothepreviousones,growthofthealgorithmsuggestedby 6{11 wouldreducesignicantly.Ofcourse,ifthedatasamplesintheinputspacearedistinctandwhenapositivedenitekernelfunctionisused,noneofthecorrespondingfeaturevectorswillbeexactlylinearlydependent.Butinpracticetheycanbeawfullyclose.ThatiswhylargeGrammatricesareusuallyillconditioned.Infact,weshalleectivelyemployonestepofGaussianeliminationateachtimeinstantontheGrammatrixgivenbythekerneltodeterminethelineardependencyofthenewinputfeaturevectorwiththepreviousones.Beforeweexplaintheapproachletusdeneafewvariables.GnisthekernelGrammatrixcorrespondingtotheinputsfxi;i2rng,andvn=[xn;xrn1;xn;xrn2;:::]T;{29wherernisthevectorofindexesoftheinputwhosecorrespondingfeaturevectorsarelinearlyindependentandrnkisthekthelementofrn.Thedenitionofrnwillbeclearwhenthestepsofthealgorithmarepresentednext: 1. InitializeT1=T1=1;G1=_G1=1;r1=1;e1=1. 87

PAGE 88

2. Theoutputandtheerrorarecalculatedas:yn=e;nvn=lengthrn)]TJ/F18 5.978 Tf 5.756 0 Td[(1Xi=1e;nixrn)]TJ/F20 7.97 Tf 6.587 0 Td[(1;i;xn{30anden=dn)]TJ/F21 11.955 Tf 11.955 0 Td[(yn.e;niistheithelementofe;n. 3. _Gn=264Gn)]TJ/F20 7.97 Tf 6.587 0 Td[(1Tn)]TJ/F20 7.97 Tf 6.586 0 Td[(1vnvn1375. 4. Tn=264Tn)]TJ/F20 7.97 Tf 6.586 0 Td[(1001375.ItiseasytoseethatTnGn=_Gn. 5. PerformnGaussianeliminationrowoperationson_GnandthesameoperationsonTntoobtainGnandTn.ThenateachstepGnisuppertriangularandTnGn=Gn. 6. IfthelastelementgnofthelastrowandhencethewholerowofGniszeroorlessthanasmalluserdenedquantity,itisobviousthattheGrammatrixGnisill-conditionedandhencethenewfeaturevector,xiisalmostlinearlydependentonthepreviousfeaturevectors.IfthisistrueandthelastrowofTnistn,thentnGn0{31andhencetngivestheweightswithwhichxnislinearlydependentwiththepreviousvectorsfxi;i2rng.InthiscasethefeaturevectorxnisredundantandthematricesTnandGnarerevertedbacktoTn)]TJ/F20 7.97 Tf 6.587 0 Td[(1andGn)]TJ/F20 7.97 Tf 6.587 0 Td[(1bysimplydeletingthelastrowandthelastcolumn.Theupdatee;ni=e;n)]TJ/F20 7.97 Tf 6.587 0 Td[(1i+ienisalsoperformedwheree;niistheithelementofe;nandi=)]TJ/F21 11.955 Tf 9.298 0 Td[(tn;i,wheretn;iistheithelementoftn. 7. Ifgn,thenxniscertainlylinearlyindependentfromthepreviousvectorsandtheupdatesrn=[rn)]TJ/F20 7.97 Tf 6.587 0 Td[(1;n]ande;n=[e;n)]TJ/F20 7.97 Tf 6.586 0 Td[(1;en]areperformed. 88

PAGE 89

Figure6-5.LearningcurveoflinearLMSandkernelLMSwithrestrictedgrowthforthreevaluesof. 6.7ExperimentalComparisonbetweenKLMSandKLMSwithRestrictedGrowthTodemonstratetheperformanceoftheproposedmethodwewillpresentsomesimulationresults.ThemeansquareerrorwillbeusedtocomparetheperformanceoftheKernelLMSwithrestrictedgrowthKLMSRandthatofthelinearLMSalgorithmfortheonesteppredictionoftheMackey-GlassMG30timeseries.Fortheseexperiments,anembeddingdimensionof6wasusedwhichistheoptimalvalueaccordingtoTakensembeddingtheorem[ 65 ].Thesimulationsimplementequation 6{30 fortheKLMSRandequation 6{4 forthelinearLMS.Thekernelsizeandthestepsizeweredeterminedforbestresultsafterscanningtheparameters.Thedatawasnormalizedtounityvariancebeforehand.Thekernelsize,2waschosentobe1fortheseexperiments.Itwasalsoobservedthattheperformancewasnotverysensitivetothekernelsizesbetween1and4. 89

PAGE 90

Figure6-6.PerformanceofkernelLMSwithrestrictedgrowthforvariousvaluesof. ThevaluesofthestepsizefortheLMSandtheKLMSRalgorithmswerechosenas0.01and0.5respectively.TheplotspresentedincludethelearningcurveandcomparisonsofMSEvaluesfordierentvaluesofthetolerancethreshold.Notethatfor=0weobtaintheevergrowingRBFnetwork.Toplotthelearningcurve,aftereachupdatethelearningwasfrozenandanewbatchof300datasampleswasusedtoestimatethemeansquareerror.Figure 6-5 showsthelearningcurveforLMSandKLMSRalgorithmsthestepsizevalueswerechosenforfastestconvergence.ItcanbeobservedthatthereisaslightpenaltyinperformanceforlargervaluesofsincetheRBFltersizeisactuallyverydierentinthesethreecasesseegure 6-7 .Thespeedsofconvergenceforthreedierentvaluesofarepracticallythesameinthebeginningsinceallthethreealgorithmsarestillgrowing,butafterteh500thsampletheystabilizei.e.thelearningcurvesarealmostparalleltoeachother.Thisshowsthatthelearningprocessstillcontinuesinthealgorithm 90

PAGE 91

Figure6-7.Numberofkernelcentersaftertrainingforvariousvaluesof. evenwhenthenewkernelcentersarenotadded.Figure 6-6 showstheperformanceofthealgorithmforarangeofvaluesofandgure 6-7 showsthenumberoftotalkernelcentersaccumulatedusing600trainingdatasamples.Asexpectedthenumberofkernelcentersreducesdrasticallyasincreasesfrom600tolessthan100.YetthedegradationinperformanceforthisdatasetiscomparativelymuchlessandtheoveralllterperformsmuchbetterthantheadaptivelinearcombinertrainedwithLMS.6.8ApplicationtoNonlinearSystemIdentication6.8.1MotivationAswehavealreadymentionedintheprevioussections,inadditiontoVapnik'swidelyknownsupportvectormachine[ 3 ]therehavebeennumerouskernelmethodsproposedintherecentyearslike,kernelprinciplecomponentanalysis[ 4 ],kernelFisherdiscriminantanalysis[ 5 ],andkernelcanonicalcorrelationanalysis[ 6 ],[ 7 ],withapplicationsinclassicationandnonlinearregression.Thoughtheyhaveshownenormouspotential 91

PAGE 92

insolvingcomplexproblems,theirusefulnesshasbeenratherlimitedinmanyrealsystemsrequiringon-lineoperation.Thesealgorithmscannotbeemployedonlinebecauseofthedicultiesposedbythebasickernelmethod,suchasthetimeandmemorycomplexityandtheneedtoavoidovertting.OneoftheearlyonlinenonlinearregressionalgorithmswithawelltailoredcomplexityistheresourceallocationnetworkRAN[ 73 ].ThealgorithmwaspresentedasgrowingradialbasisfunctionRBFnetworkwhereforeachnewdatasamplethatsatisessomenoveltycondition,oneRBFunitisaddedtothenetwork.AteachsteptheweightstotheRBFsisadaptedusingtheLMSalgorithm[ 63 ].ThoughtheRANissimpleandeectiveitrequirestheusertoheuristicallychoosetwothresholdparametersforthenoveltytest,whichmaycreateinconsistentresults.MorerecentlyakernelRLSalgorithmhasbeenproposedthatdealtwiththecomplexityissuesbyapplyingasparsicationproceduretolimitthesizeofthekernelGrammatrix[ 74 ].Thisalgorithmallowsforonlinetrainingbutcannothandletime-varyingdata.AnimprovementhasalsobeenpresentedtothisapproachthroughtheslidingwindowkernelRLSalgorithm[ 57 ],whereanecientupdateruleispresentedtocomputetheleastsquaresolutionforthekernelweightswithdatainwindowthatslidesonesampleattime.Thisapproachhasonesimpleaw,theresponsetimeofthesystemtoanyabruptchangeinthedataisroughlythesameasthewindowlengththatischosenapriori.Inaddition,itrequiresthechoiceofaregularizationparameter,whichisnottrivial.Thesealgorithmshavebeenusedfornonlinearsystemidentication.Incertaincases,likeforestimatingacommunicationchannel,itispreferredthattheimplementationisonlineandthatitrespondstononstationaryvariations.TothiscauseweshallemploytheKernelLMSalgorithmthatwepresentedinapreviouschapter.Asweshallseethroughsimulationsthisapproachhascertainadvantagesovertheothermethods. 92

PAGE 93

6.8.2IdenticationofaWienerSystemTheWienersystemisawell-knownsimplenon-linearsystemwhichconsistsofalinearsystemfollowedbyastaticmemorylessnonlinearityseegure 6-8 .Suchachannelcanbeencounteredindigitalsatellitecommunicationsandindigitalmagneticrecording.Traditionally,thistypeofnonlinearequalizationorchannelestimationhasbeentackledbynonlinearstructuressuchasneuralnetworksandMLPs[ 75 ].HereweshallcompareourresultswiththeslidingwindowKernelRLSKRLS,sinceithasalreadybeenshownthatKRLSperformsbetterthantheMLPs.InadditionweshallalsoshowtheresultswiththeRANasareference.WeshouldnotethattheRANistheleastcomplexwithcomputationofONforeachtimestep,whereNisthenumberofRBFkernelunitspresentateachstep.Ourmethod,KLMSwithrestrictedgrowth,andKRLShaveacomputationcomplexityofON2.ForKLMSR,NisthenumberofkernelunitspresentateachstepandforKRLS,itisthesizeoftheslidingwindow.Asweshallseeshortly,incertaincases,thenumberofkernelunitsrequiredforKLMSRcanbemuchsmallerthanthesizeoftheslidingwindowforKRLSforthesameperformance,henceKRLSwouldbefarmorecostly. Figure6-8.AnonlinearWienersystem. 6.8.3ExperimentalResultsSupervisedlearningisusedfortheidenticationofanunknownWienersystem.Totestthetrackingcapabilityofthevariousalgorithms,weshallswitchthecoecientsofthelinearsystematagiveninstant.Asmentionedin[ 57 ]wewouldexpecttheresponsetimeofKRLSalgorithmforthischangetoberoughlythewindowsizethatischosen. 93

PAGE 94

WewouldalsoexpecttheKLMSandRANalgorithmstobemuchfastersincetheydon'temployanywindowingscheme.Duringtherstpartofthesimulation,thelinearchannelisH1z=1+0:0668z)]TJ/F20 7.97 Tf 6.586 0 Td[(1)]TJ/F15 11.955 Tf -452.679 -23.908 Td[(0:4764z)]TJ/F20 7.97 Tf 6.587 0 Td[(2+0:8070z)]TJ/F20 7.97 Tf 6.587 0 Td[(3andafterreceiving600symbolsitischangedtoH2z=1+0:4326z)]TJ/F20 7.97 Tf 6.587 0 Td[(1)]TJ/F15 11.955 Tf 13.009 0 Td[(0:0656z)]TJ/F20 7.97 Tf 6.587 0 Td[(2+0:7153z)]TJ/F20 7.97 Tf 6.587 0 Td[(3.Abinarysignalxn2)]TJ/F15 11.955 Tf 23.225 0 Td[(1;+1issentthroughthischannelafterwhichthesignalistransformednonlinearlyaccordingtothenonlinearfunction,vn=tanhun,wherevnisthelinearlteroutput.Ascatterplotofunversusvncanbefoundingure 6-9 .Finally,15dBofadditivewhiteGaussiannoiseisadded.TheWienersystemistreatedasablackboxofwhichonlyinputandoutputareknown.TheslidingwindowRLSalgorithmwasappliedstartingfromthetimeindexequaltothewindowlengthsincethatmuchofdataisrequired. Figure6-9.SignalinthemiddleoftheWienersystemvs.outputsignalforbinaryinputsymbolsanddierentindexesn. 94

PAGE 95

Systemidenticationwasperformedbyusingthreetechniques:kernelLMSwithconstrainedgrowthKLMS,slidingwindowkernelRLSKRLS,andtheresourceallocationnetworkRAN.ForallthemethodsweappliedtimeembeddingtechniquesinwhichthelengthLofthelinearchannelwasknown.TheinputvectorsforthealgorithmsweretimedelayedvectorsoflengthL,xn=[xn)]TJ/F22 7.97 Tf 6.586 0 Td[(L+1;:::;xn]T.SincethelengthofthelinearchannelsH1andH2was4,wealsousedL=4.Herearetheparametervaluesthatwerechosen,mostlyheuristically,foreachmethodsoastoobtainthebestperformance:KLMSR:thresholdforthelineardependencytest,=:2;stepsize=0.2;kernelsize,2=1.KRLS:windowsizeN=120;regularizationparameter=0.001;kernelsize,2=1.RAN:thenoveltytestparameters-errorparameter=0.17anddistanceparameter=0.17;stepsize=0.2;kernelsize,2=1.TheMSEvaluesatdierenttimeinstanceslearningcurvearegiveningure 6-10 AsexpectedtheKRLSapproachistheslowestandtakesatimespanapproximately Figure6-10.MeansquareerrorfortheidenticationofthenonlinearWienersystemwiththethreemethods.ThevaluesforKRLSisonlyshownafterthetherstwindow. 95

PAGE 96

equaltothelengthoftheslidingwindowtoadjustforasuddenchange.TheRANandtheproposedKLMSRalgorithmbothseemtoworksimilarlywithsameresponsetimetoanabruptchangeinthechannel.ThoughtheRANiscomputationallymoreecientitrequiresonemoreparametertobedecidedapriori.Also,bothRANandKLMSRrestricttherearchitecturetoonly16radialbasisfunctionsinthenetwork,automaticallythroughtheirrespectivecriteria.Butusing16astheslidingwindowlengthandhence16RBFsforKRLSissopoorthatwehavenotincludedtheresults.KRLSrequiredawindowsizeofaround100forcomparableresults.Thisisbecauseitisonlyusing16samplesatanygiventimeforestimation,whereasRANandKLMSRbothLMSbasedalgorithms,thoughuseonly16radialbasisfunctions,incorporatetheinformationofthepastdatauntilthegiveninstancetoestimatetheoutput.RANandKLMSRarecontinuouslyadaptingateachstep,sothesetwoapproachesarebetterthanKRLSSooverallforthisexample,intermsofdesignsimplicityandperformance,KLMSRisthemostappropriatechoicesinceithaslesserparametersthanRAN,butintermsofthecomputationspeed,RANcouldbeused.6.9SummaryWehavederiveddirectlyWidrow'sleastmeansquaresalgorithminaninnitedimensionalkernelHilbertspaceandmodiedthealgorithmtocontaingrowthoftheresultingnonlinearlter.Thealgorithmusesthekerneltricktoobtainthenaloutput,eectivelyresultingintheadaptationofanonlinearlterwithoutthecomplexitiesofbackpropagationandutilizingthedataprettymuchastheconventionallinearltertrainedwithLMS.Althoughstochasticlearningisknowninthemachinelearningcommunity,theproblemofgrowingcomplexityhadstillnotbeenaddressedforthisclassofproblems.Herewehaveproposedasetofsequentialstepstotestthelineardependencyinthefeaturespaceofeachincomingdatawiththepreviousinputsamples.This,inturn,providesameansofdistributingthepresenterrortoweightthekernelfunctionscenteredatthepreviousinputsamplesandrestrictthegrowthofthealgorithm 96

PAGE 97

complexity.Thismethodcanalsobepotentiallyusedforotheron-linekernelbasedalgorithmswhereabatchofdataisnotavailableaprioriandthecomplexityofthealgorithmneedstoberestrictedsequentiallyonesampleatatime.ItisalsonoteworthythattheKLMSandtheimprovedKLMSRalgorithmsbasicallysolvetheleastsquaresprobleminthefeaturespaceimplyingthatthegradientsearchisonasmoothquadraticperformancesurface,resultinginreliableconvergencewithoutthehasslesoflocalminima.AnotherinterestingfactwhentheGaussianoranytranslationinvariantkernelisutilized,isthatthetransformeddataliesonthesurfaceofaspherei.e.,thetransformeddataisautomaticallynormalized,behavinglikethenormalizedLMSalgorithm,whichanimportantadvantage. 97

PAGE 98

CHAPTER7APPLICATIONTOBLINDEQUALIZATION7.1MotivationIndigitalcommunicationsystems,thetransmittedsignalsareoftendistortedthroughabandlimitedchannelwhichintroducesintersymbolinterferenceISI.Blindequalizationreferstotheproblemofrestoringtheoriginaldigitalsignalwhenonlythechannel'soutputisavailable.Tosolvetheproblem,blindtechniquesexploitsomeknowledgeaboutthestatisticalpropertiesoftheinputsignalorthestructureofthechannel[ 76 ].Benvenisteet.al.[ 77 ]werethersttoprovethatasucientconditionforperfectequalizationisthatthepdfoftherecoveredsymbolsbeequaltothesourcepdf.Accordingtothisideasomeblindalgorithmsaimingatforcingagivenpdfattheoutputoftheequalizerhavebeenproposed[ 78 ],[ 79 ],[ 80 ].Later,ShalviandWeinsteinrelaxedthisconditionbyshowingthatitisenoughtomatchafewcumulantsbetweentheoutputsignalandthesourceinordertoachieveperfectequalization[ 81 ].Forinstance,theso-calledsuper-exponentialalgorithm[ 82 ]maximizesthekurtosisattheoutputofthechannelsubjecttoaconstraintontheoutputvariance.OtheralgorithmsbasedonthisresultarethecumulantmatchingmethodsproposedbyTugnait[ 83 ],andalsobyHatzinakosandNikias[ 84 ].Tosummarize,thetheoreticalresultsin[ 77 ]and[ 81 ]demonstratethatthehigher-orderstatisticsareenoughtosolvetheproblem.Thisfactmayexplainwhytodatethetimestructureofthesourcesignalshasnotbeenexplicitlyexploitedinblindequalizationalgorithms,eventhoughformostpracticalcasesitisknownthatthesourceiscorrelatedincodedsystems,forinstance.However,wefeelthatinanylearningoradaptivealgorithmweshoulduseasmuchpriorinformationaspossibleabouttheproblem.Accordingtothisidea,theproposedcorrentropyfunctionenablesustoderivenewcostfunctionsforblindequalizationthatexploitssimultaneouslyourknowledgeaboutboththepdfofthesourceanditstimestructure. 98

PAGE 99

7.2ProblemSettingWeconsiderabaud-ratesampledbasebandrepresentationofthedigitalcommunicationsystem.AsasimpleexampleofacodeddigitalcommunicationssystemweconsidertheAlternateMarkInversionAMIlinecoding,whichisalinecodeusedforT1andE1.InAMI,thelogical0"isrepresentedbynolinesignalandthelogical1"isrepresentedbypositiveandnegativealternatepulses[ 19 ].Aftermatchedlteringandsamplingatthebaudratewehaveacorrelatedsymbolsequencedrawnfromthealphabetf)]TJ/F15 11.955 Tf 15.276 0 Td[(1;0;+1gwithprobabilities1/4,1/2and1/4,respectively.Now,letussupposethatthesourceAMIsignalsnissentthroughalineartime-invariantchannelwithcoecientshn.Theresultingchanneloutputcanbeexpressedasxn=Xihisn)]TJ/F22 7.97 Tf 6.587 0 Td[(i+en;{1whereenisazero-meanwhiteGaussiannoisewithvariance2e.TheobjectiveofablindlinearequalizeristoremovetheISIatitsoutputwithoutusinganytrainingsequence.Typically,theequalizerisdesignedasanFIRlterwithMcoecientsw;then,itsoutputisgivenbyyn=M)]TJ/F20 7.97 Tf 6.587 0 Td[(1Xi=0wixn)]TJ/F22 7.97 Tf 6.586 0 Td[(i=wTxn:{2Probably,themostpopularadaptiveblindalgorithmsarethefamilyofGodardalgorithms[ 85 ],whicharestochasticgradientdescentSGDmethodsforminimizingthecostfunctionsJGw=Ejynjp)]TJ/F21 11.955 Tf 11.955 0 Td[(Rp2;p=1;2;{3whereRp=E[jsnj2p] E[jsnjp]dependsontheinputconstellationpdf.Fortheparticularcasep=2,Eq. 7{3 isthecostfunctionoftheconstantmodulusalgorithmCMA,[ 85 ],[ 86 ].UsinganSGDminimizationapproach,theCMAcanbewrittenaswn+1=wn)]TJ/F21 11.955 Tf 11.955 0 Td[(jynj2)]TJ/F21 11.955 Tf 11.955 0 Td[(R2ynxn:{4 99

PAGE 100

AsameasureofequalizationperformanceweusetheISIdenedasISI=10log10Pnjnj2)]TJ/F15 11.955 Tf 11.955 0 Td[(maxnjnj2 maxnjnj2;{5where=hwisthecombinedchannel-equalizerimpulseresponse,whichisadeltafunctionforazero-forcingequalizer.7.3CostFunctionandIterativeAlgorithmThetheoreticalcorrentropyfunctionforthesourceAMIsignalisgivenbyVsk=8>>>><>>>>:;k=01 4+1 2+1 4;k=13 8+1 2+1 8;jkj>1{6wherexistheGaussiankerneldenedin 2{3 .ThisfunctionconveysinformationaboutthepdfandthecorrelationofthesourceAMIsignal.Therefore,weproposedtousethefollowingcostfunctionforequalizationJcew=PXm=1Vsk)]TJ/F21 11.955 Tf 11.956 0 Td[(Vyk2;{7whereVykisthecorrentropyfunctionestimatedattheoutputoftheequalizerandPisthenumberoflagsnoticethatwehaveremovedthezerolagsinceitisalwaysequalto.InordertoderiveanSGDadaptivealgorithmwecanuseaslidingwindowapproachandupdateonceperinputsampleaccordingtothefollowingalgorithmwn+1=wn+PXk=1Vsk)]TJ/F21 11.955 Tf 11.955 0 Td[(Vyk@Vyk @w;{8where@Vyk @w=)]TJ/F15 11.955 Tf 26.2 8.087 Td[(1 Nw)]TJ/F21 11.955 Tf 11.955 0 Td[(knXi=n)]TJ/F22 7.97 Tf 6.586 0 Td[(Nw)]TJ/F22 7.97 Tf 6.586 0 Td[(k+1yi)]TJ/F21 11.955 Tf 11.955 0 Td[(yi)]TJ/F22 7.97 Tf 6.586 0 Td[(kyi)]TJ/F21 11.955 Tf 11.955 0 Td[(yi)]TJ/F22 7.97 Tf 6.587 0 Td[(kxi)]TJ/F37 11.955 Tf 11.956 0 Td[(xi)]TJ/F22 7.97 Tf 6.587 0 Td[(k:{9Adrawbackoftheproposedapproachinanonlinesettingisthatalargenumberofsamplesisrequiredinordertohavereliableandlowvarianceestimatesofthecorrentropy 100

PAGE 101

function.Tomitigatesomehowthisproblemweproposetouseastop-and-go"technique[ 87 ].Specically,theequalizercoecientsareonlyupdatedwhentheerrorfunctionobtainedusingharddecisionsandtheerrorfunctionusingtheoutputoftheequalizerhavethesamesign.7.4SimulationResultsIntherstexampletheAMIsignalisdistortedbyanIIRchannel:H1z=1 1)]TJ/F20 7.97 Tf 6.586 0 Td[(0:5z)]TJ/F18 5.978 Tf 5.756 0 Td[(1,thenwhiteGaussiannoiseisaddedforanalSNR=20dB.A3-tapequalizerwasusedandinitializedwiththecentercoecientsettounityandtheresttozero.ThecorrentropyfunctionwasestimatedusingawindowofNw=100samplesandP=4lags.Ontheotherhand,thekernelsizewasselectedasthestandarddeviationoftheequalizer'soutput=stdy.Fig.7showstheISIcurvesforcorrentropyandCMAobtainedbyaveraging25independentMonteCarlosimulations.Thestep-sizeforcorrentropywas=0:6,whereasforCMAwascma=0:04inbothcasesthesearethelargeststep-sizesforwhichallthetrialsconverged.Fig.8showstheevolutionofthecoecientsoftheequalizerforcorrentropysolidlineandCMAdottedlinestartingfromw0=;1;0;thezero-ISIsolutionis;1;)]TJ/F15 11.955 Tf 9.299 0 Td[(0:5dashedline.WeseethatforthisexamplethecorrentropyfunctioneectivelyextractsmoreinformationthantheCMAaboutthesourcesignalsanditisabletoprovideabettersolution.InordertoexplainthepoorbehaviorofCMAforthisparticularcase,weshouldrememberthattheuseofanonuniformsymboldistributionhastheeectofraisingthekurtosis,thusmakingthepdfofthesourcedistributionmoreGaussian.Specically,thekurtosisfortheAMIsignalofourexampleisE[jsnj4] E[jsnj22]=2,whereasforauniformBPSKthekurtosisis1,forauniform32-PAMis1.798andforaGaussianis3.Althoughthesourceremainssub-GaussiantheincreaseinkurtosishastheeectofliftingtheCMAcostfunctionthusincreasingtheexcessmeansquarederror,andatteningitssurfacethusreducingtheconvergencespeed[ 88 ].Moreover,theuseofacorrelatedsourcecanalsocausemajorproblemsintheCMAconvergenceasitwasalsopointedoutin[ 88 ].These 101

PAGE 102

majordrawbackswhenthereissourcecorrelationaswellashighkurtosis,seemtoaectmostoftheconventionalblindequalizationtechniques.Forinstance,thealgorithmin[ 82 ]whichusesthekurtosisasacostfunctionwasnotevenabletoopentheeyediagramformosttrialsusingawindowof100samples.Forthisreasonwedonotincludetheresultsobtainedwiththismethodhere. Figure7-1.IntersymbolinterferenceISIconvergencecurvesforcorrentropyandCMAunderGaussiannoise. Anotherinterestingpropertyofthecorrentropyfunctionisitsrobustnessagainstimpulsivenoise.Thisadditionaladvantageisduetothefactthatwhenanoutlierispresent,theinnerproductinthefeaturespacecomputedviatheGaussiankerneltendstobezeroi.e.,yi)]TJ/F21 11.955 Tf 12.384 0 Td[(yi)]TJ/F22 7.97 Tf 6.586 0 Td[(k0wheneitheryioryi)]TJ/F22 7.97 Tf 6.587 0 Td[(khavealargevalue.Toillustratethispointwehaveusedthesamesimulationexample,butthistimethechanneloutputisdistortedwithimpulsivenoisegeneratedaccordingtothefollowingGaussianmixturemodel. 102

PAGE 103

Figure7-2.ConvergencecurvesoftheequalizercoecientsforcorrentropyandCMAunderGaussiannoise.Thetruesolutionisw=;1;)]TJ/F15 11.955 Tf 9.298 0 Td[(0:5. fn= p 21exp)]TJ/F26 11.955 Tf 11.291 16.857 Td[(n2 221+)]TJ/F21 11.955 Tf 11.955 0 Td[( p 22exp)]TJ/F26 11.955 Tf 11.291 16.857 Td[(n2 222;{10wheretypically<<1and21>>22.Specically,inoursimulationsweused=0:15,21>>5022andchoose2togetanalSNR=20dB.Fig.9showtheISIcurvesobtainedinthiscase.Nowthestep-sizeforcorrentropywas=0:4andcma=0:001.Wecanseethatevenforthissmallstep-sizetheCMAisnotabletoconvergeduetothelargenoisespikesatthechanneloutput.Ontheotherhand,withthecorrentropyfunctionweobtainpracticallythesameconvergencebehaviorasintheGaussiancase. 103

PAGE 104

Figure7-3.IntersymbolinterferenceISIconvergencecurvesforthecorrentropyandCMAunderimpulsivenoise. 104

PAGE 105

CHAPTER8SUMMARY8.1SummaryOurresearchpresentsanovelmeansofextractinginformationoftimestructuresandapplieditfortimeseriesanalysismatchedlteringandoptimaladaptivelteringbasedonkernelmethodsandthenewlyinventedconceptinITL,correntropy.CorrentropyinducesanintriguingRKHSbutalsogeneratesanitedimensionalcovariancefunction,whichunliketheGrammatrixusingkernelmethods,isalwayswellconditionedandthecorrespondingsolutionsdonotrequireexplicitregularization.Therefore,itisdierentfromtheconventionalkernelmethods,inbothscopeanddetail.WehavedemonstratedtheseideasbyproposingtherobustmatchedlterandthecorrentropyWienerlter.Thematchedlterisaverysimpleandeectivemethodthatcanbeusedinavarietyofscenariosbyjusttuningthekernelsize.OurresultsshowthatwhenthenoiseisGaussianthematchedlterMFhasanoverallhigherprobabilityofdetection.However,byincreasingthekernelsize,theCMFapproachestothisoptimalperformancewithpracticallythesameresult.Clearly,thematchedlterbeingbasedonsecondorderstatisticsalonehasshortcomingswhennon-Gaussiannoiseisintroducedbythechannel.InthiscasewithanappropriatekernelsizetheCMFoutperformsthetraditionalmatchedltersuggestingthattheRKHSdenedbythecorrentropyissuperior,especiallyforimpulsivenoiseenvironments,whichcanreadilybeobservedinlow-frequencyatmosphericnoise,uorescentlightingsystems,combustionengineignition,radioandunderwateracousticchannels,economicstockprices,andbiomedicalsignals.Italsooutperformsthematchedlterusingmutualinformationwhichhasamuchhighercomputationalburden.Wehavealsopresentedtheexceptionalbehaviorofournewmethodfordetectioninalpha-stabledistributednoise.Itisnoteworthythattheperformanceofourmatchltercloselyrivalsthatofthelocallysuboptimaldetector,whichisexclusivelydesigned 105

PAGE 106

for-stabledistributionsbutrequiresapriorithevalueofbeforethedetectorcanbedesigned.ResultsclearlyshowthesuperiorityandexibilityoftheCMFinawidevarietyofsituationswithleastcomplexity,whichotherwisewouldrequireheavytrainingandcomplexalgorithms.Thisexceptionalrobustnessagainstimpulsivenoisecanbeattributedtothefactthatcorrentropyevaluatestheintegralofthejointpdfoftworandomvariablesalongthelinethatrepresentstheirequality,thusfocusingmoreonthelikelihoodofthetworandomvariableshavingclosevaluesandlessontheheavytailsthatleptokurticnoisegenerates.Itisalsonoteworthythattheproposedmethodworksimpressivelyevenwhenthetemplatesizeisjust64,whichisanotherlimitationforcomplexmethods.WeshouldalsomentionthecomparisonoftheCMFandotherkernelbaseddetectionlters.WeseethatthecomputationcomplexityoftheCMFislinearwiththetemplatelength,unlikeotherkernelbasedlterswithacomplexityofatleastON2.Thisisduetotheuseofthecorrentropyasthesimilaritymeasure,insteadofthelargematrixofprojectedpointsrequiredbyotherkernelmethods.OurresearchalsopresentsaninvestigationonanewtypeofWienerlterbasedontherecentlyintroducedcorrentropyfunction.Thisdissertationshowsameansofdirectlyusingcorrentropyastheautocorrelationfunctionoftheprojecteddata.Thisapproachcanalsobeusedtoextendotherlinearlearningschemestononlinearalgorithms.ThemethodalsoavoidsthelargegrammatrixusedformanykernelbasedmethodsandtheRBFnetwork.Insteadsystemorderisdecoupledfromthedatasize.Sincecorrentropyeectivelyuseshigherorderinformation,thecorrentropyWienerlterperformsbetterthanthelinearlterinavarietyofscenarios.Yetthelimitationsontheperformanceofthemethodisduetotherelativelynaiveapproximationemployedtoderivethenalformulation.ThoughcorrentropylterhasastructureverysimilartotheHammersteinmodel,thenonlinearityisimplicit,derivedfromthestatisticsofthedataandthekernelemployed. 106

PAGE 107

Finally,wehavepresentedthenonlinearLMSalgorithmsthatdonotneedregularization.CorrentropyLMSCLMSisderivedbyminimizingthestochasticgradientoftheMSEandusingcorrentropyasthegeneralizedcorrelationfunction.Italsoemploysanapproximation,onewhichhasbeenseentobemoreaccuratethantheoneemployedtoderivethecorrentropyWienerlter.InadditionwehavealsopresentedthekernelLMSKLMSalgorithmwhichisderivedfromkernelmethodswhichalsoincorporatesCLMSasspecialcase.OverallthesemethodsareverysimpleandecientandareverycomparabletothesolutionobtainedbykernelmethodsorRBFnetworkswithcentersatinputdatapointsbutwithoutthecomplexitiesoflargematrixoperationsandtheheuristicsofchoosingappropriateregularization.Theirdeningcharacteristicsisthatthesolutionsrelyonimprovingsuccessiveoutputsamplesbasedontheerrorsincurredinthepast.WehavealsoaddressedtheissueofkernelLMSthatcouldpotentiallyprohibititsuseinrealworldproblems,namelythecontinuousadditionofakerneltothesolutionwitheachtimestep.Thisresultsinaconstantgrowthofthelterarchitecture,thusincreasingthestorageandcomputationalcomplexitywitheachnewsample.Sincetheerrorvalueswouldeventuallydieout,thegrowthcouldbearticiallystoppedwhentheerrorreachesbelowagiventolerance.ButthiswouldalsostopthecontinuouslearningprocessthatmakesLMSsoattractive.Insteadwepresentamoreintelligentapproachbyusingthelineardependencyasacriterionfordecidingthegrowthofthelterarchitecturewithouthinderingthecontinuoussample-by-samplelearningprocess.WealsodemonstratedhowthiscouldeectivelyminimizethenumberofkernelfunctionswithminimaleectontheperformancebyapplyingittoidentifyaWienerchannelwithabinaryinputwherethelinearltercomponentofthechannelchangesabruptly.ThesimulationresultsshowthatthekernelLMSapproachisfasterandmorecosteectivebothinstorageandcomputationthantherecentlyproposedslidingwindowkernelRLS.Ofcourse,itcouldbeexpectedbecausetheresponsetimeforanabruptchangeforthekernelRLSisapproximatelythelengthofthewindowthatisapplied. 107

PAGE 108

Infact,thesameargumentsthatisusedforthesuperiortrackingofthelinearLMSalgorithmversusthelinearRLSalgorithmcanbeusedhereaswell.ThesamplebysamplestochasticadaptationinvolvedinLMSprovidesanexceptionaladvantage.Wehavepresentedtwootherusefulapplicationsthatexemplifythevariousadvantagesofusingthetheoryandideaspresentedinthisdissertation.Wehavedemonstratedtheuseoftherobustcorrentropytemplatematchingconceptinshapeclassicationforoccludedobjects.Ourmethodusesafundamentallydierentapproachtodealwiththeeversounavoidableocclusionsthatareencounteredinrealworldimageprocessingapplications.Withoutusingthevariouslandmarksthatareusuallyemployedforshapedetection,wehaveusedthefactthatocclusionspracticallybehavelikeimpulsivenoise.Hencecorrentropyhasproventobeaneectivemeansoftacklingsuchocclusions.Wehavealsopresentedapplicationonblindequalization.Therehavebeennumerousresearcharticlespublishedinthiseld.Butinvariablyallmethodseitherassumethatthetransmittedsourcesignalisiidorsimplyignorethepossibletimestructure.Theassumptionhasalwaysbeenthatthereisnoinformationavailableaboutthesourcebeforehand.Butthisisnotalwaystrueespeciallywithcommunicationchannels.Usuallyawellknowncodingschemeisemployedbythetransmitter,whichtranslatestoaknowntimestructure.Incertaincases,thecorrentropyfunctionofthesourcesignalcanbecalculatedanalyticallybeforehand.Sotheideaistoeectivelyusethisinformationaboutthetimestructure.WehaveseenthattheuseofcorrentropygivesbetterresultsforcorrelatedsourcesthanaleadingwellknownmethodcalledtheconstantmodulusalgorithmCMA.8.2OtherApplicationsThoughwehavepresentedjustafewapplications,theconceptsandideasthathavebeenpresentedarenotlimitedforjustthese.Thisresearchsuggestsanelegantframework,basedontheideasofinformationtheoreticlearningandkernelmethods,thatcouldsolvemanydicultproblemswheretheusualassumptionsofGaussianityandlinearitywillnolongerbevalid.Thiswouldincludemanyareasofengineeringlike 108

PAGE 109

communications,geosensing,computervision,nancialdatamodelingandbiomedicalengineeringjusttonameafew.ThesameconceptofcorrentropyasareproducingkernelandameasureofhigherordercorrelationhasalsobeenutilizedbyKyu-HwaJeongforthecorrentropybasedminimumaveragecorrelationenergylterCMACE,Jian-WuXuforpitchdetectioninspeech,andSeungjuHanforrobustbeamformingwithverypromisingresults.CertainotherapplicationslikeusingcorrentropyforcompressivesamplingandidentifyingMIMOchannelsisalsobeingexplored.8.3FutureWorkWehavepresentedelegantnonlinearversionsofthreebasictimeseriesproblemsinsignalprocessing{thematchedlter,theWienerlter,andtheLMSlter.Atthesametimeallthesenovelmethodsleavesomeroomforimprovementeitherincontinuingthetheoreticalstudyorinbetteringtheirpracticalimplementation.Hencethefutureworkshouldaddresstheseissues.aThecorrentropymatchedlterhasproventobeasimpleandeectivetoolforsignaldetectioninanon-Gaussianenvironment,particularlyinimpulsivenoise.Wehavealsoprovidedanovelinterpretationofcorrentropyintermsoftheintegralofthepdffunctionalongthelinewhichexplainstheworkingofthislter.Theoreticallywehavealsoshownthatthecorrentropymatchedlterisactuallyalinearmatchedlterforfeaturevectorsnonlinearlyrelatedtotheinputdatathoughthecorrentropyfunctionitself.Weshouldalsonotethatthisnonlinearityisnotmerelythroughastaticnonlinearitybutismorecomplexandrelatedtothestatisticsoftheinputdata.Thoughthenotionofoptimalityintermsofcorrentropybeingcorrelationcanbeestablishedinthespaceofthetransformedvectorsthroughthisrelationship,adirectoptimalityintermsoftheinputdatawouldrenderthetheoryasmorecomplete.Giventhenatureofthenonlinearmappingontheinputdataandthekerneltrickthatisemployed,itmightnotbeeasilyfeasibletoderivethisformulationasanexactmaximumlikelihoodcriterion,butwith 109

PAGE 110

somesimplicationanear-optimaltheoreticalinterpretationwouldbewelcomeandpossiblymorefeasible.bThemajorlimitationofthecorrentropyWienerlteristheapproximationthatgivesthenalexpression.ItisasimpleoneanddoesgiveaworkingsolutionsuchthatiteectivelygivesbettersolutionsthanthatbythelinearWienerlter.ButtheapproximationstephastobereviewedfortheltertoperformconsistentlybetterthanothercomparablenonlinearmethodsliketheNadarayaWatsonestimator.Infact,likethepreviouspoint,thisalsoboilsdowntounderstandingthedirectrelationshipbetweentheinputdataandthetransformedfeaturepointsonwhichthecorrelationfunctioniscomputed.cThebackboneofbetterelucidatingboththetheoreticalandpracticalimplicationsofusingthevariousmethodspresentedinthisstudyis,infact,understandingbettertherelationshipoftheinputdatawiththefeaturespaceinducedbycorrentropy.Thiswouldnotonlyfurtherthetheoreticalstandingoftheconcepts,butalsoimprovetheperformanceofthesignalprocessingtoolsdevelopedthroughthisresearch.AstartingpointforthiscouldbestudyingtheparametricadjustmentsmadetothecorrentropyWienerltertoimproveitsperformanceandtheformulationofthecorrentropyLMSalgorithm.Thetheoriesofkernelmethodsandinformationtheoreticlearning,andvariousdierentialgeometryconceptsalsohasanimportantparttoplaytofurthertheunderstanding.dFinallywewouldwanttoapplyourideastomorerealproblems.Wehaveappliedtheseintriguingconceptstoblindequalizationandnonlinearsystemidenticationwithpromisingresults,buttheseapplicationsweretestedonsyntheticallydesignedexperiments.Wefurtherappliedavariationofthecorrentropymatchedlterforrecognizingoccludedobjectsusingadatabaseofrealshshapes.Thebasisofthisapplicationwasthehypothesisthatpartialocclusioncanbemodeledasimpulsivenoise.Thisisadeviationfromthemoreconventionalapproachofusingcertainlandmarks. 110

PAGE 111

Theperformancewasfarbetterthanrecentlypublishedresultsthatemployedthesamehypothesiswhereconventionalmethodsareknowntofail.Thisconcepthastobefurtherexplored,probablytouseitasapartofamorecomplexobjectrecognitionalgorithm.Otherenvironmentswhereimpulsivebehaviorisabundantarelow-frequencyatmosphericnoise,uorescentlightingsystems,combustionengineignition,radioandunderwateracousticchannels,economicstockprices,etcetera.Wehavejuststartedtoexplorevariousavenuesofapplicability.Alotofpossiblerealproblemscouldbenetfromourapproachofreformulatingconventionalsignalprocessingsolutionsusingthenovelsimilarityandcorrelationmeasureasdemonstratedthroughthisstudy. 111

PAGE 112

LISTOFREFERENCES [1] N.Aronszajn,Theoryofreproducingkernels,"TransactionsoftheAmericanMathematicalSociety,vol.68,pp.337{404,1950. [2] E.Parzen,StatisticalMethodsonTimeSeriesbyHilbertSpaceMethods,AppliedMathematicsandStatisticsLaboratory,StanfordUniversity,Stanford,CA,1959,TechnicalReport. [3] V.Vapnik,TheNatureofStatisticalLearningTheory,Springer,NewYork,1995. [4] B.Scholkopf,A.Smola,andK.-R.Muller,Nonlinearcomponentanalysisasakerneleigenvalueproblem,"NeuralComputation,vol.10,pp.1299{1319,1998. [5] S.Mika,G.Ratsch,J.Wetson,B.Scholkopf,andK.-R.Muller,Fisherdiscriminantanalysiswithkernels,"inIEEEWorkshoponNeuralNetworksforSignalProcessingIX,Madison,WI,USA,1999,pp.41{48. [6] F.R.BachandM.I.Jordan,Kernelindependentcomponentanalysis,"JournalofMachineLearningResearch,vol.3,pp.1{48,2002. [7] D.R.Hardoon,S.Szedmak,andJ.Shawe-Taylor,Canonicalcorrelationanalysis:anoverviewwithapplicationtolearningmethods,"NeuralComputation,vol.16,no.12,pp.2699{2664,2004. [8] J.C.Principe,D.Xu,andJ.Fisher,Informationtheoreticlearning,"inUnsuper-visedAdaptiveFiltering,S.Haykin,Ed.Wiley,2000. [9] D.ErdogmusandJ.C.Principe,Anerror-entropyminimizationalgorithmforsupervisedtrainingofnonlinearadaptivesystems,"IEEETransactionsonSignalProcessing,vol.50,pp.1780{1786,2002. [10] K.E.HildII,D.Erdogmus,andJ.C.Principe,BlindsourceseparationusingRenyi'smutualinformation,"IEEESignalProcessingLetters,vol.8,pp.174{176,2001. [11] I.Santamara,D.Erdogmus,andJ.C.Principe,Entropyminimizationforsuperviseddigitalcommunicationschannelequalization,"IEEETransactionsonSignalProcessing,vol.16,no.12,pp.2699{2664,2004. [12] C.W.Helstrom,ElementsofSignalDetectionandEstimation,PrenticeHall,NewJersy,1995. [13] J.Cardoso,Blindsignalseparation:Statisticalprinciples,"ProceedingsoftheIEEESpecialIssue,vol.86,pp.2009{2025,1998. [14] A.BellandT.Sejnowski,Aninformation-maximizationapproachtoblindseparationandblinddeconvolution,"NeuralComputation,vol.7,pp.1129{1159,1995. 112

PAGE 113

[15] I.Santamaria,P.P.Pokharel,andJ.C.Principe,Generalizedcorrelationfunction:Denition,propertiesandapplicationtoblindequalization,"IEEETransactionsonSignalProcessing,vol.54,no.6,pp.2187{2197,2006. [16] A.P.SageandJ.L.Melsa,SystemIdentication,AcademicPress,NewYork,1971. [17] M.Pourahmadi,Foundationsoftimeseriesanalysisandpredictiontheory,Wiley,NewYork,2001. [18] Y.ZhaoandS.-G.Haggman,Intercarrierinterferenceself-cancellationschemeforofdmmobilecommunicationsystems,"IEEETransactionsonCommunication,vol.49,no.7,pp.1185{1191,2001. [19] J.G.Proakis,DigitalCommunications,McGraw-Hill,NewYork,4thedition,2001. [20] G.L.Turin,Anintroductiontomatchedlters,"IEEETransactionsonInformationTheory,vol.19,pp.19{28,1973. [21] D.DuttweilerandT.Kailath,AnRKHSapproachtodetectionandestimationproblems{partIV:NonGaussiandetection,"IEEETransactionsonInformationTheory,pp.310{329,1960. [22] H.KwonandN.M.Nasrabadi,Hyperspectraltargetdetectionusingkernelmatchedsubspacedetector,"inICIP,2004,pp.3327{3330. [23] I.Santamaria,D.Erdogmus,R.Agrawal,andJ.C.Principe,Robustmatchedlteringinthefeaturespace,"inEUSIPCO,2005. [24] D.Erdogmus,R.Agrawal,andJ.C.Principe,Amutualinformationextensiontothematchedlter,"SignalProcessing,vol.85,pp.927{935,2005. [25] R.J.P.deFigueiredoandY.Hu,Onnonlinearlteringofnon-GaussianprocessesthroughVolterraseries,"inVolterraEquationsandApplications,pp.197{202.GordonandBreachSciencePublishers,Amsterdam,2002. [26] SimonHaykin,NeuralNetworks:AComprehensiveFoundation,PrenticeHall,NewJersey,2ndedition,1998. [27] C.M.Bishop,NeuralNetworksforPatternRecognition,OxfordUniversityPress,NewYork,1995. [28] K.R.Muller,A.J.Smola,GunnerRatsch,B.Scholkopf,J.Kohlmorgen,andV.Vapnik,Usingsupportvectormachinesfortimeseriesprediction,"inAdvancesinKernelMethods.MITpress,Cambridge,1999. [29] B.WidrowandS.D.Stearns,AdaptiveSignalProcessing,Prentice-Hall,EnglewoodClis,NewJersey,1985. [30] S.Saitoh,TheoryofReproducingKernelsanditsApplications,LongmanScienticandTechnical,U.K.,1988. 113

PAGE 114

[31] G.Wahba,Splinemodelsforobservationaldata,"inCBMS-NSFRegionalConfer-enceSeriesinAppliedMathematics,Philadelphia,USA,1990,vol.59. [32] R.Jenssen,D.Erdogmus,J.C.Principe,andT.Eltoft,TheLaplacianPDFdistance:Acostfunctionforclusteringinakernelfeaturespace,"inNeuralInforma-tionProcessingSystems,2004. [33] R.Jenssen,D.Erdogmus,J.C.Principe,andT.Eltoft,Towardsaunicationofinformationtheoreticlearningandkernelmethods,"inIEEEInternationalWorkshoponMachineLearningforSignalProcessing,2004,pp.443{451. [34] T.M.CoverandJ.A.Thomas,ElementsofInformationTheory,Wiley,NewYork,1991. [35] J.-WXu,D.Erdogmus,R.Jenssen,andJ.C.Principe,Aninformation-theoreticperspectivetokernelindependentcomponentanalysis,"inICASSP2005,Philadelphia,USA,2004. [36] P.Pokharel,J.W.Xu,D.Erdogmus,andJ.C.Principe,AclosedformsolutionforanonlinearWienerlter,"inIEEEInternationalConferenceonAcoustics,Speech,andSignalProcessing,2006,vol.3,pp.720{723. [37] K.H.JeongandJ.C.Principe,ThecorrentropyMACElterforimagerecognition,"AcceptedforIEEEInternationalWorkshoponMachineLearningforSignalProcessing,2006. [38] G.H.GolubandC.F.VanLoan,MatrixComputations,JohnHopkinsUniversityPress,Baltimore,2ndedition,1989. [39] S.Kay,FundamentalsofStatisticalSignalProcessing.Volume1:EstimationTheory,PrenticeHall,NewJersey,1993. [40] T.Evegeniou,M.Pontil,andT.Poggio,Regularizationnetworksandsupportvectormachines,"AdvancesinComputationalMathematics,vol.13,no.1,pp.1{50,2000. [41] K.Tsuda,M.Kawanabe,G.Ratsch,andS.Sonnenburg,Anewdiscriminativekernelfromprobabilisticmodels,"NeuralComputation,vol.14,pp.2397{2414,2002. [42] W.Greblicki,Nonlinearitvestimationinhammersteinsystemsbasedonorderedobservations,"IEEETransactionsonsignalprocessing,vol.44,no.5,pp.1224{1233,1996. [43] L.L.Scharf,StatisticalSignalProcessing:DetectionEstimation,andTimeSeriesAnalysis,Addison-Wesley,NewYork,1991. [44] J.H.McCulloch,Financialapplicationsofstabledistributions,"inHandbookofStatistics,G.S.MadalaandC.R.Rao,Eds.,vol.14,pp.393{425.Elsevier,1996. 114

PAGE 115

[45] C.L.NikiasandM.Shao,SignalProcessingwithAlpha-StableDistributionsandApplications,JohnWileyandSons,1995. [46] A.B.Salberg,AlfredHanssen,andL.L.Scharf,Robustmultidimensionalmatchedsubspaceclassiersbasedonweightedleast-squares,"IEEETransactionsonSignalProcessing,vol.55,pp.873{880,2007. [47] S.M.ZabinandH.V.Poor,EcientestimationoftheclassAparametersviatheemalgorithm,"IEEETransactionsonInformationTheory,vol.47,pp.60{72,1991. [48] H.V.PoorandM.Tanda,Multiuserdetectioninatfadingnon-gaussianchannels,"IEEETransactionsonCommunications,vol.50,pp.1769{1777,2002. [49] V.M.Zolotarev,One-DimensionalStableDistributions,AmericanMathematicalSociety,Providence,R.I.,1986. [50] C.L.Brown,Scorefunctionsforlocallysuboptimumandlocallysuboptimumrankdetectioninalpha-stableinterference,"in11thIEEESignalProcessingWorkshoponStatisticalSignalProcessing,SSP2001,Singapore,2001,pp.58{61. [51] S.A.Kassam,SignalDetectioninNon-GaussianNoise,Springer-Verlag,NewYork,1988. [52] L.DevroyeandG.Lugosi,CombinatorialMethodsinDensityEstimation,Springer,NewYork,2001. [53] B.W.Silverman,DensityEstimationforStatisticsandDataAnalysis,ChapmanandHall,London,1986. [54] R.P.W.Duin,OnthechoiceofsmoothingparametersforParzendensityestimatorsofprobabilitydensity,"IEEETransactionsonComputers,vol.25,no.11,pp.1175{1179,1976. [55] S.Kay,FundamentalsofStatisticalSignalProcessing.Volume2:DetectionTheory,PrenticeHall,NewJersey,1998. [56] R.Weron,OntheChambers-Mallows-Stuckmethodforsimulatingskewedstablerandomvariables,"StatisticsandProbabilityLetters,vol.28,pp.165{171,1996. [57] S.V.Vaerenbergh,J.Via,andI.Santamaria,Nonlinearsystemidenticationusinganewsliding-windowkernelRLSalgorithm,"JournalofCommunications,vol.2,no.3,pp.1{8,2007. [58] I.L.DrydenandK.V.Mardia,StatisticalShapeAnalysis,Wiley,Chichister,U.K.,1998. [59] D.G.Kendall,T.K.CarneD.Barden,andH.Le,ShapeandShapeTheory,Wiley,Chichister,U.K.,1999. 115

PAGE 116

[60] N.AnsariandE.J.Delp,Partialshaperecognition:Alandmark-basedapproach,"IEEETransactionsonPatternAnalysisandMachineIntelligence,vol.12,no.5,pp.470{483,1990. [61] J.Zhang,X.Zhang,H.Krim,andG.G.Walter,Objectrepresentationandrecognitioninshapespaces,"PatternRecognition,vol.36,pp.1143{1154,2003. [62] L.L.ScharfandB.Friedlander,Matchedsubspacedetectors,"IEEETransactionsonSignalProcessing,vol.42,pp.2146{2157,1994. [63] SimonHaykin,AdaptiveFilterTheory,Prentice-Hall,UpperSaddleRiver,NJ07458,USA,fourthedition,2002. [64] A.BalestrinoandA.Caiti,ApproximationofHammerstein/Wienerdynamicmodels,"inInternationalJointConferenceonNeuralNetworks,2000,vol.1,pp.70{74. [65] F.Takens,Detectingstrangeattractorsinturbulence,"inLectureNotesinMathematics,vol.898ofDynamicalSystemsandTurbulence,pp.366{381.SpringerVerlag,1981. [66] B.Widrow,AdaptiveltersI:FundamentalsTR6764-6,StanfordElectronicsLaboratories,Stanford,CA,1966,TechnicalReport. [67] S.SmaleandY.Yao,Onlinelearningalgorithms,"Foundationsofcomputationalmathematics,vol.6,pp.145{170,2006. [68] J.Kivinen,A.Smola,andR.Williamson,Onlinelearningwithkernels,"IEEETransactionsonSignalProcessing,vol.52,pp.2165{2176,2004. [69] A.H.Sayed,FundamentalsofAdaptiveFiltering,JohnWiley,NewJersey,2003. [70] B.Scholkopf,Learningwithkernels:supportvectormachines,regularization,optimization,andbeyond,MITPress,Cambridge,2002. [71] WeifengLiu,P.P.Pokharel,andJ.C.Principe,Kernelleastmeansquarealgorithm,"submittedtoIEEETransactionsonSignalProcessing. [72] A.N.TikhonovandV.Y.Arsenin,SolutionofIll-PosedProblems,Wiley,NewYork,1977. [73] JohnPlatt,Aresource-allocatingnetworkforfunctioninterpolation,"NeuralComputation,vol.3,no.2,pp.213{225,1991. [74] Y.Engel,S.Mannor,andR.Meir,Thekernelrecursiveleast-squaresalgorithm,"IEEETransactionsonSignalProcessing,vol.52,no.8,pp.2275{2285,2004. [75] D.Erdogmus,D.Rende,J.C.Principe,andT.F.Wong,Nonlinearchannelequalizationusingmultilayerperceptronswithinformation-theoreticcriterion,"in 116

PAGE 117

IEEEWorkshoponNeuralNetworksforSignalProcessingIX,NorthFalmouth,MA,USA,2001,pp.443{451. [76] Z.DingandY.Li,BlindEqualizationandIdentication,MarcelDekker,NewYork,2001. [77] A.Benveniste,MGoursat,andG.Rouget,Robustidenticationofanon-minimumphasesystem:blindadjustmentofalinearequalizerindatacommunications,"IEEETransactionsonAutomaticControl,vol.25,no.3,pp.385{399,1980. [78] J.Sala-AlvarezandG.Vazquez-Grau,Statisticalreferencecriteriaforadaptivesignalprocessingindigitalcommunications,"IEEETransactionsonSignalProcess-ing,vol.45,no.1,pp.14{31,1997. [79] T.Adali,X.Liu,andM.K.Sonmez,Conditionaldistributionlearningwithneuralnetworksanditsapplicationtochannelequalization,"IEEETransactionsonSignalProcessing,vol.45,pp.1051{1064,1997. [80] M.Lazaro,I.Santamaria,C.Pantaleon,D.Erdogmus,andJ.C.Principe,Matchedpdf-basedblindequalization,"inIEEEInternationalConferenceonAcousticsSpeechandSignalProcessing,HongKong,China,2003,vol.4,pp.297{300. [81] O.ShalviandE.Weinstein,Newcriteriaforblinddeconvolutionofnonminimumphasechannels,"IEEETransactionsonInformationTheory,vol.36,no.2,pp.312{321,1990. [82] O.ShalviandE.Weinstein,Super-exponentialmethodsforblinddeconvolution,"IEEETransactionsonInformationTheory,vol.39,no.2,pp.504{519,1993. [83] J.K.Tugnait,Blindestimationandequalizationofdigitalcommunicationrchannelsusingcumulantmatching,"IEEETransactionsoncommunications,vol.43,pp.1240{1245,1995. [84] D.HatzinakosandC.L.Nikias,Blindequalizationbasedonhigher-orderstatisticshos,"inBlindDeconvolution,S.Haykin,Ed.Prentice-Hall,EnglewoodClis,NJ,1994. [85] D.N.Godard,Self-recoveringequalizationandcarriertrackingintwo-dimensionaldatacommunicationsystems,"IEEETransactionsoncommunications,vol.28,pp.1867{1875,1980. [86] J.R.TreichlerandB.G.Agee,Anewapproachtomultipathcorrectionofconstantmodulussignals,"IEEETransactionsonAcoustics,Speech,andSignalProcessing,vol.31,pp.349{372,1983. [87] G.PicchiandG.Prati,Blindequalizationandcarrierrecoveryusingstop-and-godecision-directedalgorithm,"IEEETransactionsonCommunications,vol.35,pp.877{887,1987. 117

PAGE 118

[88] Jr.C.R.Johnson,P.Schniter,T.J.Endres,J.D.Behm,D.R.Brown,andR.A.Casas,Blindequalizationusingtheconstantmoduluscriterion:Areview,"Proceed-ingsoftheIEEE,vol.86,no.10,pp.1927{1950,1998. 118

PAGE 119

BIOGRAPHICALSKETCHPuskalP.PokharelwasborninLalitpur,Nepalin1981.HereceivedhisBachelorofTechnologyinElectronicsandCommunicationEngineeringfromtheIndianInstituteofTechnologyIIT,Roorkee,Indiain2003andhisMasterofScienceinElectricalandComputerEngineeringfromtheUniversityofFloridain2005.HehasworkedattheComputationalNeuroEngineeringLaboratoryCNELasaresearchassistantpursuinghisPh.D.underthesupervisionofDr.JoseC.Principe.HeisamemberoftheTauBetaPihonorsocietyandastudentmemberoftheIEEE.Hiscurrentresearchinterestsincludedigitalsignalprocessing,machinelearning,informationtheoreticlearning,andtheirapplications. 119