<%BANNER%>

Fusing Probability Distributions with Information Theoretic Centers and Its Application to Data Retrieval

xml version 1.0 encoding UTF-8
REPORT xmlns http:www.fcla.edudlsmddaitss xmlns:xsi http:www.w3.org2001XMLSchema-instance xsi:schemaLocation http:www.fcla.edudlsmddaitssdaitssReport.xsd
INGEST IEID E20101124_AAAACP INGEST_TIME 2010-11-24T19:59:00Z PACKAGE UFE0011388_00001
AGREEMENT_INFO ACCOUNT UF PROJECT UFDC
FILES
FILE SIZE 22188 DFID F20101124_AACBJM ORIGIN DEPOSITOR PATH spellman_e_Page_48.QC.jpg GLOBAL false PRESERVATION BIT MESSAGE_DIGEST ALGORITHM MD5
54acd0138c754d5652a1e4e9e0e20071
SHA-1
a3136752197d4235acf1e45f37db33242b547dd8
4522 F20101124_AACBEO spellman_e_Page_67thm.jpg
7c2f88c7d2bf2a6f9347a936f53faa2f
5b7445243e0bf00c7e9107629f0296b678291a2a
25271604 F20101124_AACAXR spellman_e_Page_80.tif
d475cdf982f7f949d16c49c39970cc17
1b160b61a79354fcd58d47ba33f57269dcca2782
1051984 F20101124_AACASU spellman_e_Page_35.jp2
033a8e6a90ee7d11d027fa16511786e6
094fb8af8d011a1cac620ff0c9a8d09ebf5b9d38
47822 F20101124_AACANW spellman_e_Page_77.pro
39afceba1d5612abbc36fecea4833ea0
f7968c3545ce0d7237dc8257442a34ad685549e9
24206 F20101124_AACBJN spellman_e_Page_31.QC.jpg
cc1f4bc50564bfcd47a7e96c6064e31b
aba575c4c51ea1f6e21797abd3ed669fe0b95361
7131 F20101124_AACBEP spellman_e_Page_01.QC.jpg
8cb6a088f1eb110b4e85c1a5a77e8efb
3c49430302d3677a75b54ef89b97e8fbfb20d588
1053954 F20101124_AACAXS spellman_e_Page_81.tif
7c9be2dde54c68513cc91a68e05a78f9
be328caee1a9c4cf666508ca7e860c01d7913840
951508 F20101124_AACASV spellman_e_Page_36.jp2
31b6f020543c1b07049b2fd29a2058b4
e7ccfba816678f33ba79f2a0acd0150ee1abb754
19334 F20101124_AACANX spellman_e_Page_40.QC.jpg
202b7f0bee903b9c9f75f39b73dfa6c2
534850429bb785d267d573574eaa9a74a4371e92
23630 F20101124_AACBJO spellman_e_Page_39.QC.jpg
76e63a8636e09a3f747914c35247eda3
722cc1d66c896c53a36eed2facbaf5005cc375ca
5822 F20101124_AACBEQ spellman_e_Page_60thm.jpg
22476237e4190810e1566ed54307ac63
f64f680fac6de24226405429130256635f3ab234
F20101124_AACAXT spellman_e_Page_82.tif
61d0dcde0310498f45fb353a55f250f0
2e38159897d7fdb95e8004e3cf457b1b0c84ce76
21269 F20101124_AACASW spellman_e_Page_38.jp2
70bd7428c3c373b56fc7562c1aacd43a
b7dfd6181183adaf51b0de0787f1cfc0f0739d0a
1482 F20101124_AACANY spellman_e_Page_23.txt
86f4c1e9b68cc5f5d4a84534bed14f6d
97c2d72869eaa3471dce81db46728fcb37d4bed1
2700 F20101124_AACBJP spellman_e_Page_61thm.jpg
248349ac2adbd5f04443838c60ac57b3
d51ded281b47fec7bc1736e95191ef9ad9f3224c
5561 F20101124_AACBER spellman_e_Page_36thm.jpg
004216704f3ed8a906f84fbf6f5c0bd3
f51cfae524df824da84a4ecc45bb11e08ec476e9
F20101124_AACAXU spellman_e_Page_84.tif
3c20f88fd158c1ffeaba594db0c7bd92
251fbd93818b52e6523d4180426305c23660f6c0
1051968 F20101124_AACASX spellman_e_Page_39.jp2
cf52d342437d2757a75b1e3bf09b23ad
0a70f7e1454279587cb9ee41a23027842dc3f819
49571 F20101124_AACANZ spellman_e_Page_16.pro
05d816dc1cc01cd16476ad08bd0088d1
864965cccd49923f5b4fb3c2213832e3364b602c
3854 F20101124_AACBES spellman_e_Page_07thm.jpg
3414255bf5073e7115d7ae1660adf8fd
8cbbd6e127b5792f3e01532a379b8753c3b7346b
F20101124_AACAXV spellman_e_Page_85.tif
5ac1af68a0facc0b559b492451538eba
2d383acf9373b6f0b435544ddc2ddd16235a0c8e
895102 F20101124_AACASY spellman_e_Page_40.jp2
4001efed117b37ca2ea1962801495863
7a81908b075c7746fc4950a18d5e09813fbe0d04
2678 F20101124_AACBJQ spellman_e_Page_66thm.jpg
9507660ccc2f2616cbebba053f7780f9
5ee4c6f597b2701681e5f04bce83b1dc57e9df6f
5936 F20101124_AACBET spellman_e_Page_78.QC.jpg
23152229bbe4c8228998d08385f0dcee
4f02eb8f69d8bc616ce87d6127665a6d52ffc595
F20101124_AACAXW spellman_e_Page_86.tif
feb7b1c33a52afa7b772286d3ea553bd
0f9f923696dfe2b1fb7ad855cf7c8036579b3bed
72620 F20101124_AACAQA spellman_e_Page_46.jpg
b9e334cc29daf64f31939332ee0823d0
09632ad4b4e3be99676025987b66f90650e3db33
1051925 F20101124_AACASZ spellman_e_Page_41.jp2
22db8088eff7f23c3ad40e2b888796cd
14e892295f6d56466bc32ab2ed632bcf9a930894
5906 F20101124_AACBJR spellman_e_Page_59thm.jpg
16b4d2b6aee998a9bb13379d831c7e32
5abb17f56cfe390c2a462f4fdf8f93934b3a3896
6426 F20101124_AACBEU spellman_e_Page_44thm.jpg
2f7701580b2848f9eed9cb672e3663b2
9bd1e156fd3ad6d792216974e380e15d2ff91664
F20101124_AACAXX spellman_e_Page_87.tif
be0be7621f130ff81d5ac1aa2749b637
52fd7ab92cf0d65f50610b4e141d775d94303428
88082 F20101124_AACAQB spellman_e_Page_47.jpg
b8b624281846bd0e050ff96879bb74e0
4648c6367914a70a299cb461cca1a4f0b8c1635a
12999 F20101124_AACBJS spellman_e_Page_12.QC.jpg
b40b7a943bb9febb8d16243c0c1dc78b
c93d817498d4b3487826e313ff7c3a6a6a36afb6
22816 F20101124_AACBEV spellman_e_Page_41.QC.jpg
63e75dd9368373cce1a48ea7fa41304a
85e85ed5fb10dee75435a76fe16007b229f1c268
F20101124_AACAXY spellman_e_Page_88.tif
3a74af63634d095e511c09890f82c282
287c1fc9676eaabb7ce1612a2857222b46545d03
69706 F20101124_AACAQC spellman_e_Page_48.jpg
6b84f937514578ac7ac9eb1aa3c843fa
57af32f6a22957307d46c107f067f7bd79b2ace1
14664 F20101124_AACBJT spellman_e_Page_07.QC.jpg
503b5ba2665e0b977ea80e757c187be1
eac9accac1bcc5452930b87f93927e3f8b4003a8
4621 F20101124_AACBEW spellman_e_Page_33thm.jpg
95d0aa9500e377ba1f616502632e5ac1
bb02848d383939390fa347d28681412dc9ac0433
8932 F20101124_AACAXZ spellman_e_Page_01.pro
b420ed6e498f244655bd2b581d89618e
04aabf6740973e6a47a3ee921c0e14038c129fc4
77410 F20101124_AACAQD spellman_e_Page_49.jpg
0ba927f74effd5fd044e7a6b107df32e
0c3f22c8c1bf9c590012a1525759ff490159be7e
F20101124_AACAVA spellman_e_Page_10.tif
34471b104b9a05bc934286af915fc4f7
743f38063ad06072b322506bfade29f2cfa9ba0a
5810 F20101124_AACBJU spellman_e_Page_79thm.jpg
4920220e5cd09fa3756652278284b7e5
4411ee712af8430e410ff593df2322c0b1c89466
15726 F20101124_AACBEX spellman_e_Page_20.QC.jpg
54b56c64cd60b73f6ef4ef756e97b874
e41a29df17f38021f2c9c992425b8cb1c92852b4
33397 F20101124_AACAQE spellman_e_Page_50.jpg
7e8fefc70626d8020714177b01e94ae9
d7cd7a9ba4f710d4e94b9b2487edf26bedf7ee3e
F20101124_AACAVB spellman_e_Page_11.tif
4963f829b6b90f09f1f1f10572bdd59b
b67c7dc57aa3a3e557c55cac37d7d2e7c7f97340
3851 F20101124_AACBJV spellman_e_Page_03.QC.jpg
8f57afd37e9b3abcc1e18f0fc174ca0e
2601b4aadc7948f7efaedc61011e82713b876d2f
5791 F20101124_AACBEY spellman_e_Page_72thm.jpg
7ed318c980d2154a356cc5f38d844dde
1141fa145f8d89ec2de654e2567cba6544f622d8
561 F20101124_AACBCA spellman_e_Page_30.txt
0585e8360f0d717d8ae1ae2119664ad6
d513c49d042aa549f4f456954564e158e0861264
40702 F20101124_AACAQF spellman_e_Page_51.jpg
f3a2d4256c2024c04533ae91e5bbfab0
aea3225f06e2ecf287053cf91ea2561cc31f81f7
F20101124_AACAVC spellman_e_Page_12.tif
23a47252784058c3fa73eee133393864
56a4d45f62c9179abdf899c27952347fb5759219
6995 F20101124_AACBJW spellman_e_Page_64.QC.jpg
0db73475f5687b94373f976531285311
5945653e5024ae20ce0a5c62df2f7337f2f777fd
22434 F20101124_AACBEZ spellman_e_Page_60.QC.jpg
3bae826f3e35e1bd4c33f51931988b4b
b14838e9f1b05d7831709344bf43c789682e72cd
2037 F20101124_AACBCB spellman_e_Page_31.txt
e660bef0700fb7d4b4eaf627d34c1c1c
beeadcf866a10761ab9e1c4e05731fbfb05a4111
69201 F20101124_AACAQG spellman_e_Page_52.jpg
79b43f2b1de81ccd081325244a19fba3
f2a9663fd3b9e7f9695eef05a7b14fe6696db3c4
F20101124_AACAVD spellman_e_Page_13.tif
e1f2c0a59e17543d71ea2b9d82277cf6
8168db63556fa8b10d8b069a51b3cff25b3df371
5676 F20101124_AACBJX spellman_e_Page_52thm.jpg
8f7782eca45de27c254018f8ea61b887
3071ef066b5383aea3f7682eea4d612436409ed0
453 F20101124_AACBCC spellman_e_Page_32.txt
4f65fb2bf71bbf4a18336103105d3ac6
cfa4fc5d4bf6a5ade10d9431f12ae5726df62258
80857 F20101124_AACAQH spellman_e_Page_53.jpg
f33ae222486838c34e0677dd6146ce1c
7db79875fe083ec6756ae2e7c6bf1443c06f2344
F20101124_AACAVE spellman_e_Page_14.tif
a109f769a13c3fc356339c85aca2b745
7db8b37a1087c0faba4df8b20aa69d417eefb3c8
4139 F20101124_AACBJY spellman_e_Page_87.QC.jpg
56a0a0373a4f1d223a7687e56c490166
0e06297d5efa11d593d65e3b4a5081c315831030
3367 F20101124_AACBHA spellman_e_Page_21thm.jpg
bd654c690d29c69662a65e2f14c2db02
265850283ad56be17d3b95ba5f101093ca386c39
786 F20101124_AACBCD spellman_e_Page_34.txt
532689adccd1b10d546c3bcf9c973320
8560a2cca37ff3f491f7748ced3044fbf9c0da75
79029 F20101124_AACAQI spellman_e_Page_56.jpg
6399de004647901f0e3777adb68f813e
cfad8a96f7b2607639b5fc60566debbf96ac8c46
F20101124_AACAVF spellman_e_Page_15.tif
c38ca9dd7216a9e84db5dc7a3bc0db87
6e05521dd2d1674e2f68d057c9e96077c8c4819b
23912 F20101124_AACBJZ spellman_e_Page_15.QC.jpg
aa6d3bf5102db751b82bafab76a727ac
e9d389a5be7ee9817914951aa48ec927fb6ee953
5896 F20101124_AACBHB spellman_e_Page_46thm.jpg
cc77aef91dd5fd653a10dace1c7663c2
6f2d329dd9bdacd21a1c866df3732a5d7f5423a9
1947 F20101124_AACBCE spellman_e_Page_35.txt
5ae06a8cbfc55b4c086b8070592cdeca
275bf9848c6cad86ed4545854b93e028b49c308f
63624 F20101124_AACAQJ spellman_e_Page_57.jpg
6bfa895c5df4c768dd7eb5aed2f4b1e6
ecc53d04637b963ce30139cabd3416a0220e2a06
F20101124_AACAVG spellman_e_Page_16.tif
250925e716372e48117f63d86f15322e
9331dc72d2b54fe265c0c27e53368d65389cc7c7
14767 F20101124_AACBHC spellman_e_Page_23.QC.jpg
6fe2cd1c869ecf6936f02a10f98864fb
19c5cd38eedd6447adaa4015ac44ae1d33a3fd5b
1849 F20101124_AACBCF spellman_e_Page_36.txt
f3bfca1b88df1065be0d7e4d24329c62
ac17dec8ce7480584b52b3f4ec1a075f749a8744
54848 F20101124_AACAQK spellman_e_Page_58.jpg
979c736d394a652435a0416d63fb2ea5
65f172231d7f941cdebfe05e7a7f8dc8137b4f42
F20101124_AACAVH spellman_e_Page_17.tif
904a8d33f67305e7bc9bdf4499379402
231f309c0f74522a49b4cc345a56eeeebc1b2901
11271 F20101124_AACBHD spellman_e_Page_82.QC.jpg
9fc6ccacd9b9c0dc530003b9b46ff0cd
4157695cf3273edcbb559bbc2be82c17333bbc17
463 F20101124_AACBCG spellman_e_Page_37.txt
25cebcdd42bbbcc4f5484dc1de4cf3d8
209c264dc90e04d642807f1a902b700d56c1646e
73076 F20101124_AACAQL spellman_e_Page_59.jpg
340d2d501817605f47d20112e7dae6cd
ec655ebfc146c0af0c8dcdf71883f043e3ebae3b
F20101124_AACAVI spellman_e_Page_18.tif
aedf39f93d1a789dc13a24969f97c47c
9dd1ec0a37cfc5fd124e4ac619a5fc2fa56d3f25
2282 F20101124_AACBHE spellman_e_Page_76thm.jpg
230bf81f668ef07f4a05e9d90f3da08a
35261dac4fdacc6580485df7f152a7a6f2811107
426 F20101124_AACBCH spellman_e_Page_38.txt
28b43f17ebcbf2d687cb2ddb6d84edfa
1a48920ec472b180c7164bdb7958f6f1903ee2c7
74226 F20101124_AACAQM spellman_e_Page_60.jpg
e13c1f51925760fd7a3a05a452a5e0af
761d6e91dbc5bfe322fb17247274125b24c99e0a
F20101124_AACAVJ spellman_e_Page_19.tif
9e809b2e94fe678733e82bc72189264f
5dc31e4faa462671f3793290fe14633164f17c3c
22006 F20101124_AACBHF spellman_e_Page_85.QC.jpg
553e28d941bbe93334c63507bb18ffeb
3524f9dd84476a3171ba7721c916b49185d13333
2039 F20101124_AACBCI spellman_e_Page_39.txt
19200e7bbdea5a731d0c5f72b27716e4
9e636538efd9678869d005d607c34a7ca92cad59
25283 F20101124_AACAQN spellman_e_Page_61.jpg
d436aaa70e13fb1768e84e19913bc13a
6262a534fc536867d62e2a6573c9a61332b22940
F20101124_AACAVK spellman_e_Page_20.tif
60c63c1b80b9d9340369aea9605851a6
851d855e48898e3f4e1b45dd7eb9f38ef29e146f
5196 F20101124_AACBHG spellman_e_Page_70thm.jpg
a57f84ec50648f74a3ce9057f424a56b
90e00e04aabc035b56ac20ad2a19c823a9916357
33051 F20101124_AACAQO spellman_e_Page_62.jpg
1d025214b66b93c5b65d70762528d5e0
abe5af448b8e07e15f3107c3e1a43607c066365c
F20101124_AACAVL spellman_e_Page_21.tif
1d46c7629558c0296170788fc7b62db8
4c85a26e56ef6ec3a7f79e80f2366d8edcfca94d
23485 F20101124_AACBHH spellman_e_Page_73.QC.jpg
c0846bd2e35ea70b5a6262dd8382f7e3
b1fc6ae677485832f3444068199841b13f33310a
1858 F20101124_AACBCJ spellman_e_Page_40.txt
4ec507620e0a70d5faaae58461da50ed
b59a7236127b0e3f4e98896bd1857a5dcbfb5d8c
F20101124_AACAVM spellman_e_Page_22.tif
d805f06ea4187b1e91adcb6f716a3b9f
ca4cbfd27322872acb088dc9595ed176463975e9
74973 F20101124_AACAQP spellman_e_Page_63.jpg
656c5a783571d2ddb0d7443cc18ec642
26b6dc4e4cb3e83c0915a12b0f2f5bc5bcd18b51
3958 F20101124_AACBHI spellman_e_Page_05thm.jpg
208e900c61a9a164ac1a75c2d285f6b9
85bdc5535f332478009869868712eeff88d49722
1997 F20101124_AACBCK spellman_e_Page_41.txt
45838fc9e75e6aff273ef28ad12537c4
2941bb9f47a858f0a73ecfa6f59acad3cedb2fb7
F20101124_AACAVN spellman_e_Page_23.tif
1d1a5c33ccc3d73354b4ca5fa9fc1dac
3431122a85b643dae50e8f29384ba28743562587
21560 F20101124_AACAQQ spellman_e_Page_64.jpg
a9f05fc5f09494a4a23573f8011b4b48
c8573ac85f4386c003557c25208096b027e3d725
6135 F20101124_AACBHJ spellman_e_Page_17thm.jpg
dc5d6c0506905f475a6d3d3d257321a5
344dcaa300a3cdae11a53129c87ab224dd7c012f
2077 F20101124_AACBCL spellman_e_Page_42.txt
5255be61d17f39da19078fb247156206
815803d731cd5d0d8e77aaf18bc63d9b6f993c8e
F20101124_AACAVO spellman_e_Page_24.tif
81c998536048977a819868ec32cb1340
70a1beb7352e10ef8458b79e9982e859ca9b7cb5
27869 F20101124_AACAQR spellman_e_Page_65.jpg
17ec7d3fd5f7c7e58968d69d3c838876
1fc399df707847870600abc761a4c78beced1f09
4650 F20101124_AACBHK spellman_e_Page_23thm.jpg
b179d68de8267ee8f7275a127c5b152b
8704c01028034e1bc9d215da3ed7572b2a313cbe
1970 F20101124_AACBCM spellman_e_Page_43.txt
031ba6c1425bb03073509e8ca5b0ba1c
387702920a42963646825dbb23224cd5a4bbf12c
F20101124_AACAVP spellman_e_Page_25.tif
986440ec7a221325568b8ba055188de9
c538d3785f8e7596cd59062afd55d5aa2a309be9
26464 F20101124_AACAQS spellman_e_Page_66.jpg
871f7fbc2fe79b6b99e2d3148e3b8739
19e856f99b41b50aac2000cbab20700b7ebce1eb
22897 F20101124_AACBHL spellman_e_Page_63.QC.jpg
c263c10b52a8cfb6dda29306f105e07d
48301aabc69946450b2a9baef21cac7557c9bf69
2205 F20101124_AACBCN spellman_e_Page_44.txt
5f6099b1fb19c82ebd200ce3b862cd66
8de0bde0757f3155b84d5297e9b638f7e1c465e0
F20101124_AACAVQ spellman_e_Page_26.tif
d0b2585ff85348579f21ad012d006dc4
ac620e0b3fef13e9fc0d769e6ceb17c68ef45161
53760 F20101124_AACAQT spellman_e_Page_67.jpg
939b1fc0b963c8e1d3c9f31dabbb9501
ac028b7aa4935ea1e281fbf2f545d9e58d8171b8
8774 F20101124_AACBHM spellman_e_Page_30.QC.jpg
1fed2f7b44db7a345b055166071d6d22
c3694d547d0399e76a7762f91f4fb0f27e296f7e
1623 F20101124_AACBCO spellman_e_Page_45.txt
44b6a7399225c38ee720420635d13156
7467cc3ee9e7ae42390f0188c06f4ca4557ff447
F20101124_AACAVR spellman_e_Page_27.tif
5f394e69df00bca7483a1ea6810094df
853e4208ef476a9966f73aae7947104eb7b54e07
17984 F20101124_AACAQU spellman_e_Page_68.jpg
7b171b357497311535bbb1b68dd05fd8
0ac6e3ed81b04b5906b6a81771f1ef0855d7b316
3707 F20101124_AACBHN spellman_e_Page_50thm.jpg
663b82496f2a3aeeda8d5ce688af82f8
7af81fb6cfdaf3e5064bc7a6455fe6144f2d7df8
1961 F20101124_AACBCP spellman_e_Page_46.txt
3cef7fe29d96018ab393d9e8404eca63
8e4a15c87014a1e9365da138af7d02fad2254789
F20101124_AACAVS spellman_e_Page_28.tif
334e8c1c4ebb8620cc66f588f3aab0d5
6c4f8c8fbca4196a44118137eff25b2764b3dd75
56059 F20101124_AACAQV spellman_e_Page_69.jpg
244db0f3f9662436b7c8d108a25803fc
4cde226633b34ce5fbd05b6b3752012a74747f7e
1616 F20101124_AACBCQ spellman_e_Page_47.txt
31a088688a31f7ddbe9802b5867ecb2f
a97b61123a62117d9762db63a0dd6881451a3812
F20101124_AACAVT spellman_e_Page_29.tif
398a3596a96214f409b761748b9589c9
cca08b916d22afe4b5f7a725eecdba02f8b594b1
65120 F20101124_AACAQW spellman_e_Page_70.jpg
8fcbce1b582bfc70e2f369632d7f7df4
42b1ee8b8ae2db80a5edf7bf217d0fa54c3e5dae
23886 F20101124_AACBHO spellman_e_Page_49.QC.jpg
5cb4652fe03f348d72092f6c29aaf39f
287363dd4a3ebe44bdbb6477dcbb731eda2e1920
2045 F20101124_AACBCR spellman_e_Page_49.txt
a1a50c961dc6d729e5a09e9bb55d4bf2
d24b11889449133af1b66cf867ccdb5cb1d15fc3
F20101124_AACAVU spellman_e_Page_30.tif
9405b23aff746c81e1f4a16af5e7058a
46a1d50c9305aba5c9f92bd43b2bd3d862dd5c2b
77150 F20101124_AACAQX spellman_e_Page_71.jpg
3e234b2c12d40d2c99e6ac012b4b1294
9fca1328f2d0704ee973a19572660ed435426b20
14902 F20101124_AACBHP spellman_e_Page_05.QC.jpg
98ef60515d1f20efe11f9db454d0f118
e9478872103a5c693da99853c7fb2c532f8d024c
1064 F20101124_AACBCS spellman_e_Page_51.txt
613bf9f1bf235a2c987f2bed6b37fa3c
6505ca5f6a4ab4d61cd5905470d23dbe982821fd
F20101124_AACAVV spellman_e_Page_31.tif
df34798753d2af5eaa4e61e2800a6e1f
1c92e8c4a0cd76ca72f3281f6d9b2e5fcc5ae9b4
24693 F20101124_AACBHQ spellman_e_Page_56.QC.jpg
58b6aedc8fd5aec8583aea8cb7bfa5eb
6fa13a63d43697560073f051991e104a9d72ed72
2003 F20101124_AACBCT spellman_e_Page_52.txt
41f86f065057b7ef3f6474f1662fd0a3
99de16f1a71d2d0b7290e2062d763627452b4216
8423998 F20101124_AACAVW spellman_e_Page_32.tif
91237995933721a10577ad23295898b2
8ffc377450b9b460dbac29a0942d0100cceda44d
25452 F20101124_AACAOA spellman_e_Page_44.QC.jpg
43951da70ff5573dc5cfe3d8d0ec189a
47fa33e9e59d05c973519185c1e1527b61eef7b2
66935 F20101124_AACAQY spellman_e_Page_72.jpg
b9bef0879ba35233c80243b718c59d27
1e85f85fa3544e761bc5eac1528970ef65d4e3db
2543 F20101124_AACBHR spellman_e_Page_88thm.jpg
dce9cd98b58e8fb2416958c12fff5188
3df9ce0bf487da53e5e2adde1d4142bd968c8e1a
2118 F20101124_AACBCU spellman_e_Page_53.txt
0a0b346153375b9edbfe56bd3d0e7384
45a29eea83f56fac6c57515fa58ca1135830ce50
F20101124_AACAVX spellman_e_Page_33.tif
ca223fbbfd3419af1a3b0557f009a013
2417749af810e55675e0fc38794f70e7afd63381
42937 F20101124_AACAOB spellman_e_Page_36.pro
b26cc21c8045b6d4c1097c8229051dcf
11d68a0c5eed583fe30b06c077328c522fbe9376
73682 F20101124_AACAQZ spellman_e_Page_73.jpg
9220b7bb7ad8da9f5650e840ea0b3113
de554bf2911fdb12def755413d2d22501a059db6
23408 F20101124_AACBHS spellman_e_Page_14.QC.jpg
220205e966588cefb5b7e0b67dd82c2b
c82a41a199bfc9c0154c1e3abebc38116ef8ded9
1668 F20101124_AACBCV spellman_e_Page_54.txt
dcee4cfef161bef319ba8337a23012c5
0a5888b9fcfd2597bc3826af2edda4cdd1c25083
F20101124_AACAVY spellman_e_Page_34.tif
6df2e88c2a5f6c3d5a3932b79d683b63
d577b0f77a45d1a0ad3bbb73338d54ac22b85292
25583 F20101124_AACAOC spellman_e_Page_53.QC.jpg
75ac10d0566bc227b61619aac32abd22
ce4c38330e72ff9e48ae18a379d72bcec8689a1f
2934 F20101124_AACBHT spellman_e_Page_65thm.jpg
8ab348f719a9ad25b9242a269b986516
28b68e26907ff7c9ba3875cbf285a6fbcfb9f280
1785 F20101124_AACBCW spellman_e_Page_55.txt
64ea845ca4a7dcf2222e9304fd4d7d13
66dcb0243e642c0b66b1338d55700e5a7d7b36fe
1051933 F20101124_AACATA spellman_e_Page_42.jp2
161e7263ffabeaa2734ec3660516d822
17eea642b857757a7cd2f01d4e88c4eed65df9a0
F20101124_AACAVZ spellman_e_Page_35.tif
020d4b29b30715c55f16eb55683f75c8
2659d45ed3206a62c241adf27393b4df87d43742
56354 F20101124_AACAOD spellman_e_Page_54.jpg
b8e43f8add4c4a299ec7dc5337410e49
576e47a82c48dccfbb2492b21f3695cfd2abb1ef
4772 F20101124_AACBHU spellman_e_Page_69thm.jpg
f85fe0b6d136917ba7c86f15611c6194
2d08df4a04e569e9871d0c2361c35bab300f6b7f
2140 F20101124_AACBCX spellman_e_Page_56.txt
a867b18981cf11232f8330951c4f1a77
092a87af222fc10bbfb66ac069a95fe153d8f22f
1051986 F20101124_AACATB spellman_e_Page_43.jp2
9df62027705c46488ff1bd002551cd19
dd56b9cef9c621e496d9f7766d0268e5a83dd029
12102 F20101124_AACAOE spellman_e_Page_26.QC.jpg
776a9cae36b0b2144624d638690e7e99
793aee7690e39824a1fad15fb73081c405f248bc
24285 F20101124_AACBHV spellman_e_Page_71.QC.jpg
890bb90fc37b5f8c182f84d574931633
1a61484a00668156a4f979c6db1280a25d09a668
1839 F20101124_AACBCY spellman_e_Page_57.txt
dc7f7068c57382a96b6485e50263bce4
526fc077d9a7eef7e4dc614d42f74ad6f9a1a74e
1051974 F20101124_AACATC spellman_e_Page_44.jp2
8ccc28eafe2e824481a2fc68ed00565f
74771064885cf3386394a81130a73e36b4e41da3
5565 F20101124_AACAOF spellman_e_Page_40thm.jpg
48d89a2de6946e1b100ced34b7793fbc
f55bfe77fc7f94a899fbd5855c18e8f4542a3a74
6234 F20101124_AACBAA spellman_e_Page_61.pro
fbdd568c370b93710537617f7cf1b615
61ee4b62b44f9553eb18d9c2d9c97c8dc5cf69de
7716 F20101124_AACBHW spellman_e_Page_66.QC.jpg
9909bb1cd8d85968fca125dcd07a9453
9685f43fbf31735788c124936504b825a06bce2c
1474 F20101124_AACBCZ spellman_e_Page_58.txt
9e2c95799dd38611989df0ed286f4c60
227d692434dd6d0f2824f6aa6be1c54a71b47351
784495 F20101124_AACATD spellman_e_Page_45.jp2
f3c5412efdbad65192b036b7e136f5ac
360f348f984abf224184de3dbb9c906c65c44c32
45247 F20101124_AACAOG spellman_e_Page_80.jpg
ef2549007e7dfc41ca9f28bf2df842b2
85b1cee5f5d3aa92c7fc17aa810c380247a1ebe1
1123 F20101124_AACAYA spellman_e_Page_02.pro
3ddce70f5e45d6d3d73b46e688ec2309
ba15bbaaad311b7fc88c562697e93615e379ea93
5770 F20101124_AACBAB spellman_e_Page_62.pro
7d24003e0d834f7dad66a135efa7db45
4e84eef9ed0a53f200eee6ca86cf4009752e2aec
4961 F20101124_AACBHX spellman_e_Page_81thm.jpg
4c0b82da83f1ac3ef15156d08330d883
1855bc14b109864f12afb6f7fcefb5c14a358d4a
1039420 F20101124_AACATE spellman_e_Page_46.jp2
f1845cc611c8a82e0f47549e07365912
1221accb2b575af6db303733c50ada00515330b2
103621 F20101124_AACAOH UFE0011388_00001.mets FULL
890ad5ba9d6c5e47b36a5d3ecf062a04
39c2a242cf04b01745e4263639f4577629773044
3084 F20101124_AACAYB spellman_e_Page_03.pro
73abf8d4eff4a635927d20fc8bf73a12
f93c1199631fcd062ea6231f645421206328ed05
47623 F20101124_AACBAC spellman_e_Page_63.pro
a268328f364b3506c0845db1d7077427
9909b358ebda2674c291bd59c72ee09c013009f4
2655 F20101124_AACBHY spellman_e_Page_64thm.jpg
3fd82bcbd83002241e08697763e3ee54
9f088422cad66958d30ed70f8f7a5f5043c066f9
15155 F20101124_AACBFA spellman_e_Page_19.QC.jpg
684c8a6e0205f16e290dede2fa40eaac
9a7bb7d2fd2fd921874a1591a11a3c2a2e6a9511
1051973 F20101124_AACATF spellman_e_Page_47.jp2
ad232d43075944a5a406e7c1e67f470c
486cab7817d8ae8303214ac13c585f4ab8bb5475
44860 F20101124_AACAYC spellman_e_Page_04.pro
e0632e6f298a82422b2d419f9ed30288
b33d45ad4a873b5b1ac22db0474f27cb46743f08
5234 F20101124_AACBAD spellman_e_Page_64.pro
a79f83be07a79bc974a2b35bb9d8f540
3d4b28353b13b62d7a88adae3380717ede36559b
5299 F20101124_AACBHZ spellman_e_Page_04thm.jpg
0ded17c125726efe6d2ea666cca6feb5
cba28fa3e3bed3ac21cf5c9f8fec0af819f9642b
12355 F20101124_AACBFB spellman_e_Page_18.QC.jpg
4b2d9d6e91f787b7cdeda7082b81ef3b
1c74a91cd80cf3cb1b8ccbfbb5abe59ed0a20f4d
1051963 F20101124_AACATG spellman_e_Page_48.jp2
5f3f4e5473d1538d11ded86c90bd1030
e33d1fed7888a7523b813d16fa287ccab97394fa
27003 F20101124_AACAYD spellman_e_Page_05.pro
cc234f7bd5ffb9995f3284938db28356
11b4ba7e8197791dbbce4a6bf3b68e39ad2cd593
7646 F20101124_AACBAE spellman_e_Page_65.pro
c5f68997ef16da0683e90ffffc9d8feb
a9a0fc8e235f4ec6688d46942728e761e1a8212f
6024 F20101124_AACBFC spellman_e_Page_73thm.jpg
df7a8b7d077753202a8698c6b4970a8e
a4cb8f507236871e2db63c6abc92b493108d6404
F20101124_AACATH spellman_e_Page_49.jp2
126a46e43b8fc7379f0581a2a31eff87
c80b934d7ed9458e8c018049c31b060010e14644
24949 F20101124_AACAOK spellman_e_Page_01.jpg
7028f3dbbdcb8e5e9eb35f8b36d9c673
e3b9fa812fda0f2e01bcf2a7b62b102655b867e1
50973 F20101124_AACAYE spellman_e_Page_06.pro
e9fece25b3c1b60f4e7e18a750721882
fd9182ba8152e12d15e0e64bc11d127d8e97d8d2
33425 F20101124_AACBAF spellman_e_Page_67.pro
4fc912648ca71e234a2892a7cb263ad9
be146d6b063c84252b30014d2c7331694e7b5902
19295 F20101124_AACBKA spellman_e_Page_81.QC.jpg
0b0e6cc7c1faa5ee39bd7da24280d6f3
a62cd42449db99137bd446fb87b24e92958a87e4
17707 F20101124_AACBFD spellman_e_Page_69.QC.jpg
7735afa079b887a36f433bf5aaa0e215
6f7a861c6ab7eefed6dfeec4d4db777b8e44e8f1
327959 F20101124_AACATI spellman_e_Page_50.jp2
f26efe82b6e0d65843c589dade088da0
eb82da35604bee3f3600e2a4380800cd8dec204e
10116 F20101124_AACAOL spellman_e_Page_02.jpg
ba3d0b36b2faf705c569f08e42130812
f8f0570e15bd5762d3a351e16b2b6efc36d6acc2
26334 F20101124_AACAYF spellman_e_Page_07.pro
5abc2017d0297c2a5b5f08f294128538
42634e5d7f04a5c7334a2bd43632215ae695f92d
6316 F20101124_AACBAG spellman_e_Page_68.pro
3d017e5c851edddcf0b7bb27c6b8649f
b2a13d778d096522136b2c3a85168cc72b754a8f
1539 F20101124_AACBKB spellman_e_Page_87thm.jpg
7a7324d84aad40405a07084543899f48
a7c159b79b1f83e4af050cf109bf3fb6e9e27177
1510 F20101124_AACBFE spellman_e_Page_03thm.jpg
9239832c842c8d555f2fb3043d1a3388
e4578b7e23f0fb19d2023a311d45080df129cfc3
34415 F20101124_AACATJ spellman_e_Page_51.jp2
cf6db4fb17a866bfd4cbaac4bbeafcde
19cb0e1ec6da85c60a2412c648a877538ad7e40e
11713 F20101124_AACAOM spellman_e_Page_03.jpg
5cecf9459fe8d338d0b4ba0001113762
31cc9704a249a465373a4799e12fbd9e5cb53bb8
9138 F20101124_AACAYG spellman_e_Page_08.pro
c6aaf98a6cd81fe2fc6788b7169d61a3
9e21c779f0647bee980936dca2fbfd33a486ca03
4908 F20101124_AACBKC spellman_e_Page_58thm.jpg
dfed2f252a99e3a0744ee5fd82e836ed
4f039cc8520edb90cd2c8d25b10d1d30cdb25b73
6093 F20101124_AACBFF spellman_e_Page_39thm.jpg
200108117a66b5a278ca079a267c1ec2
45a39c7e51e2f8924d7b790679fece55d4ec31d3
827068 F20101124_AACATK spellman_e_Page_52.jp2
2085da3780822bcd67c106ca536fa316
512c44b33a0973489d77b6adae741671d4a5e2dd
60831 F20101124_AACAON spellman_e_Page_04.jpg
9e854a8ab8e0f5312703d8ec70735b96
4266b892108a085997032f92666581d5f0d522c8
57080 F20101124_AACAYH spellman_e_Page_09.pro
9034c84b5324db57b319aca913a5d18d
182cb8beb011839ef47de62cc0a74e6163083405
42022 F20101124_AACBAH spellman_e_Page_69.pro
11c9072f9b8d12e5e9bda31a8d7f9419
1fd5065976fbcbf5bc5d3e38c018d046f1385245
22646 F20101124_AACBKD spellman_e_Page_46.QC.jpg
1b15233c56dc0a2838146ca9c142fa34
90784c08edce5b59bebcd2dff66dfbf878137c07
8170 F20101124_AACBFG spellman_e_Page_38.QC.jpg
021b5064de8399d3b1fe28b1c11d964b
df2fa1363401191dfeb3808b4ac131f91c1430f9
46653 F20101124_AACAOO spellman_e_Page_05.jpg
5e3b4af8a0fbb0c03adcebb37edffd0d
5793909645e11ddb7c55d4f3e3130820ee430a58
58027 F20101124_AACAYI spellman_e_Page_10.pro
7946d1ab6dd9c6eb54a0982140ccf99a
197ccfc4c96935dd88249623b86a22e705a0774e
41202 F20101124_AACBAI spellman_e_Page_70.pro
c8183c9c56243d4de6cab29c85469bd1
fea60f076e906dc9d0643ed97757616cf020ef62
1051969 F20101124_AACATL spellman_e_Page_53.jp2
7bde715a8c415da736ebe25ff7db6c44
64966da26ca3fa160fc7581f673cfee9e52411fe
4072 F20101124_AACBKE spellman_e_Page_74thm.jpg
fa57e34a63fe356150d02f8c9fbc4fcf
6dfa7a736a8f558360aaa5b0f92f9e8180abc6c6
10891 F20101124_AACBFH spellman_e_Page_27.QC.jpg
29a18323e6101efb2e93cf363a4663a2
ae77d195339b692fd6e73b0959f886d1e1780bcc
51556 F20101124_AACAOP spellman_e_Page_07.jpg
d97332b91dc1a4d2bb0b3dac1c5ce892
2d3b174e814d29b151141c1d47d5cc7e8193d987
44666 F20101124_AACAYJ spellman_e_Page_11.pro
ffcffae5b88b407a24282b1b5274e1e9
e9a9841954271e1fb4e645ce5d492d376a6de428
52335 F20101124_AACBAJ spellman_e_Page_71.pro
854b41bbc23c7b8a18a15e7f309da13e
9101bfaf3ba7cb71262b0d15fbdd5f9cd9fc4a92
88946 F20101124_AACATM spellman_e_Page_54.jp2
62a0a7509e9bbdb804b8f6e331d0f856
e45b6a43f9155d40b1d3224edf2d1302e05058d4
23265 F20101124_AACBKF spellman_e_Page_77.QC.jpg
8886cd9ea9fa31ff4c9a080aeef92b79
4b135b5f47a5c49ad423334fc3588076af35222a
6042 F20101124_AACBFI spellman_e_Page_16thm.jpg
6f1c7f6154f5ebd18e6783ec72321ab7
fc9c8d0c073dec3e30be410e0cb6bb4632eacd6a
22054 F20101124_AACAOQ spellman_e_Page_08.jpg
937bb6f3f85a81f9e72fd351465b3f89
d9ab9eac8243d86cd9869232e647a5313d97d735
27612 F20101124_AACAYK spellman_e_Page_12.pro
97c19fc3df4a02497cb074f96a6c513b
1e221e1d5d102c4527671047bc0a673dfdc41b4f
46349 F20101124_AACBAK spellman_e_Page_72.pro
f742eec273d1ebe7a4cd9da8430f52f6
3ed45a1e301f798a2d8d7fa16ca6b913b2c91bb2
1051938 F20101124_AACATN spellman_e_Page_56.jp2
ae01e606040a406d1d72f5a46d09e060
f4010438fc546619bd5b1a759e66240ef01f93aa
8131 F20101124_AACBKG spellman_e_Page_88.QC.jpg
91c507e51ba79cf3ac465e09606c30de
5f095eaca3a5154e7f97b77df7b2923dc2e78262
6167 F20101124_AACBFJ spellman_e_Page_31thm.jpg
1637df076c4831a5c443c6e21b5620ee
8fdf9617e41bc77d81921e42b95a5d9f69908446
85916 F20101124_AACAOR spellman_e_Page_09.jpg
025062674632459d750b73a5eefd8abb
13b8ab3d904269ea3b676e779e1ac2ec60ff202b
48515 F20101124_AACAYL spellman_e_Page_13.pro
d76b1251a9b0a11e321fed42bb185694
c47e67445605a9f3fb1a8040fc1c4dde978c8d4c
49347 F20101124_AACBAL spellman_e_Page_73.pro
30dc6381575663862887d31cca4e4fbd
e4e643cf40878f8b818034f9c18c54d13fe1544b
860168 F20101124_AACATO spellman_e_Page_57.jp2
b399e15ccb7a6c33ea05368cef317b13
7d6fc929b8e2f7b62a1414888289e7e7634bc63f
133967 F20101124_AACBKH UFE0011388_00001.xml
bb5600d1d6ba02f3fdf815908c3921c1
11be55df06db84cab7b9812086d8fcedb6c7c419
3118 F20101124_AACBFK spellman_e_Page_02.QC.jpg
d160218b58677c0690ed45258936dc8e
c2ad5d3f81d2e29fbe7656d53242fd171d6e0385
87629 F20101124_AACAOS spellman_e_Page_10.jpg
01e00c36c4c431df5c6749258e56fc11
266154ca9c6906e55d466b815efc9059c7217869
50003 F20101124_AACAYM spellman_e_Page_14.pro
f76b7163352bbc765048ec029501f9cc
1ae551711a0baf28e666464b8fa00efe8e674cd4
1223 F20101124_AACBAM spellman_e_Page_74.pro
3e14c9649e5fb6c51cedd0dc6956a750
7fe5e0244ec84a586b4a20959e9f7d8394ac9a0d
721844 F20101124_AACATP spellman_e_Page_58.jp2
bc6579b7ace2c3f9a48e15cbd6c2645e
cea3037ee0a9a8abe559512bb36119aedfb6927f
2154 F20101124_AACBKI spellman_e_Page_78thm.jpg
4832c78f550a672fec080cabc0c40b45
fb835695586be16d6973df19f7e0130ab2f3a257
2148 F20101124_AACBFL spellman_e_Page_68thm.jpg
afed998ab83e0358d48de269fe6f86be
23644d621f3a6f19e872aad93257089026bb600a
63320 F20101124_AACAOT spellman_e_Page_11.jpg
7f1effbab24045e43d2c131dd0e60491
3ca9b10ecbd70106b8ea8eaf871b186522a83101
50837 F20101124_AACAYN spellman_e_Page_15.pro
b46ca7e0800cff264a47f77391aa8521
55f372eeccd84fe1542cc3f5ce96a516dc0a0b2e
50025 F20101124_AACBAN spellman_e_Page_75.pro
6e90618565c4dd3a71785b2aad079a54
761c299dab1f2cc7c425cfb809cb02f93d8656a9
1039410 F20101124_AACATQ spellman_e_Page_59.jp2
519a40ea2c8ee5b467f664a162482aba
a3d4ed1c7960061106d8f8c84b242a0e2c5417b7
40874 F20101124_AACAOU spellman_e_Page_12.jpg
bcd281e80a804b4f4446c02f788eb517
10b1fe430cda2a5127d86ac842376110f6498266
49792 F20101124_AACAYO spellman_e_Page_17.pro
70b6ccbae401a186afdc1f4d14fa1a51
ef09effebaaa89e866ee78f548038f1108643547
8798 F20101124_AACBAO spellman_e_Page_76.pro
3033a9df309fea72fd31840338af9ed4
7b78f1703df1d3137577e5262475319207869d38
1024796 F20101124_AACATR spellman_e_Page_60.jp2
f3e90b4977cd814bc6c32bb9c8920d00
6cab4619f9b654b5b9956c0c24bd1c71491d8250
21206 F20101124_AACBFM spellman_e_Page_84.QC.jpg
560af2245f7190eb8ae9bc3da14d8f44
b0118d9fdef9d5841db9cc3e45c082981379ce59
63005 F20101124_AACAOV spellman_e_Page_13.jpg
88b5bce99f5a535ad0320d616c8a6866
6dc2596449ac71a3437b3052bcfc86a1a4c92d0d
21558 F20101124_AACAYP spellman_e_Page_18.pro
fc4042daad9d3e90151eaa8223a9fc64
9d3af965edacea4f1972efa5b9b19a2351df98f0
8023 F20101124_AACBAP spellman_e_Page_78.pro
50bbd7a65e570a5b257d127eb2b435c4
38875a2a68271fcb62ae0c0b8bbfeaef2b0055b3
339203 F20101124_AACATS spellman_e_Page_61.jp2
519fddd829a0add77a096e6c82715b46
2ad7b8bafa1e265adcb029e494a4c86aac750b03
15040 F20101124_AACBFN spellman_e_Page_24.QC.jpg
8fc4ef785772c4639af81496d72d0d45
0a4a387036e98ec22639f0ffa6fb8269650c0f6b
32670 F20101124_AACAYQ spellman_e_Page_19.pro
3ee64f969f49cd4db8bd03481d9dfdec
c035298e66d773813b762570761378e776722798
45753 F20101124_AACBAQ spellman_e_Page_79.pro
ddc54d7fca0bbe085a9fabef5f5f758e
bb3e49eaa911f8e16b0ff9ba38505da5625c04f7
417105 F20101124_AACATT spellman_e_Page_62.jp2
52326aa54088794ec092cb5b65f13bbd
6d5de944c61e94737b8989574fcaafcede292d90
3362 F20101124_AACBFO spellman_e_Page_18thm.jpg
372852c9e1cdc144780da1606bd54f5d
17f88f35f7106d54c7ff1a91cc22ef3dcde1669c
75341 F20101124_AACAOW spellman_e_Page_14.jpg
7d1d785856ab22e01db3c8e2c10e0a52
b537026754b9bd17c95419193a127d51bdd43440
31496 F20101124_AACAYR spellman_e_Page_20.pro
cfd31e307882b8bf1cf4f75f28fc9dde
7afeedbfb455ee73368d9071d942ea7fdd65668f
28778 F20101124_AACBAR spellman_e_Page_80.pro
7b00e478c8906997bfd4bbdfac0f4584
0be80b2306f47dcebf51b7de690fecea5f52177d
1051982 F20101124_AACATU spellman_e_Page_63.jp2
5826bdd32373f66e9e71df730e6f5c1f
691c5e63c78cb6bbdeb77ed96d4935000f9dac7f
3607 F20101124_AACBFP spellman_e_Page_37thm.jpg
81eb3eb8f6659287935866289bb9f4e3
fca81fde85ba5c1b4f6defd87f2bf2e94e5b5e2d
74834 F20101124_AACAOX spellman_e_Page_15.jpg
a5f5cc10b2a20738bcd62feb982c650a
47a09dccf250e6bfdc6d6802ddcbed6687a9f185
F20101124_AACAYS spellman_e_Page_21.pro
d613924265d7c5ee07a21681e0b18e6a
d71a36c46095d022949b2b99900eba006b7d52dc
45466 F20101124_AACBAS spellman_e_Page_81.pro
ca6220faf34512ae3e4a83783c722e5a
91cd63fd3d05dd9c8fea3eb2ca54471c88f4b413
232350 F20101124_AACATV spellman_e_Page_64.jp2
cbe30efdf1411635f7b0d336837f8b2a
dbd942fc70548be832d104cb01d7b067690a3311
5626 F20101124_AACBFQ spellman_e_Page_25thm.jpg
768b4d7b2e8bad76a3942db4d06256d0
959975a1a7d6c2bd05fa9cf5d581f65f94326072
73116 F20101124_AACAOY spellman_e_Page_16.jpg
ba33a99c56fdb8e4b3a5095f4672016f
da2e76ec56d759b50475d41d015ee4505d708fe2
29318 F20101124_AACAYT spellman_e_Page_22.pro
de3230a760d8b2da28cadc36f7702b3c
140c95242754a759a4de5f3fc457dbc7d2e2551b
22750 F20101124_AACBAT spellman_e_Page_82.pro
1652632acf8dd092ad452a9e5e583959
7e386ab8986a60189513c72966632d254375e46b
374919 F20101124_AACATW spellman_e_Page_66.jp2
a132126a23aeef61f3e2d7e7e98f9352
f2399ba2dc52f322a399abfffbf0a6261d7b9a3d
6211 F20101124_AACBFR spellman_e_Page_15thm.jpg
daa74d2daa11ec9622a8f6c956e1350f
901599f708c177d784deb426cd6966469e15500a
73655 F20101124_AACAOZ spellman_e_Page_17.jpg
5925bfec63bd992455aebd05296f4df3
6115996214fb818bdbb7d6a20a4444ba90f7828a
28470 F20101124_AACAYU spellman_e_Page_23.pro
aa81ff4b37740d64021f3e6e4a9f1146
077735680d17421ce2fbb496640afdb45e0993ef
50295 F20101124_AACBAU spellman_e_Page_83.pro
1556d31328d52bbe63eb29a1278ee7a6
badb754537350ef3846a99d8520fbbe1b933dc23
738884 F20101124_AACATX spellman_e_Page_67.jp2
935f07cea3efb3f89079a91c415b7a6f
c2143270196743bc494be3b298ee3923b7e6b6c6
3616 F20101124_AACBFS spellman_e_Page_62thm.jpg
da20a4459a577c16fca47d45b8331f8f
0dbd555d3ecfef9e23576931bdf10a23e4ec330e
38537 F20101124_AACAYV spellman_e_Page_25.pro
f52db5ae53e18d015f91a96bed5d21eb
d7585fa509326a1ec2757ea1185092bcdf050088
54676 F20101124_AACBAV spellman_e_Page_84.pro
558791da62b822a955bb503d2c155dfc
a1f24364cc641c8be7ac2c124164cc1488c31c5b
18335 F20101124_AACATY spellman_e_Page_68.jp2
c8731fb37d965e1eef2bf0eb725be885
fc60cc07155b13fc7a752d632c970e59dddadc55
23368 F20101124_AACBFT spellman_e_Page_17.QC.jpg
c70b59403d9760057dc5906a7a96b816
21f98007c6685df7f3da89ffd619f7170c31e66b
18671 F20101124_AACAYW spellman_e_Page_27.pro
2d7b194fdaf614a2d18faf131057a9a8
4caf8831a45138061c1164d2042bc0609d9a08e0
58955 F20101124_AACBAW spellman_e_Page_85.pro
85934deee9dbaece5d86bf75a6387c43
093f088680cbb73efde4731876160cb8b8b06d24
39678 F20101124_AACARA spellman_e_Page_74.jpg
daf20ec39aac6279518b71d5a9ebbe1f
755c920fe37138a7e1a18ab865e293fccfc6792d
88563 F20101124_AACATZ spellman_e_Page_69.jp2
a82537b07f1cce0b6cb412b71b1ee6f6
f9a6c00645af4688756d2362e34e077c54d0f915
5305 F20101124_AACBFU spellman_e_Page_83thm.jpg
5b81c85f4c995b717c6722366e0b8f3d
bae782d62101531c3e7acf4c18d5e6c466e8e5e2
21389 F20101124_AACAYX spellman_e_Page_28.pro
b1449ef4f841aa6c7844df2eb32740b9
33c0e3320ebbae55e1a87b30b518b3c22bfa2c20
62621 F20101124_AACBAX spellman_e_Page_86.pro
6bc470978a8af295cdc5bcdbcd606bf3
24b59a9dbf5752d324c1d4119f961fbd0ac9bc21
76304 F20101124_AACARB spellman_e_Page_75.jpg
0879e153c92bf4cb4a0438b812bbc9ee
8ae58bd67632fcf9a9dc2fd0858b14e1502a788e
4325 F20101124_AACBFV spellman_e_Page_80thm.jpg
65225753d6a9c33eda200128d35acd4f
9fc03b2be28eef3e1b5f277e12c4f31680a6a2ad
F20101124_AACAYY spellman_e_Page_29.pro
b4e8005e4d182d79ae106b53f7770539
2217e89e26121182a0b29d83dfc0d536ce1e686a
3612 F20101124_AACBAY spellman_e_Page_87.pro
02f1bf030a114afb5d0ab6e735c7f22b
46ce0c11547f8f7e81e600b1327fe86cb5c5a676
20490 F20101124_AACARC spellman_e_Page_76.jpg
7675560d0033c9c697878f7a27f4f64e
faefae3de839abb0837e75b81bd10ea5b15ef990
3245 F20101124_AACBFW spellman_e_Page_29thm.jpg
8047de1cd72864714475575f4d565aa8
4ac2d019be3895e1e42753c26306596bb2374d67
F20101124_AACAWA spellman_e_Page_36.tif
f48dca33a21f5eb50bb930315ed371df
650fb66308942d678879b48a9ae3964f5d04d25b
51557 F20101124_AACAYZ spellman_e_Page_31.pro
9723ce48d38f8bd63e496d5ac0bd557b
7d7fd30d8b477b0313210b9924ce43f7ea483195
14536 F20101124_AACBAZ spellman_e_Page_88.pro
5567a86aa2ccbdcb77e644adb4c31141
bb3bceb0746c84e3f23ed87c77c45c2a310ec6bd
74612 F20101124_AACARD spellman_e_Page_77.jpg
994e6d427fbffb5ebde243f7280d1078
4ea18f48f990072a524c9ee89701a664307469e0
12474 F20101124_AACBFX spellman_e_Page_51.QC.jpg
256de22e56dbfd7abe119e11df027f7b
a0384c70b5758cd2af144cb61def8ae28a8b7a90
F20101124_AACAWB spellman_e_Page_37.tif
d812dc7b6ed840ed38be20b476d235f2
64b2c1f47594bc867dc3abd76bd606ac9e5fff7a
19748 F20101124_AACARE spellman_e_Page_78.jpg
dfebbb56d3628a8a7287f2c6e5e31f68
61485b9d975857ea0f09f4a3f82cd251ce3beed4
22658 F20101124_AACBFY spellman_e_Page_09.QC.jpg
00388031308f2844a08dcc0651c6ec45
d7413d79d4fe677a22b826e7efec298175f08948
1893 F20101124_AACBDA spellman_e_Page_60.txt
9e00ba100495cf6985306bac1dbc1bb7
cd1756dddf81bdceb699c4ba159be5fa9e72007b
F20101124_AACAWC spellman_e_Page_38.tif
31aeb0c3f5e0c06caffa4da55fe77995
3a8abc85c9b8151da1a32ff94b4303bdeec8d124
71242 F20101124_AACARF spellman_e_Page_79.jpg
74d475e475a8f85050c87095e6b47ea1
0126e2ea08501cdc1482a9f78c6dad82a9aa7a12
19561 F20101124_AACBFZ spellman_e_Page_83.QC.jpg
e7803d9d797812ad757e3504977c7082
4193142d6841dcfea7d91576c32bcbdd1e8d2007
361 F20101124_AACBDB spellman_e_Page_61.txt
46d9994a8e65be29ae31761b8a02a25f
e1671db8c7f5d731b1f05ab581e0e33a2d8e3205
F20101124_AACAWD spellman_e_Page_39.tif
4e1291514fd4401083119192e64e0767
186d7f012c11608a4a76157acfae0268af240008
61053 F20101124_AACARG spellman_e_Page_81.jpg
7534ae05c7148337119eb2f98ff4c13e
989dbd020d9b4a41340b75f486a345be1e9d7a29
350 F20101124_AACBDC spellman_e_Page_62.txt
37a5bdef1839396703f844dc899225ce
644506b3301a5dbc1276d3f31ecc0f597a06f465
F20101124_AACAWE spellman_e_Page_40.tif
387e6200fbed4b71d3a32216725d3532
dcc4e955249e59ec7bbb68eded0da0b2e75bdcd8
34913 F20101124_AACARH spellman_e_Page_82.jpg
f4ea0d54852024fce6cb349614fdb042
de0b38a9663abc4c9f48dd8bd092b06d7e6f4540
6250 F20101124_AACBIA spellman_e_Page_42thm.jpg
c7c355dd637fe16fae839359b7f1a9ca
7cb0f96d9bb6a55d9e82faa10cf27ac17a374318
1921 F20101124_AACBDD spellman_e_Page_63.txt
77d7db3d72c291371f08a81fe8d2c330
514dc8d690d55c28701afe73efb88ac505ebe7ca
F20101124_AACAWF spellman_e_Page_41.tif
d3098329117543dbfff2e898218ca1e6
9bc93aa96b3d39bac727c675c50d9cb9b02d2fdd
67305 F20101124_AACARI spellman_e_Page_83.jpg
31296de184845b17057f470e7b46e3b2
4962d37497be7d7f0ea8f3d673007bacf9fd88af
5767 F20101124_AACBIB spellman_e_Page_85thm.jpg
32148b932b49a3953487efe16786b988
2f2c348537803d14d572cf0fd7fa1da904e08073
F20101124_AACBDE spellman_e_Page_64.txt
50126a47cda42dd2b36101e17cc97997
cea56c5dcdf52eb6a91d303e375a5cff375a51ac
F20101124_AACAWG spellman_e_Page_42.tif
0889c4806e235f168f749f1384f9a679
d6a5a9f5eeb05467abee6e8ddd08ea0ba170cbc8
70724 F20101124_AACARJ spellman_e_Page_84.jpg
488de3de3521f42233feccaf0d730000
b3a5bd36603a2f08251b8e8f4fa65f44d4ded3b4
6619 F20101124_AACAMM spellman_e_Page_66.pro
7b270c3c622e0953758962ef466f2e3f
8c6206800a8d539547d47773dec3657252429892
6069 F20101124_AACBIC spellman_e_Page_75thm.jpg
b6927d1103907f345b87e5cfefaa360b
6d29112058903915f7e16c74017c9d38d7e57319
534 F20101124_AACBDF spellman_e_Page_65.txt
750ce90272a59915a08858b260f63d21
485bdd2fe2ca17f926410d6a44a7ece2e5f4f083
76479 F20101124_AACARK spellman_e_Page_85.jpg
c04c824ad150571cb278c60cc2bfc953
bf57fadfc08b6d1cb12882bebd9778b9de833df2
29065 F20101124_AACAMN spellman_e_Page_24.pro
c0bf42c828bfc96d46477faca8324947
fd21419a18a3f8230fabf8dd22279e4ea446cfe2
F20101124_AACAWH spellman_e_Page_43.tif
add4ac86f2f8c47c834ab3e5e7f53d55
a1a1d6b113853b3591f7477547883f5d4a567e8b
5349 F20101124_AACBID spellman_e_Page_13thm.jpg
6efd02db5cb23f268a930c3cb1fa3dea
9eceee56c40c5802c39a18d71caab15042f69492
1372 F20101124_AACBDG spellman_e_Page_67.txt
d4197d8bdfb43598058a37b683c433de
56b60d2401098ee5c13fa23480eadb81f944d58e
80339 F20101124_AACARL spellman_e_Page_86.jpg
ef7176c5f146864e88f993323858b407
1e75f3e2b692824a359885056e07ea06b510ae78
5450 F20101124_AACAMO spellman_e_Page_02.jp2
f765082e55b3b9451344c3caa03ae236
49f6dcf8637fb9c09eb41d859fd77f9169631a19
F20101124_AACAWI spellman_e_Page_44.tif
8e05b069b2058b7c8e4bf12380009884
f3bf6469bed304be2402e8d3435693957b722f13
4140 F20101124_AACBIE spellman_e_Page_26thm.jpg
d47b8b7e004332a69eaff1be470bf759
117bd9fe8bb317b8a0b049359f25698b76a2bac1
378 F20101124_AACBDH spellman_e_Page_68.txt
758ba8b16180fba2942fc68621210d97
41d4d9ead0c8e4fae4e8a43bc6ad1ddab6b2979c
12596 F20101124_AACARM spellman_e_Page_87.jpg
64ed183e6dc8b3005c1be77c7f77a541
64ee1ec6eb8266837a2b8e95c91e2e79f10c8fe1
23717 F20101124_AACAMP spellman_e_Page_86.QC.jpg
f202a415f8348cdc1d4caf3bbdbad568
3023bec155303a1d73a879abf188dcff2df2070b
F20101124_AACAWJ spellman_e_Page_45.tif
c15878723c6060ae2bff72812fa67d18
7d0efa10e5196baeb9e78002408079fa0b67ffc4
23235 F20101124_AACBIF spellman_e_Page_75.QC.jpg
552d47135f6058f9e459aa5dd4c107f9
336749f6c7333918d3bf37739f948100cc949af2
1804 F20101124_AACBDI spellman_e_Page_69.txt
d60323a43dbf16e582a2b271f53a1e95
5a0801ed4594def04bec272b38525d82c0d9deae
25464 F20101124_AACARN spellman_e_Page_88.jpg
531819aa02ef880eafd7c4b8f120e0e4
035dec5420edaf7ffc1a87ea3befda804e4afaea
2991 F20101124_AACAMQ spellman_e_Page_38thm.jpg
6a0081121a76e1d7192faa2b48e37b9c
631e3eb82d162cd777ee019a4c1036bcd281a085
F20101124_AACAWK spellman_e_Page_46.tif
8c57510a01f4bd67fcf50f7c272b8be0
71f67927ff3a642d53466e1439de00da068327ad
F20101124_AACBIG spellman_e_Page_54thm.jpg
6a4c6f4c39b58b48921f78ee66dcd253
f0ae383fca30e4df8112600dd6ee1cb4c5d9a48f
1751 F20101124_AACBDJ spellman_e_Page_70.txt
f6d7b7dc2f9b4f217d70e9e37d9faa2f
1919b9bb0274381e9852eb5ba4cf5c9175bfda74
9956 F20101124_AACARO spellman_e_Page_03.jp2
0f8bc5ff66e7818ca363c7956b28ac5f
7b33891d93b45016306914aa7f3c77328d04e896
416336 F20101124_AACAMR spellman_e_Page_37.jp2
98358c4fc39207b76011d3e2aeb4b492
b7db460cdf3a834e0d918ab593713b186a6c9cf9
F20101124_AACAWL spellman_e_Page_47.tif
f2f59dfba09e6946a174d706dfecc1fe
6d3f279bc8220c186690632fbbaf4fa61f582efe
12176 F20101124_AACBIH spellman_e_Page_74.QC.jpg
f23a60641ae0d05fab8643a33d9c17c1
041bec422fc3277c20452fa74c706363064f2efe
95143 F20101124_AACARP spellman_e_Page_04.jp2
2ee2f5373682e3de1a66b09e04d907c2
0beafd3419fcdee2de8a6dd9d73b099ab46a6c06
947966 F20101124_AACAMS spellman_e_Page_55.jp2
ef75b0671347911d047068799f290b11
10765a8aca2678b9756d579ed90de22b323c1959
F20101124_AACAWM spellman_e_Page_48.tif
c7008e3508cebdb5fe28a27b9e1b5488
42dba1715acbc47719e0dcf97a9abb784cdf5c75
5096 F20101124_AACBII spellman_e_Page_45thm.jpg
735845f76f2d3f09aa7eee3f76a1d7c6
149ee250859bdeae5e1dbfa59b7fd0229c885b2a
2129 F20101124_AACBDK spellman_e_Page_71.txt
73a097569716f2a00234197078a557bb
6e61ad543cf8d00302430aa6c6a56287041a1f9d
618479 F20101124_AACARQ spellman_e_Page_05.jp2
6b6ce1e939604b013dc99bae28eefc8b
1f5fddcdd532a6479d7ab4ea902ae0f3bf9cc30f
129018 F20101124_AACAMT spellman_e_Page_86.jp2
0da4e036a652417b6d89f0f5e8694214
a4e9ccb29d230397d117b3d552d15f9368a43c84
F20101124_AACAWN spellman_e_Page_49.tif
6c04beebcce611234b182af40f9f1251
34db71a96b5d1207f538ec50e391e2697dc73aa5
23147 F20101124_AACBIJ spellman_e_Page_28.QC.jpg
d67c630db664ba176993cf0fe08c9253
3473dfa4c152a807c85adde10c03045c8571e76a
1920 F20101124_AACBDL spellman_e_Page_72.txt
22bb2345ac129fd802854e002a1520db
2f7224de5e899000bcb122ecdf9060a35baa2eb1
1051985 F20101124_AACARR spellman_e_Page_06.jp2
0a8d8b8fd4b946dcb0d5d4f4e61238f0
00a10371fa5a3a3eb0e9441308412e9a59c21da5
F20101124_AACAWO spellman_e_Page_50.tif
5b699ec249325a4d60e2e11bd09134f6
2262147c42ef0c5d82188756e9aa4d97e6a6661c
21933 F20101124_AACBIK spellman_e_Page_79.QC.jpg
604fbdd904ef75a59536b86dc6ac7753
9df11feef531a33da59feb0bcd7b015ee699ce06
135 F20101124_AACBDM spellman_e_Page_74.txt
e0f6992bf78caa492051efc16387f9e9
36c2d477da16f535c9a9e0e3bca3f555c5fb9182
1005030 F20101124_AACARS spellman_e_Page_07.jp2
ef1179a2671e695ebaeac2747ca9db6c
1951fd94f930c9b5f9078c20dc9067719417fde6
25719 F20101124_AACAMU spellman_e_Page_30.jpg
e20856cd1c95527cd64d2bdbfbcde4cb
a252cd1380d638417abc7f81c9a1fb9c37efd93b
F20101124_AACAWP spellman_e_Page_51.tif
07b95607e6cccbe0408456e21d2ead73
24e9ddd449fa141230da36eebe53dfb6f49cb6a2
6006 F20101124_AACBIL spellman_e_Page_28thm.jpg
895cf72dba7a48588cb82801eb085478
866ed42f9cde3f5b4553c844f5a9ca3a1cd379c7
2122 F20101124_AACBDN spellman_e_Page_75.txt
65428a1104d659f2dec3c92d3c614bdc
852f643411dfe573388c7dd5d3512e75c144b762
371814 F20101124_AACART spellman_e_Page_08.jp2
cc8f3d8fd5211279423b10243e42e8dc
b450d531628aadbf9c078debc3bf63228bbfef68
9761 F20101124_AACAMV spellman_e_Page_37.QC.jpg
8e0920b648f0b257077f90bae8ec5af7
1a00f65f7725fa70a1d8d445cf9d6bc90bb952d1
F20101124_AACAWQ spellman_e_Page_52.tif
05e0bd9b50423ca7e3395e161300b068
59201b6925617fff2c8d755bc8dc86b72889d70b
25940 F20101124_AACBIM spellman_e_Page_47.QC.jpg
fec11accb129e148c5da74ba69328b45
fdc62adcd9337353e8a994b7182a0f4f79bf05e2
447 F20101124_AACBDO spellman_e_Page_76.txt
5580a9437a16fc6b1cf0d5ea6a27ab10
b5f7492a0063ec8d292baf48526f9c667b1a7f2e
1051917 F20101124_AACARU spellman_e_Page_09.jp2
bde7e6e0334ef20c0f8867a4d954e0bc
a92cb4e49c16a817f457d5eb1709ddc9ee34e527
430 F20101124_AACAMW spellman_e_Page_50.txt
72d7676a4d4acf0055a4f798a588b21a
87217d5bd577ef1513d679754df90de9b4d28008
F20101124_AACAWR spellman_e_Page_53.tif
5cc00fdcca6246f9eadba54d73ed181d
5d1dd204403e83bb6589ca5a106954e7e184526a
5894 F20101124_AACBIN spellman_e_Page_10thm.jpg
9a2a23a2b9d818e2399e4bdad7919f25
3a3a41366f6eba8e3da435468376b88b654f4c75
1901 F20101124_AACBDP spellman_e_Page_77.txt
45aa699acf33115b939b175e1cf7c8bc
6999267b0f97bc55e3aeb6ce1104a4e702e64bfe
1051959 F20101124_AACARV spellman_e_Page_10.jp2
963b2dc4273489dacc6227058ed0a141
c42a6e81313ea3dd1c580df8ac6c4b820fdf0ff1
1004 F20101124_AACAMX spellman_e_Page_33.txt
169072df0fde93f34746db44192d4947
bf43da23cd4455c8fffc662fa2498c04bf5bfd1f
F20101124_AACAWS spellman_e_Page_54.tif
9624edc1103bd7726844139e69adcd52
762026ce1bfcd6caa5dde20a4250c6aa7fee4863
6340 F20101124_AACBIO spellman_e_Page_53thm.jpg
810671fa3c3a9cd13088849ce754a32a
4d72f53d2a67620fbfa1c9c1b8e7b88c0a47cdc6
440 F20101124_AACBDQ spellman_e_Page_78.txt
138adfdfd151e3cd9bf8e8f02c537389
cb9eadd5840bc3979191f84fae9c6a906e742c0c
94325 F20101124_AACARW spellman_e_Page_11.jp2
77fd3b845cfde62303974a21fee89f1a
6107f283a6069385b6c7b59d1f89b1dc2de90be2
2040 F20101124_AACAMY spellman_e_Page_79.txt
11e54824d654d85fcc52317c8a79d02a
37b579b64cd045bf53a253451216fb9b9016bd61
F20101124_AACAWT spellman_e_Page_55.tif
bb02e1ea1c21b5662e89cf61a7cfe73a
dce36e868478cf2707f252d7e3922938d289b2d4
1426 F20101124_AACBDR spellman_e_Page_80.txt
db0ea77707d32ee5ee086e1bb36b1e59
40de1b374935da1a76c070ceb6026471a3ffaac8
59709 F20101124_AACARX spellman_e_Page_12.jp2
ee3444701380205f351e8b056eb89b30
c37ad235320a8ddddeec199c5015cf0cd3064f2f
75127 F20101124_AACAMZ spellman_e_Page_35.jpg
b2deb2ca87405af7f362b79705c7dd2b
9d4c233a43d8aecee3203b480df700f169f183b2
F20101124_AACAWU spellman_e_Page_56.tif
61bb677aa18b917a899d201d80b2e146
fb73026ecbc7551cd5b0aa8721d77905eb003d94
19924 F20101124_AACBIP spellman_e_Page_55.QC.jpg
a7a4f456ff8da19ac2ff9faca57c3410
501c0de083a492913b2cdf15367547558a6d9f8d
946 F20101124_AACBDS spellman_e_Page_82.txt
355d16ab1d7c2efb25ac3e2b0993ef9c
7bca15950f26d59f23702f0f14277f1edfd7ea36
101380 F20101124_AACARY spellman_e_Page_13.jp2
198e508c913d48d664fc8a33854e9bb9
56e48f363a826d4d91df50ef6080bea85958e3d3
F20101124_AACAWV spellman_e_Page_57.tif
6583c3667a256508700a44ced2b1ecf1
1a40b8f055c51ede3f1ba246911c60a9409a8574
3785 F20101124_AACBIQ spellman_e_Page_51thm.jpg
2b6377ce324cee867055c36f233ddbf7
e2561bea64d146bfe779e0fcb840a12e902beead
2041 F20101124_AACBDT spellman_e_Page_83.txt
2993ba093bcb9fe612cf9f96b355490f
3005de773fa6c87ebc166b42006ae470aafe29de
F20101124_AACAWW spellman_e_Page_58.tif
ca1fcdb5dcc373fcdeaa7b4dfd8d7735
2105f42e8ecf3c71365e0ca2058a70defe424b81
39341 F20101124_AACAPA spellman_e_Page_18.jpg
a3caa542a72a918943c7965c52c6b2f6
c5ce16c979f686ab3c94c0b329353bc6ded99431
4950 F20101124_AACBIR spellman_e_Page_20thm.jpg
306c21b276e5a0947c44a1c5432a1713
ce0a27b7d6a0827fd9a56c50bb1e82e071eb2ee4
2214 F20101124_AACBDU spellman_e_Page_84.txt
ee2103ef92396358b3b7e8bb45b20e08
a7f499ac1414e6e13f719907ce2dacb83c4e0491
1051972 F20101124_AACARZ spellman_e_Page_14.jp2
ffbc533895d689f2d9bc90c291507461
7103748bfe1cdb435b35a00784fca327b38e3b78
F20101124_AACAWX spellman_e_Page_59.tif
a70c55fd7a341ed119434d7b75fe15a8
f51137a28c1a349a4d47d6449bc065432e0aec80
46260 F20101124_AACAPB spellman_e_Page_19.jpg
60e26bde43f0cc864120b1d257b7fbf3
2c3387e8abad464dc72a8eea019100a7afff3bfe
5949 F20101124_AACBIS spellman_e_Page_63thm.jpg
969c0547cf87ffba38e80e99b866a8d6
76952deb60a1db84896b01744ae4a803704d026b
2377 F20101124_AACBDV spellman_e_Page_85.txt
64c048970211598f88c17a697758eb08
793869d1315ecf49d4a391547b39aec6eef0c8a1
F20101124_AACAWY spellman_e_Page_60.tif
12eaeb2fbeb8c26ca4268a7c9208b68b
621f9fc464a83f1012053fe605ede54f26f193b9
47024 F20101124_AACAPC spellman_e_Page_20.jpg
20e164e8ffc36d04eefc6a49db853615
8b892df0d1ace096201b15d4c211e3847c97bd5d
19941 F20101124_AACBIT spellman_e_Page_13.QC.jpg
f21a2bf531cf197bcb13b0f2dfc25ba4
0fd21c7ced9c85463fe6d7f8a020b70cfe9e3bab
2523 F20101124_AACBDW spellman_e_Page_86.txt
efb898a3c882c61948b40a89abf702bf
3f76ff63e7823d6a141788e57b0805c3ade2aebf
897868 F20101124_AACAUA spellman_e_Page_70.jp2
1f08d97b5160b5fff138d954386e354a
8272ccab553cd360425c2c5b7d5a6f1c973a7457
F20101124_AACAWZ spellman_e_Page_61.tif
23904b2657d04b50697a0b3e8c37f673
979e80f138c8453cbda18ad6ccbadc219dce30d6
20685 F20101124_AACAPD spellman_e_Page_21.jpg
c7ac49b97a5d2fb826086a16a1adc257
1db060546f571a128206ac4bdd1d576954906b64
6278 F20101124_AACBIU spellman_e_Page_56thm.jpg
77de1a49387cc0435954846bd4133725
b4e63be523a0ed13291bddd0c3dd18f3990f7adb
222 F20101124_AACBDX spellman_e_Page_87.txt
e02ad5f828537e3818aacb580beb65a3
0ab74b19afca63ed006fe54dabe0f5d21bcf2aca
1051894 F20101124_AACAUB spellman_e_Page_71.jp2
cf73e746f913aea859f27641738b1fbf
a9e5f390ea46a2c74e5ad41c6a7c3cf9cadf3c0d
49389 F20101124_AACAPE spellman_e_Page_22.jpg
6b927c32abaa1afc62577a166df2df46
12b5666286d5bf0e7f5c123fa5e279ad7bec5e3c
5006 F20101124_AACBIV spellman_e_Page_11thm.jpg
081f8c08a5ce140252c5a616b0a95bb7
684b299d550569dd8a2715096e0ede2f8669f832
624 F20101124_AACBDY spellman_e_Page_88.txt
310e2cd5c20edda228cc0d38fa585e08
2ada3a59897698882bbe45cd2b106ce5029a1e01
8099 F20101124_AACAZA spellman_e_Page_32.pro
dbb8374cada7bdca79d91d2aa1c94dc5
3aba9ba0c015b6a3c2e0ca60c92c56cb955472f4
472 F20101124_AACBBA spellman_e_Page_01.txt
429b58ff993f3b3990e9409db649b26c
197740ec9f6f163c2a7b0d829c0fce249a77b4d3
981769 F20101124_AACAUC spellman_e_Page_72.jp2
eeb021d46a786cac63f1ec1102c318b0
8c444f928861b19aeb165419b3627abd4e3a482e
46492 F20101124_AACAPF spellman_e_Page_23.jpg
036048f82c6e1642657004f26aa9a864
9b06e334cbe48acd0ca25972c32de4d3a247ee9e
4506 F20101124_AACBIW spellman_e_Page_24thm.jpg
6f2054fe22b55b7e1fe2425ca4b79724
f41ac0649bca72ba1bca2f2f15d0652602768485
2382 F20101124_AACBDZ spellman_e_Page_01thm.jpg
ae353150e81c75eec59dfe7caf80f7fd
1141b2a604fb43e02e046404ace18466f6547909
109 F20101124_AACBBB spellman_e_Page_02.txt
746b042f0b510392f3885bc549e385d5
e118300e0f71771132ca421392602b473df590ec
1051947 F20101124_AACAUD spellman_e_Page_73.jp2
5c2f487ddfb1e868011ddde8ae532dc8
fce10d3bcaa48178be3f04467f911426836d44a1
48139 F20101124_AACAPG spellman_e_Page_24.jpg
886454d8e468611ccb40991e33a13ac9
94b02c4ea2cdb46c2bdfb4d565b23ba1426227f9
7632 F20101124_AACBIX spellman_e_Page_21.QC.jpg
c7d9d915a254c4153e12ab7de3805c23
be4fc2d3ff6dae5e8e11a6a45abbcc2f879f7aa4
21372 F20101124_AACAZB spellman_e_Page_33.pro
ac8c245ccfc56d0ecad7f86e7e329cd0
40289af3d1ec3ca047d9c4764f22eaea79aa19c6
143 F20101124_AACBBC spellman_e_Page_03.txt
e3d9616ea6fa64cea224de97e0ca9d2e
ef304294304173fc6506ecd5339f273cee03f99d
763859 F20101124_AACAUE spellman_e_Page_74.jp2
adb097dbd68f6874f5ad7695f444b25d
a732336a02ea43b664c5c4140581af18d703e2ae
61531 F20101124_AACAPH spellman_e_Page_25.jpg
c699f4c3700d07811d8ff1ca21f9bb93
ef6d17353561e62d0b2856f5218e7f40502c0b87
17321 F20101124_AACBIY spellman_e_Page_58.QC.jpg
80d640fd6cd1f289b4c66d4c0d5488c0
3417751ce25c1042f072234e7ee9070f977b76e2
10786 F20101124_AACBGA spellman_e_Page_50.QC.jpg
72b00c5fce73d89d81f48b82ebffd481
968b1dd581415a4459cfe0b5cfb79da199f7c525
15631 F20101124_AACAZC spellman_e_Page_34.pro
d259c1523310d936773b02d4f5c1db1e
41b3f4fc8c55564f502ea679d4652a16a64bd70a
1819 F20101124_AACBBD spellman_e_Page_04.txt
5b6f5572e4944aec0515fdde43810bfd
31e734d0e389bf88a06113ff69aea929ea194c32
188963 F20101124_AACAUF spellman_e_Page_76.jp2
9720a045055371d9afcede44bafc3142
27bdbed0d6a3c9085cd0ac622460f2924ff59da5
37465 F20101124_AACAPI spellman_e_Page_26.jpg
c729654efd4d74e8674c14fc125739bd
45f6d82b51044b25bfdf9cd32737acaa9ff55679
5516 F20101124_AACBIZ spellman_e_Page_68.QC.jpg
90ea7dae032d3222ffb0729d6b88fa4a
50d68b748da21f3a795f091467d3e5437eab4694
4441 F20101124_AACBGB spellman_e_Page_19thm.jpg
64f97e01be0fa79bc0c656bb9e8925ba
9ac3e6467d57c28f5902d0ee123079c4b08573dd
1080 F20101124_AACBBE spellman_e_Page_05.txt
3ccf09a8f10b242fffa71df60c84a89a
515313437182866e272a8cb352faa328f5ab1671
1039962 F20101124_AACAUG spellman_e_Page_77.jp2
6d2d5d3ae86739442f0ede8dd240a414
45b8f41979627f3c78f7cff72b4065c93bd91fa4
34275 F20101124_AACAPJ spellman_e_Page_27.jpg
f8900cd3307c98b672f3a175b8620424
05e74bc271b4fdca2c5ce70ebe1fff7350ec2fb5
47993 F20101124_AACAZD spellman_e_Page_35.pro
922510553a4cd08428088a09f1de1159
d207495ee38f555abcdac06568de5dd766369449
22335 F20101124_AACBGC spellman_e_Page_43.QC.jpg
9d625d907c44dc8633d1c1a8409fc024
aabdb4e76ddb4acb1d3c064619a93f0b3dc64f91
2318 F20101124_AACBBF spellman_e_Page_06.txt
e5e988b822bd5ea0268d072ed54ab811
5ddbc3f82e3c980d524f5e3d4e93121a9afc4e2f
21708 F20101124_AACAUH spellman_e_Page_78.jp2
6531a99de799c1d45760cd21fab035a6
d9a11ce46eee842766ac4bc5e4d7e00a3c0be272
87459 F20101124_AACAPK spellman_e_Page_28.jpg
8beacf41461a0cbe888f71358fdeb5d7
dd367e5f9d305be5c70137f9c0dd57b694c2c786
9370 F20101124_AACAZE spellman_e_Page_37.pro
216eb2eeb58493c94d87c048729dd4f3
0d36b3b0a47952e7ac5ebaf1bcfe755e24c5f785
2796 F20101124_AACBGD spellman_e_Page_30thm.jpg
d8978ebafc1026655d283ea540cc1aa6
60bab194bcf05df5ccc9c9c0bacdf9f88cb77014
1113 F20101124_AACBBG spellman_e_Page_07.txt
6a931597f3910fae4fe220f50212ad41
c60652d1b266df4ddadaa8c8efc5d75ac86a64a1
977531 F20101124_AACAUI spellman_e_Page_79.jp2
05b45f514b438b0603bf113f97293f5e
433417be1530eba8024b60cf9bb2b6da60dc8a3f
29313 F20101124_AACAPL spellman_e_Page_29.jpg
b7b1985a71a42b3067e0dec59acdaef1
97eb3c4b53fc819dae77c9780632ac788bd95a1b
7499 F20101124_AACAZF spellman_e_Page_38.pro
20033ed8852f91ff77726b4615b40ab7
0d987d20ef0b0664799662c7eaa3bcde7288e1c6
10861 F20101124_AACBGE spellman_e_Page_62.QC.jpg
3cb0a7168d20591382dea2f0dea7a0d7
afb9fa2171d663cd2fd24955c3bf660902351a4b
459 F20101124_AACBBH spellman_e_Page_08.txt
b2946b720794b92e25475331b2cf47da
2bf7bc5bca71ca28d7b07a9aed83cb72bea7769f
627324 F20101124_AACAUJ spellman_e_Page_80.jp2
4942450b9fb9915a25a75274ed6b07a2
dedb63eb53da6540c41a99265a8b8d030632bc75
79466 F20101124_AACAPM spellman_e_Page_31.jpg
438e7799b42997c691edcd57b21f4a3e
0995021f2f81ab71571a5fa2c62c34794276aa65
40758 F20101124_AACAZG spellman_e_Page_40.pro
a6c8c3c3995104a053b7812940b30c1f
1570e9b4aa462726a3995aebbcd02b871aa7c1fd
6256 F20101124_AACBGF spellman_e_Page_14thm.jpg
2a88f2a8aa53b462f2ebccf771cb31e9
ba69e6c9197903efcb1d37ab4c455f316f584efe
94870 F20101124_AACAUK spellman_e_Page_81.jp2
addfb0cf2074d7ab433e9338600d61f1
7385bc0c982203882c86c332bf258eeb59dc5aae
31282 F20101124_AACAPN spellman_e_Page_32.jpg
97babee4848a26d9f45de40e49b97352
c360efad8b17e19e738b8fda547fb464d9023676
48645 F20101124_AACAZH spellman_e_Page_41.pro
ed179eb405c2ce877ccaab4e2d5de5b6
49befc9ea8c01816917162ca1f490e2790f8c16a
2177 F20101124_AACBGG spellman_e_Page_08thm.jpg
93bcfb29f533496908a60da442eaafc2
2bfc0f8afb2ddf720d3b7cbe14d916a44d7a6ff4
2522 F20101124_AACBBI spellman_e_Page_09.txt
dd4b445c4283648bf3ee97dda3f87899
744a25e01c344f9677a1a6a82094f45c125c008c
50760 F20101124_AACAUL spellman_e_Page_82.jp2
e1e641d87f0a59a43ed983f5192f87c9
77653ab2ea404060328354a7197b3ec19fc05e46
52027 F20101124_AACAPO spellman_e_Page_33.jpg
07403699dd03f02824e4a3ceb4acf3d2
9d4a261e5315513b4188b8810c8c89d8633aea3f
52477 F20101124_AACAZI spellman_e_Page_42.pro
c0d998a7da762782b29b2e722a1edc5a
917bed20e553bdf3f829da9aedd2389d53d7bc91
18023 F20101124_AACBGH spellman_e_Page_06.QC.jpg
2119c1f41eb57ec41a181c5b63571afb
c1c439827f13f7eeb9af88308fc51992a8e76e5b
2455 F20101124_AACBBJ spellman_e_Page_10.txt
df9d8e3a57319b83f29c0eda242b361a
444deacfc55d9c8f21d2ae9122e00bda42534381
105516 F20101124_AACAUM spellman_e_Page_83.jp2
9b0f669da6b87f5ccff2fe30ccdc7f84
f37c1ddaad6e7967eaf0161ef998e25cddfd9e68
40009 F20101124_AACAPP spellman_e_Page_34.jpg
c159e0e6fe89bce563603b8247a2519a
495ac80938ce3d8aab23449e783bee89dbc3bb39
47471 F20101124_AACAZJ spellman_e_Page_43.pro
8f2ba8e75865cad3d5fca4e1d82d9457
0241c7acea7300b76aa123640ad7546097da6d29
15603 F20101124_AACBGI spellman_e_Page_33.QC.jpg
0204ff5fafc88677cc13a4be89de3462
908ecfe90b27e01fbda48f32a0ed1a7ee5169e25
112749 F20101124_AACAUN spellman_e_Page_84.jp2
0714addc3f2a09fa2b1df4e1bce2b3c3
8edfc7b61b0d50fdfc4a99f6e247552e0debc33b
65976 F20101124_AACAPQ spellman_e_Page_36.jpg
98c102e5a5944d5e1e5f3d9028a18ed5
6a60f6ee56d594a91aa74a2d75d51829464b1f04
54958 F20101124_AACAZK spellman_e_Page_44.pro
31e9395d4e7125a26785a13886bb882d
8e96550938046960bb94cb73bed2098ad18b44ad
1102 F20101124_AACBBK spellman_e_Page_12.txt
3f2c0ec72bc310b4832b9bab7fcc926a
c38902df62239db361a73ea5ce045249dd4a7d03
8412 F20101124_AACBGJ spellman_e_Page_65.QC.jpg
fbccd2f62c15ab98fee953a30841c31b
2948cabfc58fdf4ba9ea95bff374a687a9f899e4
124348 F20101124_AACAUO spellman_e_Page_85.jp2
17310d78f42c91e0cbd6c7ff54e3e437
831e6c82cf91642929b3b00141cd0576fe37760b
31241 F20101124_AACAPR spellman_e_Page_37.jpg
26d63b9d4fd65f27ec817b5f1c1c70c7
262f46c2d10b2dcd5f74b3e3f35bb55f82698f7a
33433 F20101124_AACAZL spellman_e_Page_45.pro
00e81e10e375e44dc53bcd017b4d4b68
334e0bd927069367c13e66780261f20b8e76e96a
1982 F20101124_AACBBL spellman_e_Page_13.txt
0991eeed1c83eea561f3daa608b8db1c
a1513526e57ce2a81e061bc8373566d76163afda
4550 F20101124_AACBGK spellman_e_Page_06thm.jpg
7363212c491a20f7be7066a1f2c64628
26f6bc44d1c2edbc26b25117ee89434b5894a6d6
10576 F20101124_AACAUP spellman_e_Page_87.jp2
01953b0546c5c428adb7c18f245b4fba
c5443d0ef4fcc80e1a922b652233ec8d2bb73ddf
25246 F20101124_AACAPS spellman_e_Page_38.jpg
d2402d3b0b9c251d43d5c25ecc82e4e0
93363f1323c62388b9a768e61cefe0e08a3213b5
47315 F20101124_AACAZM spellman_e_Page_46.pro
c305f9f4fccef59f0fdea960944ca04f
a410ec2e0a4ec00813d942f4ffa5af764a2f9ad1
2092 F20101124_AACBBM spellman_e_Page_14.txt
ed7ef54633dcbc3674312f2beb409501
233ae79258d8969b55b0ac786dde94d17b8a63cb
22882 F20101124_AACBGL spellman_e_Page_16.QC.jpg
0ffb4472719e23bb01b2c671ff335992
321d3e1e695fbb45ff72a005bfde9b4e8fb3e466
34922 F20101124_AACAUQ spellman_e_Page_88.jp2
0994872473d6194e7d0fe6e164971bce
2d699cf00e85f2ce54e4269b4a2dabc8c6164a61
75999 F20101124_AACAPT spellman_e_Page_39.jpg
92a914b01844c8310dba30c52aabe910
48abe3bd159c8225f8587eb4ac6e5e49ad1371a6
39587 F20101124_AACAZN spellman_e_Page_47.pro
3d07f582f00539a3ec647bf7365a15cc
b4edca670ee1ede238a7edd8bcb3a65915b17924
F20101124_AACBBN spellman_e_Page_15.txt
e6afd2f16fd489e9e5925434249e36e9
77ad3c09a8d6ed9fe67ff339aeddb30e0ec92e5a
23189 F20101124_AACBGM spellman_e_Page_35.QC.jpg
329556570d6f622f86d006b1dbc0db2d
6729e5a059b032af6d63c8a6dd0827fc7424a7f0
F20101124_AACAUR spellman_e_Page_01.tif
b5b4501934760c278d5b78b8a8b02d07
b8da337863d9ae0e97e29aeb8d8c6b154652b81a
63892 F20101124_AACAPU spellman_e_Page_40.jpg
d8ba955de0addbbb4e6a1332f1006bad
ad6b858c4e17f5c2917c97864b9bffb8e09f7730
50173 F20101124_AACAZO spellman_e_Page_48.pro
df194a2d933a657bad2660de04ed1eff
0f969fbea55c03c441cc952a12c3675b85b3de1f
2061 F20101124_AACBBO spellman_e_Page_16.txt
bfb16f92f5f61ca07a8acb525440ca78
38446a044811c8b0173083fc868f8edf1d37e9e2
F20101124_AACAUS spellman_e_Page_02.tif
5855067ece4cd4a42a84cfc4cd951b41
7819ccf30f1949688dfb996e6eb260cd742a96d7
75377 F20101124_AACAPV spellman_e_Page_41.jpg
3b056ecc219ceed99e5cb3be12a70c8d
152f38d522a76d3c38608cce4eddf2f7c29ba0c9
50781 F20101124_AACAZP spellman_e_Page_49.pro
f5970e696e03149f9aacc1f32763a4a2
7bcce4bb40113a743207d42eaaa6b08454ef0794
2066 F20101124_AACBBP spellman_e_Page_17.txt
245ea7b17878b484f4fe21326a903b1d
d75034ac7880fa9542e3d267655252932af65f9c
7651 F20101124_AACBGN spellman_e_Page_61.QC.jpg
bd8a528417fda02706518aaa201409c6
a745d7ed8bcdfae2a7d6c11b93b70d4ed24b7c9e
F20101124_AACAUT spellman_e_Page_03.tif
3d1a285b0444d7f5affba8f63f1ac4f7
dd51c43321b192b4d622ffb45cd65985aa816c90
79283 F20101124_AACAPW spellman_e_Page_42.jpg
f0d2f9ecb6cbda064c92ebebc12f166b
3f2b963e42ae9fa4f176c9d54d8ee7edd3eac0b9
9427 F20101124_AACAZQ spellman_e_Page_50.pro
04737babcae6f4fdd0675777cf9de650
058d35a5b4c03bf7538dacddbf4218cc693e1665
857 F20101124_AACBBQ spellman_e_Page_18.txt
340bf506514adc8abe6b8a2feb4ee54d
b75c03bb5e7d4a78562da01b69f399f34a71c2fa
3486 F20101124_AACBGO spellman_e_Page_32thm.jpg
d468db211b92df5b052066589c062879
a27d643449b16f74cafb29b0d02cc5231218e71e
F20101124_AACAUU spellman_e_Page_04.tif
ad14080918ab06a884d799b553b54e0e
cea86779671d7b314d05c05efc104ab7ab5f7b37
20384 F20101124_AACAZR spellman_e_Page_51.pro
d6c35d86f0cf1631ed2115ddfad4c68e
d8dc742a35e10b2e27495f40e68f1d06a5255184
1593 F20101124_AACBBR spellman_e_Page_19.txt
451885c55021f60ac5ef9daeecb744cd
88e72a736fca11c551e4c271c0e30f1630d0b4fd
5938 F20101124_AACBGP spellman_e_Page_49thm.jpg
985aba116fe734b6d513b297b48462f3
07cef540380a31c12f25001d29cb0f587abe9659
F20101124_AACAUV spellman_e_Page_05.tif
812eb7b3c44797ab49e3a7d1971af25c
333ab23e8899cd65ae12c6b5031f713fdfe2fa7d
71649 F20101124_AACAPX spellman_e_Page_43.jpg
bd4c328366f136bc58c5c2761bb35651
a4c95f9c1f32504f51ec8f38d5d983817b9f45ab
41558 F20101124_AACAZS spellman_e_Page_52.pro
fdfcb017f47626d3c7129924761ac922
ccf7d80169f7780978d11bc0d9b2fc5597e84fec
1498 F20101124_AACBBS spellman_e_Page_20.txt
4f0ffa0d7c45ebd4f797ac2ebca805d8
c3c12ea95a610e496d978ff388f03ef27531c1a6
17448 F20101124_AACBGQ spellman_e_Page_45.QC.jpg
f20afb633df326765bd59c103007b3fc
354a0f2c057f3a78f2d175432de92f5568c9a159
F20101124_AACAUW spellman_e_Page_06.tif
3bfeba6c9ad4e75e97fe83c926560b95
2f4fea5cc41655a968c81868e5cd42e47e86849e
5733 F20101124_AACANA spellman_e_Page_84thm.jpg
56aa2731818f308f4b2e5d84af1be467
3983ebf166f4afe76d6823225c463654b3fdf1c1
80647 F20101124_AACAPY spellman_e_Page_44.jpg
a019a4012052973518252bab70c0cda0
4eb9f64f82ee28d000dced1d1da09edb767b4ea0
53504 F20101124_AACAZT spellman_e_Page_53.pro
34e23b7820ee204d64e2090918fb7001
b9ad8660ae836de1cc03eb9ce7b7b798160a5f51
164 F20101124_AACBBT spellman_e_Page_21.txt
6bf0f020d9a51dd373a810ff190a2e0e
476c992676429958575d613109a5b0f3d47ae8e0
21220 F20101124_AACBGR spellman_e_Page_72.QC.jpg
b5e36913103a90fd8efcdd2233589a0d
6c9aab5e3648ab5a8026b701c88639a776b84860
F20101124_AACAUX spellman_e_Page_07.tif
eb6baebee61c4925db0efb4d76774fda
695dfc2e4a7501aa7cb52ed6ee075d5eb0705a90
2016 F20101124_AACANB spellman_e_Page_73.txt
31d957e1964c3e360c9424a4f22d54d0
6bfbb166fc2e54a48eb1915edd94e774ea7c6aa5
58105 F20101124_AACAPZ spellman_e_Page_45.jpg
9ab6ac66ab406a3ba30743e64ca99783
5172a9287df378d3d2ff09c06bbeb3f0156eff71
40878 F20101124_AACAZU spellman_e_Page_54.pro
6ead54cb592d648b2ff47d8398d3e82d
19adb8fa916f7acc51d722c7c8c7e07ed6e3413a
1401 F20101124_AACBBU spellman_e_Page_22.txt
3198f18fd302f7a48d9a84e3575d65cd
55039ce1ff2b433e9bc7e888a51d5deb3e0c63db
3192 F20101124_AACBGS spellman_e_Page_82thm.jpg
12281996955ba9460db411966cb9c7b5
b195dcb7f7934d22c94f9398e718400b792a5067
F20101124_AACAUY spellman_e_Page_08.tif
45441a8881b5b86a2388d2bc9bb825ee
c079cb1ab6c46b438a5fd664f2c35d7957dbaa7b
25670 F20101124_AACANC spellman_e_Page_01.jp2
c72c8d5fe4284f959bdb6b2988daf5c4
a297a6214980d5e02f2fa7e508f763414e5bbb5f
53351 F20101124_AACAZV spellman_e_Page_56.pro
54e66aae33b895c734cb748b3856eb84
24797c3026e5ab2b97595ac3c0ee2ef047115cea
1378 F20101124_AACBBV spellman_e_Page_24.txt
e3ac4920e1acfcd95483e83046b41ebd
7c24eb2fe7289f3713d5709e367f1ce3be8c76f1
6156 F20101124_AACBGT spellman_e_Page_08.QC.jpg
5e5fca573243688a293949cd65aa4be7
fc0cc4224eac3fc076ef94d912a3b9aad3b33d0c
F20101124_AACAUZ spellman_e_Page_09.tif
fa8141cb4853637dcac0c55a5f43376b
95a0b77fcc8597f1dd83fa5c75f6c77cde496133
27074 F20101124_AACAND spellman_e_Page_26.pro
7dde706319d2ccec5a4986db57aa6609
914d26b7a8f72a685335475dcbae8e0c7cd72b29
39034 F20101124_AACAZW spellman_e_Page_57.pro
8a12ce9f8804cc2077a4a091243d23d1
400f0618895f0c83bcccf6f1617dba51bf7fa463
1706 F20101124_AACBBW spellman_e_Page_25.txt
2fb83ba80b59dd8e1da7cbf3685fa63d
490028b56b7e11e9cf9c4986cf96493b0926692d
1051964 F20101124_AACASA spellman_e_Page_15.jp2
feed34bcbb8c30647899e29b1f03327b
11e26c4b1f2dc5203b2094e0f6d3b053cab20b76
6134 F20101124_AACBGU spellman_e_Page_76.QC.jpg
1c84b8c653a883338e1ecc1abe3c8240
97f5736aebfb72543c9877a5268a14508a9e06fb
50749 F20101124_AACANE spellman_e_Page_39.pro
b4a70a153b9285916347b1f4deadc2ad
53e394084e04a5bcdf699bca47007a2ed3e184aa
32699 F20101124_AACAZX spellman_e_Page_58.pro
dcde384f0ae2824285e7eeb387051565
2ba6f7809948329e40a908843220cc774e28b755
1436 F20101124_AACBBX spellman_e_Page_26.txt
0563375e6309bdc23caf3cd9d58a5ec4
fb7c14b5900dcb8b66098a64ceafc91c70a256f6
1051979 F20101124_AACASB spellman_e_Page_16.jp2
0cfd65bb2de7173701c23c2b3e006679
e41bd25f39480d9b61b0b780fb1f4cc91db63036
5953 F20101124_AACBGV spellman_e_Page_48thm.jpg
c0cb320e85f1788fab127c77c3929dd8
0535b8ca6e383b5583c02dd68a8fd8b03aae9c2e
803 F20101124_AACBBY spellman_e_Page_27.txt
fba286eb3c23ad05399ebecd3bc57a89
dce023fc93dfb29cf656945383a6e49dc4669c46
42060 F20101124_AACANF spellman_e_Page_55.pro
3d21c3f8b90054951c1bf5179804b034
72bab11bb6c64bd5273e81ae7cd062ef975a0a5e
47151 F20101124_AACAZY spellman_e_Page_59.pro
392bda3245b75786165294e025b49851
a11e1ace1d592d1b283ddecb1892122f8568c9db
1051978 F20101124_AACASC spellman_e_Page_17.jp2
4a0d4a829fde5e584a9d8ea6c7b4abdf
014abf9829568166301e735c81195699c53cdabc
1350 F20101124_AACBGW spellman_e_Page_02thm.jpg
d34b8c6b97fa8803d2c15128177c892d
06919054c3947e563635ea006a5cc2a3658637f0
926 F20101124_AACBBZ spellman_e_Page_28.txt
b7e95634cc6281b8d05b64edfc028d95
9ef82f91db6114ef2001af38367b1de42dfb2c8f
2272 F20101124_AACANG spellman_e_Page_48.txt
d621f5ce5b8874470a9e93fd4c3c7e94
35eafb9db6f5f6c06c5440bcce7c99fb351e93d8
F20101124_AACAXA spellman_e_Page_62.tif
b962b70040ec2332c5671a967d09ce9f
e29e232871d3c4e8fdec704322367338a7f8ad8a
46594 F20101124_AACAZZ spellman_e_Page_60.pro
dfb31e25be14404e99297e7c8efdea32
45cf1a1ded418d6226d2f9f5518f3ccded5d6381
476338 F20101124_AACASD spellman_e_Page_18.jp2
1d6b865226425a42b12e61d32e92ecef
03ce3ded34b39fee8245efb2884def3d9f1ad8dc
6090 F20101124_AACBGX spellman_e_Page_41thm.jpg
4cfa44bde62c06c1b05f183a5b5928cb
fbb790e4b7b0074dcbcd787eca2300dd230ec6c7
8463 F20101124_AACANH spellman_e_Page_30.pro
73e7f9037cd7f111975bf6b870ab19fc
198c33f5b4f43b0683f0c6b40e2c620b9311a00b
F20101124_AACAXB spellman_e_Page_63.tif
237fff3a1497c9f186acb3f4c86e79e4
bc1d1bf8e4b3d5b894718c6df9b4b303b87ee4b5
70070 F20101124_AACASE spellman_e_Page_19.jp2
ad8aa91f7802eb3f9dda8eb67ec1458a
e28f19c5520d7328a68afd161c0f713d243cf3ac
23200 F20101124_AACBGY spellman_e_Page_10.QC.jpg
a2545afb884e5de88915d9511f05b2ae
980d51a327527059ce53d563cd5fe96f53f5e771
1607725 F20101124_AACBEA spellman_e.pdf
e7c6e4bba6b55608adf8ffcd9a20db59
77c5caeb7a0bf24852cd4cc84fd5bba29866fc1d
321 F20101124_AACANI spellman_e_Page_29.txt
8102f8ec6c97b5ceec19d91944ad3392
bcebb54915327a2855d0b9d52fba596bfeab01b5
F20101124_AACAXC spellman_e_Page_64.tif
10db90e3c8da0c8f6cd850c5710421c0
a7cc7554ff87d42218308935d7efc01a2ba0199b
671688 F20101124_AACASF spellman_e_Page_20.jp2
6af54da06fad74277de0c19ea2ce06d0
6193843e8cbbfc7d8f665b076795d5326091bc7f
6096 F20101124_AACBGZ spellman_e_Page_77thm.jpg
54c97bce74031540804e6982d4e7d420
9b1cef26fb854c6eaccf6aa30915d84b13639225
6039 F20101124_AACBEB spellman_e_Page_35thm.jpg
9789b03177f96ca71836405418c74f03
bbd6ebfa8e49904c05b9f81913cdc7cacf48be8a
1940 F20101124_AACANJ spellman_e_Page_59.txt
7ced7b90924b9d64e6dcfcc84c2b87c1
f43a1b31366efd46017e2bbe780b496823270bc4
F20101124_AACAXD spellman_e_Page_65.tif
0dcdbc845e06750198b927f0fc7960c8
b5768733cf69220a53a2c7b8108030bc98416c32
27039 F20101124_AACASG spellman_e_Page_21.jp2
406969cc6ca523eda2cc4bfab02b2fad
4721e7cb9568d3b25619d898a73adc25649a7aa4
15473 F20101124_AACBEC spellman_e_Page_22.QC.jpg
435c7a539ab5d5d79308228ff565e5c9
fbd0b708cb8ed352375119576e84c15373cc001b
66818 F20101124_AACANK spellman_e_Page_06.jpg
dd71c399adf2dd724b6479db0e87a9b2
addc0274d3e0eb2de8bd085b560bef54da5445ac
F20101124_AACAXE spellman_e_Page_66.tif
b117642dd6da75a4b224892b42f068d6
d0a4a7f1b36365c8eff4f8870f8273263fd89d77
650325 F20101124_AACASH spellman_e_Page_22.jp2
fd04b87b5a2e2d2cbb533c8ff11e0d21
c7f4ff421ba617b7c762956a39c7818b71425f47
3534 F20101124_AACBJA spellman_e_Page_12thm.jpg
2dd0578fd9788cd89a7290ab638bc873
c8478f85607ee88c18b0c3440fc07440dae82239
13961 F20101124_AACBED spellman_e_Page_80.QC.jpg
303b77e4db1dad0357a41965af87e465
c6fa96f2c22461c3f0a7f6ac3f7b5b0e2a757c0d
6664 F20101124_AACANL spellman_e_Page_47thm.jpg
7808e9d0c4a5b63bdf7ff7a62949b0b5
bf0dd525fa4f869988c4ca88dfbd7f60b96c08ed
F20101124_AACAXF spellman_e_Page_67.tif
fd8b57162d0ef01191f970cf8917448c
7df84e7d5611482d6cdbdd5c2a169a27647cdc05
626348 F20101124_AACASI spellman_e_Page_23.jp2
88ffae86f5517925759de0379aae3f38
2b48474cb962c9c07d425bf1669698fac3ce69dd
19589 F20101124_AACBJB spellman_e_Page_11.QC.jpg
d6827f1013d6ffde520b3d1b1939d640
fb349f4e026ab7730de35e0d770144a3df827c02
20289 F20101124_AACBEE spellman_e_Page_70.QC.jpg
d1cecd9b45eeebd20d2b8f999876c833
51d8a860c6702f63641b16e7453092c7d9df763e
1051891 F20101124_AACANM spellman_e_Page_75.jp2
1d8606dc1649a8b0ccfeadf690d571f4
2c18bfb63f7fe467e9c1df523fb4e408941a2a4e
F20101124_AACAXG spellman_e_Page_68.tif
897b45f536074628d5e4e722196221c2
be6740f00f3e2ee959663a6b1d7de202c41d5fbd
634436 F20101124_AACASJ spellman_e_Page_24.jp2
91426949f9f143dc4882d8859ec066f5
d1581ed8f8e217e2e6844e49dd57d1500dd47aa4
3640 F20101124_AACBJC spellman_e_Page_27thm.jpg
1f73578a427637eafe117d33bdeaf7eb
377d5805cc591d4e64b80d9a3a7f39ae7468da67
5280 F20101124_AACBEF spellman_e_Page_57thm.jpg
2a720ba066bda2ecca350095e35207e3
8c680c50f3d2e0eebb33518d0e01cf15add0c998
441450 F20101124_AACANN spellman_e_Page_65.jp2
3c2c5693b4703c7650d9898f80bfc250
15b02d3313827f490a48e6c25351a0042a06e24d
F20101124_AACAXH spellman_e_Page_69.tif
119737e696b0d385cd95613ac878464a
0beacd3748edd95ea4411901f92f45d2d0b797f4
853683 F20101124_AACASK spellman_e_Page_25.jp2
b0f3d916f73daf3afa1b59e5b55bc2af
752f86cb6216e437d9ba75fbc63b5dcd23e59b38
4454 F20101124_AACBJD spellman_e_Page_22thm.jpg
411178e4cc95dd78b6aaf18d2018ef2e
b152cf80a8594d48c22d0c0086482e766eed9459
9696 F20101124_AACBEG spellman_e_Page_29.QC.jpg
dd02972081459640f06b582e9b4e9e6c
e82be16bd9a28689278c43ebffe408d6bcd53fe5
F20101124_AACANO spellman_e_Page_83.tif
f95d34cc778f5f14571e3ca186b96d13
b90d432971f36df1e63375b577122eedf6677bb6
F20101124_AACAXI spellman_e_Page_70.tif
4e38d871d237ecd35e91749f3ce17f44
122df78dc11f930428f4d98380f5c24fbd54c009
59503 F20101124_AACASL spellman_e_Page_26.jp2
572b6d7ba5974b55cf4daafadf1ec1fc
e89cb746de4eb057f499a6f91113e6aa023e3da1
6148 F20101124_AACBJE spellman_e_Page_86thm.jpg
b7dfe1e4683dfefe89f597a4fd2d76f6
60fe86b7b36fbc96bdd0fac7ec97f233fe9d9ea4
5729 F20101124_AACBEH spellman_e_Page_09thm.jpg
6afb93f32d05482b0da3ef1d1bd03d48
4d28eff5475a115446f70170c76c3c6dcb0a810d
711 F20101124_AACANP spellman_e_Page_66.txt
f47ed93dd2afd9a2bfc3bdedf5d8e0fd
acd95aa363a72cff1095c1da387e0fe563076c2b
F20101124_AACAXJ spellman_e_Page_71.tif
559d04f56c576f44d80f9bc5a593ee94
18319f532555fa46f4f3f2088db119815d3327ea
422597 F20101124_AACASM spellman_e_Page_27.jp2
8dc7876d1e720a0d2ba8e427d8384bd8
41dc149b0434cc2e69bcd693aec36e511fc23418
12189 F20101124_AACBJF spellman_e_Page_34.QC.jpg
5e2403eb5d718d7501620f88305c516f
a559550c662168f4d429485c5ecf5ec6f7ccb00e
10244 F20101124_AACBEI spellman_e_Page_32.QC.jpg
2036d8c1ad78d09c720f2e08b20b74e7
52095367a0ae60f3d4464b6ef4a3c314580bea46
18808 F20101124_AACANQ spellman_e_Page_25.QC.jpg
8e25b12da0f9e30917d0e2001b85ee3b
76eafc800ee22e06a86800ae6b903fc9083e734b
F20101124_AACAXK spellman_e_Page_73.tif
76d4cedff68f30c3d0db9b8f6704d60f
36e33342afb4a7aaa4bc866ecdd005c12faa7c5a
1051977 F20101124_AACASN spellman_e_Page_28.jp2
7624a0f68bde9ab7dd6c1796a97369fa
35a42eaffb75325acdf75eb5395e9b4a5f3b3a5a
22383 F20101124_AACBJG spellman_e_Page_59.QC.jpg
31039ac1120db2610f26c901f158582a
1d08c3f1efed817fb9d7be6a56508f9230f51fe0
6351 F20101124_AACBEJ spellman_e_Page_71thm.jpg
185995c05fa1a245eb350ce6b94d356e
5c26fc98364b72aa31c5310cf79a719059075c67
20910 F20101124_AACANR spellman_e_Page_36.QC.jpg
f582225bfc3ca628b633e565502fb64a
ee86285dae990e7ee652bb333b280270c638f51e
F20101124_AACAXL spellman_e_Page_74.tif
85067db20c78fad00154f43530ab7882
78e53b325713b3091f33d2bc8ed86085010d7b9c
286392 F20101124_AACASO spellman_e_Page_29.jp2
71e84a834fe0eca6c7ecde9355887dcb
1b7c1b42ef3318d12856cb8b5e172f65ac91a816
20646 F20101124_AACBJH spellman_e_Page_52.QC.jpg
ce7c1206eba8b77c259c23450a5651cb
61f5ed8874ffd3520a7775d735790239e028d375
5765 F20101124_AACBEK spellman_e_Page_43thm.jpg
3cf85ff80890f210840e19051d5f1327
b6e3f5b5d269fcc62056bdd968b4420b2ffabc0f
65013 F20101124_AACANS spellman_e_Page_55.jpg
1ffef733129682daac634ddea5a45cd2
6c92955b7a4b0ff5a097df6d1c1b76dd8ec17fe5
F20101124_AACAXM spellman_e_Page_75.tif
49169dd9f4ea747f338cf981f8946fcd
f8b75000baa6f7342cde5b0f49b123c068093562
248733 F20101124_AACASP spellman_e_Page_30.jp2
c2312d72969014c81551e30819394e62
69b6922ee0f376d545284c92b27bbc8763a1969e
24396 F20101124_AACBJI spellman_e_Page_42.QC.jpg
103a70b45b043c52d7cd1f0649e9ac59
8e5f68b0981b79570a0bae0125868cb876119caa
F20101124_AACAXN spellman_e_Page_76.tif
f952e331c4647ad00d633b9736da6923
d476557595d630cebab6fd6ccc882a34aacede44
1051944 F20101124_AACASQ spellman_e_Page_31.jp2
9361a5f6097f0d4112fe792252a51450
b9ebafe3dede906f464929aa5577b84d0a9d1d4e
F20101124_AACANT spellman_e_Page_72.tif
220c44296dd4069d5cd38dd470e86a66
c1af5d865debe0637a65610587df5eb7c09fad83
19696 F20101124_AACBJJ spellman_e_Page_04.QC.jpg
c9f3a886e92296b425b7e152aa475227
bac56f668d9a3c9406e26ba291b7c2bd850579eb
3802 F20101124_AACBEL spellman_e_Page_34thm.jpg
969874e0d9e39b66c8826745c207fd01
3a0caebdf8d27e68e8b184415ee66069f5034138
F20101124_AACAXO spellman_e_Page_77.tif
8fb16c557bc880d2961ce8619a2c48ef
715195abe8efb5b40a5d68d3e72e5b193ad49f02
496444 F20101124_AACASR spellman_e_Page_32.jp2
afedad86a4987072423d0d73eaa4b2de
439a6f99e18bbd15ae4d5ab730171f42b44dc30d
1944 F20101124_AACANU spellman_e_Page_81.txt
812bdca07d8c88b0c9365d17d5749d18
432dcb82308c9e5bccabc3e8cc1eae6b0057859d
17827 F20101124_AACBJK spellman_e_Page_54.QC.jpg
7054b261fba6d46ccd88c1d951cdd9ee
52621089d5cd452605d1d403aae28c3439bb77cb
16719 F20101124_AACBEM spellman_e_Page_67.QC.jpg
9ce9f08d72ce7ee44b2427d20b657688
2ea6bce89f4e7d59340c6164f30bd88a68d2fd38
F20101124_AACAXP spellman_e_Page_78.tif
db71dc7f2bb6796aedb02c8b24dfbefc
acd3183dd9a851b91f276e3761d6872d45c0f9ca
623002 F20101124_AACASS spellman_e_Page_33.jp2
04b872e1db4ea014072ace74e43267fd
0c1a5c4077a238b4c72ebc77859683737c45bde7
19061 F20101124_AACBJL spellman_e_Page_57.QC.jpg
4d6399c96504ad47a2f9c93e85394772
c403986a049096a673b84ba8bd920a8cf82f9800
5357 F20101124_AACBEN spellman_e_Page_55thm.jpg
54816011bb002c3166a77f9eb6670239
50cc0666de47aab96b6d1a0e40644fd439568876
F20101124_AACAXQ spellman_e_Page_79.tif
8e674864981669991478b7a394cd6e4a
42a914fe6d688b2d5041fab589d09f2412634492
55006 F20101124_AACAST spellman_e_Page_34.jp2
993c6e33004c8fe5e359e6e03a07ffc6
38e1560c5738d4da8b6d888f83c95567f56f8612
F20101124_AACANV spellman_e_Page_11.txt
ecf2c09941f807b81ea3c8e02a1816aa
b7c1fa5b4f2908edcb7330d590ef719529b2bb6c



PAGE 1

FUSINGPROBABILITYDISTRIBUTIONSWITHINFORMATIONTHEORETICCENTERSANDITSAPPLICATIONTODATARETRIEVALByERICSPELLMANADISSERTATIONPRESENTEDTOTHEGRADUATESCHOOLOFTHEUNIVERSITYOFFLORIDAINPARTIALFULFILLMENTOFTHEREQUIREMENTSFORTHEDEGREEOFDOCTOROFPHILOSOPHYUNIVERSITYOFFLORIDA2005

PAGE 2

Copyright2005byEricSpellman

PAGE 3

IdedicatethisworktomydearestKaylawithwhomIhavealreadylearnedhowtobeaDoctorofPhilosophy.

PAGE 4

ACKNOWLEDGMENTSForhissupportiveguidanceduringmygraduatecareer,IthankDr.BabaC.Vemuri,mydoctoraladvisor.Hetaughtmetheeld,oeredmeaninterestingproblemtoexplore,pushedmetopublish|inspiteofmyterminalprocrastination{andtriedhisbesttoinstillinmetheintangiblesecretstoaproductiveacademiccareer.AlsotheothermembersofmycommitteehavehelpedmegreatlyinmycareerattheUniversityofFlorida,andIthankthemall:Dr.BrettPresnelldeliveredtherstlectureIattendedasanundergraduate;andIamgladhecouldattendthelastlectureIgaveasadoctoralcandidate.IhavealsobenettedfromandappreciatedDr.AnandRangarajan'slectures,professionaladvice,andphilisophicaldiscussions.WhileIdidnothavethepleasureofattendingDr.ArunavaBanerjee'sorDr.JeHo'sclasses,Ihaveappreciatedtheirinsightsandtheirexamplesassuccessfulearlyresearchers.IwouldalsoliketothankDr.MuraliRaoforstimulatingdebates,Dr.Sun-IchiAmariforproposingtheideaofthee-centerandproofsoftherelatedtheorems,andnumerousanonymousreviewers.Myprofessionaldebtsextendbeyondthefacultyhowevertomyfellowcomrades-in-research.Withthem,Ihavemutteredallmannerofthingsabouttheaforementionedgroupinthesurestcondencethatmymutteringswouldnotbebetrayed.Dr.JundongLiu,Dr.ZhizhouWang,TimMcGraw,FeiWang,SantoshKodipaka,NickLord,BingJian,VinhNghiem,andEvrenOzarslanalldeservethanks.AlsodeservingaretheDepartmentstamemberswhosehardworkkeepsthisplaceaoatandRonSmithfordesigningaword-processingtemplatewithout iv

PAGE 5

whichtheprocessofwritingadissertationmightitselfrequireaPh.D.Forthepermissiontoreproducecopyrightedmaterialwithinthisdissertation,IthanktheIEEEChapter 3 andSpringer-VerlagChapter 4 .FordataIthankthepeoplebehindtheYaleFaceDatabaseimagesfromwhichIusedinFig. 2{5 andFig. 2{8 andMichaelBlackfortrackingdata.Andforthenancialsupportwhichmadethisworkpossible,IacknowledgetheUniversityofFlorida'sStephenC.O'ConnellPresidentialFellowship,NIHgrantRO1NS42075,andtravelgrantsfromtheComputerandInformationScienceandEngineeringdepartment,theGraduateStudentCouncil,andtheIEEE.Andnally,mostimportantly,Ithankmyfamily.Ithankmymother-in-lawDonnaLeaforallofherhelpthesepastfewweeksandNeil,Abra,andPeterforlettingustakeherawayforthattime.Ithankmymotherandfatherforeverythingandmybrother,too.AndIofcoursethankmydearestKaylaandmyloudest,mostobstinate,andsweetestSophia. v

PAGE 6

TABLEOFCONTENTS page ACKNOWLEDGMENTS ............................. iv LISTOFTABLES ................................. viii LISTOFFIGURES ................................ ix ABSTRACT .................................... xi CHAPTER 1INTRODUCTION .............................. 1 1.1InformationTheory .......................... 2 1.2AlternativeRepresentatives ..................... 3 1.3MinimaxApproaches ......................... 4 1.4OutlineofRemainder ......................... 5 2THEORETICALFOUNDATION ...................... 7 2.1Preliminary|EuclideanInformationCenter ............ 7 2.2MixtureInformationCenterofProbabilityDistributions ..... 10 2.2.1GlobalOptimality ....................... 13 2.3ExponentialCenterofProbabilityDistributions .......... 13 2.4IllustrationsandIntuition ...................... 16 2.4.1Gaussians ........................... 19 2.4.2NormalizedGray-levelHistograms .............. 19 2.4.3Thee-CenterofGaussians .................. 23 2.4.4e-ITCofHistograms ...................... 24 2.5PreviousWork ............................. 27 2.5.1Jensen-ShannonDivergence .................. 27 2.5.2MutualInformation ...................... 28 2.5.3ChannelCapacity ....................... 29 2.5.4Boosting ............................ 30 3RETRIEVALWITHm-CENTER ...................... 31 3.1Introduction .............................. 31 3.2ExperimentalDesign ......................... 34 3.2.1ReviewoftheTextureRetrievalSystem ........... 34 3.2.2ImprovingEciency ...................... 37 3.3ResultsandDiscussion ........................ 37 vi

PAGE 7

3.3.1ComparisontoPre-existingEciencyScheme ....... 41 3.4Conclusion ............................... 42 4RETRIEVALWITHe-CENTER ...................... 43 4.1Introduction .............................. 43 4.2LowerBound ............................. 44 4.3ShapeRetrievalExperiment ..................... 47 4.4RetrievalwithJS-divergence ..................... 51 4.5Comparingthem-ande-ITCs .................... 55 5TRACKING .................................. 57 5.1Introduction .............................. 57 5.2Background|ParticleFilters ..................... 57 5.3ProblemStatement .......................... 59 5.4Motivation ............................... 59 5.4.1BinaryStateSpace ...................... 59 5.4.2Self-informationLoss ..................... 61 5.5Experiment|OneTrackertoRuleThemAll? ........... 61 5.5.1Preliminaries .......................... 61 5.5.2m-ITC ............................. 67 5.6Conclusion ............................... 68 6CONCLUSION ................................ 69 6.1Limitations .............................. 69 6.2Summary ............................... 69 6.3FutureWork .............................. 70 REFERENCES ................................... 71 BIOGRAPHICALSKETCH ............................ 76 vii

PAGE 8

LISTOFTABLES Table page 5{1Averagetime-till-failureofthetrackerbasedonthearithmeticmeanAMandonthem-ITC0particles ................ 68 5{2Averagetime-till-failureofthetrackerbasedonthearithmeticmeanAMandonthem-ITC0particleswithsecondsetofmodels 68 viii

PAGE 9

LISTOFFIGURES Figure page 2{1Centerisdenotedbyandsupportsaredenotedby. ........ 9 2{2Theensembleof50Gaussianswithmeansevenlyspacedonthein-terval[-30,30]and=5 ........................ 16 2{3ThecomponentsfromFig. 2{2 scaledbytheirweightsinthem-ITC. 17 2{4Them-ITCsolidandarithmeticmeanAM,dashedoftheensem-bleofGaussiansshowninFig. 2{2 .................. 18 2{5Sevenfacesofonepersonunderdierentexpressionswithaneighthfacefromsomeoneelse.AboveeachimageistheweightpiwhichtheITCassignstothedistributionarisingfromthatimage. .... 20 2{6Thenormalizedgray-levelhistogramsofthefacesfromFig. 2{5 .AboveeachdistributionistheKL-divergencefromthatdistributiontothem-ITC.ParenthesesindicatethatthevalueisequaltotheKL-radiusoftheset.Notethataspredictedbytheory,thethedis-tributionswhichhavemaximumKL-divergencearetheveryoneswhichreceivednon-zeroweightsinthem-ITC. ............ 21 2{7Intheleftcolumn,wecanseethatthearithmeticmeansolid,lowerleftresemblesthedistributionarisingfromtherstfacemorecloselythanthem-ITCsolid,upperleftdoes.Intherightcolumn,weseetheopposite:Them-ITCupperrightmorecloselyresemblestheeighthdistributionthandoesthearithmeticmeanlowerright. 22 2{8Eightimagesoffaceswhichyieldnormalizedgraylevelhistograms.Wechooseanextraordinarydistributionfornumbereighttocon-trasthowtherepresentativecapturesvariationwithinaclass.Thenumberaboveeachfaceweighsthecorrespondingdistributioninthee-ITC. ................................ 25 2{9KLC;Piforeachdistribution,forCequaltothee-ITCandgeo-metricmean,respectively.Thehorizontalbarrepresentsthevalueofthee-radius. ............................. 26 3{1Usingthetriangleinequalitytoprune .................. 33 3{2ExamplesimagesfromtheCUReTdatabase .............. 35 ix

PAGE 10

3{3Onaverageforprobesfromeachtextureclass,thespeed-uprelativetoanexhaustivesearchachievedbythemetrictreewiththem-ITCastherepresentative ....................... 38 3{4Theexcesscomparisonsperformedbythearithmeticmeanrelativetothem-ITCwithineachtextureclassasaproportionofthetotaldatabase ................................. 39 3{5Theexcesscomparisonsperformedbythebestmedoidrelativetothem-ITCwithineachtextureclassasaproportionofthetotaldatabase 40 4{1Intuitiveproofofthelowerboundinequation 4.7 seetext.TheKL-divergenceactslikesquaredEuclideandistance,andthePythagoreanTheoremholdsunderspecialcircumstances.Qisthequery,Pisadistributioninthedatabase,andCisthee-ITCofthesetcontain-ingP.PistheI-projectionofQontothesetcontainingP.Ontheright,DCjjPRe,whereReisthee-radius,bytheminimaxdenitionofC. ............................. 45 4{2Thespeed-upfactorversusanexhaustivesearchwhenusingthee-ITCasafunctionofeachclassintheshapedatabase. ....... 49 4{3Therelativepercentofadditionalpruningswhichthee-ITCachievesbeyondthegeometriccenter,againforeachclassnumber. ..... 50 4{4Speedupfactorforeachclassresultingfromusinge-ITCoveranex-haustivesearch ............................. 52 4{5Excesssearchesperformedusinggeometricmeanrelativetoe-ITCasproportionoftotaldatabase. ..................... 53 4{6Excesssearchesperformedusinge-ITCrelativetom-ITCaspropor-tionoftotaldatabase. ......................... 54 4{7Theratioofthemaximal2distancefromeachcentertoalloftheelementsinaclass ........................... 56 5{1Framefromtestsequence ......................... 62 5{2Averagetimetillfailureasafunctionofangledisparitybetweenthetruemotionandthetracker'smotionmodel ............. 64 5{3Averagetimetillfailureasafunctionoftheweightonthecorrectmotionmodel;for10and20particles ................. 66 x

PAGE 11

AbstractofDissertationPresentedtotheGraduateSchooloftheUniversityofFloridainPartialFulllmentoftheRequirementsfortheDegreeofDoctorofPhilosophyFUSINGPROBABILITYDISTRIBUTIONSWITHINFORMATIONTHEORETICCENTERSANDITSAPPLICATIONTODATARETRIEVALByEricSpellmanAugust2005Chair:BabaC.VemuriMajorDepartment:ComputerandInformationScienceandEngineeringThisworkpresentstworepresentationsforacollectionofprobabilitydistribu-tionsordensitiesdubbedinformationtheoreticcentersITCs.Likethecommonarithmeticmean,therstnewcenterisaconvexcombinationofitsconstituentdensitiesinthemixturefamily.Analogously,thesecondITCisaweightedgeomet-ricmeanofdensitiesintheexponentialfamily.Inbothcases,theweightsinthecombinationsvaryasonechangesthedistributions.ThesecentersminimizethemaximumKullback-Leiblerdivergencefromeachdistributionintheircollectionstothemselvesandexhibitanequi-divergenceproperty,lyingequallyfarfrommostelementsoftheircollections.Thepropertiesofthesecentershavebeenestablishedininformationtheorythroughthestudyofchannel-capacityanduniversalcodes;drawingonthisrichtheoreticalbasis,thisworkappliesthemtotheproblemsofindexingforcontent-basedretrievalandtotracking.Manyexistingtechniquesinimageretrievalcasttheproblemintermsofprobabilitydistributions.Thatis,thesetechniquesrepresenteachimageinthedatabase|aswellasincomingqueryimages|asprobabilitydistributions,thusreducingtheretrievalproblemtooneofndinganearestprobabilitydistribution xi

PAGE 12

undersomedissimilaritymeasure.ThisworkpresentsanindexingschemeforsuchtechniqueswhereinanITCstandsinforasubsetofdistributionsinthedatabase.IfthesearchndsthataqueryliessucientlyfarfromsuchanITC,thesearchcansafelydisregardi.e.,withoutfearofreducingaccuracytheassociatedsubsetofprobabilitydistributionswithoutfurtherconsideration,thusspeedingsearch.Oftenintracking,onerepresentsknowledgeabouttheexpectedmotionoftheobjectofinterestbyaprobabilitydistributiononitsnextposition.Thisworkconsidersthecaseinwhichonemustspecifyatrackercapableoftrackinganyoneofseveralobjects,eachwithdierentprobabilitydistributionsgoverningitsmotion.Relatedisthecaseinwhichoneobjectundergoesdierentphases"ofmotion,eachofwhichcanbemodeledindependentlye.g.,acardrivingstraightvs.turning.Inthiscase,anITCcanfusethesedierentdistributionsintoone,creatingonemotionmodeltohandleanyoftheseveralobjects. xii

PAGE 13

CHAPTER1INTRODUCTIONGivenasetofobjects,onecommonlywishestorepresenttheentiresetwithoneobject.Inthiswork,weaddressthisconcernforthecaseinwhichtheobjectsareprobabilitydistributions.Particularly,wepresentanovelrepresentativeforasetofdistributionswhosebehaviorwecandescribeinthelanguageofinformationtheory.Forcontrast,letusrstconsiderthefamiliararithmeticmean:Thearithmeticmeanisauniformlyweightedconvexcombinationofasetofobjectse.g.,distributionswhichminimizesthesumofsquaredEuclideandistancesfromobjectsinturntoitself.However,latersectionscontainexamplesofapplicationsinwhichsucharepresentativeisnotideal.AsanalternativewedescribearepresentativewhichminimizesthemaximalKullback-Leiblerdivergencefromitselftothesetofobjects.Thecentralideaofthisworkisthatsuchaminimaxrepresentationisbetterthanmorecommonlyusedrepresentativese.g.,thearithmeticmeaninsomecomputervisionproblems.Inexploringthisthesis,wepresentthetheoreticalpropertiesofthisrepresentativeandresultsofexperimentsusingit.Weexaminetwoapplicationsinparticular,comparingtheminimaxrepresen-tationtothemorecommonarithmeticmeanandotherrepresentatives.Theseapplicationsareindexingcollectionsofimagesortexturesorshapesandchoosingamotionpriorunderuncertaintyfortracking.Intherstoftheseapplications,wefollowapromisingavenueofworkinusingaprobabilitydistributionasthesignatureofagivenobjecttobeindexed.Thenusinganestablisheddatastruc-ture,therepresentativecanfuseseveralsignaturesintoone,thusmakingsearchesmoreecient.Inthetrackingapplication,weconsiderthecaseinwhichonehas 1

PAGE 14

2 someuncertaintyastothemotionmodelthatgovernstheobjecttobetracked.Specically,thegoverningmodelwillbedrawnfromaknownfamilyofmodels.Wesuggestusingaminimaxrepresentativetoconstructasinglepriordistributiontodescribetheexpectedmotionofanobjectinawaythatfusesallofthemodelsinthefamilyandyieldsonemodelwhichwithbestworstcaseperformance. 1.1 InformationTheoryAsdiscussedinChapter 2 ,theminimaxrepresentativecanbecharacterizedintermsoftheKullback-LeiblerdivergenceandtheJensen-Shannondivergence.Hence,abriefreviewoftheseconceptsisinorder.TheKullback-LeiblerKLdivergence[ 1 ]alsoknownastherelativeentropybetweentwodistributionspandqisdenedasKLp;q=Xipilogpi qi:.1Itisconvexinp,non-negativethoughnotnecessarilynite,andequalszeroifandonlyifp=q.Ininformationtheoryithasaninterpretationintermsofthelengthofencodedmessagesfromasourcewhichemitssymbolsaccordingtoaprobabilitydistribution.WhilethefamiliarShannonentropygivesalowerboundontheaveragelengthpersymbolacodecanachieve,theKL-divergencebetweenpandqgivesthepenaltyinlengthpersymbolincurredbyencodingasourcewithdistributionpundertheassumptionitreallyhasdistributionq;thispenaltyiscommonlycalledredundancy.Toillustratethis,considertheMorsecode,designedtosendmessagesinEnglish.TheMorsecodeencodestheletterE"withasingledotandtheletterQ"withasequenceoffourdotsanddashes.BecauseE"isusedfrequenlyinEnglishandQ"seldom,thismakesforecienttransmission.HoweverifonewantedtousetheMorsecodetosendmessagesinChinesepinyin,whichmightuseQ"morefrequently,hewouldndthecodelessecient.Ifweassume

PAGE 15

3 contrafactuallythattheMorsecodeisoptimalforEnglish,thisdierenceineciencyistheredundancy.AlsoplayingaroleinthisworkistheJensen-ShannonJSdivergence.ItisdenedbetweentwodistributionspandqasJSp;q=KLp;p+)]TJ/F22 11.955 Tf 11.955 0 Td[(q+)]TJ/F22 11.955 Tf 11.956 0 Td[(KLq;p+)]TJ/F22 11.955 Tf 11.955 0 Td[(q;.2where2;1isaxedparameter[ 2 ];wewillalsoconsideritsstraightforwardgeneralizationtondistributions. 1.2 AlternativeRepresentativesAlsoofinterestisthebodyofworkwhichcomputesaveragesofsetsofobjectsusingnon-Euclideandistancessincetherepresentativepresentedinthisworkplaysasimilarrole.Oneexampleofthisappearsincomputingaveragesonamanifoldofshapes[ 3 4 ]bygeneralizingtheminimizationcharacterizationofthearithmeticmeanawayfromthesquaredEuclideandistanceandtothegeodesicdistance.Linkingmanifoldsononehandanddistributionsontheotheristheeldofinformationgeometry[ 5 ].UsingnotionsfrominformationgeometryonecanndthemeanonamanifoldofparameterizeddistributionsbyusingthegeodesicdistancesderivedfromtheFisherinformationmetric.Inthiswork,therepresentativedoesnotminimizeafunctionofthisgeodesicdistance,butrathertheextrinsic"KL-divergence.Furthermore,therepresenta-tivesherearerestrictedtosimplemanifoldsofdistributions|namelythefamilyofweightedarithmeticmeansi.e.,convexcombinationsandnormalized,weightedgeometricmeanssometimesreferredtoastheexponentialfamily"ofthecon-stituentdistributions.Thesearesimpleyetinterestingfamiliesofdistributionsandcanaccomodatenon-parametricrepresentations.Thesetwomanifoldsaredualinthesenseofimformationgeometry[ 6 ],andsoasonemightexpect,therepresen-tativeshaveasimilardualrelationship.Pelletierformsabarrycenterbasedinthe

PAGE 16

4 KL-divergenceoneachofthesemanifolds[ 7 ].Thatbarrycenter,inthespiritofthearithmeticmeanandincontrasttotherepresentativeinthiswork,minimizesasumofKL-divergences.Anothermean-like"representativeonthefamilyofGaus-siandistributionsseekstominimizethesumofsquaredJ-divergencesalsoknownassymmetrizedKL[ 8 ].ThekeydierencebetweenmostoftheseapproachesandthisworkisthatthisworkseekstopresentarepresentativewhichminimizesthemaximumofKL-divergencestotheobjectsinaset. 1.3 MinimaxApproachesIngrainedinthecomputervisioncultureisaheightenedawarenessofnoiseindata.Thisisadefensivetraitwhichhasevolvedoverthemillenniatoallowresearcherstosurviveinaharshenvironmentfullofimperfectsensors.Withthisjustiedconcerninmind,onenaturallyasksifminimizingamaximumdistancewillbedoomedbyover-sensitivitytonoise.Aswithmanysuchconcerns,thisdependsontheapplication,andwearguethatanapproachwhichminimizesamax-basedfunctioniswell-suitedtotheapplicationsinthiswork.Buttoshowthatsuchasetofappropriateapplicationsisnon-empty,considerthesuccessfulworkofHuttenlocheretal.asaproofofconcept[ 9 ]:TheyseektominimizetheHausdordistanceHbetweentwopointsetsA;B.Asonecansee,minimizingtheHausdordistance,HA;B=maxhA;B;hB;AhA;B=maxa2Aminb2Bdista;b;minimizesamax-basedfunction.AnotherexampleappearsintheworkofHoetal.fortracking.InHoetal.[ 10 ]ateachframetheyndalinearsubspacewhichminimizesthemaximaldistancetoasetofpreviousobservations.Additionally,HartleyandSchaalitzkyminimizeamax-basedcostfunctiontodogeometricreconstruction[ 11 ].Cognizantthatthemethodissensitivetooutliers,they

PAGE 17

5 recommendusingitondatafromwhichtheyhavebeenremoved.Theseexamplesdemonstratethatonecannotdismissaprioriamethodasoverly-sensitivetonoisejustbecauseitminimizesamax-basedmeasure.Closerinspirittotheapproachinthisworkisworkusingtheminimaxen-tropyprinciple.Zhuetal.[ 12 ]andLiuetal.[ 13 ]usethisprincipletolearnmodelsoftexturesandfaces.Therstpartofthisprinciplethewell-knownmaximumentropyorminimumprejudiceprincipleseekstolearnadistributionwhich,giventheconstraintthatsomeofitsstatisticstsamplestatistics,maximizesentropy.Csiszarhasshownthatthisprocessisrelatedtoanentropyprojection[ 14 ].Therationaleforselectingadistributionwithmaximalentropyisthatitisrandomorsimpleexceptforthepartsexplainedbythedata.Coupledwiththisistheminimumentropyprinciplewhichfurtherseekstoimposedelityonthemodel.Tosatisfythisprinciple,Zhuetal.chooseasetofstatisticsconstraintstowhichamaximalentropymodelmustadheresuchthatitsentropyisminimized.Byminimizingtheentropy,theyshowthatheminimizestheKL-divergencebetweenthetruedistributionandthelearnedmodel.ForasetSofconstraintsonstatistics,theysummarizetheapproachasS=argminSmaxp2Sentropyp;.3whereSisthesetofallprobabilitydistributionswhichsatisfytheconstraintsinS.ThenwiththeoptimalS,oneneedonlyndthep2Swithmaximalentropy. 1.4 OutlineofRemainderInthenextchapters,werigorouslydenetheminimaxrepresentativesandpresenttheoreticalresultsontheirproperties.Wealsoconnectoneofthesetoitsalternativeandbetter-knownidentityintheinformationtheoryresultsforchannelcapacity.Thereafterfollowtwochaptersonusingtherepresentativesto

PAGE 18

6 indexdatabasesforthesakeofmakingretrievalmoreecient.InChapter 3 wepresentatextureretrievalexperiment;inthisexperimenttheaccuracyoftheretrievalisdeterminedbytheJensen-Shannondivergence,thesquarerootofwhichisatruemetric.LaterinChapter 4 wepresentanexperimentinshaperetrievalwherethedissimilaritymeasureistheKL-divergence.Weproposeasearchmethodwhichhappenstoworkintheparticularcaseofthisdataset,butingeneralhasnoguaranteethatitwillnotdegradeinaccuracy.Thesecondareainwhichwedemonstratetheutilityofaminimaxrepresentativeistracking.InChapter 5 wepresentexperimentsinwhichtherepresentativestandsinforanunknownmotionmodel.Lastlyweendwithsomeconcludingpointsandthoughtsforfuturework.

PAGE 19

CHAPTER2THEORETICALFOUNDATIONInthischapterwedenetheminimaxrepresentativesandenumeratetheirproperties.First,wepresentacasual,motivatingtreatmentinEuclideanspacetogiveintuitiontotheidea.Thencomethesection'scentralresults|theITCforthemixturefamilyandtheITConthedualmanifold,theexponentialfamily;afterdeningthem,wepresentseveralillustrationstolendintuitiontotheirbehavior;andthennallyweshowtheirwell-establishedinterpretationintermsofchannelcapacity. 2.1 Preliminary|EuclideanInformationCenterLetS=ff1;;fngbeasetofnpointsintheEuclideanspaceRm.ThroughoutourdevelopmentwewillconsiderthemaximumdispersionofthemembersofasetSaboutapointf,Df;S=maxijf)]TJ/F36 11.955 Tf 11.955 0 Td[(fij2:.1WewilllookforthepointfcthatminimizesDf;Sandcallitthecenter. Denition1 ThecenterofSisdenedbyfcS=argminfDf;S:.2Thefollowingpropertiesareeasilyproved. Theorem1 ThecenterofSisuniqueandisgivenbyaconvexcombinationoftheelementsofS.fc=nXi=1pifi;.3where0pi1andPpi=1.Wecallapointfiforwhichpi>0asupport. 7

PAGE 20

8 Theorem2 LetfcbethecenterofS.Then,forF=fi;pi>0g,thesetofindicescalledthesupports,wehaveanequi-distanceproperty,jfi)]TJ/F36 11.955 Tf 11.955 0 Td[(fcj2=r2;ifi2F; .4 jfi)]TJ/F36 11.955 Tf 11.955 0 Td[(fcj2r2;otherwise, .5 wherer2isthesquareoftheradiusofthespherecoincidingwiththesupportsandcenteredatthecenter.NowthatwehavecharacterizedthecenterrstbyitsminimaxpropertyDenition 1 andthenitsequi-distancepropertyTheorem 2 ,wecangiveyetanothercharacterization,thisoneusefulcomputationally.Denethesimplexofprobabilitydistributionsonmsymbols,=np=p1;;pn;0pi;Xpi=1o:WedeneafunctionofponbyDEp;S=)]TJ/F15 11.955 Tf 10.494 8.088 Td[(1 2Xpifi2+1 2Xpikfik2;.6andnowuseittondthecenter. Theorem3 DEp;Sisstrictlyconcaveinp,andthecenterofSisgivenbyfc=X~pifi;.7where~p=argmaxpDEp;S.8istheuniquemaximalpointofDEp;S.AnexampleofthecenterisgiveninFig. 2{1 .Ingeneral,thesupportsetconsistsofarelativelysparsesubsetofpoints.Thepointsthataremostextraordi-nary"aregivenhighweights.

PAGE 21

9 Figure2{1:Centerisdenotedbyandsupportsaredenotedby.

PAGE 22

10 2.2 MixtureInformationCenterofProbabilityDistributionsWenowmovetotheheartoftheresults.Whilethefollowingderivationsusedensities,theycouldjustaseasilyworkinthediscretecase.Letfxbeaprobabilitydensity.GivenasetSofnlinearlyindependentdensitiesf1;;fn,weconsiderthespaceMconsistingoftheirmixtures,M=nXpifi0pi;Xpi=1o:.9ThedimensionmofMsatisesmn)]TJ/F15 11.955 Tf 11.956 0 Td[(1.MisaduallyatmanifoldequippedwithaRiemannianmetric,andapairofdualaneconnections[ 5 ].TheKLdivergencefromf1xtof2xisdenedbyKLf1;f2=Zf1xlogf1x f2xdx;.10whichplaystheroleofthesquaredofthedistanceinEuclideanspace.TheKL-divergencefromStofisdenedbyKLS;f=maxiKLfi;f:.11Letusdenethemixtureinformationcenterm-ITCofS. Denition2 Them-centerofSisdenedbyfmS=argminfKLS;f:.12Inordertoanalyzepropertiesofthem-center,wedenethem-sphereofradiusrcenteredatfbythesetSmf;r=f0KLf0;fr2;f02M:.13

PAGE 23

11 Inordertoobtainananalyticalsolutionofthem-center,weremarkthatthenegativeentropy'f=Zfxlogfxdx.14isastrictlyconvexfunctionoff2M.ThisisthedualpotentialfunctioninMwhosesecondderivativesgivetheFisherinformationmatrix[ 5 ].GivenapointfinM,f=Xpifi;Xpi=1;pi0;.15andusing'above,wewritetheJensen-ShannonJSdivergence[ 2 ]offwithrespecttoSasDmf;S=)]TJ/F22 11.955 Tf 9.298 0 Td[('f+Xpi'fi:.16WhenweregardDmf;Sasafunctionofp=p1;;pnT2,wedenoteitbyDmp;S.ItiseasytoseeDmf;Sisstrictlyconcaveinp2 .17 Dmf;Sisinvariantunderpermutationofffig; .18 Dmf;S=0ifpisanextremepointofsimplex: .19 Hencefromequations 2.17 and 2.19 ,Dmf;Shasauniquemaximumin.IntermsoftheKL-divergence,wehaveDmp;S=XpiKLfi;f:.20 Theorem4 Them-centerofSisgivenbytheuniquemaximizerofDmf;S,thatis,fm=X~pifi; .21 ~p=argmaxpDmp;S: .22

PAGE 24

12 Moreover,forsupportsforwhich~pi>0,KLfi;fm=r2,andfornon-supportsforwhich~pi=0,KLfi;fmr2,r2=maxiKLfi;fm.ByusingtheLagrangemultiplierfortheconstraintPpi=1,wecalculate@ @pinDmp;S)]TJ/F22 11.955 Tf 11.955 0 Td[(Xpi)]TJ/F15 11.955 Tf 11.955 0 Td[(1o=0:.23FromthedenitionofDmandbecauseof@ @pi'f=@ @piZXpjfjlogXpkfkdx=Zfilogf+1; .24 equation 2.23 isrewrittenas)]TJ/F28 11.955 Tf 11.291 16.273 Td[(Zfilogf)]TJ/F15 11.955 Tf 11.955 0 Td[(1+'fi)]TJ/F22 11.955 Tf 11.955 0 Td[(=0.25whenpi>0,andthenasKLfi;f=+1:.26However,becauseoftheconstraintspj0,thisisnotnecessarilysatisedforsomefjforwhichpj=0.Hence,attheextremepointfmS,wehaveKLfi;f=r2.27forsupportsfipi>0,butfornon-supportspi=0,KLfi;fr2:.28Thisdistinctionoccursbecause~pmaximizesPpiKLfi;f,andifweassumeforanon-supportdensityfithatKLfi;f>r2,wearriveatacontradiction.WeremarkthatnumericaloptimizationofDmcanbeaccomplishedecientlye.g.,withaNewtonscheme[ 15 ],becauseitisconcaveinp.

PAGE 25

13 2.2.1 GlobalOptimalityLastly,itshouldbenotedthatwhileitappearsthattheresultssofarsuggestthatthem-ITCistheminimaxrepresentativeovermerelythesimplexofmix-tures,aresultbyMerhavandFeder[ 16 ]basedonworkbyCsiszar[ 17 ]showsthatthem-ITCisindeedtheglobalminimaxrepresentativeoverallprobabilitydistributions.Toargueso,consideradistributiongwhichaskepticputsforwardastheminimizerofequation 2.12 .Next,weconsidertheI-projectionofgontothesetofmixturesMdenedas~g=argminf2MKLf;g:.29AndfromthepropertiesofI-projections[ 17 ],weknowforallf2MthatKLf;gKLf;~g+KL~g;gKLf;~g.30sincethelasttermontheright-handsideispositive.Thistellsusthatthemixture~gperformsatleastaswellasg;sowemayrestrictourminimizationtothesetofmixtures,knowingwewillndtheglobalminimizer. 2.3 ExponentialCenterofProbabilityDistributionsGivenndensitiesfix,fix>0,wedenetheexponentialfamilyofdensitiesE,E=nfjfx=expnXpilogfix)]TJ/F22 11.955 Tf 11.955 0 Td[(po;0pi;Xpi=1o;.31insteadofthemixturefamilyM.ThedimensionmofEsatisesmn)]TJ/F15 11.955 Tf 12.478 0 Td[(1.EisalsoaduallyatRiemannianmanifold.EandMaredual"[ 5 ],andwecanestablishsimilarstructuresinE.Thepotentialfunctionpisconvex,andisgivenbyp=logZexpnXpilogfixodx:.32

PAGE 26

14 Thepotentialpisthecumulantgeneratingfunction,andisconnectedwiththenegativeentropy'bytheLegendretransformation.Itiscalledthefreeenergyinstatisticalphysics.ItssecondderivativesgivetheFisherinformationmatrixinthiscoordinatesystem.Ane-sphereexponentialsphereinEcenteredatfisthesetofpointsSef;r=f0KLf0;fr2;f02E;.33whereristhee-radius. Denition3 Thee-centerofSisdenedbyfeS=argminfKLf;S;.34whereKLf;S=maxiKLf;fi:.35WefurtherdenetheJS-likedivergenceinE,Dep;S=)]TJ/F22 11.955 Tf 9.298 0 Td[(p+Xpifi:.36Becauseoffi=logZexplogfidx=0;.37wehaveDep;S=)]TJ/F22 11.955 Tf 9.298 0 Td[(p:.38ThefunctionDe=)]TJ/F22 11.955 Tf 9.298 0 Td[(isstrictlyconcave,andhasauniquemaximum.Itisaninterestingexercisetoshowforf=expfPpilogfi)]TJ/F22 11.955 Tf 11.955 0 Td[(pg,)]TJ/F22 11.955 Tf 9.298 0 Td[(p=XpiKLf;fi:.39AnalogoustothecaseofM,wecanprovethefollowing.

PAGE 27

15 Theorem5 Thee-centerofSisuniqueandgivenbythemaximizerofDep;ffig,asfeS=expnX~pilogfi)]TJ/F22 11.955 Tf 11.956 0 Td[(~po; .40 ~p=argmaxDep;S: .41 Moreover,KLfe;fi=r2;forsupportingfipi6=0;.42andKLfe;fir2;fornon-supportingfipi=0;.43wherer2=maxiKLfe;fi:.44Wecalculatethederivativeof)]TJ/F22 11.955 Tf 9.299 0 Td[(p)]TJ/F22 11.955 Tf 11.955 0 Td[(Ppi)]TJ/F15 11.955 Tf 11.955 0 Td[(1,andput@ @pin)]TJ/F22 11.955 Tf 9.299 0 Td[(p)]TJ/F22 11.955 Tf 11.956 0 Td[(Xpi)]TJ/F15 11.955 Tf 11.955 0 Td[(1o=0:.45Forf=expnXpilogfix)]TJ/F22 11.955 Tf 11.956 0 Td[(po;.46wehave@ @pip=@ @pilogZexpnXpilogfixodx=Zfxlogfixdx: .47 Hence, 2.45 becomesZfxlogfixdx==const2.48andhenceKLf;fi=const2.49

PAGE 28

16 Figure2{2:Theensembleof50Gaussianswithmeansevenlyspacedontheinter-val[-30,30]and=5 forfiwithpi>0.Forpi=0,KLf;fiisnotlargerthanr2,becauseof 2.39 2.4 IllustrationsandIntuitionInthissectionweconveysomeintuitionregardingthebehavioroftheITCs.Particularly,weshowexamplesinwhichtheITCschoosearelativelysparsesubsetofdensitiestobesupportdensitiesassigningzeroasweightstotheotherdensitiesintheset;alsowendthattheITCtendstoassigndisproportionatelyhighweightstomembersofthesetthataremostextraordinary"i.e.,thosethataremostdistinctfromtherestoftheset.

PAGE 29

17 Figure2{3:ThecomponentsfromFig. 2{2 scaledbytheirweightsinthem-ITC.

PAGE 30

18 Figure2{4:Them-ITCsolidandarithmeticmeanAM,dashedoftheensembleofGaussiansshowninFig. 2{2

PAGE 31

19 2.4.1 GaussiansFirst,weexamineasyntheticexampleinwhichweanalyticallyspecifyasetofdensitiesandthennumericallycomputethem-ITCofthatset.Weconstructasetof50one-dimensionalGaussiandistributionswithmeansevenlyspacedontheinterval[)]TJ/F15 11.955 Tf 9.298 0 Td[(30;30]andwith=5.Fig. 2{2 showsallofthedensitiesinthisset.Whenwecomputethem-ITC,weseebothofthepropertiesmentionedabove:Outofthe50densitiesspecied,onlyeightbecomesupportdensitiessparsity.Andadditionally,thedensitieswhichreceivethelargestweightinthem-ITCareouter-mostdensitieswithmeansat-30and30highlightingextraordinary"elements.Fig. 2{3 showstheeightsupportdensitiesscaledaccordingtotheweightwhichthem-ITCassignsthem.Forthesakeofcomparison,Fig. 2{4 showsthem-ITCcomparedwiththearithmeticmean. 2.4.2 NormalizedGray-levelHistogramsNextweconsideranexamplewithdistributionsarisingfromimagedata.Fig. 2{5 showseightimagesfromtheYaleFaceDatabase.Therstsevenimagesareofthesamepersonunderdierentfacialexpressionswhilethelasteighthimageisofadierentperson.Weconsidereachimage'snormalizedhistogramofgray-levelintensitiesasadistributionandshowtheminFig. 2{6 .Whenwetakethem-ITCofthesedistributions,weagainnoticethesparsityandthefavortismoftheboundaryelements.ThenumbersaboveeachfaceinFig. 2{5 aretheweightsthatthedistributionarisingfromthatfacereceivedinthem-ITC.Again,wendthatonlythreeoutoftheeightdistributionsaresupportdistributions.Inthepreviousexample,wecouldconcretelyseehowthem-ITCfavoreddensitiesontheontheboundary"ofthesetbecausethedensitieshadacleargeometricrelationshipamongthemselveswiththeirmeansarrayedonaline.Inthisexamplethenotionofaboundaryisnotquitesoobvious;yet,wethinkthat

PAGE 32

20 Figure2{5:Sevenfacesofonepersonunderdierentexpressionswithaneighthfacefromsomeoneelse.AboveeachimageistheweightpiwhichtheITCassignstothedistributionarisingfromthatimage.

PAGE 33

21 Figure2{6:Thenormalizedgray-levelhistogramsofthefacesfromFig. 2{5 .AboveeachdistributionistheKL-divergencefromthatdistributiontothem-ITC.Paren-thesesindicatethatthevalueisequaltotheKL-radiusoftheset.Notethataspredictedbytheory,thethedistributionswhichhavemaximumKL-divergencearetheveryoneswhichreceivednon-zeroweightsinthem-ITC.

PAGE 34

22 Figure2{7:Intheleftcolumn,wecanseethatthearithmeticmeansolid,lowerleftresemblesthedistributionarisingfromtherstfacemorecloselythanthem-ITCsolid,upperleftdoes.Intherightcolumn,weseetheopposite:Them-ITCupperrightmorecloselyresemblestheeighthdistributionthandoesthearithmeticmeanlowerright.

PAGE 35

23 ifoneexaminestheeighthimageandtheeighthdistribution,onecanqualitativelyagreethatitisthemostextraordinary."ReturningbrieytoFig. 2{6 weexaminetheKL-divergencesbetweeneachdistributionandthem-ITC.Wereportthesevaluesaboveeachdistributionandindicatewithparenthesesthemaximalvalues.SincethethreedistributionswithmaximalKL-divergencetotheITCareexactlythethreesupportdistributions,thisexampleunsurprisinglycomplieswithTheorem 4 .Finally,weagaincomparethem-ITCandthearithmeticmeaninFig. 2{7 ,butthistimeweoverlayinturntherstandeighthdistributions.Examiningtheleftcolumnofthegure,weseethatthearithmeticmeanmorecloselyresemblestherstdistributionthandoesthem-ITC.TheKL-divergencesbearthisoutwiththeKL-divergencefromtherstdistributiontothearithmeticmeanbeing0.0461whichcomparesto0.1827tothem-ITC.Conversely,whenwedoasimilarcom-parisonintherightcolumnofFig. 2{7 fortheeighthextraordinarydistribution,wendthattheITMmostresemblesit.Again,theKL-divergencesquantifythisobservation:WhereastheKL-divergencefromtheeighthdistributiontothem-ITCisagain0.1827,wendtheKL-divergencetothearithmeticmeantobe0.5524.ThisresultfallsinlinewithwhatwewouldexpectfromTheorem 4 andsuggestsamorerenedbitofintuition:Them-ITCbetterrepresentsextraordinarydistribu-tions,butsometimesattheexpenseofthemorecommon-lookingdistributions.Yetoverall,thattrade-oyieldsaminimizedmaximumKL-divergence. 2.4.3 Thee-CenterofGaussiansWeconsideraverysimpleexampleconsistingofmultivariateGaussiandistributionswithunitcovariancematrix,fix=cexp)]TJ/F15 11.955 Tf 10.494 8.088 Td[(1 2jx)]TJ/F36 11.955 Tf 11.955 0 Td[(ij2;.50

PAGE 36

24 wherex,i2Rm.WehaveXpilogfi=)]TJ/F15 11.955 Tf 10.494 8.088 Td[(1 2x)]TJ/F28 11.955 Tf 11.955 11.357 Td[(Xpii2+1 2Xpii2)]TJ/F15 11.955 Tf 10.494 8.088 Td[(1 2Xpikik2+logc .51 p=)]TJ/F15 11.955 Tf 10.494 8.088 Td[(1 2Xpikik2)]TJ/F28 11.955 Tf 11.955 13.749 Td[(Xpii2: .52 Hence,EtooconsistsofGaussiandistributions.ThiscaseisspecialbecauseEisnotonlyduallyatbutalsoEuclideanwheretheFisherinformationmatrixistheidentitymatrix.TheKL-divergenceisKL[fj;fi]=1 2j)]TJ/F36 11.955 Tf 11.955 0 Td[(i2;.53givenbyahalfofthesquaredEuclideandistance.Hence,thee-centerinEofffigisthesameasthecenterofthepointsfigintheEuclideanspace.Whenm=1,thatisxisunivariate,iisasetofpointsontherealline.When1<2<
PAGE 37

25 Figure2{8:Eightimagesoffaceswhichyieldnormalizedgraylevelhistograms.Wechooseanextraordinarydistributionfornumbereighttocontrasthowtherepre-sentativecapturesvariationwithinaclass.Thenumberaboveeachfaceweighsthecorrespondingdistributioninthee-ITC.

PAGE 38

26 Figure2{9:KLC;Piforeachdistribution,forCequaltothee-ITCandgeomet-ricmean,respectively.Thehorizontalbarrepresentsthevalueofthee-radius.

PAGE 39

27 previoussection.BynowexaminingFig. 2{9 ,wecanseethatKLC;Piisequaltothee-radiusindicatedbythehorizontalbarforthethreesupportdistributionsi=1;7;8andislessfortheothers.Thisillustratestheequi-divergencepropertystatedpreviously.InFigs. 2{8 and 2{9 ,theworst-caseKL-divergencefromthegeometricmeanis2.5timeslargerthantheworst-casefromthee-ITC.Ofcourse,thisbetterworst-caseperformancecomesatthepriceofthee-ITC'slargerdistancetotheothersevendistributions;butitisourthesisthatinsomeapplicationsweareeagertomakethistrade. 2.5 PreviousWork 2.5.1 Jensen-ShannonDivergenceFirstintroducedbyLin[ 2 ],theJensen-Shannondivergencehasappearedinseveralcontexts.Sibsonreferredtoitastheinformationradius[ 18 ].Whilehismonikeristantalizinglysimilartothepreviouslymentionednotionsofm-/e-radii,hestrictlyusesittorefertothedivergence,nottheoptimalvalue.Jardinandhelateruseditasadiscriminatorymeasureforclassication[ 19 ].OthershavealsousedtheJS-divergenceanditsvariationsasadissimilaritymeasure|forimageregistrationandretrievalapplications[ 20 21 ],andintheretrievalexperimentofChapter 3 ,thisworkwillfollowsuit.ThatexperimentalsomakesuseoftheimportantfactthatthesquarerootoftheJS-divergenceinthecasewhenitsparameterisxedto1 2isametric[ 22 ].TopseprovidesanothertakeontheJS-divergence,callingascaled,specialcaseofitthecapacitorydiscrimination[ 23 ].Thisnamehintsatthenext,perhapsmostimportantinterpretationoftheJS-divergence,namelythatasameasurementofmutualinformation.Thisalternativeidentityiswidelyknown.Cf.BurbeaandRao[ 24 ]asjustoneexemple.Andthisunderstandingcanhelpilluminatethecontextofthem-centerwithininformationtheory.

PAGE 40

28 2.5.2 MutualInformationButtheobviousquestionarises|themutualinformationbetweenwhat?First,themutualinformationbetweentworandomvariablesXandYwithdistributionsPXandPYisdenedasMIX;Y=HY)]TJ/F22 11.955 Tf 11.956 0 Td[(HYjX;.54whereHY=)]TJ/F28 11.955 Tf 11.291 8.966 Td[(PPYlogPYisShannon'sentropyandtheconditionalentropyisHYjX=)]TJ/F28 11.955 Tf 11.291 11.357 Td[(XxXyPXYx;ylogPYjXyjx:.55Toseetheconnectiontothem-centerandJS-divergence,LetusrstidentifytherandomvariableXaboveasarandomindexintoasetofrandomvariablesfY1;;YngwithprobabilitydistributionsfP1;;Png.NotethatXmerelytakesonanumberfromoneton.IfwenowconsiderYasarandomvariablewhosevalueyresultsfromrstsamplingX=iandthensamplingYi=y,wecancertainlybegintoappreciatethattheseconcoctedrandomvariablesXandYaredependent.Thatis,learningthevaluethatXtakeswillletyouguessmoreaccuratelywhatvalueYwilltake.Conversely,learningthevalueofYhintsatwhichdistributionYithatvaluewasdrawnfromandhencethevaluethatXtook.Returningtoequation 2.54 andpayingparticularattentiontothedenitionoftheconditionalentropyterm,wecanuseourdenitionsofXandYtogetanexpressionfortheirjointdistribution,PXYi;y=PXiPyji:.56ThenbyobservingthatPyji=Piy,pluggingthisintoequation 2.54 ,andpullingPXioutsideofthesummationswithrespecttoy,wehaveMIX;Y=HY)]TJ/F28 11.955 Tf 11.955 11.358 Td[(XiPXiHPi:.57

PAGE 41

29 AndthisispreciselytheJensen-ShannondivergenceJSP1;;Pnwithco-ecientsPX;;PXn.SowhenweevaluatetheJS-divergencewithaparticularchoiceofthosecoecients,wedirectlyspecifyadistributionforboththerandomvariableXandindirectlyspecifyadistributionamixturefortherandomvariableYandevaluatethemutualinformationbetweenthem.AndwhenwemaximizetheJS-divergencewithrespecttothosesamecoecients,wespecifythedistributionsXandYwhichhavemaximummutualinformation.Thereareseveralequivalentdenitionsformutualinformation,andbyconsideringadierentonewecangaininsightintotheselectionofsupportdistributionsinthem-center.Themirrorimageofequation 2.54 givesusMIX;Y=HX)]TJ/F22 11.955 Tf 11.956 0 Td[(HXjY:.58Startingfromhereandforconveniencelettingfyi=PijywecanderivethatMIX;Y=HX)]TJ/F22 11.955 Tf 11.955 0 Td[(EYHfyi:.59fyidescribesthecontributionsatavalueyofeachofthedistributionswhichmakeupthemixture.BymaximizingMIX;Yweminimizetheexpectedvalueoveryoftheentropyofthisdistribution.Thismeansthatontheaverageweencourageasfewaspossiblecontributorsofprobabilitytoalocation;thisisparticularlythecaseatlocationsywithhighmarginalprobability.Actingasaregularizationtermofsorts,topreventtheprocessfromassigningalltheprobabilitytoonecomponentthusdrivingtheright-handtermtozero,isthersttermwhichencouragesuniformweights.Maximizingthewholeexpressionmeansthatwebalancetheimpulsesforfewcontributorsandforuniformweights. 2.5.3 ChannelCapacityOfinteresttoinformationtheoryfromitsinceptionhasbeenthequestionofhowmuchinformationcanonereliablytransmitoverachannel.

PAGE 42

30 Inthiscase,oneinterpretstheensembleofdistributionsfP1;;PngmakinguptheconditionalprobabilityPyji=Piyasthechannel.Thatis,givenanin-putsymboli,thechannelwilltransmitanoutputsymbolywithprobabilityPiy.Givensuchachanneldiscreteandmemorylesssinceeachsymbolisindependentofthesymbolsbeforeandafter,onewillachievedierentratesoftransmissionde-pendingonthedistributionoverthesourcesymbolsPX.TheseratesarepreciselythemutualinformationbetweenXandY,andtondthemaximumcapacityofthechannel,onepicksthePXyieldingamaximumvalue.Inthiscontext,manyoftheresultsfromthissectionhavebeendevelopedandpresentedquiteclearlyintextsbyGallager[ 25 ,Section4.2]andCsiszar[ 26 ,Section2.3].Relatedalsoistheeldofuniversalcoding.Asreviewedearlier,choosinganappropriatecodedependsuponthecharacteristicsofthesourcewhichwillbeencodede.g.,MorsecodeforEnglish.Universalcodingconcernsitselfwiththeproblemofselectingacodesuitableforanysourceoutofafamily|i.e.,acodewhichwillhavecertainuniversalperformancecharacteristicsacrossthefamily.Althoughthisworkinitiallydevelopedinignoranceofthateld,thekeyresultinthiseldwhichthisworktouchesuponistheRedundancy-CapacityTheorem.Thistheorem|independentlyproveninseveralplaces[ 27 28 ]andlaterstrengthened[ 16 ]|statesthatthebestcodeforwhichonecanhopewillhaveredundancyequaltothecapacityormaximumtransmissionrate[ 23 ]ofacertainchannelwhichtakesinputfromtheparametersofthefamilyandoutputassymbolstobeencoded. 2.5.4 BoostingSurprisingly,wealsondaconnectionintheonlinelearningliteraturewhereintheAdaBoost[ 29 ]learningalgorithmisrecastasentropyprojection[ 30 ].Thisangle,alongwithitsextentionstogeneralBregmandivergencesisaninterestingavenueforfutuework.

PAGE 43

CHAPTER3RETRIEVALWITHm-CENTER 3.1 IntroductionTherearetwokeycomponenetsinmostretrievalsystems|thesignaturestoredforanobjectandthedissimilaritymeasureusedtondaclosestmatch.Thesetwochoicescompletelydetermineaccuracyinanearestneighbor-typesystemifoneiswillingtoendureapotentiallyexhaustivesearchofthedatabase.However,becausedatabasescanbequitelarge,comparingeachqueryimageagainsteveryelementinthedatabaseisobviouslyundesirable.Inthischapterandthenextwefocusonspeedingupretrievalwithoutcom-promisingaccuracyinthecaseinwhichtheobjectandquerysignaturesareprobabilitydistributions.Inavarietyofdomains,researchershaveachievedimpres-siveresultsutilizingsuchsignaturesinconjunctionwithavarietyofappropriatedissimilaritymeasures.Thisincludesworkinshape-basedretrieval[ 31 ],imageretrieval[ 32 33 ],andtextureretrieval[ 34 35 36 37 ].ThelastofwhichwereviewingreatdetailinSection 3.2 .Foranysuchretrievalsystemwhichusesadistri-butionandametric,wepresentawaytospeedupquerieswhileguaranteeingnodropinaccuracybyrepresentingaclusterofdistributionswithanoptimallycloserepresentativewhichwecallthem-ITC.Tomesofresearchhaveconcentratedonspeedingupnearestneighborsearchesinnon-Euclideanmetricspaces.Webuildonthisworkbyreningittobettersuitthecaseinwhichthemetricisonprobabilitydistributions.Inlow-dimensionalEuclideanspaces,thefamiliark-dtreeandR*-treecanindexpointsetshandily.Butinspaceswithanon-Euclideanmetric,onemustresorttoothertechniques.Theseincludeballtrees[ 38 ],vantagepointtrees[ 39 ],andmetrictrees[ 40 41 ]. 31

PAGE 44

32 Oursystemutilizesametrictree,butourmaincontributionispickingasingleobjecttorepresentaset.Pickingasingleobjecttodescribeasetofobjectsisoneofthemostcommonwaystocondensealargeamountofdata.Themostobviouswaytoaccomplishthiswhenpossibleistocomputethearithmeticmeanorthecentroid.Tocontrastwiththepropertiesofourchoicethem-ITCwepointoutthatthearithmeticmeanminimizesthesumofsquareddistancesfromittotheelementsofitsset.Inthecaseinwhichthedataareknownorforcedtolieonamanifold,itisusefultopickanintrinsicmeanwhichhasthemean'sminimizationpropertybutwhichalsoliesonamanifoldcontainingthedata.Thishasbeenexploredformanifoldsofshape[ 4 3 ]andofparameterizedprobabilitydensities[ 5 ]. Metrictrees.Toseehowourrepresentativetsintothemetrictreeframework,abriefreviewofmetrictreeshelps.Givenasetofpointsandametricwhichbydenitionsatisespositive-deniteness,symmetry,andthetriangleinequality,aleafofametrictreeindexessomesubsetofpointsfpigandcontaintwoelds|acentercandaradiusr|whichsatisfydc;pir,forallpiwheredisthemetric.Proceedinghierarchically,aninteriornodeindexesallofthepointsindexedbyitschildrenandalsoensuresthatitseldssatisfytheconstraintabove.Henceusingthetriangleinequality,foranysubtree,onecanndalowerboundonthedistancefromaquerypointqtotheentiresetofpointscontainedinthatsubtree|dq;pidq;c)]TJ/F22 11.955 Tf 11.956 0 Td[(r;.1asillustratedinFig. 3{1 .Andduringanearestneighborsearch,onecanrecursivelyusethislowerboundcostingonlyonecomparisontopruneoutsubtreeswhichcontainpointstoodistantfromthequery. Importanceofpickingacenter.Returningtothechoiceofthecenter,weseethatiftheradiusrinequation 3.1 islarge,thenthelowerboundisnotverytightandsubsequently,pruningwillnotbeveryecient.Ontheotherhand,acenter

PAGE 45

33 Figure3{1:Usingthetriangleinequalitytoprune whichyieldsasmallradiuswilllikelyleadtomorepruningandecientsearches.IfwereexamineFig. 3{1 ,weseethatwecandecreaserbymovingthecentertowardthepointpiinthegure.Thisispreciselyhowthem-ITCbehaves.Weclaimthatthem-ITCyieldstighterclustersthanthecommonlyusedcentroidandthebestmedoid,allowingmorepruningandbettereciencybecauseitrespectsthenaturalmetricsusedforprobabilitydistributions.WeshowtheoreticallythatundertheKL-divergence,them-ITCuniquelyyieldsthesmallestradiusforasetofpoints.Andwedemonstratethatitalsoperformswellunderothermetrics.Inthefollowingsectionswerstdeneourm-ITCandenumerateitsprop-erties.TheninSection 3.2 wereviewapre-existingtextureclassicationsystem[ 37 ]whichutilizesprobabilitydistributionsassignaturesanddiscusshowwebuildoureciencyexperimentatopthissystem.TheninSection 3.3 wepresentresultsshowingthatthem-ITC,whenincorporatedinametrictree,canimprovequeryeciency;alsowecomparehowmuchimprovementthem-ITCachievesrelativeto

PAGE 46

34 otherrepresentativeswheneachisplacedinametrictree.Lastly,wesummarizeourcontributions. 3.2 ExperimentalDesignThecentralclaimofthischapteristhatthem-ITCtightlyrepresentsasetofdistributionsundertypicalmetrics;andthus,thatthisbetterrepresentationallowsformoreecientretrieval.Withthatgoalinmind,wecomparethem-ITCagainstthecentroidandthebestmedoid|twocommonlyusedrepresentatives|inordertondwhichschemeretrievesthenearestneighborwiththefewestcomparisons.Thecentroidisthecommonarithmeticmean,andthebestmedoidofasetfpiginthiscaseisthethedistributionpsatisyingp=argminp02fpigmaxjKLpj;p0:Noticethatwerestrictourselvestothepre-existingsetofdistributionsinsteadofndingthebestconvexcombinationwhichisthem-ITC.Intheexperiment,eachrepresentativewillserveasthecenterofthenodesofametrictreeonthedatabase.Wecontrolforthetopologyofthemetrictree,keepingitunchangedforeachrepresentative.Sincetheexperimentwillexamineeciencyofretrieval,notaccuracy,wewillchoosethesamesignatureasVarmaandZisserman[ 37 ],andthisalongwiththemetricwillcompletelydetermineaccuracy,allowingustoexclusivelyexamineeciency. 3.2.1 ReviewoftheTextureRetrievalSystemThetextureretrievalsystemofVarmaandZisserman[ 37 ][ 42 ]buildsonanearliersystem[ 43 ].Itusesaprobabilitydistributionasasignaturetorepresenteachimage.Aqueryimageismatchedtoadatabaseimagebynearestneighborsearchbasedonthe2dissimilaritybetweendistributions.Althoughthesystemcontainsamethodforincreasingeciency,wedonotimplementitanddelaydiscussionofittothenextsection.

PAGE 47

35 Figure3{2:ExamplesimagesfromtheCUReTdatabase Textondictionary.Beforecomputinganyimage'ssignature,thissystemrequiresthatwerstconstructatextondictionary.Toconstructthisdictionary,werstextractfromeachpixelineachtrainingimageafeaturedescribingthetextureinitsneighborhood.Thisfeaturecanbeavectoroflterresponsesatthatpixel[ 42 ]orsimplyavectorofintensitiesintheneighborhoodaboutthatpixel[ 37 ].Wechoosethelaterapproach.Afterclusteringthisensembleoffeatures,wetakeasmallsetofclustercentersforthedictionary,610centersintotal. Computingasignature.Tocomputethesignatureofanimage,onerstndsforeachpixeltheclosesttextoninthedictionaryandthenretainsthelabelofthattexton.Atthisstage,onecanimagineanimagetransformedfromhavingintensitiesateachpixeltohavingindicesintothetextondictionaryateachpixel,aswouldresultfromvectorquantization.Inthenextsteponesimplyhistogramstheselabelsinthesamewaythatonewouldformaglobalgray-levelhistogram.Thisnormalizedhistogramontextonlabelsisthesignatureofanimage. Data.Againasin[ 37 ],weusetheColumbia-UtrechtReectanceandTextureCUReTDatabase[ 44 ]fortexturedata.Thisdatabasecontains61varietiesof

PAGE 48

36 texturewitheachtextureimagedundervarioussometimesextremeilluminationandviewingangles.Fig. 3{2 illustratestheextremeintra-classvariabilityandtheinter-classsimilaritywhichmakethisdatabasechallengingfortextureretrievalsystems.Inthisgureeachrowcontainsrealizationsfromthesametextureclasswitheachcolumncorrespondingtodierentviewingandilluminationangles.Selecting92imagesfromeachofthe61textureclasses,werandomlypartitioneachoftheclassesinto46trainingimages,whichmakeupthedatabase,and46testimages,whichmakeupthequeries.Preprocessingconsistsofconversiontograyscaleandmeanandvariancenormalization. Dissimilaritymeasures.VarmaandZisserman[ 37 ]measureddissimilaritybetweenaqueryqandadatabaseelementpbothdistributionsusingthe2signgancetest,2p;q=Xipi)]TJ/F22 11.955 Tf 11.955 0 Td[(qi2 pi+qi;.2andreturnedthenearestneighborunderthisdissimilarity.Inourwork,werequireametric,sowetakethesquarerootofequation 3.2 [ 23 ].Notethatsincethesquarerootisamonotonicfunction,thisdoesnotalterthechoiceofnearestneighborandthereforemaintainstheaccuracyofthesystem.Additionallyweuseanothermetric[ 22 ],thesquarerootoftheJensen-Shannondivergencebetweentwodistributions:JS1 2p;q=Hp+q 2)]TJ/F15 11.955 Tf 13.151 8.088 Td[(1 2Hp)]TJ/F15 11.955 Tf 13.15 8.088 Td[(1 2Hq=1 2KLp;p+q 2+1 2KLq;p+q 2:Notethatincontrasttoequation 2.16 ,wenowtakeonlytwodistributionsandxthemixtureparameteras1 2.Unliketherstmetric,thischangeeectsaccuracy,butinourexperiments,thechangeinretrievalaccuracyfromtheoriginalsystemdidnotexceedonepercentagepoint.ThisdoesnotsurprisesincetheJShasservedwellinregistrationandretrievalapplicationsinthepast[ 20 ].

PAGE 49

37 3.2.2 ImprovingEciencyAfterselectingthesignatureanddissimilaritymeasureinouracasemetric,wehavexedtheaccuracyofthenearestneighborsearch;sowecannowturnourattentiontoimprovingtheeciencyofthesearch.Weconstructametrictreeontheelementsofthedatabasetoimproveeciency.Andinourexperimentwewillvarytherepresentativeusedinthenodesofthemetrictree,ndingwhichrepresentativeprunesmost.Weholdconstantthetopologyofthemetrictree,determiningeachelement'smembershipinatreenodebasedonthetexturevarietyfromwhichitcame.Specically,eachofthe61texturevarietieshasacorrespondingnodeandeachofthesenodescontainthe46elementsarisingfromtherealizationsofthattexture.Giventhisxednodemembership,wecanconstructtheappropriaterepresentativeforeachnode.Then,giveneachrepresentative'sversionofthemetrictree,weperformsearchesandcountthenumberofcomparisonsrequiredtondthenearestneigh-bor. 3.3 ResultsandDiscussionTheresultswitheachdissimilaritymeasurewereverysimilar,andtheoreticalresults[ 22 ]bearthisobservedsimilarityoutbyshowingthattheJS-divergenceand2areasymptoticallyrelated.BelowwereporttheresultsfortheJS-divergence.Fig. 3{3 showsthespeed-upswhichthemetrictreewithm-ITCachieveforeachtextureclass.Thesedatarelatethetotalnumberofcomparisonsrequiredforanaveragequeryusingthemetrictreetothenumberofcomparisonsinanexhaustivesearch.Onaverage,them-ITCdiscards68.9%ofthedatabase,yieldingafactorof3.2improvementineciency.Itshouldnotsurprisethattheindexingout-performsanexhaustivesearch,sowenowconsiderwhathappenswhenwevarytherepresentativeinthemetrictree.Onaverage,thearithmeticmeandiscards47.6%ofthedatabase,resulting

PAGE 50

38 Figure3{3:Onaverageforprobesfromeachtextureclass,thespeed-uprelativetoanexhaustivesearchachievedbythemetrictreewiththem-ITCastherepresenta-tive

PAGE 51

39 Figure3{4:Theexcesscomparisonsperformedbythearithmeticmeanrelativetothem-ITCwithineachtextureclassasaproportionofthetotaldatabase

PAGE 52

40 Figure3{5:Theexcesscomparisonsperformedbythebestmedoidrelativetothem-ITCwithineachtextureclassasaproportionofthetotaldatabase inaspeed-upfactorof1.9.Fig. 3{4 plotstheexcessproportionofthedatabasewhichthearithmeticmeansearchesrelativetothem-ITCforeachprobe.Theboxandwhiskersrespectivelyplotthemedian,quartiles,andrangeofthedataforalltheprobeswithinaclass.Onaverage,themetrictreewiththearithmeticmeanexploresanadditional21.3%ofthetotaldatabaserelativetothemetrictreewiththem-ITC.Inonly2.0%ofthequeriesdidthem-ITCfailtoout-performthearithmeticmean.Sincetheproportionofexcesscomparisonsisapositivevaluefor98.0%oftheprobes,wecanconcludethatthem-ITCalmostalwaysoerssomeimprovementandoccasionallyavoidssearchingmorethanathirdofthedatabase.

PAGE 53

41 Fig. 3{5 showsasimilarplotforthebestmedoid.Again,theboxandwhiskersrespectivelyplotthemedian,quartiles,andrangeofthedataforalltheprobeswithinaclass.Onaverage,themetrictreewiththebestmedoidexploresanadditional22.1%ofthetotaldatabaserelativetothemetrictreewiththem-ITC.Herethem-ITCimprovesevenmorethanitdidoverthearithmeticmean;inonly0.1%ofthequeriesdidthem-ITCfailtoout-performthebestmedoid|neveroncedoingmorepoorly. 3.3.1 ComparisontoPre-existingEciencySchemeVarmaandZissermanproposeadierentapproachtoincreasetheeciencyofqueries[ 37 ].Theydecreasethesizeofthedatabaseinagreedyfashion:Initially,eachtextureclassinthedatabasehas46modelsonearisingfromeachtrainingimage.Thenforeachclass,theydiscardthemodelwhoseabsenceimpairsretrievalperformancetheleastonasubsetofthetrainingimages.Andthisisrepeatedtillthenumberofmodelsissuitablysmallandtheestimatedaccuracyissuitablyhigh.Whilethismethodperformedwellinpractice|achievingcomparableaccuracytoanexhaustivesearchandreducingtheaveragenumberofmodelsperclassfrom46toeightornine,ithasseveralpotentialshortcomings.Therstisthismethod'scomputationalexpense:Althoughthemodelselectionprocessoccurso-lineandtimeisnotcritical,thenumberoftimesonemustvalidatemodelsagainsttheentiretrainingsubsetscalesquadraticallyinthenumberofmodels.Additionallyandmoreimportantly,themodelselectionproceduredependsuponthesubsetofthetrainingdatausedtovalidateit.Itoersnoguaranteesofitsaccuracyrelativetoanexhaustivesearchwhichutilizesalltheknowndata.Incontrast,ourmethodcancomputethem-ITCeciently,andmoreimpor-tantlyweguaranteethattheaccuracyofthemoreecientsearchisidenticaltotheaccuracythatanexhaustivesearchwouldachieve.Althoughitmustbenotedthat

PAGE 54

42 themethodofVarmaandZissermanperformedfewercomparisons,webelievethatbuildingamulti-layeredmetrictreewillbridgethisgap.Lastly,thetwomethodscanco-existsimultaneously:Sincethepre-existingapproachfocusesonreducingthesizeofthedatabasewhileoursindexesthedatabasesolvinganorthogonalproblem,nothingstopsusfromtakingthesmallerdatabaseresultingfromtheirmethodandperformingourindexingatopitforfurtherimprovement. 3.4 ConclusionOurgoalwastoselectthebestsinglerepresentativeforaclassofprobabilitydistributions.Wechosethem-ITCwhichminimizesthemaximimKL-divergencefromeachdistributionintheclasstoit;andwhenweplaceditinthenodesofametrictree,itallowedustoprunemoreeciently.Experimentally,wedemonstratedsignicantspeed-upsoverexhaustivesearchinastate-of-the-arttextureretrievalsystemontheCUReTdatabse.Themetrictreeapproachtonearestneighborsearchesguaranteesaccuracyidenticaltoanexhaustivesearchofthedatabase.Additionally,weshowedthatthem-ITCoutperformsthearithmeticmeanandthebestmedoidwhentheseotherrepresentativesareusedanalogously.Probabilitydistributionsareapopularchoiceforretrievalinmanydomains,andastheretrievaldatabasesgrowlarge,therewillbeaneedtocondencemanydistributionsintoonerepresentative.Wehaveshownthatthem-ITCisausefulchoiceforsucharepresentativewithwell-behavedtheoreticalpropertiesandempiricallysuperiorresults.

PAGE 55

CHAPTER4RETRIEVALWITHe-CENTER 4.1 IntroductionInthecourseofdesigningaretrievalsystem,onemustusuallyconsideratleastthreebroadelements| 1. asignaturethatwillrepresenteachelement,allowingforcompactstorageandfastcomparisons, 2. adissimilaritymeasurethatwilldiscriminatebetweenapairofsignaturesthatarecloseandapairthatarefarfromeachother,and 3. anindexingstructureorsearchstrategythatwillallowforecient,non-exhaustivequeries.Therstofthesetwoelementsmostlydeterminetheaccuracyofasystem'sretrievalresults.Thefocusofthischapter,likethelast,isonthethirdpoint.Agreatdealofworkhasbeendoneonretrievalsystemsthatutilizeaprob-abilitydistributionasasignature.Thisworkhascoveredavarietyofdomainsincludingshape[ 31 ],texture[ 34 ],[ 45 ],[ 35 ],[ 36 ],[ 37 ],andgeneralimages[ 32 ],[ 33 ].Ofthese,somehaveusedtheKullback-LeiblerKLdivergence[ 1 ]asadissimilaritymeasure[ 35 ],[ 36 ],[ 32 ].TheKL-divergencehasmanynicetheoreticalproperties.Particularly,itsrelationshiptomaximum-likelihoodestimation[ 46 ].However,inspiteofthis,itisnotametric.Thismakesitchallengingtoconstructanindexingstructurewhichrespectsthedivergence.ManybasicmethodsexisttospeedupsearchinEuclideanspaceincludingk-dtreesandR*-trees.Andthereareevensomemethodsforgeneralmetricspacessuchasballtrees[ 38 ],vantagepointtrees[ 39 ],andmetrictrees[ 40 ].Yetlittleworkhasbeendoneonecientlynding 43

PAGE 56

44 exactnearestneighborsunderKL-divergence.Inthischapter,wepresentanovelmeansofspeedingnearestneighborsearchandhenceretrievalinadatabaseofprobabilitydistributionswhenthenearestneighborisdenedastheelementthatminimizestheKL-divergencetothequery.Thisapproachdoeshaveasignicantdrawbackwhichdoesnotimpairitonthisparticulardataset,butingeneralitcannotguaranteeretrievalaccuracyequaltothatofanexhaustivesearch.Thebasicideaisacommononeincomputerscienceandreminiscentofthelastchapter:Werepresentasetofelementsbyonerepresenative.Duringasearch,wecomparethequeryobjectagainsttherepresentative,andiftherepresentativeissucientlyfarfromthequery,wediscardtheentiresetthatcorrespondstoitwithoutfurthercomparisons.Ourcontributionliesinselectingthisrepresentativeinanoptimalfashion;ideallywewouldliketodeterminethecircumstancesunderwhichwemaydiscardthesetwithoutfearofaccidentallydiscardingthenearestneighbor,butthiscannotalwaysbeguaranteedForthisapplication,wewillutilizethetheexponentialinformationtheoreticcentere-ITC.Intheremainingsectionswerstderivetheexpressionuponwhichthesystemmakespruningdecisions.Thereafterwepresenttheexperimentshowingincreasedeciencyinretrievaloveranexhaustivesearchandovertheuniformlyweightedgeometricmean|areasonablealternaterepresentative.Finally,wereturntothetextureretrievalexampleofthepreviouschaptertocomparethem-ande-centers,evaluatingwhichformsthetightestclustersasevaluatedbythe2metricequation 3.2 4.2 LowerBoundInthissectionweattempttoderivealowerboundontheKL-divergencefromadatabaseelementtoaquerywhichonlydependsuponthetheelementthroughitse-ITC.ThislowerboundguidesthepruningandresultsinthesubsequentincreasedsearcheciencywhichwedescribeinSection 4.3

PAGE 57

45 Figure4{1:Intuitiveproofofthelowerboundinequation 4.7 seetext.TheKL-divergenceactslikesquaredEuclideandistance,andthePythagoreanTheo-remholdsunderspecialcircumstances.Qisthequery,Pisadistributioninthedatabase,andCisthee-ITCofthesetcontainingP.PistheI-projectionofQontothesetcontainingP.Ontheright,DCjjPRe,whereReisthee-radius,bytheminimaxdenitionofC. Inordertosearchforthenearestelementtoaqueryeciently,weneedtoboundtheKL-divergencetoasetofelementsfrombeneathbyaquantitywhichonlydependsuponthee-ITCofthatset.Thatway,wecanusetheknowledgegleanedfromasinglecomparisontoavoidindividualcomparisonstoeachmemberoftheset.WeapproachsuchalowerboundbyexaminingtheleftsideofFig. 4{1 .HereweconsideraquerydistributionQandanarbitrarydistributionPinasetwhichhasCasitse-ITC.Asasteppingstonetothelowerbound,webrieydenetheI-projectionofadistributionQontoaspaceEasP=argminP2EDPjjQ:.1ItiswellknownthatonecanuseintuitionaboutthesquaredEuclideandistancetoappreciatethepropertiesoftheKL-divergence;andinfact,inthecaseofthe

PAGE 58

46 I-projectionPofQontoE,wehaveinsomecasesaversionofthefamiliarPythagoreanTheorem[ 26 ].InthiscasewehaveforallP2EthatDPjjQDPjjQ+DPjjP:.2AndinthecasethatP=P1+)]TJ/F22 11.955 Tf 11.955 0 Td[(P2.3fordistributionsP1;P22Ewith0<<1,thenwecallPanalgebraicinnerpointandwehaveequalityinequation 4.2 UnfortunatelyitisthisconditionwhichwecannotverifyeasilyasitdependsuponboththequerywhichwewilllabelQandthestructureofEwhichisdeterminedbythedatabase.InterestinglywedogettheequalityversionwhenwetakeEasalinearfamily,likethefamilieswithwhichthem-ITCisconcerned.Regardless,wewillcontinuewiththederivationanddemonstrateitsuseinthisapplication.Assumingequalityinequation 4.2 andapplyingittwiceyields,DPjjQ=DPjjQ+DPjjP .4 DCjjQ=DPjjQ+DCjjP; .5 wherewearefreetoselectP2EasanarbitrarydatabaseelementandCasthee-ITC.Equation 4.4 correspondsto4QPPwhileequation 4.5 correspondsto4QCPinFig. 4{1 .Ifwesubtractthetwoequationsaboveandre-arrange,wend,DPjjQ=DCjjQ+DPjjP)]TJ/F22 11.955 Tf 11.955 0 Td[(DCjjP:.6

PAGE 59

47 ButsincetheKL-divergenceisnon-negative,andsincethee-radiusReisauniformupperboundontheKL-divergencefromthee-ITCtoanyP2E,wehaveDPjjQDCjjQ)]TJ/F22 11.955 Tf 11.955 0 Td[(Re:.7Hereweseethatthem-ITCwoulddolittlebetteringuaranteeingthisparticularbound.Whileitwouldinsurethatwehadequalityinequation 4.2 ,itcouldnotboundthelastterminequation 4.6 becausetheorderoftheargumentsisreversed.Wecangetanintuitive,nonrigorousviewofthesamelowerboundbyagainborrowingnotionsfromsquaredEuclideandistance.Thispictoralrepriseofequation 4.7 canlendvaluableinsighttothetightnessoftheboundanditsdependenceoneachofthetwoterms.ForthisdiscussionwerefertotherightsideofFig. 4{1 .TheminimaxdenitiontellsusthatDCjjPRe.WeconsiderthecaseinwhichthisisequalityandsweepoutanarccenteredatCwithradiusRefromthebaseofthetrianglecounter-clockwise.WetakethepointwherealinesegmentfromQistangenttothisarcasavertexofarighttrianglewithhypotenuseoflengthDCjjQ.ThelegwhichisnormaltothearchaslengthRebyconstruction,andbythePythagoreanTheoremtheotherlegofthistriangle,whichoriginatesfromQ,haslengthDCjjQ)]TJ/F22 11.955 Tf 12.319 0 Td[(Re.Wecanusethelengthofthislegtovisualizethelowerbound,andbyinspectionweseethatitwillalwaysbeexceededbythelengthofthelinesegmentoriginatingfromQandterminatingfurtheralongthearcatP.ThissegmenthaslengthDPjjQandisindeedthequantityweseektoboundfrombelow. 4.3 ShapeRetrievalExperimentInthissectionweapplythee-ITCandthelowerboundinequation 4.7 torepresentdistributionsarisingfromshapes.Sincethelowerboundguaranteesthat

PAGE 60

48 weonlydiscardelementsthatcannotbenearestneighbors,theaccuracyofretrievalisasgoodasanexhaustivesearch.Whileweknowfromthetheorythatthee-ITCyieldsasmallerworst-caseKL-divergence,wenowpresentanexperimenttotestifthistranslatesintoatighterboundandmoreecientqueries.Wetackleashaperetrievalproblem,usingshapedistributions[ 31 ]asoursignature.Toformashapedistributionfroma3Dshape,weuniformlysamplepairsofpointsfromthesurfaceoftheshapeandcomputethedistancebetweentheserandompoints,buildingahistogramoftheserandomdistances.Toaccountforchangesinscale,weindependentlyscaleeachhistogramsothatthemaximumdistanceisalwaysthesame.Forourdissimilaritymeasure,weuseKL-divergence,sothenearestneighborPtoaquerydistributionQisP=argminP0DP0jjQ:.8Fordata,weusethePrincetonShapeDatabase[ 47 ]whichconsistsofover1800triangulated3Dmodelsfromover160classesincludingpeople,animals,buildings,andvehicles.Totesttheeciency,weagaincomparethee-ITCtotheuniformlyweighted,normalizedgeometricmean.UsingtheconvexityofEwecangeneralizethelowerboundinequation 4.7 toworkforthegeometricmeanbyreplacingthee-radiuswithmaxiDCjjPiforourdierentC.Wetakethebaseclassicationaccompanyingthedatabasetodeneourclusters,andthencomputethee-ITCandgeometricmeansofeachcluster.Whenweconsideranovelquerymodelonaleave-one-outbasis,wesearchforthenearestneighborutilizingthelowerboundanddisregardingunnecessarycomparisons.Foreachquery,wemeasurethenumberofcomparisonsrequiredtondthenearestneighbor.

PAGE 61

49 Figure4{2:Thespeed-upfactorversusanexhaustivesearchwhenusingthee-ITCasafunctionofeachclassintheshapedatabase.

PAGE 62

50 Figure4{3:Therelativepercentofadditionalpruningswhichthee-ITCachievesbeyondthegeometriccenter,againforeachclassnumber.

PAGE 63

51 Fig. 4{2 andFig. 4{3 showtheresultsofourexperiment.InFig. 4{2 ,weseethespeed-upfactorthatthee-ITCachievesoveranexhaustivesearch.Aver-agedoverallprobesinallclasses,thisspeed-upfactorisapproximately2.6;thegeometricmeanachievedanaveragespeed-upofabout1.9.AndinFig. 4{3 ,wecomparethee-ITCtothegeometricmeanandseethatforsomeclasses,thee-ITCallowsustodiscardnearlytwiceasmanyunworthycandidatesasthegeometricmean.Fornoclassofprobesdidthegeometricmeanprunemorethanthee-ITC,andwhenaveragedoverallprobesinallclasses,thee-ITCdiscardedover30%moreelementsthandidthegeometricmean. 4.4 RetrievalwithJS-divergenceInthissectionwemimicktheexperimentsofthepreviouschapter.InsteadofusingtheKL-divergencetodeterminenearestneighbors,wereturntothesquare-rootoftheJS-divergence,andweagainusethetriangleinequalitytoguaranteenodecreaseinaccuracy.Usingmetrictreeswiththee-center,geometricmean,andm-center,wecancomparetheecienciesofeachrepresentativeandtheoverallspeedup.InFig. 4{4 ,weseethespeeduprelativetoanexhaustivesearchwhenusingthee-ITC;onaverage,thespeedupfactoris1.53.InFig. 4{5 wecomparethee-ITCtothegeometricmean,muchaswecomparedthem-ITCtothearithmeticmeaninthelastchapter.Weclaimthatthisisthenaturalcomparisonbecausetheexponentialfamilyconsistsofweightedgeometricmeans.Herethegeometricmeansearchesonaverage7.24%moreofthedatabasethanthee-ITC.LastlyinFig. 4{6 wecomparethetwocentersagainsteachother.Herethee-ITCcomesupshort,searchingonaverageanadditional14.89%ofthedatabase.Inthenextsectionwetrytoexplainthisresult.

PAGE 64

52 Figure4{4:Speedupfactorforeachclassresultingfromusinge-ITCoveranex-haustivesearch

PAGE 65

53 Figure4{5:Excesssearchesperformedusinggeometricmeanrelativetoe-ITCasproportionoftotaldatabase.

PAGE 66

54 Figure4{6:Excesssearchesperformedusinge-ITCrelativetom-ITCasproportionoftotaldatabase.

PAGE 67

55 4.5 Comparingthem-ande-ITCsSincebothcentershavefoundsuccessfulapplicationtoretrieval,itisreason-abletoexploretheirrelationship.Sincetheargumentsintheirrespectiveminimaxcriteriaarereversed,itisnotimmediatelyclearthatameaningfulcomparisoncouldbemadewithKL-divergencealone.Hence,weresorttothe2distancefromthepreviouschapterasanarbiterthoughwecouldjustaswelluseJS-divergence.Thecomparisonwemakenextissimple.Returningtothetextureretrievaldatasetfromthepreviouschapter,weusethesamesetmembershipsandcalculateanm-ITCandane-ITCforeachset.Thenforeachrepresentativeandeachsetwedeterminewhatthemaximum2distancebetweenanelementofthatsetandtherepresentativeis.TheresultsappearinFig. 4.5 astheratiobetweenthis2radius"ofthee-ITCandthem-ITC.Sincethenumbersaregreaterthanonewiththeexceptionofonlytwooutof61classes,itissafetoconcludeinthissettingthatthem-ITCformstighterclusters.Thisresulthelpsexplainthesuperiorperformanceofthem-centerintheprevioussection.Inretrospect,onecouldattributethistothem-ITC'sglobaloptimalitypropertycf.Section 2.2.1 whichthee-ITCmaynotshare.

PAGE 68

56 Figure4{7:Theratioofthemaximal2distancefromeachcentertoalloftheelementsinaclass

PAGE 69

CHAPTER5TRACKING 5.1 IntroductionInthepreviouschapters,weconsideredtheproblemofecientretrieval.Inthecaseofretrieval,whereauniformupperboundisimportant,onemeasureshowwellarepresentativedoesbyfocusingonhowithandlesthemostdistantmembers.Thispropertyiswhytheminimaxrepresentativesarewell-suitedtoretrieval.Inthischapter,weexplorethequestionofwhetherthesamecanbesaidfortracking.Werstpresentseveralencouragingsignsthatinfactitmaybe,andthenwegoontoconsideranexperimenttotesttheperformanceofatrackerbuiltaroundthem-ITC.Butrstwesetthecontextofthetrackingprobleminprobabilisticterms. 5.2 Background|ParticleFiltersThetrackingproblemconsistsofestimatingandmaintainingahiddenvariableusuallyposition,sometimesposeoramorecomplicatedstatefromasequenceofobservations.Commonly,insteadofsimplykeepingoneguessastothepresentstateandupdatingthatguessateachnewobservation,onestoresandupdatesanentireprobabilitydistributiononthestatespace;then,whenpressedtogiveasingle,concreteestimateofthestate,oneusessomestatistice.g.,meanormodeofthatdistribution.ThisapproachisembodiedfamouslyintheKalmanlter,wheretheprobabil-itydistributiononthestatespaceisrestrictedtobeaGaussian,andthetrajectoryofthestateorsimplythemotionoftheobjectisassumedtofollowlineardynam-icssothattheGaussianatonetimestepmaypropagatetoanotherGaussianatthenext. 57

PAGE 70

58 Whenthisassumptionistoolimiting|possiblybecausebackgroundcluttercreatesmultiplemodesinthedistributiononthestatespaceorbecauseofcomplexdynamicsorboth|researchersoftenturntoaparticleltertotrackanobject[ 48 ].UnliketheKalmanlter,particleltersdonotrequirethattheprobabilitydis-tributiononthestatespacebeaGaussian.Togainthisadditionalrepresentativepower,aparticlelterstoresasetofsamplesfromthestatespacewithaweightforeachsampledescribingitsprobability.Thegoalthenistoupdatethesesetswhennewobservationsarrive.Onecanupdatefromatimestept)]TJ/F15 11.955 Tf 12.129 0 Td[(1toatimesteptinthreesteps[ 48 ]: 1. Givenasamplesetfst)]TJ/F21 7.97 Tf 6.587 0 Td[(11;;st)]TJ/F21 7.97 Tf 6.587 0 Td[(1nganditsassociatedweightsft)]TJ/F21 7.97 Tf 6.586 0 Td[(11;;t)]TJ/F21 7.97 Tf 6.587 0 Td[(1ng,randomlysamplewithreplacementanewsetf~st1;;~stng. 2. Toarriveatthenewsti,randomlypropagateeach~stibyassigningstitothevalueofxaccordingtoaprobabilitydistributioni.e.,themotionmodelpmxj~sti. 3. Adjusttheweightsaccordingtothelikelihoodofobservingthedataztgiventhatthetruestateissti,ti=1 Zpdztjsti;.1whereZisanormalizationfactor.Inthisworkwerestrictourattentiontoatwodimensionalstatespaceconsistingonlyofanobject'sposition.Alsoweexclusivelyconsidermotionmodelspminwhichpmxjy=pmx+x0jy+y0=Gx)]TJ/F22 11.955 Tf 11.955 0 Td[(y;.2whereGisaGaussian.Thatispmisashift-invariant"modelinwhichtheprobabilityofthedisplacementfromapositioninonetimesteptoapositioninthe

PAGE 71

59 nextisindependentofthatstartingposition.Furthermore,wetakethatprobabilityofadisplacementasdeterminedbyaGaussian. 5.3 ProblemStatementInthischapterweconsiderthefollowingscenario:Wehaveseveralobjects,andforeachobjectweknowitsdistinctprobabilisticmotionmodel.Wewanttobuildasingleparticlelterwithonemotionmodelwhichcantrackanyoftheobjects.Thisisreminiscientoftheproblemofdesigningauniversalcodethatcanecientlyencodeanyofanumberofdistinctsources,andthatsimilaritysuggeststhatthisisapromisingapplicationforaninformationtheoreticcenter.Relatedtothisproblemisthecaseinwhichonesingleobjectundergoesdistinctphases"ofmotion,eachofwhichhasadistinctmotionmodel.Anexampleofthisisacarthatmovesinonefashionwhenitdrivesstraightandinacompletelydierentfashionwhenturning.Thisworkdoesnotexploresuchmulti-phasetracking.Forthisrelatedproblemofmulti-phasemotion,therearecertainlymorecomplicatedmotionmodelssuitedtotheproblem[ 49 ].Andfortheproblemwefocusoninthischapter,onecouldalsoimaginereningthemotionmodelasobservationsarrive,until,onceinpossessionofapreponderanceofevidence,onenallysettlesonthemostlikelycomponent.Butalloftheserequiresomesortofon-linelearningwhileincontrast,theapproachwepresentoersasingle,simple,xedpriorwhichcanbedirectlyincorporatedintothebasicparticlelter 5.4 Motivation 5.4.1 BinaryStateSpaceTomotivatetheuseofanITC,webeginwithatoyexampleinwhichwetrack"thevalueofabinaryvariable.OurcaricatureofatrackerisbasedonaparticlelterwithNparticles.Inthisexample,wemakeafurther,highlysimplifyingassumption:Theobservationmodelisperfectandclutter-free.Thatisthelikelihoodpdinequation 5.1 hasperfectinformation.Thismeansthatany

PAGE 72

60 particleswhichpropagatetotheincorrectstatereceiveaweightofzero,andtheparticlesifanywhichpropagatetothecorrectstatesharetheentireprobabilitymassamongthemselves.Undertheseassumptions,theeventthatthetrackerfailsandhenceforthneverrecoversateachtimestepisanindependent,identicallydistributedBernoullirandomvariablewithprobabilitypfail=p)]TJ/F22 11.955 Tf 11.955 0 Td[(qN+)]TJ/F22 11.955 Tf 11.955 0 Td[(pqN;.3wherepistheprobabilitythetrackertakesonstate0andqistheprobabilitythataparticleevolvesunderitsmotionmodeltostate0.Similarly,1)]TJ/F22 11.955 Tf 12.565 0 Td[(pistheprobabilitythetrackertakesonstate1and1)]TJ/F22 11.955 Tf 12.207 0 Td[(qistheprobabilitythataparticleevolvestostate1.WhattheequationabovesaysisthatourtrackerwillfailifandonlyifallNparticleschoosewrongly.Nowtheinterestingthingaboutthisexampleisthatamotionmodelinwhichq6=pcanoutperformoneinwhichq=p.Specicallybydierentiatingequation 5.3 withrespecttoq,wendthatpfailtakesonaminimumwhenq="1)]TJ/F22 11.955 Tf 11.955 0 Td[(p p1 N)]TJ/F18 5.978 Tf 5.756 0 Td[(1+1#)]TJ/F21 7.97 Tf 6.587 0 Td[(1:.4Asaconcreteexample,wetakethecasewithp=:1andN=10;heretheoptimalvalueforqis.4393.Whenwendtheexpectednumberoftrialstillthetrackerfailsineachcasesimplyp)]TJ/F21 7.97 Tf 6.586 0 Td[(1fail,wendthatifwetakeamotionmodelwithq=p,thetrackergoesforanaverageof29steps,butifwetaketheoptimalvalueofq,thetrackercontinuesfor1825steps.HereweseeremindersofhowtheITCsgivemoreweighttotheextraordinarysituationsthanotherrepresentatives.Inthiscaseitisjustiedtounder-weightthemostlikelycasebecauseevenhavingasingleparticlearriveatthecorrectlocationisasgoodashavingallNparticlesarrive.

PAGE 73

61 5.4.2 Self-informationLossFormoreevidencesuggestingthattheITCmightbewell-suited,weconsiderthefollowinganalysis[ 50 ].Supposewetrytopredictavaluexwithdistributionpx.Butinsteadofpickingasinglevalue,wespecifyanotherdistributionqxwhichdenesourcondenceforeachvalueofx.Nowdependingonthevaluexthatoccurs,wepayapenaltybasedonhowmuchcondencewehadinthatvalue;ifwehadagreatdealofcondenceqxclosetoone,wepayasmallpenalty,andifwedidnotgivemuchcredencetothevalue,wepayalargerpenalty.Theself-informationlossfunctionisacommonchoiceforthispenalty.Accordingtothisfunctionifthevaluexoccurs,wewouldincurapenaltyof)]TJ/F15 11.955 Tf 11.291 0 Td[(logqx.Ifweexaminetheexpectedvalueofourloss,wendEx[)]TJ/F15 11.955 Tf 11.291 0 Td[(logqx]=KLp;q+Hp:.5Returningtoourproblemstatement,ifwearefacedwithahostofdistribu-tionsandwanttondasingleqtominimizetheexpectedlossinequation 5.5 overthesetofdistributions,webegintoapproachsomethinglikethem-ITC. 5.5 Experiment|OneTrackertoRuleThemAll? 5.5.1 PreliminariesTotesthowwellthem-ITCincorporatesasetofmotionmodelsintoonetracker,wedesignedthefollowingexperiment.Givenasequenceofimages,werstestimatedthetruemotionbyhand,measuringthepositionoftheobjectofinterestatkeyframes.WethentaGaussiantothesetofdisplacements,takingthemeanoftheGaussianastheaveragevelocity. Data.Asingleframefromthe74-framesequenceappearsinFig. 5{1 .Inthissequencewetracktheheadofthewalkerwhichhasnearlyconstantvelocityinthex-directionandslightperiodicmotioninthey-directionasshesteps.Themeanvelocityinthex-directionwas-3.7280pixels/framewithamarginalstandard

PAGE 74

62 Figure5{1:Framefromtestsequence

PAGE 75

63 deviationof1.15;inthey-directiondirectiontheaveragevelocitywas-0.1862withstandarddeviation2.41.Forourobservationmodel,wesimplyuseatemplateoftheheadfromtheinitialframeandcompareittoagivenregion,ndingthemeansquarederrorMSEofthegray-levels.Thenlikelihoodisjustexp)]TJ/F26 7.97 Tf 8.02 -4.976 Td[()]TJ/F21 7.97 Tf 6.587 0 Td[(1 22MSE.Weinitializealltrackerstothetruestateattheinitialframe. Motionmodels.Fromthissingleimagesequence,wecanhallucinatenumerousimagesequenceswhichconsistofrotatedversionsoftheoriginalsequence.Weknowthetruemotionmodelsforallofthesenovelsequencessincetheyarejusttheoriginalmotionmodelrotatedbythesameamount.Toexaminetheperformanceofonemotionmodelappliedtoadierentimagesequence,oneneedonlyconsidertheangledisparitybetweenthetrueunderlyingmotionmodeloftheimagesequenceandthemotionmodelutilizedbythetracker.InFig. 5{2 wereporttheperformanceinaveragetime-till-failureasafunctionofangledisparity.Herethenumberofparticlesisxedto10.Sinceinthisexperimentandsubsequentonesinthischapter,wedeneatrackerashavingirrevocablyfailedwhenallofitsparticlesareatadistancegreaterthan40pixelsfromthelocation,weseethateventhemosthopelessoftrackerswillsucceedintracking"thesubjectforsixframes|justlongenoughforallofitsparticlestoeefromthecorrectstateatanaveragerelativevelocityof7.5pixels/frame.Thisobservationletsuscalculatealowerboundontheperformanceofatrackerwithagivenangledisparityfromthetruemotion:Pessimisticallyassumingacompletelyuninformativeobservationmodel,onecancalculatethetimerequiredforthecentroidoftheparticlestoexceedadistanceofD=40pixelsfromthetruestateast=D 2rsin 2;.6

PAGE 76

64 Figure5{2:Averagetimetillfailureasafunctionofangledisparitybetweenthetruemotionandthetracker'smotionmodel

PAGE 77

65 wherer=3:7326isthespeedatwhichcentroidandtruestateeachmoveandistheangledisparity.ThedashedlineinFig. 5{2 representsthiscurve,cappedatthemaximumnumberofframes74.Oneshouldnotethatsinceallofthemotionmodelsarerotatedversionsofeachother,theHPterminequation 5.5 isaconstant.Hencetheqwhichminimizesthemaximumexpectedself-informationlossoverasetofp'sisinfactthem-ITC. Performanceofmixtures.Becausewewilltakethem-ITCofseveralofthesemotionmodels,itisalsoofinteresthowmixturesperform.Weconsideramixtureofthecorrectmotionmodelandamotionmodelradiansrotatedintheoppositedirection,whichessentiallycontributesnothingtothetracking.InFig. 5{3 weagainplotthetime-till-failurefortrackerswith10and20particlesasafunctionoftheweightinthemixtureofthecorrectmodel.Toderivealowerbound,thistimewerepresenttheproportionoftheproba-bilitynearthetruestateattimetasrtandtheremainderaswt.Furtherwesaythatateachtimestep,rtoftheprobabilitymovestoorremainsinthecorrectstateasaresultofthoseparticlesbeingdrivenbythecorrectmotionmodelandtherestmovesawayfromthecorrectstate.Nextwemodelthestepintheparticlelteralgorithmwhereweadjusttheweights.Weassumethataparticleawayfromthetruestatereceivesaweightthatisc<1timestheweightreceivedbyaparticlenearthetruestate.Iftherewerenoclutterinthescene,thisnumberwouldbezeroandwewouldreturntotheassumptioninSection 5.4.1 .Now,bypessimisticallyassumingthatparticleswhichmoveawayfromthecorrectstatehaveexceedinglysmallchancei.e.,zeroofrejoiningthetruestaterandomlywecanderivethe

PAGE 78

66 Figure5{3:Averagetimetillfailureasafunctionoftheweightonthecorrectmotionmodel;for10and20particles

PAGE 79

67 followingfromaninitialr0=1,rt=t Z .7 wt=)]TJ/F22 11.955 Tf 11.955 0 Td[(Pn)]TJ/F21 7.97 Tf 6.587 0 Td[(1i=0ct)]TJ/F23 7.97 Tf 6.586 0 Td[(ii Z; .8 whereZisanormalizingconstant.Andbytakingalowerboundofwt>)]TJ/F23 7.97 Tf 6.586 0 Td[(cn)]TJ/F18 5.978 Tf 5.756 0 Td[(1 Z,wecanderivethatrt< +1)]TJ/F23 7.97 Tf 6.587 0 Td[(c.Finally,bytakingthisupperboundonrtasaBernoullirandomvariableaswedidinSection 5.4.1 ,wecangetalowerboundontheexpectedtime-till-failureas1 1)]TJ/F23 7.97 Tf 6.587 0 Td[(rtplusanadjustmentforthesixfreeframesrequiredtodriftoutofthe40pixelrange.ThisisthelowerboundshowninFig. 5{3 .Tocalculatec,werandomlysampledthebackgroundatadistancegreaterthan40pixelsfromthetruestate,andaveragedtheresponseoftheobservationmodel,yieldingc=0:1886. 5.5.2 m-ITCAgainwecomparetheperformanceofthem-ITCtothearithmeticmean.Thistimeweformmixturesofseveralmotionmodelsofvaryingangles,takingweightseitherasuniformforthearithmeticmeanorasdeterminedbythem-ITC.Intherstcasewetakeatotalof12motionmodelswithangledisparitiesoff5=20;4=20;3=20;2=20;1=20;0;g:Onthismixture,them-ITCassignsweightsof.31toeachofthemotionmodelsat=4andtheremaining.37tothemotionmodelat.Wetestedthesetrackerswhenthetruemotionmodelhasorientationzero,=4,and,respectivelyandreportedtheiraveragetimes-till-failure,inTable 5{1 .Indeedaswemightexpectfromaminimaxrepresentative,them-ITCregistersthebestworst-caseperformance;butneverthelessitisnotanimpressiveperformance.Thesecondsetofmotionmodelswasslightlylessextremeinvariation,f0;=64;=32;=4g.Tothesecomponents,them-ITCsplititsprobability

PAGE 80

68 evenlybetweenthemotionmodelsatdisparities=4.Inthiscase,them-ITChadabettershowingoverall,andstillhadthebestworstcaseperformance.TheresultsareshowninTable 5{2 Table5{1:Averagetime-till-failureofthetrackerbasedonthearithmeticmeanAMandonthem-ITC0particles Trueanglem-ITCAM 04374=42346167 Table5{2:Averagetime-till-failureofthetrackerbasedonthearithmeticmeanAMandonthem-ITC0particleswithsecondsetofmodels Trueanglem-ITCAM 07274=44437 5.6 ConclusionWhilethereisreasontobelievethataminimaxrepresentativewouldservewellincombiningseveralmotionmodelsintracking,theprecisecircumstanceswhenthismightbebenecialarediculttodetermine.ExaminingFig. 5{2 andFig. 5{3 itseemsthatifthereistoolittleweightortoogreatadisparity,atrackerisdoomedfromthebeginning.Sowhilethem-ITCwillnotperformideallyunderallsituations,itstillretainsitsexpectedbestworst-caseperformance.

PAGE 81

CHAPTER6CONCLUSION 6.1 LimitationsThecentralthrustofthisworkhasbeentheclaimthatdespitemanycomputervisionresearchers'instinctivesuspicionofminimaxmethods,giventherightapplication,theycanbeuseful.Howeverthoseskepticalresearchers'instinctsareoftenwell-founded:Themainissueonemustbeawareofregardingtherepresentativespresentedinthisworkistheirsensitivitytooutliers.Onemustcarefullyconsiderhisdata,particularlytheextremeelements,becausethosearepreciselytheelementswiththemostinuenceontheserepresentatives.Inadditiontodata,onemustalsoconsiderhowone'sapplicationdenessuccessfulresults.Ifone'sapplicationcantoleratesmalldeviationsinthenormalcases,"andsuccessfulbehaviorisdenedbygoodresultsinextremecases,thenaminimaxrepresentativemightbeappropriate.Thisispreciselythecaseintheretrievalproblemwhereauniformupperboundonthedispersionofasubsetisthecriteriononwhichsuccessfulindexingisjudged.Suchacriteriondisregardswhethertheinnermostmembersareespeciallyclosetotherepresentativeornot.Despitesomeinitialsuggestions,thisdidnotturnouttobethecaseinthetrackingdomaininthepresenceofclutter.Thereitseemsthedeviationsinthenormalcases"didhaveasignicanteectonperformance. 6.2 SummaryAftercharacterizingtwominimaxrepresentatives,withrmgroundingsininformationtheory,wehaveshownhowtheycanbeutilizedtospeedtheretrievaloftextures,images,shapes,oranyobjectsorepresentedbyaprobabilitydistribution.Theirpowerinthisapplicationcomesfromthefactthattheyform 69

PAGE 82

70 tightclusters,allowingformorepreciselocalizationandecientpruningthanothercommonrepresentatives.Whilethetrackingresultsdidnotbearasmuchfruit,westillbelievethatisapromisingavenueforsucharepresentativeiftheproblemisproperlyformulated. 6.3 FutureWorkThistopictouchesuponamyriadofareasincludinginformationtheory,informationgeometry,andlearning.Csiszarandothershavecharacterizedtheexpectation-maximizationalgorithmintermsofdivergenceminimization;andwebelievethatincorporatingtheITCsintosomeEM-stylealgorithmwouldbeveryinteresting.AlsoofinterestareitsconnectionstoAdaBoostandotheronlinelearningalgorithms.Butforalloftheseavenues,themainchallengeremainsofverifyingthatthedataandmeasurementofsuccessareappropriatetstotheserepresentatives.

PAGE 83

REFERENCES [1] T.M.CoverandJ.A.Thomas,ElementsofInformationTheory,Wiley&Sons,NewYork,NY,1991. [2] J.Lin,Divergencemeasuresbasedontheshannonentropy,"IEEETrans.Inform.Theory,vol.37,no.1,pp.145{151,Mar.1991. [3] P.T.Fletcher,C.Lu,andS.Joshi,Statisticsofshapeviaprincipalgeodesicanalysisonliegroups,"IEEETrans.Med.Imag.,vol.23,no.8,pp.995{1005,Aug.2004. [4] E.Klassen,A.Srivastava,W.Mio,andS.H.Joshi,Analysisofplanarshapesusinggeodesicpathsonshapespaces,"IEEETrans.PatternAnal.MachineIntell.,vol.26,no.3,pp.372{383,Mar.2004. [5] S.-I.Amari,MethodsofInformationGeometry,AmericanMathematicalSociety,Providence,RI,2000. [6] S.-I.Amari,Informationgeometryonhierarchyofprobabilitydistributions,"IEEETrans.Inform.Theory,vol.47,no.5,pp.1707{1711,Jul.2001. [7] B.Pelletier,Informativebarycentresinstatistics,"AnnalsofInstituteofStatisticalMathematics,toappear. [8] Z.WangandB.C.Vemuri,Ananeinvarianttensordissimilaritymeasureanditsapplicationstotensor-valuedimagesegmentation,"inProc.IEEEConf.ComputerVisionandPatternRecognition,Washington,DC,Jun./Jul.2004,vol.1,pp.228{233. [9] D.P.Huttenlocher,G.A.Klanderman,andW.A.Rucklidge,Comparingimagesusingthehausdordistance,"IEEETrans.PatternAnal.MachineIntell.,vol.15,no.9,pp.850{863,Sep.1993. [10] J.Ho,K.-C.Lee,M.-H.Yang,andD.Kriegman,Visualtrackingusinglearnedlinearsubspaces,"inProc.IEEEConf.ComputerVisionandPatternRecognition,Washington,DC,Jun./Jul.2004,pp.228{233. [11] R.I.HartleyandF.Schaalitzky,L-1minimizationingeometricrecon-structionproblems,"inProc.IEEEConf.ComputerVisionandPatternRecognition,Washington,DC,Jun./Jul.2004,pp.504{509. 71

PAGE 84

72 [12] S.C.Zhu,Y.N.Wu,andD.Mumford,Minimaxentropyprincipleanditsapplicationtotexturemodeling,"NeuralComputation,vol.9,no.8,pp.1627{1660,Nov.1997. [13] C.Liu,S.C.Zhu,andH.-Y.Shum,Learninginhomogeneousgibbsmodeloffacesbyminimaxentropy,"inProc.Int'lConf.ComputerVision,Vancover,Canada,Jul.2001,pp.281{287. [14] I.Csiszar,Whyleastsquaresandmaximumentropy?Anaxiomaticapproachtoinferenceforlinearinverseproblems,"AnnalsofStatistics,vol.19,no.4,pp.2032{2066,Dec.1991. [15] D.P.Bertsekas,NonlinearProgramming,AthenaScientic,Belmont,MA,1999. [16] N.MerhavandM.Feder,Astrongversionoftheredundancy-capacitytheoremofuniversalcoding,"IEEETrans.Inform.Theory,vol.41,no.3,pp.714{722,May1995. [17] I.Csiszar,I-divergencegeometryofprobabilitydistributionsandmini-mizationproblems,"AnnalsofProbability,vol.3,no.1,pp.146{158,Jan.1975. [18] R.Sibson,Informationradius,"Z.Wahrscheinlichkeitstheorieverw.Geb.,vol.14,no.1,pp.149{160,Jan.1969. [19] N.JardineandR.Sibson,MathematicalTaxonomy,JohnWiley&Sons,London,UK,1971. [20] A.O.Hero,B.Ma,O.Michel,andJ.Gorman,Applicationsofentropicspanninggraphs,"IEEESignalProcessingMag.,vol.19,no.5,pp.85{95,Sep.2002. [21] Y.He,A.B.Hamza,andH.Krim,Ageneralizeddivergencemeasureforrobustimageregistration,"IEEETrans.SignalProcessing,vol.51,no.5,pp.1211{1220,May2003. [22] D.M.EndresandJ.E.Schindelin,Anewmetricforprobabilitydistri-butions,"IEEETrans.Inform.Theory,vol.49,no.7,pp.1858{1860,Jul.2003. [23] F.Topse,Someinequalitiesforinformationdivergenceandrelatedmeasuresofdiscrimination,"IEEETrans.Inform.Theory,vol.46,no.4,pp.1602{1609,Jan.2000. [24] J.BurbeaandC.R.Rao,Ontheconvexityofsomedivergencemeasuresbasedonentropyfunctions,"IEEETrans.Inform.Theory,vol.28,no.3,pp.489{495,May1982.

PAGE 85

73 [25] R.G.Gallager,InformationTheoryandReliableCommunication,JohnWiley&Sons,NewYork,NY,1968. [26] I.CsiszarandJ.G.K}orner,InformationTheory:CodingTheoremsforDiscreteMemorylessSystems,AcademicPress,Inc.,NewYork,NY,1981. [27] L.D.DavissonandA.Leon-Garcia,Asourcematchingapproachtondingminimaxcodes,"IEEETrans.Inform.Theory,vol.26,no.2,pp.166{174,Mar.1980. [28] B.Y.Ryabko,Commentson`Asourcematchingapproachtondingminimaxcodes',"IEEETrans.Inform.Theory,vol.27,no.6,pp.780{781,Nov.1981. [29] Y.FreundandR.E.Schapire,Adecision-theoreticgeneralizationofon-linelearningandanapplicationtoboosting,"J.ComputerandSystemSciences,vol.55,no.1,pp.119{139,Aug.1997. [30] J.KivinenandM.K.Warmuth,Boostingasentropyprojection,"inProceedingsoftheTwelfthAnnualConferenceonComputationalLearningTheory,SantaCruz,CA,Jul.1999,pp.134{144. [31] R.Osada,T.Funkhouser,B.Chazelle,andD.Dobkin,Shapedistributions,"ACMTrans.Graphics,vol.21,no.4,pp.807{832,Oct.2002. [32] S.Gordon,J.Goldberger,andH.Greenspan,Applyingtheinformationbottleneckprincipletounsupervisedclusteringofdiscreteandcontinuousimagerepresentations,"inProc.Int'lConf.ComputerVision,Nice,France,Oct.2003,pp.370{396. [33] C.Carson,S.Belongie,H.Greenspan,andJ.Malik,Blobworld:imagesegmentationusingexpectation-maximizationanditsapplicationtoimagequerying,"IEEETrans.PatternAnal.MachineIntell.,vol.24,no.8,pp.1026{1038,Aug.2002. [34] Y.Rubner,C.Tomasi,andL.Guibas,Ametricfordistributionswithapplicationstoimagedatabases,"inProc.Int'lConf.ComputerVision,Bombay,India,Jan.1998,pp.59{66. [35] J.Puzicha,J.M.Buhmann,Y.Rubner,andC.Tomasi,Empiricalevaluationofdissimilaritymeasuresforcolorandtexture,"inProc.Int'lConf.ComputerVision,Kerkyra,Greece,Sep.1999,pp.1165{1172. [36] M.N.DoandM.Vetterli,Wavelet-basedtextureretrievalusinggeneralizedGaussiandensityandKullback-Leiblerdistance,"IEEETrans.ImageProcessing,vol.11,no.2,pp.146{158,Feb.2002.

PAGE 86

74 [37] M.VarmaandA.Zisserman,Textureclassication:Arelterbanksnecessary?,"inProc.IEEEConf.ComputerVisionandPatternRecognition,Madison,WI,Jun.2003,pp.691{698. [38] S.M.Omohundro,Bumptreesforecientfunction,constraint,andclassica-tionlearning,"inAdvancesinNeuralInformationProcessingSystems,Denver,CO,Nov.1990,vol.3,pp.693{699. [39] P.N.Yianilos,Datastructuresandalgorithmsfornearestneighborsearchingeneralmetricspaces,"inProc.ACM-SIAMSymp.onDiscreteAlgorithms,Austin,TX,Jan.1993,pp.311{321. [40] J.K.Uhlmann,Satisfyinggeneralproximity/similarityquerieswithmetrictrees,"InformationProcessingLetters,vol.40,no.4,pp.175{179,Nov.1991. [41] A.Moore,Theanchorshierarchy:Usingthetriangleinequalitytosurvivehigh-dimensionaldata,"inProc.onUncertaintyinArticialIntelligence,Stanford,CA,Jun./Jul.2000,pp.397{405. [42] M.VarmaandA.Zisserman,Classifyingimagesofmaterials:Achievingviewpointandilluminationindependence,"inProc.EuropeanConf.ComputerVision,Copenhagen,Denmark,May/Jun.2002,pp.255{271. [43] T.LeungandJ.Malik,Recognizingsurfacesusingthree-dimensionaltextons,"inProc.Int'lConf.ComputerVision,Kerkyra,Greece,Sep.1999,pp.1010{1017. [44] K.J.Dana,B.vanGinneken,S.K.Nayar,andJ.J.Koenderink,Reectanceandtextureofreal-worldsurfaces,"ACMTrans.Graphics,vol.18,no.1,pp.1{34,Jan.1999. [45] E.LevinaandP.Bickel,Theearthmover'sdistanceistheMallowsdistance:someinsightsfromstatistics,"inProc.Int'lConf.ComputerVision,Vancover,Canada,Jul.2001,pp.251{256. [46] N.Vasconcelos,Onthecomplexityofprobabilisticimageretrieval.,"inProc.Int'lConf.ComputerVision,Vancover,Canada,Jul.2001,pp.400{407. [47] P.Shilane,P.Min,M.Kazhdan,andT.Funkhouser,Theprincetonshapebenchmark,"inShapeModelingInternational,Genova,Italy,Jun.2004,pp.167{178. [48] M.IsardandA.Blake,Condensation|conditionaldensitypropagationforvisualtracking,"Int'lJ.ofComputerVision,vol.29,no.1,pp.5{28,Jan.1998. [49] M.IsardandA.Blake,Amixed-statecondensationtrackerwithautomaticmodel-switching,"inProc.Int'lConf.ComputerVision,Bombay,India,Jan.1998,pp.107{112.

PAGE 87

75 [50] N.MerhavandM.Feder,Universalprediction,"IEEETrans.Inform.Theory,vol.44,no.6,pp.2124{2147,Oct.1998.

PAGE 88

BIOGRAPHICALSKETCHAnativeFloridian,EricSpellmangrewuponFlorida'sSpaceCoast,gradu-atingfromSatelliteHighSchoolin1998.ThereafterheattendedtheUniversityofFlorida,receivinghisBachelorofScienceinmathematicsin2000,hisMasterofEn-gineeringincomputerinformationscienceandengineeringin2001,and,underthesupervisionofBabaC.Vemuri,hisDoctorofPhilosophyinthesamein2005.AftergraduatinghewillreturntotheSpaceCoastwithhiswifeKaylaanddaughterSophiatoworkforHarrisCorporation. 76


Permanent Link: http://ufdc.ufl.edu/UFE0011388/00001

Material Information

Title: Fusing Probability Distributions with Information Theoretic Centers and Its Application to Data Retrieval
Physical Description: Mixed Material
Copyright Date: 2008

Record Information

Source Institution: University of Florida
Holding Location: University of Florida
Rights Management: All rights reserved by the source institution and holding location.
System ID: UFE0011388:00001

Permanent Link: http://ufdc.ufl.edu/UFE0011388/00001

Material Information

Title: Fusing Probability Distributions with Information Theoretic Centers and Its Application to Data Retrieval
Physical Description: Mixed Material
Copyright Date: 2008

Record Information

Source Institution: University of Florida
Holding Location: University of Florida
Rights Management: All rights reserved by the source institution and holding location.
System ID: UFE0011388:00001


This item has the following downloads:


Full Text











FUSING PROBABILITY DISTRIBUTIONS WITH INFORMATION
THEORETIC CENTERS AND ITS APPLICATION TO DATA RETRIEVAL
















By

ERIC SPELLMAN


A DISSERTATION PRESENTED TO THE GRADUATE SCHOOL
OF THE UNIVERSITY OF FLORIDA IN PARTIAL FULFILLMENT
OF THE REQUIREMENTS FOR THE DEGREE OF
DOCTOR OF PHILOSOPHY

UNIVERSITY OF FLORIDA


2005

































Copyright 2005

by

Eric Spellman















I dedicate this work to my dearest Kayla with whom I have already learned

how to be a Doctor of Philosophy.















ACKNOWLEDGMENTS

For his supportive guidance during my graduate career, I thank Dr. Baba C.

Vemuri, my doctoral advisor. He taught me the field, offered me an interesting

problem to explore, pushed me to publish-in spite of my terminal procrastination

and tried his best to instill in me the intangible secrets to a productive academic

career.

Also the other members of my committee have helped me greatly in my career

at the University of Florida, and I thank them all: Dr. Brett Presnell delivered the

first lecture I attended as an undergraduate; and I am glad he could attend the last

lecture I gave as a doctoral candidate. I have also benefitted from and appreciated

Dr. Anand Rangarajan's lectures, professional advice, and philosophical discussions.

While I did not have the pleasure of attending Dr. Arunava Banerjee's or Dr. Jeff

Ho's classes, I have appreciated their insights and their examples as successful early

researchers. I would also like to thank Dr. Murali Rao for stimulating debates,

Dr. Sun-Ichi Amari for proposing the idea of the e-center and proofs of the related

theorems, and numerous .. i :vimous reviewers.

My professional debts extend beyond the faculty however to my fellow

comrades-in-research. With them, I have muttered all manner of things about

the aforementioned group in the surest confidence that my mutterings would not be

betll, i, 1 Dr. Jundong Liu, Dr. Zhizhou Wang, Tim McGraw, Fei W\,'v. Santosh

Kodipaka, Nick Lord, Bing Jian, Vinh Nghiem, and Evren Ozarslan all deserve

thanks.

Also deserving are the Department staff members whose hard work keeps

this place afloat and Ron Smith for designing a word-processing template without









which the process of writing a dissertation might itself require a Ph.D. For the

permission to reproduce copyrighted material within this dissertation, I thank the

IEEE (C'!i plter 3) and Springer-Verlag (C'! plter 4). For data I thank the people

behind the Yale Face Database images from which I used in Fig. 2-5 and Fig. 2-8

and Michael Black for tracking data. And for the financial support which made

this work possible, I acknowledge the University of Florida's Stephen C. O'Connell

Presidential Fellowship, NIH grant RO1 NS42075, and travel grants from the

Computer and Information Science and Engineering department, the Graduate

Student Council, and the IEEE.

And finally, most importantly, I thank my family. I thank my mother-in-law

Donna Lea for all of her help these past few weeks and Neil, Abra, and Peter for

letting us take her away for that time. I thank my mother and father for everything

and my brother, too. And I of course thank my dearest Kayla and my loudest,

most obstinate, and sweetest Sophia.















TABLE OF CONTENTS
page

ACKNOWLEDGMENTS ................... ...... iv

LIST OF TABLES ................... .......... viii

LIST OF FIGURES ..................... ......... ix

ABSTRACT ...................... ............. xi

CHAPTER

1 INTRODUCTION ........................... 1

1.1 Information Theory .......................... 2
1.2 Alternative Representatives ......... ........ ... 3
1.3 Minimax Approaches ......................... 4
1.4 Outline of Remainder ........... ............. 5

2 THEORETICAL FOUNDATION ........... ........... 7

2.1 Preliminary-Euclidean Information Center ............ 7
2.2 Mixture Information Center of Probability Distributions ..... 10
2.2.1 Global Optimality .................. .. 13
2.3 Exponential Center of Probability Distributions . ... 13
2.4 Illustrations and Intuition .................. ..... 16
2.4.1 Gaussians .................. ........ .. 19
2.4.2 Normalized Cr -v-1.-v -l Histograms . . ..... 19
2.4.3 The e-Center of Gaussians ................ 23
2.4.4 e-ITC of Histograms .................. ..... 24
2.5 Previous W ork .................. .......... .. 27
2.5.1 Jensen-Shannon Divergence ... . . 27
2.5.2 Mutual Information .................. .... 28
2.5.3 Ci .11,1!, I Capacity .................. .. 29
2.5.4 Boosting .. ... .. .. .. .. ... .. .. .. ... .. .. 30

3 RETRIEVAL WITH m-CENTER .................. ..... 31

3.1 Introduction .................. ........... .. 31
3.2 Experimental Design .................. ..... .. 34
3.2.1 Review of the Texture Retrieval System . .... 34
3.2.2 Improving Efficiency .................. ..... 37
3.3 Results and Discussion .................. ... .. 37









3.3.1 Comparison to Pre-existing Efficiency Scheme
3.4 Conclusion . . . . . .

4 RETRIEVAL WITH e-CENTER .. ............

4.1 Introduction . . . . . .
4.2 Lower Bound .. ..................
4.3 Shape Retrieval Experiment .. ...........
4.4 Retrieval with JS-divergence .. ...........
4.5 Comparing the m- and e-ITCs .. ...........

5 TRA CKING . .. .. ... .. .. .. .. ... .. ..


5.1 Introduction .. .......
5.2 Background-Particle Filters
5.3 Problem Statement .....
5.4 Motivation .. ........
5.4.1 Binary State Space
5.4.2 Self-information Loss
5.5 Experiment-One Tracker to
5.5.1 Preliminaries .....
5.5.2 m-ITC .. ......
5.6 Conclusion .. ........


Rule Them All?


6 CONCLUSION . . . . . . . .


Limitations .. ......
Summary .. .......
Future Work ........


REFERENCES .............

BIOGRAPHICAL SKETCH ......















LIST OF TABLES
Table page

5-1 Average time-till-failure of the tracker based on the arithmetic mean
(AM) and on the m-ITC (800 particles) ............... ..68

5-2 Average time-till-failure of the tracker based on the arithmetic mean
(AM) and on the m-ITC (400 particles) with second set of models .68















LIST OF FIGURES
Figure page

2-1 Center is denoted by o and supports are denoted by x. . . 9

2-2 The ensemble of 50 Gaussians with means evenly spaced on the in-
terval [-30,30] and a = 5 ............... ..... 16

2-3 The components from Fig. 2-2 scaled by their weights in the m-ITC. 17

2-4 The m-ITC (solid) and arithmetic mean (AM, dashed) of the ensem-
ble of Gaussians shown in Fig. 2-2 ................. 18

2-5 Seven faces of one person under different expressions with an eighth
face from someone else. Above each image is the weight pi which
the ITC assigns to the distribution arising from that image. . 20

2-6 The normalized grv i-kv1 histograms of the faces from Fig. 2-5. Above
each distribution is the KL-divergence from that distribution to
the m-ITC. Parentheses indicate that the value is equal to the KL-
radius of the set. Note that as predicted by theory, the the dis-
tributions which have maximum KL-divergence are the very ones
which received non-zero weights in the m-ITC . ...... 21

2-7 In the left column, we can see that the arithmetic mean (solid, lower
left) resembles the distribution arising from the first face more closely
than the m-ITC (solid, upper left) does. In the right column, we
see the opposite: The m-ITC (upper right) more closely resembles
the eighth distribution than does the arithmetic mean (lower right). 22

2-8 Eight images of faces which yield normalized gray level histograms.
We choose an extraordinary distribution for number eight to con-
trast how the representative captures variation within a class. The
number above each face weighs the corresponding distribution in
the e-ITC. ............... ............ .. 25

2-9 KL(C, Pi) for each distribution, for C equal to the e-ITC and geo-
metric mean, respectively. The horizontal bar represents the value
of the e-radius. ............... .......... .. 26

3-1 Using the triangle inequality to prune ............. .. 33

3-2 Examples images from the CUReT database . ...... 35









3-3 On average for probes from each texture class, the speed-up relative
to an exhaustive search achieved by the metric tree with the m-
ITC as the representative .................. .. 38

3-4 The excess comparisons performed by the arithmetic mean relative
to the m-ITC within each texture class as a proportion of the total
database. .................. ........... 39

3-5 The excess comparisons performed by the best medoid relative to the
m-ITC within each texture class as a proportion of the total database 40

4-1 Intuitive proof of the lower bound in equation 4.7 (see text). The KL-
divergence acts like squared Euclidean distance, and the Pythagorean
Theorem holds under special circumstances. Q is the query, P is a
distribution in the database, and C is the e-ITC of the set contain-
ing P. P* is the I-projection of Q onto the set containing P. On
the right, D(C| P) < Re, where Re is the e-radius, by the minimax
definition of C. .................. .. ...... 45

4-2 The speed-up factor versus an exhaustive search when using the e-
ITC as a function of each class in the shape database. ...... ..49

4-3 The relative percent of additional prunings which the e-ITC achieves
beyond the geometric center, again for each class number. . 50

4-4 Speedup factor for each class resulting from using e-ITC over an ex-
haustive search .................. ........ .. .. 52

4-5 Excess searches performed using geometric mean relative to e-ITC as
proportion of total database. .................. .... 53

4-6 Excess searches performed using e-ITC relative to m-ITC as propor-
tion of total database. .................. ..... 54

4-7 The ratio of the maximal X2 distance from each center to all of the
elements in a class .................. ........ .. 56

5-1 Frame from test sequence .................. ... .. 62

5-2 Average time till failure as a function of angle disparity between the
true motion and the tracker's motion model . . ..... 64

5-3 Average time till failure as a function of the weight on the correct
motion model; for 10 and 20 particles ................ .. 66















Abstract of Dissertation Presented to the Graduate School
of the University of Florida in Partial Fulfillment of the
Requirements for the Degree of Doctor of Phil. .. 1hi

FUSING PROBABILITY DISTRIBUTIONS WITH INFORMATION
THEORETIC CENTERS AND ITS APPLICATION TO DATA RETRIEVAL

By

Eric Spellman

August 2005

C('! i: Baba C. Vemuri
Major Department: Computer and Information Science and Engineering

This work presents two representations for a collection of probability distribu-

tions or densities dubbed information theoretic centers (ITCs). Like the common

arithmetic mean, the first new center is a convex combination of its constituent

densities in the mixture family. Analogously, the second ITC is a weighted geomet-

ric mean of densities in the exponential family. In both cases, the weights in the

combinations vary as one changes the distributions. These centers minimize the

maximum Kullback-Leibler divergence from each distribution in their collections

to themselves and exhibit an equi-divergence property, lying equally far from most

elements of their collections. The properties of these centers have been established

in information theory through the study of channel-capacity and universal codes;

drawing on this rich theoretical basis, this work applies them to the problems of

indexing for content-based retrieval and to tracking.

Many existing techniques in image retrieval cast the problem in terms of

probability distributions. That is, these techniques represent each image in the

database-as well as incoming query images-as probability distributions, thus

reducing the retrieval problem to one of finding a nearest probability distribution









under some dissimilarity measure. This work presents an indexing scheme for such

techniques wherein an ITC stands in for a subset of distributions in the database.

If the search finds that a query lies sufficiently far from such an ITC, the search

can safely disregard (i.e., without fear of reducing accuracy) the associated subset

of probability distributions without further consideration, thus speeding search.

Often in tracking, one represents knowledge about the expected motion of

the object of interest by a probability distribution on its next position. This work

considers the case in which one must specify a tracker capable of tracking any

one of several objects, each with different probability distributions governing its

motion. Related is the case in which one object undergoes different ph i of

motion, each of which can be modeled independently (e.g., a car driving straight

vs. turning). In this case, an ITC can fuse these different distributions into one,

creating one motion model to handle any of the several objects.















CHAPTER 1
INTRODUCTION

Given a set of objects, one commonly wishes to represent the entire set with

one object. In this work, we address this concern for the case in which the objects

are probability distributions. Particularly, we present a novel representative

for a set of distributions whose behavior we can describe in the language of

information theory. For contrast, let us first consider the familiar arithmetic

mean: The arithmetic mean is a uniformly weighted convex combination of a set

of objects (e.g., distributions) which minimizes the sum of squared Euclidean

distances from objects in turn to itself. However, later sections contain examples

of applications in which such a representative is not ideal. As an alternative

we describe a representative which minimizes the maximal Kullback-Leibler

divergence from itself to the set of objects. The central idea of this work is that

such a minimax representation is better than more commonly used representatives

(e.g., the arithmetic mean) in some computer vision problems. In exploring this

thesis, we present the theoretical properties of this representative and results of

experiments using it.

We examine two applications in particular, comparing the minimax represen-

tation to the more common arithmetic mean (and other representatives). These

applications are indexing collections of images (or textures or shapes) and choosing

a motion prior under uncertainty for tracking. In the first of these applications,

we follow a promising avenue of work in using a probability distribution as the

signature of a given object to be indexed. Then using an established data struc-

ture, the representative can fuse several signatures into one, thus making searches

more efficient. In the tracking application, we consider the case in which one has









some uncertainty as to the motion model that governs the object to be tracked.

Specifically, the governing model will be drawn from a known family of models. We

s-l--,- -1 using a minimax representative to construct a single prior distribution to

describe the expected motion of an object in a way that fuses all of the models in

the family and yields one model which with best worst case performance.

1.1 Information Theory

As discussed in C'! lpter 2, the minimax representative can be characterized

in terms of the Kullback-Leibler divergence and the Jensen-Shannon divergence.

Hence, a brief review of these concepts is in order.

The Kullback-Leibler (KL) divergence [1] (also known as the relative entropy)

between two distributions p and q is defined as


KL(p,q) pilogp. (1.1)
i qi

It is convex in p, non-negative (though not necessarily finite), and equals zero if

and only if p = q. In information theory it has an interpretation in terms of the

length of encoded messages from a source which emits symbols according to a

probability distribution. While the familiar Shannon entropy gives a lower bound

on the average length per symbol a code can achieve, the KL-divergence between

p and q gives the penalty (in length per symbol) incurred by encoding a source

with distribution p under the assumption it really has distribution q; this penalty is

commonly called redundancy.

To illustrate this, consider the Morse code, designed to send messages in

English. The Morse code encodes the letter "E" with a single dot and the letter

"Q" with a sequence of four dots and dashes. Because "E" is used frequently in

English and "Q" seldom, this makes for efficient transmission. However if one

wanted to use the Morse code to send messages in Chinese pinyin, which might

use "Q" more frequently, he would find the code less efficient. If we assume









contrafactually that the Morse code is optimal for English, this difference in

efficiency is the redundancy.

Also 1p i ing a role in this work is the Jensen-Shannon (JS) divergence. It is

defined between two distributions p and q as


JS(p, q) = aKL(p, ap + (1- c)q) + (1- a)KL(q, ap + (1- a)q), (1.2)


where a E (0, 1) is a fixed parameter [2]; we will also consider its straightforward

generalization to n distributions.

1.2 Alternative Representatives

Also of interest is the body of work which computes averages of sets of objects

using non-Euclidean distances since the representative presented in this work

p1ii- a similar role. One example of this appears in computing averages on a

manifold of shapes [3, 4] by generalizing the minimization characterization of the

arithmetic mean away from the squared Euclidean distance and to the geodesic

distance. Linking manifolds on one hand and distributions on the other is the field

of information geometry [5]. Using notions from information geometry one can

find the mean on a manifold of parameterized distributions by using the geodesic

distances derived from the Fisher information metric.

In this work, the representative does not minimize a function of this geodesic

distance, but rather the :;:-ii KL-divergence. Furthermore, the representa-

tives here are restricted to simple manifolds of distributions-namely the family of

weighted arithmetic means (i.e., convex combinations) and normalized, weighted

geometric means (sometimes referred to as the ::" i. i i i family") of the con-

stituent distributions. These are simple yet interesting families of distributions and

can accommodate non-parametric representations. These two manifolds are dual in

the sense of information geometry [6], and so as one might expect, the represen-

tatives have a similar dual relationship. Pelletier forms a barrycenter based in the









KL-divergence on each of these manifolds [7]. That barrycenter, in the spirit of the

arithmetic mean and in contrast to the representative in this work, minimizes a

sum of KL-divergences. Another "mean-like" representative on the family of Gaus-

sian distributions seeks to minimize the sum of squared J-divergences (also known

as symmetrized KL) [8]. The key difference between most of these approaches and

this work is that this work seeks to present a representative which minimizes the

maximum of KL-divergences to the objects in a set.

1.3 Minimax Approaches

Ingrained in the computer vision culture is a heightened awareness of noise

in data. This is a defensive trait which has evolved over the millennia to allow

researchers to survive in a harsh environment full of imperfect sensors. With this

justified concern in mind, one naturally asks if minimizing a maximum distance

will be doomed by over-sensitivity to noise. As with many such concerns, this

depends on the application, and we argue that an approach which minimizes a

max-based function is well-suited to the applications in this work. But to show

that such a set of appropriate applications is non-empty, consider the successful

work of Huttenlocher et al. as a proof of concept [9]: They seek to minimize the

Hausdorff distance (H) between two point sets (A, B). As one can see, minimizing

the Hausdorff distance,


H(A, B) = max(h(A, B), h(B, A))

h(A,B) = maxmindist(a,b),
aEA bEB

minimizes a max-based function. Another example appears in the work of Ho et

al. for tracking. In Ho et al. [10] at each frame they find a linear subspace which

minimizes the maximal distance to a set of previous observations. Additionally,

Har'!,- and Schaffalitzky minimize a max-based cost function to do geometric

reconstruction [11]. Cognizant that the method is sensitive to outliers, they









recommend using it on data from which they have been removed. These examples

demonstrate that one cannot dismiss a priori a method as overly-sensitive to noise

just because it minimizes a max-based measure.

Closer in spirit to the approach in this work is work using the minimax en-

tropy principle. Zhu et al. [12] and Liu et al. [13] use this principle to learn models

of textures and faces. The first part of this principle (the well-known maximum

entropy or minimum prejudice principle) seeks to learn a distribution which, given

the constraint that some of its statistics fit sample statistics, maximizes entropy.

Csiszar has shown that this process is related to an entropy projection [14]. The

rationale for selecting a distribution with maximal entropy is that it is random

or simple except for the parts explained by the data. Coupled with this is the

minimum entropy principle which further seeks to impose fidelity on the model.

To satisfy this principle, Zhu et al. choose a set of statistics (constraints) to which

a maximal entropy model must adhere such that its entropy is minimized. By

minimizing the entropy, they show that he minimizes the KL-divergence between

the true distribution and the learned model. For a set S of constraints on statistics,

they summarize the approach as


S* = argmin max entr. ',u(p), (1.3)
S pERs

where Qs is the set of all probability distributions which satisfy the constraints

in S. Then with the optimal S*, one need only find the p E Qs. with maximal

entropy.

1.4 Outline of Remainder

In the next chapters, we rigorously define the minimax representatives and

present theoretical results on their properties. We also connect one of these to

its alternative and better-known identity in the information theory results for

channel capacity. Thereafter follow two chapters on using the representatives to









index databases for the sake of making retrieval more efficient. In ('!i ipter 3 we

present a texture retrieval experiment; in this experiment the accuracy of the

retrieval is determined by the Jensen-Shannon divergence, the square root of which

is a true metric. Later in C'! ipter 4 we present an experiment in shape retrieval

where the dissimilarity measure is the KL-divergence. We propose a search method

which happens to work in the particular case of this data set, but in general has

no guarantee that it will not degrade in accuracy. The second area in which we

demonstrate the utility of a minimax representative is tracking. In ('!i Ilter 5 we

present experiments in which the representative stands in for an unknown motion

model. Lastly we end with some concluding points and thoughts for future work.














CHAPTER 2
THEORETICAL FOUNDATION

In this chapter we define the minimax representatives and enumerate their

properties. First, we present a casual, motivating treatment in Euclidean space to

give intuition to the idea. Then come the section's central results-the ITC for the

mixture family and the ITC on the dual manifold, the exponential family; after

defining them, we present several illustrations to lend intuition to their behavior;

and then finally we show their well-established interpretation in terms of channel

capacity.

2.1 Preliminary-Euclidean Information Center

Let S = {f1, f,} be a set of n points in the Euclidean space Rf.

Throughout our development we will consider the maximum dispersion of the

members of a set S about a point f,


D (f, S) = max f f (2.1)

We will look for the point f, that minimizes D (f, S) and call it the center.

Definition 1 The center of S is 1/ 7,.,I by


fc(S)= arg min D(f, S). (2.2)
f
The following properties are easily proved.

Theorem 1 The center of S is unique and is given by a convex combination of the

elements of S.

fc Z= if, (2.3)
i= 1
where 0 < pi < 1 and :pi 1. We call a point f for which pi > 0 a support.









Theorem 2 Let f, be the center of S. Then, for F = {i,pi > 0}, the set of indices
called the supports, we have an equi-distance p"*''. I/;


If fe 2 2, ifi E F, (2.4)

fi f 2 < r2, otherwise, (2.5)

where r2 is the square of the radius of the sphere coinciding with the supports and
centered at the center.
Now that we have characterized the center first by its minimax property
(Definition 1) and then its equi-distance property (Theorem 2), we can give yet

another characterization, this one useful computationally. Define the simplex A of
probability distributions on m symbols,

A {P= (pi, -- ,P),0 < pi, pi 1 }.

We define a function of p on A by

DE (P, S) -i If 2 ill 2, (2.6)

and now use it to find the center.
Theorem 3 DE(p, S) is strictly concave in p, and the center of S is given by


fc = YPfyi, (2.7)

where

p = arg max DE(p, S) (2.8)
P
is the unique maximal point of DE(p, S).
An example of the center is given in Fig. 2-1. In general, the support set

consists of a relatively sparse subset of points. The points that are most :;i ,i ih-
ii ,y are given high weights.


















































Figure 2-1: Center is denoted by o and supports are denoted by x.









2.2 Mixture Information Center of Probability Distributions

We now move to the heart of the results. While the following derivations

use densities, they could just as easily work in the discrete case. Let f(x) be a

probability density. Given a set S of n linearly independent densities fl, i fn, we

consider the space M consisting of their mixtures,


M = pi f 0
The dimension m of M satisfies m < n 1.

M is a dually flat manifold equipped with a Riemannian metric, and a pair of

dual affine connections [5].

The KL divergence from fi(x) to f2(x) is defined by


KL(ff2) fX) log dx, (2.10)

which pl i, the role of the squared of the distance in Euclidean space. The KL-

divergence from S to f is defined by

KL(S, f)= max KL (f, f). (2.11)

Let us define the mixture information center (m-ITC) of S.

Definition 2 The m-center of S is /. Im by

fm(S)= arg min KL(S, f). (2.12)
f

In order to analyze properties of the m-center, we define the m-sphere of

radius r centered at f by the set


Sm(f,r) {f' KL(f',f)

(2.13)









In order to obtain an analytical solution of the m-center, we remark that the

negative entropy

p(f) j f(x) log f(x)dx (2.14)

is a strictly convex function of f e M. This is the dual potential function in M
whose second derivatives give the Fisher information matrix [5].
Given a point f in M,


f Z= i f, p p > 0, (2.15)

and using o above, we write the Jensen-Shannon (JS) divergence [2] of f with
respect to S as

Dm(f, S) -= -(f)+ Ypi (fi). (2.16)

When we regard Dm(f, S) as a function of p = (pi, ,,)T E A, we denote it by

Dm(p, S). It is easy to see

Dm(f, S) is strictly concave in p A (2.17)

Dm(f, S) is invariant under permutation of {f,}, (2.18)

Dm(f, S) = 0 if p is an extreme point of simplex A. (2.19)

Hence from equations 2.17 and 2.19, Dm(f, S) has a unique maximum in A. In
terms of the KL-divergence, we have

D,(p,S) = piKL (f, f). (2.20)

Theorem 4 The m-center of S is given by the unique maximizer of Dm(f, S), that
ts,


fm 5fi, (2.21)

p arg max Dm(p,S). (2.22)
P








Moreover, for supports for which pi > 0, KL (fi, fm) = r2, and for non-supports for
which pi = 0, KL (fi, fm) < r2 r2 = maxKL (fi, fm).
By using the Lagrange multiplier A for the constraint pi = 1, we calculate

S[D(p, S) A( p I)} -0. (2.23)

From the definition of Dm and because of

a()= a) f'l9g pkf kg

Sf, logf + 1, (2.24)

equation 2.23 is rewritten as

fl ogf -1+(f)- A=0 (2.25)

when pi > 0, and then as
KL (fi, f)= A+ 1. (2.26)

However, because of the constraints pj > 0, this is not necessarily satisfied for some
fj for which pj = 0. Hence, at the extreme point fm(S), we have

KL (fi, f) =r2 (2.27)

for supports fi (pi > 0), but for non-supports (pi = 0),

KL (f, f) < r2. (2.28)

This distinction occurs because p maximizes EpiKL (fi, f), and if we assume for a
non-support density fi that KL (fi, f) > r2, we arrive at a contradiction.
We remark that numerical optimization of D, can be accomplished efficiently

(e.g., with a Newton scheme [15]), because it is concave in p.









2.2.1 Global Optimality

Lastly, it should be noted that while it appears that the results so far -ii--:: -1

that the m-ITC is the minimax representative over merely the simplex of mix-

tures, a result by Merhav and Feder [16] (based on work by Csiszar [17]) shows

that the m-ITC is indeed the global minimax representative over all probability

distributions.

To argue so, consider a distribution g which a skeptic puts forward as the

minimizer of equation 2.12. Next, we consider the I-projection of g onto the set of

mixtures M defined as

SargminKL(f,g). (2.29)
fEM
And from the properties of I-projections [17], we know for all f E M that


KL(f, g) > KL(f, g) + KL(g, g) > KL(f, g) (2.30)

since the last term on the right-hand side is positive. This tells us that the mixture

g performs at least as well as g; so we may restrict our minimization to the set of

mixtures, knowing we will find the global minimizer.

2.3 Exponential Center of Probability Distributions

Given n densities fi(x), f,(x) > 0, we define the exponential family of densities

E,


E = f f(x) -exp{ pilog f(x) (p)}, 0 < p, pi 1- }, (2.31)

instead of the mixture family M. The dimension m of E satisfies m < n 1. E

is also a dually flat Riemannian manifold. E and M are "dual" [5], and we can

establish similar structures in E.

The potential function Q(p) is convex, and is given by


(p) log Jexp { pi log f(Wx)} dx. (2.32)









The potential b(p) is the cumulant generating function, and is connected with the

negative entropy p by the Legendre transformation. It is called the free energy in

statistical physics. Its second derivatives give the Fisher information matrix in this

coordinate system.

An e-sphere (exponential sphere) in E centered at f is the set of points


Se(f,r)= {f' KL(f', f) < r2, f E} (2.33)

where r is the e-radius.

Definition 3 The e-center of S is 1 f, .I by


fe(S)= arg min KL(f, S), (2.34)
f

where

KL(f, S) max KL (f, f,). (2.35)
i

We further define the JS-like divergence in E,


De (p, S)= { (p) + p' (f,). (2.36)

Because of

y (f) log exp (log f) dx 0, (2.37)

we have

D, (p, S) = -(p). (2.38)

The function De = is strictly concave, and has a unique maximum. It is an

interesting exercise to show for f = exp { pi log fi b(p)},


-i(p) = piKL (f, f) (2.39)


Analogous to the case of M, we can prove the following.








Theorem 5 The e-center of S is unique and given by the maximizer of De (p, {fi}),


exp { pilogf/i


b (p) },


p = arg max De (p, S) .


(2.40)

(2.41)


Moreover,


KL (fe, fi)


and


KL (f, fi)

r2, for supporting fi (pi / 0),



for non-supporting fi (pi C


where


r2 = maxKL (fe, fi).


We calculate the derivative of


'(p) A ( pi


1), and put


I ) 0.


f = exp i logfi(x)


Slog
apii


Sexp plog f (x) dx


Sf(x) log f,(x)dx.


(2.47)


Hence, (2.45) becomes


I f(x) log fa(x) dx


A = const


(2.48)


and hence


KL (f, fi) = const


fe(S)


(2.42)


(2.43)


(2.44)


we have


(2.45)


a (p),


(2.46)


(2.49)


- (p) A >pi










0.08


0.07


0.06


0.05


0.04


0.03


0.02


0.01


0
-50 -40 -30 -20 -10 0 10 20 30 40 50

Figure 2-2: The ensemble of 50 Gaussians with means evenly spaced on the inter-
val [-30,30] and a = 5


for fi with pi > 0. For pi = 0, KL (f, fi) is not larger than r2, because of (2.39).

2.4 Illustrations and Intuition

In this section we convey some intuition regarding the behavior of the ITCs.

Particularly, we show examples in which the ITCs choose a relatively sparse subset

of densities to be support densities (assigning zero as weights to the other densities

in the set); also we find that the ITC tends to assign disproportionately high

weights to members of the set that are most :;:i .,i hi, iry" (i.e., those that are

most distinct from the rest of the set).


























0.018


0.016


0.014


0.012


0.01


0.008


0.006


0.004


0.002


0
-50 -40 -30 -20 -10 0 10 20 30 40 50


Figure 2-3: The components from Fig. 2-2 scaled by their weights in the m-ITC.

























ITM
o-/ T AM
0.016 --/ ,


0.014


0.012-


0.01 -


0.008 -


0.006 -


0.004 -


0.002-/


0 -
-50 -40 -30 -20 -10 0 10 20 30 40 50

Figure 2-4: The m-ITC (solid) and arithmetic mean (AM, dashed) of the ensemble
of Gaussians shown in Fig. 2-2









2.4.1 Gaussians

First, we examine a synthetic example in which we analytically specify a set

of densities and then numerically compute the m-ITC of that set. We construct a

set of 50 one-dimensional Gaussian distributions with means evenly spaced on the

interval [-30, 30] and with a = 5. Fig. 2-2 shows all of the densities in this set.

When we compute the m-ITC, we see both of the properties mentioned above: Out

of the 50 densities specified, only eight become support densities (sparsity). And

additionally, the densities which receive the largest weight in the m-ITC are outer-

most densities with means at -30 and 30 (highlighting ::i i. ,iii! try" elements).

Fig. 2-3 shows the eight support densities scaled according to the weight which

the m-ITC assigns them. For the sake of comparison, Fig. 2-4 shows the m-ITC

compared with the arithmetic mean.

2.4.2 Normalized C-r liv- -v- Histograms

Next we consider an example with distributions arising from image data.

Fig. 2-5 shows eight images from the Yale Face Database. The first seven images

are of the same person under different facial expressions while the last (eighth)

image is of a different person. We consider each image's normalized histogram of

gray-level intensities as a distribution and show them in Fig. 2-6.

When we take the m-ITC of these distributions, we again notice the sparsity

and the favortism of the boundary elements. The numbers above each face in

Fig. 2-5 are the weights that the distribution arising from that face received in the

m-ITC. Again, we find that only three out of the eight distributions are support

distributions. In the previous example, we could concretely see how the m-ITC

favored densities on the "on the bouiwl iy of the set because the densities had a

clear geometric relationship among themselves with their means ai i i' on a line.

In this example the notion of a boundary is not quite so obvious; yet, we think that





















0.32519 0 0 0















0 0 0.2267 0.4481












Figure 2-5: Seven faces of one person under different expressions with an eighth
face from someone else. Above each image is the weight pi which the ITC assigns
to the distribution arising from that image.























0.1718


0.02


0.015


0.01


0.005


000 000 0
50 100150200250


0.1787


50 100150200250


D.015[


0.01


005


50.



50 100150200250


0.1668


50 100150200250


0.



50 100150200250


(0.1827)


50 100150200250


0.02


0.015


0.01


.005


50 100150200250


(0.1827)


50 100150200250


Figure 2-6: The normalized gray-level histograms of the faces from Fig. 2-5. Above
each distribution is the KL-divergence from that distribution to the m-ITC. Paren-
theses indicate that the value is equal to the KL-radius of the set. Note that as
predicted by theory, the the distributions which have maximum KL-divergence are
the very ones which received non-zero weights in the m-ITC.


0.02


0.015


0.01


0.005


0


0.02


0.015


0.01


0.005


0


(0.1827)


0.1602


0.1524





































50 100 150 200 250


50 100 150 200 250


0.02


0.015


0.01


0.005


0




0.02


0.015


0.01


0.005


0


50 100 150 200 250


0.02


0.015


0.01


0.005


0




0.02


0.015


0.01


0.005


0


Figure 2-7: In the left column, we can see that the arithmetic mean (solid, lower
left) resembles the distribution arising from the first face more closely than the
m-ITC (solid, upper left) does. In the right column, we see the opposite: The m-
ITC (upper right) more closely resembles the eighth distribution than does the
arithmetic mean (lower right).


50 100 150 200 250









if one examines the eighth image and the eighth distribution, one can qualitatively

agree that it is the most ::li i, inary."

Returning briefly to Fig. 2-6 we examine the KL-divergences between each

distribution and the m-ITC. We report these values above each distribution and

indicate with parentheses the maximal values. Since the three distributions with

maximal KL-divergence to the ITC are exactly the three support distributions, this

example unsurprisingly complies with Theorem 4.

Finally, we again compare the m-ITC and the arithmetic mean in Fig. 2-7,

but this time we overlay in turn the first and eighth distributions. Examining the

left column of the figure, we see that the arithmetic mean more closely resembles

the first distribution than does the m-ITC. The KL-divergences bear this out with

the KL-divergence from the first distribution to the arithmetic mean being 0.0461

which compares to 0.1827 to the m-ITC. Conversely, when we do a similar com-

parison in the right column of Fig. 2-7 for the eighth (extraordinary) distribution,

we find that the IT\i most resembles it. Again, the KL-divergences quantify this

observation: Whereas the KL-divergence from the eighth distribution to the m-ITC

is (again) 0.1827, we find the KL-divergence to the arithmetic mean to be 0.5524.

This result falls in line with what we would expect from Theorem 4 and -'i-.-- -1- a

more refined bit of intuition: The m-ITC better represents extraordinary distribu-

tions, but sometimes at the expense of the more common-looking distributions. Yet

overall, that trade-off yields a minimized maximum KL-divergence.

2.4.3 The e-Center of Gaussians

We consider a very simple example consisting of multivariate Gaussian

distributions with unit covariance matrix,


fi(x) = cexp x- pAi2 (2.50)









where x, ti E Rm. We have


SPi logfi (x p 2 2

ll + logc (2.51)



Hence, E too consists of Gaussian distributions.

This case is special because E is not only dually flat but also Euclidean where

the Fisher information matrix is the identity matrix. The KL-divergence is

1 2
KL [fy, fi] 1 2y -i (2.53)

given by a half of the squared Euclidean distance. Hence, the e-center in E of {fi}

is the same as the center of the points {pI} in the Euclidean space.

When m = 1, that is x is univariate, pi is a set of points on the real line.

When pi < P-2 < ... < I,,, it is easy to see that the center is is the average of the

two extremes, pi and p,. This is different from the m-center of {fi} which is not a
single Gaussian but rather a mixture.

2.4.4 e-ITC of Histograms

We return again to the histograms of Section 2.4.2 As for the m-ITC, this
example illustrates how the e-ITC gives up some representative power for the

elements with small variability for the sake of better representing the elements

with large variability. Fig. 2-8 is an extreme case of this, chosen to make this

effect starkly clear: Here again we have the same seven images of the same person

under slight variations along with an eighth image of a different person. After

representing each image by its global gr-iv-1.v.- histogram, we compute the

uniformly weighted, normalized geometric mean and the e-ITC.

It is also worth noting that the e-ITC only selects three support distributions

out of a possible eight, exemplifying the sparsity tendency mentioned in the


















0.21224










0


0










0


Figure 2-8: Eight images of faces which yield normalized gray level histograms. We
choose an extraordinary distribution for number eight to contrast how the repre-
sentative captures variation within a class. The number above each face weighs the
corresponding distribution in the e-ITC.


0










0.18607


0.60169


ad


IA~
























0.4


0.35


0.3


0.25


0.2


0.15


0.1 -


0.05-


0I
1



Figure 2-9: KL(C, P,)
ric mean, respectively.


2 3 4 5 6 7 8
distribution number


for each distribution, for C equal to the e-ITC and geomet-
The horizontal bar represents the value of the e-radius.









previous section. By now examining Fig. 2-9, we can see that KL(C, Pi) is equal

to the e-radius (indicated by the horizontal bar) for the three support distributions

(i = 1, 7, 8) and is less for the others. This illustrates the equi-divergence property

stated previously.

In Figs. 2-8 and 2-9, the worst-case KL-divergence from the geometric mean

is 2.5 times larger than the worst-case from the e-ITC. Of course, this better worst-

case performance comes at the price of the e-ITC's larger distance to the other

seven distributions; but it is our thesis that in some applications we are eager to

make this trade.

2.5 Previous Work

2.5.1 Jensen-Shannon Divergence

First introduced by Lin [2], the Jensen-Shannon divergence has appeared in

several contexts. Sibson referred to it as the information radius [18]. While his

moniker is tantalizingly similar to the previously mentioned notions of m-/e-radii,

he strictly uses it to refer to the divergence, not the optimal value. Jardin and he

later used it as a discriminatory measure for classification [19].

Others have also used the JS-divergence and its variations as a dissimilarity

measure-for image registration and retrieval applications [20, 21], and in the

retrieval experiment of C'! lpter 3, this work will follow suit. That experiment also

makes use of the important fact that the square root of the JS-divergence (in the

case when its parameter is fixed to 1) is a metric [22].

Topsoe provides another take on the JS-divergence, calling a scaled, special

case of it the capac.:/..,';, discrimination [23]. This name hints at the next, perhaps

most important interpretation of the JS-divergence, namely that as a measurement

of mutual information. This alternative identity is widely known. (Cf. Burbea and

Rao [24] as just one example.) And this understanding can help illuminate the

context of the m-center within information theory.









2.5.2 Mutual Information

But the obvious question arises-the mutual information between what? First,

the mutual information between two random variables X and Y with distributions

Px and Py is defined as

MI(X; Y) = H(Y) H(Y|X), (2.54)


where H(Y) = Py log Py is Shannon's entropy and the conditional entropy is


H(YIX) = Pxy(x, y) log Pyx(iu ). (2.55)
x y

To see the connection to the m-center and JS-divergence, Let us first identify

the random variable X above as a random index into a set of random variables

{Y1, ,Y,} with probability distributions {P1, P,}. Note that X merely

takes on a number from one to n. If we now consider Y as a random variable

whose value y results from first sampling X = i and then sampling = y, we

can certainly begin to appreciate that these concocted random variables X and Y

are dependent. That is, learning the value that X takes will let you guess more

accurately what value Y will take. Conversely, learning the value of Y hints at

which distribution Y that value was drawn from and hence the value that X took.

Returning to equation 2.54 and p ,iing particular attention to the definition

of the conditional entropy term, we can use our definitions of X and Y to get an

expression for their joint distribution,


Px(i,y)= Px(i)P(ii,.). (2.56)

Then by observing that P(',|.i) = Pi(y), piu.-inii; this into equation 2.54, and

pulling Px(i) outside of the summations with respect to y, we have


MI(X; Y) H(Y) Px(i)H(P,). (2.57)
i









And this is precisely the Jensen-Shannon divergence JS(PI, .- P ) with co-

efficients (Px(1), Px(n)). So when we evaluate the JS-divergence with a

particular choice of those coefficients, we directly specify a distribution for both

the random variable X and indirectly specify a distribution (a mixture) for the

random variable Y and evaluate the mutual information between them. And when

we maximize the JS-divergence with respect to those same coefficients, we specify

the distributions X and Y which have maximum mutual information.

There are several equivalent definitions for mutual information, and by

considering a different one we can gain insight into the selection of support

distributions in the m-center. The mirror image of equation 2.54 gives us


MI(X; Y)= H(X) H(XIY). (2.58)


Starting from here and for convenience letting f(i) = P(i y) we can derive that


MI(X; Y) = H(X) EyH(fy(i)). (2.59)


fy(i) describes the contributions at a value y of each of the distributions which

make up the mixture. By maximizing MI(X; Y) we minimize the expected value

(over y) of the entropy of this distribution. This means that on the average

we encourage as few as possible contributors of probability to a location; this

is particularly the case at locations y with high marginal probability. Acting

as a regularization term of sorts, to prevent the process from assigning all the

probability to one component (thus driving the right-hand term to zero), is the first

term which encourages uniform weights. Maximizing the whole expression means

that we balance the impulses for few contributors and for uniform weights.

2.5.3 ('! im n. I Capacity

Of interest to information theory from its inception has been the question of

how much information can one reliably transmit over a channel.









In this case, one interprets the ensemble of distributions {P1, P,} making

up the conditional probability P(yli) Pi(y) as the channel. That is, given an in-

put symbol i, the channel will transmit an output symbol y with probability Pi(y).

Given such a channel (discrete and memoryless since each symbol is independent

of the symbols before and after), one will achieve different rates of transmission de-

pending on the distribution over the source symbols Px. These rates are precisely

the mutual information between X and Y, and to find the maximum capacity of

the channel, one picks the Px yielding a maximum value. In this context, many

of the results from this section have been developed and presented quite clearly in

texts by Gallager [25, Section 4.2] and Csiszar [26, Section 2.3].

Related also is the field of universal coding. As reviewed earlier, choosing

an appropriate code depends upon the characteristics of the source which will be

encoded (e.g., Morse code for English). Universal coding concerns itself with the

problem of selecting a code suitable for any source out of a family-i.e., a code

which will have certain universal performance characteristics across the family.

Although this work initially developed in ignorance of that field, the key result in

this field which this work touches upon is the Redundancy-Capacity Theorem. This

theorem-independently proven in several places [27, 28] and later strengthened

[16]-states that the best code for which one can hope will have redundancy

equal to the capacity (or maximum transmission rate [23]) of a certain channel

which takes input from the parameters of the family and output as symbols to be

encoded.

2.5.4 Boosting

Surprisingly, we also find a connection in the online learning literature wherein

the AdaBoost [29] learning algorithm is recast as entropy projection [30]. This

angle, along with its extentions to general Bregman divergences is an interesting

avenue for futue work.















CHAPTER 3
RETRIEVAL WITH m-CENTER

3.1 Introduction

There are two key components in most retrieval systems-the signature

stored for an object and the (dis)similarity measure used to find a closest match.

These two choices completely determine accuracy in a nearest neighbor-type system

if one is willing to endure a potentially exhaustive search of the database. However,

because databases can be quite large, comparing each query image against every

element in the database is obviously undesirable.

In this chapter and the next we focus on speeding up retrieval without com-

promising accuracy in the case in which the object and query signatures are

probability distributions. In a variety of domains, researchers have achieved impres-

sive results utilizing such signatures in conjunction with a v ,i i I i of appropriate

dissimilarity measures. This includes work in shape-based retrieval [31], image

retrieval [32, 33], and texture retrieval [34, 35, 36, 37]. (The last of which we review

in great detail in Section 3.2.) For any such retrieval system which uses a distri-

bution and a metric, we present a way to speed up queries while guaranteeing no

drop in accuracy by representing a cluster of distributions with an optimally close

representative which we call the m-ITC.

Tomes of research have concentrated on speeding up nearest neighbor searches

in non-Euclidean metric spaces. We build on this work by refining it to better suit

the case in which the metric is on probability distributions. In low-dimensional

Euclidean spaces, the familiar k-d tree and R*-tree can index point sets handily.

But in spaces with a non-Euclidean metric, one must resort to other techniques.

These include ball trees [38], vantage point trees [39], and metric trees [40, 41].









Our system utilizes a metric tree, but our main contribution is picking a single

object to represent a set. Picking a single object to describe a set of objects is one

of the most common v-,v to condense a large amount of data. The most obvious

way to accomplish this (when possible) is to compute the arithmetic mean or the

centroid. To contrast with the properties of our choice (the m-ITC) we point out

that the arithmetic mean minimizes the sum of squared distances from it to the

elements of its set. In the case in which the data are known (or forced) to lie on a

manifold, it is useful to pick an intrinsic mean which has the mean's minimization

property but which also lies on a manifold containing the data. This has been

explored for manifolds of shape [4, 3] and of parameterized probability densities [5].

Metric trees. To see how our representative fits into the metric tree framework,

a brief review of metric trees helps. Given a set of points and a metric (which by

definition satisfies positive-definiteness, symmetry, and the triangle inequality), a

leaf of a metric tree indexes some subset of points {pi} and contain two fields-a

center c and a radius r-which satisfy d(c, p) < r, for all pi where d is the metric.

Proceeding hierarchically, an interior node indexes all of the points indexed by its

children and also ensures that its fields satisfy the constraint above. Hence using

the triangle inequality, for any subtree, one can find a lower bound on the distance

from a query point q to the entire set of points contained in that subtree


d(q, pi) > d(q, c) r, (3.1)

as illustrated in Fig. 3-1. And during a nearest neighbor search, one can recursively

use this lower bound (costing only one comparison) to prune out subtrees which

contain points too distant from the query.

Importance of picking a center. Returning to the choice of the center, we see

that if the radius r in equation 3.1 is large, then the lower bound is not very tight

and subsequently, pruning will not be very efficient. On the other hand, a center










P.,


d(q,p

p1

c p2
q d(q,c)
P, P3
P4







Figure 3-1: Using the triangle inequality to prune

which yields a small radius will likely lead to more pruning and efficient searches. If

we reexamine Fig. 3-1, we see that we can decrease r by moving the center toward

the point pi in the figure. This is precisely how the m-ITC behaves. We claim that

the m-ITC yields tighter clusters than the commonly used centroid and the best

medoid, allowing more pruning and better efficiency because it respects the natural
metrics used for probability distributions. We show theoretically that under the

KL-divergence, the m-ITC uniquely yields the smallest radius for a set of points.

And we demonstrate that it also performs well under other metrics.

In the following sections we first define our m-ITC and enumerate its prop-

erties. Then in Section 3.2 we review a pre-existing texture classification system

[37] which utilizes probability distributions as signatures and discuss how we build
our efficiency experiment atop this system. Then in Section 3.3 we present results

showing that the m-ITC, when incorporated in a metric tree, can improve query
efficiency; also we compare how much improvement the m-ITC achieves relative to









other representatives when each is placed in a metric tree. Lastly, we summarize

our contributions.

3.2 Experimental Design

The central claim of this chapter is that the m-ITC tightly represents a set of

distributions under typical metrics; and thus, that this better representation allows

for more efficient retrieval. With that goal in mind, we compare the m-ITC against

the centroid and the best medoid-two commonly used representatives-in order to

find which scheme retrieves the nearest neighbor with the fewest comparisons. The

centroid is the common arithmetic mean, and the best medoid of a set {pi} in this

case is the the distribution p satisfying


p arg min maxKL(pj,p').
p'E{pi} 3

Notice that we restrict ourselves to the pre-existing set of distributions instead of

finding the best convex combination which is the m-ITC.

In the experiment, each representative will serve as the center of the nodes

of a metric tree on the database. We control for the topology of the metric tree,

keeping it unchanged for each representative. Since the experiment will examine

efficiency of retrieval, not accuracy, we will choose the same signature as Varma

and Zisserman [37], and this along with the metric will completely determine

accuracy, allowing us to exclusively examine efficiency.

3.2.1 Review of the Texture Retrieval System

The texture retrieval system of Varma and Zisserman [37] [42] builds on an

earlier system [43]. It uses a probability distribution as a signature to represent

each image. A query image is matched to a database image by nearest neighbor

search based on the X2 dissimilarity between distributions. Although the system

contains a method for increasing efficiency, we do not implement it and d 1 i

discussion of it to the next section.













^^^(^~~~ ** *'.'
;E
*^, ,~-'m ..
P (; .m *.m.' .? J : I


Figure Examples images from the CAiURT database
Figure 3-2: Examples images from the CUReT database


Texton dictionary. Before computing any image's signature, this system

requires that we first construct a texton ,I.: / :..,, i, ,;/ To construct this dictionary, we

first extract from each pixel in each training image a feature describing the texture

in its neighborhood. This feature can be a vector of filter responses at that pixel

[42] or simply a vector of intensities in the neighborhood about that pixel [37]. (We

choose the later approach.) After clustering this ensemble of features, we take a

small set of cluster centers for the dictionary, 610 centers in total.

Computing a signature. To compute the signature of an image, one first finds

for each pixel the closest texton in the dictionary and then retains the label of

that texton. At this stage, one can imagine an image transformed from having

intensities at each pixel to having indices into the texton dictionary at each pixel,

as would result from vector quantization. In the next step one simply histograms

these labels in the same way that one would form a global gray-level histogram.

This normalized histogram on texton labels is the signature of an image.

Data. Again as in [37], we use the Columbia-Utrecht Reflectance and Texture

(CUReT) Database [44] for texture data. This database contains 61 varieties of









texture with each texture imaged under various (sometimes extreme) illumination

and viewing angles. Fig. 3-2 illustrates the extreme intra-class variability and the

inter-class similarity which make this database challenging for texture retrieval

systems. In this figure each row contains realizations from the same texture class

with each column corresponding to different viewing and illumination angles.

Selecting 92 images from each of the 61 texture classes, we randomly partition each

of the classes into 46 training images, which make up the database, and 46 test

images, which make up the queries. Preprocessing consists of conversion to gray

scale and mean and variance normalization.

Dissimilarity measures. Varma and Zisserman [37] measured dissimilarity

between a query q and a database element p (both distributions) using the X2

signfigance test,

X2(p,q) ( q) (3.2)
pi + qi
and returned the nearest neighbor under this dissimilarity.

In our work, we require a metric, so we take the square root of equation 3.2

[23]. Note that since the square root is a monotonic function, this does not alter

the choice of nearest neighbor and therefore maintains the accuracy of the system.

Additionally we use another metric [22], the square root of the Jensen-Shannon

divergence between two distributions:

p+q 1 1
JS1 (p, q) = H( ) H(p) H(q)
2 2 2 2
1 p+ q 1 p +q
-KL(p, P -q) + KL(q, + q.
2 2 2 2

Note that in contrast to equation 2.16, we now take only two distributions and fix

the mixture parameter as Unlike the first metric, this change effects accuracy,

but in our experiments, the change in retrieval accuracy from the original system

did not exceed one percentage point. This does not surprise since the JS has served

well in registration and retrieval applications in the past [20].









3.2.2 Improving Efficiency

After selecting the signature and dissimilarity measure (in our a case metric),

we have fixed the accuracy of the nearest neighbor search; so we can now turn our

attention to improving the efficiency of the search.

We construct a metric tree on the elements of the database to improve

efficiency. And in our experiment we will vary the representative used in the nodes

of the metric tree, finding which representative prunes most. We hold constant the

topology of the metric tree, determining each element's membership in a tree node

based on the texture variety from which it came. Specifically, each of the 61 texture

varieties has a corresponding node and each of these nodes contain the 46 elements

arising from the realizations of that texture. Given this fixed node membership, we

can construct the appropriate representative for each node.

Then, given each representative's version of the metric tree, we perform

searches and count the number of comparisons required to find the nearest neigh-

bor.

3.3 Results and Discussion

The results with each dissimilarity measure were very similar, and theoretical

results [22] bear this observed similarity out by showing that the JS-divergence and

X2 are .,- mptotically related. Below we report the results for the JS-divergence.

Fig. 3-3 shows the speed-ups which the metric tree with m-ITC achieve for

each texture class. These data relate the total number of comparisons required

for an average query using the metric tree to the number of comparisons in an

exhaustive search. On average, the m-ITC discards 68.9'. of the database, yielding

a factor of 3.2 improvement in efficiency.

It should not surprise that the indexing out-performs an exhaustive search,

so we now consider what happens when we vary the representative in the metric

tree. On average, the arithmetic mean discards 47.1' of the database, resulting






















I I I I I I I I I I I I


121-


I


I .


5 10 15 20 25 30 35 40
Texture class


45 50 55 60


Figure 3-3: On average for probes from each texture class, the speed-up relative to
an exhaustive search achieved by the metric tree with the m-ITC as the representa-
tive

























a-D I
0.35
I I IT I I T T


0.3 l I I I I
iT8I iT
iF' '
0 r I TT^ ^ n n 1 i i i iT


o 0.25 i 1 .
Z I 1 1 1 T I III TI

,, I 1 1 ii i
0.2 i
aU)I I- II

i I
SII I
0.15 I I I I I I
0.1 I II


z1 1 II II 1 I I II

0 I I 5

5 10 15 20 25 30 35 40 45 50 55 60
Texture class number

Figure 3-4: The excess comparisons performed by the arithmetic mean relative to
the m-ITC within each texture class as a proportion of the total database















_ 0.35 T
TT T T

o 0.3 I T T I I T T T I I
S I I I I I I I

l1 11 | 1

S0.25 I I II I I I I I I

S1 1 -
a 0. 5 IIII I I


E II
0. 1 i
I0.15 I I I I I





0.05 I I 1
S 0.11 1 1 1 1 11


0.05 -
0-

5 10 15 20 25 30 35 40 45 50 55 60
Texture class number

Figure 3-5: The excess comparisons performed by the best medoid relative to the
m-ITC within each texture class as a proportion of the total database


in a speed-up factor of 1.9. Fig. 3-4 plots the excess proportion of the database

which the arithmetic mean searches relative to the m-ITC for each probe. The

box and whiskers respectively plot the median, quartiles, and range of the data

for all the probes within a class. On average, the metric tree with the arithmetic

mean explores an additional 21.;:'. of the total database relative to the metric tree

with the m-ITC. In only 2.0'. of the queries did the m-ITC fail to out-perform the

arithmetic mean. Since the proportion of excess comparisons is a positive value for

98.0'. of the probes, we can conclude that the m-ITC almost ahlv-ix offers some

improvement and occasionally avoids searching more than a third of the database.









Fig. 3-5 shows a similar plot for the best medoid. Again, the box and whiskers

respectively plot the median, quartiles, and range of the data for all the probes

within a class. On average, the metric tree with the best medoid explores an

additional 22.1 of the total database relative to the metric tree with the r-ITC.

Here the m-ITC improves even more than it did over the arithmetic mean; in only

0.1 of the queries did the m-ITC fail to out-perform the best medoid-never once

doing more poorly.

3.3.1 Comparison to Pre-existing Efficiency Scheme

Varma and Zisserman propose a different approach to increase the efficiency of

queries [37]. They decrease the size of the database in a greedy fashion: Initially,

each texture class in the database has 46 models (one arising from each training

image). Then for each class, they discard the model whose absence impairs retrieval

performance the least on a subset of the training images. And this is repeated till

the number of models is suitably small and the estimated accuracy is suitably high.

While this method performed well in practice-achieving comparable accuracy

to an exhaustive search and reducing the average number of models per class from

46 to eight or nine, it has several potential shortcomings. The first is this method's

computational expense: Although the model selection process occurs off-line and

time is not critical, the number of times one must validate models against the

entire training subset scales quadratically in the number of models. Additionally

and more importantly, the model selection procedure depends upon the subset of

the training data used to validate it. It offers no l;,.ii,,l" of its accur'. ;, relative

to an exhaustive search which utilizes all the known data.

In contrast, our method can compute the m-ITC efficiently, and more impor-

tantly we guarantee that the accuracy of the more efficient search is identical to the

accuracy that an exhaustive search would achieve. Although it must be noted that









the method of Varma and Zisserman performed fewer comparisons, we believe that

building a multi-i i t -, metric tree will bridge this gap.

Lastly, the two methods can co-exist simultaneously: Since the pre-existing

approach focuses on reducing the size of the database while ours indexes the

database (solving an orthogonal problem), nothing stops us from taking the smaller

database resulting from their method and performing our indexing atop it for

further improvement.

3.4 Conclusion

Our goal was to select the best single representative for a class of probability

distributions. We chose the m-ITC which minimizes the maximim KL-divergence

from each distribution in the class to it; and when we placed it in the nodes

of a metric tree, it allowed us to prune more efficiently. Experimentally, we

demonstrated significant speed-ups over exhaustive search in a state-of-the-art

texture retrieval system on the CUReT database. The metric tree approach to

nearest neighbor searches guarantees accuracy identical to an exhaustive search of

the database. Additionally, we showed that the m-ITC outperforms the arithmetic

mean and the best medoid when these other representatives are used analogously.

Probability distributions are a popular choice for retrieval in many domains,

and as the retrieval databases grow large, there will be a need to condence many

distributions into one representative. We have shown that the m-ITC is a useful

choice for such a representative with well-behaved theoretical properties and

empirically superior results.














CHAPTER 4
RETRIEVAL WITH e-CENTER

4.1 Introduction

In the course of designing a retrieval system, one must usually consider at least

three broad elements

1. a signature that will represent each element, allowing for compact storage and

fast comparisons,

2. a (dis)similarity measure that will discriminate between a pair of signatures

that are close and a pair that are far from each other, and

3. an indexing structure or search strategy that will allow for efficient, non-

exhaustive queries.

The first of these two elements mostly determine the accuracy of a system's

retrieval results. The focus of this chapter, like the last, is on the third point.

A great deal of work has been done on retrieval systems that utilize a prob-

ability distribution as a signature. This work has covered a variety of domains

including shape [31], texture [34], [45], [35], [36], [37], and general images [32], [33].

Of these, some have used the Kullback-Leibler (KL) divergence [1] as a dissimilarity

measure [35], [36], [32].

The KL-divergence has many nice theoretical properties. Particularly, its

relationship to maximum-likelihood estimation [46]. However, in spite of this,

it is not a metric. This makes it challenging to construct an indexing structure

which respects the divergence. it ,i, basic methods exist to speed up search

in Euclidean space including k-d trees and R*-trees. And there are even some

methods for general metric spaces such as ball trees [38], vantage point trees

[39], and metric trees [40]. Yet little work has been done on efficiently finding









exact nearest neighbors under KL-divergence. In this chapter, we present a novel

means of speeding nearest neighbor search (and hence retrieval) in a database of

probability distributions when the nearest neighbor is defined as the element that

minimizes the KL-divergence to the query. This approach does have a significant

drawback which does not impair it on this particular dataset, but in general it

cannot guarantee retrieval accuracy equal to that of an exhaustive search.

The basic idea is a common one in computer science and reminiscent of the

last chapter: We represent a set of elements by one representative. During a search,

we compare the query object against the representative, and if the representative

is sufficiently far from the query, we discard the entire set that corresponds to it

without further comparisons. Our contribution lies in selecting this representative

in an optimal fashion; ideally we would like to determine the circumstances under

which we may discard the set without fear of accidentally discarding the nearest

neighbor, but this cannot alv--iv be guaranteed For this application, we will utilize

the the exponential information theoretic center (e-ITC).

In the remaining sections we first derive the expression upon which the system

makes pruning decisions. Thereafter we present the experiment showing increased

efficiency in retrieval over an exhaustive search and over the uniformly weighted

geometric mean-a reasonable alternate representative. Finally, we return to

the texture retrieval example of the previous chapter to compare the m- and e-

centers, evaluating which forms the tightest clusters as evaluated by the X2 metric

(equation 3.2).

4.2 Lower Bound

In this section we attempt to derive a lower bound on the KL-divergence from

a database element to a query which only depends upon the the element through

its e-ITC. This lower bound guides the pruning and results in the subsequent

increased search efficiency which we describe in Section 4.3.









Q Q
D(CIIQ) Re



D(PIIQ) D(P* IQ) D(CIIQ) D(CIIQ)

/DPIIQ)
/ \ /DQ)



P P* C P P* C
D(PIIP*) D(CIIP*)

Figure 4-1: Intuitive proof of the lower bound in equation 4.7 (see text). The
KL-divergence acts like squared Euclidean distance, and the Pythagorean Theo-
rem holds under special circumstances. Q is the query, P is a distribution in the
database, and C is the e-ITC of the set containing P. P* is the I-projection of Q
onto the set containing P. On the right, D(C IP) < Re, where Re is the e-radius,
by the minimax definition of C.

In order to search for the nearest element to a query efficiently, we need to

bound the KL-divergence to a set of elements from beneath by a quantity which

only depends upon the e-ITC of that set. That way, we can use the knowledge

gleaned from a single comparison to avoid individual comparisons to each member

of the set.

We approach such a lower bound by examining the left side of Fig. 4-1. Here

we consider a query distribution Q and an arbitrary distribution P in a set which

has C as its e-ITC. As a stepping stone to the lower bound, we briefly define the
I-projection of a distribution Q onto a space E as

P* argmin D(PIIQ). (4.1)
PEE

It is well known that one can use intuition about the squared Euclidean distance

to appreciate the properties of the KL-divergence; and in fact, in the case of the









I-projection P* of Q onto E, we have in some cases a version of the familiar

Pythagorean Theorem [26]. In this case we have for all P E E that


D(PI Q) > D(P* Q) + D(P IP*). (4.2)

And in the case that

P* cP1 + (1 a)P2 (4.3)

for distributions P1, P2 c E with 0 < a < 1, then we call P* an ,i /,iraic inner

point and we have equality in equation 4.2

Unfortunately it is this condition which we cannot verify easily as it depends

upon both the query (which we will label Q) and the structure of E which is

determined by the database. Interestingly we do get the equality version when

we take E as a linear family, like the families with which the m-ITC is concerned.

Regardless, we will continue with the derivation and demonstrate its use in this

application.

Assuming equality in equation 4.2 and applying it twice yields,


D(PIIQ) = D(P*IIQ) + D(PIIP*) (4.4)

D(CIIQ) = D(P*IIQ) + D(C IP*), (4.5)

where we are free to select P E E as an arbitrary database element and C as the

e-ITC. Equation 4.4 corresponds to AQPP* while equation 4.5 corresponds to

AQCP* in Fig. 4-1.

If we subtract the two equations above and re-arrange, we find,


D(PIIQ) = D(CIIQ) + D(PIP*) D(CIIP*).


(4.6)









But since the KL-divergence is non-negative, and since the e-radius Re is a uniform

upper bound on the KL-divergence from the e-ITC to any P E E, we have


D(PIIQ) > D(CIQ) R,. (4.7)

Here we see that the m-ITC would do little better in guaranteeing this particular

bound. While it would insure that we had equality in equation 4.2, it could not

bound the last term in equation 4.6 because the order of the arguments is reversed.

We can get an intuitive, nonrigorous view of the same lower bound by again

borrowing notions from squared Euclidean distance. This pictoral reprise of

equation 4.7 can lend valuable insight to the tightness of the bound and its

dependence on each of the two terms. For this discussion we refer to the right side

of Fig. 4-1.

The minimax definition tells us that D(CI P) < R,. We consider the case in

which this is equality and sweep out an arc centered at C with radius Re from the

base of the triangle counter-clockwise. We take the point where a line segment from

Q is tangent to this arc as a vertex of a right triangle with hypotenuse of length

D(CIIQ). The leg which is normal to the arc has length Re by construction, and by

the Pythagorean Theorem the other leg of this triangle, which originates from Q,

has length D(C IQ) Re. We can use the length of this leg to visualize the lower

bound, and by inspection we see that it will alh--i be exceeded by the length of

the line segment originating from Q and terminating further along the arc at P.

This segment has length D(P| Q) and is indeed the quantity we seek to bound

from below.

4.3 Shape Retrieval Experiment

In this section we apply the e-ITC and the lower bound in equation 4.7 to

represent distributions arising from shapes. Since the lower bound guarantees that









we only discard elements that cannot be nearest neighbors, the accuracy of retrieval

is as good as an exhaustive search.

While we know from the theory that the e-ITC yields a smaller worst-case KL-

divergence, we now present an experiment to test if this translates into a tighter

bound and more efficient queries. We tackle a shape retrieval problem, using shape

distributions [31] as our signature. To form a shape distribution from a 3D shape,

we uniformly sample pairs of points from the surface of the shape and compute

the distance between these random points, building a histogram of these random

distances. To account for changes in scale, we independently scale each histogram

so that the maximum distance is ahv--, the same. For our dissimilarity measure,

we use KL-divergence, so the nearest neighbor P to a query distribution Q is


P argmin D(P'I Q). (4.8)


For data, we use the Princeton Shape Database [47] which consists of over 1800

triangulated 3D models from over 160 classes including people, animals, buildings,

and vehicles.

To test the efficiency, we again compare the e-ITC to the uniformly weighted,

normalized geometric mean. Using the convexity of E we can generalize the lower

bound in equation 4.7 to work for the geometric mean by replacing the e-radius

with maxim D(CIIPi) for our different C.

We take the base classification a< "n-1i' ,r_:ing the database to define our

clusters, and then compute the e-ITC and geometric means of each cluster.

When we consider a novel query model (on a leave-one-out basis), we search

for the nearest neighbor utilizing the lower bound and disregarding unnecessary

comparisons. For each query, we measure the number of comparisons required to

find the nearest neighbor.


























14


12



10


8



6


4



2


0
0 20 40 60 80 100
class number


120 140 160 180


Figure 4-2: The speed-up factor versus an exhaustive search when using the e-ITC
as a function of each class in the shape database.


















































0 20 40 60 80 100
class number


120 140 160 180


Figure 4-3: The relative percent of additional prunings which the e-ITC achieves
beyond the geometric center, again for each class number.









Fig. 4-2 and Fig. 4-3 show the results of our experiment. In Fig. 4-2, we

see the speed-up factor that the e-ITC achieves over an exhaustive search. Aver-

aged over all probes in all classes, this speed-up factor is approximately 2.6; the

geometric mean achieved an average speed-up of about 1.9.

And in Fig. 4-3, we compare the e-ITC to the geometric mean and see that

for some classes, the e-ITC allows us to discard nearly twice as many unworthy

candidates as the geometric mean. For no class of probes did the geometric mean

prune more than the e-ITC, and when averaged over all probes in all classes, the

e-ITC discarded over 3i '. more elements than did the geometric mean.

4.4 Retrieval with JS-divergence

In this section we mimick the experiments of the previous chapter. Instead of

using the KL-divergence to determine nearest neighbors, we return to the square-

root of the JS-divergence, and we again use the triangle inequality to guarantee no

decrease in accuracy.

Using metric trees with the e-center, geometric mean, and m-center, we can

compare the efficiencies of each representative and the overall speedup. In Fig. 4-4,

we see the speedup relative to an exhaustive search when using the e-ITC; on

average, the speedup factor is 1.53. In Fig. 4-5 we compare the e-ITC to the

geometric mean, much as we compared the m-ITC to the arithmetic mean in the

last chapter. We claim that this is the natural comparison because the exponential

family consists of weighted geometric means. Here the geometric mean searches on

average 7.2!' more of the database than the e-ITC. Lastly in Fig. 4-6 we compare

the two centers against each other. Here the e-ITC comes up short, searching on

average an additional 14 I. of the database. In the next section we try to explain

this result.
































15









10


C






5-









0 20 40 60 80 100 120 140 160 180
class number



















Figure 4 4: Speedup factor for each class resulting from using e-ITC over an ex-
haustive search













































































20 40 60 80 100
class number


120 140 160 180


Figure 4-5: Excess searches performed using geometric mean relative to e-ITC as

proportion of total database.


V)


a
m 0.16




0


a
C 0.14






0 .
C.


0
C
E 0.1


0

0.08


E
= 0.06



0.04


0.02
0










54


























0.5



ro
S0.4 -


0
0 0.3



0.2
-0.2



E 0.1
8


0


E
= -0.1



-0.2
0 20 40 60 80 100 120 140 160 180
class number






















Figure 4-6: Excess searches performed using e-ITC relative to m-ITC as proportion

of total database.









4.5 Comparing the m- and e-ITCs

Since both centers have found successful application to retrieval, it is reason-

able to explore their relationship. Since the arguments in their respective minimax

criteria are reversed, it is not immediately clear that a meaningful comparison

could be made with KL-divergence alone. Hence, we resort to the X2 distance from

the previous chapter as an arbiter (though we could just as well use JS-divergence).

The comparison we make next is simple. Returning to the texture retrieval

dataset from the previous chapter, we use the same set memberships and calculate

an m-ITC and an e-ITC for each set. Then for each representative and each set we

determine what the maximum X2 distance between an element of that set and the

representative is.

The results appear in Fig. 4.5 as the ratio between this "X2 radius" of the

e-ITC and the m-ITC. Since the numbers are greater than one with the exception

of only two out of 61 classes, it is safe to conclude in this setting that the r-ITC

forms tighter clusters. This result helps explain the superior performance of the

m-center in the previous section.

In retrospect, one could attribute this to the m-ITC's global optimality

property (cf. Section 2.2.1) which the e-ITC may not share.


















































0 10 20 30 40
class number


50 60 70


Figure 4-7: The ratio of the maximal X2 distance from each center to all of the
elements in a class


2.6

2.4

2.2

2

1.8 -

1.6

1.4

1.2

10.8

0.8 -


U.;















CHAPTER 5
TRACKING

5.1 Introduction

In the previous chapters, we considered the problem of efficient retrieval. In

the case of retrieval, where a uniform upper bound is important, one measures how

well a representative does by focusing on how it handles the most distant members.

This property is why the minimax representatives are well-suited to retrieval. In

this chapter, we explore the question of whether the same can be said for tracking.

We first present several encouraging signs that in fact it may be, and then we go

on to consider an experiment to test the performance of a tracker built around the

m-ITC. But first we set the context of the tracking problem in probabilistic terms.

5.2 Background-Particle Filters

The tracking problem consists of estimating and maintaining a hidden variable

(usually position, sometimes pose or a more complicated state) from a sequence

of observations. Commonly, instead of simply keeping one guess as to the present

state and updating that guess at each new observation, one stores and updates

an entire probability distribution on the state space; then, when pressed to give a

single, concrete estimate of the state, one uses some statistic (e.g., mean or mode)

of that distribution.

This approach is embodied famously in the Kalman filter, where the probabil-

ity distribution on the state space is restricted to be a Gaussian, and the trajectory

of the state (or simply the motion of the object) is assumed to follow linear dynam-

ics so that the Gaussian at one time step may propagate to another Gaussian at

the next.









When this assumption is too limiting-possibly because background clutter

creates multiple modes in the distribution on the state space or because of complex

dynamics or both-researchers often turn to a particle filter to track an object [48].

Unlike the Kalman filter, particle filters do not require that the probability dis-

tribution on the state space be a Gaussian. To gain this additional representative

power, a particle filter stores a set of samples from the state space with a weight for

each sample describing its probability. The goal then is to update these sets when

new observations arrive. One can update from a time step t 1 to a time step t in

three steps [48]:

1. Given a sample set {s j1), s ) } and its associated weights

{7rt) -.. 7, t-)}, randomly sample (with replacement) a new set


2. To arrive at the new st randomly propagate each s t) by assigning sl) to

the value of x according to a probability distribution (i.e., the motion model)

Pmx (X I)).
3. Adjust the weights according to the likelihood of observing the data z(t) given

that the true state is si )

1 pd(z(t)) P ()5.1)


where Z is a normalization factor.

In this work we restrict our attention to a two dimensional state space

consisting only of an object's position. Also we exclusively consider motion models

pm in which

Pm(xly) = pm(x + xoly + yo) = G(x y), (5.2)

where G is a Gaussian. That is p, is a "shift-inv ,i i ,l model in which the

probability of the displacement from a position in one time step to a position in the









next is independent of that starting position. Furthermore, we take that probability

of a displacement as determined by a Gaussian.

5.3 Problem Statement

In this chapter we consider the following scenario: We have several objects,

and for each object we know its distinct probabilistic motion model. We want to

build a single particle filter with one motion model which can track any of the

objects. This is reminiscent of the problem of designing a universal code that can

efficiently encode any of a number of distinct sources, and that similarity -ii--. -1 -

that this is a promising application for an information theoretic center.

Related to this problem is the case in which one single object undergoes

distinct lph,! -" of motion, each of which has a distinct motion model. An

example of this is a car that moves in one fashion when it drives straight and in a

completely different fashion when turning. This work does not explore such multi-

phase tracking. For this related problem of multi-phase motion, there are certainly

more complicated motion models suited to the problem [49]. And for the problem

we focus on in this chapter, one could also imagine refining the motion model as

observations arrive, until, once in possession of a preponderance of evidence, one

finally settles on the most likely component. But all of these require some sort of

on-line learning while in contrast, the approach we present offers a single, simple,

fixed prior which can be directly incorporated into the basic particle filter

5.4 Motivation

5.4.1 Binary State Space

To motivate the use of an ITC, we begin with a toy example in which we

Ii 1:" the value of a binary variable. Our caricature of a tracker is based on

a particle filter with N particles. In this example, we make a further, highly

simplifying assumption: The observation model is perfect and clutter-free. That

is the likelihood Pd in equation 5.1 has perfect information. This means that any









particles which propagate to the incorrect state receive a weight of zero, and the

particles (if any) which propagate to the correct state share the entire probability

mass among themselves.

Under these assumptions, the event that the tracker fails (and henceforth never

recovers) at each time step is an independent, identically distributed Bernoulli

random variable with probability


Pfai = p(1 q)N + (1 p)qN, (5.3)

where p is the probability the tracker takes on state 0 and q is the probability

that a particle evolves under its motion model to state 0. Similarly, 1 p is the

probability the tracker takes on state 1 and 1 q is the probability that a particle

evolves to state 1. What the equation above -,v is that our tracker will fail if and

only if all N particles choose wrongly.

Now the interesting thing about this example is that a motion model in

which q / p can outperform one in which q = p. Specifically by differentiating

equation 5.3 with respect to q, we find that Pfail takes on a minimum when


q ( 1 (5.4)

As a concrete example, we take the case with p = .1 and N = 10; here the optimal

value for q is .4393. When we find the expected number of trials till the tracker

fails in each case (simply pfA), we find that if we take a motion model with q p,

the tracker goes for an average of 29 steps, but if we take the optimal value of q,

the tracker continues for 1825 steps.

Here we see reminders of how the ITCs give more weight to the extraordinary

situations than other representatives. In this case it is justified to under-weight the

most likely case because even having a single particle arrive at the correct location

is as good as having all N particles arrive.









5.4.2 Self-information Loss

For more evidence -i, '-' -ii!-; that the ITC might be well-suited, we consider

the following analysis [50]. Suppose we try to predict a value x with distribution

p(x). But instead of picking a single value, we specify another distribution q(x)

which defines our confidence for each value of x.

Now depending on the value x that occurs, we p li a penalty based on how

much confidence we had in that value; if we had a great deal of confidence (q(x)

close to one), we p liv a small penalty, and if we did not give much credence to the

value, we lp .i a larger penalty. The self-information loss function is a common

choice for this penalty. According to this function if the value x occurs, we would

incur a penalty of log q(x). If we examine the expected value of our loss, we find


E [-log q(x)] KL(p, q) + H(p). (5.5)


Returning to our problem statement, if we are faced with a host of distribu-

tions and want to find a single q to minimize the expected loss in equation 5.5 over

the set of distributions, we begin to approach something like the m-ITC.

5.5 Experiment-One Tracker to Rule Them All?

5.5.1 Preliminaries

To test how well the m-ITC incorporates a set of motion models into one

tracker, we designed the following experiment. Given a sequence of images, we first

estimated the true motion by hand, measuring the position of the object of interest

at key frames. We then fit a Gaussian to the set of displacements, taking the mean

of the Gaussian as the average velocity.

Data. A single frame from the 74-frame sequence appears in Fig. 5-1. In this

sequence we track the head of the walker which has nearly constant velocity in

the x-direction and slight periodic motion in the y-direction (as she steps). The

mean velocity in the x-direction was -3.7280 pixels/frame with a marginal standard


















































Figure 5-1: Frame from test sequence









deviation of 1.15; in the y-direction direction the average velocity was -0.1862 with

standard deviation 2.41.

For our observation model, we simply use a template of the head from the

initial frame and compare it to a given region, finding the mean squared error

(V\.Sl) of the gi i ,-1. 1.I. Then likelihood is just exp ( MSE). We initialize all

trackers to the true state at the initial frame.

Motion models. From this single image sequence, we can hallucinate numerous

image sequences which consist of rotated versions of the original sequence. We

know the true motion models for all of these novel sequences since they are just the

original motion model rotated by the same amount.

To examine the performance of one motion model applied to a different image

sequence, one need only consider the angle disparity between the true underlying

motion model of the image sequence and the motion model utilized by the tracker.

In Fig. 5-2 we report the performance in average time-till-failure as a function of

angle disparity. (Here the number of particles is fixed to 10.)

Since in this experiment (and subsequent ones in this chapter), we define

a tracker as having irrevocably failed when all of its particles are at a distance

greater than 40 pixels from the location, we see that even the most hopeless of

trackers will succeed in I Ii .il:, the subject for six frames-just long enough

for all of its particles to flee from the correct state at an average relative velocity

of 7.5 pixels/frame. This observation lets us calculate a lower bound on the

performance of a tracker with a given angle disparity from the true motion:

Pessimistically assuming a completely uninformative observation model, one can

calculate the time required for the centroid of the particles to exceed a distance of

D = 40 pixels from the true state as


t s- (5.6)
2r sin 2
2







































X -0 2618 X -0 1258
Y 74 Y 74
U-.


-3 -2.5 -2 -1.5 -1
angle of deviation (radians) from true motion


-0.5


Figure 5-2: Average time till failure as a function of angle disparity between the

true motion and the tracker's motion model


performance
lower bound


50

E
| 40
0)
S30


20


10









where r = 3.7326 is the speed at which centroid and true state each move and 0 is

the angle disparity. The dashed line in Fig. 5-2 represents this curve, capped at the

maximum number of frames 74.

One should note that since all of the motion models are rotated versions

of each other, the H(P) term in equation 5.5 is a constant. Hence the q which

minimizes the maximum expected self-information loss over a set of p's is in fact

the m-ITC.

Performance of mixtures. Because we will take the m-ITC of several of these

motion models, it is also of interest how mixtures perform. We consider a mixture

of the correct motion model and a motion model 7 radians rotated in the opposite

direction, which essentially contributes nothing to the tracking. In Fig. 5-3 we

again plot the time-till-failure for trackers with 10 and 20 particles as a function of

the weight in the mixture of the correct model.

To derive a lower bound, this time we represent the proportion of the proba-

bility near the true state at time t as rt and the remainder as wt. Further we -i

that at each time step, art of the probability moves to (or remains in) the correct

state (as a result of those particles being driven by the correct motion model) and

the rest moves away from the correct state. Next we model the step in the particle

filter algorithm where we adjust the weights. We assume that a particle .iv. from

the true state receives a weight that is c < 1 times the weight received by a particle

near the true state. If there were no clutter in the scene, this number would be zero

and we would return to the assumption in Section 5.4.1. Now, by pessimistically

assuming that particles which move away from the correct state have exceedingly

small chance (i.e., zero) of rejoining the true state randomly we can derive the





































80
performance, N=
performance, N=21
lower bound

60


50

E
|40-


,30-


20-


10-


0.2 0.3 0.4 0.5 0.6 0.7
weight of correct component


0.8 0.9


Figure 5-3: Average time till failure as a function of the weight on the correct

motion model; for 10 and 20 particles









following from an initial ro 1,


rt = (5.7)

(1 a) Y o ct-iai
wt (- a (5.8)


where Z is a normalizing constant. And by taking a lower bound of wt > (1-a)c -
z
we can derive that rt < a+( a Finally, by taking this upper bound on rt as a

Bernoulli random variable as we did in Section 5.4.1, we can get a lower bound

on the expected time-till-failure as 1- plus an adjustment for the six free frames

required to drift out of the 40 pixel range. This is the lower bound shown in Fig. 5

3. To calculate c, we randomly sampled the background at a distance greater than

40 pixels from the true state, and averaged the response of the observation model,

yielding c 0.1886.

5.5.2 m-ITC

Again we compare the performance of the m-ITC to the arithmetic mean. This

time we form mixtures of several motion models of varying angles, taking weights

either as uniform (for the arithmetic mean) or as determined by the m-ITC. In the

first case we take a total of 12 motion models with angle disparities of


{5r/20, 4r/20, 3r/20, 2r/20, lr/20, 0, r}.


On this mixture, the m-ITC assigns weights of .31 to each of the motion models at

r/4 and the remaining .37 to the motion model at r.

We tested these trackers when the true motion model has orientation zero,

7/4, and 7, respectively and reported their average times-till-failure, in Table 5-1.

Indeed as we might expect from a minimax representative, the m-ITC registers the

best worst-case performance; but nevertheless it is not an impressive performance.

The second set of motion models was slightly less extreme in variation,

{0, r/64, r/32, r/4}. To these components, the m-ITC split its probability









evenly between the motion models at disparities r/4. In this case, the m-ITC had

a better showing overall, and still had the best worst case performance. The results

are shown in Table 5-2.

Table 5-1: Average time-till-failure of the tracker based on the arithmetic mean
(AM) and on the m-ITC (800 particles)

True angle m-ITC AM
0 43 74
7r/4 23 46
7 16 7



Table 5-2: Average time-till-failure of the tracker based on the arithmetic mean
(AM) and on the m-ITC (400 particles) with second set of models

True angle m-ITC AM
0 72 74
7r/4 44 37



5.6 Conclusion

While there is reason to believe that a minimax representative would serve

well in combining several motion models in tracking, the precise circumstances

when this might be beneficial are difficult to determine. Examining Fig. 5-2 and

Fig. 5-3 it seems that if there is too little weight or too great a disparity, a tracker

is doomed from the beginning. So while the m-ITC will not perform ideally under

all situations, it still retains its expected best worst-case performance.















CHAPTER 6
CONCLUSION

6.1 Limitations

The central thrust of this work has been the claim that despite many computer

vision researchers' instinctive suspicion of minimax methods, given the right

application, they can be useful. However those skeptical researchers' instincts

are often well-founded: The main issue one must be aware of regarding the

representatives presented in this work is their sensitivity to outliers. One must

carefully consider his data, particularly the extreme elements, because those are

precisely the elements with the most influence on these representatives.

In addition to data, one must also consider how one's application defines

successful results. If one's application can tolerate small deviations in the i .. i- 1

cases," and successful behavior is defined by good results in extreme cases, then

a minimax representative might be appropriate. This is precisely the case in the

retrieval problem where a uniform upper bound on the dispersion of a subset is

the criterion on which successful indexing is judged. Such a criterion disregards

whether the innermost members are especially close to the representative or not.

Despite some initial si-.-l. -1 I ii- this did not turn out to be the case in the tracking

domain (in the presence of clutter). There it seems the deviations in the "normal

( did have a significant effect on performance.

6.2 Summary

After characterizing two minimax representatives, with firm groundings

in information theory, we have shown how they can be utilized to speed the

retrieval of textures, images, shapes, or any object so represented by a probability

distribution. Their power in this application comes from the fact that they form









tight clusters, allowing for more precise localization and efficient pruning than other

common representatives. While the tracking results did not bear as much fruit, we

still believe that is a promising avenue for such a representative if the problem is

properly formulated.

6.3 Future Work

This topic touches upon a myriad of areas including information theory,

information geometry, and learning. Csiszar and others have characterized the

expectation-maximization algorithm in terms of divergence minimization; and

we believe that incorporating the ITCs into some EM-style algorithm would be

very interesting. Also of interest are its connections to AdaBoost and other online

learning algorithms. But for all of these avenues, the main challenge remains of

verifying that the data and measurement of success are appropriate fits to these

representatives.















REFERENCES


[1] T. M. Cover and J. A. Thomas, Elements of Information The. ",; Wiley &
Sons, New York, NY, 1991.

[2] J. Lin, "Divergence measures based on the shannon entropy," IEEE Trans.
I,f.' ,, Th., -;, vol. 37, no. 1, pp. 145-151, Mar. 1991.

[3] P. T. Fletcher, C. Lu, and S. Joshi, "Statistics of shape via principal geodesic
analysis on lie groups," IEEE Trans. Med. Imag., vol. 23, no. 8, pp. 995-1005,
Aug. 2004.

[4] E. Klassen, A. Srivastava, W. Mio, and S. H. Joshi, "Analysis of planar shapes
using geodesic paths on shape spaces," IEEE Trans. Pattern Anal. Machine
Intell., vol. 26, no. 3, pp. 372-383, Mar. 2004.

[5] S.-I. Amari, Methods of Information G, .. ,,/ Ir; American Mathematical
Society, Providence, RI, 2000.

[6] S.-I. Amari, lii.l il i..1 geometry on hierarchy of probability distributions,"
IEEE Trans. Int[..,,, T i,,., vol. 47, no. 5, pp. 1707-1711, Jul. 2001.

[7] B. Pelletier, I~i -,ii, I ,ive barycentres in statistics," Annals of Institute of
Statistical Mathematics, to appear.

[8] Z. Wang and B. C. Vemuri, "An affine invariant tensor dissimilarity measure
and its applications to tensor-valued image segmentation," in Proc. IEEE
Conf. Computer Vision and Pattern Recognition, Washington, DC, Jun./Jul.
2004, vol. 1, pp. 228-233.

[9] D. P. Huttenlocher, G. A. Klanderman, and W. A. Rucklidge, "Comparing
images using the hausdorff distance," IEEE Trans. Pattern Anal. Machine
Intell., vol. 15, no. 9, pp. 850-863, Sep. 1993.

[10] J. Ho, K.-C. Lee, M.-H. Yang, and D. Kriegman, "Visual tracking using
learned linear subspaces," in Proc. IEEE Conf. Computer Vision and Pattern
Recognition, Washington, DC, Jun./Jul. 2004, pp. 228-233.

[11] R. I. Hartley and F. Schaffalitzky, "L-oo minimization in geometric recon-
struction problems," in Proc. IEEE Conf. Computer Vision and Pattern
Recognition, Washington, DC, Jun./Jul. 2004, pp. 504-509.









[12] S. C. Zhu, Y. N. Wu, and D. Mumford, \I !!!ii! ,:: entropy principle and
its application to texture modeling," Neural Computation, vol. 9, no. 8, pp.
1627-1660, Nov. 1997.

[13] C. Liu, S. C. Zhu, and H.-Y. Shum, "Learning inhomogeneous gibbs model of
faces by minimax entropy," in Proc. Int'l Conf. Computer Vision, Vancover,
Canada, Jul. 2001, pp. 281-287.

[14] I. Csiszar, "Why least squares and maximum entropy? An axiomatic approach
to inference for linear inverse problems," Annals of Statistics, vol. 19, no. 4,
pp. 2032-2066, Dec. 1991.

[15] D. P. Bertsekas, Nonlinear P,. ,j,.iiin,:,j Athena Scientific, Belmont, MA,
1999.

[16] N. Merhav and M. Feder, "A strong version of the redundancy-capacity
theorem of universal coding," IEEE Trans. In r,1,, The ..,; vol. 41, no. 3, pp.
714-722, May 1995.

[17] I. Csiszar, "I-divergence geometry of probability distributions and mini-
mization problems," Annals of P, .l,,l.:.:/';; vol. 3, no. 1, pp. 146-158, Jan.
1975.

[18] R. Sibson, lid li in 1ii i.. radius," Z. Wahrscheinlichkeitstheorie verw. Geb.,
vol. 14, no. 1, pp. 149-160, Jan. 1969.

[19] N. Jardine and R. Sibson, Mathematical TI'..'.,n,,;;, John Wiley & Sons,
London, UK, 1971.

[20] A. O. Hero, B. Ma, O. Michel, and J. Gorman, "Applications of entropic
spanning graphs," IEEE S':,j.rl Processing Mag., vol. 19, no. 5, pp. 85-95, Sep.
2002.

[21] Y. He, A. B. Hamza, and H. Krim, "A generalized divergence measure for
robust image registration," IEEE Trans. S.:'.'li Processing, vol. 51, no. 5, pp.
1211-1220, May 2003.

[22] D. M. Endres and J. E. Schindelin, "A new metric for probability distri-
butions," IEEE Trans. I,.f..I,, Th(t.'-, vol. 49, no. 7, pp. 1858-1860, Jul.
2003.

[23] F. Tops0e, "Some inequalities for information divergence and related measures
of discrimination," IEEE Trans. Inform. Th(. ,-; vol. 46, no. 4, pp. 1602-1609,
Jan. 2000.

[24] J. Burbea and C. R. Rao, "On the convexity of some divergence measures
based on entropy functions," IEEE Trans. Ift. I, The(.-', vol. 28, no. 3, pp.
489-495, May 1982.









[25] R. G. Gallager, Information Theory and Reliable Communication, John Wiley
& Sons, New York, NY, 1968.

[26] I. Csiszar and J. G. K6rner, Information The(..,' Coding Theorems for
Discrete Memorl, -- SS i-/,i' Academic Press, Inc., New York, NY, 1981.

[27] L. D. Davisson and A. Leon-Garcia, "A source matching approach to finding
minimax codes," IEEE Trans. Inform. The.'., vol. 26, no. 2, pp. 166-174,
Mar. 1980.

[28] B. Y. Ryabko, "Comments on 'A source matching approach to finding
minimax codes'," IEEE Trans. Ifrt,..,, Th(..,;, vol. 27, no. 6, pp. 780-781,
Nov. 1981.

[29] Y. Freund and R. E. Schapire, "A decision-theoretic generalization of on-line
learning and an application to boosting," J. Computer and System Sciences,
vol. 55, no. 1, pp. 119-139, Aug. 1997.

[30] J. Kivinen and M. K. Warmuth, "Boosting as entropy projection," in
Proceedings of the To. I /l, Annual Conference on Computational Learning
Ti, .. ,.i Santa Cruz, CA, Jul. 1999, pp. 134-144.

[31] R. Osada, T. Funkhouser, B. C'!.. !!.-, and D. Dobkin, "Shape distributions,"
ACMI Trans. Graphics, vol. 21, no. 4, pp. 807-832, Oct. 2002.

[32] S. Gordon, J. Goldberger, and H. Greenspan, "Applying the information
bottleneck principle to unsupervised clustering of discrete and continuous
image representations," in Proc. Int'l Conf. Computer Vision, Nice, France,
Oct. 2003, pp. 370-396.

[33] C. Carson, S. Belongie, H. Greenspan, and J. Malik, "Blobworld: image
segmentation using expectation-maximization and its application to image
querying," IEEE Trans. Pattern Anal. Machine Intell., vol. 24, no. 8, pp.
1026-1038, Aug. 2002.

[34] Y. Rubner, C. Tomasi, and L. Guibas, "A metric for distributions with
applications to image databases," in Proc. Int'l Conf. Computer Vision,
Bomb-,i,, India, Jan. 1998, pp. 59-66.

[35] J. Puzicha, J. M. Buhmann, Y. Rubner, and C. Tomasi, "Empirical evaluation
of dissimilarity measures for color and texture," in Proc. Int'l Conf. Computer
Vision, Kerkyra, Greece, Sep. 1999, pp. 1165-1172.

[36] M. N. Do and M. Vetterli, \\V i,. t-based texture retrieval using generalized
Gaussian density and Kullback-Leibler distance," IEEE Trans. Image
Processing, vol. 11, no. 2, pp. 146-158, Feb. 2002.









[37] M. Varma and A. Zisserman, "Texture classification: Are filter banks
necessary?," in Proc. IEEE Conf. Computer Vision and Pattern Recognition,
Madison, WI, Jun. 2003, pp. 691-698.

[38] S. M. Omohundro, "Bumptrees for efficient function, constraint, and classifica-
tion learning," in Advances in Neural Information Processing S1~i.l i,- Denver,
CO, Nov. 1990, vol. 3, pp. 693-699.

[39] P. N. Yianilos, "Data structures and algorithms for nearest neighbor search in
general metric spaces," in Proc. AC'11-.IAM Symp. on Discrete Algorithms,
Austin, TX, Jan. 1993, pp. 311-321.

[40] J. K. Uhlmann, "Satisfying general proximity/similarity queries with metric
trees," Information Processing Letters, vol. 40, no. 4, pp. 175-179, Nov. 1991.

[41] A. Moore, "The anchors hierarchy: Using the triangle inequality to survive
high-dimensional data," in Proc. on Uncera''.'ii, in Ar'.:l' .:.' Intelligence,
Stanford, CA, Jun./Jul. 2000, pp. 397-405.

[42] M. Varma and A. Zisserman, "Classifying images of materials: Achieving
viewpoint and illumination independence," in Proc. European Conf. Computer
Vision, Copenhagen, Denmark, ,i-/Jun. 2002, pp. 255-271.

[43] T. Leung and J. Malik, "Recognizing surfaces using three-dimensional
textons," in Proc. Int'l Conf. Computer Vision, Kerkyra, Greece, Sep. 1999,
pp. 1010-1017.

[44] K. J. Dana, B. van Ginneken, S. K. N i, I', and J. J. Koenderink, "Reflectance
and texture of real-world surfaces," AC'i Trans. Graphics, vol. 18, no. 1, pp.
1-34, Jan. 1999.

[45] E. Levina and P. Bickel, "The earth mover's distance is the Mallows distance:
some insights from statistics," in Proc. Int'l Conf. Computer Vision, Vancover,
Canada, Jul. 2001, pp. 251-256.

[46] N. Vasconcelos, "On the complexity of probabilistic image retrieval.," in Proc.
Int'l Conf. Computer Vision, Vancover, Canada, Jul. 2001, pp. 400-407.

[47] P. Shilane, P. Min, M. Kazhdan, and T. Funkhouser, "The princeton shape
benchmark," in S1p ,l Modeling International, Genova, Italy, Jun. 2004, pp.
167-178.

[48] M. Isard and A. Blake, "Condensation-conditional density propagation for
visual tracking," Int'l J. of Computer Vision, vol. 29, no. 1, pp. 5-28, Jan.
1998.

[49] M. Isard and A. Blake, "A mixed-state condensation tracker with automatic
model-switching," in Proc. Int'l Conf. Computer Vision, B-ombli-, India, Jan.
1998, pp. 107-112.







75

[50] N. Merhav and M. Feder, "Universal prediction," IEEE Trans. h ,f.,,
Tih., ,;< vol. 44, no. 6, pp. 2124-2147, Oct. 1998.















BIOGRAPHICAL SKETCH

A native Floridian, Eric Spellman grew up on Florida's Space Coast, gradu-

ating from Satellite High School in 1998. Thereafter he attended the University of

Florida, receiving his Bachelor of Science in mathematics in 2000, his Master of En-

gineering in computer information science and engineering in 2001, and, under the

supervision of Baba C. Vemuri, his Doctor of Philosophy in the same in 2005. After

graduating he will return to the Space Coast with his wife Kayla and daughter

Sophia to work for Harris Corporation.