<%BANNER%>

Information Fusion and Sparsity Promotion Using Choquet Integrals

Permanent Link: http://ufdc.ufl.edu/UFE0021658/00001

Material Information

Title: Information Fusion and Sparsity Promotion Using Choquet Integrals
Physical Description: 1 online resource (243 p.)
Language: english
Publisher: University of Florida
Place of Publication: Gainesville, Fla.
Publication Date: 2008

Subjects

Subjects / Keywords: choquet, fuzzy, gibbs, integral, logistic, measures, regression, sampler, slice
Computer and Information Science and Engineering -- Dissertations, Academic -- UF
Genre: Computer Engineering thesis, Ph.D.
bibliography   ( marcgt )
theses   ( marcgt )
government publication (state, provincial, terriorial, dependent)   ( marcgt )
born-digital   ( sobekcm )
Electronic Thesis or Dissertation

Notes

Abstract: Our study addressed problems encountered in combining information from multiple sources. Novel methods for learning parameters for information aggregation are proposed. In practical applications of pattern classification, multiple algorithms are often developed for the same classification problem. Each algorithm produces confidence values by which each new sample may be classified. We would like to aggregate these confidence values to produce the best possible confidence for the given sample. This can be seen as a particular instance of what is called information fusion. In addition to learning parameters of aggregation operators to assign the best confidence for a given sample, we would also like the aggregation operators to use a subset of the algorithm confidences and achieve the same level of performance as the entire set of confidences. Using a subset of the algorithms implies lower cost for applications. Choquet integrals are nonlinear operators based on fuzzy measures that can represent a wide variety of aggregation operators. Previous research has demonstrated the utility of Choquet integrals for this problem compared to other methods such as neural networks and Bayesian approaches. However, one of the novel results of this research is that the measures learned can be very sensitive to the choice of desired outputs. In response to this problem, we propose an alternative training methodology based on Minimum Classification Error (MCE) training that does not require the use of desired outputs. A problem with this method is that it depends on a constrained type of fuzzy measure, the Sugeno measure There is a need for additional approaches to learning unconstrained fuzzy measures that are more computationally attractive and provide more robust performance. We propose an approach to learning unconstrained fuzzy measures that relies on Markov Chain / Monte Carlo sampling methods. The use of such approaches for learning measures for Choquet integral fusion is completely novel. In addition, we propose the inclusion of the Bayesian approach of imposing sparsity promoting prior distributions on the measure parameters during sampling as a way of selecting subsets of the algorithms for inclusion in the aggregation. This approach is completely new for learning fuzzy measures.
General Note: In the series University of Florida Digital Collections.
General Note: Includes vita.
Bibliography: Includes bibliographical references.
Source of Description: Description based on online resource; title from PDF title page.
Source of Description: This bibliographic record is available under the Creative Commons CC0 public domain dedication. The University of Florida Libraries, as creator of this bibliographic record, has waived all rights to it worldwide under copyright law, including all related and neighboring rights, to the extent allowed by law.
Thesis: Thesis (Ph.D.)--University of Florida, 2008.
Local: Adviser: Gader, Paul D.

Record Information

Source Institution: UFRGP
Rights Management: Applicable rights reserved.
Classification: lcc - LD1780 2008
System ID: UFE0021658:00001

Permanent Link: http://ufdc.ufl.edu/UFE0021658/00001

Material Information

Title: Information Fusion and Sparsity Promotion Using Choquet Integrals
Physical Description: 1 online resource (243 p.)
Language: english
Publisher: University of Florida
Place of Publication: Gainesville, Fla.
Publication Date: 2008

Subjects

Subjects / Keywords: choquet, fuzzy, gibbs, integral, logistic, measures, regression, sampler, slice
Computer and Information Science and Engineering -- Dissertations, Academic -- UF
Genre: Computer Engineering thesis, Ph.D.
bibliography   ( marcgt )
theses   ( marcgt )
government publication (state, provincial, terriorial, dependent)   ( marcgt )
born-digital   ( sobekcm )
Electronic Thesis or Dissertation

Notes

Abstract: Our study addressed problems encountered in combining information from multiple sources. Novel methods for learning parameters for information aggregation are proposed. In practical applications of pattern classification, multiple algorithms are often developed for the same classification problem. Each algorithm produces confidence values by which each new sample may be classified. We would like to aggregate these confidence values to produce the best possible confidence for the given sample. This can be seen as a particular instance of what is called information fusion. In addition to learning parameters of aggregation operators to assign the best confidence for a given sample, we would also like the aggregation operators to use a subset of the algorithm confidences and achieve the same level of performance as the entire set of confidences. Using a subset of the algorithms implies lower cost for applications. Choquet integrals are nonlinear operators based on fuzzy measures that can represent a wide variety of aggregation operators. Previous research has demonstrated the utility of Choquet integrals for this problem compared to other methods such as neural networks and Bayesian approaches. However, one of the novel results of this research is that the measures learned can be very sensitive to the choice of desired outputs. In response to this problem, we propose an alternative training methodology based on Minimum Classification Error (MCE) training that does not require the use of desired outputs. A problem with this method is that it depends on a constrained type of fuzzy measure, the Sugeno measure There is a need for additional approaches to learning unconstrained fuzzy measures that are more computationally attractive and provide more robust performance. We propose an approach to learning unconstrained fuzzy measures that relies on Markov Chain / Monte Carlo sampling methods. The use of such approaches for learning measures for Choquet integral fusion is completely novel. In addition, we propose the inclusion of the Bayesian approach of imposing sparsity promoting prior distributions on the measure parameters during sampling as a way of selecting subsets of the algorithms for inclusion in the aggregation. This approach is completely new for learning fuzzy measures.
General Note: In the series University of Florida Digital Collections.
General Note: Includes vita.
Bibliography: Includes bibliographical references.
Source of Description: Description based on online resource; title from PDF title page.
Source of Description: This bibliographic record is available under the Creative Commons CC0 public domain dedication. The University of Florida Libraries, as creator of this bibliographic record, has waived all rights to it worldwide under copyright law, including all related and neighboring rights, to the extent allowed by law.
Thesis: Thesis (Ph.D.)--University of Florida, 2008.
Local: Adviser: Gader, Paul D.

Record Information

Source Institution: UFRGP
Rights Management: Applicable rights reserved.
Classification: lcc - LD1780 2008
System ID: UFE0021658:00001


This item has the following downloads:


Full Text
xml version 1.0 encoding UTF-8
REPORT xmlns http:www.fcla.edudlsmddaitss xmlns:xsi http:www.w3.org2001XMLSchema-instance xsi:schemaLocation http:www.fcla.edudlsmddaitssdaitssReport.xsd
INGEST IEID E20101206_AAAACC INGEST_TIME 2010-12-06T10:55:43Z PACKAGE UFE0021658_00001
AGREEMENT_INFO ACCOUNT UF PROJECT UFDC
FILES
FILE SIZE 32805 DFID F20101206_AAAVXG ORIGIN DEPOSITOR PATH mendezvazquez_a_Page_058.pro GLOBAL false PRESERVATION BIT MESSAGE_DIGEST ALGORITHM MD5
bd5a558bbd418cc978d9096f1c2d1d7f
SHA-1
797ecb2a6e19ca5cd2e60521b1845e7652df78cb
38268 F20101206_AAAVXH mendezvazquez_a_Page_059.pro
9370ab49ffe04b7b9414ab2ee31a5fcb
95218ac9d702a7a453c5aad02f79f589f82ce684
19395 F20101206_AAAWBW mendezvazquez_a_Page_191.pro
7f24e3774f2e45ebb40c21f4631ffd28
1f0140743e06a5631214039ad5ce1a56c5f8eb04
6435 F20101206_AAAWUA mendezvazquez_a_Page_089thm.jpg
57900533c3fd150e994e12a13974d0d2
8ee42dd79e9ae3630131294b66d289d807ca0e26
35188 F20101206_AAAVXI mendezvazquez_a_Page_060.pro
d0dce8dadfd4edb5937b9309cc7eea40
ff93d44c3e703ddcc14a51f5f3202cd74ad1e876
24085 F20101206_AAAWBX mendezvazquez_a_Page_192.pro
66855e3a8c00f490caeaa4cf292f54cd
de4d90994bbf94e20c6991da6cfe7b7d087f861f
26148 F20101206_AAAWUB mendezvazquez_a_Page_089.QC.jpg
68eb16cb21146b4bdfbbb71f984cef02
3618f8a86a177011991ff9cca1b89a8025ce8369
39722 F20101206_AAAVXJ mendezvazquez_a_Page_061.pro
8447600878f931a717bd81b12fff64a2
32a2c5b7c2ad0898164a6250ade0667d79b78be1
31386 F20101206_AAAWBY mendezvazquez_a_Page_193.pro
7d364e04a3ab7cd7c664791a516c8ba8
4a4a6d0e2bd8a1644983f19cc87970e20a6b1557
5181 F20101206_AAAWUC mendezvazquez_a_Page_090thm.jpg
342b15959fe99b67fe62c74bf5ab69cc
111d0ad7e8dee61a5675168a455f0631d2529246
42932 F20101206_AAAVXK mendezvazquez_a_Page_062.pro
29dedfd4795ed072b637b210f5f9268b
e122f280dc9f55d2163674519038a2a4160f5802
22118 F20101206_AAAWBZ mendezvazquez_a_Page_194.pro
6dd13f4c6bfb2e3130e40d8571a0cb4c
a2cc53c788549457f257c5b0a222dc2719ff1cb7
18468 F20101206_AAAWUD mendezvazquez_a_Page_090.QC.jpg
78c9cc799098792b67126bf54e11e1e5
c6b34eb7258d289e378a9a7279af50b4b9c43c59
54682 F20101206_AAAVXL mendezvazquez_a_Page_064.pro
7f8a5272740816f2798ff6ad87e41158
c101a7c4701927555d5038c1851196fab18f5fd9
21815 F20101206_AAAWUE mendezvazquez_a_Page_091.QC.jpg
6fa4f13f36041386f8a67861e31ef2d0
40e5b2d1c6568b014628d0706ec4ded1dedc0c87
629107 F20101206_AAAVKA mendezvazquez_a_Page_144.jp2
85c0b4cd2ae7db5bf4461abed3340dc6
2e7529a0d3b542efe3368b82517960c83a36b527
54447 F20101206_AAAVXM mendezvazquez_a_Page_065.pro
347483da1946851cefd7771e76141751
6187989b346f44ab23808accedf8cd4614d677db
30258 F20101206_AAAWUF mendezvazquez_a_Page_092.QC.jpg
cd7e4617185703f38e65af3b5eaa3a2a
d827ec0671721ac309df3fa3c6f4c1b120a8e4e2
1051964 F20101206_AAAVKB mendezvazquez_a_Page_145.jp2
85f814d500f80e83a7eb79b153e1012a
7af1f0e74bd906a93aa91551ead48ea4a75926f5
18157 F20101206_AAAWUG mendezvazquez_a_Page_094.QC.jpg
6b5b9b9778520d902de264c3a5780895
a6838209fa11f98580a8235d71c112411972a0c4
312832 F20101206_AAAVKC mendezvazquez_a_Page_146.jp2
8c1e8c845b90e57d5471ea4a968d3208
f6f6802b65c9b4dd6b1e7b69ed21ae5f2f71bce7
54142 F20101206_AAAVXN mendezvazquez_a_Page_066.pro
ded51519c10600951febb61d794d8471
0f1a4c94d29ca64f127f21dca6cff8e8594090d8
7377 F20101206_AAAWUH mendezvazquez_a_Page_095thm.jpg
73df3f3c36c882bf7eedc11ffcc50278
3e70012b07c904c30f797e6b52852dfb3959f6d4
324976 F20101206_AAAVKD mendezvazquez_a_Page_147.jp2
c4d2009c1d0cdade60594da628ce156d
0f02230c11020340d9afed45929c471f93b019c2
47114 F20101206_AAAVXO mendezvazquez_a_Page_067.pro
5dbd47990ca41c70daf78452320c2cb4
194040d4a7db7c15404de9bffefef17b9539c45d
31666 F20101206_AAAWUI mendezvazquez_a_Page_095.QC.jpg
c580be762ecc0a21966fbd428a385f03
58d324e5134c4ad896d7eb70f9878b6f623c50d5
968189 F20101206_AAAVKE mendezvazquez_a_Page_148.jp2
24e60bf26f77645ebe08938574467f87
f31094cd308718a0e50bd728279f38a4e030f9d4
46101 F20101206_AAAVXP mendezvazquez_a_Page_068.pro
a7f5b5c2da61f1d16069df82e2663bdc
2d04cea56476715b1c237f3b4cab70728ad790f8
1051949 F20101206_AAAVKF mendezvazquez_a_Page_149.jp2
567b5980dd1452526ccbe0b5b6890219
ef20fb4e5e7f46f2d9f06bb215372c893ca1b877
48385 F20101206_AAAVXQ mendezvazquez_a_Page_070.pro
f70a5e068b41765ad4c0a2090ff93286
d3bffaf472eac772071ebb06e2de9af600772fee
8557 F20101206_AAAWUJ mendezvazquez_a_Page_096thm.jpg
d436ba73173fe8afb18bf414a611295e
8be0931ee6cbfbddae04a146a82890c1a0e567c5
F20101206_AAAVKG mendezvazquez_a_Page_150.jp2
fef9065374f1e34d45263074f5f58b84
a513dd2a60f539a968d3480b94ccfe719e4f8351
54664 F20101206_AAAVXR mendezvazquez_a_Page_071.pro
17931e848ff2029266591edcd9c39777
f490fa0d64860bfb9117c4a450adf689086811b8
36030 F20101206_AAAWUK mendezvazquez_a_Page_096.QC.jpg
166fc82b8c1a513ad0434e266b4c261a
80d9198fbb913271fdb51847a75e43bc1d5087a3
1051973 F20101206_AAAVKH mendezvazquez_a_Page_151.jp2
6ea7682fd1d7439de45fa357f489ac28
6218143c5337bd433795608529795097d2a2c467
1223 F20101206_AAAWHA mendezvazquez_a_Page_094.txt
17fda2979bc5f594e306aab710ad22d1
b9ab4d4e9005152496a4a2ba66e93f72d8351223
47032 F20101206_AAAVXS mendezvazquez_a_Page_072.pro
226e3cc162da7a0d6e08951e7d94a7d8
0a634ce8e7dd3a3063f17fbb109259a2c7d3d503
2247 F20101206_AAAWUL mendezvazquez_a_Page_097thm.jpg
b71ba6ba3ebf85a51e653dd9054a479c
1ece7ea206d8653d05fb5d8a09a5273d5bc31003
416601 F20101206_AAAVKI mendezvazquez_a_Page_152.jp2
5301cb22551dfd1c202f15a7b9e8c8d4
1ea2c8d7d7e9f086be4ed1bdc5b7d31fb3cc659d
1942 F20101206_AAAWHB mendezvazquez_a_Page_095.txt
8396329d93f1a9dd363ba87485cfdc8f
f43741534ebdde4c14ed2cffc48813d8e66d9513
37616 F20101206_AAAVXT mendezvazquez_a_Page_073.pro
32d0ef4134fef17607b608796ec83cb1
7933cfcf4c4f5276fe20f3e034dd33c035dfc260
8105 F20101206_AAAWUM mendezvazquez_a_Page_097.QC.jpg
67af9623f2f745e9f31c5deaf8827f39
edc36b2ca847430392450cb7e154e690f0965e45
168168 F20101206_AAAVKJ mendezvazquez_a_Page_153.jp2
386ca009d237a1ec7a4b6eb884cbe8ec
af0d4d3cb38071be699dbbe9299be8dde522cf72
2287 F20101206_AAAWHC mendezvazquez_a_Page_096.txt
27514c7108fe6e4177a24a72da6e71b2
0e25c267019db4860f2376918455a5ff1289199e
44090 F20101206_AAAVXU mendezvazquez_a_Page_074.pro
9e5f2fc466af984ad56f524498280d7a
3ad75f1acd0054d93be11717b952a96724bd6dec
5455 F20101206_AAAWUN mendezvazquez_a_Page_098thm.jpg
dc9ed384e2b1da31eb55e6a21029dd34
74165e79743cfb1a4abb9f0803768b495f75ed01
394733 F20101206_AAAVKK mendezvazquez_a_Page_154.jp2
58dc19b9068c07727204ba14df4ff7ed
1dfeac755cb8ba65436f0f1ce0ff4de78f388354
471 F20101206_AAAWHD mendezvazquez_a_Page_097.txt
4da882465ec0ce00cbf31077abbdd68d
8a78814feaede577593951ac09c140cf936b2f11
26473 F20101206_AAAVXV mendezvazquez_a_Page_075.pro
b89906dfa7810d6d9a47c53b17f9746e
7d2e7f1f6bf7152cb03e7ac4c28cc562ee2bf751
20904 F20101206_AAAWUO mendezvazquez_a_Page_098.QC.jpg
238a0bc5a7bb6cca405556ce25d011cc
27750e87fa5cb35b90b0409ce32a3cd3c0cad24d
276487 F20101206_AAAVKL mendezvazquez_a_Page_155.jp2
62a952fe44fe6a51d36edf7396962e89
009db20c6fba3ed8f588ccb30686072d3d3b6772
978 F20101206_AAAWHE mendezvazquez_a_Page_098.txt
3e7313082e58a82b547de399423a2eb0
c6bb648912ceb9ded1273043cc7d30985303a150
55037 F20101206_AAAVXW mendezvazquez_a_Page_076.pro
371d80c9b350e9fb2aecd4d9348685bf
91d68be0a2a5005225cadc771755ecb1c6cb5d69
7321 F20101206_AAAWUP mendezvazquez_a_Page_099thm.jpg
841dc9ce5847205ff312bc75023dce4e
d54cb94d11857b2d881f0e7e20dda6964b800fa5
2889 F20101206_AAAWHF mendezvazquez_a_Page_099.txt
fbdddbcc2495ab28e5567abf2945736a
1b8d7e15c535b803177a869ff10efdcd472a7764
41286 F20101206_AAAVXX mendezvazquez_a_Page_077.pro
89c72650aa2223dbdca6ed19baac7808
ae4e5e4e8a869798b5da39c827f09c55edb4598b
26545 F20101206_AAAWUQ mendezvazquez_a_Page_099.QC.jpg
f5c9ce51bc6f018bbd38c6217356af2d
2acb14cd2749480f285b274b78af943eedba3238
777308 F20101206_AAAVKM mendezvazquez_a_Page_157.jp2
3005af3deef9865c8688a155bdaebc2d
d7eeb55e959193ba0240bc7d1a8a61a8baa972c2
1497 F20101206_AAAWHG mendezvazquez_a_Page_102.txt
a8bc948e89b5f8087618bc5648213fac
6fa7e8ba1d6a3ccc1558eb404cf2970beef908d3
68925 F20101206_AAAVXY mendezvazquez_a_Page_078.pro
1ef2853d2d27d545f8d6d15f41cd4e50
b02f5efdc8925badc071d8c170f23d7ab996c618
7344 F20101206_AAAWUR mendezvazquez_a_Page_100thm.jpg
81ac9b4d935a74d285e26306f5a689a6
90e287a49ec09530437368cc31577579aaceea6b
615585 F20101206_AAAVKN mendezvazquez_a_Page_159.jp2
b40c98c0b40ee110536e7dcaa9c21d80
b56c775e8456c2f5e8ced743d74e70507d2fd0fc
1922 F20101206_AAAWHH mendezvazquez_a_Page_103.txt
82eff22fbcfba2a488a06d4fd3f1f70f
8c1602a831cc86f6f26dd2210e78bce6abcb20ac
11628 F20101206_AAAVXZ mendezvazquez_a_Page_079.pro
c08b243e1ea90be78f4fe0a17075e025
ff670f4b753f17669b2c2d7dac492b004b887be8
26784 F20101206_AAAWUS mendezvazquez_a_Page_100.QC.jpg
5691dcfeb5234d28a0e449fa78cb943c
3a1c4de80ea2fdceb13dc6880b0e63c46121062c
429062 F20101206_AAAVKO mendezvazquez_a_Page_160.jp2
6189236d715adf58eae006c226e91d6b
8a5d5fff4c7904c7a99c3c5b9c3eb3c6051492d9
1801 F20101206_AAAWHI mendezvazquez_a_Page_104.txt
80e0deef97378da8ad96055932d5b850
65b46873e52ff065f93ae1136bb512eb4ab70cc2
4949 F20101206_AAAWUT mendezvazquez_a_Page_102thm.jpg
590388026fb6201d5b0c782f16aefc02
965e15248933042ee6044e2ae23bc8df8ee9a34f
823743 F20101206_AAAVKP mendezvazquez_a_Page_161.jp2
56c0353d7b6761f187599621436accde
e503a89d0b141197b6db7a66b9f96dadf0165bce
2047 F20101206_AAAWHJ mendezvazquez_a_Page_105.txt
1a8546c221046f7044d5f6a7bdeded44
c0951c673527f12600da22598cd7b46fd15c7472
18816 F20101206_AAAWUU mendezvazquez_a_Page_102.QC.jpg
52abe3859379c5be7074c38518b3a7f9
13a5a58850b9161020457101aa73bfeec1ee784c
480088 F20101206_AAAVKQ mendezvazquez_a_Page_162.jp2
8b3c7a25a2ed5ef7ba1c709ef487ae39
1507413acd2a9e771e11a91ce64ed34076f7b062
1557 F20101206_AAAWHK mendezvazquez_a_Page_106.txt
839aefc6df4c96b0cd1d6031b7b9e25d
9641d6686e451b17daafe04e340fe16983450e06
7148 F20101206_AAAWUV mendezvazquez_a_Page_103thm.jpg
79efdfa74c3fda77abcaeabd098361d3
8b35f797c95cba99a460d4e4531edb1c2993dbe7
760243 F20101206_AAAVKR mendezvazquez_a_Page_163.jp2
53cda317c21172fd8c1afd6314f3c0a5
3ca81f21d0844db2f00a529288dab1a5d31ba4b0
1738 F20101206_AAAWHL mendezvazquez_a_Page_107.txt
a9340f62396fc7645a0d9fb9a9f01dc2
627763c800113f710deac873c19e10e15b7929fb
29230 F20101206_AAAWUW mendezvazquez_a_Page_103.QC.jpg
51658eed51b7534d44c1e0014907d38a
95d05576668e075c9c291825472f010978a8aca7
649044 F20101206_AAAVKS mendezvazquez_a_Page_164.jp2
efd38c6bc829526f6c16bdff8b4a4f99
0fe2e0817362bab46d28417c03ec19f4aaaf6232
931 F20101206_AAAWHM mendezvazquez_a_Page_108.txt
b3b5ef4f0d3d2aefed7e2d0480a32f5d
7b5632ae6b9fb3dc36b795a6e8ff4aa1a12bea98
6457 F20101206_AAAWUX mendezvazquez_a_Page_104thm.jpg
6f0a5f6a15f573dd308f8c09f0491ab4
8f823fd78c917ee1c112b5c159f5aac432deb788
595493 F20101206_AAAVKT mendezvazquez_a_Page_165.jp2
98c0a245e8824baa6fec75330111b407
fce08e51b1daad6cf6019e5977d8bb70a9b10b61
2207 F20101206_AAAWHN mendezvazquez_a_Page_109.txt
8c71f037006e4c3fc67f72042ff207a8
c75d102eef2fc472834fe840cff7020eba4d9bf3
24963 F20101206_AAAWUY mendezvazquez_a_Page_104.QC.jpg
376b47dbc76a6251576a4618880b846b
3e41fe9df349878dd49d25d8e4465fb9086cd3c1
600236 F20101206_AAAVKU mendezvazquez_a_Page_167.jp2
81c870fd689c08e2161c50b09ce9b856
be4901481d49884e46c17e858e0bb42bdb72641a
1625 F20101206_AAAWHO mendezvazquez_a_Page_110.txt
76605b22eeb9deb9838230ccefff3489
de712a142f9bfa0990f0e98638b8dd7723cc669f
5711 F20101206_AAAWUZ mendezvazquez_a_Page_105thm.jpg
ff023418532878cb5a643ca31dbdeb11
83dbc156a41f07cf857aa1e69d21462d48ecec10
76379 F20101206_AAAVKV mendezvazquez_a_Page_168.jp2
0f2bf2925755990362d190c2502f221a
e0df09266a9abe11fb12883bcf8ddda168639776
2067 F20101206_AAAWHP mendezvazquez_a_Page_111.txt
008090d2bbff0e9b9cf2859d664d8d3b
b71bb5eeb530194f0f0f20f53aa526df24e93a7a
634033 F20101206_AAAVKW mendezvazquez_a_Page_172.jp2
9543c3b727b2b7e081ee59cd936a9af8
002ac94a3667d2c98747b7bcea47f359629f3e74
1850 F20101206_AAAWHQ mendezvazquez_a_Page_112.txt
9ddd927991637f02d8e61892cf3cec49
0dae5fa4351d13c95d8ae2cd84583aa4aa696a33
634636 F20101206_AAAVKX mendezvazquez_a_Page_173.jp2
c16aeece7fb69bfc3b2c8fd8712fdd91
2cf0789eaf81d2e3aadc4cdb489bea8c394df598
1956 F20101206_AAAWHR mendezvazquez_a_Page_113.txt
09b64f08618f4b336439cf3cee82938c
35531ef8449ad4bb82017c76fd488a6328357bab
56165 F20101206_AAAVKY mendezvazquez_a_Page_174.jp2
51925530130b841cb9a1a78d8cbd8b15
5889e29e2e57d17943cb68f749549cfc55997df8
1295 F20101206_AAAWHS mendezvazquez_a_Page_115.txt
8f11fd42c180ca2022092242814bf914
8a4abfa4f161b02f0cd4e492f75ad1428fb19e04
999481 F20101206_AAAVKZ mendezvazquez_a_Page_176.jp2
10f07fe932de41b318c9c3930cd9781c
a258b2c1b2f80d724f4ea60a345a90d27c221fb8
991 F20101206_AAAWHT mendezvazquez_a_Page_116.txt
03784f5dc078553d71ed112d0db81497
e83153547c8cc68ee1e021be92e7905746261496
258 F20101206_AAAWHU mendezvazquez_a_Page_117.txt
6c49e2bcae791274fbf1bc578e257113
fbf3414a07239d6a04dd2c7e8ce6d7a2aa8abb5a
53043 F20101206_AAAUTA mendezvazquez_a_Page_159.jpg
604562b62816aa7772ae83726daf1ba9
5088f4ce00fedb4c163d37ce6fb9172dac60be84
320 F20101206_AAAWHV mendezvazquez_a_Page_118.txt
7db3a97e15b333cfab1970681fbb9067
69f9d350100350c394c85bab5edb2343ca82bad9
30062 F20101206_AAAUTB mendezvazquez_a_Page_072.QC.jpg
24d93add6e024f8f3f0cfa2824581adf
ce4d5f06759235029afd3e8015dc1677b190b06b
429 F20101206_AAAWHW mendezvazquez_a_Page_119.txt
d2109c1d32315177b0d2ffa20058a867
5e4ed88639eb18ef9bda24fa3fa65a670b779d90
25050 F20101206_AAAUTC mendezvazquez_a_Page_043.QC.jpg
9ce351183485eacbe56e4a9557096337
bec510088baf80d157c26810f43f26f2874ef668
1867 F20101206_AAAWHX mendezvazquez_a_Page_120.txt
2d1cf5e064dd450d0b0222d0ac4fa94c
e731e92e9545764c8c59497a0a97ff6ec5b1437b
9473 F20101206_AAAUTD mendezvazquez_a_Page_155.pro
7049e1dc70d2cea4fefb80a4e96bbd42
3b2a79bc3184f3de33e4ef9238c5e87ddd53208e
1491 F20101206_AAAWHY mendezvazquez_a_Page_122.txt
c271620d5721ed0028abcd41255d7154
b95badb42d88835936422b07559ddcf3a8f4c4b6
27358 F20101206_AAAUTE mendezvazquez_a_Page_077.QC.jpg
8b05e2cb952cf4ad90d5e5474f5b910f
8f24ef19ca4f1014e24b54ce69ee61eaccfe07a5
1291 F20101206_AAAWHZ mendezvazquez_a_Page_123.txt
2e5dec8c60785c6228ca47b23fe44992
b1b124c199af35f2344f57ae111a871a6f1ef42d
8203 F20101206_AAAUTF mendezvazquez_a_Page_013thm.jpg
f66cbef8d17508b257cd381a7b6d98ef
73f3c7666ef47a1367351c8baef785a52495fd0a
694993 F20101206_AAAUTG mendezvazquez_a_Page_046.jp2
159064edeac914e0abba7e69817ed049
0fda99078982a8911cb0aa08312bf0d2ab7bef83
86920 F20101206_AAAUTH mendezvazquez_a_Page_211.jpg
6fbc080e3b0850c65c9f16826f9623f5
c9584485988d2f3f9847a765a33cd678d9f23ecd
25271604 F20101206_AAAVQA mendezvazquez_a_Page_082.tif
cb2c6e4a3111b2c06be107f4a7b23392
f1726d1e671fbb981abb34229411a57a9d480c4d
5041 F20101206_AAAUTI mendezvazquez_a_Page_123thm.jpg
c8968bf271c9a4c10627239171c78f19
4b85284361b668a0459ae932aacd24f5283560e9
F20101206_AAAVQB mendezvazquez_a_Page_083.tif
42ec8fe155f8a7e64f8e68df377ef11e
8b4f37f100dc815e1e49ae3bfaca701fdcd5672e
68170 F20101206_AAAUTJ mendezvazquez_a_Page_171.jpg
ff64a37d05d445e3f0984d01970ec9e7
2665e81fa35eabf73377ec289b899f99eea71bd9
F20101206_AAAVQC mendezvazquez_a_Page_084.tif
2fb9871c21dfdaac8432521c41bfc88a
7f8efcd974e43a8dbcaf86c8e41331db127b348e
F20101206_AAAVQD mendezvazquez_a_Page_085.tif
689f7abd6772d7147daf1e695530eb3c
a71aabbe36bb823555bb7e02244f0403698fbdd2
522 F20101206_AAAUTK mendezvazquez_a_Page_079.txt
13995e3cc590cdf89d3657978752d70d
c9f039a165b953fad4625f7696765a0d884379ab
F20101206_AAAVQE mendezvazquez_a_Page_086.tif
6363abd5bbfaac96be8711b073fe5922
1f778db0c180e94c48e42acd07cf850aa18f901d
59268 F20101206_AAAUTL mendezvazquez_a_Page_144.jpg
1b6d69f914199a42bd36525e97e8a751
4d38044b3cb7d50bfcc44d46396ef88c7ad38241
1053954 F20101206_AAAVQF mendezvazquez_a_Page_087.tif
74e2ba00381d586a47dab90ce4d44e62
73ce271acf32370c73e79c65eeeb63ea7e373eee
5735 F20101206_AAAUTM mendezvazquez_a_Page_148.pro
fcf60fd52b1ded168824a4d54555a438
8b818d85013604817fb6b6185eaf8585293b75ff
3227 F20101206_AAAWNA mendezvazquez_a_Page_017thm.jpg
10c5ce8737b5eefcbbd3dd5ba7acd060
367d3adf3d316a199fbfd8264b1f465f3f24d31b
36882 F20101206_AAAUTN mendezvazquez_a_Page_008.QC.jpg
e9e1c879b9dda7a3cb6914e31824ffcc
39253e97b7aab3ddf781b2e72f129686d0c49847
F20101206_AAAVQG mendezvazquez_a_Page_088.tif
2f830a1dab56fbe5cb1d9c2b4c7486ec
0b45208add5b03347ade8337cb18af2633d30bd7
6750 F20101206_AAAWNB mendezvazquez_a_Page_030thm.jpg
25538136cad692fdcb2ced0b82fd187c
c625502e30170b0e625d43e5517c8e0110287da3
5891 F20101206_AAAUTO mendezvazquez_a_Page_160.pro
c88348d5715ce47cb204587fcd673a94
d937efc8245fa73cf5d33263f54c895e1bd49e0c
F20101206_AAAVQH mendezvazquez_a_Page_089.tif
68278df96f9a662a94dd165d277a7a53
f198524d0f6af122be80de8d09cb0129157e4568
821197 F20101206_AAAUTP mendezvazquez_a_Page_104.jp2
b4d56af3a4ca89e08c0471724938eb84
a742511c7e2c8232881ee488a63e73f626acd1eb
F20101206_AAAVQI mendezvazquez_a_Page_090.tif
14d4d0f1c03a25c3d3238eaf420f2a1a
09bfa500dae23d21fd8be6fd8e54d3a3c808dcad
7675 F20101206_AAAWNC mendezvazquez_a_Page_008thm.jpg
b93b738e10929880ad353a4d1a410de9
4c29b1e6117ca08beaeb0d5c3e9a58ab9538cb78
3825 F20101206_AAAUTQ mendezvazquez_a_Page_155thm.jpg
6add4480b565627b4a64a3813fcd1534
e96f07a9b403bcd2ae56d2c55a78067572959682
F20101206_AAAVQJ mendezvazquez_a_Page_091.tif
13ff5b5440e5820c434f456780c9372a
9912f314908c8fd6c20d4797eb246c7353ae13e1
6869 F20101206_AAAWND mendezvazquez_a_Page_227thm.jpg
0ec8fe31adc19ff0776b768f8a875c39
d4dbe35407052479f9f2429df60582a07b18e8a0
318 F20101206_AAAUTR mendezvazquez_a_Page_003.txt
9794a7c2b18013c13161810bce5d0ee9
7b639f68fca62f2c4a5eeeda1f6398c26405c026
F20101206_AAAVQK mendezvazquez_a_Page_092.tif
02a906e77d9e6cfa1dc6bba7708b2d60
a8fcb1017c3de5ba5dc287a2353274b335eaa54a
7121 F20101206_AAAWNE mendezvazquez_a_Page_056thm.jpg
6146a60a620a55606e3fc6b90e21ce65
67905920ffb05b0813fba1dbe09aaec210b3bfe6
59124 F20101206_AAAVDA mendezvazquez_a_Page_187.jpg
bbbdd0cb4866aec778d5e10a7025566f
7560422801c73ed35b6d132b1fb14f2f4a77f0bb
73023 F20101206_AAAUTS mendezvazquez_a_Page_040.jpg
01a88b65236b16a8c7b82bd884f6025c
bad783bfafb2682d0e277ab7e603ea61b5a8d349
F20101206_AAAVQL mendezvazquez_a_Page_093.tif
1df4562482b4f44fce4f6d8f39d6196e
77e9f2403c1428b180256719778ec53335b5ff36
21302 F20101206_AAAWNF mendezvazquez_a_Page_105.QC.jpg
e0855b825943bbb13e8ed60d99e58898
1f49ad9a0d5ea29ea72c8d0244138581a0c94adf
73358 F20101206_AAAVDB mendezvazquez_a_Page_188.jpg
0e43548b2aaa7eaed98bdb7c21f54ff8
034bf0a033562c9df6b32e900bbf0d5f65547161
39262 F20101206_AAAUTT mendezvazquez_a_Page_084.jpg
78d38c07297cb7ebb39aaee2f7495b9c
c07362be9b117cdf65ecfb36e9970c4b0ef61265
F20101206_AAAVQM mendezvazquez_a_Page_094.tif
f1ee23f606b5b57127322cd80f93ac75
61c653d643da319caf76a9caf3f2a89e574007a2
6288 F20101206_AAAWNG mendezvazquez_a_Page_040thm.jpg
352fb4bbec16c6a2c2f3f017fd496d71
79cea006099ccfdce5909defe15c0423ca62859d
64703 F20101206_AAAVDC mendezvazquez_a_Page_190.jpg
bd4a041da3532e2824c0df9f583e4a67
f55a87b1c2d8e77c174f71c27187d53023d95a54
7787 F20101206_AAAUTU mendezvazquez_a_Page_243.pro
f4809bb51051f958ec11e4ea83fa8e2a
7fb08311586aef24f3ae1cace95efafabd492f62
F20101206_AAAVQN mendezvazquez_a_Page_095.tif
7722277c77bebbbfa2b09c0e69a0cfcd
fbf835d1778b1534822fd0cd31ce5b1742da05d7
10199 F20101206_AAAWNH mendezvazquez_a_Page_117.QC.jpg
a42992af2b057d2862208312bd8a7c78
fee29b3ab94bb16a2f3065f819d84eaa75d274b4
78703 F20101206_AAAVDD mendezvazquez_a_Page_191.jpg
202cd2ea0d73935e59131f92d35cff8c
f26afa3a58ddb72d3bce121c6f83a2cb4c919da3
296312 F20101206_AAAUTV mendezvazquez_a_Page_156.jp2
dd3bf48f55d7093b78950e74ab1a09db
697986b43b61c003d6a7722bc89da931f8a716bb
F20101206_AAAVQO mendezvazquez_a_Page_096.tif
455b460d00b265f57ccb0affd6ed64ff
6a652678568b67ad80830ec4f008a3b6724d1e0b
7309 F20101206_AAAWNI mendezvazquez_a_Page_031thm.jpg
84d6284ec9878009eb7b1c01d989f48b
c9bbe419e3dd9416668052e3f11bc2da573c806e
65753 F20101206_AAAVDE mendezvazquez_a_Page_192.jpg
6306ff55d40047f8d62c22cdeb7a1232
c8e56e4f9df652e19903e8e859c1887447827d40
8044 F20101206_AAAUTW mendezvazquez_a_Page_064thm.jpg
27bbca52d45b842a7895a19708929e16
cd8ed86acb58949e40bcaacafaf49340dd1a7d7a
F20101206_AAAVQP mendezvazquez_a_Page_097.tif
049d99ba792de7de59111336018f5164
b521f2cf12c4db0b872c1a175540dbe944d232ea
1868 F20101206_AAAWNJ mendezvazquez_a_Page_012thm.jpg
04152194b72b556e747906272004ab44
cce048f3db3edb275299601f6b35a38cce5392da
64593 F20101206_AAAVDF mendezvazquez_a_Page_193.jpg
11d57a2c1b3164a71cf24785bea4bc5f
048f6a93f4a26829e28dec45ed41cd9ab91aa49a
42188 F20101206_AAAUTX mendezvazquez_a_Page_030.pro
d50133c190c9bb9278566fc7bf3297be
711e656fe303fdfea8bc6f38c8f9402fe03518bc
F20101206_AAAVQQ mendezvazquez_a_Page_098.tif
a3462ef3728435ecaae8c30c90d11362
7e3eb7c6fee34807a743af8e62e97b99c966db11
33153 F20101206_AAAWNK mendezvazquez_a_Page_179.QC.jpg
5d7583f400ac037d0f08c370559b9784
f685982279fa2a3cc8f95eafae865aecffd366a6
65771 F20101206_AAAVDG mendezvazquez_a_Page_194.jpg
b87687e2ce270178a333232fa8f1452a
1bcbb9f3e71716254434850861be8886e869ad9a
4427 F20101206_AAAUTY mendezvazquez_a_Page_149thm.jpg
e0c7582d8217cc0877d5850a9302b5b8
336ac09a78a4fc6df3de3aaf5c7eb40190df5991
F20101206_AAAVQR mendezvazquez_a_Page_099.tif
bac0677e23174218e762f40210ec79e9
9a45ded2a477c7b425b2c1b811f392ac8cac476d
6413 F20101206_AAAWNL mendezvazquez_a_Page_124thm.jpg
4580e0b2a957fc0838d359f37963a0ba
b90be2787d95eb289fc70eb33a32bcf5d1f66360
31391 F20101206_AAAVDH mendezvazquez_a_Page_195.jpg
5b04f70f6b3539711fc20b0d083073fb
050a95668931f10b9f224e6451618da42e78f34d
11971 F20101206_AAAUTZ mendezvazquez_a_Page_156.QC.jpg
cd2432f06fbcf34b9e1288e4b15b86ee
dce04d7bfe617b5fd4fee12630c4a6d43ae5993a
21626 F20101206_AAAWAA mendezvazquez_a_Page_136.pro
d5b934b2dd0c2e41aa29a45d0181c601
28d138452037559cf7eeae79f884b69c40921b34
F20101206_AAAVQS mendezvazquez_a_Page_100.tif
a0b84bb6e65d83380af67b839eb3f011
795afdaa9999d9d40824cce58e1c4364f6abd03d
5941 F20101206_AAAWNM mendezvazquez_a_Page_091thm.jpg
c17263965bead8518a5aa28b86e699e0
c2fa53c1142ebbfc8f0b7e1a9ce1a04ddd792736
67428 F20101206_AAAVDI mendezvazquez_a_Page_196.jpg
23b62b90f44959d5fa9b3aafeb4472e8
11b61278cb064d79e569bcc3422b51336207053c
27577 F20101206_AAAWAB mendezvazquez_a_Page_137.pro
dcfa0cd49229051632fb00bc429a86f3
0142941974353303abca64476cee0568af453627
F20101206_AAAVQT mendezvazquez_a_Page_101.tif
1fd561cdfa58699e1f56062935f293d9
d35877bf4acb3ba080c9a84e7c3150da39eca91d
30207 F20101206_AAAWNN mendezvazquez_a_Page_067.QC.jpg
57ba1739c04f989aa8d7633c6b084a65
3c5caa888f9f800fee5e8d8c58eca25de3cb5069
68086 F20101206_AAAVDJ mendezvazquez_a_Page_197.jpg
3925526d7df4abb008c6a9b1f6ddaaa5
72172200b42c9b02f5fc262dd26ee44fd92fbf9d
25547 F20101206_AAAWAC mendezvazquez_a_Page_138.pro
296d4bb9bf74f313a2f0cba1eabf638a
2b7e77a63ce236a51b68c2bdd06a144c3a2b0044
F20101206_AAAVQU mendezvazquez_a_Page_102.tif
890dbe140134ffa179f29cef0186d7cf
0a7c77b99baacbbec96e36fbb73db1eb4da54aea
69807 F20101206_AAAVDK mendezvazquez_a_Page_199.jpg
975ed7a97d8e1aa43131705c3ca30e88
17f07d7c45d85d7867d4763fcb82cca3538b03b5
15959 F20101206_AAAWAD mendezvazquez_a_Page_139.pro
5a875e4867ecda51da53f6d7a3a9f30b
ce1b8449f0b2933d522b2823f01034e42bdecc63
F20101206_AAAVQV mendezvazquez_a_Page_103.tif
15448e793d51f0cb9d7ee52fba13996c
9d610e38a63aa3fa0e874f612d684a9bc1fe5a14
19428 F20101206_AAAWNO mendezvazquez_a_Page_150.QC.jpg
9b24aec833c634f1f977daa592b2ee8f
639f79539d3924d7f7dbdf0b14238f11c905fa69
32124 F20101206_AAAVDL mendezvazquez_a_Page_200.jpg
2892bdbb74add6cc805542fe4a97b5a1
ed737ec336593dabb78b699b3c089be74bf3e2c7
16440 F20101206_AAAWAE mendezvazquez_a_Page_140.pro
304dbbf1370f468551d12711b039b078
05cdebcc1a7dba7c52a231bee9873232a6309222
F20101206_AAAVQW mendezvazquez_a_Page_104.tif
73abf236394b776064f7e7426578870b
63d6f2b3aed942aef292a021501f89fe7c889d89
12262 F20101206_AAAWNP mendezvazquez_a_Page_183.QC.jpg
6df03d195f7ac0aae0795744825d06ce
6cb6449563dca2273e0acf3ef2103bceb50f451d
52396 F20101206_AAAVDM mendezvazquez_a_Page_201.jpg
6a3fff5a1114c6b1f46b55a50a6e666d
ac31327a397e77e473c5ece9fb5b706157abc0c7
13427 F20101206_AAAWAF mendezvazquez_a_Page_141.pro
164a55969b4d7b8e71034d2b5bf7107c
809c69560983bb7e27f65574d2d4fc270ea5c01e
F20101206_AAAVQX mendezvazquez_a_Page_106.tif
3c1cfda6aacb82b28fbad28bd13b2a6f
d9a8ea4542f0b6065a24bc0f4f476423a6ca6f32
26012 F20101206_AAAWNQ mendezvazquez_a_Page_048.QC.jpg
580c265e6c718ed001d9e3c573eb7e34
41dff995bc5764c02e279fc5558a363f0d2fc52b
45298 F20101206_AAAVDN mendezvazquez_a_Page_202.jpg
236ce79e1977e243e63e4203ee08bafa
3d4df9c1106308e96079cd59aad8544f9fbf0ccb
30292 F20101206_AAAWAG mendezvazquez_a_Page_142.pro
fdc99ce40b3db9ec41def47431d45b87
dfff23214321257f079549568d9282bed8f81ac6
F20101206_AAAVQY mendezvazquez_a_Page_107.tif
b0020423f30250599fca91c06a619b67
c9781bbbe3fbe404626bcd5aedb790a24df0db41
6842 F20101206_AAAWNR mendezvazquez_a_Page_021thm.jpg
23e7058dd94f017ad2ffdb3283d11212
c14162f136d73bed6fce22895bb6f096e52c830f
45395 F20101206_AAAVDO mendezvazquez_a_Page_203.jpg
726ff5bd1a79da35a1dc9d3d2a809e1b
e8893105bcfceb5c0f18be1905390fe9c22b9951
30922 F20101206_AAAWAH mendezvazquez_a_Page_143.pro
dbdc418b352144b0d22b2835dd22b9a8
a3647ed7820173010485235c5e8075d995aa26ef
F20101206_AAAVQZ mendezvazquez_a_Page_108.tif
fe423a9159c29726775ff43af5aba861
6467783e236dff0025fa1c6cfd7157923437dd4b
6696 F20101206_AAAWNS mendezvazquez_a_Page_060thm.jpg
f6858afff51580825434cb5b07a0ecc2
d8211697396f29b330af3df4d4184d93be7d8e47
34518 F20101206_AAAVDP mendezvazquez_a_Page_204.jpg
4027d151d048b244a61bc2b73d72e7fc
1b1203fad09f8d4872ad5a8f92814483e3eaed22
33631 F20101206_AAAWAI mendezvazquez_a_Page_144.pro
ceb2866c66d419f774d6103e09399d5d
b07619f009657f33ddbe9fb8db06cc303864e8a3
6771 F20101206_AAAWNT mendezvazquez_a_Page_085thm.jpg
d999d0d7fcd9bfb078479cca94820357
56d144b1c31e878334c0bbd47c997a47ae5622c3
76911 F20101206_AAAUZA mendezvazquez_a_Page_059.jpg
a5800cdf1952d9fcb0c8e213ed3bdf95
acab4322d2e50a82de69b5ab2f7ba5bcd6640c00
33118 F20101206_AAAVDQ mendezvazquez_a_Page_205.jpg
1969ae340fabda7f18caa82031d3c30d
529c29635d57b5cf772a6ad90011703885951c02
25250 F20101206_AAAWAJ mendezvazquez_a_Page_145.pro
1a93bd77ad402ad850ff143af91580b2
21d94f2889871fe6bd487c1d86932e049517d5e5
22771 F20101206_AAAWNU mendezvazquez_a_Page_168.QC.jpg
ef134ac564c64698c6478f5ad815246e
f4554f391ee67acef3c380a28154bda6d9a56f3c
73162 F20101206_AAAUZB mendezvazquez_a_Page_060.jpg
18017912a4dc79218dac3ed583b31dd7
daf5f3f44bd387e811fc5ef885390a00aaacc10c
63082 F20101206_AAAVDR mendezvazquez_a_Page_206.jpg
1080528fe005926e20a87eb085fb42eb
fb01dd313173b2ca501e3b587fba039b587ae249
5854 F20101206_AAAWAK mendezvazquez_a_Page_146.pro
41387ec2b7765b6f014f72ee2f0fa532
f1ce7e063f3a0b53fc77106e315ef6d21066fc23
24296 F20101206_AAAWNV mendezvazquez_a_Page_211.QC.jpg
ee121db06ab1d9b9f7f093fccfbd1ed8
dc27368c60558ee517cdc0327ac4793865fa79f9
85053 F20101206_AAAUZC mendezvazquez_a_Page_061.jpg
b16b779870aea213ebaa9e8a6c57861f
118c46a1d6e1a1ee5670753e75d45de1d2581bfb
55157 F20101206_AAAVDS mendezvazquez_a_Page_207.jpg
2c5850f8c377b43dbca3069a1fd6a6fc
af924a40cc9225142835e51f5a96f009b9a9cb11
6680 F20101206_AAAWAL mendezvazquez_a_Page_147.pro
df005eeff8a2e279ce69592c681badc0
f76388b4e6839cc71b07806ad46750fc2810f537
33517 F20101206_AAAWNW mendezvazquez_a_Page_177.QC.jpg
ad3adaa6351c8d12a432dc622e79ee25
f7b8da0186e117a713a7aea2de8fab25363dbc42
109113 F20101206_AAAUZD mendezvazquez_a_Page_063.jpg
d5285a3742c644c7b7d4b21e7b88d8c3
545ead3640cb4146347d0341e7e45153315854d2
33146 F20101206_AAAVDT mendezvazquez_a_Page_209.jpg
81325d50d199d0dbee4c1bd8b00465bf
8843d95e442f5bf41ea7b82d9ba35fb892907782
24964 F20101206_AAAWAM mendezvazquez_a_Page_149.pro
4160f77162dcc5a65e732f2d754a885a
ae3b6b0a6b5a710ff226f5e82dfc5c24db2c546d
12628 F20101206_AAAWNX mendezvazquez_a_Page_162.QC.jpg
f3703cb9822b2cec1761ab2514a02e63
03f67440a9051448cacc8e432dbdab9ef5abafdf
106914 F20101206_AAAUZE mendezvazquez_a_Page_064.jpg
5bf6f71f0e5387f77e211f66a7e00170
31f19960023f0d25319341806d4112b8a78bc808
82245 F20101206_AAAVDU mendezvazquez_a_Page_210.jpg
befbe06a9263105733e4483dbedea3b5
f77198a80af3369644e963f86ea7f91164bb3dc5
22738 F20101206_AAAWAN mendezvazquez_a_Page_151.pro
2109f8fe27475be8c78dc2a2ad78d13f
84da2b4eae21efbbfc4667762b1a6673cca9e20d
5730 F20101206_AAAWNY mendezvazquez_a_Page_203thm.jpg
7c0aef2e2042d42a480fa12d4c981f3c
719ae8464f316eff054865cfa41b19d31b314689
108367 F20101206_AAAUZF mendezvazquez_a_Page_065.jpg
ca687ad5815d3931c795f88ee10cc1db
9b0b274745e10f78c71781b8f0b23146d2cf9d7c
86898 F20101206_AAAVDV mendezvazquez_a_Page_212.jpg
abc9399277fa1bbc0829200c95b16631
c62ae7471efb2455c2bf9185b376b09b45a1f2cb
10066 F20101206_AAAWAO mendezvazquez_a_Page_152.pro
123b775f1844e3ea1199b33ee8763039
93b011361d69cbe23caf5a8d9bfa59a573f26263
25574 F20101206_AAAWNZ mendezvazquez_a_Page_051.QC.jpg
130ade8d4967206f15ba07c82c198ef7
8c38cbfb6ec1a6ccbc7ee46797abb22255a3c695
110848 F20101206_AAAUZG mendezvazquez_a_Page_066.jpg
4960a5c97b77c04aa767a52437f1d493
3b4f8f7560f7d98087ada3b31be4152ccc6ece77
65448 F20101206_AAAVDW mendezvazquez_a_Page_213.jpg
28ab23e031e77d49b1c6025e01156dfe
bd6e3c7b1bcabef9c950d313072a263457a8fda0
6794 F20101206_AAAWAP mendezvazquez_a_Page_153.pro
b1fe225543a175db89895cff00a75284
e6bfe537ec48f183718ca14a20448e631294a19b
95461 F20101206_AAAUZH mendezvazquez_a_Page_067.jpg
449f399b2f42464622f0aad0c08fc48c
8a1886ef6bd3d131ebea778d1eb8c846a0243ff2
51880 F20101206_AAAVDX mendezvazquez_a_Page_214.jpg
aaa78e27417b3fe4deff18b18f3643cf
f5b1f3c3ed16d268c6a4962c1a7f81d661be7f99
20128 F20101206_AAAVWA mendezvazquez_a_Page_017.pro
14fa245a2bf79a746197d24124997a52
e0de9516121ee426fdbcd5bbe97147b031f0c089
9490 F20101206_AAAWAQ mendezvazquez_a_Page_154.pro
495388f35d639d03e4d388a577a6e48d
9906626618f6a2829c12af9733de69fabe927b43
97199 F20101206_AAAUZI mendezvazquez_a_Page_069.jpg
fd52daa801477a0bbb48b0af150040ec
4189fd58b29e77fcbb5ad0e4ffa674d8977afbc6
51958 F20101206_AAAVDY mendezvazquez_a_Page_215.jpg
18688b7cf2402add14f838cc16629a3a
c49be2d476a3f7369cc55715961ca04382c40135
22343 F20101206_AAAVWB mendezvazquez_a_Page_019.pro
0ade67e81625f4ceac9c2684f381913d
cda565e0d38cb49b960b21e57cd2b4b97c6daff9
11641 F20101206_AAAWAR mendezvazquez_a_Page_156.pro
6c5eba4c9917c06473012667f40c64b4
a12b0cd09add0aabce072be78152fb7bf509fc5d
100562 F20101206_AAAUZJ mendezvazquez_a_Page_070.jpg
2ccb5cde2a9abd05eced923e8a19f951
f93607f16eed034ec2c4b525f765f42f1130b221
49262 F20101206_AAAVWC mendezvazquez_a_Page_020.pro
6368278963339da92a9edba6711fbda0
dc88ebc3926cf86c9acabc5411ab42302dc561bb
17054 F20101206_AAAWAS mendezvazquez_a_Page_157.pro
c0242f2a36fb64ffd9c82c482a3dfc89
4efbf7751b2419f8180c5a38351afa77340facfd
110258 F20101206_AAAUZK mendezvazquez_a_Page_071.jpg
ed3e2e17c572adcebb13240bd1ce6a6f
a85c44ce5859bfd2985476a475f77bfc20c4a257
53888 F20101206_AAAVDZ mendezvazquez_a_Page_216.jpg
72954830bc6176e587d70bf9b449c63f
95b21c45a99c7addea6bbea2cfceea36e48bb2b8
40540 F20101206_AAAVWD mendezvazquez_a_Page_021.pro
bf7c0ddca3a4d95a15aa06b23a0b75fd
0e1eeb9d91ef992abfa4148038aff037a2af13d3
8254 F20101206_AAAWAT mendezvazquez_a_Page_158.pro
11e78eb5f953661f1b4f3db368ff7495
77a05939b8e9bdc120c507890a781c47a13d015a
91610 F20101206_AAAUZL mendezvazquez_a_Page_072.jpg
c53e311d1308652348ad4fb9d2c606a8
3a9ea2bcfaea9734dc0e5dcd18c4dd7eef3c20c0
35647 F20101206_AAAVWE mendezvazquez_a_Page_022.pro
60a1953a801d91030f68b133e9b66ebe
2a48726c986037a1eb0c63ef865c5987316d8261
14214 F20101206_AAAWAU mendezvazquez_a_Page_159.pro
e281cd67f7f987c6ad294bc043271d5b
3e33133fd9140a017d13b00e4309505e27abe6e0
76743 F20101206_AAAUZM mendezvazquez_a_Page_073.jpg
9eed7b835e7f497441548b618f66c7c0
6f1440ed33e6b69765855600b89448802ae08a38
46893 F20101206_AAAVWF mendezvazquez_a_Page_023.pro
6d8bf7c8a50e305900975978312e6e55
9f16b4333efe2669e4928732504d5498d62aae9f
52036 F20101206_AAAUZN mendezvazquez_a_Page_075.jpg
e2d89cd9c43073cec1a38b0f4b8a7a5d
552466ddd89f18ba41ff863a7f2e8b413dbb3529
38809 F20101206_AAAVWG mendezvazquez_a_Page_024.pro
5c987b130b11b9b798dab8eb4baf538d
e87df3a778441922db955fc6fae43d8e459eb9d2
20989 F20101206_AAAWAV mendezvazquez_a_Page_161.pro
ebd8dd8438fa7970a61552cdc354b19e
2de15ac9c2fdd5506186ddc44fecb88a3f4ac0e2
107165 F20101206_AAAUZO mendezvazquez_a_Page_076.jpg
09043093711cc81437e7d37e2082da2e
8ca007905ddc904beecf299b2bbfb8b406228c58
48926 F20101206_AAAVWH mendezvazquez_a_Page_025.pro
096eb67bb299ba345316879cb2f35462
e794a9ba05b68195b4faff3447eed542e5ff0cbe
7585 F20101206_AAAWAW mendezvazquez_a_Page_162.pro
7fa9a2a3d4bc4ef0fd64ffb02381abcf
629ac051c4fc5c93ca51549fc52ce33b99b6d1ef
7388 F20101206_AAAWTA mendezvazquez_a_Page_072thm.jpg
ad226f3332deb735264e6a46ddb85f77
8cd1d5e53b5aa5ffd73968c7bf17eb78f0b5fb4c
85885 F20101206_AAAUZP mendezvazquez_a_Page_077.jpg
0c9b520fe57e90d4d75dc9c9e72af933
5e17cca993a0f0bb81b995b3a0c6fb02f631f674
53230 F20101206_AAAVWI mendezvazquez_a_Page_026.pro
2537a9e6d4b35f87b4e55503f6939c2f
0c67d344ec08a3aa36064300c0a3ead84989972f
30101 F20101206_AAAWAX mendezvazquez_a_Page_164.pro
f421a66f20a36077aa09ad95358b031c
ccaa2333f6e1127615744a0a7dd03cfb2fea62ae
6610 F20101206_AAAWTB mendezvazquez_a_Page_073thm.jpg
fa1ab9ae53ae3c91b10af5db84c9aa0c
fc24ccc996fbcd568222d2de4a2bd30b32f01a5e
42065 F20101206_AAAVWJ mendezvazquez_a_Page_027.pro
acc74759bc9115466508e16e4a51604d
4a237115e4a96aabfc3e9e19d6b645f45f4046e7
25960 F20101206_AAAWAY mendezvazquez_a_Page_165.pro
5e1de7ae9afdc205f9454cd581d83fdb
f7056100faff8e9a39fd1268c55f48cca749321a
25171 F20101206_AAAWTC mendezvazquez_a_Page_073.QC.jpg
926b8d429c21906bba734acfeceb5ac3
08b1411d40dab1643dfae9781491a68e704f9032
129547 F20101206_AAAUZQ mendezvazquez_a_Page_078.jpg
30148ecb03785bd0b1426e5288d9553c
4536b0423e24a4511fcf913608b63e0937399b84
39067 F20101206_AAAVWK mendezvazquez_a_Page_029.pro
a138e6b9326eac4f21737e87c9b0da12
c377fa5cda1a40a5d7b18067fad4ef45653bb504
28500 F20101206_AAAWAZ mendezvazquez_a_Page_166.pro
34ab1e71fb571205aa2842ff1a29391a
df600a7d5c65b2d3a16c41a398669eeff44adab1
7214 F20101206_AAAWTD mendezvazquez_a_Page_074thm.jpg
e0aa1fa8d78bb6d883c4136af3498fda
fcd0d7b38798bb1f156354c7977bd2db359b1a16
42493 F20101206_AAAUZR mendezvazquez_a_Page_079.jpg
a3737cb0acfdd971789c283de482de0d
077daee5a92f9b79c6a5ce45a297ddbdbb38518c
42651 F20101206_AAAVWL mendezvazquez_a_Page_031.pro
7034044d1cfaa4ad1d305e746dbe82e5
8bd781e2ed63107469292beddf60270fcfaa9cd0
28669 F20101206_AAAWTE mendezvazquez_a_Page_074.QC.jpg
a80940473139ecdb7675aaf1d7009ecc
8d25a143c66b00758cbf936859b7637d9c875c9e
235887 F20101206_AAAVJA mendezvazquez_a_Page_117.jp2
74b97336b140b73fa72bbbdb0e902fde
50cb320fc3e0c3a8e4f0209bc014e31d523b5b86
48274 F20101206_AAAUZS mendezvazquez_a_Page_081.jpg
bfa773f260547f90ace6798ff28073bc
f392e0813b74961fc4e51bcfc6bd43a21390e553
4904 F20101206_AAAWTF mendezvazquez_a_Page_075thm.jpg
6d8b60471a14cdf0b0da26b54be6df60
d61c1c357060928230b5677b97bf8af0137bccbc
279561 F20101206_AAAVJB mendezvazquez_a_Page_118.jp2
4e779f64b3f78d73e03892c135fde8be
74490d2fd2ae9b7d13443451f7aa657e34158196
42982 F20101206_AAAUZT mendezvazquez_a_Page_082.jpg
b1774f95670665e067aac5ac6847cf2c
e9b9f8896743e879fda57413e1315a521bc10f96
33815 F20101206_AAAVWM mendezvazquez_a_Page_032.pro
955842cbfc144563d2f28844fa4834d0
9b257d6866d0715bab03f3b594a85ea0a0a74a0f
17399 F20101206_AAAWTG mendezvazquez_a_Page_075.QC.jpg
400527271e7b14f5b288fbd7c3706914
8e242dba35f07d8ef245388b28fe3da46d0fc497
997235 F20101206_AAAVJC mendezvazquez_a_Page_119.jp2
9432d9b316e0654787f58ac7b896fc97
f70d5adcc90a638e57cfc2d49ce999f790f63578
69881 F20101206_AAAUZU mendezvazquez_a_Page_083.jpg
ef8b6186b12176435f6f8cc9a153416e
0e1a3f4bf289467425c3c982c3fe9378bbaccf0f
31257 F20101206_AAAVWN mendezvazquez_a_Page_033.pro
ded181a5cd1674b751cb71c44979aa75
6820a83324c5f3f09bc7027e79d5ba8d3f912115
8029 F20101206_AAAWTH mendezvazquez_a_Page_076thm.jpg
acc87af379d61d8e3bcf73301f007873
6a11a5056d2f9bc20c222563024a0c134932b33f
942260 F20101206_AAAVJD mendezvazquez_a_Page_120.jp2
31c0ceee31416663f2bef5188913a9ee
7ced1fb620400b44970aa4de5ba5587e846debac
74594 F20101206_AAAUZV mendezvazquez_a_Page_085.jpg
61d9e1c6764bd7a5f5f3ed71d09e849b
ded8c6b30358afc6fbdbe3b214ab37579911a986
42407 F20101206_AAAVWO mendezvazquez_a_Page_036.pro
ae4d62fec1ee2f766c1e1d0abb08d4fc
87b2609dde6e2b56912900323dadd75cdf53e33d
505713 F20101206_AAAVJE mendezvazquez_a_Page_121.jp2
0deedf4ead395e806a5352d2b2c65750
45accaa176d48c6e7e9aacdc2797d6f5af383e50
76845 F20101206_AAAUZW mendezvazquez_a_Page_086.jpg
e7c7f0916b90e0a1923de4d6d2893726
d5a90a8314b453ca0faf907345ddf0a844f9400d
35890 F20101206_AAAVWP mendezvazquez_a_Page_037.pro
d97f03fed69bc21448439867578508be
f7b0e49bf603f30cae081d0826346ceaacc2fb11
6847 F20101206_AAAWTI mendezvazquez_a_Page_077thm.jpg
a11d64f46edfdddc6acd67d1d4c2714c
93c4af5274b88f6aa9d222026d34968999dc2182
664653 F20101206_AAAVJF mendezvazquez_a_Page_122.jp2
59fa8395a948c011532bbb604c43d050
706012b1f351451aff96fa18a8db4b90e9c3e29b
50903 F20101206_AAAUZX mendezvazquez_a_Page_087.jpg
19aaa8051238185ac351658468e00a07
ab553d86267af2be6a7c3a2b1edd7adbb9900910
30257 F20101206_AAAVWQ mendezvazquez_a_Page_039.pro
f03cd138fc86e9003c0179524ed9c7a6
539ff6d2befd507b1d4514d5a2ad9165e2533c1c
F20101206_AAAWTJ mendezvazquez_a_Page_078thm.jpg
6bd91db96e419f00fa75d900b4d25791
f9db98339657ee04bd773a492bbe852f2c7913a9
53460 F20101206_AAAVJG mendezvazquez_a_Page_123.jp2
3ee1e809d70c855c3bc54847899f9832
8609c8d28fe3da9fd38be282afc4b8d5293210d3
67432 F20101206_AAAUZY mendezvazquez_a_Page_088.jpg
80e4bca8a71c5666a58d89ae8c074982
200f5fe7e6854a81f9db3c4c99087a2327dbf554
34161 F20101206_AAAVWR mendezvazquez_a_Page_040.pro
afb129ca228b6b59347bccb223e8a171
f4ae30bc2a75e7780fafb2b16b2a80b2770dc607
35830 F20101206_AAAWTK mendezvazquez_a_Page_078.QC.jpg
5915a9ebc290f04c193f04b3c99a4782
2b66267dab69fd0f1d8dfb0e56888276ac37a781
783527 F20101206_AAAVJH mendezvazquez_a_Page_124.jp2
d6b988ba1e79dea727844b19da9d0b3f
9dd35fa58667a071f2a066fd73bcaec42406fc4c
79019 F20101206_AAAUZZ mendezvazquez_a_Page_089.jpg
5518c2453e4917c038c2ae48c7c72272
e2e17b584807866acb47e8665de2ea83fd01c17f
2147 F20101206_AAAWGA mendezvazquez_a_Page_066.txt
4b58e59bc54d6ce7589928cca6c84906
8e97d4f04eb0a96d3cc90fbedd75d621a0d8ee80
41219 F20101206_AAAVWS mendezvazquez_a_Page_041.pro
c2a80b535fa3f4f9c53403a74273cfdf
a84e9a4144999882ecf5a4f2b93018ef9bc899e6
4256 F20101206_AAAWTL mendezvazquez_a_Page_079thm.jpg
68086f852d509949357d3cbf37a1da34
23f09e5662891de3b6f1c287356e60ce1e5ce894
729949 F20101206_AAAVJI mendezvazquez_a_Page_125.jp2
0f099a02778b1c870a2b38b4eaf0525f
692e1c6faea85e2059e53969e5c9583fcc78166e
1962 F20101206_AAAWGB mendezvazquez_a_Page_067.txt
9c4f66a2301582d75e2ed684dded438b
351e6c9ec7ee4273f9dcb4b40bb4ccbdd4bd822f
37428 F20101206_AAAVWT mendezvazquez_a_Page_042.pro
15f35d8d167020f6c5d485dd038c319e
6870bdb6d64d6709d83a2684fa89cb91b7fcb647
14623 F20101206_AAAWTM mendezvazquez_a_Page_079.QC.jpg
ef4c9ac665b3afdafabd5d1b3bd61f2f
cdc12b7b3ff9787ca7e8dccb17b22fdb56500b8f
772936 F20101206_AAAVJJ mendezvazquez_a_Page_126.jp2
d2a5d115e70533daf2901466f4b7e7d9
e65515af151d3dc0afc5fe1a00562f0ae914a4a2
1918 F20101206_AAAWGC mendezvazquez_a_Page_068.txt
07421eead0111e0540c325a21af57fd5
25c331e86f810edb354a07c1f392b9ca525b9626
38204 F20101206_AAAVWU mendezvazquez_a_Page_043.pro
2146839fec449a64b0e235265c3fc806
a6fe5b053edc3b4dfb99893c79ad168e940fea70
3274 F20101206_AAAWTN mendezvazquez_a_Page_080thm.jpg
220457e138c621708b3289a33d0186b0
e08eceef7b82a869d5255cc4840cfe37b01b4579
504336 F20101206_AAAVJK mendezvazquez_a_Page_127.jp2
02ebf088f2ecfd64867a9110b172f7aa
d96180d3e480495bf152d75d11e632556b80f69b
1936 F20101206_AAAWGD mendezvazquez_a_Page_069.txt
9d4bfff840c96aa7e774aab368f4de05
1c8c1a499325456f38169dfb6f647887d44d8d69
43437 F20101206_AAAVWV mendezvazquez_a_Page_044.pro
095992c5149be0a0ede74673024d29cb
37aee5d8e997bc5f9f9de3a81e4872e39c560ca8
9944 F20101206_AAAWTO mendezvazquez_a_Page_080.QC.jpg
7e9484821ea879b585fd13a560241c5c
9b8716b24fae79d18b85095e6182a90d6377b214
921860 F20101206_AAAVJL mendezvazquez_a_Page_128.jp2
6be0a5935d7969c0620418e24c848a15
7763a5670eeacc4654e40edbc7a4a6e5aadda72a
1933 F20101206_AAAWGE mendezvazquez_a_Page_070.txt
7792a4cdb752e28f423600c82cf1e623
66c752f7e046f3d45d4c2f4ccc00efaa9d41da59
34795 F20101206_AAAVWW mendezvazquez_a_Page_045.pro
b06cf6cf9ab7c5a772d836fb035ee91d
75b662fd80eb178a9748f4d3dd168051129d57e9
4006 F20101206_AAAWTP mendezvazquez_a_Page_081thm.jpg
69b400f26eeb3e5e477d24ddeb52e2f7
2fae2644a69be79c9e9f5c15a7c9eca2d12c9fd6
465396 F20101206_AAAVJM mendezvazquez_a_Page_129.jp2
439c7f15de1e4c5afd582118ac943f8a
aea635f501f837c9fd5d0d3edef77862637ebcca
2196 F20101206_AAAWGF mendezvazquez_a_Page_071.txt
17daa38c06ef8a22be2cd2f00c77eff5
65df70ef4b27dc98a2a2e53dc80f7b91843b209b
33427 F20101206_AAAVWX mendezvazquez_a_Page_046.pro
8a81c87475d0473fb2a1ecdb26b80035
7ab0a0648fa3644413287d4faa98f92d6dcaa3b5
13999 F20101206_AAAWTQ mendezvazquez_a_Page_081.QC.jpg
751458591480fdd00e43e2082b46d7bc
4a5d6b99f22255a00a43ddf5965d29a410a35b66
73717 F20101206_AAAVJN mendezvazquez_a_Page_130.jp2
bbac5f43e0f694e5ab6fb9ef70c9c77d
7769d5f240c8f372925a9ff44f94c5443fdc406e
2083 F20101206_AAAWGG mendezvazquez_a_Page_072.txt
b0b666c46a0a2fa1fddded5773ebcaf0
30b5b98ea31d889fdcbf332ba8325a64081a68b2
42497 F20101206_AAAVWY mendezvazquez_a_Page_048.pro
1fabe43914f728383afa66b2a6e22d28
fe860c62d1029f981513e1950104dfccd211bd6b
F20101206_AAAWTR mendezvazquez_a_Page_082thm.jpg
96559f944bfb50a7e9bcff4d77548b39
5a80d1fe6ecea1870835a8677d223f9e2542ebc9
1051984 F20101206_AAAVJO mendezvazquez_a_Page_131.jp2
617b0410650767bb5f9a5ea1d34ccd69
739e75d506338bd0894c4b0c9b8104054dd8e7ab
1754 F20101206_AAAWGH mendezvazquez_a_Page_073.txt
f70ced69cfb345ee3eb6505fe8c05cf0
8470bee14a90c1ce2525be12c301d8134f325de6
41223 F20101206_AAAVWZ mendezvazquez_a_Page_049.pro
05d7f1df983785a86bbe58699312431c
4bcbea59a2fe2080e1277ad4ee2a9ce17661673d
4789 F20101206_AAAWTS mendezvazquez_a_Page_083thm.jpg
fa25d731692ee6c505cebb5f5f7d58ab
17659c4f0c7fdb540f47ac49142f76d466ea5845
F20101206_AAAVJP mendezvazquez_a_Page_133.jp2
7317c4b1646b39f8d09a808e825ba606
7907716de9d7d186fa1a5621c187ea7fef588a58
1890 F20101206_AAAWGI mendezvazquez_a_Page_074.txt
6472d86f24b9ce979e29336abcab503d
2a063e8e64ce2d1e7b70a3722795d8a594494913
17977 F20101206_AAAWTT mendezvazquez_a_Page_083.QC.jpg
8f1c71a4eb5696c0e1e4d2404dc26fe7
0e98972f90d6b730c1b5b34ab932d1dd20b518d7
1051976 F20101206_AAAVJQ mendezvazquez_a_Page_134.jp2
79a2c9c1517813ce9ff49757883bea53
9d3bde277fc4ff803f51b78cad486bfc70d4bb7e
1657 F20101206_AAAWGJ mendezvazquez_a_Page_075.txt
0cfc65a853bdaecf759a19527da79c08
4dde913a7718b842ddcdd95ee5e7a5e93cba8162
5410 F20101206_AAAWTU mendezvazquez_a_Page_084thm.jpg
0c10af9d68df3346749d497e41b3f6d1
84ce3be942dbcb8470279a694ef267c09d386cae
1051961 F20101206_AAAVJR mendezvazquez_a_Page_135.jp2
d055b9c6a20396302554f74cadd46b5f
5ba16f22db4085655ed1d303ddf4c209841be40c
2253 F20101206_AAAWGK mendezvazquez_a_Page_076.txt
82e5171a4157e6e395cf2248bcc0a250
c14fc4898e56ecbd3092ab3a891605993025b8b7
13983 F20101206_AAAWTV mendezvazquez_a_Page_084.QC.jpg
727bbb795030e28a44ab8335a7759642
723b3a29458ef642701c9da67cc8cd4c529b26ef
508593 F20101206_AAAVJS mendezvazquez_a_Page_136.jp2
4d45acd2c1fac28ede04cf4ded36a9cb
5bb811cf2a4b51d6fddb84a9c77da8a544fafd06
1825 F20101206_AAAWGL mendezvazquez_a_Page_077.txt
f1a0b78a899d422c53fe85b88316e809
c8c57feae461e74108a432e91b3359cde7686634
25494 F20101206_AAAWTW mendezvazquez_a_Page_085.QC.jpg
402f35321496c3eb578934bdab6d0131
66364810e9ab287b5c614dcfe0ac2cb45f297a90
65585 F20101206_AAAVJT mendezvazquez_a_Page_137.jp2
1bacac64708c342fdec12dd2a30cc1be
3e7ddaa1a3cc7eb079e4d9cb98136a4b1e5099bb
2801 F20101206_AAAWGM mendezvazquez_a_Page_078.txt
400f0bc27668e5bd2c965251454cbde8
dd5b9df83ad2ea78a45dcf77a73b25699267327b
6806 F20101206_AAAWTX mendezvazquez_a_Page_086thm.jpg
940510366d990229f813caa3913acd52
e21615c041cf5ade0a74cd89d8d26d9aaa1d34bf
59687 F20101206_AAAVJU mendezvazquez_a_Page_138.jp2
4500d4441928df9a851e7c9f95eceb71
f6a0d4c70533df3d4e89712b35f3ed55bd84aa36
660 F20101206_AAAWGN mendezvazquez_a_Page_080.txt
bae432c76b7c885c95b7bb1f4f92172b
23fafaa90971c1f0636276fbd609e150535108fe
4870 F20101206_AAAWTY mendezvazquez_a_Page_087thm.jpg
43e19166a194215dffe9a2a6869ed5c6
063de08085abd51d54c2186f8bfd15b5629287be
40391 F20101206_AAAVJV mendezvazquez_a_Page_139.jp2
ce6bc9eb5bb5d9cddf37418074bcce04
7545052fe5b4a114b901e5ed8127e80700469600
F20101206_AAAWGO mendezvazquez_a_Page_081.txt
4a6ee826091759017bf0a2068d118faf
8ec4ca4ccef2f0b578b8343431881c9031a4cfc7
16552 F20101206_AAAWTZ mendezvazquez_a_Page_087.QC.jpg
f55a8f65fe59579866f93f5638db4e6d
74ab861a13f2081dd616452093b08d03c90652fa
40330 F20101206_AAAVJW mendezvazquez_a_Page_140.jp2
7084b369c2402c74f660e3e8589a3f3d
ce32a0a0ff3852fbfe8ba1c1f436d314225e6af8
154 F20101206_AAAWGP mendezvazquez_a_Page_082.txt
0048cb79a259fa2d1da29e1ab446a463
62de91a55565d8b768adf0a97a0c85099fd7b788
1051798 F20101206_AAAVJX mendezvazquez_a_Page_141.jp2
2e19e3a8de356e19ecd631bdb19543ac
52ed54190c7e4e8ecc8c97db0d63f2033d05e60f
707 F20101206_AAAWGQ mendezvazquez_a_Page_083.txt
b78a73c9c1a9fefe66ab56011026a5e8
054d37ad9c59aeb6b71f135ff1a3ad98bdb5e0b3
481 F20101206_AAAWGR mendezvazquez_a_Page_084.txt
cae0d1344485aececfdf5460d6d47faf
f5d9181421246d294aaee73f737df25570e05c18
1051982 F20101206_AAAVJY mendezvazquez_a_Page_142.jp2
63bb326b894c3bfad8460f8c85c63f60
ace6d2c874952389c8cdde2c04753939dff1776b
1823 F20101206_AAAWGS mendezvazquez_a_Page_085.txt
4b9be21dcd993fe9232e32d1d682c148
91bda90a02990bb94821394ead15be65493521c8
575228 F20101206_AAAVJZ mendezvazquez_a_Page_143.jp2
7d781db5249c36e1b6c48207c4c5fb92
0c87fc023feafaff096d4bc696efcdcb63a8e189
1965 F20101206_AAAWGT mendezvazquez_a_Page_086.txt
3b109bcf3113b0b10dc0a692621b3e96
b3c78838bdc87f36c62344f6a8db2ef6595ed0cd
1350 F20101206_AAAWGU mendezvazquez_a_Page_087.txt
62eef598015a84205bd016dff57cfe5d
b16a8a4c67051fedbf07de1233b0956e4626bae5
43383 F20101206_AAAUSA mendezvazquez_a_Page_183.jp2
60c46b97c4e46cac7474c94ed72d410a
982f41f0dde807970f8655fdbb917dfbee852714
1720 F20101206_AAAWGV mendezvazquez_a_Page_088.txt
4ea0fa1c47a58e6bc4c7d96ee0e4a460
5c8fc700f1a785361725d2db3df6a8262db6f4a2
851 F20101206_AAAUSB mendezvazquez_a_Page_215.txt
b92e657d6b0acd76e2b28c1e554a7c9d
308ab723bd986124bcfd1c75807aab0049b6011f
1572 F20101206_AAAWGW mendezvazquez_a_Page_089.txt
cdf9408cdc019a6445cac0a539d955b0
26a6cb70a0de310db3b3568575b8f71d8f9577c2
54520 F20101206_AAAUSC mendezvazquez_a_Page_063.pro
2f1216606e0dac48d02fa5d5617cf7d6
d140d49d4594784b3e58bf2e38fc32b69d457609
1768 F20101206_AAAWGX mendezvazquez_a_Page_090.txt
3c1813a7ad0f38701d790cf924d4d5fb
303ad00032850b94305ea9c221e3987600272c0f
7874 F20101206_AAAWZA mendezvazquez_a_Page_179thm.jpg
b2dd56a80610c6e2d905488b63b2b22f
ea36ae95b60dd5fe35461428cb32b9e8889cb654
34328 F20101206_AAAUSD mendezvazquez_a_Page_093.QC.jpg
876bdb0d7232d72f7366550339608916
5b8e42afad5d7667947dd9fd25206110bef8bcfc
F20101206_AAAWGY mendezvazquez_a_Page_091.txt
0ef92ac71dfeeea6251dbe0d94c610f1
07a1a5f4dbd76edabf1ca4dd894089a95c639c87
30584 F20101206_AAAWZB mendezvazquez_a_Page_180.QC.jpg
0c98b16f72bde3e9a3e1672c38b72a2c
957a87d0f3b5ade528e1231e2e99acf5727a3d2d
25764 F20101206_AAAUSE mendezvazquez_a_Page_059.QC.jpg
b64b68ec02481633400c39f4b28a5904
fce9f2e02b439123fcebe02feb7e82be36292c99
2297 F20101206_AAAWGZ mendezvazquez_a_Page_093.txt
f729db2e9baea7ac62175966592f5b5c
3d514b602fb75215cf2c25feac38aa1690beb0c0
33643 F20101206_AAAWZC mendezvazquez_a_Page_181.QC.jpg
2cb7f236b4b4d5f052a9e6795189b09f
96af878146ef2cd6494db7d89a16d5203ad772c2
23469 F20101206_AAAUSF mendezvazquez_a_Page_107.QC.jpg
3f4a820046aaf0981be704e7b8103123
28aa3a5db9e324aeadbb3cb52ad57188734481e8
33498 F20101206_AAAWZD mendezvazquez_a_Page_182.QC.jpg
37f79872e69adcf849eb80d2685d7a8c
a9f518bf92b8e486af01bef570fea276c3e3db8a
6577 F20101206_AAAUSG mendezvazquez_a_Page_022thm.jpg
5394e2338f8c1a16616cc87350981ace
ade6f9d2e182368c667388e11278145077fa1d62
3206 F20101206_AAAWZE mendezvazquez_a_Page_183thm.jpg
aa09cfbabfd0340c3299ea06d1daefa8
04c3896009117e7e962975b4597a77749ec50487
F20101206_AAAUSH mendezvazquez_a_Page_198.tif
889a63ffecb3f95d2da5bcb6f5732e80
b6932852aa7533c926900968522390005a764da5
F20101206_AAAVPA mendezvazquez_a_Page_052.tif
9a347f25136fc0fb3153cc05700df9fc
611d920cdad8f39b065faeb759c6d2e6e79ee3ce
5245 F20101206_AAAWZF mendezvazquez_a_Page_184thm.jpg
880bed2a5ea536d2ea439eb81969c677
37969de586d80557a9c4a2c759e904fdc9972fae
94897 F20101206_AAAUSI mendezvazquez_a_Page_092.jpg
989330cdcc77fc242e14861221c41951
6cfa576a74fb018c929d92d1c48c09edcb6fffbf
F20101206_AAAVPB mendezvazquez_a_Page_053.tif
34cdcc091b6cd786b38ddf458259ff3b
f98c2977ae5de1d653232be611e0036a08d6bb87
19348 F20101206_AAAWZG mendezvazquez_a_Page_184.QC.jpg
3841f3b2e91da5c92f414a67cecc45d0
f9db5f9b33334845577868b29b8811badd1d9bb2
F20101206_AAAVPC mendezvazquez_a_Page_054.tif
8fcb723652248ea5b438abac54a794eb
c7967f7223e0f9bb4fea6408ed82586256fd686b
4781 F20101206_AAAWZH mendezvazquez_a_Page_185thm.jpg
37bf39a3a863346338c2af829007f292
c0de579e743d685f417f482ddc2fdf474764fcc2
66269 F20101206_AAAUSJ mendezvazquez_a_Page_032.jpg
f944b06c97bdd69d75075bdf6d0f8bca
c77c7ee50de25257b20331c6af79381b707feacf
F20101206_AAAVPD mendezvazquez_a_Page_055.tif
a412287418d82c8ddc96c2c82dcfd3a9
f21b8e65244b3ab3c0769efa9c844afacab24d8f
16584 F20101206_AAAWZI mendezvazquez_a_Page_185.QC.jpg
5651d6bbc475f96767bf3e68740fa444
4fe01e02f8d476e42d5a2071214dc26dcd513eeb
32337 F20101206_AAAUSK mendezvazquez_a_Page_163.pro
96e2e14a9a5e0cc8675b1156b659b55e
7fe70f271ae83b384a71c9d20f49f4d1deff56c9
F20101206_AAAVPE mendezvazquez_a_Page_056.tif
3da87f85fb4b87ea824957cf709455da
a80d49e90c158b4209674a17389a461dfd2b5a27
5039 F20101206_AAAWZJ mendezvazquez_a_Page_186thm.jpg
5bfee753e5e044e61de13738353924a0
4bcc537f9e48b37bc04687d5e075377715de8411
3209 F20101206_AAAUSL mendezvazquez_a_Page_100.txt
e34cb935fe22e978c9ef83f08f5e2244
0e640f5f7d6c8e5f75294f114a56d8198cfb84eb
19347 F20101206_AAAWZK mendezvazquez_a_Page_186.QC.jpg
b16cdaac0ac56a362dda2bc02f881669
aa9be7e93accdac8954f1c861574b7ab96b45ff9
F20101206_AAAUSM mendezvazquez_a_Page_154.tif
c13fe5a1a676356ba326ae5a7d953a0f
012f83e6502479bf8917dbfd819fa26768c1eac2
F20101206_AAAVPF mendezvazquez_a_Page_057.tif
e3feb227080d60e48fc7116dbcbb9a8a
9c682d6e793d905791e3dadc053b69df781ea7a0
2219 F20101206_AAAWMA mendezvazquez_a_Page_237.txt
2dee4f44ea227c161b2fb765f0d22529
174e11c255d931aad79d5df23744c1c2839144a6
5449 F20101206_AAAWZL mendezvazquez_a_Page_187thm.jpg
85772bc0a01834c2457fdc76002084e4
eb8dcae4d1a672adbcaf87d054b1789e53098b42
43293 F20101206_AAAUSN mendezvazquez_a_Page_014.QC.jpg
216f6382b6d3f519d2987f3395deba89
360f33f695ff3cd12aba5061c09e639ebccd65db
F20101206_AAAVPG mendezvazquez_a_Page_058.tif
39c80ac76a88e0a1324066800f76b531
19c6f1307a8b784836c887f2bd0e0dfb67bbe1b3
19813 F20101206_AAAWZM mendezvazquez_a_Page_187.QC.jpg
38504965c4e1ffc4d77f419bd7a364c0
43b47c5bec4aadbaeda6492a58ab351b9abbc22a
1010665 F20101206_AAAUSO mendezvazquez_a_Page_047.jp2
83cefd38378f9eaa94a1ccbf38da1652
eb897efb1137f0acc70c70075a5aa2997af156ac
F20101206_AAAVPH mendezvazquez_a_Page_059.tif
dcfa045f517673d8969fadb829eb9fbc
5fc3b1733f46c0b4b266984e7a8968467e9c1697
2411 F20101206_AAAWMB mendezvazquez_a_Page_238.txt
176170575177bcf040ca94054e072e65
4a6c55e4f8c73934ea90b4f9a9d64ae8bedbc22b
5644 F20101206_AAAWZN mendezvazquez_a_Page_188thm.jpg
86c35d2ef86a0fa23bfc8cc26a28e7f9
a622ea5a1ec6b0701c04afa95928757162812c06
462677 F20101206_AAAUSP mendezvazquez_a_Page_158.jp2
dde005a462e375dcc427005cbda756b1
e6c00890126ca718ce6bb2fa8975fad693b7b79b
F20101206_AAAVPI mendezvazquez_a_Page_060.tif
46e71e0f6c82fbb20b5d66e38319008d
9e3455412d5b4366d6454035cc502e0eaa195e15
2317 F20101206_AAAWMC mendezvazquez_a_Page_239.txt
1bd3877a7fc3d63a532136f99fe15260
f951ab3f399a7694f8fcb962916569dbfb2fc833
F20101206_AAAUSQ mendezvazquez_a_Page_221.tif
06d005364037a978ec5d4bb3b5d8c506
4d7e1757dddcec1da5b4ee2c8d5b295a1775dbe7
F20101206_AAAVPJ mendezvazquez_a_Page_063.tif
9cbc02848cd7d8b418aceddc3ed06c79
c8ee5fece71b3091cd5989a8cc9a9371bbf034e0
2536 F20101206_AAAWMD mendezvazquez_a_Page_240.txt
c266ee294f4837ac6d2b601afda808cc
49bf939fa09587ce6edbcdea83cbc87e12215da6
24787 F20101206_AAAWZO mendezvazquez_a_Page_189.QC.jpg
bb592b6c58e82b7e611c12e2180e3bcd
a05a711a67799d38a7ed53b35eed417180a29f76
53510 F20101206_AAAUSR mendezvazquez_a_Page_143.jpg
a9de4bb7579009e99327d1d581aa82a2
e39fde6803304f6f3d15ef1a44e7c7e68c3a847b
F20101206_AAAVPK mendezvazquez_a_Page_065.tif
45db28af29766c627bef2c803d99adb8
1429229b12658a9e1b3af310e351e3566457c293
2126 F20101206_AAAWME mendezvazquez_a_Page_241.txt
8338bea69e8a5e16661610e26a86a855
42aded5f10711ad2aa40ad02265bb8eaedf8f3e1
30197 F20101206_AAAVCA mendezvazquez_a_Page_155.jpg
5249d8b48e9bcb3af89836d5c72d8206
2970707082e939b6c005196b5564f2b44bcc004a
5709 F20101206_AAAWZP mendezvazquez_a_Page_190thm.jpg
fc2a0c4d9079a2b53309729a3476047c
8b92d54a22ae7b7eaaccfb7b5a545c2eb8db5e57
80154 F20101206_AAAUSS mendezvazquez_a_Page_007.pro
ba91cd3f379b1078472607c4c8dc9d56
a47d39bbf6008cf44a4ac6b2b9702a0d094695c0
F20101206_AAAVPL mendezvazquez_a_Page_066.tif
5118e4738988f3028eed40f8d058c6ec
36a825507475d77823bb1a1a910fa1a33df0b619
347 F20101206_AAAWMF mendezvazquez_a_Page_243.txt
d4308d23125bba317e64da504603ac59
758eff4ee4dd99c700598c5adac4443b9b45472d
33813 F20101206_AAAVCB mendezvazquez_a_Page_156.jpg
d751c1051c5a1307be421c113db94cfc
4b6283e8d8020f43fe9d70ae24362e3ddd0c6484
4477 F20101206_AAAWZQ mendezvazquez_a_Page_192thm.jpg
23b1669be3425335722d33ca0b27c03f
ae378bb61efb7a1230ca5a938854ae99c5129385
1351 F20101206_AAAUST mendezvazquez_a_Page_169.txt
7ef12ce24b24b9d282f4f9d93ee172b1
53f9768a67c9ee6cef18a3711213d1b81c7dc332
F20101206_AAAVPM mendezvazquez_a_Page_067.tif
f4a88368ba90557e2a38a6dccc0f9766
834fcee760f837305b4ac4e71961351e584945a0
3356096 F20101206_AAAWMG mendezvazquez_a.pdf
a95171b9254d65cdc2ffd22ace257c6a
b90fc615b6228e8bddf0b1eaa815d19513091e83
67999 F20101206_AAAVCC mendezvazquez_a_Page_157.jpg
bb99f1b700b07946ce77db86d2da88d8
6703660cadb73a0d1dd0d216293da78e2d2304b5
17981 F20101206_AAAWZR mendezvazquez_a_Page_192.QC.jpg
f276784249ee539cdb118a9e63f12345
976c524d71b03defdb4eb02f1329077a136121f7
1051960 F20101206_AAAUSU mendezvazquez_a_Page_095.jp2
f534ad189e517bf9e7f7d66c3748a6a3
1068fe9d3d3fb613369482eab9cd3ff2c5d2000d
F20101206_AAAVPN mendezvazquez_a_Page_069.tif
d4515e8091c2d054ed905b3a9a6ffb59
0c4594a97e9ad35cd6a7fbdfb10a48b6d1416df3
10536 F20101206_AAAWMH mendezvazquez_a_Page_195.QC.jpg
cdbc84387244290d5e2021820a389bf1
8f4a80dcf1d7fd0817c4d7d8d8efb0845eefbfe0
45115 F20101206_AAAVCD mendezvazquez_a_Page_158.jpg
cce889f5261408fe4003d2fb615aec6e
96d35fa9d6d351cf871d871a67fddc9478a0a797
4385 F20101206_AAAWZS mendezvazquez_a_Page_193thm.jpg
9649a0403dc8c4057de328065eb3a5eb
372dd08f0fe517a2ffe3ba8e63f1c86089b32251
68186 F20101206_AAAUSV mendezvazquez_a_Page_125.jpg
4a295501eea8e3578698d96eaaa0fa35
612a4e0c41b415086668cca649782b2414b0b899
F20101206_AAAVPO mendezvazquez_a_Page_070.tif
41684b889f7e0b0b8e50533319caf153
3458e490c7573f9ca384a7eae093461307982cbf
4569 F20101206_AAAWMI mendezvazquez_a_Page_215thm.jpg
49312d4e7a4e6ca5995c72a692b1ce49
3b487415c8d63fb250b64333e202048f06a644ed
64870 F20101206_AAAVCE mendezvazquez_a_Page_161.jpg
a5deaf0493bf17fa4537ed7fed4f0355
5c1960e6d366ccaf292f1eff62f05ed59289865c
17801 F20101206_AAAWZT mendezvazquez_a_Page_193.QC.jpg
9b4be23129d339e9f46dadd83efe5e5f
a1d14420113653e1749a56d201f2deab742e244e
F20101206_AAAUSW mendezvazquez_a_Page_105.tif
28df71ed54e567427bce7569e00a4d50
57849634a170bc789a76d8c00ae1d1f81bd84a26
F20101206_AAAVPP mendezvazquez_a_Page_071.tif
aece327e107804954a90ae1a2c3c3918
e41824e7c3468a33e39ab43a56407c7c3c5f83fd
5250 F20101206_AAAWMJ mendezvazquez_a_Page_191thm.jpg
079b8dcee7a38da6f212cb5ebb566edd
5a6bfba57ae1f03a8cc605e70cdd3920c9790c4c
43551 F20101206_AAAVCF mendezvazquez_a_Page_162.jpg
7ce6663fcc812bee3f89a1618cfc39b4
95f46d40696f5548a2715aa6c3c77a7cc759a772
18088 F20101206_AAAWZU mendezvazquez_a_Page_194.QC.jpg
d506559615cc382ccc177a47ec6ef667
89f9feaea7271923b56e4105ebb6fdcb309302bb
1387 F20101206_AAAUSX mendezvazquez_a_Page_121.txt
8f90d344ac58f359e1b322f77429dd14
5019e72cd12f5dd9e11114680ada6afa862346f2
F20101206_AAAVPQ mendezvazquez_a_Page_072.tif
5ac3774bb325189b1813f33e3a2e4c1c
6539cd3afc82941265b670357016901337639e8b
35116 F20101206_AAAWMK mendezvazquez_a_Page_071.QC.jpg
bfc0283ebe2d0ab1f987bde918984914
95ffa7d32d90e0de4d45b9730230f9be0ad0832b
69991 F20101206_AAAVCG mendezvazquez_a_Page_163.jpg
82423dc9ba4b0ce63c9fadeedd6802b5
ecfb0bc5804288dd62dc851d834108d6a7d1fdc7
4495 F20101206_AAAWZV mendezvazquez_a_Page_196thm.jpg
b1933e945f4706ac3d1fb56f36e7357b
7b31da0a5a25f160d15b1c276855a246094c13c7
F20101206_AAAUSY mendezvazquez_a_Page_011.tif
32fadbc0c07b78f0dc3551a468b1afe8
158a3a42c786b11349cde0303a5e34d9eaff89f3
F20101206_AAAVPR mendezvazquez_a_Page_073.tif
2f07137b5218312be54caf9686942832
8623a4dd03aaf265ce3f54ed6079113dbba7c8c4
18552 F20101206_AAAWML mendezvazquez_a_Page_123.QC.jpg
a7b8bff2ee0e6e77d4e0d67777d3bc31
6d4f45702f6d823a097920e7532f975d67549efd
62926 F20101206_AAAVCH mendezvazquez_a_Page_164.jpg
bbd8125eb79e2f1a12ddac93513cbca4
031272a205fc13deafb26c72819d423ac6470b76
18604 F20101206_AAAWZW mendezvazquez_a_Page_196.QC.jpg
ec247eb0db9bf90a3422b8ef873b5945
6ee93d083ed20c336d5ea1b609af79f9c3a0a640
72153 F20101206_AAAUSZ mendezvazquez_a_Page_107.jpg
54104371e845bfb55e3455c9e20afb78
796b776974ecdc8276f717ca4fceec343673b60b
F20101206_AAAVPS mendezvazquez_a_Page_074.tif
7e6ab55050b5dd810e7a6633d1327071
ed65c37c455fd2bb99b2660de1f15cfff3862a95
7572 F20101206_AAAWMM mendezvazquez_a_Page_069thm.jpg
0d21e6f8c088de89d6b00f2770d77769
91d393f5227a1c32079963484b5f9564c0eee29b
58365 F20101206_AAAVCI mendezvazquez_a_Page_165.jpg
1225ddd910f1cb743708a3dcfc3958c6
3d2a8ea493d7d28cd83be1c1875d2068e551775c
18729 F20101206_AAAWZX mendezvazquez_a_Page_197.QC.jpg
8a3d1593d8685fbf0eaf13af4af01a03
5c1046c938e869a6515d65d52c1ea06a9290f463
F20101206_AAAVPT mendezvazquez_a_Page_075.tif
18607772c2da3e8e5b4bab3280e59876
bc7833d9b24bd8fa9d615d3d1d9ab079a0e76298
28754 F20101206_AAAWMN mendezvazquez_a_Page_062.QC.jpg
bc499d852d1b704e874f1a48add370e1
8b1aae359c08135f5b55eb9c5153a6be4352b331
62210 F20101206_AAAVCJ mendezvazquez_a_Page_166.jpg
1b2247965c5f59cf2052ff9a9d75b488
bc4f329c373c2deb4f33c878ad6bd3742064aa4c
4888 F20101206_AAAWZY mendezvazquez_a_Page_198thm.jpg
9e7c599f5beff1fa09af05cf9287e026
5d68b261355f5809038db590b6663e37b8b50b14
F20101206_AAAVPU mendezvazquez_a_Page_076.tif
7327579fa453a05b5f13d98de646fbc7
2b0cf18bac78b49aeebb28800b6c2cdbec6c972d
6194 F20101206_AAAWMO mendezvazquez_a_Page_189thm.jpg
2d100f390257cd0600cb28f113a4ebeb
316f5effdc23872c1e182c132c79c789c4ba5c79
58850 F20101206_AAAVCK mendezvazquez_a_Page_167.jpg
653a873606b1b9d9e252732ef88e75d2
3617f9deae4d01c7d556f2d0555f640f196c3c30
4738 F20101206_AAAWZZ mendezvazquez_a_Page_199thm.jpg
05ed1bdeaea67b9b5263aae8bfa4a420
122f78706e8ffc0814767d0ecf2d5ace08ba23b5
F20101206_AAAVPV mendezvazquez_a_Page_077.tif
406fd8346fabbab7f8aa5b150065753c
a1c1891451f1e7d46c7e715b03af0b5b713f55cc
2615 F20101206_AAAWMP mendezvazquez_a_Page_226thm.jpg
1d06116ca04acfe3b04f5ded1e19f36d
b10e60a808f37dd0a738e893e8ac71fcaa8e616a
73022 F20101206_AAAVCL mendezvazquez_a_Page_168.jpg
d76a44ff0bb7276257f06a8061382f7e
1b938903e12a20d782013ef523ed88575155ff90
F20101206_AAAVPW mendezvazquez_a_Page_078.tif
b681e6a51510d0e79532b31343c65ba6
4001d264736b4d1cddcbfc1dedb27f457eceec9c
6039 F20101206_AAAWMQ mendezvazquez_a_Page_088thm.jpg
25ab771798e66820de24679fae75fad4
88d1e8a80c69321aee7b6d6d8a63c80de38b68ec
64441 F20101206_AAAVCM mendezvazquez_a_Page_169.jpg
55f169428b4d206e723de9e779e20fd5
001dff78d5f7cb0b175fae5435e5381655435b7e
F20101206_AAAVPX mendezvazquez_a_Page_079.tif
d5b0983dc5cef8b2e539738c51ad596e
b846a802da508ea58f8d8a93782fedcd71ddf7ba
36507 F20101206_AAAWMR mendezvazquez_a_Page_013.QC.jpg
b5f3bb34b6c1c945529cc34819d95bea
e150bb6b7a9829fa14491bb49ad21afb7dc6fbcf
73037 F20101206_AAAVCN mendezvazquez_a_Page_170.jpg
49563fab082784e703d0a938e2bed2d0
e914ba0b94b8f7f9ae6e764d21b960c6dc6a256a
F20101206_AAAVPY mendezvazquez_a_Page_080.tif
60e1c04c61caeb988e12f57c838c684d
d805eb6da4c18a893632af0ee97826130e99061c
7430 F20101206_AAAWMS mendezvazquez_a_Page_113thm.jpg
e73e99cd2461384ef5125a9dbecbb5ac
ad9bd10c27e60f40031c3396a5e14a79ffab0b5b
62413 F20101206_AAAVCO mendezvazquez_a_Page_172.jpg
1fa89695c35eb6c99c6ba12a570b48e1
5db416336bf7d3b0154c6c847018c8d7313ca131
F20101206_AAAVPZ mendezvazquez_a_Page_081.tif
4b467bc7d390b62de2ee847707e24486
4e9a288c4fd087cf68118adfd2fc87c8605f91d2
18847 F20101206_AAAWMT mendezvazquez_a_Page_165.QC.jpg
644b8501b11e332841a9b859e0f31734
d2483c49f64929265e849b70fc4470949768323d
63120 F20101206_AAAVCP mendezvazquez_a_Page_173.jpg
cbe937d939bc66c84912f4d0e61c9d8b
8db7d03da5d4dab66ca3f3f4327a2af7a3bcc196
7026 F20101206_AAAWMU mendezvazquez_a_Page_230thm.jpg
35900aa475e115c2a60c745c1df9080f
29066813730a55d536a1d79ecceb6314a7d17359
72477 F20101206_AAAUYA mendezvazquez_a_Page_022.jpg
2e66430e5a279910cab26e091fdf1778
b28bdbe7243022c7b820de1fb143330a9dd3a659
52447 F20101206_AAAVCQ mendezvazquez_a_Page_174.jpg
a71bf977adaf498569c8f2706cc6cde7
4ba0b3524784ee4a7d20bcb462a3390b4849fbfe
32001 F20101206_AAAWMV mendezvazquez_a_Page_025.QC.jpg
aea983f051eb68ec92b15638dc964e11
a0762d52e75dbe12b5b006df658b107845305bb1
100376 F20101206_AAAUYB mendezvazquez_a_Page_023.jpg
df2375fe2a7e0c23999a04b8504de248
355d11d83d03f57686e9474583e882b6b1d6d764
61232 F20101206_AAAVCR mendezvazquez_a_Page_175.jpg
104e128dc62a29e880082f38a33b9283
0810b66385440cf3af26515f42999c2d6dc61025
7051 F20101206_AAAWMW mendezvazquez_a_Page_048thm.jpg
b7b5479dcea1322ab9b65463c738cdc6
32bcf31f219c2cb986c2bc0e78e971667d6a0a75
82473 F20101206_AAAUYC mendezvazquez_a_Page_024.jpg
e6cb504870234ac57a35e932cc13507a
62187856b15b89cfbf05637066b58db8c8d135e7
93131 F20101206_AAAVCS mendezvazquez_a_Page_176.jpg
51c6a497d91f7f52accb163d67c7f6f2
7664d1c580761937019e75c0ecd52f6a31b270f4
7219 F20101206_AAAWMX mendezvazquez_a_Page_112thm.jpg
55244c95bdbf772474184e6755f3261d
f103c90d0867836667e1c5f62f36148921e0fd3e
100788 F20101206_AAAUYD mendezvazquez_a_Page_025.jpg
da2c9e3d4f5fbaeffd3d70cc0bb0dd5c
5bd576e7e8353099caead835b785d01a1419614b
109219 F20101206_AAAVCT mendezvazquez_a_Page_177.jpg
77310d7b2600ed3c95f7f70178c1044f
0ef06539f3cd3a951c61e70aaf7c0cce4f787d81
7629 F20101206_AAAWMY mendezvazquez_a_Page_132thm.jpg
a2eb294b1932eea91c5420743da4be2c
bd828ce6be80017e65a06378ce03cab6b19b3845
106188 F20101206_AAAUYE mendezvazquez_a_Page_026.jpg
d0aba85b7f8ef26caf5c74320da310ce
2eb253291aa4377c7166d36ecf5f88a9402b7b99
102883 F20101206_AAAVCU mendezvazquez_a_Page_178.jpg
e4e0efe418c529fb7a10e1a5aa22c6e1
16fe832f1de34ee3b30af4a0c14161db80c830dc
7265 F20101206_AAAWMZ mendezvazquez_a_Page_101thm.jpg
acef4afee5cf3f68db4fe8c6239a931a
aa3e20fe8b1c72c45270e65debd9c1e848c8f5bd
89282 F20101206_AAAUYF mendezvazquez_a_Page_027.jpg
21ed38a483bbba5ceb1cec645d8f35f0
00d9f2d014b3d32be5e75af2b0a8876b012b76f4
104487 F20101206_AAAVCV mendezvazquez_a_Page_179.jpg
46615c0fdb348865d3e3f4ec9c20c04b
fc04c608a1f49f6df0960207e9f6eb6653cf67f1
101295 F20101206_AAAUYG mendezvazquez_a_Page_028.jpg
d4395f8ee012d7d364535bed560ad63c
aa3499135f1b80261395300e13332177f103b297
103008 F20101206_AAAVCW mendezvazquez_a_Page_181.jpg
b1ae7e459697f912d0f10fc72488ce13
565df7238fcfd01c7614e360947b2d6fe875f1f9
82401 F20101206_AAAUYH mendezvazquez_a_Page_029.jpg
7b495cb56a883fe653a2efdf0c87ac92
d61adadd20aca9c70458e591a07b91ed7458a645
106444 F20101206_AAAVCX mendezvazquez_a_Page_182.jpg
1d80ad9ac53be02671ceac0ece518d1e
71811c4a49ba7e7250cacb391fdf625affca4515
F20101206_AAAVVA mendezvazquez_a_Page_230.tif
2608b2da6a504cd30bcd9852245529b8
0a0a6271376f40bc795b5b19557ab50df37b711c
93564 F20101206_AAAUYI mendezvazquez_a_Page_030.jpg
83d9e95122c36bdd7ef220b65f8ed5f7
03d25de04b2d15aae7f19a0c1fe2688f4286741f
F20101206_AAAVVB mendezvazquez_a_Page_231.tif
814a2d37a4007d2e197cfee33929a3a8
8fe52968f1436460b34acc90c78155a918eae06e
92079 F20101206_AAAUYJ mendezvazquez_a_Page_031.jpg
4409bc18b84d3b0d7b0b6ec992b0fef6
a1b4b30ff184f131eed383e50120f6d98e277a8f
39214 F20101206_AAAVCY mendezvazquez_a_Page_183.jpg
38d0921dfbe037e2df86b6166ab4ab41
749e18c18f115d5501409412b7825c30ce7427cc
F20101206_AAAVVC mendezvazquez_a_Page_232.tif
eaba49272b2cad3b92fc1b97ea577519
9f78233b0fa39613421d854e7eaa8f68aeb2a4c5
90234 F20101206_AAAUYK mendezvazquez_a_Page_034.jpg
d7768d159f95d4b1e5e92584b3202a18
777018d36355bd8bfb5b37a974b6ab2496095099
51674 F20101206_AAAVCZ mendezvazquez_a_Page_185.jpg
10d8ab339576374bf6afc3360c3fb2bd
f100d42a851415e092e5147d10f2d3dd917afcb8
F20101206_AAAVVD mendezvazquez_a_Page_234.tif
6ad39d84cd54aca6645437238c6a8191
70e2378bb4a7a75921a4fdfdb600a6c2d5193bb4
97259 F20101206_AAAUYL mendezvazquez_a_Page_035.jpg
3df826fce5313b81f25cc13b8f199f8f
989d7c7649a5b241cd98cf99991e0f96e51a91ff
F20101206_AAAVVE mendezvazquez_a_Page_235.tif
6f631a8e435ed0e9b3a6aa2ad83b4d60
57264a64e1b18b8b35348d4da9ba6ca8ae3727e3
90308 F20101206_AAAUYM mendezvazquez_a_Page_036.jpg
a0abd771eeec15cfb602f5775f7fdd6f
76fa230eb560635ecf6b487074defaa159bc1b59
F20101206_AAAVVF mendezvazquez_a_Page_237.tif
2e0f35c213ee995dbc0fcfc263a4c202
87247b13e736dd33864f9d9dcac5753bd91b8b7a
72337 F20101206_AAAUYN mendezvazquez_a_Page_037.jpg
da9752766c99104494bbb9a62fb22d08
11acc8aba21f1d944c3dd831542c0626791e186e
F20101206_AAAVVG mendezvazquez_a_Page_239.tif
1bf123e69bbf3435e81ddad71c305c70
e9180c75e54a08f23badf303006bccdfb773931e
65359 F20101206_AAAUYO mendezvazquez_a_Page_039.jpg
c5b51abfa2f09d38d12351be9b7e9096
6d4f87fb19812d7961570082de46bfc3bbe2e2c7
F20101206_AAAVVH mendezvazquez_a_Page_240.tif
eedc40ea067a0a258906fa2a4bdd8f1c
5bfb80c6e1cbfeb665cf265c41e26680937a388d
7697 F20101206_AAAWSA mendezvazquez_a_Page_052thm.jpg
de7543fd5b92d4d169a65118940c10ed
f306b3c94b18bc4b44aec6ed1ca6e2e227bffa29
F20101206_AAAVVI mendezvazquez_a_Page_241.tif
12cf16d2abe0adcbb2ce246f0fbb8e6c
6fec3c76ab7b13ab52aeaec7b064b7ef6e24ba02
5393 F20101206_AAAWSB mendezvazquez_a_Page_053thm.jpg
3f7331b1acbd664b9ebffb5a8b220f39
c55637c06a89d1bfb891fba4702da296f934254a
77669 F20101206_AAAUYP mendezvazquez_a_Page_043.jpg
7e14412231e080e99ace539ad5153620
624cd9e2f9921631b7cb6ba04754ca61b1a6ba08
F20101206_AAAVVJ mendezvazquez_a_Page_242.tif
8fd8921589d25754759bf9263aeb41be
c355861352d54daf4240a68f6b4950cd1738fb5f
18677 F20101206_AAAWSC mendezvazquez_a_Page_053.QC.jpg
b8630917136538b92ee3c87077feef02
3bbceb1105965a254942b406a5a4130a42775fd5
69259 F20101206_AAAUYQ mendezvazquez_a_Page_045.jpg
c08a28491ef0f211be2ae061593826c4
e6745270d4d689b46067aed9195bdc373b25a05d
F20101206_AAAVVK mendezvazquez_a_Page_243.tif
b7996660673aca955d884955309118e9
297ed6a61faafe8f7f6de712abe4a51ea7997134
6041 F20101206_AAAWSD mendezvazquez_a_Page_054thm.jpg
f02b7107ff61f8384d8f82325af47e99
32c1fb26bc247152d9854906adbaec79952d3c63
66021 F20101206_AAAUYR mendezvazquez_a_Page_046.jpg
0f925b4c00cbbc09e861916d7523dcb5
1b30573995a53aa40fbe359f141cc77060d6ce65
26834 F20101206_AAAWSE mendezvazquez_a_Page_055.QC.jpg
c9b8aa057675e59fd8c4175288bc6694
75930b824910c6e719522361c94e4c90b7f2e537
55256 F20101206_AAAVIA mendezvazquez_a_Page_087.jp2
eca4e737f500d1b6be9de85de09ead9a
626918a6e476ab8086e2297213e1cf743229fe92
93097 F20101206_AAAUYS mendezvazquez_a_Page_047.jpg
f6a1dd5539318ed78a295f250eafe726
10cabc40e2808968796f165800e616a3183b3eb4
8042 F20101206_AAAVVL mendezvazquez_a_Page_001.pro
6fb8eefdcdae75ca0da03d98cb129380
17494d2b50d5e259edf6225ed858a7d21820a011
7666 F20101206_AAAWSF mendezvazquez_a_Page_057thm.jpg
bfe037085f9449a8e7611140609a2f3c
6d48c960296aa95133ba232cd1036b3350bf7fd1
709265 F20101206_AAAVIB mendezvazquez_a_Page_088.jp2
c454e8250f42b0aa89a93cedc1859b0d
cb5a6c1ccce4735161582565971232151e7f5d46
84964 F20101206_AAAUYT mendezvazquez_a_Page_048.jpg
a05be9831b9fb92fc4e35b0a000f26d0
84f3cd96a688ecf1bf3a75c0f4923fb7b149e26c
1038 F20101206_AAAVVM mendezvazquez_a_Page_002.pro
c51189c70ce155e39156f52bc43c4ae8
f93a476746f4503aeb9b3d1ddc8895c53088b133
29260 F20101206_AAAWSG mendezvazquez_a_Page_057.QC.jpg
759ac7ab193312f67ea8df7ada718384
7798b7bfbafdc439bdf739df62530e42b290f889
874487 F20101206_AAAVIC mendezvazquez_a_Page_089.jp2
65552e3b19f5665fbd951330e09b8572
8af9a94b015cc760424b39b7301fa3fa4ae39ba3
81813 F20101206_AAAUYU mendezvazquez_a_Page_049.jpg
7f1551f632f4b2c141d3303582031436
455dd321a8e37c357286cbf8b1fc99f4bd3c7b7f
6022 F20101206_AAAVVN mendezvazquez_a_Page_003.pro
58f6ba63a29e0d517aa43d9b24a3c94b
b5b0bdc86002c22224105680cb8cf3517c77e125
63928 F20101206_AAAVID mendezvazquez_a_Page_090.jp2
53ce617b1d25cba73b492413d03ff01c
52d345a5fb0a436c3fb29ba75672b8f0edec9e93
68582 F20101206_AAAUYV mendezvazquez_a_Page_050.jpg
e8bc78924709f20c0047dee74913df83
b7be335456f2d16dc96e4e43a76f90653195beae
19202 F20101206_AAAVVO mendezvazquez_a_Page_004.pro
e1b6255cbe03e6de118cf255a3f6a90e
6f25c40a5c4201fbbd575c0b96c28dd699825285
6392 F20101206_AAAWSH mendezvazquez_a_Page_058thm.jpg
fe00f6610a0c6cca473dfe45f6b2cc58
38dd5e1ab14beacfaa3c17968883c98cc46ab501
F20101206_AAAVIE mendezvazquez_a_Page_092.jp2
60fae227d40fccfdab873363c624f362
c2185b0d663955e4a05ca316f63f7c9e1750a28b
55960 F20101206_AAAUYW mendezvazquez_a_Page_053.jpg
49b57bcd3a8c2bb2497bbf3caa0971bb
d5d3bf9589caa46cce5929ce08930bc8d2e68635
65384 F20101206_AAAVVP mendezvazquez_a_Page_005.pro
06329510d80661352fc5dc01268ffad7
0447f6a0a8fb7c6916cac311751cdfa1d8e70e71
22864 F20101206_AAAWSI mendezvazquez_a_Page_058.QC.jpg
fc9bbf0e11efa2e50a28ed155c4336ce
ea9e0c9808287b63edbeee27de909dfc6c8b01c8
1051940 F20101206_AAAVIF mendezvazquez_a_Page_093.jp2
286596caceb57ac6620b6024d432628d
d91210142f1c17dc5238ebc0f670123a400e5dbb
81494 F20101206_AAAUYX mendezvazquez_a_Page_055.jpg
ac46620c8ac7b372869755b41afe66d8
b55b5f8485840b8fecb8f5edc329d7a9b1f9308e
64465 F20101206_AAAVVQ mendezvazquez_a_Page_006.pro
33ab52d346e529bf9b3fa33b4ad8e678
6d656529ab8b7c3123f4e8a28d921360e5e5a871
24282 F20101206_AAAWSJ mendezvazquez_a_Page_060.QC.jpg
b7c900ec08d7a388fee79d1503ae0219
b6304bc68aeb67aa6678c221039d9619d69cb04a
60432 F20101206_AAAVIG mendezvazquez_a_Page_094.jp2
51ba1595c336f946850c7e23c1daa500
5c68c7657f82a325d6f8c909612b4ee9ad7f7a3a
91818 F20101206_AAAUYY mendezvazquez_a_Page_056.jpg
b845261b8c0aa361d1f3167b4c33ccb9
4b24a8ed4fd059ab947751c234c51df79933d6d2
76063 F20101206_AAAVVR mendezvazquez_a_Page_008.pro
b4f48c749b6b66e39bb43b8dd3def940
cc847ed18a6136e5f2bfd14752c5c5d7f0e42214
7313 F20101206_AAAWSK mendezvazquez_a_Page_061thm.jpg
3b7cec6f0dcc0f8cb0f28f634c38f014
90a0d004cd1cea2b32e15f3a4ea578ce91094855
1051950 F20101206_AAAVIH mendezvazquez_a_Page_096.jp2
72f68fe532ec1b64575ab634bdb20f27
0e3f3f853af6632d0f8b9d52b693f25c3bbd1db4
71071 F20101206_AAAUYZ mendezvazquez_a_Page_058.jpg
9a241eab6720c7d31e7f9f2a2f7a965c
91c84d73065fda25d656a83b84d2b6dfee665825
1693 F20101206_AAAWFA mendezvazquez_a_Page_039.txt
4f8f86a2ffe12531ae81f7c769ae2c07
de1096765b1d1ce5aee2b1374fe9158601e6479e
12501 F20101206_AAAVVS mendezvazquez_a_Page_009.pro
99632046f6ff8cdeebb3c7ed15fde5d3
36ca840130cb145c3d7eafdc4ec4c29dc1898f01
26554 F20101206_AAAWSL mendezvazquez_a_Page_061.QC.jpg
6e36b8319d4e3c7f8eddf83dc6d8117c
166288e38d43fb3348334ff666aa42ed1ff49321
27065 F20101206_AAAVII mendezvazquez_a_Page_097.jp2
dedb506168571b2b489f54d5de7d150b
f2578a5c9172d1b725aabea9d732446aab3df141
1814 F20101206_AAAWFB mendezvazquez_a_Page_040.txt
bdb324fd3c30096f3926a0378d5f27e6
899a99bb4e8d9ce6ae38098e44c0d7cddd642d87
65451 F20101206_AAAVVT mendezvazquez_a_Page_010.pro
bf2aeef43c596821721f1dd599466335
3026a0cc6181ebc0ce87e6d8481d5eb45d20a905
7551 F20101206_AAAWSM mendezvazquez_a_Page_062thm.jpg
ef07ac1017c182412024cd3c1922c191
1838e082a1d482b90559c2fcb78f19075bb85d80
741796 F20101206_AAAVIJ mendezvazquez_a_Page_098.jp2
90f963740dbf776bb9708dd3572524cc
4c0a06c63b72968d2e9033e4467b2d9d7992fd80
1881 F20101206_AAAWFC mendezvazquez_a_Page_041.txt
865ce037142ba55f44e505a00adc1dd8
19d4449fe15eeebf6f73590f41d84c75686591dc
61437 F20101206_AAAVVU mendezvazquez_a_Page_011.pro
321d9f0d91eb65ba021a15299c7341f7
12387020bf5dd1b819a537805709305b652e4545
8057 F20101206_AAAWSN mendezvazquez_a_Page_063thm.jpg
0871dda32ac4a406022c7a2d298781de
391c4288815c4ed7ca0d0d6b552729515a673aaf
109200 F20101206_AAAVIK mendezvazquez_a_Page_099.jp2
4a13f1a3ff7a1083025f4c16c07201b7
3ed1d8281fd27e79fac050410012ba6ed6be88c9
1737 F20101206_AAAWFD mendezvazquez_a_Page_042.txt
5cc58ed7e367ec995ef60b04a9d4aa61
b981b58e1ecae53c0deef6943f5f36dd7bf82cda
8777 F20101206_AAAVVV mendezvazquez_a_Page_012.pro
1239470d4e71b29e1a61ad405ed1cf12
f5cc9da064d495058e8948c0f34c697bef31c2e2
33530 F20101206_AAAWSO mendezvazquez_a_Page_063.QC.jpg
6c59d1ce0b4cdc314fbe9e610425ed54
2e72298603a27e32c75a4f7491b29414a6d2ee50
109991 F20101206_AAAVIL mendezvazquez_a_Page_100.jp2
edac2628f822a8c7ab91e773a092ef71
6836f80dba2754926fbc94d7e30d06242b290e06
1895 F20101206_AAAWFE mendezvazquez_a_Page_043.txt
f4c5c6d3f981ffa9ae623a5acd6effcd
5865cfd90ca54acc3f463f21dccf59cd5a4334bf
64375 F20101206_AAAVVW mendezvazquez_a_Page_013.pro
1de26300f2259e77b28b1a2e6a9f3345
963745c6991320e59e6f904ecded4ecebf4bd9ca
33211 F20101206_AAAWSP mendezvazquez_a_Page_064.QC.jpg
54e5c3ce958bc2cb783bdaa782715afd
14902ab1a8260feba0adaa562f09fab758912503
66844 F20101206_AAAVIM mendezvazquez_a_Page_102.jp2
44f29ed2f53c5ba14aac92cd8b0084e0
05c39e8f4edd652f35624cb7f54342911b889421
1978 F20101206_AAAWFF mendezvazquez_a_Page_044.txt
b26ebbfbc33d1db3dfc6c2cd41aafd86
f7117972a18da677767ea8d6f6cd90e269c57ecd
67358 F20101206_AAAVVX mendezvazquez_a_Page_014.pro
f032b184a3475fc2e8a7c6cc1e7117b3
9fc829403d6eed56d0ef26e0185eb316cb376eff
7894 F20101206_AAAWSQ mendezvazquez_a_Page_065thm.jpg
93278dec16639f8a7cd2df085bb93c40
4d2f6abc4a96f9df489ca6e791250edfa886d760
972386 F20101206_AAAVIN mendezvazquez_a_Page_103.jp2
fcb464e51552e0781b78dc1c3b3128d7
97eeb2bfebc1c525d9ad5d172d2a7f39239da59c
1952 F20101206_AAAWFG mendezvazquez_a_Page_045.txt
ba354b05ed12fd8877c6cd63ae9b5de6
b02d8689e5d367b277e7719b3781d536297bd953
66639 F20101206_AAAVVY mendezvazquez_a_Page_015.pro
c68b9ee8618663d9d7e969d26209c0bc
20c5d416b2ee2475d42a9f60082fe8213e8863a4
33884 F20101206_AAAWSR mendezvazquez_a_Page_065.QC.jpg
34e1af1b16f17d18e03cd8d32cbd41f6
a56cbae6eee1c260c9b4fa120b319cb1182d7c1d
72207 F20101206_AAAVIO mendezvazquez_a_Page_105.jp2
ab1f4386d2f91eaefaf1f5aba96f47f7
dbdfdc51776287787fd675076bd9dc82e4d17b35
1762 F20101206_AAAWFH mendezvazquez_a_Page_046.txt
9d153c2898185c7b6ff0bb74f8e55189
a09507fb65d50eb400fd3654b5a3a9e9ef993ab7
6890 F20101206_AAAVVZ mendezvazquez_a_Page_016.pro
1bc85f75deca061b6b924e066af26d84
8d5ccf9d75db95eba8e20ffc3f1a0ec2c91c50ca
33314 F20101206_AAAXCA mendezvazquez_a_Page_234.QC.jpg
413520079027baad9dfc1f28132646a5
5722f579f09862cb3b4dd6158744b68f45049d2d
7979 F20101206_AAAWSS mendezvazquez_a_Page_066thm.jpg
7d9aef9612874a5eedfec5b1cc453e55
af2675f268c9beaa1a3f86faca1d9fa9ad644a1c
817503 F20101206_AAAVIP mendezvazquez_a_Page_106.jp2
a18722fffd3ee9b0982eb5ab5c0b88ae
92e3264ccb59f90fb69721648158b18a3aca2367
2149 F20101206_AAAWFI mendezvazquez_a_Page_047.txt
ad0b282e36b9a9385e3f88cd55fa44ad
331bcc4b6c8b9aa322abb6ca759dfb6beb930446
7234 F20101206_AAAXCB mendezvazquez_a_Page_235thm.jpg
c561a784f7c2cb31d22045c9a8154fe5
cf415a9460ff3213dce946bfdcd797aa682bd009
34110 F20101206_AAAWST mendezvazquez_a_Page_066.QC.jpg
24e7a2dcd42032966de85558b4058a18
bb72f1379cdbf9acb466058f673e1c85e6bb7e59
744118 F20101206_AAAVIQ mendezvazquez_a_Page_107.jp2
7175dfb981d5ab97afc9f03e1d763512
57499d8218d8b1432a1686c012e9b41179e7f435
2150 F20101206_AAAWFJ mendezvazquez_a_Page_048.txt
552b69044a64de1cd52940d481461bc0
601f063db00ad02caae926e4cf0b321da23cf3e0
31765 F20101206_AAAXCC mendezvazquez_a_Page_235.QC.jpg
eec09acef792dc0fb340e37d4d11aada
f70401a2f74f1a47ede9ad135bfa55813be9b3df
7209 F20101206_AAAWSU mendezvazquez_a_Page_067thm.jpg
1c8bdca3c7a4e73915bbfd61f8f22123
0bb2bc9bc30fe6eb975e272c295031e86ecd7f66
53269 F20101206_AAAVIR mendezvazquez_a_Page_108.jp2
99afeab5201f6f7d44cef544382d3a3a
c1edf6286caf68b533610e6f5837c110f3af8d4c
2106 F20101206_AAAWFK mendezvazquez_a_Page_049.txt
a8fff980d476de7bca1d79290c5720d3
9923edb2f66da0c30c89230c7732afb20a9ffb99
7441 F20101206_AAAXCD mendezvazquez_a_Page_236thm.jpg
6df03dfdc2324543a49e20890d048ba9
4592f8af418b1a1b86f4355eebe2ca940223f972
7476 F20101206_AAAWSV mendezvazquez_a_Page_068thm.jpg
5cecb2ef37c2a953d76ab4f4fdeaafed
874c1ce961e7ccad19334beec3c7b251dbabd2a7
1051438 F20101206_AAAVIS mendezvazquez_a_Page_109.jp2
6ef8e8d40cf32bd6bbbeb935a2c6fc66
536a52d44a2f7984a33c8683f33a6066648faffa
1322 F20101206_AAAWFL mendezvazquez_a_Page_050.txt
2fae7fa9ea6670fa800d53b53191b28a
41851689b7d5dfecb23684a7dae85c81d41ba83e
29927 F20101206_AAAXCE mendezvazquez_a_Page_236.QC.jpg
0283a82dcac8364dab0220af94e6948a
0420aa8bc196324efd107346e3b290b48dadffd8
31136 F20101206_AAAWSW mendezvazquez_a_Page_068.QC.jpg
18fc151ddfd5404035d86a85a20803f7
eb7a756346438791ed545142950ee92a3652124c
745055 F20101206_AAAVIT mendezvazquez_a_Page_110.jp2
7e19a7bd7be9c17e02e0fc2b962a0867
300ca919e0a7a670bb27c8cb1e7ce5a9f0af557c
1774 F20101206_AAAWFM mendezvazquez_a_Page_051.txt
19d40340d5bf36fdd76da0a9256d13f9
b52d06eec7b8ee6060747e7c566f78ed460104ed
7492 F20101206_AAAXCF mendezvazquez_a_Page_237thm.jpg
68f0ad98b94e80b146cfaae6f5cc0305
2f433a152deed09c47d7663c20b426a4c0c8ebd2
30607 F20101206_AAAWSX mendezvazquez_a_Page_069.QC.jpg
e811ca17c724acbbf73fa1b6972d8b15
681d3bc038ebf2b35063389d4044ebf3f1ddc7a5
848268 F20101206_AAAVIU mendezvazquez_a_Page_111.jp2
b3a2e3180fcb71a74b6d5f8cfe05e2f7
d4a3749fd9981406ab097e475e24c460758f7269
1864 F20101206_AAAWFN mendezvazquez_a_Page_052.txt
cd5a3327958d342fa14a8ea256b1e8db
e838008794c070fc4d9a20c7e6c3ad8422f61e68
28853 F20101206_AAAXCG mendezvazquez_a_Page_237.QC.jpg
59c04ba10af580bd96f865e1700da5dd
a66c2b6c74b1cbbdd0de5d6a53dc7d8eb1175125
31864 F20101206_AAAWSY mendezvazquez_a_Page_070.QC.jpg
40a28acd4935dff039d733f2c48f3972
52ff11500761a9abc0b66956cd83ba5f03ae07eb
928527 F20101206_AAAVIV mendezvazquez_a_Page_112.jp2
c42d07b4dc5339b6ace1796fd52333f3
836542fec3d349ba50d061b672ec699dafc99e46
1406 F20101206_AAAWFO mendezvazquez_a_Page_053.txt
80e1fd0d322d7b6ed039c512d977fc01
23c7fc47b2071eaa7347619b906b588926d01ea9
8148 F20101206_AAAXCH mendezvazquez_a_Page_238thm.jpg
08062b3aa6573f1ae10b4b3dca28081c
b45686341f45b765c871d840f54731411e24cb1a
8322 F20101206_AAAWSZ mendezvazquez_a_Page_071thm.jpg
071f02ddfc0d380cdf211cf6d02dd859
7d2ee671c85d576ffa4daa50d837d8f80268ecb3
1051945 F20101206_AAAVIW mendezvazquez_a_Page_113.jp2
3dc857a874c85ca0b3adbe1f04588a09
e0973ff9ed200edfdc35af406d3235c86b905823
1381 F20101206_AAAWFP mendezvazquez_a_Page_054.txt
8dea2c2baa853a5c9aceb0c80f0bd347
238174a78d3c4bd83109b0337d77b6b2396dd776
34220 F20101206_AAAXCI mendezvazquez_a_Page_238.QC.jpg
cf37ed8dc5e5bb41733a87f51bfcc7a8
ca7501874dd17563aa0db961b9f27bfe7d0c6f17
7348 F20101206_AAAXCJ mendezvazquez_a_Page_239thm.jpg
6273bfedd3fcbc98e23b7c5e8bc50eb7
69e1676e6dcd166d837923ee3111960a1e8aff6f
1051978 F20101206_AAAVIX mendezvazquez_a_Page_114.jp2
3a8009618ef6e77d71f118bca331b721
48942af37b35a7cf7fe546b04379d778a6a58d94
2025 F20101206_AAAWFQ mendezvazquez_a_Page_056.txt
43dc42ee3b45e89f179398da1046a391
6e8e023a7262483f2cc1b94bea65c6fd13701e8e
33040 F20101206_AAAXCK mendezvazquez_a_Page_239.QC.jpg
508dd070c950635cc473279be13627f8
5a1ae68f75f1486bd07179b9869fae796a3531e6
71990 F20101206_AAAVIY mendezvazquez_a_Page_115.jp2
f07134fac778ec5602f421e95648a7ff
1c2b0325e93ffd2392a189dad0a41f0cd1ba8e35
1856 F20101206_AAAWFR mendezvazquez_a_Page_057.txt
54acdfe0a0377912759ea7993f147a53
ebea3bfbf717092b4c6c16ce9ad2e6b304143451
7696 F20101206_AAAXCL mendezvazquez_a_Page_240thm.jpg
cb50d57e6ab6134c6e0d1c7697f6865c
6ba29ba1187d6bae4cdec9c92330b2d2831f4e20
53516 F20101206_AAAVIZ mendezvazquez_a_Page_116.jp2
c02af5b3bba410e796506938aea958fc
0a1b7336106d2eaf8411498fa88ac1773e51f289
1637 F20101206_AAAWFS mendezvazquez_a_Page_058.txt
fa2d7aecfa832d6f8c61dce748121ceb
2d9d6a768cf03b5dd59ac63fa2953636758e0cf9
33380 F20101206_AAAXCM mendezvazquez_a_Page_240.QC.jpg
f836f50246995cf9e54b8c50925fd376
810bcf921359128b8f8ee4e3c09bbb030096620a
1691 F20101206_AAAWFT mendezvazquez_a_Page_059.txt
70a4ff7095577fd1870642dcd6e10a22
4de32be60cc7e5276897537c5d0f2932f208bbd0
7049 F20101206_AAAXCN mendezvazquez_a_Page_242thm.jpg
615a1d04df2d253f0202dd9cbc8e9914
0f458c59cef493dc1c8dc5ae81b07105aee2906e
1666 F20101206_AAAWFU mendezvazquez_a_Page_060.txt
e4a183693c5b1ce1f81c5dd8e914034f
6f68761a12b0cc76f5ce5666d05fd00c03a8e3a1
F20101206_AAAURA mendezvazquez_a_Page_062.tif
f3995996ff063659abdf8adf529b8c3f
f8624635da032bf0b4eaa4c85b3fe12525a75c90
31706 F20101206_AAAXCO mendezvazquez_a_Page_242.QC.jpg
3790fcd047e610c664c8cf658738c4fe
d87c5b26f49d37ee758f46c8d2ed8759152d247d
1826 F20101206_AAAWFV mendezvazquez_a_Page_061.txt
9e211dd5069de45099e725dc2fe06775
d875e4280df09dbd6afc828534deeea9f36db978
F20101206_AAAURB mendezvazquez_a_Page_150.tif
f4dee4d137c30a239ff4576a78055015
399cf19d84cfa71a29e01516e64396d290eae442
1482 F20101206_AAAXCP mendezvazquez_a_Page_243thm.jpg
54cb51b30e49a3171547b9ba1a5981e2
e6210efb13fffdadff82ad1e6d06381a3c9b3a0b
1790 F20101206_AAAWFW mendezvazquez_a_Page_062.txt
5fbfa8eb83ac77f16cfdc7b4ed13bab2
4e6763b8cf2e60901d73cc6db56dbca373a814e6
40019 F20101206_AAAURC mendezvazquez_a_Page_160.jpg
bc9bd9f0c1e39333ccea931073661b1a
a734f6cfc1242d0d5d8940e6938020113f9b296e
6130 F20101206_AAAXCQ mendezvazquez_a_Page_243.QC.jpg
3c8ce7264c83c1670d9f37c37eebbbe3
1f25f40dd9d8ff974f57311b82490f3eb4b3a8fa
2172 F20101206_AAAWFX mendezvazquez_a_Page_063.txt
41a2733183dae1abcf2e085b521d54b4
b718d795d2b15ccca1ac4cb27993e01eade4c406
12126 F20101206_AAAWYA mendezvazquez_a_Page_160.QC.jpg
a1016a010ce37c9df747cd8968530509
4598c2556fa3ae0c1cc372a400ad66f11df7b72b
106524 F20101206_AAAURD mendezvazquez_a_Page_233.jp2
67b78e68be484efcee381fb04b029860
42d72145fe2f72bd0d06d2f81c9c5d8aecc9a746
2159 F20101206_AAAWFY mendezvazquez_a_Page_064.txt
b7ebbd67bf4d8a902fc1a7142da11558
2e304d017ec106e22a2d33fbb43656aa00f61a73
4851 F20101206_AAAWYB mendezvazquez_a_Page_161thm.jpg
97f3ebba5abe1e16d44715f9a6017f3b
379374a142b62a8d74f3edac0522ae7ba6349c28
2156 F20101206_AAAURE mendezvazquez_a_Page_026.txt
76066e8885ed0bf29438202f61bf7741
20eeedfaa5551e1c175371391533fc618c5878d7
2160 F20101206_AAAWFZ mendezvazquez_a_Page_065.txt
66f33874ef1213a43908620f409b4feb
c2e43b684904f15ae77b176acf235d6a34dc473b
18285 F20101206_AAAWYC mendezvazquez_a_Page_161.QC.jpg
774fa387092defb80d67fda1caef6dc8
64dd7f4434003668cd575c1ea47821fbf423453f
11836 F20101206_AAAURF mendezvazquez_a_Page_152.QC.jpg
e6305392066f233cce39fadbd271a412
9f6d4a0584bed0cef663084becf438a883f6d456
5969 F20101206_AAAWYD mendezvazquez_a_Page_163thm.jpg
6f1057bbf207fa9502ad944608fbdc26
86fe691a4249c49203d25035e254aa86046fda96
92 F20101206_AAAURG mendezvazquez_a_Page_002.txt
19cb449070d652eedf2dd71802316cde
67f474c12a2df14f082227d30386321ac10457e1
22681 F20101206_AAAWYE mendezvazquez_a_Page_163.QC.jpg
1ea51435473724f6d57ac154aa3d558a
2f1ab5ce7610c609b4e78ddfa1107ef0ad8180d2
3355 F20101206_AAAURH mendezvazquez_a_Page_158thm.jpg
44d71323ed6baba31f3fad51ede1b076
e64885c8c20b8444d8ac75f93d0bf0894a2a4c72
F20101206_AAAVOA mendezvazquez_a_Page_025.tif
224facf03ec4de18c54e2c51a53d6fb2
e50059d3e79cc2e1efe85b0f3cd04103b0655e53
5646 F20101206_AAAWYF mendezvazquez_a_Page_164thm.jpg
c236453b8fc9a26799b1cfb1bdb4550a
46e822900f71789ccbf66832914084481e45d06b
F20101206_AAAVOB mendezvazquez_a_Page_026.tif
ec19cc236beda0cff39c01556adef564
cff6390bec81ff5a08a9c3505b4ea9c36b0cd4c2
20510 F20101206_AAAWYG mendezvazquez_a_Page_164.QC.jpg
a038fd89c382128cde7917f3a6274175
05cd256a5bd251a5e9f6b61cb5d7b31df4c3fc18
5270 F20101206_AAAURI mendezvazquez_a_Page_223thm.jpg
811c94fadf22040c0ecb3232543e6eba
16cbe39007bf3c9b6cc78ccae5cdb5c5b142e99b
F20101206_AAAVOC mendezvazquez_a_Page_027.tif
a6c56868a3ba9c19688e1a93e2874b3a
65ee0c5518e1e132bee6d530ff8cd7e60dba44a1
5339 F20101206_AAAWYH mendezvazquez_a_Page_165thm.jpg
5739a9a2c3ecd6d0bceb9c6ec5c317f9
5133044a5ae67d0c66ad0b33cfd7d4f5fbae3076
7350 F20101206_AAAURJ mendezvazquez_a_Page_092thm.jpg
c36c5594ebf26cde001e05b1ea07c8b8
21dcad4dfbf7ae6acfe55da65ca159a616c51139
F20101206_AAAVOD mendezvazquez_a_Page_028.tif
68512d87b7dcc734b575ff8baa344663
55251f43baad820bf69c3e959dcf12f47d270566
5652 F20101206_AAAWYI mendezvazquez_a_Page_166thm.jpg
03aab29ce611333e543a1cbef1205298
daabb64a311a185b08cc7fe852c1f848313af525
69721 F20101206_AAAURK mendezvazquez_a_Page_198.jpg
d153d3b410a93c869176994937905be4
5a7a301b55f8784d644e273bff4afedd5354125b
19711 F20101206_AAAWYJ mendezvazquez_a_Page_166.QC.jpg
6ada33f5a4c1549fc7fe1b07328b9ad9
8e8d829610e9f5da47ae6e01a8229cb2f7c023c1
75052 F20101206_AAAURL mendezvazquez_a_Page_042.jpg
32bc529076aee0f122a2beef2e15bfd1
e9c6c76b3a88975f5244968f97f199a9460b9408
F20101206_AAAVOE mendezvazquez_a_Page_029.tif
0d795ce8dd8de73403c9a52897eeadcb
9505fa56d65986c066e2d3a336d9169213c4902a
5780 F20101206_AAAWYK mendezvazquez_a_Page_168thm.jpg
ef86d6db00ccbd821e4aa3c5e1832907
7d1b228cc4f2c15eadecdd41b473a355964b1e21
34183 F20101206_AAAURM mendezvazquez_a_Page_076.QC.jpg
20fb293161a6e158d4095290213133a1
96795809473b67b0eb7d40be0ae0410d8200e557
F20101206_AAAVOF mendezvazquez_a_Page_030.tif
b6b2dd0b983008124ee2d78af0c1c877
b48856bf2039e7468b3a0d0a48440d8704d00f34
5930 F20101206_AAAWYL mendezvazquez_a_Page_169thm.jpg
9c93d3072e91093ba4b44a0b40e0e2b6
6922461a0d84c5a222d79fb2e45ebff5c883ae8e
6611 F20101206_AAAURN mendezvazquez_a_Page_059thm.jpg
1c4fefa8d8537af4aaa714a6e3a0a329
9d4de5558abed591d1bac2fd06d1b1adfd8a0ea0
F20101206_AAAVOG mendezvazquez_a_Page_031.tif
8432fd4b01e2c949d5f446e99f885cf1
740e76d6cf3203bd515b1726ca3e99f22b83ade5
495 F20101206_AAAWLA mendezvazquez_a_Page_209.txt
312e30e6aa4858ab0ea57b27db732329
88ea0d9cc3f4c2d9483876f54d8c88b0b1d7413b
21508 F20101206_AAAWYM mendezvazquez_a_Page_169.QC.jpg
d9d007b9b32f20a7d9ec95e28d590303
392ab028f18ac1a9e6a52f3dcd6ee066b128f586
1127 F20101206_AAAURO mendezvazquez_a_Page_142.txt
d4e8f4a592b45ed4e71cdec8778bfac2
d3462f3a546465858a1a4a498ef61ffc4016a2ec
F20101206_AAAVOH mendezvazquez_a_Page_032.tif
d305be5c863131ecdc885d46601ba4c5
a80b6fd54509d8108258f4d062cabc151c243279
995 F20101206_AAAWLB mendezvazquez_a_Page_210.txt
6ed62a61f2964b0e8e24a541a913ef25
120829100ab339fab0dce2cf2694a9b6562bee4c
21141 F20101206_AAAURP mendezvazquez_a_Page_228.pro
1f670a5410ded91767c65ef7d9d5595a
3a5f7bc12b1b7e331876330d1c19f7443fcdb784
F20101206_AAAVOI mendezvazquez_a_Page_033.tif
b742542683e4120d4b8aa48a187233e2
f92238151fa3beac85b3bebb4413c534623a57f8
1360 F20101206_AAAWLC mendezvazquez_a_Page_211.txt
54de1a72062fb627db340618196e83ae
0c4588fde9a994c306359128d38c749017759be5
6120 F20101206_AAAWYN mendezvazquez_a_Page_170thm.jpg
63e0b0ec39bfc82a533efe16a1c83307
3090d57834705a43d667545f964ff8e44d664cba
49675 F20101206_AAAURQ mendezvazquez_a_Page_028.pro
b638066d3bcd817be4947e74be6e0833
f0d11d758d170214bf6bfb567d18073271ebd50a
F20101206_AAAVOJ mendezvazquez_a_Page_034.tif
4fa022f9040ea3c8d4e7668c4df40f8f
cac33ab5b30fe3633f52b95cb9e317edcfafdf07
1631 F20101206_AAAWLD mendezvazquez_a_Page_212.txt
c98dafae374b689fccfbad9bc2625f99
feb9ba09fbc537bb19c450feb6371ccc2b0a14ba
22710 F20101206_AAAWYO mendezvazquez_a_Page_170.QC.jpg
4c38f7f111d4c39fdaf1750f24112954
de66e0a246b6074a9a00958232a90cfd820a76ec
1051981 F20101206_AAAURR mendezvazquez_a_Page_028.jp2
c026542abbccdd1ef10dcd072dcb7d85
6e47a5bc82980d7ac544e10ae3de8bc8f6f6ee7f
F20101206_AAAVOK mendezvazquez_a_Page_035.tif
b670d1ebd687730cd5826f0378f2cd8b
bb255b2158015d264039b47d325166ec5f02e3c2
1234 F20101206_AAAWLE mendezvazquez_a_Page_213.txt
4bf63d763d8f41c1f626a414e6d10c3d
db5d871876e0baf0df8295544333619482485a27
51107 F20101206_AAAVBA mendezvazquez_a_Page_121.jpg
a5f42c5d58c1424054c8be55d6d489a2
b6d1879cf13a3d3cefe4f8d72974b03f8e21d23b
21548 F20101206_AAAWYP mendezvazquez_a_Page_171.QC.jpg
aa3d366ddaeb2d2acac81aaf30dca20d
b169594d6b58bfb72ee159a94fedc92ae91cbbc0
F20101206_AAAURS mendezvazquez_a_Page_064.tif
13359cb28116face7bfb4f2fba4ab192
d10b914a64a97367457330f69aa77fa10ad56636
F20101206_AAAVOL mendezvazquez_a_Page_036.tif
6819c27abb3ddc0c27857e97ea8774ca
6edef1b0f3a18dfc3a4cc80999d4e5d982c2c457
1675 F20101206_AAAWLF mendezvazquez_a_Page_214.txt
4fbe0ab051e15793f6580e97d06e32f5
5083aa16476c574daca8e3944b84fe97e75e660b
64880 F20101206_AAAVBB mendezvazquez_a_Page_122.jpg
02c276b7d1dd7b0fd545ec508e973f3b
64109045f21c520796e1fded684f36d9b9d68020
5667 F20101206_AAAWYQ mendezvazquez_a_Page_172thm.jpg
a2ac144dbb9acd59b731b70fe77213bc
4995ec1d9b51171f96f6ad117e01c08906e046bf
87546 F20101206_AAAURT mendezvazquez_a_Page_062.jpg
55b2fa6c801cfb23f887bbcad8ed50f2
081eb70c6f24ca9a1fdd6340d46ec9f1f20008b0
F20101206_AAAVOM mendezvazquez_a_Page_037.tif
d06842fd91eb2a66dcc363fab82154df
78c9c991abb8860cd06f9c96c636d86a9bad62a0
754 F20101206_AAAWLG mendezvazquez_a_Page_216.txt
ebaa21f10a85ae657f057dfb0a4d8a97
669d6e735b7a5754a6b4e65356d35d5e2b2c21c1
51984 F20101206_AAAVBC mendezvazquez_a_Page_123.jpg
a2e5fc6de6ca889a0f6ecc3d616f8148
0f4abdd7f56253cf6674019baafdbce90016e75a
19744 F20101206_AAAWYR mendezvazquez_a_Page_172.QC.jpg
3cacf4c45097d1bfa97c3b81df5a5d0f
e882c70560899af07a2d91433faed335a9d6a663
92410 F20101206_AAAURU mendezvazquez_a_Page_052.jpg
0bd3a4365ff184a3e72834f5c29d3d7d
a3637aa5ee8382067a723985c8574367ed742064
F20101206_AAAVON mendezvazquez_a_Page_038.tif
cc566285ba2c2b718b8ea6aa1d8e6c74
ae9095b4dc115cedb5e11ffe09b8b4c22b758f43
986 F20101206_AAAWLH mendezvazquez_a_Page_217.txt
275146abdbfe28b7bbb1d150f222c197
5b2f4264722571b7b057001e058d7e092081a372
75274 F20101206_AAAVBD mendezvazquez_a_Page_124.jpg
b27a2e729a7f1922d408b580bc33c4d6
47660d7b3ca1ddd4562229a5b81b05735588945f
6152 F20101206_AAAWYS mendezvazquez_a_Page_173thm.jpg
027995584b0f255ae087dc7a00eb64e5
9237e8ae43c3950a65f31914b75053bc95285e50
F20101206_AAAURV mendezvazquez_a_Page_152.tif
b59b877c890569c547689b7f8f59400e
160984f41509dec4b232b0ae027ea6e455c2b83b
F20101206_AAAVOO mendezvazquez_a_Page_039.tif
0fc5f0193b75c3cf00b6c4eefeed9388
bdc224a69404fdbc6b28978a9f0d67dfeda2cdea
702 F20101206_AAAWLI mendezvazquez_a_Page_218.txt
609b8759a5570ae9ef76682d3c25859f
fa88afa1f47ec33a44e77355a321cfab3d534cbd
53252 F20101206_AAAVBE mendezvazquez_a_Page_127.jpg
2013e69eabed2f2d61555eaaacda9ae6
17714f5c69d47280f50063370400f0bdb4a33a91
20950 F20101206_AAAWYT mendezvazquez_a_Page_173.QC.jpg
3d52d658cbae10fb69b26520bace074f
42babd65148f5c648c013d844bb9c696f639a2f0
757189 F20101206_AAAURW mendezvazquez_a_Page_022.jp2
5932718d9c8d687090a1fa5560b6a79f
d31dddc29c09823ab7abb18987a28a6f7543be94
F20101206_AAAVOP mendezvazquez_a_Page_040.tif
032d99fea0c55ee2b858e19acd61757f
771b4929ba59fcbaf449ab16fea2bfd5f584d019
2151 F20101206_AAAWLJ mendezvazquez_a_Page_219.txt
0d9b2821aca19a68ced214dd7033f498
5fd2335b0e1298f401018608e2dc34d94a70299d
47295 F20101206_AAAVBF mendezvazquez_a_Page_129.jpg
a4323b0cb2d4fc020044ec3f85dd4f0f
897f4cbe225214f94a8d39fd16387a8c0b7bee7f
4984 F20101206_AAAWYU mendezvazquez_a_Page_174thm.jpg
987a55e6bb9476f99b1dd372852caaf9
4eda9213a99b7b0a6b5811684f5bdd76b7b9737d
F20101206_AAAURX mendezvazquez_a_Page_133.tif
5186a0b74bbb239aaaf9f61a44cb6d95
691d5fb6c7ccc635add0dd6364ea0e686db68465
F20101206_AAAVOQ mendezvazquez_a_Page_041.tif
d7839f664acb334259a048e0dee94349
e025236f20f5547b722475c4000e60e861a73392
275 F20101206_AAAWLK mendezvazquez_a_Page_220.txt
17eed4faf60381ae8c6cfda3103f75b4
2860a849c075fda6940df5423b033cd1c7381ef4
70487 F20101206_AAAVBG mendezvazquez_a_Page_130.jpg
080b5ccf269efc1fdc3ada5269854ee8
71973be9573b38c44681df678b8b89df67a534a7
19706 F20101206_AAAWYV mendezvazquez_a_Page_175.QC.jpg
58471267907c7c0ab7b9858fe4b6fbae
af6c6112d75a4ffbb2c46fcc2989d44816a2bd1c
29267 F20101206_AAAURY mendezvazquez_a_Page_080.jpg
04918d10b3927e726ba72f3319e1665e
39d4015ab7ec6ece4134314bee5c25758d2e78b8
F20101206_AAAVOR mendezvazquez_a_Page_042.tif
336f5f096a6ca1302a4e65d8a87fbc11
cc09a72e824db55e838240855415ea9c248d8a78
1680 F20101206_AAAWLL mendezvazquez_a_Page_221.txt
bf3ddbaff14a9631393d8b52c38718fa
c11660d6331e3605132ae83370b20aa252e61b0f
103921 F20101206_AAAVBH mendezvazquez_a_Page_131.jpg
02ea9510021fd5bae232d2a4e2496859
f6a82520f56eb30d6577e75330b9684e49960c4e
7530 F20101206_AAAWYW mendezvazquez_a_Page_176thm.jpg
59576852fa656c77108a25cf60f971fd
821b202358114666401dd654c3bad09dcf9bed90
21191 F20101206_AAAURZ mendezvazquez_a_Page_206.QC.jpg
a834437a07d39949a1662042636a3cd0
6ba58448981ed1915915b374382bb70ea5f8be7d
F20101206_AAAVOS mendezvazquez_a_Page_043.tif
1cbbef42b60c5643b2ce52eefb5a515a
73ae1fcb44f8d56c888e02f7b1f91494fb33d8c6
898 F20101206_AAAWLM mendezvazquez_a_Page_222.txt
0ead32bb448a1d169080d29f376e81b7
551dd6db54e23b6a167ef01bc5f26a16bc056595
100591 F20101206_AAAVBI mendezvazquez_a_Page_132.jpg
45100781182d1600179067e6cd3db197
231bbe53114ef3c4433061974e90a527bfc5ed36
28785 F20101206_AAAWYX mendezvazquez_a_Page_176.QC.jpg
5515df9e9c7e8782a1481684d3b12ef0
af315d9946d3b6a2e2ac0b9841af5789184d6e1f
F20101206_AAAVOT mendezvazquez_a_Page_044.tif
e2c5b28a65d01400180c1ffe7a13acd8
ba7db976330b8bad7360d834c82197d39638641f
1408 F20101206_AAAWLN mendezvazquez_a_Page_223.txt
fa4eabd13ab0713571df0ed6f7fb374c
476a74734de01d93d24506007fa7b0fb6aa3078b
99841 F20101206_AAAVBJ mendezvazquez_a_Page_133.jpg
1a148c79d6bd70eb1dd001e50f6fe2ff
9a01304a868cc2f8b51a15a7b404c74b387bd1c6
8319 F20101206_AAAWYY mendezvazquez_a_Page_177thm.jpg
1ff0fb7a9c0ae0cc39701d7416835d63
7455ca5bbd1b5b436bead7b0e865a0d2eb847e28
F20101206_AAAVOU mendezvazquez_a_Page_045.tif
1d964009f9ba2276a1b421cb81b036ec
a6480d4036ed0a3c36661016b5c240f27203e8a9
172 F20101206_AAAWLO mendezvazquez_a_Page_224.txt
def1ac81a0aa381159075aea3b74b7aa
913c37cf1de5ea987ba840c042abc7b99c623459
106800 F20101206_AAAVBK mendezvazquez_a_Page_134.jpg
7715e42c09b8c2a6fa2393d894def636
ed1d1e1a12de0505c4c6131ec98603daf02efcdc
7493 F20101206_AAAWYZ mendezvazquez_a_Page_178thm.jpg
b439323481fc1fd83c12270a669c5bcc
1cf3feb5d81dbd32275ab187d309967b1090cc05
F20101206_AAAVOV mendezvazquez_a_Page_046.tif
416e518e47d986015dfd9c7638a5da98
88d5c9efb379e9a2469d9cc17ed66624677a51c6
2437 F20101206_AAAWLP mendezvazquez_a_Page_225.txt
ac9ed2b1b9250e7aeafe7322274d1b8d
5a07a6c63cb002dd0397f14cae37828b1db70f49
106648 F20101206_AAAVBL mendezvazquez_a_Page_135.jpg
3979b276ac5c54c34de9bb6a9185b50e
52fb3868f1742ede9639e6cb70c1510ab3cafa3c
F20101206_AAAVOW mendezvazquez_a_Page_048.tif
56eb7b2132a8a7039a2c0eb1004654dd
d42e63cc3f87c1e780ba95c725c3d651401dbd0d
725 F20101206_AAAWLQ mendezvazquez_a_Page_226.txt
6e2436ae4453fb2046c1b51bb0ecaa18
3b88219b6685e64743569977a0287637b9b6efd3
50838 F20101206_AAAVBM mendezvazquez_a_Page_136.jpg
39b7030b174edc0de9f9a84f89d46d4a
0b5bf9d27a980bd43c1d77eed6176c0a05db6961
F20101206_AAAVOX mendezvazquez_a_Page_049.tif
b49cded17912c004bf47af4096217ddd
73a54176562ee0ab16933e783ee9bfcd334253de
2155 F20101206_AAAWLR mendezvazquez_a_Page_227.txt
626cea867e3f8a99a5bd801f68cc1891
aa19e14f5e452be608d663b29af1b67b345517d4
64075 F20101206_AAAVBN mendezvazquez_a_Page_137.jpg
c1ddb0eb001ab861972ea56ac7dec089
85315fc3deb6837bd9a211115da5795e361753be
F20101206_AAAVOY mendezvazquez_a_Page_050.tif
d82540435500d0b8ec97f32bbd27863e
7e52f3bb658f5cc915c6afcbe698466ee3312948
835 F20101206_AAAWLS mendezvazquez_a_Page_228.txt
d4cba305e5388c0dfc6a129af5f5b7cd
2f7919f1433b8f7009a54db357551b9bb7047786
56953 F20101206_AAAVBO mendezvazquez_a_Page_138.jpg
041fca2415de5cfd1596f473a9e7bff8
b58ef90e648896024d41348270120c2939b9d1e6
F20101206_AAAVOZ mendezvazquez_a_Page_051.tif
c9ba10c0d24beaebf05bcf19dcc77f3c
3cff99649584f6f4a2b1260280b583d6a4be142a
2014 F20101206_AAAWLT mendezvazquez_a_Page_229.txt
b35bae2694d797fc387511e3fcc880e8
2f3750ca35acc5514ba7d14ac700060c11e8824d
43753 F20101206_AAAVBP mendezvazquez_a_Page_139.jpg
ce2d5e879ea216bdc43e4eb20f479c75
65bfd7f173bcc092a18b85b66e5527f807976e26
2089 F20101206_AAAWLU mendezvazquez_a_Page_231.txt
ad97d6b2879764e25e22a217c3b09d4d
28bf34f9dcb009f555523f09669ad4204843ddbb
6924 F20101206_AAAUXA mendezvazquez_a_Page_241thm.jpg
4f535fd859b76ed4512fe585385d8042
6986d2885a2de1fa5df1554b58b5988c2d66719f
42731 F20101206_AAAVBQ mendezvazquez_a_Page_140.jpg
5e13269c40dd1b30200ceb11b7d07ac0
5a764e5c8fd5a81512c1789e8a9f3120354a569e
2044 F20101206_AAAWLV mendezvazquez_a_Page_232.txt
1d22392355bb3c189a88a250a9eacf26
f695170ae946e438d6a7f935004006b975d7c988
26925 F20101206_AAAUXB mendezvazquez_a_Page_120.QC.jpg
561ff04ba89a5ad42b886fe61684be97
eb66eb2fb70d99e238d80b1ea2b0589d0b434f96
58668 F20101206_AAAVBR mendezvazquez_a_Page_141.jpg
a0be9ab1bfc3b988035056081212c7a3
da199d98d27acae9e70b4f57cdae573894b70f96
F20101206_AAAWLW mendezvazquez_a_Page_233.txt
480da97ea45a5798725d32c108fef1ed
4866338990e997bf8f979a1244a99544700fbf4f
308291 F20101206_AAAUXC mendezvazquez_a_Page_204.jp2
55ecadfe19d03f9b258952a62cb3ea0a
84f56128bdb7067bf85869f65face9e585d9806a
60067 F20101206_AAAVBS mendezvazquez_a_Page_142.jpg
17d780f3bd00ed7c29bc4d9ca6c5a5fc
6b6da4530dc859932bc22db4263c8d191d51bd3b
2216 F20101206_AAAWLX mendezvazquez_a_Page_234.txt
e17345f6395653ebea8831046fdfc3aa
4ef915d0933a77837535cd6348b05a04ebb88d7b
287143 F20101206_AAAUXD UFE0021658_00001.mets FULL
527de9461092971d5af79d49e10f2730
25476ee73730cc5e1e7abbbae34f861a3ebdf3d7
70774 F20101206_AAAVBT mendezvazquez_a_Page_145.jpg
46e6691207426217c39a4840aeb6b1b1
bd476385ba74cabc6e516252e1ea91ab0188530d
2235 F20101206_AAAWLY mendezvazquez_a_Page_235.txt
0223ec2bbddff1e19ed7fe20a56d604a
52d2835afaf9b6b50196bf56beb1b080c01e4b54
31693 F20101206_AAAVBU mendezvazquez_a_Page_147.jpg
cbcb2e0ea31529f9732061da0bb52d31
8f83f1337f8e2f3b54e13c74e92e61c1a35eea98
2154 F20101206_AAAWLZ mendezvazquez_a_Page_236.txt
7db39ff67fc4371600c4adaf89f375b5
b80f7caad48b776d9ebacb190fb63a3f3b753b05
38280 F20101206_AAAVBV mendezvazquez_a_Page_148.jpg
9a7c6d3c184e025514d973f58ac8b133
fd65c9b763c989b5d3daadf3a0801d3682d49f3d
26616 F20101206_AAAUXG mendezvazquez_a_Page_001.jpg
2173c08f4fc5031880b36e8b3d4ca77f
2eb5bb6fd46b6dbac8a04e6c494fee2846c9138e
66798 F20101206_AAAVBW mendezvazquez_a_Page_149.jpg
db460b3ba4aca62ad79552ef30796fe3
bb2d8558b6ebcb987a4812d1de244f6f8b987364
4502 F20101206_AAAUXH mendezvazquez_a_Page_002.jpg
ac23f07dc32e654bcdde8c9d67c0d432
6f185144ef75ba74024cb003e81a9ca1ba2c996b
F20101206_AAAVUA mendezvazquez_a_Page_199.tif
5d01785a0e38324089394d4d6382e600
e9ba731a20d495e9284830266ddf47eccc6e3922
13196 F20101206_AAAUXI mendezvazquez_a_Page_003.jpg
c8f0a22cbb3454670f4dd7be28e6ec6a
9a6d97df61062e50b5dfcf9175ece19a4dbfa7a4
58902 F20101206_AAAVBX mendezvazquez_a_Page_151.jpg
7b2a7da3a4299f1542075caf2c3aba20
137d3b4c369f0855a3bb0e607f87dc58d8856fbd
F20101206_AAAVUB mendezvazquez_a_Page_200.tif
170a105d138696220c9d017bc48cd61b
7a813a02201ea5eaf2a3c0426276bac381381628
43966 F20101206_AAAUXJ mendezvazquez_a_Page_004.jpg
832d6d5b4f1f4723754ed730de256738
1e69aff73c4f189149919846cf2616ef1ecba08a
39011 F20101206_AAAVBY mendezvazquez_a_Page_152.jpg
2a84b17b0cb71070723baba56ccdd7e5
f8141fce12998ad45a79d5a9dad6772dd3f9db1a
F20101206_AAAVUC mendezvazquez_a_Page_201.tif
4c5e03e95f4e47d45dd8134df59ed026
098953eeb330578b2a95fb8deb54466ca91f4b7c
117802 F20101206_AAAUXK mendezvazquez_a_Page_005.jpg
88b962760e61c350681870ec17e9af1e
ae98bb06c0bbc8f7bd75fdb011db5e077f638a17
18338 F20101206_AAAVBZ mendezvazquez_a_Page_153.jpg
c2c775a731cec183ab80782c4c11cb1a
76065eaa5a391c2e5f5f96460a853b3e596d426e
F20101206_AAAVUD mendezvazquez_a_Page_202.tif
2f5c4afa0cda7cc0486215965470ac86
7c1979fb9c4f2e92d081afe77866d46bff439f61
143731 F20101206_AAAUXL mendezvazquez_a_Page_006.jpg
f386dfd165feaa12f9f8127e1c7dbd43
12168a423ceef9743573fe8392ea6b9e025623f8
F20101206_AAAVUE mendezvazquez_a_Page_204.tif
e36f44d7ea3e31e31b1254228fad0b56
348da92492af811b29a397f31be630afe326c80b
149180 F20101206_AAAUXM mendezvazquez_a_Page_007.jpg
9aa7ae8c395f0eb27f0f930639e7e75a
9dea81d4cd0b1be17b8afeb2034579c5d7df7c64
F20101206_AAAVUF mendezvazquez_a_Page_205.tif
0469d12ddc7d1003bf4875ef7db8c877
6d86730934c1d9a90052fc9585e2d235b6c8c914
137674 F20101206_AAAUXN mendezvazquez_a_Page_008.jpg
e2924580b11b35673fb0c0f9519ecf2b
edd10019a93d2aba8698cbe2bfc3e248b8ab2924
F20101206_AAAVUG mendezvazquez_a_Page_206.tif
49c74f2b1b1b2647da5a3703e729fd48
949e05e0ee7d79b16cc6b1bf3af60ec85fdfc7ad
F20101206_AAAVUH mendezvazquez_a_Page_208.tif
1db5e7b505a7baf80bb1dc88fc080fca
729af496b9ae2af785bb418e074a32cbc5c40650
7673 F20101206_AAAWRA mendezvazquez_a_Page_034thm.jpg
df1b05e08615997aa3f8febeb3377a00
e598ca3975c96c22fe594242bdef677b8260cac2
43828 F20101206_AAAUXO mendezvazquez_a_Page_009.jpg
9a7deb722905379d795fdb584b1d1ed7
9fe2f2cdab873832e9c153d8183fdbf08a17a472
F20101206_AAAVUI mendezvazquez_a_Page_209.tif
e2db156b96cd4b0088aadaa0efd10cae
f9269c6802c79f8326dbde751ae45cddc55d3a89
29275 F20101206_AAAWRB mendezvazquez_a_Page_034.QC.jpg
612adbb7260edb1696d53b04d095bae5
0356f63d3e6709631ce45283564b03de4854f0e5
139976 F20101206_AAAUXP mendezvazquez_a_Page_010.jpg
ce465ce8480b8fce8d19b02ce4f6083f
56a15cd0b5cf55bea58b29a3d6a7141b4d6c62f2
F20101206_AAAVUJ mendezvazquez_a_Page_210.tif
c7969d26a7fce107a14eaf9f6193b0e0
d092690202f26104a12ce9f02b336e9062f033a5
7564 F20101206_AAAWRC mendezvazquez_a_Page_035thm.jpg
7489801f3996d6871a0e75dbd6c205a5
64ed3ce3c525bce4f85756be6b9fee35a1f18075
132417 F20101206_AAAUXQ mendezvazquez_a_Page_011.jpg
fede095c80da560a9eb4743cb70ca708
d1bc005ff00439286dc53be4f998ee48af055ae6
30216 F20101206_AAAWRD mendezvazquez_a_Page_035.QC.jpg
33a73b212045f21f9413ba6de132e039
0203d9c8efc01976f35308f85f9322d69c71e560
24256 F20101206_AAAUXR mendezvazquez_a_Page_012.jpg
46afe99b920eb8db5f6042c9babdbc91
b7450846e3375ad0a0ed12bf814dbe9a7343011b
F20101206_AAAVUK mendezvazquez_a_Page_211.tif
e498b874ba0639f6e90b6ad9174bd3d4
6fead19fdc0de8d3a12c941fa93e87b491f90509
F20101206_AAAWRE mendezvazquez_a_Page_036thm.jpg
82f86f4bb4531ec44ca76b04df753187
dd288e2f0d174708a285c4e31d59ed85711021bf
894344 F20101206_AAAVHA mendezvazquez_a_Page_061.jp2
553f4549ab617de5fde8186c6ac1cdea
19989a6e2f1e747d515543a1354677fe978e2138
118667 F20101206_AAAUXS mendezvazquez_a_Page_013.jpg
c5fef6dfcefc62207668dde5be308051
d96da10314a5f8616f1ecf9ee3fbd16edf06954d
F20101206_AAAVUL mendezvazquez_a_Page_212.tif
7e2316cbf9e1ab4effb70a0a7f978949
89bfe5e7b578ea7ace66834c64d47268720a4579
27951 F20101206_AAAWRF mendezvazquez_a_Page_036.QC.jpg
408181e8d3af95d9745ee36a51621954
5c394541804dda45c3d39858d2037b35f2cf9a35
968362 F20101206_AAAVHB mendezvazquez_a_Page_062.jp2
a87130d9a18e75ddf0cad3cd9e6a0ced
c8aa1012d97981d9bcd0445b934227bdf7020b54
145610 F20101206_AAAUXT mendezvazquez_a_Page_014.jpg
793f6c1134c201c9bb1dd9743f3aac11
54be53076cd493c061f356989326c00ad77ae930
F20101206_AAAVUM mendezvazquez_a_Page_213.tif
39c67323ebaee092eea9aa627e305f4c
22cd5b72896a51d1e85ec67235fc955cd856d1df
1051963 F20101206_AAAVHC mendezvazquez_a_Page_063.jp2
da32726342e69e7102f9c6c45bd6284f
2eb8171d8560a85af2b2b364342d3bac1a96f4fe
144643 F20101206_AAAUXU mendezvazquez_a_Page_015.jpg
dfc0728dfe2d8e4421e95f135447735f
9806f3f8b19e67a63ff2368b6170cebcb221109a
F20101206_AAAVUN mendezvazquez_a_Page_214.tif
28d5f85c85251e23af805175595475f1
425c11a2e172f85ce539eba5d5f328f8ff035b1e
23490 F20101206_AAAWRG mendezvazquez_a_Page_037.QC.jpg
e57d1c789cb2b9cf5c9d88755b287d9a
9277b258de2441aa094a554f255e870b4d8bcce3
F20101206_AAAVHD mendezvazquez_a_Page_064.jp2
d3dc44b79e616210fb29cefeac1eef0d
b5060e6d5e0f45d7ed8489fc732f16e551e673b4
38639 F20101206_AAAUXV mendezvazquez_a_Page_017.jpg
d59b2b8429f031e7d2c9d64b5c557388
69c68a90d063278e170ad1da40749a32f4f95370
F20101206_AAAVUO mendezvazquez_a_Page_215.tif
045005abeee08b28a2e86b128932bf1d
1b9e587b1995d9a97843223dcec8c7e6b18eaee0
5914 F20101206_AAAWRH mendezvazquez_a_Page_038thm.jpg
9034081694212b80a99b8e5c69764149
2ebc4a9b2c668e35347591885cb2dc8606d440dc
1051974 F20101206_AAAVHE mendezvazquez_a_Page_065.jp2
ca091f05c69e16154a92e7e9ac988293
47f15b2ee1352bddd0f75a4fd9fbab2b47353551
94653 F20101206_AAAUXW mendezvazquez_a_Page_018.jpg
24e247067632538305f0b57a2d4c7495
28c982aec4d08b1ce922b9319ef9ce11a6088e96
F20101206_AAAVUP mendezvazquez_a_Page_217.tif
2c75760c2726792da4dfbfdc5ad1aeed
945eb0e0b2d4688fa199d50946df662f880cd2d2
20444 F20101206_AAAWRI mendezvazquez_a_Page_038.QC.jpg
bb2a050b9d2f8783b0ddeb39fe369a59
b78b0bdb647f3535d2a0dc5d8cff3368a1ac39d5
1051972 F20101206_AAAVHF mendezvazquez_a_Page_066.jp2
5e806aa0c11b032985ab144d629940c3
4a5644d48c106ef4f7f244d092d7ca9c5a9ad3a8
47321 F20101206_AAAUXX mendezvazquez_a_Page_019.jpg
a19ec02a215f609f0b1c4fed360534f8
18c4391a86a5df20dd342b2e45f0e137b624161e
F20101206_AAAVUQ mendezvazquez_a_Page_218.tif
5f59ecf69b6e6c3b6c18ce69287333bc
408b619c401c9a3974a0fce1852eebfc6c1c5c83
5760 F20101206_AAAWRJ mendezvazquez_a_Page_039thm.jpg
d9b0fedbb135c1c02c1735318e16a4d6
a04500e22f87ff314d975f710b32a22501170e3d
1030792 F20101206_AAAVHG mendezvazquez_a_Page_067.jp2
6c1f746114d57e87312fce4dd8caa120
46954596229d5ca15f963781d27beef960af07ee
100042 F20101206_AAAUXY mendezvazquez_a_Page_020.jpg
8e9e8773bdae9c4dd27090f5850d0e8c
f0c01fb1e0b25df3f809df1d5b2fab3c9ddd18b3
F20101206_AAAVUR mendezvazquez_a_Page_220.tif
51054683b5679409228a2dc6d01b5819
76e41ad57ebb0f6ae58c2618412673af957434da
7191 F20101206_AAAWRK mendezvazquez_a_Page_041thm.jpg
24d5309c214bc8bbaab26ddbb0d55a35
e2c4f828f931762a78d91726e1e754273d29bb8b
1051985 F20101206_AAAVHH mendezvazquez_a_Page_068.jp2
e38e47985db3cf7864c4790aa42aa9ef
f9c2cbf3eae99664e0f6b5bd42360cfd5d7bee3a
82076 F20101206_AAAUXZ mendezvazquez_a_Page_021.jpg
d9cec2899dfd4b53dbba944bd9be75aa
ee0d8300d38b06050b27db2d10b13554c87b86cd
2523 F20101206_AAAWEA mendezvazquez_a_Page_011.txt
94ab6e58c7bbe6096263428acc85d00a
4294ad202ce906832969110264778a44eb814bfa
F20101206_AAAVUS mendezvazquez_a_Page_222.tif
a01519e64cf1a67d58bfab430d04e5f3
d06a3196520f86a9d82a44f47532ee7176c84fb3
26046 F20101206_AAAWRL mendezvazquez_a_Page_041.QC.jpg
d399c3d34bbdfa5114c0dbdfcaa6a00f
9ec2d1298acd7d1d79d00cd8ed7e5721f691fc11
1051951 F20101206_AAAVHI mendezvazquez_a_Page_069.jp2
fc316379434aa49a48eb371f78bc420e
a18b7a58a93dcf3ba2e8045a040ed6807095399e
350 F20101206_AAAWEB mendezvazquez_a_Page_012.txt
3cf483147301f648e7a1da521b8e13bd
11992789fd55b5335dc203af44725373212acd50
F20101206_AAAVUT mendezvazquez_a_Page_223.tif
3dde91a19e8bbf9e1455538c47a8f08e
225a9f719831d1808d6565a46a94b742e235113c
6810 F20101206_AAAWRM mendezvazquez_a_Page_042thm.jpg
aa1be554b0d7c055d2347231918c6314
00d832721b9ec0d245611e111e26bf738f6a3870
1051962 F20101206_AAAVHJ mendezvazquez_a_Page_070.jp2
2de6ee8a601bfa3a80f35ad9a328f9f9
e27a8d37336fafac3b69c06086606f1c30549a29
2604 F20101206_AAAWEC mendezvazquez_a_Page_013.txt
debf257fe0bc460cc21e607b4258dcfe
001747b32c25fdd9286078908b4f8c85b108c5fd
F20101206_AAAVUU mendezvazquez_a_Page_224.tif
a94ca47116baaeb153b0e28191a318a7
b0471f257007d94d5dff189c51c7dcf612fd62aa
25078 F20101206_AAAWRN mendezvazquez_a_Page_042.QC.jpg
2fd6c161a1cc75229df5f98d33ed3d3c
254e285b8af14e1bf9f40964ce0185bcaf0926f8
1051971 F20101206_AAAVHK mendezvazquez_a_Page_071.jp2
420038ac8fbb2fb03dea66865e9ad053
220f563cc3ea591fbc4b66b2fac609595592ff3c
2674 F20101206_AAAWED mendezvazquez_a_Page_014.txt
864586bda9c291febb5c575d28d4813f
ba7c6b9b669f75396f2990938aabd21993d4b2a8
F20101206_AAAVUV mendezvazquez_a_Page_225.tif
2dcb5a94a279eb24b1bd8bed1cada3dc
029d3a7e427882d18d7bd45cf1e1b640bf0fd2b0
6881 F20101206_AAAWRO mendezvazquez_a_Page_043thm.jpg
f503d7ed366f9dd5e9d48df4bf175d39
727cbb6638f5888abd12f00ef1deb55dcf87808d
1023992 F20101206_AAAVHL mendezvazquez_a_Page_072.jp2
777029708986e84546fc4093446a50ae
05f18b6860e942e1429e4ccd5c51dd9161c5d0e0
2716 F20101206_AAAWEE mendezvazquez_a_Page_015.txt
ccba32a55a009032df38ac51558b8a74
5337c23bf19f2c2351b9cacc522b08f4ee5f8e25
F20101206_AAAVUW mendezvazquez_a_Page_226.tif
b4b4b67c5140109463da2652cd9b0ccd
664e74dce2ded11bdc5ea91d1b546a779ecfaf23
7471 F20101206_AAAWRP mendezvazquez_a_Page_044thm.jpg
774266cc52337942a9d89fb4415606c6
e1622979b137041881cd25bd9886e496de109226
843687 F20101206_AAAVHM mendezvazquez_a_Page_073.jp2
bdb74a0378ac3c7ef8db8fcdf299fc4d
82b63270b6d0cb2fda6b8e5dea43448012555824
276 F20101206_AAAWEF mendezvazquez_a_Page_016.txt
68713189a59a01aa81f22ff79d069bc6
d056419fdf3b79fbc58b1e20d9b0638ee6d799b6
F20101206_AAAVUX mendezvazquez_a_Page_227.tif
708f6ce9d71c7577450a6f5b0f4a8d00
0691fbdb6b69406b61f15bbb1393a7e6e4fdfc3e
28677 F20101206_AAAWRQ mendezvazquez_a_Page_044.QC.jpg
5e2a5987c5e1ee05391aa57edd593e9e
c4279f2483f8c26bbb4218842c696855a09c5de2
987649 F20101206_AAAVHN mendezvazquez_a_Page_074.jp2
56719d1b229161dc51177877d4951346
ff11909eceb2f751ac1854bf8281625621ee56f1
966 F20101206_AAAWEG mendezvazquez_a_Page_017.txt
7b7118e263e3e0670bbd67b9102ca121
76c8181e31269c5cb6a1607f57d7690e113fde87
F20101206_AAAVUY mendezvazquez_a_Page_228.tif
6d61567d37bc4348563fbdf676dc6640
2b7eb23a250f2a0fd151a83c1be0d2b0ec26cc0f
6259 F20101206_AAAWRR mendezvazquez_a_Page_045thm.jpg
3bd891a44904db86881624fcf31b81e8
3d1d38b1c684c799837d338a130861902f84ad53
527825 F20101206_AAAVHO mendezvazquez_a_Page_075.jp2
ce1a1879df9f376e9559dfd9a666f501
a90ef9dfaac0ff5a50368e0f7b4133d54a7703b5
2131 F20101206_AAAWEH mendezvazquez_a_Page_018.txt
d7f5b7501a0d457bf8370c054b78b683
1a5f0ef950f8dc3d09ba05222c6936b55b6c7e30
F20101206_AAAVUZ mendezvazquez_a_Page_229.tif
92e36ecc7a6e4176904b8635b8022f59
1c734bfaf9f39890124f59889e35cc4002cbe623
17285 F20101206_AAAXBA mendezvazquez_a_Page_217.QC.jpg
6aff0090f600ed4e2f33b8f7e4837b9b
cd5af9c0ff1ca64646d482352655a5939cd36b33
22369 F20101206_AAAWRS mendezvazquez_a_Page_045.QC.jpg
017fa53f9142681f9f24e84a0a1421d0
af3a09ec5657b7546639401bf30697421062ae67
F20101206_AAAVHP mendezvazquez_a_Page_076.jp2
219f044fcac854f54b9bf9edd870dae6
05f6a2d69e9772f823f50ee9caeec20c882f5cd8
891 F20101206_AAAWEI mendezvazquez_a_Page_019.txt
628971ce111b09bbbb0c47a8fd23bd04
082cfc6f1121742d642b20d6c251ff9134246221
4142 F20101206_AAAXBB mendezvazquez_a_Page_218thm.jpg
6f95887d15505f068f68f936a027aaa5
50e91a2bd44628e7cdf1bb0492f238e477089157
5928 F20101206_AAAWRT mendezvazquez_a_Page_046thm.jpg
cf93386a57e4c81719ef4c15e47e5f70
32731ab34ccaffc3132fc79c1ebdbaab038bdab7
910332 F20101206_AAAVHQ mendezvazquez_a_Page_077.jp2
218dde7699d7134533dab0aa17bc8f22
a93e5a13b9c57e092eefc20a5538f499393f046a
2058 F20101206_AAAWEJ mendezvazquez_a_Page_020.txt
ee71629970322e5e40644a9b63453987
93286819be2e1ff43f1baf26a4912c2624eb8e1d
15188 F20101206_AAAXBC mendezvazquez_a_Page_218.QC.jpg
694ff1fa8214d7e588743b2ed68a0951
d77faa70a0e725b1ff59ee7e288665b882176866
22014 F20101206_AAAWRU mendezvazquez_a_Page_046.QC.jpg
4b12c0867264281ffc26fa065dfe8685
93de664d761202653562239219810da008086482
F20101206_AAAVHR mendezvazquez_a_Page_078.jp2
4868ee44d5ae2c4407d148322ef27b49
a8da5f0d1c20d45f83eebff02faad8cf82b52f6e
1646 F20101206_AAAWEK mendezvazquez_a_Page_021.txt
5a673843a8f5c4ee6f91d9695a96c4fa
0d8b8891db25b8c908ccc30243651ca07aa8486c
7536 F20101206_AAAXBD mendezvazquez_a_Page_219thm.jpg
17d9119dc830e721092b136427a43fba
42c17e09edf42c0a39e9fc8c2bc1ea3b3fae58d8
7170 F20101206_AAAWRV mendezvazquez_a_Page_047thm.jpg
59a036e56b2ff9f20e8b02062423f3b7
393189d27bdae5a23d5a7b528bb4430b55672f63
372021 F20101206_AAAVHS mendezvazquez_a_Page_079.jp2
5b797346bfcf743a1fb9797b8c1089d5
53d7ea6abfff5fbf4a5c8984355993a031b4bd21
1747 F20101206_AAAWEL mendezvazquez_a_Page_022.txt
f5ab85b3df04c9b383b4cf030af0721d
f66944f41bca8145a7556cc08f09186f26000052
32198 F20101206_AAAXBE mendezvazquez_a_Page_219.QC.jpg
03b076cbd1e08ccc26035b815a6d9e93
67ba6fa21548777aaa6eaa0cfb9d8897b0195d89
29418 F20101206_AAAWRW mendezvazquez_a_Page_047.QC.jpg
00b040b05886618f129623b8ca0149da
91022295cc68b62327c77765c3f352c08dd9868c
287088 F20101206_AAAVHT mendezvazquez_a_Page_080.jp2
5d3c25a94897f4c15abf1f24665cf086
a76d0d5ef1cf905a8f2e1d416cf8d7040fc746ae
1743 F20101206_AAAWEM mendezvazquez_a_Page_024.txt
26630ba1cd9a568e7541c7ec17da0baf
32c461fa2d3154d8dc3d075c9c1f6e890465ae12
1414 F20101206_AAAXBF mendezvazquez_a_Page_220thm.jpg
c6f9f2e7acce2077369b1d64ed70cdea
48bfbb9d9f17cb42e7542378c1e94b33b9dee398
6923 F20101206_AAAWRX mendezvazquez_a_Page_049thm.jpg
34a63ab51632eb907491cc866811e0eb
f1ba86e3b1c89915e7e471f82cf8c5eed1112350
891753 F20101206_AAAVHU mendezvazquez_a_Page_081.jp2
b8565fb8183a206846198e7cd58b6e9b
0e39e01e1b64ba6acf656fc2075e1d8fef8113f2
2123 F20101206_AAAWEN mendezvazquez_a_Page_025.txt
2ad7e96ccf04326022a5fa3f9a1472e3
5a6bab9b3e1f4dee735f4f4287681534d5951986
5019 F20101206_AAAXBG mendezvazquez_a_Page_220.QC.jpg
0384afcdb5ced5b426fa42e48f7c7e64
e03040171609e0987a2ec0f61a4a637ba5f39ee3
5964 F20101206_AAAWRY mendezvazquez_a_Page_050thm.jpg
ba5ac4318607e1a7fba07320a09f8471
d09e24e78397853c9d0f36298b88aa89d1ddcfaa
527097 F20101206_AAAVHV mendezvazquez_a_Page_082.jp2
ace62a1ef12faebae4ad7ff11eab2776
7abf9c6418ffff27089f2c2b6d3ffe0437925200
1749 F20101206_AAAWEO mendezvazquez_a_Page_027.txt
735f863bd906eb1d79c582c408e93291
7626eabc8561de92785aab8da6c2763d34969d31
5973 F20101206_AAAXBH mendezvazquez_a_Page_221thm.jpg
ac1d8f8563625dcc28f40f42058926e4
3429f9de2515576e59e6097f05c2f19b0815cb1f
6659 F20101206_AAAWRZ mendezvazquez_a_Page_051thm.jpg
e3d0962d3a357eb04c2c95bf9d8f7aab
e40cd4407f91f9652b093517d44219cb99d3611e
1051880 F20101206_AAAVHW mendezvazquez_a_Page_083.jp2
be00908add1e6b34fe77d6b126060b57
12a7ccc1e63c72e8867311f2cdc9b16226ecfada
1994 F20101206_AAAWEP mendezvazquez_a_Page_028.txt
14ab624458e859dc5c258b86f80e6a46
2c8d4ecf8596e59b67f1d0c6e27416f14078f83d
21103 F20101206_AAAXBI mendezvazquez_a_Page_221.QC.jpg
8090c4e6865b231ae366a5168f14da65
e63608db08935be29a6f68c4ff03ca15e5e2d6f6
418148 F20101206_AAAVHX mendezvazquez_a_Page_084.jp2
d715ea0a1edc279f7ced829a578c1c10
390adba1bb5f1619f3fabc608af9dc76a0804ac1
1589 F20101206_AAAWEQ mendezvazquez_a_Page_029.txt
2c12c7a1a5b67943cf43edb02c5d2fcf
69e356e69d51dc24989efdf686cddddecc1a6270
12316 F20101206_AAAXBJ mendezvazquez_a_Page_222.QC.jpg
09965ce474f3d3937c730ad4abbf0cdf
329e0207c17e3b6707704ce9400c7da204a21b50
831122 F20101206_AAAVHY mendezvazquez_a_Page_085.jp2
2de797a1f2683d00d4bad01ffc4d54f7
a69cb1b819a1ee2128b2e5bc443c9448d16a2209
1708 F20101206_AAAWER mendezvazquez_a_Page_030.txt
56377870049634c1c3cd85ea49775858
3aca6eba02bdac5798009ca409c18f5e4d0cf22d
20004 F20101206_AAAXBK mendezvazquez_a_Page_223.QC.jpg
f4994f2f48e32837631c2a8570a7df78
88d8b1fe4cc94c4f9d74126ee582176045ba3af6
819932 F20101206_AAAVHZ mendezvazquez_a_Page_086.jp2
cdf90b9c67a3576ee69e2041b8cc52db
26f2aa873869cff42f744f09c326bf5a32df70ab
1819 F20101206_AAAWES mendezvazquez_a_Page_031.txt
f18477253d00ebc0f68ee5b832b52a5b
e3bab64eee80634acd7dd490663c08cfb52abfb4
4588 F20101206_AAAXBL mendezvazquez_a_Page_224thm.jpg
e3473b740841bcff798f79548c530d11
f825d30435cb5ba4f64356e135970035899a82bc
1585 F20101206_AAAWET mendezvazquez_a_Page_032.txt
55f4386a78525cf5349d26880ca5ca12
66486a063a1ddbdb25d6d0e8afd55f1e69579d25
14491 F20101206_AAAXBM mendezvazquez_a_Page_224.QC.jpg
f0cf9a74711d5c98e2720256126ea867
7df9543cfab18e847f22a50e7e9bfcc3919aacc2
1484 F20101206_AAAWEU mendezvazquez_a_Page_033.txt
42910a951d77445aa0e72ec53fbef7ff
60d1955d351028201c4f958975b9762bee32f272
8003 F20101206_AAAXBN mendezvazquez_a_Page_225thm.jpg
2d5e13fa44f6194388bd0ad6920e9537
bd28a2a0243a500bf921161751a5e1715df69c13
682 F20101206_AAAUQA mendezvazquez_a_Page_200.txt
0b2867809f36db6f011c8335b6af3624
d83639ee102d549857b99e89791346c4777710e8
2054 F20101206_AAAWEV mendezvazquez_a_Page_034.txt
461fc4fa045db630f4c57cdadf373689
e662ec89d52f11a5fa0ad2df8d7b81a17b4f8b93
32678 F20101206_AAAXBO mendezvazquez_a_Page_225.QC.jpg
46b65ef5dd06e5507e2d9041bb7174cb
1244beebc0b8c4e51f8feb8571b26e8985d59f9b
12273 F20101206_AAAUQB mendezvazquez_a_Page_158.QC.jpg
a6ec36c02b969cd67ce252d95c9f8a9d
213583db23555df465da5af0e26d93bfd0e62a17
2130 F20101206_AAAWEW mendezvazquez_a_Page_035.txt
a4de320f344d4a669c5323ce67879b17
ddfdfc39a86d613b8e1a27c500ced11189eeb436
26088 F20101206_AAAXBP mendezvazquez_a_Page_227.QC.jpg
3214ea791b4030cd7f16ee16fa283660
484b4ab9fd8b5409f0b14403e6a1cb93f69a3a09
426498 F20101206_AAAUQC mendezvazquez_a_Page_203.jp2
ff4b7ec13cf4575667905a72d8060b75
10eb191722245b47a80c47b9fbb769b199256018
2011 F20101206_AAAWEX mendezvazquez_a_Page_036.txt
d4bd38aeff737af69a65e5a3d71b091a
3ec3b84a408e96e1581ddf10070a541ed9b25442
12809 F20101206_AAAWXA mendezvazquez_a_Page_140.QC.jpg
247868c5a408390df7b24152ab93e6a6
758deb55bb1ff3dacde1dd2eff51906c1b7f8c17
2857 F20101206_AAAXBQ mendezvazquez_a_Page_228thm.jpg
e36090df3a8445b98f3a0f388025122a
1d564941e6794b363fff4bf1c71c1a50ff6b923e
590395 F20101206_AAAUQD mendezvazquez_a_Page_215.jp2
88ec379c827dce193bc50fc0f315dc30
f4b0638de6a45658837ff00bd1cf333320efdb94
1663 F20101206_AAAWEY mendezvazquez_a_Page_037.txt
79a302ef4da77cad817666832a5c686b
4012a487a422c3cb8f08121da102e378a9d7d854
16065 F20101206_AAAWXB mendezvazquez_a_Page_141.QC.jpg
cf49d7dd619ea1ec52ddec19bf89ba06
8ebf22ef16c3e9dee6c60c7372b7a718cc8321f9
11760 F20101206_AAAXBR mendezvazquez_a_Page_228.QC.jpg
0b900865fbb83b714ec29e6c94dcfe16
f7beae3e3cc44a66e68eb749b7bdb55cee0a046f
F20101206_AAAUQE mendezvazquez_a_Page_132.jp2
9afa1576d4ca3b0890ad9294e7ccd778
64c08c5ea629b888c1b7ea0db76fc0ce8e6a4565
4280 F20101206_AAAWXC mendezvazquez_a_Page_142thm.jpg
497dd5b985dc2ff48666a497cd9f90ee
3719b7623e1ed1152d3adf8a8a5ef5525a2cfe2a
7448 F20101206_AAAXBS mendezvazquez_a_Page_229thm.jpg
6fcf75bcc05a91d79384dbc84a6eb98e
75329cc0783db041e4f1d871827d755a8e66cb3e
7434 F20101206_AAAUQF mendezvazquez_a_Page_109thm.jpg
774f02958e0c1f0f65655e2caacd31da
933167567867cd5ffb26ed5551b524e9399ae698
F20101206_AAAWEZ mendezvazquez_a_Page_038.txt
0cddb337abc220f92843c86c3319b42d
946ecfc354cb1d737753514b18aae51675d0a581
16899 F20101206_AAAWXD mendezvazquez_a_Page_142.QC.jpg
42080e0833970beb41f46c00c7190e93
4495c0f020751c41ca854b200615c188713ab3d9
25196 F20101206_AAAXBT mendezvazquez_a_Page_229.QC.jpg
1e4b35ccb8257d624a58c0f3d1b95992
4f62d1fcf5f344274965c5011ecdc147079a0d19
49276 F20101206_AAAUQG mendezvazquez_a_Page_035.pro
ecbe6645db8171e31a9642fe368ef444
a0c0ef6ee31cb19ad4aa3cfbc705d910653d8cd1
5328 F20101206_AAAWXE mendezvazquez_a_Page_143thm.jpg
c821f950f9654e427759f54c01f26346
3770ea144c9f1ea8df76da3842c2ca63da7589ae
25118 F20101206_AAAXBU mendezvazquez_a_Page_230.QC.jpg
1b561cc91e12d85312b80de43536c9ff
365982aaf565ac949c99056db0e31a61de77833a
134914 F20101206_AAAVNA mendezvazquez_a_Page_240.jp2
563173aad0779e0a41e16c0cfb44e931
e52d443e6bd65449936163ad34de5a941aa33749
17170 F20101206_AAAWXF mendezvazquez_a_Page_143.QC.jpg
742c0a105e4a395d1b042184c50378ed
b57c690126b3051c6ae43a1823121746d749d970
27709 F20101206_AAAUQH mendezvazquez_a_Page_150.pro
f0ceaffa5db29a2972d14745be4a9146
efe31e80cd452d4a3aaf92ffb455e869a28a781f
111837 F20101206_AAAVNB mendezvazquez_a_Page_241.jp2
21344f0c8245cb97f7ee4181a1d8a422
15676ced7c889fceefec611e2375b34b5b0d465c
6967 F20101206_AAAWXG mendezvazquez_a_Page_144thm.jpg
cb5ea4f9e94dc5160910830c6da2481a
e6546520a9e8a2b9151fbb64de6830121cb57089
7208 F20101206_AAAXBV mendezvazquez_a_Page_231thm.jpg
b38f3d7e9ba7b5d86992a98b3e25a496
ab0186f03f4118af9ec1ebb29576bbe57756f3e5
2146 F20101206_AAAUQI mendezvazquez_a_Page_242.txt
cb8926ec94c1de99d91282ae9b48aeed
9f4c30b33a49c35f1c35eceee8e9037273a36989
112312 F20101206_AAAVNC mendezvazquez_a_Page_242.jp2
7e28a738b8324f3f4d539d8a285b847b
db25ece67c384eea22bd1b24edb5b9b2a6763ee1
20965 F20101206_AAAWXH mendezvazquez_a_Page_144.QC.jpg
74bbc6ae2368425c8d82db634bff118f
875276bbed2d23f93f210682c6c46b736dfd0264
30601 F20101206_AAAXBW mendezvazquez_a_Page_231.QC.jpg
bbacb027712c3e23fc738910d6973e6e
4b8b48ad43faae0e3fa6ef748e4ee2aa3f913be1
62871 F20101206_AAAUQJ mendezvazquez_a_Page_184.jpg
17c4e5f87a7d10de46c0a2181facabbb
23208be70d0fd125f79debb26c9cf51b90418654
4554 F20101206_AAAWXI mendezvazquez_a_Page_145thm.jpg
1085993d197d4261aae6eba07babfa30
1f34772a7af4b7653799a2261dc58980935ba9b2
7042 F20101206_AAAXBX mendezvazquez_a_Page_232thm.jpg
30d54e773fc5cd57c204032d3d235f84
c62d40b7d9c8c9b367d6332320c624419599b4b2
23162 F20101206_AAAUQK mendezvazquez_a_Page_127.pro
a1c4593d6e95be2bae60561674cb452b
3e81629f06d720ff521a314d8687698c1d4fcbd3
20857 F20101206_AAAVND mendezvazquez_a_Page_243.jp2
6de3cd4344f11b21a007dc0a8d9e3395
f874943022137a7bb8618b1f62a0e1b6d6a71d1b
19034 F20101206_AAAWXJ mendezvazquez_a_Page_145.QC.jpg
eb30653a4e1d68f15022875a026fe163
1ec28911eae2fac01526832e1070bddd3c9be5fe
7231 F20101206_AAAXBY mendezvazquez_a_Page_233thm.jpg
4f1e6156160654b18923b84d49974012
7a283593778650cb3fc8ad8cc64665db72eb713d
F20101206_AAAUQL mendezvazquez_a_Page_061.tif
d819272a2ddf93d01e14b2edddf2fca0
d3c326165589ba11b9c05820f50055b4dd4794d1
F20101206_AAAVNE mendezvazquez_a_Page_001.tif
f01d5c42776f6b76b3efcb4daea5bf73
b5f1a6ce417b2593795eaf393e17c948504c91cc
2546 F20101206_AAAWXK mendezvazquez_a_Page_146thm.jpg
b599b8cd6ddf1dbb73395954c200d9cb
b8ebe4ee558df0687d7dfaf78b22c5d3595097d5
7583 F20101206_AAAXBZ mendezvazquez_a_Page_234thm.jpg
e8aa8bc8142fd80e423e1f44bd731a11
4f9b9a8fd552d310ed3fba0d27558346f8aa2a93
30402 F20101206_AAAUQM mendezvazquez_a_Page_146.jpg
d397dccd6747ea1f3ea83ecb938836a3
94cc22241324dad82f4fb06a962784896443a28e
F20101206_AAAVNF mendezvazquez_a_Page_002.tif
1e95880a9c9a789a2e7d8113722e8981
f4fe1954d5fe944c9485924ea56f5e6ecd7786c6
8778 F20101206_AAAWXL mendezvazquez_a_Page_146.QC.jpg
75642acaa0100fa1bb2ff6d0dcecf4d8
75923690878dbef69eab94de6bea84be9ea27a0c
1976 F20101206_AAAUQN mendezvazquez_a_Page_023.txt
645efbbd1d7bfa44423554e28ae9cabb
c44cc32d0a844ae80c934bb66b9ef4283e1e74b4
F20101206_AAAVNG mendezvazquez_a_Page_003.tif
b51dfbe16a5b499c06ed735d99af9df7
1ed1d2cba66b40d60d7900453d2c8035ebbfafab
1974 F20101206_AAAWKA mendezvazquez_a_Page_181.txt
de530ed9783fcfb6792c0683be7bbc51
78ee9f458c35ed1e7fe6f1072e19eff662f4a0ae
95545 F20101206_AAAUQO mendezvazquez_a_Page_180.jpg
dd5565abbb65761318dfe130e5c56839
8182140d1154ce1ce2b8c4c1553cbd04cc6dd9e6
F20101206_AAAVNH mendezvazquez_a_Page_004.tif
f2a14e3f1d44d242716eec5b2e384b36
f4294077554a5243c08418b36c8483208f833bc0
2094 F20101206_AAAWKB mendezvazquez_a_Page_182.txt
e008538ee1ecf41b15e28f3448aa24b4
4e5c1bd189702fd29015ddbdbbc3c6ba0a607477
9307 F20101206_AAAWXM mendezvazquez_a_Page_147.QC.jpg
b86f1c3a61142031e60d5fa4258f804f
e83d1a4bd71d7c2d6541159afb7f41ea8c0a5e40
F20101206_AAAUQP mendezvazquez_a_Page_184.tif
dfac679c01293a61940c83d460f5aa7d
0e193e607ec0293d50c58ccad9fe29dda4af0bf7
F20101206_AAAVNI mendezvazquez_a_Page_005.tif
9c8ff8029a8f15e5175ec8f5c4deaab9
6c1e6fea068d21cef7991121b8c9b168447cac78
757 F20101206_AAAWKC mendezvazquez_a_Page_183.txt
ba6c53177fc2a69f1ca4a38e3d3ff668
0b60b11f4896abf3d635358c5da96f2aa48f6f33
11714 F20101206_AAAWXN mendezvazquez_a_Page_148.QC.jpg
4952cd9a670eaa9eab26ed53e8943049
6502f38c59145a60db456354a518947f0b08df20
7649 F20101206_AAAUQQ mendezvazquez_a_Page_114thm.jpg
6943ca181e831a20888d0345db5828e7
4616d73514ee1ea684915d5cf816818bacc62016
F20101206_AAAVNJ mendezvazquez_a_Page_007.tif
2e3f0fc64695fbf79060633cfd75bf77
f32b4c24443a90b417c6041ceeedbac8f89b6095
1286 F20101206_AAAWKD mendezvazquez_a_Page_184.txt
c6e402633a880a7d7d2ca509fde3cea9
322a40b2e7af041d59c87375647d87cd8960a852
4818 F20101206_AAAWXO mendezvazquez_a_Page_150thm.jpg
17e0ddefa032e3ff187f4cdef777c4b2
eea4bf3d4059cefdb4d25e1824af7d0b8ed3375a
771580 F20101206_AAAUQR mendezvazquez_a_Page_170.jp2
39266c70b6fde706899f99b256df42d3
f1ba4c12542a4df6401feb9441e6fe2e9dfcba95
F20101206_AAAVNK mendezvazquez_a_Page_008.tif
2e2762491612674b362a451ac262880f
e7b2cb7f1653f85baaf2a121acfa87e46f4f2f29
1083 F20101206_AAAWKE mendezvazquez_a_Page_185.txt
6e8c58cd1f5c1c4e970c27f4f0eaa90f
5c8df4957a900eb1b01f47e2965e3bcdee0ad2c4
56136 F20101206_AAAVAA mendezvazquez_a_Page_090.jpg
2bbc5fcd6749780e0db6659a63eeec8e
724e361332a852c32b47baa1ce36539c145b1423
4122 F20101206_AAAWXP mendezvazquez_a_Page_151thm.jpg
a91fced0829a3fbfcb8f3f644323d19d
0f48415d96674aeec5c6da956f93352792e05558
5943 F20101206_AAAUQS mendezvazquez_a_Page_002.jp2
1cf4fb05ce793a099c7f4e08cb18c4ce
95a7bb86aaa0156712b6e6354a0558c1f746de2b
F20101206_AAAVNL mendezvazquez_a_Page_009.tif
924f8c788963f261964b877066ca8bad
2e415e5ba5d4a80353f0b81266318b0a490c9237
1242 F20101206_AAAWKF mendezvazquez_a_Page_186.txt
3fbead6dd0c3c8ac4dc981a6000a6d07
0df37888f89386c48c7e25088e9cc05aea05314f
69546 F20101206_AAAVAB mendezvazquez_a_Page_091.jpg
f863185f33c1af0fe68290864d5fb37c
527d664265853bf9a6d2fe4dc97e611b4e9db549
16137 F20101206_AAAWXQ mendezvazquez_a_Page_151.QC.jpg
58b0770f82e4dedd1a29b0a0332f5fb2
052c49c64ea0193dacff61234d4e9f79c9ab1d13
86208 F20101206_AAAUQT mendezvazquez_a_Page_128.jpg
185e8b3b470f52e1c0e2641357872e24
478fc6ca4fa8b5c639237992048d4b08391355ff
F20101206_AAAVNM mendezvazquez_a_Page_010.tif
27976573756473f6409b80c11c0acd8e
5308953f5353fdae803b39b22492771652e32794
1176 F20101206_AAAWKG mendezvazquez_a_Page_187.txt
82b1d3c4a1a08da25b38ea97f3623b76
0e5dd5e98b50f9e2c32649b4448fe7899c256f86
108651 F20101206_AAAVAC mendezvazquez_a_Page_093.jpg
0d65529958bb5505d78e0958eb13f9b9
21246c034aa8e1fb5fec9d91862e260d179b9565
3288 F20101206_AAAWXR mendezvazquez_a_Page_152thm.jpg
10ccc4bd7ef6439b3908f0491ef69d84
ee567aa1ca7440e26886933934486226655d65a1
60765 F20101206_AAAUQU mendezvazquez_a_Page_186.jpg
08d0887ee7c197dd8a676bc6a6cab4ee
be3589a7a4753d1c91186c9de14e46ef4f274486
F20101206_AAAVNN mendezvazquez_a_Page_012.tif
cc09a62e01b11f257846c5f30b489584
5f02cccf7f142479139ff3a9547817ee2d5af956
1575 F20101206_AAAWKH mendezvazquez_a_Page_188.txt
35b9c054c289b899b8c58ec42debcb4e
06fb13bb4046e5a8db932fce39f337daefd13790
54201 F20101206_AAAVAD mendezvazquez_a_Page_094.jpg
8d390d6779656bf8c9110e47b93af29b
ed1d725237bc9aa262e69f560a25b26b9a30063b
2589 F20101206_AAAWXS mendezvazquez_a_Page_153thm.jpg
3b97d2ff404e64b77929578a07c384a2
78c337311f8e43f51c0720a7ac3935a86cae6796
50809 F20101206_AAAUQV mendezvazquez_a_Page_232.pro
f8a2a14e956f2f5e94d775865de82071
62942594139f18122fced6777e930a7ec4c46269
F20101206_AAAVNO mendezvazquez_a_Page_013.tif
ff74c8d9908fe0ed24d9d4d3bd084110
58f5cc4beecabec5fc7dbc13f8779419556c5caf
1630 F20101206_AAAWKI mendezvazquez_a_Page_189.txt
fcc954f5d9dd60fba81781d081af2632
8f44af7b9b13dd7fc9c016e9c79e2fffcb62f02d
101468 F20101206_AAAVAE mendezvazquez_a_Page_095.jpg
47a1d698d2f5f67ad295f19737721619
7f05b45fb4cfc77d77f01eb4942cf54675ea4447
6785 F20101206_AAAWXT mendezvazquez_a_Page_153.QC.jpg
ae7d8998f1f0cdf88740216d9204f0df
b28254cf2cef42778db543d0d8b0897af2ac9acb
F20101206_AAAUQW mendezvazquez_a_Page_180.tif
e6d46d100e1c401d5225fcdb821b3990
26b08f0b3b6eff4ae38e13b7d2ef73eaf9a71854
F20101206_AAAVNP mendezvazquez_a_Page_014.tif
77bfbcb8703f82907163697aad4b4d7d
0a4d3e65f688005212e92bc0fce260e22fb9158a
1086 F20101206_AAAWKJ mendezvazquez_a_Page_190.txt
9f427f5c60eb4331fab7e23833a2d18f
9caa97505b5f45bf3bf3a606b1221458d8960bb0
26285 F20101206_AAAVAF mendezvazquez_a_Page_097.jpg
d6f4eae938410a4e407ec843e91adb27
0c43829085ab6f8d90e9d67b828f7225dfbcc77a
2778 F20101206_AAAWXU mendezvazquez_a_Page_154thm.jpg
9c7696ed3c058e9dea24ab118395f655
b7ee5f6ed24da420a8bd603309788c2c2d223942
F20101206_AAAUQX mendezvazquez_a_Page_207.tif
3196a424a271156cb12f94dd1bb58542
5f511f7f4378b97f602ffbca27acef1848d5afca
F20101206_AAAVNQ mendezvazquez_a_Page_015.tif
5b1aa4f36a8dcc1d72651c139b0917d9
3f5cd26d2b5f8059534c80015a554b0cc156fcb4
1114 F20101206_AAAWKK mendezvazquez_a_Page_191.txt
c64446155a6c17ab4088ce28d3d33628
e6d5ccb88537dd60cf1a15243c5c7fcf2056f295
67287 F20101206_AAAVAG mendezvazquez_a_Page_098.jpg
7a00b144e62bd444d3667ef8e2bfa562
b20692abe7c08b65ef186f911bdfb1992d0f68d2
10328 F20101206_AAAWXV mendezvazquez_a_Page_155.QC.jpg
e56a29c49cbe2a6b501f78bb3703ee0b
d5d569596c155d6b6c9a698a1f8d2e3b06f52006
91640 F20101206_AAAUQY mendezvazquez_a_Page_057.jpg
f05a4fd86ab46ec656bbc4f898e790bc
479f1751ff9fa163817081be66a74bc47656253f
F20101206_AAAVNR mendezvazquez_a_Page_016.tif
9a1ba515eb0d5e6efc6853339b7dc611
2c89e001fdad2fd5f1f4422a0badfb3895f67dbc
1405 F20101206_AAAWKL mendezvazquez_a_Page_192.txt
8b028fd155173b60825e6637df4a908a
02766809ef6ed5025da549b40fd1fcdb9cc944b4
90905 F20101206_AAAVAH mendezvazquez_a_Page_099.jpg
94dab118b4d4bee2f74371daa6a51448
2f729eca4b69f9bd0369a52d52e0c0d1b8ec3344
19183 F20101206_AAAWXW mendezvazquez_a_Page_157.QC.jpg
e179be1daa1465ecc2e274a3489473dd
d477096535d022c9567cc17f74409b8e32441451
60365 F20101206_AAAUQZ mendezvazquez_a_Page_175.jp2
48f40999908ac308fe4861d79499f79d
f9b03833d5a789002cab585e59c283b9da923d78
F20101206_AAAVNS mendezvazquez_a_Page_017.tif
788b7cf52f6ebfce1a110cc9de8a73c2
1e82628acfc9de44e5da6a273967269afa69bda6
1615 F20101206_AAAWKM mendezvazquez_a_Page_193.txt
d050a38e235ed5019fffbbc918a4a49e
92d1bf9e3a6e5600c6a807fca5d65e8bed94f41b
92232 F20101206_AAAVAI mendezvazquez_a_Page_100.jpg
0ad26bd80e6bae9f12acd344c5898695
6693456efd5f130b96aaa9f91fc7a0babccbc221
4356 F20101206_AAAWXX mendezvazquez_a_Page_159thm.jpg
7a853de838b74ceae6b573bda410ac91
c02f9d440f9ad51def1da9db85d770708c04ee13
F20101206_AAAVNT mendezvazquez_a_Page_018.tif
a6aed11299999bda52792d8a9dd4473f
a28f22f26307436c208c522f3fd6986105789bf3
1475 F20101206_AAAWKN mendezvazquez_a_Page_194.txt
679ae190bed1b3b94103d7fad774218e
954a95f59bcb5815fc00624e1a783ff2126350fc
91761 F20101206_AAAVAJ mendezvazquez_a_Page_101.jpg
6bb362fe49fb289381aa7a3a225fcdb2
4a7f55e00724a33da6dbd7641a7fdd1571dade04
15640 F20101206_AAAWXY mendezvazquez_a_Page_159.QC.jpg
b0f3d3f9308b3a969d59f576e57d90d4
dcba939e2cab3a8774400070b2b12f9c2855ad41
F20101206_AAAVNU mendezvazquez_a_Page_019.tif
ee51481d2d8661b319cae5ac28242d86
1a9ab04b2d9bc4b17516b31862f5e588f7cee873
815 F20101206_AAAWKO mendezvazquez_a_Page_195.txt
0a0ea7f7ec64d1f7030c28949a79305d
111f46af35317de33492cff7af3b3175881c2d9e
91435 F20101206_AAAVAK mendezvazquez_a_Page_103.jpg
f656a6e3ea73bc42b1f786d8429a6ea2
c951875dae609ab0750a4d277f81b2e244f6c2ac
3443 F20101206_AAAWXZ mendezvazquez_a_Page_160thm.jpg
e763b5049ca7bd1df4569174d512f9c9
1635c134413c118c69612f0be93983eaef1982cd
F20101206_AAAVNV mendezvazquez_a_Page_020.tif
8de1c6d8f9a1da4c7124684be94ea672
2ff000b4cf64091f06971720ee33aa802e4b0b06
1213 F20101206_AAAWKP mendezvazquez_a_Page_196.txt
39e88bdb1db6cecd71dd759fc86026c0
e253b2567291092febfae799606ac9a8ae75cb91
77894 F20101206_AAAVAL mendezvazquez_a_Page_104.jpg
d13a942e4a7399fbc242122551b41e65
a64764cb249b5441c3bde66199c0aef37999427d
F20101206_AAAVNW mendezvazquez_a_Page_021.tif
22ff0dda22400cadae7ba1d47c428a50
47a5ee66db37bb371ef027019b44e25069f74abe
652 F20101206_AAAWKQ mendezvazquez_a_Page_197.txt
25b54a613459d56d40fb0e36bcb85f6f
d1f08a9be4fa108c03d38df187a8fef24487b071
68095 F20101206_AAAVAM mendezvazquez_a_Page_105.jpg
86aecba10c210b6410e3cba8d36ad993
e749d70eb0b9cfb7f9f72c445c40d58683268b4b
F20101206_AAAVNX mendezvazquez_a_Page_022.tif
48a84af147299228b55ecda5a667ae23
db76619e55e557e2c3843a26ac98f5bac1c697f7
1710 F20101206_AAAWKR mendezvazquez_a_Page_198.txt
e7fbbb67f0292016a7b4f051f8587e6f
1bc662496d5c251235e68633e54f59af1677b984
78208 F20101206_AAAVAN mendezvazquez_a_Page_106.jpg
4943285f0ce9450d7da6f35de286f399
ee653bf258643f7023e028aaf78cd012e430d48d
F20101206_AAAVNY mendezvazquez_a_Page_023.tif
02ddb5ffa6dce92b233fe9c52a801e58
ad4d1818d3fdc55ef7282eb015cafc0c485483ad
1346 F20101206_AAAWKS mendezvazquez_a_Page_199.txt
2728910a4ccb6ded40dd9b498fddab21
87598d3b7f0c215e7fc6f625bec6c5423ef4584a
98261 F20101206_AAAVAO mendezvazquez_a_Page_109.jpg
94f21bc8753ad1a6b421b9316b146aaa
fb17f72b1eb6a5401659e22e05ea8069d5b5c75a
F20101206_AAAVNZ mendezvazquez_a_Page_024.tif
f9bde34f1019320337a251d4c58aabe0
f465246d633ff78b5b37032d95f1c9c0d1ab8e38
1116 F20101206_AAAWKT mendezvazquez_a_Page_201.txt
563a18c139528eb501bbd3172e4775d1
acb5a3af0648ae3b81a72c3e700493fe140788ad
74801 F20101206_AAAVAP mendezvazquez_a_Page_110.jpg
689ce7238378edc94375965ee4163915
cf5f4c90939681450fd9ccddc075ca93d88dea04
1415 F20101206_AAAWKU mendezvazquez_a_Page_202.txt
8396c42d52d2bde235186c1d6bfa1475
81d186073bcde0246d373854a3389a883a0d9dcd
82547 F20101206_AAAUWA mendezvazquez_a_Page_041.jpg
d4228070290bdefb44a2d10e85e43438
c1bbf17c79a8dcd1fae0f788db76352fc0fdf52a
80732 F20101206_AAAVAQ mendezvazquez_a_Page_111.jpg
2a5b5120c05732bdb3cbe379db3e65d1
f4ddf8ee211e7afa976d96a68f874ec1e9a5d4ed
3039 F20101206_AAAWKV mendezvazquez_a_Page_203.txt
301a5f8798e2136fa5a5fd45b78780fd
b6ecca201e878b0a5269a50438d9db5f5e0b8a25
8147 F20101206_AAAUWB mendezvazquez_a_Page_181thm.jpg
26fecc8b7e3f0d6601914abbade111cb
bb1f1dc556be6029e662cf835e18f26e340c9e03
85883 F20101206_AAAVAR mendezvazquez_a_Page_112.jpg
6ffb1c72dbd030fe9867190d7b598ce6
56e533639be129495ec5966cfcdd3567d4397c78
535 F20101206_AAAWKW mendezvazquez_a_Page_204.txt
904cc7b1104647636ee151473fc2ab35
44c535c1a8e0163b1a97c4251360d30782971815
48370 F20101206_AAAUWC mendezvazquez_a_Page_069.pro
ef2956452a28bfe690352c82cbc7166e
95d70cf5d5549d5f630d5626c17cf1ad3128e06d
100100 F20101206_AAAVAS mendezvazquez_a_Page_113.jpg
4675f775ec832899cf96c0db00d21a77
c3e776ed12923ee31959678961ec34483010e462
589 F20101206_AAAWKX mendezvazquez_a_Page_205.txt
1fc04fc67b77b301556de2ef61133599
944b6c0cfc5c986f0980c2b8429f054608f72707
557950 F20101206_AAAUWD mendezvazquez_a_Page_200.jp2
38ecde7a4d97527223cff64a6463d07e
161ae0a92947e1c61707c39cfc849e6aede7826a
101179 F20101206_AAAVAT mendezvazquez_a_Page_114.jpg
a63cfba224d3c84e5132cfcef04fac80
3f84f62eccf68ffb871bf85a8a195dc546441bee
886 F20101206_AAAWKY mendezvazquez_a_Page_206.txt
f1d731b7d9ca075ee5a5db4da4fa9b3f
550dc123e8278940324e648148b2351b58f9564a
48695 F20101206_AAAUWE mendezvazquez_a_Page_018.pro
afb4a286ba43e79b73bd8580428a4519
2ce0bb87491620886a756b602cdb54880809c1ee
69720 F20101206_AAAVAU mendezvazquez_a_Page_115.jpg
f43f8b63c04c3d975c49e187c3f2ea10
a3c123220f45088969093125f8c1d57120fd1dfe
448 F20101206_AAAWKZ mendezvazquez_a_Page_208.txt
69fc0ba917c8df5ae0228841137e8fe7
4f61e1212a71ed34229ad9b7b408a143755d8f05
97506 F20101206_AAAUWF mendezvazquez_a_Page_068.jpg
55336fb24b268de75026acadf7811463
136313bcd5ad1fb1ba687f53c4b95982a1ab0da6
49930 F20101206_AAAVAV mendezvazquez_a_Page_116.jpg
a7a8aa3ab0c0e7aca45de443cd65292e
e074bed2886a8097e84ba59314f650a80472866f
79729 F20101206_AAAUWG mendezvazquez_a_Page_051.jpg
1cf86f7fa599791f0c34c5c25b26e914
4e27127551c189262ad3c5761e28cc284143af7d
2007 F20101206_AAAUWH mendezvazquez_a_Page_114.txt
c37d145bbf3d41fef29922adb3bb174f
5376faf9806c8ac4554624f0e425b430d7dce55a
27380 F20101206_AAAVAW mendezvazquez_a_Page_117.jpg
f2ee0cab6ba24589f4f8a44ed8340857
f0bb96f7a323356956e2512a5a7f4f763eb6d961
F20101206_AAAVTA mendezvazquez_a_Page_169.tif
2115d34c8be60150bdcf417fc78efa97
5f72be218ba986cb73fc1188dd2106ce3aed4638
13831 F20101206_AAAUWI mendezvazquez_a_Page_197.pro
95fc7a27fc85177c5e742d204dd5e3ea
4a56e84557becc6a7de8ddec9c050464550618e1
28150 F20101206_AAAVAX mendezvazquez_a_Page_118.jpg
5b5e821a51dbc9fe75c79a9eb3bcc266
78e81abcbb58d434e9844eb142d216edaba69afc
F20101206_AAAVTB mendezvazquez_a_Page_170.tif
b1332b95e5789ec383130496e5aedf77
be681ac35c5e9f03842c7e3e8f3b6cdaf5ef8293
4046 F20101206_AAAUWJ mendezvazquez_a_Page_116thm.jpg
b16edc1d4fb7b08354f863e234fdce2c
1012fef95293245fff3267dd09a618a9df5e216e
43294 F20101206_AAAVAY mendezvazquez_a_Page_119.jpg
b65f64ad05c17af647a8c3e773dc7ae8
fb1201e0cb0f0c056af0c46b6800a5a4bbe7cc82
F20101206_AAAVTC mendezvazquez_a_Page_172.tif
4fae703feb24bab1182049d76be18193
86034c567c696ff18c261acfe57ca32c59bba206
F20101206_AAAUWK mendezvazquez_a_Page_143.tif
02c534492e3927cdf4262569052667a9
9f8421b7130e4a0527ef330ef3e0bf9366429c27
86532 F20101206_AAAVAZ mendezvazquez_a_Page_120.jpg
5a589fdea95a896d7a385798d75a8796
b3a082b10f35463ca304b5716caafc33a6135711
F20101206_AAAVTD mendezvazquez_a_Page_173.tif
df2f3165b507b7feaea6936315e518d1
9eefa3cd78bda95e1ebba3409e2fbee8e2755474
32043 F20101206_AAAUWL mendezvazquez_a_Page_178.QC.jpg
141d22528d46916fb780352f26894a69
8239c550778ccce72bb8d7cc6adbda2e681a8c80
F20101206_AAAVTE mendezvazquez_a_Page_174.tif
1804cd4d27683bdaef66df77841ff50c
df64f7ceb86292253191dcd0c3ecd8415de5546d
2013 F20101206_AAAUWM mendezvazquez_a_Page_144.txt
56d0bc0aedcf086ab512401e6fb98d87
8515f5b0336d422000177e8d105d020d756b183c
F20101206_AAAVTF mendezvazquez_a_Page_175.tif
b5715a7f68af5fa6d5c035bde1b3f5a1
8017e30c90168adb2041a313d2fc1c9466a80a53
F20101206_AAAVTG mendezvazquez_a_Page_176.tif
7031b38685440569680aac0abf716cd4
723263aff260c797090bfacd3bce387143244c55
F20101206_AAAUWN mendezvazquez_a_Page_219.tif
b86754418ef33fd67ba895c036cd1605
0f677b1c26d0de3bb3898fa3736d78eeda5ef7cd
F20101206_AAAVTH mendezvazquez_a_Page_177.tif
a77c8f79f6a757811b9cd95a2a44fad9
d2888786e5e98e4002df2c5ede34c3467ef44338
39592 F20101206_AAAWQA mendezvazquez_a_Page_010.QC.jpg
6c4d80d95b649488a621633debd2802e
c66e1c8416b2dc15a13fbcdfc42618a12bf38147
62470 F20101206_AAAUWO mendezvazquez_a_Page_054.jpg
49260629b159c39f4dbac08b809dcc0c
8d4481a381e486b104fff350f352950a312ee882
F20101206_AAAVTI mendezvazquez_a_Page_178.tif
655a9363fe43db31ab0c8a5f2894af1c
c63a71291cfdb7cd0b4f47792b18fc3be4713acb
40201 F20101206_AAAWQB mendezvazquez_a_Page_011.QC.jpg
8d79bd9de9046c9ea3eab04a19905d97
f8ead4b102e188d3e39afa5e9a99f4c3e37f1721
31538 F20101206_AAAUWP mendezvazquez_a_Page_241.QC.jpg
d0999cd49944851149217a187a77098a
c4b0314ff7e3b8fb0461dd137ad59ba43666d003
7791 F20101206_AAAWQC mendezvazquez_a_Page_012.QC.jpg
98b1c9340ad12d660c9a759f9e51e399
10313fa6381cf5feaa99713fa566f7c8335f24b3
44875 F20101206_AAAUWQ mendezvazquez_a_Page_057.pro
75c81db732d18569b643360a4de96bde
d2c7ac15c1d502b2dee0c6efbe5a92a84a31279c
F20101206_AAAVTJ mendezvazquez_a_Page_179.tif
cc16dfeba8dd3aaeb592e567aee26ea4
f430bb12992306bf78b17213980f02cff2b05a1b
9124 F20101206_AAAWQD mendezvazquez_a_Page_014thm.jpg
f314f0e0143f29154eac377db37cc1db
3eec933248eec855545ee6fcbbc8ebaee6026ed7
72232 F20101206_AAAUWR mendezvazquez_a_Page_150.jpg
16c36f866c664f4141e2f563e2a32ec8
1c1679bc40865b5864cebbb3fb93c4fa0619a307
F20101206_AAAVTK mendezvazquez_a_Page_181.tif
6c2e619fee3196a07ef8501df8eb3151
725c6b1719dcff3cb35aa6d70d6120c6882df935
8887 F20101206_AAAWQE mendezvazquez_a_Page_015thm.jpg
0875ce53108747f8dd5a334a4e66a1dc
fcba27b8c59fceb54a39a6a0b00a3aed53f34ffe
972948 F20101206_AAAVGA mendezvazquez_a_Page_031.jp2
5cdb2b3112665a245e5366cc75692c63
62c1e2ddb6f816583c45e48841181ad2d86d3fba
7664 F20101206_AAAUWS mendezvazquez_a_Page_070thm.jpg
150b7f6f48d62d4eb2b04f42f5fa1330
63e0866e1b1992b776a951ce02eb5a0b37f06036
F20101206_AAAVTL mendezvazquez_a_Page_182.tif
d4bd23617e0c084dbcccf113538c1663
bf18d9cc5b787d2e6d8b0d3e989a162b17614a63
695303 F20101206_AAAVGB mendezvazquez_a_Page_032.jp2
71122f19d2adabccb201d86532af53a2
ff8c38fddf5860f56bc6de44d824348061d5aeff
F20101206_AAAUWT mendezvazquez_a_Page_006.tif
eb1741345af10ef7c307cafc44533f65
ed432b2357e8d87f8900c708f2197ad3bb8e7881
F20101206_AAAVTM mendezvazquez_a_Page_183.tif
ef5c2101cecab705ff97dea4e868cb62
e403d000c480677dc31f6b7eb4de1e420895e3f1
42311 F20101206_AAAWQF mendezvazquez_a_Page_015.QC.jpg
0a724cbf5e376fab07ca6cfe1f55fe6b
46a011001896304b9ebbd2e239c2519050cb94a5
68761 F20101206_AAAVGC mendezvazquez_a_Page_033.jp2
da3220273ae732143336176a16366513
244f8b72b7bd6298e096cbbd15a6fbe87f215458
829230 F20101206_AAAUWU mendezvazquez_a_Page_213.jp2
ebbb68a679f9ac7132841370a21c9687
b3fd94c722461d165a0e63e5bdc2cbec215fff9e
F20101206_AAAVTN mendezvazquez_a_Page_185.tif
f9fcc89a7a6c408022b7bbebce9b7d6d
d8e016e4b0d0173bc941879c29a86465a5f89eca
1716 F20101206_AAAWQG mendezvazquez_a_Page_016thm.jpg
e42faa9e930c0c0ecbe7c99db200202f
5630fb6dc448f86c7f31ca68d5ff86870b451e8c
997267 F20101206_AAAVGD mendezvazquez_a_Page_034.jp2
d8acd845b879579f5e1fab1fff66c17f
b707e06708bd6eb2c800dc96d7daf60fed4c3a95
26105 F20101206_AAAUWV mendezvazquez_a_Page_101.QC.jpg
dacbefd0c40bed4aa5a4cbadc3a41b53
4ee7d647cc2dd987e97c274c18aafd55d38f0cd0
F20101206_AAAVTO mendezvazquez_a_Page_186.tif
e2c17bcac8a46a385c5547f168578f6b
02196fb633f0c3e3ad58fa7854dd737034150f20
6157 F20101206_AAAWQH mendezvazquez_a_Page_016.QC.jpg
43370d77a498ef539e3008d6e44d4bba
b0e8be1dd51fb2afec53776bddc3709db163ea13
1051986 F20101206_AAAVGE mendezvazquez_a_Page_035.jp2
00bb50679d5a6993c255bfe5add87610
a763085dcdd1fa3888626ebc561505e2247acd39
546994 F20101206_AAAUWW mendezvazquez_a_Page_207.jp2
b8ad60918ef07c28495a8dfc50a30514
75d80dbad31e84cfbc07f5eeee372bbc903a70a2
F20101206_AAAVTP mendezvazquez_a_Page_187.tif
8616596e559698a73b59e1ee92a9360e
4b262cf8e7c71616c1c606f08742b764b4e29607
29038 F20101206_AAAWQI mendezvazquez_a_Page_018.QC.jpg
381d8d8fc4d887e5eca535a70ad8c10a
5af5d487b56f675a1d8ab9ce2bb8d6658982051b
964870 F20101206_AAAVGF mendezvazquez_a_Page_036.jp2
1efdeda823eaf391088e3e17b8e35759
b098c4f887ac5ebb333e509c4e76a6c9a6de2bb5
31010 F20101206_AAAUWX mendezvazquez_a_Page_109.QC.jpg
b8125784976af86d6c682ddfb3400017
dfd202bdda2e732880e9387a3898c59a3433e063
F20101206_AAAVTQ mendezvazquez_a_Page_188.tif
5010cf62f8ec4b1fbb9f8a364fa43caa
8178a670fe64d1f17f7e964679fa854721b99d33
3460 F20101206_AAAWQJ mendezvazquez_a_Page_019thm.jpg
e0379f32b25a73bdce5829b670b7311f
616e89cca4282689c270653c37ed290611bb948f
780100 F20101206_AAAVGG mendezvazquez_a_Page_037.jp2
38b9998f273d6097b8de41a5b8370ae9
3ee89e5b178aac3cc43763da92855003fbbb9623
1051966 F20101206_AAAUWY mendezvazquez_a_Page_010.jp2
b60ee4f9f729dc7017d06b46658fc95c
2bdee99ab419688db35d16cb38c6471b0394e604
F20101206_AAAVTR mendezvazquez_a_Page_189.tif
15e63da5e168166d89da9ee8ee47f882
32558ebfbaa0f36b4d82effa36fe7a718f33b151
14930 F20101206_AAAWQK mendezvazquez_a_Page_019.QC.jpg
5b5fc79a9e2a2193a3d1666a62fcebcc
03f853d4fd09ac4fa8b4033aa097baa49184fc44
64406 F20101206_AAAVGH mendezvazquez_a_Page_038.jp2
76796800fa9b3c16e78dc9a6a97af411
065ad2873d0ac155422c8e6d67e0558ae46e8d46
F20101206_AAAUWZ mendezvazquez_a_Page_194thm.jpg
85fc10ab2ea1f8d996def2b4af83faf1
4a84b805bfb0644a03c7461beffb031b7b699a0b
32513 F20101206_AAAWDA mendezvazquez_a_Page_223.pro
3181fe3b03763d9f7220afed012a3cd2
c47984591d3482c48c601425619bba25d8871afb
F20101206_AAAVTS mendezvazquez_a_Page_190.tif
98a0304147f4260b90567d71baa578b6
4ba68dbcb1c8010acd6756cd797161cf40bfc8fb
30741 F20101206_AAAWQL mendezvazquez_a_Page_020.QC.jpg
066e6e157599283c49bed45dbbaa2bc3
7cfc7191f884546b71177da5f6aabe889a3b69d0
669260 F20101206_AAAVGI mendezvazquez_a_Page_039.jp2
2037f01abc54185843f5f1145e60456e
327d039e91397781fd6a5083f427bcc5bde9e9dc
2062 F20101206_AAAWDB mendezvazquez_a_Page_224.pro
56bf283f75c3945e0646330d97d94dad
55f9b172d6a53dced27b75182444817d3f01a03b
F20101206_AAAVTT mendezvazquez_a_Page_191.tif
34f267cc028ae8a1929e6bf24627294c
c28917d0d787483d02fed96a64d5f0f15697c540
26838 F20101206_AAAWQM mendezvazquez_a_Page_021.QC.jpg
5d56aa6522e7561a2abc1e5676d6ed5a
addc26f38dd026823a9188798d2f3bf1af89dc34
764149 F20101206_AAAVGJ mendezvazquez_a_Page_040.jp2
3a1726ca4b8f7a69a2b1f8ecd4a827e2
7ec7c42504647796ca06fe6f49173d2b5f392ad9
59114 F20101206_AAAWDC mendezvazquez_a_Page_225.pro
e9f0da33acd7ffad037781ce6a4edfcc
055a35195a922733fbb8d76fd7c8996a3096499d
F20101206_AAAVTU mendezvazquez_a_Page_192.tif
57a2907d42078deb7a85c60dd1b253e4
25b17436c10eb3c64ec6488455838c43d3f216ed
22993 F20101206_AAAWQN mendezvazquez_a_Page_022.QC.jpg
b774857f891ca3ea630df8090108f0e0
cd7191fa9ffe6856b0a6bb40c023b962ce17c13d
902078 F20101206_AAAVGK mendezvazquez_a_Page_041.jp2
f28e4a11ef712922fbcdbb227dca29e1
ab9a707e5c903c41b6436c71779683ba88201b67
17952 F20101206_AAAWDD mendezvazquez_a_Page_226.pro
f60ff0c192086dd902b4fa94f332d627
5189b10ba86cab1ab9a50643d493cac7c42fb9cd
F20101206_AAAVTV mendezvazquez_a_Page_193.tif
6469dbec8efe125177d8a5a484dd5eff
f5dc5f6964bf999807d34adc4c3f9887ab3edc10
7807 F20101206_AAAWQO mendezvazquez_a_Page_023thm.jpg
a311dd0071d526df64b4fdd89dd676e3
ae9b431b237249a810d9d1607f4ce5228f365876
811416 F20101206_AAAVGL mendezvazquez_a_Page_042.jp2
9deb7bb3151ee154cbc1a374583764e3
a854fd08d8b3235b4420deb5b453d6b4b135fed7
42073 F20101206_AAAWDE mendezvazquez_a_Page_227.pro
63878fde5329b1e27d136dabc673335d
1e5373d1c97f0de05f0960ed131619f9148505ef
F20101206_AAAVTW mendezvazquez_a_Page_194.tif
f11f69fce21940037d4bec32bdafdb74
b041a0cabe76fa5b0de23c0032e5e0cbb27aed55
31887 F20101206_AAAWQP mendezvazquez_a_Page_023.QC.jpg
624c86ed822f4da39c4b441704e1e87d
dd3aabd5a6baefee349c6f35cd750b3646e46308
831614 F20101206_AAAVGM mendezvazquez_a_Page_043.jp2
cd3ebd096a61b429ec8e59d3b0e4c0b8
63f8d76f5da2e675dea099de300c12aecbe29861
49822 F20101206_AAAWDF mendezvazquez_a_Page_229.pro
82469f93a28d518a6bbfc9cdc6b13275
5286fa63ae51da5811214d0d6be9478c0847be71
F20101206_AAAVTX mendezvazquez_a_Page_195.tif
33905f56df5cf9bda819a9fca4558728
4026a8c46e60c0a7a4485408c56d09eeb8a3d834
7079 F20101206_AAAWQQ mendezvazquez_a_Page_024thm.jpg
8094aa6293047cfc35f1e386183f5d8e
59c9200b98d28cb76ddb1e7a67976aacaaff66bb
955622 F20101206_AAAVGN mendezvazquez_a_Page_044.jp2
f7653d51a6e9489494e7c86b1ef0f8e8
5798dc16343eb208a0f2068377e8d3a77606e7a7
46814 F20101206_AAAWDG mendezvazquez_a_Page_230.pro
e1fefa4fdc2c6ecbd16c5611371dad34
cadb84f6cf980e8c67ec7ad3c2c8bc1a23b2d73c
F20101206_AAAVTY mendezvazquez_a_Page_196.tif
7e0705c3706eb886944ab76f3cf9791e
a2d196e39ef29cc7058daa9a815ba61c43625e61
7804 F20101206_AAAWQR mendezvazquez_a_Page_025thm.jpg
03f79c821feb989ff2efef419487529d
4865236ca1ec7d0d4a7ae1eeca1dd50ec3f6bd36
75282 F20101206_AAAVGO mendezvazquez_a_Page_045.jp2
12aa7e562759eb34c4e653fb5712a7f3
50d4a8720cdadb025aad6d1b7ee3858dd58af7d5
52056 F20101206_AAAWDH mendezvazquez_a_Page_231.pro
f2eed15c7bc7c2bfc18a79954a3f5e26
c06afd523f2521dfb529c3a375f9a9e2cef7494c
F20101206_AAAVTZ mendezvazquez_a_Page_197.tif
87a70252b04240bbbc2748554b20391a
26a6a65ffe8f2be8077bbcf338141f1e5a751548
19075 F20101206_AAAXAA mendezvazquez_a_Page_199.QC.jpg
3238ea62beb37e761640b89758010e67
26856d62ba2b4ca5253245f79b03869f5aab08d0
33709 F20101206_AAAWQS mendezvazquez_a_Page_026.QC.jpg
dfde29e0ec309975d2e9ea3bd07d0db8
ce0ee95430788f354f5bf9c2305078dbfa6f92b2
916056 F20101206_AAAVGP mendezvazquez_a_Page_048.jp2
77871feea7ca13555c79203e7e6e1536
b6ea7e3aeac20fdffddde0c5dd6bb6f5d2229322
49192 F20101206_AAAWDI mendezvazquez_a_Page_233.pro
d1ed8a77f8339e0b9e1a91f5cabe0dcd
a7e6560475013009d4fea859e96b8855bcecca7c
3688 F20101206_AAAXAB mendezvazquez_a_Page_200thm.jpg
2192cf0f37ec8632f7c6b23000aefba4
e1b661eb9e2c94cc9cbc63f45967efcdc0cb47da
7702 F20101206_AAAWQT mendezvazquez_a_Page_027thm.jpg
87f2bc99c35f3368fb991060f88d6643
fb2d7db4842454d996399147b1ab6ea31d1896f4
867140 F20101206_AAAVGQ mendezvazquez_a_Page_049.jp2
88df1afb047d164026fc864ebbf6fd61
6aa277c228d498573f7b6420f90bf389e790671e
55258 F20101206_AAAWDJ mendezvazquez_a_Page_234.pro
97a02cde4c6da73cc29a6b9c261488e6
60f53f4bf5910ed364c46ff50ede255885af7741
11287 F20101206_AAAXAC mendezvazquez_a_Page_200.QC.jpg
913abfd77ed5fbe1dfdefbaaf09d9265
734f6c27a80d948460a29583ff349968f5843e09
7750 F20101206_AAAWQU mendezvazquez_a_Page_028thm.jpg
aeedfe6a2b8bac71ea934267dff38f67
c9503eca898c1a36ddd591d2446635800a8a73fb
683185 F20101206_AAAVGR mendezvazquez_a_Page_050.jp2
d67b59c2f552e6f58ad57aae0d6a3b25
caaa7e814d6336cd3fbf2537ad43696adcbdd827
55800 F20101206_AAAWDK mendezvazquez_a_Page_235.pro
758753010039c82c12374c80508d054e
15d633fe230052b08e611732f252ccb92f4c7441
4236 F20101206_AAAXAD mendezvazquez_a_Page_201thm.jpg
774c16df23d64436e9f478814abf710e
f8da5529749afc0952752268b53fda43af1bdd4f
32279 F20101206_AAAWQV mendezvazquez_a_Page_028.QC.jpg
abf8038305897a967be9b2f811a3dd8c
106978d11dbb03d677bb948aa6975b144b203c93
880495 F20101206_AAAVGS mendezvazquez_a_Page_051.jp2
58e966d180d44a263d6bd7a5379b30e8
00289efc0c2d2595cfb9b12c5f27334bb983ba17
53789 F20101206_AAAWDL mendezvazquez_a_Page_236.pro
c303523b0c61386922744bd6f8584860
e73e20ae3d1bef03389dcbc85b30f335f8ef7994
F20101206_AAAXAE mendezvazquez_a_Page_201.QC.jpg
5e7c392c3dc46a3b454d2adf70c0346e
1d8d31db44d042f43e986e868ab8937fb1816ac2
6947 F20101206_AAAWQW mendezvazquez_a_Page_029thm.jpg
3ca514f2942c5ba78be26bee3714f69a
860a95816f138874184d64383012a31755c6315a
63705 F20101206_AAAVGT mendezvazquez_a_Page_053.jp2
7ced023c96fc89a585f190096378b315
386955e845a6aa3236def5dca0e838cc3d0ba40d
54963 F20101206_AAAWDM mendezvazquez_a_Page_237.pro
11452caeb39f1b6a0d6ee0ba072a1f61
b6bb3a6f619ff071f8a58246e0613e7337179058
5394 F20101206_AAAXAF mendezvazquez_a_Page_202thm.jpg
46a8f27f02ee4d1a31ea43ae94290cfb
f6f4b0084bdc2330593084b534b0e6e272a240ae
25841 F20101206_AAAWQX mendezvazquez_a_Page_029.QC.jpg
8851e7fa74bdfd3646a57a1415462db1
4b6ffeb34cbfebd566a773d721acf382145c1e30
639213 F20101206_AAAVGU mendezvazquez_a_Page_054.jp2
b16c8f0925e341d5197721b7d0d2d855
0670eca45260ee32278f5872557d630d2188a3a2
59865 F20101206_AAAWDN mendezvazquez_a_Page_238.pro
e5a2337778458534e157427247ee4ceb
c9b382a68dd0b6d53ef093d264d686ba1846c803
16945 F20101206_AAAXAG mendezvazquez_a_Page_202.QC.jpg
be2927003fed651c89c0199d65ede47a
e32d563c583cdb37d0c8cd784e9c5977076dd15c
28291 F20101206_AAAWQY mendezvazquez_a_Page_030.QC.jpg
1c60b310776fa5b7721b790b14750e13
abc8be1a7276445f7713607fc3c2424754b54a21
873592 F20101206_AAAVGV mendezvazquez_a_Page_055.jp2
3b34a18482f90227361e207a7761fab9
60a474b1fae3b1038a71adfdbac6242d90c66288
57347 F20101206_AAAWDO mendezvazquez_a_Page_239.pro
2a0f5e26b332be4b4ad61983f26f4cf2
77ab434f7506d4b7b44c88cc94b96dc600cd968a
3436 F20101206_AAAXAH mendezvazquez_a_Page_204thm.jpg
88ff6b24422763271d7e4d27e38d6407
ec4c9deeabd5db08f1e17445c6204be861a1c6dd
29469 F20101206_AAAWQZ mendezvazquez_a_Page_031.QC.jpg
74cb743bc395af2b181b1d7cf153c9e6
f5fc9b8b28b014c4085d35bbf618169f9ddf9995
977837 F20101206_AAAVGW mendezvazquez_a_Page_056.jp2
29bebe6f3d29a896161623099981f318
d2a752e750736d88e7d98cd9a75167c869ded1a4
62881 F20101206_AAAWDP mendezvazquez_a_Page_240.pro
9145515d6a82e73eff21ba90392b59eb
63991065dc9f326a76724dfae89e74cc28a18eb2
6687 F20101206_AAAXAI mendezvazquez_a_Page_206thm.jpg
5df486e77c605ece5ff20e5e52ae675f
d415bc8275244f04af7024f3c40d71d08e30cbcf
977594 F20101206_AAAVGX mendezvazquez_a_Page_057.jp2
870c6bab0b990d2d13ba881705a6f52b
562c1d8cf42b13ad456b1ab48b333c386a70f5c3
21101 F20101206_AAAVZA mendezvazquez_a_Page_108.pro
2d5bdf5d734acd35dcdd20027a6cde47
e2fbc476c3beaa40190f8414eea59c513fd8989d
52053 F20101206_AAAWDQ mendezvazquez_a_Page_241.pro
4ec16f4dc5d9538695a7528a40d22768
583b4eb5a91c1a9050a0002441120af8778028c8
6229 F20101206_AAAXAJ mendezvazquez_a_Page_207thm.jpg
4714f9764ba2161ab475ddd24522206f
92b9100410c79fd6ceb9cc56fc84d70beaf09af4
738104 F20101206_AAAVGY mendezvazquez_a_Page_058.jp2
8ea99f31c0e9f8dd4008cd6d8253497e
4a41f07c4c5cd8abbd16530ab94d4d472572ef0e
49471 F20101206_AAAVZB mendezvazquez_a_Page_109.pro
65066d9a7db8b2434ae9ad82b6d2ca09
0ea3ceb9f9966c2c256e3405e47da27c84cf130c
52886 F20101206_AAAWDR mendezvazquez_a_Page_242.pro
4886cec5e0ebe11627e630bdfc1004cd
655d166d5ad87a07df0c49fe49c3c5f7db7024d6
18829 F20101206_AAAXAK mendezvazquez_a_Page_207.QC.jpg
b30b877943436e2a1c76da827d1aa50e
8ddecdb54ab7edead1740b6867a79a69eef6158e
831977 F20101206_AAAVGZ mendezvazquez_a_Page_059.jp2
8d86d20d943adf811f98638d3021c56d
debfb601c4e9782c47b6b8a65d1edce8a851db35
32954 F20101206_AAAVZC mendezvazquez_a_Page_110.pro
3e43a3368b4155ed2ae87e2e0b1f4005
72e7e797928ce27e6f63fa5524054092ed1a46cb
466 F20101206_AAAWDS mendezvazquez_a_Page_001.txt
a81eb55c8eadba030e6b56d3adb42de8
585fb4603a6e7cdf6d3527cb377f62ec09df0f9b
11696 F20101206_AAAXAL mendezvazquez_a_Page_208.QC.jpg
bf6e79f8131ab6ec463b6ffa286c595d
057a8e84a8fbf058fba6bd6a64c8a5a3da0c8a42
39715 F20101206_AAAVZD mendezvazquez_a_Page_111.pro
e55165c760746d26a4240e3f1b136b45
ebdfaa9bf78bfcfee9a5bd97e3a0d623989a8653
798 F20101206_AAAWDT mendezvazquez_a_Page_004.txt
55db2b936aa8dcd1b56040e3d67b9dd1
d5679e2cab8df2163d1b5f1badd8d85bd4c16e24
4364 F20101206_AAAXAM mendezvazquez_a_Page_209thm.jpg
24963cf24d234fe51eb74aa766071492
e3baa4baccbc38773de3b4d30b42594ea1a623bd
42843 F20101206_AAAVZE mendezvazquez_a_Page_112.pro
7dccc5d1c718cd3a4594146e2000ce6d
514497d55387369158a49d527fc4773048ddccb7
3022 F20101206_AAAWDU mendezvazquez_a_Page_005.txt
ad3a7259b008a4cb892f943796808cfc
094c56785f2fae177cd9ed1a710dace0c171b34f
11859 F20101206_AAAXAN mendezvazquez_a_Page_209.QC.jpg
41f29b4e942d829e2fc64818df975d0c
337a57fa2dc23f3f396821012a0618693ed32c62
48638 F20101206_AAAVZF mendezvazquez_a_Page_113.pro
e332634998ae51cea12da8f1b3e66865
bba326ef6fa2df5a4b421f1f3abeb387f98755cf
2768 F20101206_AAAWDV mendezvazquez_a_Page_006.txt
aec09d036bc89f569e6a81bbf65fd51d
92ff81e302968107c720dd5a14cb25169f72ac6e
5805 F20101206_AAAXAO mendezvazquez_a_Page_210thm.jpg
6e12f06e4e5c8d97fcd5d2803149ec05
4a34432c87efe176751e229b4255c7f23c8b8dad
F20101206_AAAUPB mendezvazquez_a_Page_171.tif
1b5036c16439734e8b63ffdf1085502d
bfc113cb779495fd5dfe8c8e4237934ac4d0f273
50482 F20101206_AAAVZG mendezvazquez_a_Page_114.pro
fa0f79c5579dadbe7c2d67b532a62d38
9364c3b181f32bb67fb33fd214d1a9939a151652
3382 F20101206_AAAWDW mendezvazquez_a_Page_007.txt
36f97ab6efe4b7d4ab6191e071825ffa
a295dcabd0375f635d2d4b4ea53191cef002b952
22438 F20101206_AAAXAP mendezvazquez_a_Page_210.QC.jpg
9c367e2b892b3baf2ca3ad77e1723d3f
6cacdbe67b7ecce5fc6740a0837fd45737d3888e
88063 F20101206_AAAUPC mendezvazquez_a_Page_074.jpg
24b4e5fe3b10aaf3904800399adaf7cf
b79f8f5507877799c7e7694f0ba0fc82dc9df2b3
29924 F20101206_AAAVZH mendezvazquez_a_Page_115.pro
01f4d14f52f7cb8fdd2293ca08f76d37
63dfa7fc6cb4dd9636212a5efa55fc9a37a062f7
3178 F20101206_AAAWDX mendezvazquez_a_Page_008.txt
b810d7995a3fdd56c6ee2cae29167766
bb3f512db8bb60fe216776e2ee35e50eab2d6de0
6667 F20101206_AAAWWA mendezvazquez_a_Page_126thm.jpg
12dd6f4b68ba34960aa2390d9280702a
97dddb2b1d2b18b12f8ca9f34eaccdbb02f7d70a
6485 F20101206_AAAXAQ mendezvazquez_a_Page_211thm.jpg
204a3d09b2e302a6a8cc7b50873bf6ba
dae1438e19057152606c96bc003a425452451794
6378 F20101206_AAAUPD mendezvazquez_a_Page_037thm.jpg
086185bd3054f15629b008e87ffab339
afe4b7bcfd29bddf928c79d9066119a1103b727d
21171 F20101206_AAAVZI mendezvazquez_a_Page_116.pro
9028d44ebc5495d972da5860337819fd
229675cb1e45b6b7f22fe1ebc7c917f0115a4db7
23134 F20101206_AAAWWB mendezvazquez_a_Page_126.QC.jpg
193499929a2d2f42a0f47dab23c5181d
4688246a495d6675d0d3f792db8474daf0ee73e3
6241 F20101206_AAAXAR mendezvazquez_a_Page_212thm.jpg
20d4eef9ca614273cf4e1e1efac86830
b3afdef7cdae09fe42f429f39bc681e6a1d9f411
92793 F20101206_AAAUPE mendezvazquez_a_Page_232.jpg
0797022f1c1c3ae63e3c2cf0cbb32aa7
a1c9d2a0f9997cc02b1d41578ac02d565ee5199a
2658 F20101206_AAAVZJ mendezvazquez_a_Page_117.pro
91b5eb4228c606e62201ed448e9515d1
19ac4631f7483a22b7cfba79fb5f8cccac5c7726
507 F20101206_AAAWDY mendezvazquez_a_Page_009.txt
2e706b0d3ff1de3db21b59909cb147d1
f4d922c3143355af693c2560f2bcebc5be919aff
5185 F20101206_AAAWWC mendezvazquez_a_Page_127thm.jpg
17bc63eb4df7cd8e12bed83580fe7b54
37f2e95cab98b1d7337408f40c7d23a492892841
24190 F20101206_AAAXAS mendezvazquez_a_Page_212.QC.jpg
ac952da38ba9982f188bca842d4b43bc
751174e89ed84430104838fe6b3e7c75242d50e0
F20101206_AAAUPF mendezvazquez_a_Page_039.QC.jpg
c43aec43221e433f215d5a40435a5ac8
c8a7db38d81418cd5c0ab07f00d870f0145a49d1
6672 F20101206_AAAVZK mendezvazquez_a_Page_118.pro
7a293b301d9b2466d291a7b60c48f39a
2d0dff5279b81a44593398aae2adff0f1d17d1cb
2690 F20101206_AAAWDZ mendezvazquez_a_Page_010.txt
662c4ab175b082ff8a67465c11893b12
b218230b6c8762624a2af4000ee518bf5fa538f7
18138 F20101206_AAAWWD mendezvazquez_a_Page_127.QC.jpg
b6b56f63c877cc019ff78af40eb9d79e
e37c55c9b577a6348038ef9c6c99fd98022963ad
18668 F20101206_AAAXAT mendezvazquez_a_Page_213.QC.jpg
c8ad6049fb7b0255b57c7795a32e6af7
b116d334b4615824f80c4466c3422d48cabc48df
5999 F20101206_AAAVZL mendezvazquez_a_Page_119.pro
6a98d1716153c5e490acbbf591c428ac
26a3b2532a472f71dccc4504d81dbcf1af3465bd
7133 F20101206_AAAWWE mendezvazquez_a_Page_128thm.jpg
f7e9f816fddae9d0037fafca0a5d5d13
26e6bfb0a62b97bf5778fea5bca4b00d5a733f37
30191 F20101206_AAAUPG mendezvazquez_a_Page_027.QC.jpg
e75d757c6a84e1c3685df68de784bf94
e61da25ccfdcdf2fe9ba8b614b432fdf215a9bbb
911092 F20101206_AAAVMA mendezvazquez_a_Page_211.jp2
40091fc127cd63843cc38af4ce6962cd
6a6da849ef17c92a3358b893ba4925459c7b6cc4
41410 F20101206_AAAVZM mendezvazquez_a_Page_120.pro
c296feb3b7afb1dc42719f01e76f4670
a30562eef129c652c78bdbe1360878ba5ca5e87e
28455 F20101206_AAAWWF mendezvazquez_a_Page_128.QC.jpg
b9a8a3ffa6010478c1071f6213417134
90bc529acf306295bbadf2c2c1ac1a62c4f7e6cf
3638 F20101206_AAAXAU mendezvazquez_a_Page_214thm.jpg
e0eb459805545e25213b4c865d6965d2
c8a1b3c8c8af2473da75a3b8f8e52cac0ac55a54
F20101206_AAAUPH mendezvazquez_a_Page_068.tif
faa197d50f8b5d59a4ca43ba551275f2
d21382c2b4545d35db5220cc28e990c9aff43f40
833720 F20101206_AAAVMB mendezvazquez_a_Page_212.jp2
90999f9a4973dc8196385957585378f3
d1d1bb1574cab0b8c8400fbc5e44b0d9d87f1bf9
23046 F20101206_AAAVZN mendezvazquez_a_Page_121.pro
962273093330839d939611463f2852f7
b98c82330027fb84a4271f37837b9190bd088109
4989 F20101206_AAAWWG mendezvazquez_a_Page_129thm.jpg
35ee6330760ecd10d5cc0944e3ba7e55
90326c8b79f24ffd3673d5ac9d3a00c3f65b92bc
14976 F20101206_AAAXAV mendezvazquez_a_Page_214.QC.jpg
1735796911344164380624594feaf282
b6ecd56c6f558adaf2a51bec867be7791b7057ca
F20101206_AAAUPI mendezvazquez_a_Page_233.tif
d91941295f089f2e1ce0912109a2a510
6d46f6de94a1b9aa0604a4f31435735dff1a95b7
29122 F20101206_AAAVZO mendezvazquez_a_Page_122.pro
e9b3a2f4dd862b1d6684377827d00367
b8cb75ed1cfa070313f6653168f14a498d307698
16007 F20101206_AAAWWH mendezvazquez_a_Page_129.QC.jpg
69d9ea053a964adf88b3dca1a44ae753
3fb1140cd6308a08798d13d00413d5bcdd130a5e
15577 F20101206_AAAXAW mendezvazquez_a_Page_215.QC.jpg
402a128c7998b4dfa6d87fee32b11fc8
099105db3f82215c542b3ace259c44b14d72ae30
53819 F20101206_AAAUPJ mendezvazquez_a_Page_108.jpg
4d7eb1693436538f5c3f525e38c479cb
8690731e267bc40b6530820955b3eca28f5de3a6
584342 F20101206_AAAVMC mendezvazquez_a_Page_214.jp2
aeb6eb7525df4b71850f66b2b108868b
b1d95e6cb625ba5d759d94be7a2faad8d546c766
6115 F20101206_AAAWWI mendezvazquez_a_Page_130thm.jpg
c667a353720b0169531761c48c2c3c02
a3719def5133ba6be0b9c2e042d917f6c9a71846
4083 F20101206_AAAXAX mendezvazquez_a_Page_216thm.jpg
ed09cf08f7a3efd4dd879d2738eaf121
a8c527d75935395444abd2384506cf98432bf986
10370 F20101206_AAAUPK mendezvazquez_a_Page_226.QC.jpg
7c9b3cba80460fb064ae179e01547de0
ca7db01755c642a8de1ac4f9d41938eb90f4f5c8
610970 F20101206_AAAVMD mendezvazquez_a_Page_216.jp2
36b28f555c3f64985cce30d558ae2125
c4d12795dff940f5ff39ec4226bf3d402dfac141
20665 F20101206_AAAVZP mendezvazquez_a_Page_123.pro
a2ce7ff24fcdc02766b9fea1922622f4
87a03c7f34ea1dfb7a7c7db68534173205ef5c6c
22962 F20101206_AAAWWJ mendezvazquez_a_Page_130.QC.jpg
1600c3ee557f343c4ea1f33a4c6609b2
e6c60e00f94382c66126db8f28dc7a0f238fc8bf
14914 F20101206_AAAXAY mendezvazquez_a_Page_216.QC.jpg
54b13b7fb1cc86d8f6c2fcff35d24a22
764343ac04b85115b46ceaa569e40a8a3fd5b7d0
1875 F20101206_AAAUPL mendezvazquez_a_Page_055.txt
befb2000689eb34aca6d4c2bee122411
3acecc1b3c3d42b59586bf3c38959ac09cb84044
750072 F20101206_AAAVME mendezvazquez_a_Page_217.jp2
f2ee8a487837bb5649a793bee387a5da
49b0e2a716c7d69482b51b278a8b685ea5496cb9
34801 F20101206_AAAVZQ mendezvazquez_a_Page_124.pro
bb3036337c96501a2561780c53e5ff32
ec9e0e28318ef9581ba1801d6d228c2e2bc9adaa
7537 F20101206_AAAWWK mendezvazquez_a_Page_131thm.jpg
cc40adb551bd5cf26645107d9f6851a0
68b4c28cc65f859e5e2e2bf57b134288f1b54a0d
4656 F20101206_AAAXAZ mendezvazquez_a_Page_217thm.jpg
f9ee3eeb53d68a299706893e2ee8bd08
0e7463240887c81eea25f292a799f68f1f7f50b1
1038081 F20101206_AAAUPM mendezvazquez_a_Page_052.jp2
1dbc4a4b948b4d7ff3a4d57b781b77a7
3f24a0be27f7f964cadade4915d196aa14fc02ca
617055 F20101206_AAAVMF mendezvazquez_a_Page_218.jp2
3aa23e174504b9315a20a59468d606d5
54b19c7309059e58ae840e7e30e0135395c56cbc
33112 F20101206_AAAVZR mendezvazquez_a_Page_125.pro
f783afc6362ff8ab40bbc760e9aa146c
7bf5348e4130fd8c2ab333cb46f1fdf146e2402e
5815 F20101206_AAAUPN mendezvazquez_a_Page_167thm.jpg
68b5fc081f9718f91610eaf8d4c8bed6
27d9b8c1ab1c0a9f2a03d2d21f241ca0aae39904
110715 F20101206_AAAVMG mendezvazquez_a_Page_219.jp2
c682940e80e5c57b36be3850c980f2ed
e5a867abb49373f9b3b4d32915e609bfab680186
667 F20101206_AAAWJA mendezvazquez_a_Page_152.txt
04e16b0b0c3e8af5a3eaede32e50f639
672fb22fafa1b88667e999dc9f5709583acae3fd
35804 F20101206_AAAVZS mendezvazquez_a_Page_126.pro
6228f1352877c7fb867e1cd2d9d23562
28a6a803267004e34cd701349cb386cf10cd3b29
31896 F20101206_AAAWWL mendezvazquez_a_Page_131.QC.jpg
1e5bf59a4ecd7b33df2eb6fe65d4efc2
cc057f58739efceae6ed8a5e7394b0e6276395b8
26253 F20101206_AAAUPO mendezvazquez_a_Page_167.pro
5421c66ebb5b18d85912e74566686262
631de5eccc33cbb8551cb56439b5a1e2756253cf
18593 F20101206_AAAVMH mendezvazquez_a_Page_220.jp2
19e271ae8c315b2d69e8fabe73112139
adfed876f91e3b19b7b51cf55d5b4379a27d240e
592 F20101206_AAAWJB mendezvazquez_a_Page_153.txt
2342aa2297f5da6147d0b78d034e90f5
2c35beea2310f17d514a57fdb33ad49f3b576d45
43200 F20101206_AAAVZT mendezvazquez_a_Page_128.pro
ae5f247ad0315a08a3ecb9d15ad50187
cd82ca0a37b69c8981e6f0e5de8b75b85a5bc7a7
31632 F20101206_AAAWWM mendezvazquez_a_Page_132.QC.jpg
347f33165a5c5462a868f108d9eb9baf
41592c94a69098504d6be665f51c5387a59ea1e3
39782 F20101206_AAAUPP mendezvazquez_a_Page_055.pro
9d2734cf93cb489913406f669ed0fad0
472cbdecc91ff1323b3ecc9101dd7c79a711bc27
627775 F20101206_AAAVMI mendezvazquez_a_Page_221.jp2
4306ff43d7074a62f6434cf0bf1ab541
5f9da32315ab7c3c9a654f3569e35d56aa5ee3b2
661 F20101206_AAAWJC mendezvazquez_a_Page_154.txt
75dd5c1004ab4fccf6e2df32c72df15c
9f57fdaa187a9d59d452144d9c9a62e126930c14
23834 F20101206_AAAVZU mendezvazquez_a_Page_129.pro
c39ece4f2ab705238f6ff684dc7384e2
4b2999336e3a5879d6828b6bb1d44773d8ac4f32
7754 F20101206_AAAWWN mendezvazquez_a_Page_133thm.jpg
2dae59438fa9561fc6f4617da12a97d3
852a88d4b538d3708cca85d91625b55bb0544779
53194 F20101206_AAAUPQ mendezvazquez_a_Page_135.pro
4bd4c7b1e346d8e67f2185fdfa2ae446
b01f6f01c4122b5381e2d200c8430f2270216019
361147 F20101206_AAAVMJ mendezvazquez_a_Page_222.jp2
79d0790e0ecb254deb4a754bcbc20d24
4bda5356c6b3a12c4c148fc930d94ed37c64b95b
473 F20101206_AAAWJD mendezvazquez_a_Page_155.txt
de99e623674b93f661e60e817f6240f3
3f281b4a2f1aefa11d481fad4f64f1b3556da085
34346 F20101206_AAAVZV mendezvazquez_a_Page_130.pro
4210a239d433b7c803e2625289abb30b
ac11b375668b63b9a7bb286efc2e5402b64f54ab
31872 F20101206_AAAWWO mendezvazquez_a_Page_133.QC.jpg
0ee4e410fa8bb43847cdc47baa847d9e
f7bc0f06904491e4701b42926a498c99ed4d1d33
F20101206_AAAUPR mendezvazquez_a_Page_162.tif
7a1b46d04ec7affef4fe919bac68d799
d80f0755d448ce99409b00b7098b4b1af124b55d
73699 F20101206_AAAVMK mendezvazquez_a_Page_223.jp2
31a47237c7aeb52df45510e088b3b3f0
7b586096ba57121c09b36b19322b425ed454619e
742 F20101206_AAAWJE mendezvazquez_a_Page_156.txt
32ba8ac212d5e401cd0468ef9d99f12f
665abbc53a682543c067d42c536f2931ed4b0c31
50561 F20101206_AAAVZW mendezvazquez_a_Page_131.pro
782d4c89897d314ea9674f04a3e6f6ea
b246b64e71162a71d329358db0ddbeae1c586f59
8025 F20101206_AAAWWP mendezvazquez_a_Page_134thm.jpg
7b303ee57d5dd17039d7d73ad8074931
a9c42898bd976103c1a532059c0e2ff7c3101ef1
F20101206_AAAUPS mendezvazquez_a_Page_092.txt
eaa77a96ecaf3182048bf8b98539064c
4d60cdf2ef32da187f9ddaaf3109fa393ea1e107
371391 F20101206_AAAVML mendezvazquez_a_Page_224.jp2
763c7c6468b85a09449d7ebea0bb8028
307d4f3c05fd579adaddb593fdb7556929136ae1
741 F20101206_AAAWJF mendezvazquez_a_Page_159.txt
672c05f787647879b83a53ac03970b35
9c894ac375dd9c04993a785beaa6d957ae49d625
49862 F20101206_AAAVZX mendezvazquez_a_Page_132.pro
5a8a2aed17c0646b0b0eea2d5166fc70
14f316fb3e0e1fbb9e7ceb07c7ac32b7dbc7e8d9
32705 F20101206_AAAWWQ mendezvazquez_a_Page_134.QC.jpg
ae98bb902f00d94602214d4b3b5fd59d
eeb6ab3405e059d1eb3b4a7245baf655ca93f71e
15873 F20101206_AAAUPT mendezvazquez_a_Page_136.QC.jpg
6ba33edbaa93122a9e40b1b8608d0ea1
f1de8f178460296a0d7ecf69c8389bd6dddcf3db
F20101206_AAAVMM mendezvazquez_a_Page_225.jp2
42ccc552bbc3b02572e2198d2d893cdb
ee71b7ca66b03661901bb67991ff75691dcd8e94
F20101206_AAAWJG mendezvazquez_a_Page_160.txt
76d1d232b84f5cfe518176ed3833560a
b788f99b40fa9d7e3bca9941844d816981a37abf
50508 F20101206_AAAVZY mendezvazquez_a_Page_133.pro
86313b85d31a65dd86e0e3901025d3db
c39105cb92253f15c6d6f6905cbddf0311e1d1c6
7837 F20101206_AAAWWR mendezvazquez_a_Page_135thm.jpg
5bc5f6d9b23b2b8a4f1f1c64fda010c9
d7544707c472fa0605beff4391afcd049e931b09
24805 F20101206_AAAUPU mendezvazquez_a_Page_202.pro
5a60267c1d4b79fe45858660f661370b
c27323a5cc1fbba8e98410ac4a5fa96dc64fcc9a
397351 F20101206_AAAVMN mendezvazquez_a_Page_226.jp2
c34b1d5d7516ec901302b8ec6419457b
05eaaa470a0d25a9b1a84895af4afc5d737ae620
1325 F20101206_AAAWJH mendezvazquez_a_Page_161.txt
85f8e5ab22508f99688509352ec7df51
0c4327ae83ff3eac910e9d9d2220e59678dd7fbd
52207 F20101206_AAAVZZ mendezvazquez_a_Page_134.pro
b63501ea78cd7b51bed0f2f5193935e0
e2fc728bd1bdd56365d1446351f653283b3b97ef
4055 F20101206_AAAWWS mendezvazquez_a_Page_136thm.jpg
cad65db2a2eace114b4964f85222e78f
ed2d1d375ded75a40ccc3fa41c6507b2d5482d42
12314 F20101206_AAAUPV mendezvazquez_a_Page_205.QC.jpg
88344f7071b7672078573b1cb98d6dd0
23ae5d29c60ddc21d60965aa0f66f186f41c5c16
911833 F20101206_AAAVMO mendezvazquez_a_Page_227.jp2
933017e7cd95616003b3d98e152139f6
acbe8a206e93699ab0ab0b8ff3932bb1afd1149a
572 F20101206_AAAWJI mendezvazquez_a_Page_162.txt
9cd4803470f533b8dbdb95c81addeaed
dfb0418027455ef85dde5f773af4f451027dd241
5360 F20101206_AAAWWT mendezvazquez_a_Page_137thm.jpg
afdfef7a166d0f3998b12d1beaf72d0a
48b7a3af333529ef9d3fb96756fafe159dc7e8aa
60539 F20101206_AAAUPW mendezvazquez_a_Page_102.jpg
336ae281a7ddfc8056022aba969e0556
565dfee73bc518af167b2bff701b203cb52b000e
45251 F20101206_AAAVMP mendezvazquez_a_Page_228.jp2
9567149a3bae894cc5a811879927e203
ae3451b7781c1555ed820dd5f02dd744541da071
1604 F20101206_AAAWJJ mendezvazquez_a_Page_163.txt
069a1cd6f47e3851ca9ed16630cecda0
5c70148b46514357d6fdb64832edbb232ba1545b
19154 F20101206_AAAWWU mendezvazquez_a_Page_137.QC.jpg
547b6a4e09e6dace7e20421605eeb74e
caff094386a0cd116a34b23f7f9e5cb614102119
F20101206_AAAUPX mendezvazquez_a_Page_216.tif
b29c44a87a99ff5ed5d27159e379700b
79bbddaaa1cf52e58d668f194538ce4358a934f2
108214 F20101206_AAAVMQ mendezvazquez_a_Page_229.jp2
5ece9451ffc7506f2231fb29811b619b
2bda341ef25f2eb49c212085d0cfc70e5c1b9681
1641 F20101206_AAAWJK mendezvazquez_a_Page_164.txt
31a55ae17de2e5bbacc575ce52da9768
4c64ce982ce9b513bb08eca8eb200b8f81d849ee
4659 F20101206_AAAWWV mendezvazquez_a_Page_138thm.jpg
8e0065dce61b53247fb3a81b982268aa
6c77f305607fe231d379083415dd6fd17c81c779
23894 F20101206_AAAUPY mendezvazquez_a_Page_040.QC.jpg
b29c006f8d65a8d664ff22543bfb0e6b
1edfe06c70cc2d34f02327e1296c633a1aed28ed
104442 F20101206_AAAVMR mendezvazquez_a_Page_230.jp2
d1d7f5271a0a9a4dacd891982f8b021e
952729203e38d157ee5d1cad492aa7fed39d6669
1527 F20101206_AAAWJL mendezvazquez_a_Page_165.txt
d2f5f665770251ba8bc608a71c406021
8cc7573af38caddb096d2aeba9a09512628f4a3b
16413 F20101206_AAAWWW mendezvazquez_a_Page_138.QC.jpg
103929d36ab30bd321affdc4f850742a
17c74b4b1da9e8e1241f4cd06f83c08f38e44671
4846 F20101206_AAAUPZ mendezvazquez_a_Page_094thm.jpg
c435b63477b18fa02a95303cfa6e005f
a07d260db55a26b75454266cb8a1665afe87a5a1
110789 F20101206_AAAVMS mendezvazquez_a_Page_231.jp2
1efca58cc75408e5beb4dd2d6fd791d5
6de335c67d7dd5965aab1917e1842ce1df7ab0f7
1582 F20101206_AAAWJM mendezvazquez_a_Page_166.txt
37ee2aafdad4a5a81ef9e81c41538b50
fbeb7c8b6b6a191b197099a1c8a692720b1f77b2
4089 F20101206_AAAWWX mendezvazquez_a_Page_139thm.jpg
49e719c6028b7929970985b6689f1756
bd1b544c1d5862b7fbd4252b4d0f43f1ef0254e6
108842 F20101206_AAAVMT mendezvazquez_a_Page_232.jp2
4f7bfadd70a270a72d38d44fcd5410ec
8599b17674a459d30eba355c2afb07c82afbfb8e
1290 F20101206_AAAWJN mendezvazquez_a_Page_167.txt
ac9b4bd3acaba893cbb5645b21a1b23d
9661516283905eda68af63c8307ca27cedfe38dd
14607 F20101206_AAAWWY mendezvazquez_a_Page_139.QC.jpg
e044832aa5db19b00fcff0e24064ecc7
f2a28f6c8f6789a9628dfac58a07b44920d9475d
117785 F20101206_AAAVMU mendezvazquez_a_Page_234.jp2
6d8ab52d9d4a168650cb7102734be599
bf3dc12a80b1170d5de778bac2ca4f50462f283a
1511 F20101206_AAAWJO mendezvazquez_a_Page_168.txt
2e8dd26c7131215db734bebe83c6a0a9
d0b883dfd2e366455e9a5284075e3fac64cd11e4
3824 F20101206_AAAWWZ mendezvazquez_a_Page_140thm.jpg
8a7cfa22fbdd9054466fcbf0e5705837
998372ebe2d98b932997d0808723ef92e79e6ff4
117140 F20101206_AAAVMV mendezvazquez_a_Page_235.jp2
2ac9a2c86b07044fad9937dffca96a3e
1ba1a0343ca0219f17db264c0cfbbc12c88c678e
1777 F20101206_AAAWJP mendezvazquez_a_Page_170.txt
b49f5586499f9c6b26710d92dd061147
fb9c6a3064476a80b8dd6cd450bcbb5abd4d3965
111812 F20101206_AAAVMW mendezvazquez_a_Page_236.jp2
a1de7634ffb120a5f76b135c87bd5e8f
5e5ce5aea2145952bf6aa5b096d96593b46978ce
1817 F20101206_AAAWJQ mendezvazquez_a_Page_171.txt
d63cfe36ba1807c50850c1b121c059cd
70f6cf6cf6719c2d24f2042d4162e6cf6c52e6b5
115252 F20101206_AAAVMX mendezvazquez_a_Page_237.jp2
1fb22caee59fc2e0d094a227e84ca57a
ea8e7d3ca152e9fd0056ea1719df767a738173f7
1761 F20101206_AAAWJR mendezvazquez_a_Page_172.txt
5def23b80121ea356c425ce66b1009ba
be8438ca047b54bcbaa548200f63a3a2e585f350
124677 F20101206_AAAVMY mendezvazquez_a_Page_238.jp2
72561a39eca4174c25ba6105f06013c4
2a12275d3fbf5452cb5e8c5eadb44d68642ffa43
1249 F20101206_AAAWJS mendezvazquez_a_Page_173.txt
55f27f3072a21ee8002c6ec2b97ad200
cf0da9bb79b851a4acf7d0a8cbb2aa771797f870
125702 F20101206_AAAVMZ mendezvazquez_a_Page_239.jp2
f562919b9a93fbfaa956504490dbb6bd
5b36f689c46cfacc08051fc2476206b85a8fb2ec
1476 F20101206_AAAWJT mendezvazquez_a_Page_174.txt
818c0c1651ab1bedb23fbe42d4936bcc
3d79aa275a96e482433376e893b7f5c7121d235e
1538 F20101206_AAAWJU mendezvazquez_a_Page_175.txt
b86a0588e138a0149e2c8749b3de516f
a3b90fd3ce035e490a7df0318076b8385c87b7ed
36768 F20101206_AAAUVA mendezvazquez_a_Page_085.pro
9ccc8800e01e6e100656d5e2a9e61c17
fe675f4894e17845732cfad6826b816e9d01f8c3
2103 F20101206_AAAWJV mendezvazquez_a_Page_176.txt
eb85e96dd3c99da16bdecdb5a77826fe
f36167dd8ddcc72640c6b8ad22e9807ec80575f0
21579 F20101206_AAAUVB mendezvazquez_a_Page_050.QC.jpg
57bf7e9dcae519e191286ef0c26afd74
fab39e0ac5ea8664d8e2a28c7fae4f483559c87a
2224 F20101206_AAAWJW mendezvazquez_a_Page_177.txt
a74154c48840e6b88cc7ab1f0226e36e
1f930ab958a5a67f3d9b32590774f8c68be6af49
46686 F20101206_AAAUVC mendezvazquez_a_Page_034.pro
d7d5de04fffbf730aa05b2b19cb2a7b7
127f4506ac00783a2bb6276c0154e115915438c8
2015 F20101206_AAAWJX mendezvazquez_a_Page_178.txt
5d4cdb92b9d431141d2cf9203f670975
621f5c5fecb248d3cc89f9bf6968c79de794356e
2640 F20101206_AAAUVD mendezvazquez_a_Page_147thm.jpg
6a34693456c29df0a3b8133489a25d87
3686f5dff69b3cb242d8a085c11197454720fb66
2050 F20101206_AAAWJY mendezvazquez_a_Page_179.txt
59cc57198784aed33389e0244792fa01
81cf552ad3cd3c20816274d56b0a473a1f7f93ac
111089 F20101206_AAAUVE mendezvazquez_a_Page_101.jp2
c919250095d153b51d80cb00cedcf623
d39576a2b9636cffe86be56803cccd8d3ffae4e7
1916 F20101206_AAAWJZ mendezvazquez_a_Page_180.txt
30eb7ac9adcda8abd36ff14bb54771d7
56f06b2d530cafbbaf7fc91dfc5862d09630d1da
313042 F20101206_AAAUVF mendezvazquez_a_Page_208.jp2
282475191117cc7ebe5d10078e9d2356
ebf162e356ea922934b0eab9c31ee15947b53bf8
3071 F20101206_AAAUVG mendezvazquez_a_Page_101.txt
9a0e292289369bdf2bc95cce7b034c52
febcae639df2570109a3fd1f4340396a669af04a
68658 F20101206_AAAUVH mendezvazquez_a_Page_033.jpg
dfcd331a8ea8fc06fcd0b08f72ac1ee7
d257708e27b3f82c0fce92fceb5c1dd0a5f60324
F20101206_AAAVSA mendezvazquez_a_Page_137.tif
bd144a2ad37bda781ef4b1d6b18191d5
7a457040ada853c569b9464700b59879e65472bd
587 F20101206_AAAUVI mendezvazquez_a_Page_158.txt
a777af94dc3830360e5104cac2211a3e
27ff4763af2a0e248a29d1b01004950d1865126c
F20101206_AAAVSB mendezvazquez_a_Page_138.tif
4a11bf4dafdce899f27cc86e28e07cdf
ac2b3be514a9aa3099d888f4d84a282a31d59ab0
115335 F20101206_AAAUVJ mendezvazquez_a_Page_096.jpg
aab9107c074963d3553ae4480b690fa5
ad19297f8c7948112ed53c5026a7649e23336b12
F20101206_AAAVSC mendezvazquez_a_Page_139.tif
e7d8417d1ee4834466b85422101659eb
183781114eb878aab8e28fc1789b212b1bc0c066
F20101206_AAAUVK mendezvazquez_a_Page_203.tif
3fffece123900170328ea71edeb62e65
d2a97e7076ddb75e1e8243969e23b02d5f609fcc
F20101206_AAAVSD mendezvazquez_a_Page_140.tif
334bf4428ac4b7e1d4c910261f574eb7
660e10dc4650be659637e56c73078d320551324f
F20101206_AAAUVL mendezvazquez_a_Page_164.tif
1fc8f3d7c13ae1dd0c3d0590463d4764
4aed9a3562c6061be6f15e6b19a2c1366cd94340
F20101206_AAAVSE mendezvazquez_a_Page_141.tif
9374d509652b94d679cedbc4e2182034
2216f363538574169824987dd50851204292455e
F20101206_AAAVSF mendezvazquez_a_Page_142.tif
2b6ddd250d98a9bb8a4a933ebd86651e
cbd0b5bb0989c5e3a54e473486c6403e7ae444fe
62715 F20101206_AAAUVM mendezvazquez_a_Page_038.jpg
f1b893922176ea0461e8bd0dc35a3af1
790b9075473219876ecb43f79cc02dd2f7084b3e
F20101206_AAAVSG mendezvazquez_a_Page_144.tif
12591568e5587e0c7a041676846883bc
96ed79fb664c6d12e5da8adcf9ad33fe8e970f9e
1037 F20101206_AAAUVN mendezvazquez_a_Page_157.txt
320e33d3a161efc805126a91e1dfc63b
59c1930a333bc61ac7f7002ff79672f282cc4d5b
F20101206_AAAVSH mendezvazquez_a_Page_145.tif
ebcb5c36763b86821d119f2906551c68
c7556edb2cb28245f0f7bdf3b1050813d2bfe1ff
17632 F20101206_AAAWPA mendezvazquez_a_Page_174.QC.jpg
62a941f4b07f81a1a6f4e8e367bfdd91
3696f8346f97c42d9210d3f6f4c572b098dbb8ac
34992 F20101206_AAAUVO mendezvazquez_a_Page_208.jpg
9641767c35df35ad5bca90ae1a370876
11ea220b4a58e68d456b3b754d3cfdd369e71ef4
13239 F20101206_AAAWPB mendezvazquez_a_Page_017.QC.jpg
4dcddf9f23ceb03832039482937e51b8
559b6e9ff9fe62449984973eb3b6aff554d67acc
3331 F20101206_AAAUVP mendezvazquez_a_Page_195thm.jpg
835045a520fb02c078f8d13f2417a975
685de9292fe657081a9f86955a5aa97b2ba0c4df
F20101206_AAAVSI mendezvazquez_a_Page_146.tif
cdf4a322ef7783487b985d0baa51d27f
a7e5d3455299146a3abc2a04777a7d719aeed79c
8625 F20101206_AAAWPC mendezvazquez_a_Page_010thm.jpg
3b09b2ecb62fc90bd0f575219128b320
2d86269a41e6f5bccb13c0dd7abfccb81c7a561b
635227 F20101206_AAAUVQ mendezvazquez_a_Page_166.jp2
251da9f5ecffc5df9d2a3899ad5d483a
f393d64fab1c3a7c40122fcfb2599f531e3018b6
F20101206_AAAVSJ mendezvazquez_a_Page_147.tif
3fdafa8827c34a0c3afa05377a9ed5c9
8c7f833aea1878bbfc2fb1c840d9d2de728f3fe4
3452 F20101206_AAAWPD mendezvazquez_a_Page_162thm.jpg
8c0f59bbd0c4c2a011dd19c66caf8a93
b086e46dfe12dc687101be89b10a53e4c5829540
4506 F20101206_AAAUVR mendezvazquez_a_Page_156thm.jpg
5e01280b8d43d79ea3ec48ee233cf0c2
6fcb5b97d4396ab17ff93d7019521e0fab57b60c
F20101206_AAAVSK mendezvazquez_a_Page_148.tif
3fe4876164005ec16eb179888c12e2af
367155151027c3db65c89635a1e95495912ef1be
25105 F20101206_AAAVFA mendezvazquez_a_Page_001.jp2
a3f59a00c61c3b00e8c3babc05a69900
5a17ff8ad56342f3843a4db0fa59ecbbd8fa5784
1885 F20101206_AAAUVS mendezvazquez_a_Page_230.txt
896573bcf2d3e65fba06887b71677551
ee2ba5e66f90d4d3ef4eb459103af8d85755863d
F20101206_AAAVSL mendezvazquez_a_Page_149.tif
5c3f6b1888366cf97dc44675d2e98140
39948e792af8c6f5f869b05d2f8153ba9c90b7de
20298 F20101206_AAAWPE mendezvazquez_a_Page_054.QC.jpg
5033daf84268a2c8953c06769a5a7e9b
d3fb12d3cee34cd52bfa5ba4add70f60bf4014b1
16393 F20101206_AAAVFB mendezvazquez_a_Page_003.jp2
5fcea835161f328354f6255209437435
ae2cb2e1429f12488146ddda3edd071c7df8f226
46987 F20101206_AAAUVT mendezvazquez_a_Page_047.pro
a5baa5059b5a652864a51800f25f67fc
bc8868accb2e83b1cea0997aaf7ee4c10701867c
F20101206_AAAVSM mendezvazquez_a_Page_151.tif
45f3100bb3d3266263bdb1f525604a6e
c69037a27fdbe854f8e51718d4c9b329b65435df
5529 F20101206_AAAWPF mendezvazquez_a_Page_175thm.jpg
d7d3a4ca26b02a86abdb694dac640a2d
f990624048cf8fd95b81ac16a71ff8a303ae39a8
44532 F20101206_AAAVFC mendezvazquez_a_Page_004.jp2
089e4a63ac53743720ff17cfbfda27d0
8b2f9f937a10177045f5f5ed046e614883d3b320
628448 F20101206_AAAUVU mendezvazquez_a_Page_169.jp2
7124bd141ffe6691334bcf3a33a517cc
47b89a1f3befbf046cdabcfe43393df3182225ae
F20101206_AAAVSN mendezvazquez_a_Page_153.tif
d6a2ae181101131a3e93f97717e84047
0b3602d064077eca8d39f865136e34af5d891afe
6014 F20101206_AAAWPG mendezvazquez_a_Page_171thm.jpg
55b0b1aacf227c61c9573dccb0974294
74a4363dfc9c4bd642f820fbf6c97cd9022a960a
1051970 F20101206_AAAVFD mendezvazquez_a_Page_005.jp2
ff005daa8c979a8c0d9496d1039a3fb8
78d671f44fcfe72ef36e1b587288ba22de9cc135
21143 F20101206_AAAUVV mendezvazquez_a_Page_188.QC.jpg
e5f1a474a68afe79cbcc361fc5974e25
dc01de394b6982676d78b84f0244f34116fb8a6e
F20101206_AAAVSO mendezvazquez_a_Page_155.tif
5322540659dadb2e0145016869c4b5f1
204a26976a1d5ae01f39c5cf12c6321c593c439d
26382 F20101206_AAAWPH mendezvazquez_a_Page_049.QC.jpg
141e1334ebfb99082a7274084fa61b09
ae1eb7cf09207243e80e78bd1c79712069486c54
F20101206_AAAVFE mendezvazquez_a_Page_006.jp2
155e50eafeae92c5b8e6e6193112c035
d7f00b7f690ffbddfccdf5e016f25989e00938c3
4449 F20101206_AAAUVW mendezvazquez_a_Page_141thm.jpg
3749f292fb4cec9c536f1b4873d46e94
86bdb64233b6a3906c3ac3db162be5eb097d1f7d
F20101206_AAAVSP mendezvazquez_a_Page_156.tif
7e2fba0189ecfa3ed7031b45deb4f16e
db4964e13e1f144a9368c9406c8846e5c6fcae7e
3404 F20101206_AAAWPI mendezvazquez_a_Page_148thm.jpg
bfe19b4dc9bb7c3fa49962be69839c94
e095b55a07b870d69af12f9921a36f82377fa745
1051975 F20101206_AAAVFF mendezvazquez_a_Page_007.jp2
03b28d2e7aecc345ca88a384dca8e63b
b5fba6d409c3ca03fe106bd8229b1038039e01d4
7485 F20101206_AAAUVX mendezvazquez_a_Page_020thm.jpg
8d179f7dbbac185b37d464fa18ad4943
a471f4fdc47946781be1de503c1698c1e2686b5c
F20101206_AAAVSQ mendezvazquez_a_Page_157.tif
88bf3938252c0584ad84a48cbc4d14ef
2ed057a9ff576f1e6c37b5c611903280227cc296
5890 F20101206_AAAWPJ mendezvazquez_a_Page_032thm.jpg
49152011f4fb338e22a68d49b1aa640e
9e53f4244ce22968c897b8551ef5f496e44ba122
F20101206_AAAVFG mendezvazquez_a_Page_008.jp2
0c7838bdc5450507cc9bd73152525c78
bf87a68fe6c4722eed85de6ce9e33453018fff0a
2169 F20101206_AAAUVY mendezvazquez_a_Page_207.txt
aab9026a17bdec44385f39be388a938c
d229e7e50885863d5b8573e1dff13924015a1de9
F20101206_AAAVSR mendezvazquez_a_Page_158.tif
0ff1c9c4cdadfed6aaeedd6493bdd826
8d61bc352dbc6f1691a67492db43bc2e26cc8f73
F20101206_AAAWPK mendezvazquez_a_Page_005.QC.jpg
6ecb31425da74556ed75e6c902d71bfa
5c2ff2f7d0ed0cdb421e7e6236b161e362c76784
697457 F20101206_AAAVFH mendezvazquez_a_Page_009.jp2
ade525967060788b47c8dafc7417b18b
b9a71c9abb5c7c34c129ad6053573011b4109ca8
21821 F20101206_AAAUVZ mendezvazquez_a_Page_033.QC.jpg
ece0b190b02018e608a82902091e2f8a
d42ce7d97e5c54e3fe4ec7e0d49d385f3e3ce050
15396 F20101206_AAAWCA mendezvazquez_a_Page_195.pro
2321ff15b917a3a7ea645bf8b625ec0c
e61be60d7cc980d72504cf8ea849e1f9a93351d0
F20101206_AAAVSS mendezvazquez_a_Page_159.tif
c75748c3a1c33f95c60d23935b3275f4
ddc8534bbbd883f309176ea7ee2de4a27c19180c
372948 F20101206_AAAWPL UFE0021658_00001.xml
895eea235681ad9ba5ee8bedd75c36eb
70209580fe6221b122a504674ea80622001a51af
F20101206_AAAVFI mendezvazquez_a_Page_011.jp2
609961225645b8b0584b5caafe389f46
97e6935e851c97836ee96ee2d41a3be2c9eab178
20447 F20101206_AAAWCB mendezvazquez_a_Page_196.pro
62ee7f7d316381367acf839c93317802
518209bac42c9ae918e8da2ce16f13b4356daadc
F20101206_AAAVST mendezvazquez_a_Page_160.tif
0aff7d8a869e5ebdb3ba78414af03991
df7ad1d56a7ccbc1c8c8b72c8cfffbe6a242f46b
8219 F20101206_AAAWPM mendezvazquez_a_Page_001.QC.jpg
dd9ce3788fb12d71331bc29ba5dcdacc
f3afc7dafb7abf3cb4d7b8b974ab736b64f6267f
387058 F20101206_AAAVFJ mendezvazquez_a_Page_012.jp2
cca05f34789189dd7f84b37a2f1ab97c
35fb1de51c0401d96b293a3e6c89ee7c883bad13
31235 F20101206_AAAWCC mendezvazquez_a_Page_198.pro
b4a4668859e2f205e7cff7c4ba99f88f
0741c56f48bee9122dae814b28016943bc72be6c
F20101206_AAAVSU mendezvazquez_a_Page_161.tif
32361bd794dffd725fbf80066d02212f
d4404b6166f84bf04bcc2adf4f1c9fa3f8852a4a
510 F20101206_AAAWPN mendezvazquez_a_Page_002thm.jpg
f04ec789fe479cad4f6df31ac83f58b8
f05466e175df38047bd93ea072943606b618968c
F20101206_AAAVFK mendezvazquez_a_Page_013.jp2
792af13a972e303750dd87502a01e5e5
19b966712c478f20e390442c273114b28b3ab1ad
25048 F20101206_AAAWCD mendezvazquez_a_Page_199.pro
3cc15ea87731587bd6b7387141ab17ee
b5e1628a96922d2b1e89e02ab4da1355fe6074ae
F20101206_AAAVSV mendezvazquez_a_Page_163.tif
d9287738d9be3e43f277ea7e3403e1db
72e37f0c38447db6248f251a4a42f6d0d16c8ab7
1592 F20101206_AAAWPO mendezvazquez_a_Page_002.QC.jpg
a9bf479e8fdb91ebcdefa9062860986a
e2657dfa110040609b46166e2da062ee07bfb4b8
F20101206_AAAVFL mendezvazquez_a_Page_014.jp2
912430983361ddc6c9720732fb71ce7a
d65e2c51d60b748876832f4f39876b3339037db8
13845 F20101206_AAAWCE mendezvazquez_a_Page_200.pro
b86a8b6da8a2a585cc6d5f7986e89a1e
7540e98441e89b3538aeb013cad89f61036b60a0
F20101206_AAAVSW mendezvazquez_a_Page_165.tif
7fea475e07e73f4f6ee9ef99628e7756
1aed5ae846f13fd370a903549dc374bc12c25459
1293 F20101206_AAAWPP mendezvazquez_a_Page_003thm.jpg
cc1201a65098da843b32ef490bc76790
70cb8242393aeeafa8897086776bebd617fabe71
1051977 F20101206_AAAVFM mendezvazquez_a_Page_015.jp2
9ec03b19f448c97126d10f70617111ad
ecda39cef0efc4adc6fd7f8c021426112842abc1
21397 F20101206_AAAWCF mendezvazquez_a_Page_201.pro
f7c01efd8e64a3eddaeaf82b1fd503ad
500700be7f9b38e9d92841d28247f4f58ed5972c
F20101206_AAAVSX mendezvazquez_a_Page_166.tif
74c14104bfebc22eba298cb01d612c73
3e49c20aa492e21d7bde26dd499a1fd3da663bde
4642 F20101206_AAAWPQ mendezvazquez_a_Page_003.QC.jpg
b8496c64e1585a96aa74430752893420
0bee496a22e8a10e58f6914dda0830db0581ed3a
321387 F20101206_AAAVFN mendezvazquez_a_Page_016.jp2
9b4d8c95d1ed43f2a0b95fc3485d8a2a
6abb4b5f075e7ccc8ffb00282a89fb0692b44ffc
40007 F20101206_AAAWCG mendezvazquez_a_Page_203.pro
557f70f1ee41e0167dcc99be617af56c
b34804a7c2853d2c146dc66d484b9e39937c1f3d
F20101206_AAAVSY mendezvazquez_a_Page_167.tif
8b5cdd2b47d44f090d085dd675614816
873302cdbc76aba3f18f8b5e4cfe8dca4a6d6913
3254 F20101206_AAAWPR mendezvazquez_a_Page_004thm.jpg
f2f468224667731823cd9c6645a0c8fe
99fbde96c2509c1bb1ad6c95fa151d6875bd848c
42439 F20101206_AAAVFO mendezvazquez_a_Page_017.jp2
72b9b03341bb0dc42ed57fa3f4feb324
aa25fc1473b342ab5360100cda8573e9c1d045e5
10843 F20101206_AAAWCH mendezvazquez_a_Page_204.pro
82f1157fe51791c88d4bb529d42c8a95
9a118fdcee84b2cfcb1612b4741c5229f7597230
F20101206_AAAVSZ mendezvazquez_a_Page_168.tif
194f2066ac7020e5cf07164ef9936474
9cb1a4d2a290bfe60a9a117530f4733bdb7ef9da
13829 F20101206_AAAWPS mendezvazquez_a_Page_004.QC.jpg
6a942ee83652b69adf63b34827a09e09
f585654853cec8a0b102beab37808ffe23145826
104460 F20101206_AAAVFP mendezvazquez_a_Page_018.jp2
89fff0388a81d32ed34f14c71693ee71
f2952d0c0a57bdb2f0fd030eced483d1406d5b1b
10372 F20101206_AAAWCI mendezvazquez_a_Page_205.pro
20f8b4f69a51bfa18359fb7f5fc440b5
816a768f4f538391bbee32ef918976af3ed0fbab
6605 F20101206_AAAWPT mendezvazquez_a_Page_005thm.jpg
8b564bb92264b83c52122af06a7b27ce
25a253328864054b407a1e4ae3aff43b5abc5e3a
50077 F20101206_AAAVFQ mendezvazquez_a_Page_019.jp2
a3d4a5d92b48143d1c0f9400ab90d061
3154425ece26f9f21a6ee8dac04fa6aedcf3edc5
17459 F20101206_AAAWCJ mendezvazquez_a_Page_206.pro
7ef18df76cc241eea7c82cea42b82e6b
f8ba60c4cc110f1f28a41e5f053f8c00163cc4c2
7785 F20101206_AAAWPU mendezvazquez_a_Page_006thm.jpg
9d0461bd84f5b58487215ad67d1e4116
41df9d1cf5f205ac16fab4ed7bf10a4f56380d39
1051968 F20101206_AAAVFR mendezvazquez_a_Page_020.jp2
598cc91d78949f4a9c203fd66c3a367f
9f692bb3cc669c23beb1b89d3e3d46f1ffe7c5cc
40516 F20101206_AAAWCK mendezvazquez_a_Page_207.pro
855a7e64620d74dfec17ca84f187fa8d
6cfee241977df9ea7e3bd05a90bf259598d80260
37131 F20101206_AAAWPV mendezvazquez_a_Page_006.QC.jpg
397fe624893e78d6378c2758a751f267
9b333e55562e7d1a5591d4f175d2943307a2603a
90932 F20101206_AAAVFS mendezvazquez_a_Page_021.jp2
688de73ba40cedae43138a3c7167fc74
e937c21d97394aa47b67fe120e1b42d83a6d74c0
10053 F20101206_AAAWCL mendezvazquez_a_Page_208.pro
17f368af1b6ac44aac294973566cf6d1
613a6f8ba21da909baa256c016524599adb386ea
8083 F20101206_AAAWPW mendezvazquez_a_Page_007thm.jpg
ac12119a65faec0fd42c670f5bf40736
7958643e4a72dc0171ff4474d5679b9e8a34d078
1043877 F20101206_AAAVFT mendezvazquez_a_Page_023.jp2
1bb33a49d178e114c862e8285efcc01d
302bdb61d61a81d0a748010b56ff53188f0cff79
8910 F20101206_AAAWCM mendezvazquez_a_Page_209.pro
071c33159947b9a8bc6b06990050de2c
ea182ddc3eecf3122eaaea8a7d62190526c7632a
40077 F20101206_AAAWPX mendezvazquez_a_Page_007.QC.jpg
624d3d49d7d9eaa4f2075a31deef7a65
7a13f83f9604ace24b1e9e7ef6837f88109bc5af
867813 F20101206_AAAVFU mendezvazquez_a_Page_024.jp2
bfe13df31707ceed9cb6cb1400f63fcb
805b3f0e25a79f1e4798fcd63d0474f46b1551ae
21972 F20101206_AAAWCN mendezvazquez_a_Page_210.pro
de7018d2cba38312a0d43e52108c94fc
1140fa8ef7536d208f970f81ffdd9ba7da63a6a7
2909 F20101206_AAAWPY mendezvazquez_a_Page_009thm.jpg
4582e4526dfbf8d1c95e279a5d4b5e72
70f27a6f325a31eb124fb340885988894c3f5885
1051979 F20101206_AAAVFV mendezvazquez_a_Page_025.jp2
f38b9d807f558781886e3f2ee06925a9
67876e1b6f6e472115e208d271529e386b692325
25614 F20101206_AAAWCO mendezvazquez_a_Page_211.pro
bad0aa152852e29a07aa4b642e2d1c93
b5667d2773d00a8b1b7ea6916645cdc4b17f4563
12977 F20101206_AAAWPZ mendezvazquez_a_Page_009.QC.jpg
e6807607a58786045a25a9e3b6623f98
c0de52b8cd0305a9e63c6cbbecab420d59a4caec
1051969 F20101206_AAAVFW mendezvazquez_a_Page_026.jp2
a01c0f3409734782be7c3794fa45ba6d
e933f2c505da8ff74018af497ed5fe0ab2c5da4a
32461 F20101206_AAAWCP mendezvazquez_a_Page_212.pro
b563283a6acfeaecb5434a2906334de5
9e14bda902fe31042f0d525a0283ce5097198881
951126 F20101206_AAAVFX mendezvazquez_a_Page_027.jp2
eeeaf0061fdf7e1c0041780c2d496209
7c5ad5d34d273a860d8107e15c2c1f4cd898be5c
7455 F20101206_AAAVYA mendezvazquez_a_Page_080.pro
72b4895169ac5e3d18079073d6b10544
53947bf83a9596c3c83b14a3c4446159347c3bb0
22577 F20101206_AAAWCQ mendezvazquez_a_Page_213.pro
d0a8838c5dc0a19d03116cfc1d41fa75
0dda14038a9ac05e8c7f4396e5cc9251c216f160
904910 F20101206_AAAVFY mendezvazquez_a_Page_029.jp2
0e2592c3971bed95260f5d0dd8af36de
ef7c0a13ba5513a6ba047ce5d04ebbc50deed4eb
14721 F20101206_AAAVYB mendezvazquez_a_Page_081.pro
98c436a70a4d0a631247b21bd50e0dd3
50f7099b58091e3f8b5ff087256e8c78e73d31d9
25225 F20101206_AAAWCR mendezvazquez_a_Page_214.pro
3658eec8a6d30d6cb89ecba224a5855d
3536e0e29e1ed0fd0e9458a936075f99e10aa851
969388 F20101206_AAAVFZ mendezvazquez_a_Page_030.jp2
ef497ede6e09fbeb7a312d2c9c9260e8
1ffbe778e41df4181d3e2eefb13fcd1b327bcca4
2573 F20101206_AAAVYC mendezvazquez_a_Page_082.pro
1bc5b106920a15eb73d74576be5dfd0d
30ee99122655ecd47b75adb0afdef4cbfcb8af78
12734 F20101206_AAAWCS mendezvazquez_a_Page_215.pro
174777056e30942f45ff7d9c67d9d5d6
eb8543a5f759d7fd19dcf72f6a6d619295d3982b
10303 F20101206_AAAVYD mendezvazquez_a_Page_083.pro
5cc3e3398a862ba314fada4d479c747a
f9b06315ea2aa3b2788d0c559182e3fa31290c84
13267 F20101206_AAAWCT mendezvazquez_a_Page_216.pro
d540db4f69b50ba3405e3cb761f8021b
90ab7ecbc2e281c9e8cb18bc611ae117909df73b
11294 F20101206_AAAVYE mendezvazquez_a_Page_084.pro
62328fc3caedca95e7c713d952d25415
e44e48d28f9c18b4986d66937fd775f2d5205c81
15291 F20101206_AAAWCU mendezvazquez_a_Page_217.pro
1eabf01d714660511eb7c5d6186f4fd3
2e7532d5c20c05234a6bbe0b26917c7ac6e9c49e
38701 F20101206_AAAVYF mendezvazquez_a_Page_086.pro
4f10af10dd2dfc85acbf41e1e73eda49
b9e7fd4aba2fa19302ec3c16fb065541cccaccc5
15318 F20101206_AAAWCV mendezvazquez_a_Page_218.pro
fcc37986a7cc758845b3f5e5574363a1
00329a5f2ea7d3e1dab29560938780c1697f85cb
26008 F20101206_AAAVYG mendezvazquez_a_Page_087.pro
87fa7e468c6a8e71fcdb92ab03844c4f
5ed9fe6b9165905d476d03243c567824dd81d83c
52524 F20101206_AAAWCW mendezvazquez_a_Page_219.pro
5e97fcdaa0f07fcb5786fc633b22b157
229faec20855e527d5cf50e799bc8446d08d8d1d
31028 F20101206_AAAVYH mendezvazquez_a_Page_088.pro
420e20e0ed55c8074d8204baf71bb91a
eea9c4835f5c9dd376949d041d721fe8392200ce
6381 F20101206_AAAWVA mendezvazquez_a_Page_106thm.jpg
0788eeae5bbcc077099f5469be409873
d8e26f62aa3f0b0b23f0c094d0142ebf017d4017
37517 F20101206_AAAVYI mendezvazquez_a_Page_089.pro
23a6531845c380cb069c01bfe02ed3bb
77b250d69e0b1a23e2c26bfeefc31e9a49616eaa
6876 F20101206_AAAWCX mendezvazquez_a_Page_220.pro
6dc6921dda8828b1b51370df07bfbc94
168ba1157fd76aee12c799a5f7c0f6de9de922d5
24441 F20101206_AAAWVB mendezvazquez_a_Page_106.QC.jpg
7c5f78101e5a5764df838fd7a0e4ba76
a29919217895bc2ccacf96a6b14d225dcdf64ef5
29292 F20101206_AAAVYJ mendezvazquez_a_Page_090.pro
1ce124e78396eec15653f7eb063e0753
3291139175aa90f59e96383fb26d9fe24f8d671a
28815 F20101206_AAAWCY mendezvazquez_a_Page_221.pro
28079c5ee588c51ed28b339b3e0fefea
68e4455c44b8fa98137ec07b204c9a311abcce21
6140 F20101206_AAAWVC mendezvazquez_a_Page_107thm.jpg
a1ffd6c1d29d1ee20f5d27c9155eea9e
fdcd507240ca04bc19d3948a532c99e95fc65d1e
32820 F20101206_AAAVYK mendezvazquez_a_Page_091.pro
949c02e8573d479201960a8a381713da
0cb0358372b832efdf7f70606e43113dfbd47a8f
16579 F20101206_AAAWCZ mendezvazquez_a_Page_222.pro
20a2e04d75ae627d9d5811c30722c95c
4171fd401784b50e4f598f2ce4d23bf9a5282c60
5381 F20101206_AAAWVD mendezvazquez_a_Page_108thm.jpg
3c9ee7f1ccddf4571003aa676067d3ce
f784b14849889556377f3d7b8ec4079bdcc06da6
F20101206_AAAVLA mendezvazquez_a_Page_178.jp2
50d257c4b4763c867a918d56663ef416
61107adcb5c6003b103e14de18b6e89df593450a
48631 F20101206_AAAVYL mendezvazquez_a_Page_092.pro
3113b82de3f12303947eb10400af7519
4474588b229adc9c770308683a25ac8d60cc1f28
17764 F20101206_AAAWVE mendezvazquez_a_Page_108.QC.jpg
95d9d072909e388d18b9fb939e3fb656
9f3322e2e146c4021dfee6d4e8f12095415e3f9d
54879 F20101206_AAAVYM mendezvazquez_a_Page_093.pro
949021a8ec73f2d332afc17dfb5e8a37
f24fffdc840dd18fc42f9bb7a1ab22ced2108e38
6045 F20101206_AAAWVF mendezvazquez_a_Page_110thm.jpg
9355432d0aed4e2115f4a4612bbde685
744a946c0b1b6689dbd22690bdcf0c47295501b8
25533 F20101206_AAAVYN mendezvazquez_a_Page_094.pro
6ec5c7e795784d52b1269109fa954738
df88184de708e69e3128d38ed46f018f497f7cea
23146 F20101206_AAAWVG mendezvazquez_a_Page_110.QC.jpg
0b89e0b21f4f735169e52c7cba4a3594
71c62a4b53a48a6087cc8f970ecd3921decb03a7
F20101206_AAAVLB mendezvazquez_a_Page_179.jp2
48902f9614cc1190d34dba190116d7a4
6ea7908f1a5ddb573c30c2047d69396f1d2ea7e6
6706 F20101206_AAAWVH mendezvazquez_a_Page_111thm.jpg
d581acfc93d1bb6227ea8a0b14c674fe
717dd5513c8c824541b272fa92141bf3240baff2
1051926 F20101206_AAAVLC mendezvazquez_a_Page_180.jp2
55fbc19b4db2b2eca777bcf7b1d7b96c
6c03410d4cb5c6808f881b704c1264d5fcdddcd3
48668 F20101206_AAAVYO mendezvazquez_a_Page_095.pro
b98c5ea6db71ecf0f19512b33b93dc63
68c959539208cee282d41e4b63d7ffb57a6c68dc
25917 F20101206_AAAWVI mendezvazquez_a_Page_111.QC.jpg
1fcb2d9016da942988e014fbf2a7999b
0e304c18f9e355323271c5a39173a95c61217e09
F20101206_AAAVLD mendezvazquez_a_Page_181.jp2
a324204819347570d0fca9a8df73b18b
81325ce4066d3a27e171f72a3e27a5b3f052731e
57948 F20101206_AAAVYP mendezvazquez_a_Page_096.pro
663b30cb99764f4f234ded6bc093aeeb
1f1d0536ca9cf4385f9f540045e7e0c0ed548619
27843 F20101206_AAAWVJ mendezvazquez_a_Page_112.QC.jpg
61cabeca8530b94e67392550ae2d1623
218db82408ef695a12bd108d89de24f60a2c48c6
F20101206_AAAVLE mendezvazquez_a_Page_182.jp2
92a12cb95406ab8246c07e8c5995f779
3cd01fd84c3a8381c8ff7c654d2ae05f252b9461
17315 F20101206_AAAVYQ mendezvazquez_a_Page_098.pro
24825377af0aa913a54109d3f4e05cf4
5597ec2fce2b635cda164d1e2733b8d76e3bc327
62130 F20101206_AAAVLF mendezvazquez_a_Page_184.jp2
7afdbc631e8f402b861f70a829b556eb
22f9f07cdf3999e99e980444796ada6e76aecee6
56931 F20101206_AAAVYR mendezvazquez_a_Page_099.pro
428f541f97a74bc613717b2ec920d7a3
4f8c7680941c4cd77aaad03aa2ee5c3dcf23bc2a
31448 F20101206_AAAWVK mendezvazquez_a_Page_113.QC.jpg
7e29fbe0a2beb523c8002b123a122248
c3e06d0692dbda272305f7b624553c9541d5b795
60398 F20101206_AAAVLG mendezvazquez_a_Page_186.jp2
89e1eaedc68468c4c20e31a63d1aea72
cf3f14a6cf9b0b7e7304ab9a423ab1e542627653
1724 F20101206_AAAWIA mendezvazquez_a_Page_124.txt
597dae976cff9d9c237aa69e46d961c8
f68ce7bcbdd40fbee04f5935d3ce07fe8b60f00f
61058 F20101206_AAAVYS mendezvazquez_a_Page_100.pro
742301956a244598b59c1d9a00fd7ea2
65b73a000d5274ba367adea9cd0cec63f5ad4aee
32427 F20101206_AAAWVL mendezvazquez_a_Page_114.QC.jpg
12903ba460b82ce33b9f65161e6eb793
9db26f6fbaade8b63c189d1e011c5c9ac9d135fa
59360 F20101206_AAAVLH mendezvazquez_a_Page_187.jp2
6aac2080cbcc1dfa66ecfd5e8a82e7b9
89fd3e82abb8e29d833712801bee164c62d6101b
F20101206_AAAWIB mendezvazquez_a_Page_125.txt
ec4bad9e8d596d65ae397e2e3990c9e8
73682a99ef994f2ee73a5d5c5c4672c97c6f6226
57688 F20101206_AAAVYT mendezvazquez_a_Page_101.pro
7e2e5473b106e312fc613ead37999cc4
eb544c82ccd9adb4ab48a6b0a9ba8aeaae354064
F20101206_AAAWVM mendezvazquez_a_Page_115.QC.jpg
2caf4b4578aa3a36d99293827be343d1
3a8f1594161612f5ceb4496ebd165a0eb794eef8
77491 F20101206_AAAVLI mendezvazquez_a_Page_188.jp2
79b6072d0e015dad42d60ab72475190c
509a9c4a88c765fbb26e302858fa58252d1b7c22
1879 F20101206_AAAWIC mendezvazquez_a_Page_126.txt
d4496970bba6e666ec0c9231e3c998d9
13034b7a1cf3a762ce8f1181336c2456bbb2b743
27674 F20101206_AAAVYU mendezvazquez_a_Page_102.pro
9db859538d17c458e127b89f2b1261af
9d0b1bb2bcb410aae35c531d7ca42e79c0b4722a
14935 F20101206_AAAWVN mendezvazquez_a_Page_116.QC.jpg
53da930bc52fe6fabff7e127b7619ba8
95cc2be9bdb473bed104af943099685f669cb8b6
84644 F20101206_AAAVLJ mendezvazquez_a_Page_189.jp2
efedebeeb5cfaea4df6a6dcae51d5846
ea02b8449f6633469b125d2682ecdbb8cf5af82a
1026 F20101206_AAAWID mendezvazquez_a_Page_127.txt
0196bdfd2933a3d554c8f6e5dbddcf70
8ca8d78fd3eaec31b95b57faa5dd59610bdb9b30
43274 F20101206_AAAVYV mendezvazquez_a_Page_103.pro
be27044c392039069c426a8e09daad46
5b8c7a2c366368581ab8715802437cf7522dc9be
3184 F20101206_AAAWVO mendezvazquez_a_Page_117thm.jpg
c29fdf4a6c0c98ec7ef5453bf430b35c
eddbf6131ff4c19e7425e2e06b56fb3e5fe354a2
66982 F20101206_AAAVLK mendezvazquez_a_Page_190.jp2
9022ae3bbb0051fe3811dd380a926d22
ad328e9dac4f7ec2ac78e38c7109a9e2c3d0d7a8
2087 F20101206_AAAWIE mendezvazquez_a_Page_128.txt
88d45fb57bae13737c11abaa070608f8
fa8d771bb6742c8f97c41068b76c90aeacbcc6b0
38345 F20101206_AAAVYW mendezvazquez_a_Page_104.pro
330a7ca0e2c5158d0dc05396d258034f
be55cda0e4c21eee247ba7201a167ca35423d7b5
2791 F20101206_AAAWVP mendezvazquez_a_Page_118thm.jpg
c8ec23404ac5c1b3435fcea0c1f9a823
32a7d2d3ed558a8df5cf456508c3f82561fc918d
F20101206_AAAVLL mendezvazquez_a_Page_191.jp2
6174d749a5e5db211bdf5e68ff65a9e5
7bf61d421ad7e94f6ec4338f5144f96c66c4a7c3
1751 F20101206_AAAWIF mendezvazquez_a_Page_129.txt
e69412a442538e24100c1907700c0827
d915344fab9988531d7d203b2a4db9b2218c12f1
37935 F20101206_AAAVYX mendezvazquez_a_Page_105.pro
f3952db28782635e72e26ac19cb92ea3
8b7546667e5d8b6d901be75c201010d563c5cb18
9512 F20101206_AAAWVQ mendezvazquez_a_Page_118.QC.jpg
228a0a2d6a0ffeabd41ba6d324dfb2c6
d7c100f03521c938ecb338b9f7bc734e3c5ec1d8
F20101206_AAAVLM mendezvazquez_a_Page_192.jp2
2554b6c0d6a871c8acea49e500e20cda
e7d5606e0086e2c16c301b5431a77715b47d609e
1510 F20101206_AAAWIG mendezvazquez_a_Page_130.txt
4d6e13af07f50cd4435d6db7d51e5333
a23d49c9a05db805585e21855fdda319bb785e9a
35627 F20101206_AAAVYY mendezvazquez_a_Page_106.pro
122b32b82cab7761d684ec28ac496732
b981c4024d8b759bad22b04fda0ad00a4dbbfb85
3550 F20101206_AAAWVR mendezvazquez_a_Page_119thm.jpg
c1503525c9169f2ce433b41ae25980f4
02a4c0acf10a0687bc03d708ebfa9c3f35da13db
F20101206_AAAVLN mendezvazquez_a_Page_193.jp2
67cc3df62089774642b5b078e5dec2f8
a2db95a389721a5c8e0c6df429c1ccde5e4daae5
2048 F20101206_AAAWIH mendezvazquez_a_Page_131.txt
1b9d4ef672eb49c1936e28a99555e94c
0345a992fedd32cb6e6a74e33d8fa11e72a744e1
33959 F20101206_AAAVYZ mendezvazquez_a_Page_107.pro
2fe4ccdc036af075ea91bca086890ce1
e970b6db064e02bf5d256546dbcfff5ee5020ea3
12032 F20101206_AAAWVS mendezvazquez_a_Page_119.QC.jpg
fe3e6219e99ffcd8d56fb5e8fc5732dc
c418cb2531da3aa3bdc2ed2b10838225f099ffe9
F20101206_AAAVLO mendezvazquez_a_Page_194.jp2
e1ee32c3dccee7b6d23fcc286a6e5390
0d83a24bdfe6eca7997b9d18f32462bad021d17b
1987 F20101206_AAAWII mendezvazquez_a_Page_132.txt
2f96dc091ccbdb78c676e32afa8ef8ba
1f8467e6f905361a5ff95d2bb2747f3e877980e3
6818 F20101206_AAAWVT mendezvazquez_a_Page_120thm.jpg
4c4eff6c9a9a8fe3a903794c0e26a347
270269ea04e2c63c84f07f132c39626f7139b62d
547778 F20101206_AAAVLP mendezvazquez_a_Page_195.jp2
3621f6b79d3d0731010c7da218bbb5b3
5957e8b06422359e0a11fe149f4935790788792c
2077 F20101206_AAAWIJ mendezvazquez_a_Page_133.txt
fd1e932648063f3e3463e2173435a5dc
942fd9242c0ebd35e6f31f7096ca19c428f35460
16409 F20101206_AAAWVU mendezvazquez_a_Page_121.QC.jpg
ae5c293be3e3892ab000b630c575a177
a13b17e4ca797205592c9eeb8daeb8698de01114
1051946 F20101206_AAAVLQ mendezvazquez_a_Page_196.jp2
2dbc8ba36c1e8cb7bb8c61a5a67edff0
0ad25a34d3b026b68e2c24a80146604d70ec4e65
2095 F20101206_AAAWIK mendezvazquez_a_Page_134.txt
99657b245fac5a97e95991e01a3f1250
71328cd4ba445126490b106f93449a8b1f29da3d
6077 F20101206_AAAWVV mendezvazquez_a_Page_122thm.jpg
38724a64735736c106afd531d1fa3d68
357b02e42568e5d0726dee07b66e3f8177d30587
1051894 F20101206_AAAVLR mendezvazquez_a_Page_197.jp2
b994ff34835d4bb422df7934b37234dd
e4d5bce0b0d68e9985e9220f5758cdfdf093878a
2100 F20101206_AAAWIL mendezvazquez_a_Page_135.txt
e3c3f622950057efd06530beb83526cd
0adf7ba890ece73854e86384265f393e2af8aa3b
20336 F20101206_AAAWVW mendezvazquez_a_Page_122.QC.jpg
36d4117bc29636b379e5a30c918901dc
b6846ebb5d31b437bed8f8a358c1d45dfbcbdbf9
F20101206_AAAVLS mendezvazquez_a_Page_198.jp2
ed01e40f517f92b0023401f5256dd8d0
636bd513c6c2a3725b40e22f0daab1341f29c921
887 F20101206_AAAWIM mendezvazquez_a_Page_136.txt
3804913d84e94b859caa0ccd5d657588
f7687f312a2855bb60778890edd7bcbe7f243887
23614 F20101206_AAAWVX mendezvazquez_a_Page_124.QC.jpg
bd8842d24dcec8b0b32427e16709700e
0631f1f6dfb740a25c9cc392e1334e18466d54df
1051939 F20101206_AAAVLT mendezvazquez_a_Page_199.jp2
3c6d93e7b1645ac4ca46a5a673044227
6c478e6048e6cf36c7c316f97899d7808afbe98b
1268 F20101206_AAAWIN mendezvazquez_a_Page_137.txt
85dbae2e5908de089845be3566f9a997
5f768ae11119defd988cb4c739dfdf69ccf98a48
6138 F20101206_AAAWVY mendezvazquez_a_Page_125thm.jpg
3724303316da9b1746beeee20ac27e57
0407db9753a5317d3f13c613e88377a221957bc9
1051942 F20101206_AAAVLU mendezvazquez_a_Page_201.jp2
d7e8952cb40a21f0fcb86378de7aad18
f1d544afd426963bb08311f99a8df70889c9eac5
1109 F20101206_AAAWIO mendezvazquez_a_Page_138.txt
e9fa4341f246d8549f2e7834ed42cb57
88c1b0755e9dcba23b8b42e7e7be0929d8f32a8c
21519 F20101206_AAAWVZ mendezvazquez_a_Page_125.QC.jpg
ca8a38148c197c78d4845c7ca2de8b7e
c968e5681e7bd08cc7cb40d4c868aca3ec269bfb
410400 F20101206_AAAVLV mendezvazquez_a_Page_202.jp2
a09463f1f44b796cb89354854e8bd61c
6ae332ba6509c408a0d7cef9efc2e12c47cd6311
734 F20101206_AAAWIP mendezvazquez_a_Page_139.txt
413043dffe2e44546f4417f26c2d52d8
f01009cbf0a6b4b8492ad8bfb418cd7cb2feb5b1
296515 F20101206_AAAVLW mendezvazquez_a_Page_205.jp2
bc918bed41dbf571f8bdf9962a577276
376c60ac2cb654e5533f99142f6d4730ce07521e
731 F20101206_AAAWIQ mendezvazquez_a_Page_140.txt
84267da1c1968e994d620b5d87a5abed
725fa5cdcf310535e29693461658a6adbf9d0f96
863000 F20101206_AAAVLX mendezvazquez_a_Page_206.jp2
43ebecf3bd9e280129ce087f824e0f31
e3ba29428947e175a026a01246db2708bd040622
853 F20101206_AAAWIR mendezvazquez_a_Page_141.txt
2fae62230b8c50b8a1888f0589174545
728b0b8629b4437c8531fdec2f2062a44fba5209
295742 F20101206_AAAVLY mendezvazquez_a_Page_209.jp2
ae80f8b2a7406bb1c563c0fbc1bc4426
d750fc805b1cebda636e3fe0ff3542f80df845fe
1981 F20101206_AAAWIS mendezvazquez_a_Page_143.txt
7ba56c19a7aef68f656f944ee5e9b82e
928209f291e75ede37a92c91b9a0ac975e74a2f5
783093 F20101206_AAAVLZ mendezvazquez_a_Page_210.jp2
2c12830fda02b4b3ca9fb8df3ec18427
9c6c39a6642653eba7052f00e3a5524379f0efaa
1616 F20101206_AAAWIT mendezvazquez_a_Page_145.txt
5101183a95c54786a607d520eda5547d
594f710360c75f11e1d5070287b8fe8fcd800a27
F20101206_AAAWIU mendezvazquez_a_Page_146.txt
a1ff28eda2397e32a4982a98be3d74e6
6123655cd77edaabd8ca07547a6bfacffa3a1e3e
28731 F20101206_AAAUUA mendezvazquez_a_Page_038.pro
6593690020f8343de9c08713a25b2610
d4ad50747b6a52df5a2028950ae756ebce19d07c
493 F20101206_AAAWIV mendezvazquez_a_Page_147.txt
6097c4ac931d04d4942a929b930a7e75
d3ab292bc9a2edc1440e7b7b9ae556132a5431e3
16923 F20101206_AAAUUB mendezvazquez_a_Page_203.QC.jpg
2f533deef719e5612142100ffa087e46
e009af3fd8368dc77e6c4b09d49435e25145ed55
351 F20101206_AAAWIW mendezvazquez_a_Page_148.txt
5024435c6f25d4c8048db9ba4e8b0453
cc5b8d72facffd325c482f8d1c9ff66e9eb0e2df
37321 F20101206_AAAUUC mendezvazquez_a_Page_189.pro
9518465826fe74cc10789fe265adeff1
03bb1f7d7bb6ab07bbcc9a17b8f7e0e0c6b603a4
F20101206_AAAWIX mendezvazquez_a_Page_149.txt
4c1503878a90fe0ac77a1e0d9c691f59
da9f74ad9d1c94cd966dd164a9e6e57f1c2cc2db
1909 F20101206_AAAUUD mendezvazquez_a_Page_001thm.jpg
d91f3c2c949ea17a850463389e9f8ee8
7580ee956fe880351c7d6398cc7ecc6e36f4d0a6
1328 F20101206_AAAWIY mendezvazquez_a_Page_150.txt
abbd6e6371dd41a3d8f91cc409b867a1
f25239d140590e7e0af2808b3dd6e7e575203a04
F20101206_AAAUUE mendezvazquez_a_Page_238.tif
34ff72b4f28a6767226f4819c7bc5b80
6db99c699ad640d57c8776ebec9d347361426750
1244 F20101206_AAAWIZ mendezvazquez_a_Page_151.txt
1da5867428340c05368ff611a8dff609
ffeab7630a2d9c849dbeadee05822b7e93c5c266
18950 F20101206_AAAUUF mendezvazquez_a_Page_016.jpg
f415e7b6e91f12eb62d088dc83c205a8
e7f1db6358f87345ce234ed17f28483f18a33731
21792 F20101206_AAAUUG mendezvazquez_a_Page_088.QC.jpg
627b550143f2fc4e875ae6e840f208cc
f518f42d7d9895b8b8430a7fc9f8f011d61ff9d7
707860 F20101206_AAAUUH mendezvazquez_a_Page_171.jp2
ca19e276f39142959fdb46e7bd1d6884
2ac15c8d978fad8f84587028c84ab7ec36edd0ae
F20101206_AAAVRA mendezvazquez_a_Page_109.tif
93259cd918678a019cdf37cf9dccd109
1667306e76bbb3c2e8b04d8156f9e19414023396
F20101206_AAAUUI mendezvazquez_a_Page_128.tif
250e32562a0199b95a7c1a1de0b63d41
39fbab0f7e300511f0bff95d165c90172e8e5756
F20101206_AAAVRB mendezvazquez_a_Page_110.tif
636a321c18d6eb13a15f8e098d65b1db
e2b585cf479c6ade869d89bc590c0c6a176f4197
91563 F20101206_AAAUUJ mendezvazquez_a_Page_044.jpg
0325e4af52ecdb70ddba3112d280bd4d
367a06774a8d083142e0d791614d082e503aa5b6
F20101206_AAAVRC mendezvazquez_a_Page_111.tif
0a653e612b1261540cb9f2bac559b406
34b2d8e839fde5b5ce6e6d00a8f99a05690cd9de
F20101206_AAAUUK mendezvazquez_a_Page_047.tif
31dc4ebc65a37978aa74af08803f596f
c16b10ba4fe9e9f649c7b7323b19cf57a81e9583
F20101206_AAAVRD mendezvazquez_a_Page_112.tif
7407eb292b47da8b8b8c573fbc0cc4c8
d513965e22b048655a43f35fdc156fbf47dfaeb0
F20101206_AAAVRE mendezvazquez_a_Page_113.tif
5d0dad67569985d7caaaeb05da08401f
92e6289e65ed274016a9bd4c3bc186ad4c9518ba
3531 F20101206_AAAUUL mendezvazquez_a_Page_208thm.jpg
719e9f835bcb43c8b6b13798cbf65926
86a08e1145a04f23f6097e6b50a08d3f7f223b54
F20101206_AAAVRF mendezvazquez_a_Page_114.tif
a09f118c2fe60194179deace04f2f32b
53e0e88ae54f43ec84ea14dd70a335f45f58a427
F20101206_AAAUUM mendezvazquez_a_Page_177.jp2
eb154d8c00558bb359719d9865f700ed
45947081b872c819a367682931ac118cafc03f6a
F20101206_AAAVRG mendezvazquez_a_Page_115.tif
614e6d07babda580865d3ddb4391c8fa
76b801a5d422caa13ab3cc5e36f2d23dde1af40d
20351 F20101206_AAAWOA mendezvazquez_a_Page_167.QC.jpg
d4472ffc2900bcc90ad72c2b7726fbfe
12ba6c16fd738c99a5bffc28c98e49c0ba9b4759
20993 F20101206_AAAUUN mendezvazquez_a_Page_190.QC.jpg
ca650fc30d22901decc6b3df6ee4ca0c
5bea43f498dbbd7f361508bd34328cce9142e3af
5564 F20101206_AAAWOB mendezvazquez_a_Page_115thm.jpg
a98d7b46748297002fae4dab28bd755b
e721c49418395e70acceec98659de038b5adcf4e
73889 F20101206_AAAUUO mendezvazquez_a_Page_126.jpg
13b617b2d5116f2e6451133ee942290f
68ad3a632d86da509c73be62dd2285348bd77ee4
F20101206_AAAVRH mendezvazquez_a_Page_116.tif
90d94b5c18d06e635896b4ad575e1eb3
ffaf8d81ebfedfd53dea23e24b050af6b95b6314
10493 F20101206_AAAWOC mendezvazquez_a_Page_154.QC.jpg
8c460e7d6e79b337850254b1f836f249
e064e02143cfcaa2c7b0f270710bf5739376d12f
F20101206_AAAUUP mendezvazquez_a_Page_236.tif
74986fab5952b8d6d9ee5e69fb57fb17
38b83765161666523ea01310ea95dbe6cc05a0b8
F20101206_AAAVRI mendezvazquez_a_Page_117.tif
e5c3092138d62b0e19179a7b57019f82
57cb9874312a44afde146b8d0e796614bccdd5a7
11308 F20101206_AAAUUQ mendezvazquez_a_Page_097.pro
9590bedb99e2648e5e9879c736dc41ff
82260a39c9f0f537f95e07366033a67ec33e5ef6
F20101206_AAAVRJ mendezvazquez_a_Page_118.tif
efe0cc064b9195b8d8242939dcd37ff0
25f9f868820b6223fcd722c316ae6d7de9b3c809
73335 F20101206_AAAUUR mendezvazquez_a_Page_091.jp2
b647643a5cbe0be4c212b6776670de6a
83bd7a8bf8b79133c2f45674ea7980a554a2c4eb
F20101206_AAAVRK mendezvazquez_a_Page_119.tif
9797bdf467f3b5a4a3db10b18fec6063
7ccfc0550d3b7bb22dc7ce8beded647b4e64a38f
33370 F20101206_AAAWOD mendezvazquez_a_Page_135.QC.jpg
0fccc763962a8717df61b92718998e73
1c1e85283aaa0fe2f7a7ce7643285cd19bf0788d
61294 F20101206_AAAVEA mendezvazquez_a_Page_217.jpg
e2c3f737ee24a4b57255f54254f92089
a364691290c2edd519767ed4d5d20914675afc00
8194 F20101206_AAAUUS mendezvazquez_a_Page_026thm.jpg
977543153e8edc26644931727142d03e
4bb943c96d17fceadf3fefc5f61cba0ce8c5dbac
F20101206_AAAVRL mendezvazquez_a_Page_120.tif
f93d69a94969924a1b959fe9a2f10728
50608fd11c9036c1ff9b95eea48c9537fb09e6a3
26988 F20101206_AAAWOE mendezvazquez_a_Page_024.QC.jpg
b78266d3e00ae6ac5b9f1c52cc4439bd
aa16f78c07a21c15fc0d678b6a839dc0b77b900c
53963 F20101206_AAAVEB mendezvazquez_a_Page_218.jpg
db7bbc894224b78edc2f74c80e072ae9
3709030f363602dd273717aa365ef67ef4c486b0
82104 F20101206_AAAUUT mendezvazquez_a_Page_189.jpg
200eb9a8d413fe3e3f0a3d82a7a7894f
2a9b8835a35d593d2223ab1d16ea581657ac0bc1
F20101206_AAAVRM mendezvazquez_a_Page_121.tif
66dd593f68cbe2a480b603da242381c8
dd3e42eeee320a7455f6e9d95a1102785efaf419
F20101206_AAAWOF mendezvazquez_a_Page_180thm.jpg
8d035e1e20ed842d44f30830cacd2bf6
81264079802fe75335a91c65050eedf87d161616
103697 F20101206_AAAVEC mendezvazquez_a_Page_219.jpg
34d51413a990f74d9f26b7046e7774c3
e45f638bc4a94f307f9a3f67641ccc899f2ced19
54003 F20101206_AAAUUU mendezvazquez_a_Page_185.jp2
67b67054bcabc4cffe7d3124cf2c90c9
a9e4f1d6f2859f917b3d070292c1fd6f94d8d68b
F20101206_AAAVRN mendezvazquez_a_Page_122.tif
84488da391397ee0e07f347b4dbc4eb2
e6e75cff673a50a9ecbebc17676c7f26752d5bd8
3589 F20101206_AAAWOG mendezvazquez_a_Page_222thm.jpg
60d2cd007ab01472b5c74a10b97424cd
7e8cac3d244f4e267f76d47e464746fba90e936e
14815 F20101206_AAAVED mendezvazquez_a_Page_220.jpg
62bdc877c2ccd278e2ef6e1aca41f103
4ff348543189f209185c818fd86410ce9f226bd3
34152 F20101206_AAAUUV mendezvazquez_a_Page_154.jpg
db82f646c124a00511b9715d03ef8124
79109f138923d02b33c3f46787729e9b213bff15
F20101206_AAAVRO mendezvazquez_a_Page_123.tif
3f1729285d796f9264de271d653a1511
45a6c57c868d0e8e3dd7fd4fcfcdf7a14ee36279
4840 F20101206_AAAWOH mendezvazquez_a_Page_157thm.jpg
2fdaeeb04caedf8fdef444af7a660c00
83d0f0f15d32de3acc562d9e196848246aa3ca5c
62592 F20101206_AAAVEE mendezvazquez_a_Page_221.jpg
a632836bc1a4c36346bcc2a021668ca5
86c3f947c0d1a23ce6785b8183091be24b43b8e5
4425 F20101206_AAAUUW mendezvazquez_a_Page_205thm.jpg
eee51a4ac89e0f6682d10a2ec2971348
bda852a24b6a4b12e6efcc75611a3097e47e7df9
F20101206_AAAVRP mendezvazquez_a_Page_124.tif
8f0b58552a10827c3365aecaa2605f6c
bc7d9ea87f0e32309193a1a7045ba8c8f0e0f4d5
8368 F20101206_AAAWOI mendezvazquez_a_Page_011thm.jpg
f904a1dd079f4eac79d23b9cde797fd6
b4212e12595402db733ca96de9b4b9bde9eecd15
38086 F20101206_AAAVEF mendezvazquez_a_Page_222.jpg
f41b7934161e256945ee1df2552bca79
5f537e2a3e2fcba0ab0d6577f114e1d52678c0bd
771373 F20101206_AAAUUX mendezvazquez_a_Page_060.jp2
021ae29bc82167ecea681912931a8569
ebf8715319f1c954e97f7db50abb31a1cef1867b
F20101206_AAAVRQ mendezvazquez_a_Page_125.tif
c11903e12f8ab9f9d49d436734f18a5e
ed5c156145b6ff50cdf1d68e825d389069bad60e
7900 F20101206_AAAWOJ mendezvazquez_a_Page_093thm.jpg
964819b32c37a0d20bb34c07815805a3
f0fdd4cd2a19b9d109107bcd2ea1287c86038432
68521 F20101206_AAAVEG mendezvazquez_a_Page_223.jpg
e07b4ff1ab9ee09b9162c74e7342d1a8
bc474aaa0a2af2bc91e5f4337452a3dc418545fe
29046 F20101206_AAAUUY mendezvazquez_a_Page_232.QC.jpg
f78b6b57a74fcbc7b4e11fefdcfb25db
99b33fcd9e67874f5027de058bf67aabe0048a77
F20101206_AAAVRR mendezvazquez_a_Page_126.tif
13f132a478fe2a97c98b4b5fe17e8bd9
5aff30f729f324bb6d6de0921d986c64b134d167
15264 F20101206_AAAWOK mendezvazquez_a_Page_082.QC.jpg
940b2127f1ab5784c758218d3d6d3f33
ea5a423c649028afbf1fe0481b1aa5b79b36f5c6
42171 F20101206_AAAVEH mendezvazquez_a_Page_224.jpg
c76631628862df006c66474e893456d4
0ad1950ad6868107604097747f0e42a6e0d1fd2c
5697 F20101206_AAAUUZ mendezvazquez_a_Page_033thm.jpg
81d5f0892c74861e8174b17ebd86ec59
5059d4a32004da3a0ea17c1ddb3a0c7bb20c6b81
35060 F20101206_AAAWBA mendezvazquez_a_Page_168.pro
0efbb591c26f2cac20228af7a4a4642b
afd99d2cfe122091793a54a9e8bc97fe0aa23527
F20101206_AAAVRS mendezvazquez_a_Page_127.tif
19b2289dbc6cbba5ca86f0c6c82a9b91
a09e0667dae95a302f444e9a4ab741010c0905fd
29654 F20101206_AAAWOL mendezvazquez_a_Page_052.QC.jpg
cd77ad1ed5de4af2c4d5cda0fe5d1f81
6105ab090fd86257957bbe3f81f4a8482a66c218
114881 F20101206_AAAVEI mendezvazquez_a_Page_225.jpg
7e0cd4c9b9a636b21d70a958ad6b8077
22c6ac3cf891de2b3cd0248b135424234f546bfa
30317 F20101206_AAAWBB mendezvazquez_a_Page_169.pro
91b7f7dc3c2821ee7ae9abf0e7a6ffb8
bbf16a03a92c4996c1b5f20782be0e5b14c031c7
F20101206_AAAVRT mendezvazquez_a_Page_129.tif
7810c1371ec8d0a63651b0e5e93175bf
bd52319f3a4a3a6a74c32e5274546ad67bc5e221
28716 F20101206_AAAWOM mendezvazquez_a_Page_056.QC.jpg
b1b311e56131ebfe132161853fde3eeb
cbbd498c6ef065948bf1a784160f1f923d5d66b8
39023 F20101206_AAAVEJ mendezvazquez_a_Page_226.jpg
fbebbf1b82dcd1bfb9eb09b9f263f32f
9d9b8024a349b3b51a195aa9f1aa43f481fb7463
34791 F20101206_AAAWBC mendezvazquez_a_Page_170.pro
0de63c8dcd06f740ae9485e10e63057e
77d1ced0d6812510023a826a9b8b7ae0b07c92b7
F20101206_AAAVRU mendezvazquez_a_Page_130.tif
fa2fe5dab8e5b006e9aba0b53e91b778
0fbd280de7bacd89f51e7c9877b52685c0748643
24931 F20101206_AAAWON mendezvazquez_a_Page_086.QC.jpg
9140cbe088d444f9c32480d2bd136835
eb995764f3db8dac0b454d3729c0a196bf0d8109
88219 F20101206_AAAVEK mendezvazquez_a_Page_227.jpg
5a7d70df2ecbc4250ed6b906fe4d630d
5b53376fdf9e333580cf9ee55b6c3095db98faa6
33207 F20101206_AAAWBD mendezvazquez_a_Page_171.pro
f4605f3192734331c2cb029d2f61acb5
84e628fc13a4227cf42cbc4297a89243d9106f2f
F20101206_AAAVRV mendezvazquez_a_Page_131.tif
9bb8a0e77ec125603cf593eb4d8614fc
bb4697ccf02c49050a13fd8bfc821c0b3739d5cf
28513 F20101206_AAAWOO mendezvazquez_a_Page_233.QC.jpg
c212fb6744481cb5d8a514029daec981
f1b4a51a352f0937e3098a9d453b51636a63e2c5
41453 F20101206_AAAVEL mendezvazquez_a_Page_228.jpg
79af34e637a8f8bc673a5f0cb60dcf45
38efcbeeea9ff9a71483f1808eec6f5f74547f88
29401 F20101206_AAAWBE mendezvazquez_a_Page_172.pro
5948e4914a36f7d382f9bb0d44d34b8a
5f9a412bd18035de4b9780db14403a914b4292f7
F20101206_AAAVRW mendezvazquez_a_Page_132.tif
91b076e9a079086a7ef4bf8aa7ef570a
5add29eeb7c8813ddcc568bcb476e9e620afc2d1
18236 F20101206_AAAWOP mendezvazquez_a_Page_149.QC.jpg
727ee096011044c1fcf19e1dfc27f24c
c49f3d4452d1b7f7b4b0a78b7567c92d65844157
95816 F20101206_AAAVEM mendezvazquez_a_Page_229.jpg
356d01853f70cf698799f49e0c9ca826
308407f685a8797973905424573ca62db4d9d626
28402 F20101206_AAAWBF mendezvazquez_a_Page_173.pro
b761f1821aa652068924a5e6e3f89eda
b62be3ac39d9a3b7fb34b22da8b499f4668b7462
F20101206_AAAVRX mendezvazquez_a_Page_134.tif
52f6ae86533d5878d4990f224a5f9ef2
1598433b6847b82e0294407a38009ee945a41e9f
7127 F20101206_AAAWOQ mendezvazquez_a_Page_055thm.jpg
2f1d740adcd2778a2a27cc4b45857676
00b47ac0f494d35eb9c60b7a82c03d13d1c4fcb9
93887 F20101206_AAAVEN mendezvazquez_a_Page_230.jpg
c672693cc6f4064514b3022afc912e98
db7b2950e2cb1f200c7dbe20ed3c3587eae4eb9e
25291 F20101206_AAAWBG mendezvazquez_a_Page_174.pro
c550f9b3dcd21c84c0ee8916e8a61846
06d4943b06440c557720abd6de4b32f9b5836135
F20101206_AAAVRY mendezvazquez_a_Page_135.tif
5a5d373a3b6bef7e4444916d986404e8
de32f54125d7cdd70b90e81663ddef385f5e90f7
4549 F20101206_AAAWOR mendezvazquez_a_Page_213thm.jpg
d924ad4d9e6407176ecf74b39ef48726
2a741679eb0e264d92c76bc0f0da0788fc70ea48
97941 F20101206_AAAVEO mendezvazquez_a_Page_231.jpg
1334560972d6b707ddfae5844bdc4cd3
e17b262904ec0a048a86f048cc24d6450997faef
31747 F20101206_AAAWBH mendezvazquez_a_Page_175.pro
a1e2c5e9395736496025efd532733baf
46525fa098c0861c4e808b014898f5437a864266
F20101206_AAAVRZ mendezvazquez_a_Page_136.tif
f8ead3fbaac3541e147c2b92b5eef71d
41c387d8cefd58de8603f5b1c581ecea2830656b
19690 F20101206_AAAWOS mendezvazquez_a_Page_198.QC.jpg
3feb5236857f272224c1f7136cb7de5e
5bfebded405737453a1119f94625da2892a787d1
90135 F20101206_AAAVEP mendezvazquez_a_Page_233.jpg
ec8dd4d31392faef20a89fdfe892e379
7cb8d4c6f6725f2aefcbd45673c5bf41df8826f2
44824 F20101206_AAAWBI mendezvazquez_a_Page_176.pro
aec128a7ca5d76121d72ace6847428ae
f9eec479dd9a15ba5010854d6329c801d3ab1b49
21388 F20101206_AAAWOT mendezvazquez_a_Page_191.QC.jpg
7ecc2c3158ca12573123f0ab642f6cd9
7445a2c391e9371fcd69f6570850486a4c443439
111875 F20101206_AAAVEQ mendezvazquez_a_Page_234.jpg
7cca2071fb342c486c18722cb62de038
11e5850fbc257ed69182829adbe154a7e60ebaa3
53648 F20101206_AAAWBJ mendezvazquez_a_Page_177.pro
a7e4cb0922bd46adcdd891de33053f25
c46b31af849f4b2c97555c00a3a6bfe056b1e5cf
21505 F20101206_AAAWOU mendezvazquez_a_Page_032.QC.jpg
bc591f75305d9fbcdf47d3f9ff1d754c
0a22f6a913e49a3949ec60bb36672dd3bf070a34
103241 F20101206_AAAVER mendezvazquez_a_Page_235.jpg
6fd61b35cfd64d4b1607657e9f85ef8e
eb52d5a69aaa6c309ec8a31873d7240a51bea522
50478 F20101206_AAAWBK mendezvazquez_a_Page_178.pro
e53bd8c1b99d693e0c9b48d8886a9e58
5e656ff54f6c34f23538db5a9269644b6863ddf3
8236 F20101206_AAAWOV mendezvazquez_a_Page_182thm.jpg
b6dd7289d48351877010fe2cb503ea4b
e9a816f7de304f4f0a13b87b1b567f48e16937fd
98685 F20101206_AAAVES mendezvazquez_a_Page_236.jpg
c1cb72977820c63a1ae5f2be043442c6
97757143c189774d34bd10ed1b3173944e917fac
51596 F20101206_AAAWBL mendezvazquez_a_Page_179.pro
f29a00d5e91b7c59fcf41db3ce5c7fa3
cc3250af0c0b7560596f3ca270fe3194b40ad5e1
4845 F20101206_AAAWOW mendezvazquez_a_Page_197thm.jpg
e95e710cc02e19985ff7fbd0c32706b3
a37a9de3a303a21771de720d1c929650db7bca77
99961 F20101206_AAAVET mendezvazquez_a_Page_237.jpg
16639053e0ec4d6c75fd079829b18132
d76ae2eb651f9eb7ca222d22b21a6888f12bdf6d
48303 F20101206_AAAWBM mendezvazquez_a_Page_180.pro
533fce478f92f0124bc2cc0dada4a057
d33700c1dac4c7a9f717d49e34c92917ac0c516e
11942 F20101206_AAAWOX mendezvazquez_a_Page_204.QC.jpg
af5c4312b2f3f29a7242ec1c1943ecde
d3880849faba6814c51be7b35d23a58b54698937
114513 F20101206_AAAVEU mendezvazquez_a_Page_238.jpg
635c32c8c06563d2f3a46702a832fba1
d1f0e7cff59534e0c6da2fce3ad7596219b1ced3
F20101206_AAAWBN mendezvazquez_a_Page_181.pro
963498e7f2bb01f2884c9db4dbc55029
1c3f4464f7e8721bb8ead80b0293bcd47d03f69b
4970 F20101206_AAAWOY mendezvazquez_a_Page_121thm.jpg
3389b25f717efa5ec3d959cb7f3ab3e8
fae261c99b279e89b6f5889ce82f6938e9e29a5d
111968 F20101206_AAAVEV mendezvazquez_a_Page_239.jpg
688a8fb39923fa67b347b6a17a114ec4
e184c240889690938b40d24ef5139192a747d129
52181 F20101206_AAAWBO mendezvazquez_a_Page_182.pro
f355f2f8019a755200812d3414ace871
ec0805045e60483efe4636e2f21016ee466fdc81
7038 F20101206_AAAWOZ mendezvazquez_a_Page_018thm.jpg
a7dc543092094ccc238c69d2b1d1d1b5
a2fd024a3e89b4b08d69d6c6fadbdb853b53cf18
116517 F20101206_AAAVEW mendezvazquez_a_Page_240.jpg
7caa6fc00f7b3017f2843c55eb2ebbd1
56c500a8e5522a12eded25093da04c3c00452bbe
18114 F20101206_AAAWBP mendezvazquez_a_Page_183.pro
ec30ac43793ea7d78df7368188dee336
33616eac61f25b30cf3e24b296a7b9b2350ed2c8
102925 F20101206_AAAVEX mendezvazquez_a_Page_241.jpg
f9a2e3459df7a6848be13200eb1e66c2
d6184e8ea320d4ce036606f3cfbf6ab6ef1a22f1
29251 F20101206_AAAVXA mendezvazquez_a_Page_050.pro
e89676704a87a5fd745edaaff1c9f82c
c23194e95faf993035e2f685c60a3e84bbc22ccc
26071 F20101206_AAAWBQ mendezvazquez_a_Page_184.pro
55ac6806c1f47fa2681ea01716ce3975
2c9c1594b070f6f7acabae352231e582e4deda81
103994 F20101206_AAAVEY mendezvazquez_a_Page_242.jpg
ad2fdcf322b90f85452abcf967ee15af
596c02f342c5e2629de16789aaea04a108420677
39682 F20101206_AAAVXB mendezvazquez_a_Page_051.pro
9f8bc8e121a2458ef54b4bbcdbaa2b05
abcae7ecd050bc2317e93712d7d6fff646a99f8e
22653 F20101206_AAAWBR mendezvazquez_a_Page_185.pro
ec13cfbcbdd701fe97e04bd284718346
9701a47e8766022129aa28136f58e1e8b837c8a3
18358 F20101206_AAAVEZ mendezvazquez_a_Page_243.jpg
6fa40e62c354a1d379500a5cc07b738a
eb6ce07b668ac527d3a91813fb033c0a58f6d495
45086 F20101206_AAAVXC mendezvazquez_a_Page_052.pro
19869157526942eca7605563b9572b13
8686e008218b0097a9ac5b5ff060a8dade6d32d0
25845 F20101206_AAAWBS mendezvazquez_a_Page_186.pro
41402c561321de519c7eb0976ef9d5fc
c11ccdac5f78bbe8e63260a59dd6c5e8252159cc
27293 F20101206_AAAVXD mendezvazquez_a_Page_053.pro
2b387d4d5c4f06f66321dabb8f358099
7514fbd7d33b0e04708e9c7ca8107c0c1f63558e
25606 F20101206_AAAWBT mendezvazquez_a_Page_187.pro
1826bd42526a56828d8304dc01646d3e
f22859ef8e3022c027a8bc822d402d5341de10d2
28274 F20101206_AAAVXE mendezvazquez_a_Page_054.pro
bd62650462b53d2265893b5c7f840918
44636316a162039edd2f773e286f6ae4f043a6e6
34901 F20101206_AAAWBU mendezvazquez_a_Page_188.pro
a01e18e218e220ba1b353fad57c21738
04b47f557cb858f440fd5ea77b02668b26d418d9
45174 F20101206_AAAVXF mendezvazquez_a_Page_056.pro
e055a51b210da569046247d5eee66d12
bd40db0b257da9ddd8eba0555cef8d93c77c99f6
26360 F20101206_AAAWBV mendezvazquez_a_Page_190.pro
d7be1b2c4fdea154edd2f139834369ff
dc2c7701a7a8202c7c3d54eba4add94e09e5090e







INFORMATION FUSION AND SPARSITY PROMOTION USING CHOQUET
INTEGRALS



















By

ANDRES MENDEZ-VAZQUEZ


A DISSERTATION PRESENTED TO THE GRADUATE SCHOOL
OF THE UNIVERSITY OF FLORIDA IN PARTIAL FULFILLMENT
OF THE REQUIREMENTS FOR THE DEGREE OF
DOCTOR OF PHILOSOPHY

UNIVERSITY OF FLORIDA

2008































2008 Andres Mendez-Vazquez































To my Parents because they were there when I was alone.

To my Sister because she loves me without questions.

To my Professors because they answer my questions.

To my Friends because they were patient with me.

Thank you









ACKNOWLEDGMENTS

I would like to thank Dr. Paul Gader, Dr. Gerhard Ritter, Dr. Joe Wilson, Dr.

Manuel Bermudez and Dr. Mark Schmaltz for their patience and understanding. I would

also like to thank Dr. Jim Keller for his support for my first journal paper. I would also

like to thank Ashish Myles, Maria Velezmoro, Pedro B. Morales, II Park, Alexandra

Camacho, Arturo Camacho, Bruno Maciel, Ernest Hall, C('1i -ii in Campbell, Ganesan

Ramachandran, Xhuei Hue and Art Barnes for their advice and friendship. Finally, I

would like to thank my co-workers at the lab Alina Zare, Raazia Mazhar, Ryan Busser,

Gyeongyong Heo, Jeremy Bolton, John McElroy, Sean Matthews, Seniha Esen Yuksel and

Xuping Zhang for their understanding.









TABLE OF CONTENTS
page

ACKNOW LEDGMENTS ................................. 4

LIST OF TABLES ....................... ............ 10

LIST OF FIGURES .................................... 13

LIST OF ABBREVIATIONS ............................... 17

ABSTRACT . . . . . . . . . . 18

CHAPTER

1 LITERATURE REVIEW ................... ....... 20

1.1 Probability Theory ................... ........ 20
1.1.1 Sample Spaces ................... ....... 20
1.1.2 Kolmogorov Axioms ......... .......... ...... 21
1.1.3 Probability Distribution .. ......... ........ ... 22
1.1.4 Independence and Conditional Probability .............. 23
1.1.5 Moments ................. ............. 24
1.1.6 Bli-'Rule ........ ......... ...... .... 24
1.1.7 Likelihood Functions .................. ....... .. 25
1.1.8 Prior Distributions and Conjugate Priors . . ..... 25
1.2 Monte Carlo Simulations ............... ........ .. 26
1.2.1 Expectation Maximization ............... .... 27
1.2.2 Markov . . 28
1.2.3 Metropolis-Hastings Algorithm ................ .. .. 29
1.2.4 Gibbs Sampler ....... . . .. ......... 30
1.2.5 Relation of the Gibbs Sampler with Expectation Maximization 31
1.2.6 The Slice Sampler .................. ......... .. 31
1.3 Sparsity Promoting Distributions .................. .. 33
1.3.1 Introduction .................. ... .. ..... 33
1.3.2 Sparsity Promotion in Cost Functions : Least Absolute Shrinkage
and Selection Operator . . ..... ..... 36
1.3.3 Extending the LASSO to a B li, -i oi Hierarchical Models ..... 36
1.4 Logistic Classification ............... .......... .. 38
1.4.1 Logistic Distribution ............... ...... .. 38
1.4.2 The Binary Classification Problem .............. .. 39
1.4.3 Motivations for Using the Sigmoid .... . . .... 40
1.4.4 Logistic Regression ............... ....... .. 40
1.5 Minimum Classification Error ................ ... .. 41
1.5.1 Dissimilarity M measures ................. .... .. 42
1.5.2 Loss Functions ............... .......... .. 42
1.6 Information Theory ................ . . ... 43









1.6.1
1.6.2
1.6.3
1.6.4
1.7 Fuzzy
1.7.1
1.7.2
1.7.3
1.7.4
1.8 Fuzzy
1.8.1
1.8.2
1.8.3


Information Entropy .. ............
Mutual Information .. .............
Fano's Inequality .. ..............
Feature Selection Using Mutual Information .
M measures . . . . . .
Evidence Measures .. .............
Possibility M measures .. .............
K-Additive Measures .. ............
Sugeno Measures .. ..............
Integrals . . . . . .
Sugeno Integral .. ...............
The Continuous Choquet Integral Under a Fuzzy
From the Continuous Case to the Discrete Case


1.8.4 Examples of Discrete Choquet Integral .. ...........
1.8.4.1 Choquet integral with respect to belief and plausibility
m measures . . . . . . .
1.8.4.2 OWA operators as Choquet integrals .. ........
1.8.4.3 Choquet integral with respect to k-additive measures
1.8.4.4 Choquet integral with respect the Sugeno measure
1.8.5 Choquet Integral as a General Aggregation Rule .. .......
1.8.6 Relation Between Plausibility and Belief Choquet Integrals .
1.8.7 Shapley Index . . . . . . .


1.9 Inform ation Fusion .. ..................
1.9.1 Data Level, Feature Level and Decision Level Fusion
1.9.2 The Non-Gaussian Nature of Decision Level Fusion
1.9.3 B i-. -i i, Fusion M odels .. .............
1.9.3.1 B li,. i in hierarchical models ........
1.9.3.2 B li, -i in network models .. ........
1.9.4 Non-B -i, -i il Fusion Models .. ...........
1.9.4.1 Neural networks models .. ........
1.9.4.2 Kernel models .. .............
1.9.4.3 Genetic algorithm models .. .......
1.9.4.4 Fuzzy set models .. ...........
1.10 Previous Applications of the Choquet Integral .......
1.10.1 Choquet Integral in Pattern Recognition ......
1.10.2 Fuzzy Integral in Machine Vision .. ........
1.10.2.1 Fuzzy integral in face recognition .....


1.10.2.2 Choquet integral as an image processing
1.10.3 Choquet Integral for Decision Level Fusion .
1.10.3.1 Choquet integral in word recognition .
1.10.3.2 Choquet integral in landmine detection .
1.11 Learning Fuzzy Measures .. ...............
1.11.1 Nonlinear Optimization Methods .. .......
1.11.2 Neural Networks .. ................
1.11.2.1 Perceptron-like method .. ........


filt


;er


Measure









1.11.2.2 Look-alike gradient descent neural networks . ... 76
1.11.3 Genetic Algorithms .................. ........ .. 76
1.11.4 Heuristic Least Squared Error ................ .. .. 77
1.11.5 Conclusions .................. ............ .. 77

2 MINIMUM CLASSIFICATION ERROR USING THE SUGENO MEASURE. 85

2.1 Loss Function for the Minimum Classification Error . . .... 85
2.2 Derivation of the Derivative of the Choquet Integral with respect to the
Sugeno measure ...... ... ... ... .. ..... .. 86
2.3 Time Complexity Analysis of the MCE Training Algorithm . ... 89
2.4 Problems with the MCE Approach ................... .. 92
2.5 Results under the MCE Algorithm ................... .. 92
2.5.1 Description and Design of the Experiments . . 92
2.5.2 Results and Comparison Against Other Methods . .... 95

3 SPARSITY PROMOTION WITHIN CHOQUET INTEGRALS, USING A NAIVE
SAMPLER ...... ......... ................. .. 103

3.1 Constraining the Hierarchical Model for Sparsity Promotion under the Least
Squared Error ..... .. .... ........ ...... 103
3.2 Gibbs Sampler for the Monotonicity Constrained Model . .... 104
3.3 Sampling from the Posterior Distribution of Mu . . ..... 106
3.4 Problems When Sampling from a Gaussian with Small Variance. . .. 109
3.5 Improving the Sparsity Promoting Model . . . .... 109
3.6 Improving the Accuracy Term Sigma and the Sparsity Term Gamma 111
3.7 Interpreting the Results from the Sparsity Promoting Model . ... 112
3.8 Comparison Against Quadratic Programming . . . 113
3.9 Results under Sparsity Promoting Model with Respect to LSE ...... 114
3.9.1 Case I .................. ............... .. 114
3.9.2 Case II .................. ............... .. 114

4 AN IMPROVED LSE SPARSITY ALGORITHM ...... . ..... 120

4.1 Introduction ... . . .... ............ .. 120
4.2 Maximum A Posteriori Gibbs Sampler Model . . ..... 121
4.3 Solving the MAP-EM Problem .............. .. 124
4.4 Slice Sampler for the MAP-EM Sparsity .............. 128
4.5 Observations about MAP-EM Sparsity ........... .. ... 130
4.6 Results Under MAP-EM Sparsity ................ 131
4.6.1 Experiments Using Artificial Data Sets . . .... 131
4.6.1.1 Case I ............... ......... 131
4.6.1.2 Case II ............... ......... 132
4.6.1.3 Case III . . . . . 33
4.6.2 Conclusions over the Artificial Data Sets . . 134
4.6.3 Experiments with Landmine Detection Data Sets . .... 134
4.6.3.1 Case A2007 ................... .. .... 135











4.6.3.2 Case BJ2007 .................. .... 135
4.6.3.3 Case BJ2006 ....... ......... . ... 136
4.6.4 Conclusions over the Landmine Data Sets . . 136

5 EXTENDING MCE TO A PROBABILISTIC FRAMEWORK USING LOGISTIC
DISTRIBUTION AND GIBBS SAMPLER . ........... .... 163


5.1 Logistic Regression for the One Measure Case . ......
5.1.1 The Gibbs Sampler for the One Measure Case . ..
5.1.2 The Slice Sampler For the One Measure Case Posterior .
5.1.3 Problems of Assuming a Uniform Distribution . ..
5.1.4 Using a Hidden Variable for Logistic Regresion . .
5.2 Development of the Algorithm for the Two Measure Case . .
5.2.1 The Gibbs Sampler for the Two Measures Case . .
5.2.2 The Slice Sampler for the Two Measure Case Posterior .
5.2.3 MAP-EM For the Two Measure Case . .....
5.3 Final Thoughts about MCE under the Logistic Framework .
5.4 Results under Sparsity Promoting Model with Respect to Logistic
Logistic Algorithm s . . . . .
5.4.1 Results of the Logistic LASSO Using Artificial Data Sets .
5.4.1.1 C ase I . . . . . .
5.4.1.2 C ase II . . . . . .
5.4.2 Case III . . . . .


5.4.3 The MAP-EM
5.4.3.1 Case
5.4.3.2 Case
5.4.4 Case III .
5.4.5 Results of the
5.4.5.1 Case
5.4.5.2 Case
5.4.5.3 Case
5.4.6 Results of the
Sets . .
5.4.6.1 Case
5.4.6.2 Case
5.4.6.3 Case
5.4.7 Conclusions ov
5.4.8 Experiments ir
5.4.8.1 Case
5.4.8.2 Case
5.4.8.3 Case
5.4.9 Conclusions ov


Logistic LASSO Artificial Data Sets . .
I . . . . . . .
II. . . . . . .


MCE Logistic LASSO over Artificial Data Sets
I . . . . . . .
II ........ ............ .....
III . . . . . .
MAP-EM MCE Logistic LASSO over Artificial


I . . . . . . .
II .... .......................
III . . . . . .
er the Artificial Data Sets . .
SLandmine Detection Data Sets . .
A 2007 . . . . . .
B J2007 . . . . . .
BJ2006.... . . .
er the Landmine Data Sets . .


. .
. ~


. .
. .
. .
...
. .
. .
. .
...
Data
...
. .
. .
. .
. .
. .
. .
. .
. .


6 CONCLUSION . . .

APPENDIX


163
164
166
168
169
170
171
172
174
176


177
177
178
178
179
179
179
179
180
180
180
180
181


181
181
181
181
181
182
182
182
182
183

219


.
.









A DERIVATION OF THE QUADRATIC PROGRAMMING WITH RESPECT
TO THE GENERAL MEASURE ................... ...... 221

B RELATION BETWEEN DIFFERENT CHOQUET ALGORITHMS ...... 223

C DIFFERENCES BETWEEN THE BAYESIAN SPARSITY METHODS AND
THE FEATURE SELECTION UNDER MUTUAL INFORMATION ...... 225

D MAXIMIZING MUTUAL INFORMATION IN LOGISTIC LASSO ....... 227

LIST OF REFERENCES ................... .......... 229

BIOGRAPHICAL SKETCH ................... .......... 243









LIST OF TABLES


Table page

1-1 Examples of conjugate priors. .................. ....... 79

2-1 Comparison of PFA for Sugeno A-measure trained with LSE against different
detectors .................. ................ . .. 99

2-2 Comparison of PFA for general measure trained with LSE against different detectors
at different thresholds. .................. .. ........ 100

2-3 Comparison of PFA in MCE against different detectors at different thresholds. 101

2-4 Mean MCE PFA against mean general and Sugeno PFA. ........... ..102

2-5 Comparison of MCE Sugeno against several other classifiers for iris data and
breast cancer data. .................. .. .......... 102

3-1 Confusion matrix for artificial data set case I in Gibbs sampler for LSE sparsity. 115

3-2 Measures for artificial data set case I with mean and standard deviation of the
Markov chains in Gibbs sampler for LSE sparsity. ............... 115

3-3 Shapley values for the features in artificial data set case I in Gibbs sampler for
LSE sparsity .................. .................. .. 115

3-4 Confusion matrix for artificial data set case II in Gibbs sampler for LSE sparsity. 115

3-5 Measures for artificial data set case II with the mean and standard deviation of
the Markov chains in Gibbs sampler for LSE sparsity. .............. .116

3-6 Shapley values for the features in artificial data set case II in Gibbs sampler for
LSE sparsity .................. .................. .. 116

4-1 Confusion matrix for artificial data set case I for MAP-EM Sparsity ..... ..137

4-2 Measures artificial data set case I with mean and standard deviation of the Markov
chains for MAP-EM Sparsity. .................. ......... 137

4-3 Shapley indexes for the features in artificial data set case I for MAP-EM Sparsity. 137

4-4 Confusion matrix for artificial data set case II for MAP-EM Sparsity. ..... .137

4-5 Partial measure values for case II with the mean and standard deviation of the
Markov chains for MAP-EM Sparsity. .................. ..... 138

4-6 Shapley indexes for the features in artificial data set case II for MAP-EM Sparsity. 138

4-7 A better confusion matrix for case II. .................. ..... 139

4-8 More sparse Shapley indexes for case II. ................ ..... 139









Densities for the MCE under Sugeno measures for case II .. .........

Confusion matrix for artificial data set case III for MAP-EM Sparsity .. ....


4-11 Shapley indexes for MAP-EM Sparsity for case III .. ......

4-12 Different data sets and detection algorithms for the landmine fusion

5-1 Confusion matrix for case I using Logistic LASSO. . .....


prol


blem..


5-2

5-3

5-4

5-5

5-6

5-7

5-8

5-9

5-10

5-11

5-12

5-13

5-14

5-15

5-16

5-17

5-18

5-19

5-20

5-21

5-22

5-23


4-9

4-10


Measure parameter values by Logistic LASSO in case I .. . .....

Shapley indexes for case I by Logistic LASSO........ . . .....

Confusion matrix for case II using Logistic LASSO. . . .

Measure parameter values by Logistic LASSO in case II . . .

Shapley indexes for case II by Logistic LASSO . . ....

Confusion matrix for case III using Logistic LASSO . . .

Shapley indexes for case III by Logistic LASSO . . ....

Confusion matrix for case I using MAP-EM Logistic LASSO . . .

Measure parameter values by MAP-EM Logistic LASSO in case I . .

Shapley indexes for case I by MAP-EM Logistic LASSO . . .

Confusion matrix for case II using MAP-EM Logistic LASSO . . .

Some measure parameter values by MAP-EM Logistic LASSO in case II .. ..

Shapley indexes for case II by MAP-EM Logistic LASSO . . ...

Confusion matrix for case III using MAP-EM Logistic LASSO . . .

Shapley indexes for case III by MAP-EM Logistic LASSO . . ...

Confusion matrix for case I using MCE Logistic LASSO . . .

Measure parameter values by MCE Logistic LASSO in case I . . .

Case II confusion matrix by the MCE Logistic LASSO algorithm . .

Case III confusion matrix by the MCE Logistic LASSO algorithm . .

Case I confusion matrix by the MAP-EM MCE Logistic LASSO algorithm .

Measure parameter values by MAP-EM MCE Logistic LASSO in case I.....

Case II Shapley indexes by MCE Logistic LASSO . . .


140

184

184

184

184

185

185

185

186

186

186

186

187

187

187

187

188

188

188

189

189

189

189

190









5-24 Case III Shapley indexes by MCE Logistic LASSO. .............. ..190

5-25 Confusion matrix for case II using MAP-EM MCE Logistic LASSO . .... 190

5-26 Shapley indexes for case II by MAP-EM MCE Logistic LASSO. . ... 190

B-l Different characteristics for Choquet algorithms. ................ 223









LIST OF FIGURES


Figure page

1-1 Example of a maximization iteration in EM. .................. 79

1-2 Example of Slice sampler for the exponential distribution. . .... 80

1-3 Example of a Laplacian with 7 = 10. ............... .. .. 80

1-4 Examples of the PDF and CDF of a logistic distribution. ........... .81

1-5 Example of a decision boundary h(x) = [o + p11X + 2x = 0 . ... 81

1-6 Example of logit transformation. . ............... . ... 82

1-7 Example of a Choquet integral under a belief measure. ............ ..83

1-8 Example of a Choquet integral under a plausibility measure. . .... 83

1-9 Distribution of the outputs of a classification problem. . . 84

1-10 More realistic distribution of the outputs of a classification problem. ..... ..84

2-1 Examples of sensitivity to desired outputs for Sugeno measure where a, and a2
represent the desired outputs for mines and non-mines respectively. ...... ..98

2-2 Examples of sensitivity to desired outputs for general measures where al and
a2 represent the desired outputs for mines and non-mines respectively. . 98

3-1 Example of a lattice where the arrows represent the subset relation on it. . 117

3-2 This plot shows the idea that sampling in an interval far away from the mean
of a Gaussian with small variance is similar to sampling from a uniform in the
interval ............... ................ ... .. 118

3-3 Plot of samples for class 1 'o' and class 2 '+' for the first three features in case I. 119

4-1 Plot of samples for the first three features in case I where the second feature
has no value for classification. ............... ......... 141

4-2 Outputs for MAP-EM Sparsity in Case I with "o" for class 1 and "x".for class 2. 142

4-3 Traces for the measure parameter samples generated by MAP-EM Sparsity in
C ase I . . . . . . . . . 143

4-4 Distributions for the measure parameters generated by MAP-EM Sparsity in
C ase I . . . . . . . . . 144

4-5 Outputs for the MCE under Sugeno measures gradient descent method for case I. 145

4-6 ROC curve for MAP-EM Sparsity in case I. .... ....... . 146









4-7 ROC curve for the MCE under Sugeno measures in case I. . ..... 147

4-8 Plot of samples for the first and fifth three features in case II. . ... 148

4-9 Outputs for MAP-EM Sparsity in Case II with "o" for class 1 and "x" for class 2. 149

4-10 A better output separation for case II for MAP-EM Sparsity. . .... 150

4-11 Outputs for the MCE under Sugeno measures for case II. ........... ..151

4-12 ROC curves for the MCE under Sugeno measures and MAP-EM Sparsity for
case II. . . . . .. .. . ... ... . 152

4-13 Choquet outputs for MAP-EM Sparsity for case III with "o" for class 1 and
"x" for class 2 .................. ................. .. 153

4-14 ROC curves for the MCE under Sugeno measures and MAP-EM Sparsity for
case III. . . . . . . .......... .. .. .. 154

4-15 Examples of traces for the measure parameter by MAP-EM Sparsity in case II. 155

4-16 Examples of distributions for the measure parameters by MAP-EM for case II. 156

4-17 ROC curves for all the A2007 algorithms and MAP-EM Sparsity. . ... 157

4-18 ROC curves for MCE under Sugeno and MAP-EM Sparsity in data set A2007.. 158

4-19 ROC curves for all the BJ2007 algorithms and MAP-EM Sparsity. . ... 159

4-20 ROC curves for MCE under Sugeno and MAP-EM Sparsity in data set BJ2007. 160

4-21 ROC curves for all the BJ2006 algorithms and MAP-EM Sparsity. . ... 161

4-22 ROC curves for MCE under Sugeno and MAP-EM Sparsity in data set BJ2006. 162

5-1 Choquet outputs for the synthetic case I under Logistic LASSO with "o" for
class 1 and "x" for class 2. .................. .. ...... 191

5-2 Choquet outputs for the synthetic case II under Logistic LASSO with "o" for
class 1 and "x" for class 2. .................. ...... 192

5-3 Choquet outputs for the synthetic case I under MAP-EM Logistic LASSO with
"o" for class 1 and "x" for class 2 ............... .... .. 193

5-4 Choquet outputs for the synthetic case II under MAP-EM Logistic LASSO with
"o" for class 1 and "x" for class 2 ............... .... .. 194

5-5 Choquet outputs for the synthetic case III under MAP-EM Logistic LASSO with
"o" for class 1 and "x" for class 2 ............... .... .. 195

5-6 Choquet outputs for the synthetic case I under MCE Logistic LASSO with "o"
for class 1 and "x" for class 2 .................. ....... .. 196









5-7 Choquet outputs for the synthetic case II under MCE Logistic LASSO with "o"
for class 1 and "x" for class 2 .................. ....... .. 197

5-8 Choquet outputs for the synthetic case I under MAP-EM MCE Logistic LASSO
with "o" for class 1 and "x" for class 2 ................ .. 198

5-9 Choquet outputs for the synthetic case II under MAP-EM MCE Logistic LASSO
with "o" for class 1 and "x" for class 2 ................ .. 199

5-10 Choquet outputs for the synthetic case III under Logistic LASSO with "o" for
class 1 and "x" for class 2 .................. ........... .. 200

5-11 Choquet outputs for the synthetic case III under MCE Logistic LASSO with
"o" for class 1 and "x" for class 2 .................. .. .... .. .. 201

5-12 Traces for the measure parameters learned by the Logistic LASSO for case I. 202

5-13 Distributions for the measure parameters sampled by the Logistic LASSO for
caseI. ............................... ...... ... ... 203

5-14 Some of the traces for the measure parameters learned by the Logistic LASSO
for case II. .................. ............. ... .. 204

5-15 Some of the distributions for the measure parameters sampled by the Logistic
LASSO for case II. .................. .. .............. 205

5-16 Traces for the measure parameters learned by the MAP-EM Logistic LASSO
for case I ................... ............ ...... 206

5-17 Distributions for the measure parameters sampled by the MAP-EM Logistic
LASSO for case I ................... ............... 207

5-18 Some of the traces for the measure parameters learned by the MAP-EM Logistic
LASSO for case II. .................. .. .............. 208

5-19 Some distributions for the measure parameters sampled by the MAP-EM Logistic
LASSO for case II. .................. .. .............. 209

5-20 ROC curves for different algorithms in case I. ................. 210

5-21 ROC curves for different algorithms in case II. .................. 211

5-22 ROC curves for different algorithms in case III. ................ 212

5-23 ROC curves for all the A2007 algorithms and Logistic LASSO. . ... 213

5-24 ROC curves for MCE under Sugeno and the logistic methods in data set A2007. 214

5-25 ROC curves for all the BJ2007 algorithms and Logistic LASSO . .. 215

5-26 ROC curves for MCE under Sugeno and the logistic methods in data set BJ2007. 216









5-27 ROC curves for all the BJ2006 algorithms and Logistic LASSO. . ... 217

5-28 ROC curves for MCE under Sugeno and the logistic methods in data set BJ2006. 218

B-l Relations between the different Choquet algorithms. .............. ..224









LIST OF ABBREVIATIONS


Bel belief measure

EM expectation maximization

GPR ground penetrating radar

FAR false alarm rate

LADAR laser radar

LSE least squared error

MAP maximum a posteriori

MCE minimum classification error

PDF probability density function

PMF probability mass function

MCMC Monte Carlo Markov chain

MSE mean squared error

MLP multiple 1 V--r perception

Nec necessity measure

OWA ordered weighted averaging

PD probability of detection

PFA probability of false alarm

P1 plausibility measure

Pos possibility measure

ROC receiver operating characteristic

SVM support vector machine









Abstract of Dissertation Presented to the Graduate School
of the University of Florida in Partial Fulfillment of the
Requirements for the Degree of Doctor of Philosophy

INFORMATION FUSION AND SPARSITY PROMOTION USING CHOQUET
INTEGRALS

By

Andres Mendez-Vazquez

May 2008

C'! ,ir: Paul Gader
Major: Computer Engineering

This dissertation addresses problems encountered in combining information from

multiple sources. Novel methods for learning parameters for information .-2- regation are

proposed.

In practical applications of pattern classification, multiple algorithms are often

developed for the same classification problem. Each algorithm produces confidence values

by which each new sample may be classified. We would like to ..-'.-regate these confidence

values to produce the best possible confidence for the given sample. This can be seen as a

particular instance of what is called information fusion.

In addition to learning parameters of ..-.- regation operators to assign the best

confidence for a given sample, we would also like the .,-:-regation operators to use a subset

of the algorithm confidences and achieve the same level of performance as the entire set of

confidences. Using a subset of the algorithms implies lower cost for applications.

Choquet integrals are nonlinear operators based on fuzzy measures that can represent

a wide variety of .,.- -regation operators. Previous research has demonstrated the utility of

Choquet integrals for this problem compared to other methods such as neural networks

and B ,., i i approaches.

However, one of the novel results of this research is that the measures learned can be

very sensitive to the choice of desired outputs. In response to this problem, we propose an

alternative training methodology based on Minimum Classification Error ('ICE) training









that does not require the use of desired outputs. A problem with this method is that it

depends on a constrained type of fuzzy measure, the Sugeno measure

There is a need for additional approaches to learning unconstrained fuzzy measures

that are more computationally attractive and provide more robust performance. We

propose an approach to learning unconstrained fuzzy measures that relies on Markov

C'!I ,i / Monte Carlo sampling methods. The use of such approaches for learning measures

for Choquet integral fusion is completely novel. In addition, we propose the inclusion of

the B ,i, i i approach of imposing sparsity promoting prior distributions on the measure

parameters during sampling as a way of selecting subsets of the algorithms for inclusion in

the ..-' regation. This approach is completely new for learning fuzzy measures.









CHAPTER 1
LITERATURE REVIEW

The literature review has been divided into eight sections. In (Section 1.1) (Section

1.4), we explore the basic concepts behind probability theory [1], B li, -i 'i probability,

hierarchical models, Monte Carlo methods [2] and logistic regression [3]. In (Section

1.5) and (Section 1.6), we define the basics behind the minimum classification error

method [4-6] and information theory [7] respectively. Following (Section 1.1) (Section

1.6), we introduce fuzzy measures and define Sugeno and Choquet integrals [8-13]. We

then focus on the special case of a discrete Choquet integral and its properties.

(Section 1.9) on information fusion is divided into two main groups, B ,.i -i iI1 and

non-B li, -i ,i models for information fusion. Given these concepts, (Section 1.10) discusses

previous work done with the Choquet integral as a classifier and an .r.-- regation operator

for fusion.

In (Section 1.11), the final section of the literature review, we discuss previous

techniques for learning fuzzy measures.

1.1 Probability Theory

In this section, we review basic probability theory [1, 14, 15]. First, we define basic

concepts about sample spaces and events. We then look at the concepts of probability,

measure and event spaces. This leads to defining probability distributions and densities.

After this, we define the concepts of independence, conditional probability and moments.

Following these basic ideas, we look at the ideas of B li, -' rule, likelihood functions, prior

probability and conjugate priors.

1.1.1 Sample Spaces

One of the objectives of probability is to obtain conclusions about a collection of

elements through a series of experiments. It is necessary to identify the set of elements

and the experiments in some way. For this reason, the concepts of a sample space and

event are examined.









Definition 1.1. The set, S, of all possible outcomes of a particular experiment is called a

sample space for the experiment.

If we want to talk about the occurrence of a particular outcome of an experiment, it is

necessary to define what it is known as an event.

Definition 1.2. An event is iw,;, collection of possible outcomes of an experiment, that is,

,ii subset of S (including S itself).

This collection of possible events is normally associated with a subset A of the sample

space S. A mathematical definition for this collection of subsets on the sample space S is

known as a-algebra.

Definition 1.3. A collection B of subsets of a set X is said to be a cr-algebra on X, if B

has the following properties:

1. X CB.

2. If A B, then Ac c B.

3. If A = UjAj and Aj E S for j = 1,2,..., then A e B.

This is a natural definition on the subsets of these collections because we would

like to speak about the occurrence of an event, whether an event is dependent on other

events, the union of different events, etc. Now we are ready to define the concept of a

probability.

1.1.2 Kolmogorov Axioms

It is desirable to have a concise set of axioms to describe the possible outcomes from a

sample space. Kolmogorov defined a set of axioms that any probability should fulfill

Definition 1.4. Given a sample space S and an associated a-,tl., l,,' B, a probability

function is a function P with domain B that -,/'-. -

1. P(A) > 0 for all A E B.

2. P(S) = 1.

3. If A, A2,... c B are pairwise disjoint, then P(U~,A,) = E P(A,).









These three properties are usually referred to as the Kolmogorov axioms. Finally,

it is possible to define the concept of a probability space.

Definition 1.5. A probability space is a triple (S, B, P) where S is a set, B is a

o-.'rl 1,,, and P is a 'I ,1.,,.7,:/;,l function.

1.1.3 Probability Distribution

We would like to link each event in the sample space to a value in the real line. This

can give us a way to iir, i-u'e" these events, which in turn leads to the definition of a

random variable.

Definition 1.6. Given a llI..l.':l;';l space (S,B,P), then X : Q -+ R is a random

variable if for all a E R the set:


X-'([a,oo))= {u S I X(W) > a}, (1-1)


is in B.

This concept is used to define a probability distribution function of the random

variable X over the sets B E B as follows:


Px(B) = P(X-1(B)). (1-2)


For each distribution function, we have an associated function called the cumulative

distribution function (cdf) Fx(x) of a random variable defined by:


Fx(x) = Px(X < x)= P(w E | X(w) < x}) (1-3)


for all x. It is sometimes possible to express the probability measure as a Lebesgue integral

of a function f on a set A,

P(A) / fdp. (1-4)
JA
If the Lebesgue integral is the same as the Riemann integral [14-16], we have:


Fx(x) f(x)dx. (1-5)
--O









We have by the Fundamental Theorem of Calculus [17], if f(x) is continuous, that

d
dFx(x) f(x). (1-6)

The function f(x) is called Probability Density Function (PDF), and is denoted by

fx(x). A similar argument can be made for the discrete case, and the function fx(x)
for the discrete case receives the name Probability Mass Function (PMF). It is

possible to prove that any function fx(x) is a PDF if and only if fx(x) > 0 for all x, and

f, fx(x)dx 1= Given a probability distribution function, it may not be possible to derive
the PDF if the Lebesgue integral does not correspond to any Riemann integral.

A natural extension of this concept is the joint distribution function.

Definition 1.7. Let X1,...,X, be n random variables, all 1/. I7;,. on the same I,,,1.,:./:;

space (X,S). The joint density function of X1,...,X,2 denoted by fx1,...,X, (Xi ... n),

is the function fx1 ...,Xn(x1 ,... ,x n) : R'- --R such that for wi;., domain D C R', we have:


Sfx,...,x(ui,..., u,)dul...du, P(D). (1-7)

The joint density function has the same properties as the univariate density function.

1.1.4 Independence and Conditional Probability

In order to simplify probability models, it is desirable to assume that events are

mutually independent under certain conditions. This is formally defined as:

Definition 1.8. Two random variables X and Y are said to be independent, if given A

and B in B we have that P(X e A nY E B)= P(X e A)P(X e B).

Sometimes, we would like to express the probability of an event once another event

has happened. This is known as conditional probability.

Definition 1.9. Given random variables X and Y, if A and B are events in B, and

Py(B) > 0, the conditional probability of X E A given Y E B, written P(X E A|Y E









B), is given by:


P(X AY P(X AnY C B) P(X A,Y B)
P(X e A|Y e( B) B) (18)
P(Y E B) P(Y E B)

It can be proved that if the density functions of random variables X, Y exist, the last

equality holds. From now on we will assume that all our random variables have a PDF or

a PMF [18].

1.1.5 Moments

The different moments of a random variable X are important types of expectations.

They help to describe important properties of the random variable PDF's. For example,

how symmetric is the PDF, how thin or flat is the PDF, etc.

Definition 1.10. For each integer n, the nth moment of X (or Fx(x)), p', is p,

EX". The nth central moment of X, pn, is pT = E(X p)", where p = p = EX.

Apart from the mean, E(X), of a random variable, perhaps the most important

moment is the second central moment, which is known as the variance of a random

variable.

Definition 1.11. The variance of a random variable of X is its second central moment,

VarX = E(X E(X))2. The positive square root of Var X is the Standard Deviation

(SD) of X.

Other examples of moments are the skewness and the kurtosis.

1.1.6 Bayes' Rule

An important application of conditional probability that allows us to turn around

conditional probabilities is know as Bayes' rule.

Theorem (Bayes' Rule) 1.1. Let A1, A2,..., AT be a partition of the sample space, and

let B ,,. set. Then, for each i=1,2,...,n

P(AB) P(A) (1-9)
(' IE P(BIAy)P(Ay)









It is possible to prove that given a PDF for a probability distribution function, B li,-

rule holds for the PDF [18].

1.1.7 Likelihood Functions

An immediate application for these ideas is the estimation of the parameters for the

PDF's through the use of maximum likelihood estimation. First, we define what is

known as the likelihood function.

Definition 1.12. Let f(xl ) denote the joint PDF of the vector sample X = (X1,... ,X,).

Then, given that X = (xi,... x,) is observed, the function of 0 1. by L(Olxi,... ,x)

Sf(xl,... x 0) is called the likelihood function.

If the samples from X are identically distributed and independent, the likelihood

function is defined as

L(O|xi,..., x,z) = f (rl\O). (1-10)
i=1

1.1.8 Prior Distributions and Conjugate Priors

A direct application of B i-, -' rule is that, given a belief about the PDF 7r(0) of

a certain random variable 0 (called prior distribution), and the knowledge of the

distribution of the data given the parameter 0, f(xl0), we can generate a more accurate

PDF 7r(0|x) for 0 given the data x (called a posterior probability). Formally, the

posterior is defined as:
7r(OIX) -f(XlO)Tr(O)
fx(e|r =(1-11)
f(x )
where f(xl)7r(0) = f(x, 0), f(x) is the marginal distribution of X f(x) = f (x0)7r(0)d0,

and the function f(xlO) is called the likelihood of the data, given the parameter 0. This

formulation allows us to discover the probability distribution of a given data by guessing

an initial probability or prior and then using the probability of the data or likelihood to

modify this probability and get a better approximation for the parameter of the data

probability or posterior. A special case of priors that are heavily used in statistics are the

conjugate priors. A conjugate prior is a probability with the property that the posterior









PDF also belongs to the same family [19]. These families are collections of PDF's f(xl0)

that can be rewritten in a specific form. For example, if a PDF f(xl0) belongs to the

exponential family if it can be written as:


f(xlO) = a(x)b(0)exp {c(x)d(0)} (1-12)

where a and c depend only on x, and b and d depends only on 0. (Table 1-1) shows some

classical examples of conjugate priors.

Frequently, it is necessary to assume that the parameters of a PDF are distributed

according to another PDF, leading to the Bayesian hierarchical models. In these

models, we can use the properties of prior distributions to enhance our knowledge of the

parameters used in the topmost, or posterior, distributions. Several properties are used to

simplify the models created by this kind of hierarchy. For example, we can use hidden

variables to express special properties from the top distributions. Another example is the

use of non-informative priors [20, 21] when the distribution of the parameters are not well

known or known at all.

S11 f: interesting applications involve formulations where well-defined analytical

or closed-form solutions are not easy to come by, such as when we try to integrate the

Laplacian distribution over an interval, or when the different 1-v. r in a hierarchical model

produce difficult to integrate functions. In such cases, one may explore numerical and/or

stochastic methods to attack these kinds of problems.

1.2 Monte Carlo Simulations

Solution approaches to several problems benefit from the application of numerical

methods in statistics. The first class covers optimization problems, the second refers to

integration problems and a third refers to simulation of B ,iv i i models by sampling. In

the first case, the most representative cases are simulated annealing [2, 22, 23] and Ex-

pectation Maximization (EM) [2, 24, 25]. In the second case, the most representative

problems refer to Monte Carlo integration and importance sampling [2]. The third









case refers to the problems solved with Metropolis-Hastings algorithms [2, 26, 27]. In

our case, we are especially interested in the EM and Metropolis-Hastings algorithms.

1.2.1 Expectation Maximization

The EM algorithm was introduced by Dempster et al. [24] Suppose that we observe

the identically independent samples X1,...,X,, from a conditional distribution P(XI0).

We would like to obtain a sequence of parameter values 01, 02, ... ,n,... such that at

step n the log likelihood L(,0++IX) = In P(XIO,+I) is greater or equal than L(0,|x). To

accomplish this, we introduce a hidden or augmented variable z in the log likelihood of 0

given the data x,

In L(0IX,z)= In P(X,zl0). (1-13)

This last equation together with the inequality L(0O,+1X) > L(0,,X) allows us to derive

the following equality,


0+1 argmax EzXO, {ln P(X, z0)}}. (1-14)

From this equality one can devise an iterative algorithm for the maximization of 0. Thus,

the EM algorithm has the following steps,


EM Algorithm

1. E-step: Determine the conditional expectation Ez X,'O {ln P(X, z|O)}.

2. M-step: Maximize this expectation with respect to 0.


It can be proven that Ez X. {ln P(X, z80)} is a convex function. Then, a convex

function is used to maximize the function L(0IX) as seen in (Figure 1-1). This can be

a problem because the EM algorithm can only find a local maxima depending on the

initialization.

In addition, the EM algorithm requires a closed form of the expectation of a

random variable which is available in some important cases [28]. It is often the case









that it is not possible to find a precise formulation for the E-step [29]. Additionally, a

closed form may not be available for the M-step, so the maximization of the derivative

cannot be accomplished without resorting to iterative optimization schemes for integral

approximation. Thus, we need to consider a different way to obtain the optimal answer for

the parameters used in a distribution. It has been noted by Robert and Casella [2] that

Monte Carlo Markov Chain (\lC'\C) methods are better at solving problems where it

is difficult to find closed forms for the quantities in the E and M steps above.

1.2.2 Markov Chains

Before we describe the Metropolis-Hastings algorithm, it is necessary to provide an

overview of Markov chains [2, 30].

At the center of the statistical sampling methods is the concept of a Markov chain.

Definition 1.13. A sequence of random variables {Xi Xi P, i 0,1, 2,...} is a

Markov chain if, for I,;, n:


P(X. X.-I, ,... ,Xo) P(X," X,_-) (Memoryless property). (1-15)


This is called the transition j,' ,/'.:l.:,I;l'/ at time n for the sequence {XiXi ~ P, i = 0,1, 2,...}.

If the state space is finite, the transition p"l,' ll'.:;;, can be represented by a matrix which is

known as the transition matrix.

We seek to generate Markov chains that have the property that the random

variables in the sequence converge (in some sense) to a particular distribution. That

is, if {Xl|X -~ P, i = 0,1, 2,...} is the Markov chain and X, -~ 7,, then r,, -- 7 (in some

sense) as n oo. In this case, 7 is called the target distribution of the Markov Chain

If the measure 7 is a probability measure, the probability is called stationary. Then,

we hope to find Markov chains that converge to a stationary distribution.

Before we define what a Markov C'!i i Monte Carlo Method (\lC\IC) is, we need to

introduce the simulation concept.









Definition 1.14. Given a random variable X from a distribution P, the generation of a

sequence of values of the random variable X with distribution P is called a simulation.
Finally, we can define what is known as a Markov Chain Monte Carlo method.

Definition 1.15. A Markov Chain Monte Carlo method for the simulation of a

distribution f is i,.; method producing a well behaved Markov chain whose stat.:. *,ii;

distribution is f.
1.2.3 Metropolis-Hastings Algorithm

The first method to be explored is the well known Metropolis-Hastings algo-

rithm [26, 31, 32]. The Metropolis-Hastings algorithm begins with a target density f.

Then, a conditional density q('i, ) is chosen such that it is amenable easy to simulation,
and is defined with respect to f. Thus, the Metropolis-Hastings algorithm is defined in the

following way (Robert and Casella et al. [2]):

Metropolis-Hastings Algorithm

1. Given x(),

2. Generate Y(t) ~ q(i, i ') and u(t) ~ U(O,1) for t = 1,..., n,

3. Take

X(t+l) fY(t) with probability p(x(t), Y(t)) if U(t) < p(x(t), Y(t))
x(t) with probability 1 p(x(t), Y()) if (t) < (1 p(x(), Y()))

where p(x, y) min f(x) 1 ,

4. Return x(l), (2)..., (2),..., x().


The symbol "~" represents that samples are being generated from q('i, (t)) and U(0, 1).

In addition, upper-case random variables represent variables that need to be tested to be

accepted, and lower-case random variables represent random variables with a fixed value.









1.2.4 Gibbs Sampler

The Gibbs sampler is another type of MC'\ C method where the sampling of the

random values of interest is done in a sequential way, and these random values come from

posterior probabilities that are generated from a B ,i, i hierarchy.

It can be proven that the Gibbs sampler [2, 30, 33] is equivalent to a composition of

Metropolis-Hasting algorithms [1].

Theorem 1.1. The Gibbs sampler method is equivalent to the composition of n

Metropolis-Hastings il'.-, :thms, with acceptance probabilities unifo, i,',,1 equal to 1.

The basic Gibbs sampler for a two random variable cases is the following:


Two Stage Gibbs Sampler

1. Given the random variables X and Y with joint density f(x, y), and conditional
probabilities fxly(x y), fylx("i' ), the Gibbs sampler generates a Markov chain
(X(t),Y(t)) for t 1, 2,... taking Xo = x and sampling as follows:

(a) Y(t) fYIx(I' 1.-)).

(b) X(t) ~ fxlvy( ,/.,.).


A drawback of this method is that samples from Gibbs samplers are not mutually

independent. This means that some correlation exists between samples in the Markov

chain being generated. Therefore, it is sometimes necessary to sub-sample the original

sample to obtain a more independent collection of samples by choosing elements with

greater separation within the original chain.

An other possible drawback, depending on the problem, is that Markov chains can

exhibit slow convergence (i.e., variance in the chain can be too high). A way to resolve

this problem by adding extra steps which essentially Metropolize the Gibbs sampler has

been proposed [1, 34].









1.2.5 Relation of the Gibbs Sampler with Expectation Maximization

The EM algorithm can be seen as a precursor to the two stage Gibbs sampler in

missing data models [2]. Robert, Casella, et al. [2] divide the likelihood of the data

into two likelihood, the complete-data likelihood (LC(0 x) = f(x, z 0)), and the

incomplete-data likelihood (L(08x) = g(x 0)), with missing data density k(z x, ) =

L(OX) Now, if it is possible to normalize the complete-data likelihood, i.e.:

LC(O|x, z)
L*(lx, z) L(x, z)d(116)
f Lo(0|x, z)d0

then it is possible to define the following Gibbs sampler:


General EM Gibbs Sampler

1. ze- k(zlx, 0).

2. O|z L*(O|x,z).


The first step is the expectation step and the second the maximization step. The big

difference with respect to the classical EM is that instead of calculating E(Zlx, 0), the

Gibbs sampler generates a random variable from the density k. In a similar way, instead

of maximizing the expected complete-data log-likelihood, the Gibbs sampler generates

random samples 0, from the L*-the normalized complete-data likelihood.

Another special case of the Gibbs sampler that is used for generating samples from

the product of distributions is the slice sampler.

1.2.6 The Slice Sampler

The slice sampler is a sampling technique of great importance because it allows us to

sample complex distributions for which closed forms do not exist. For example, if we need

to sample from a product of sigmoid distributions, it is clear that we cannot reduce this

product to a simpler form.

The slice sampler is based on the fundamental theorem of simulations.









Theorem 1.2 (Fundamental Theorem of Simulations). S:i,,,1,l.:.lj


X ~ f(x) (1-17)

is equivalent to -:in,, 1,il. :,l i


(X, U) ~ U {(x,u)10 < u < f (x)}. (1-18)

Thus, we need to uniformly simulate from the sub-graph


9(f)= {(,u)|0 < < f(x)}. (1-19)

Robert and Casella et al. [2] point out that a natural solution is to use a random walk on
the set G(f) because they usually result in a stationary distribution that is the uniform
distribution on g(f). This can be done in the following way:
Starting from a point:


(x,u) E {(x,u)\0 < U < f(x)}, (1-20)

we move along the U axis which is equivalent to

UIX x U(N{ulu < f(x)}), (1-21)

which corresponds to moving to the point (x, u') E X x U. Then, we move along the X

axis which corresponds to the conditional distribution,

XlU -u' U({xlu' < f(x)}), (1-22)

to the point (x', u') which is a sample in the uniform distribution of the set g(f). Thus,

x' is generated from the distribution f(x). An example of this procedure can be seen in
(Figure 1-2).
This concept remains valid if given f(x) = Cfl(x), we use fl(x), the non normalized

part of f. Using these facts, it is possible to define a generalized Slice sampler:









Generalized Slice Sampler


1. Given f(x) x Hi fi),

2. At iteration t + 1, simulate
1. +1 (1))


k. wU(0, ~ ( X(t)))
k + 1. x(t+l) ~ (A(t+1))
A(t+l) y fi(y) > w(t+l), I ..., k}.


3. Return x(1), x(2),..., x(N).


This sampler can be seen as a special case of the Gibbs sampler. Its simplicity allows

for its use in different problems for which posterior probabilities are constructed by the

product of many individual distributions.

Now that we have these basic tools for attacking numerical problems in probability

theory, we explore the important concept of sparsity promotion in linear models.

1.3 Sparsity Promoting Distributions

1.3.1 Introduction

In ir il-, areas of science, one of the most used methods for studying specific

properties of data sets is the linear model For example, sometimes it is necessary

to know dispersion of the data, how binary the data is, if the data can be grouped in two

or more classes, etc. These models can be defined using the following equation:


f(x, ,) =, (1-23)
i=1
where f represents the output of the linear model, pi the weight given to the ith feature in

the model, and xi is the ith feature.









We are often interested in designing algorithms for learning, or estimating, the weight

vector p that provides the best approximation to a mapping x y in some sense. To

assist in this development, a noise term can be added to the basic linear model. If we

consider the following equations:


f(x, p) ph4x), y = fx, ) + e, (1-24)
i=1

where y is the desired output for the algorithm, hi(x) represents the evaluation of the

input x by a basis function hi, and c can either be thought of as noise added to the data

or as the error distribution. This can be written in vector form as:


f(x, p) = HT, y = f(x, ) + e, (1-25)


where HT = (hi(x),..., h,(x)). Now, if we have a collection of different outputs yl, ...,y

we can write (Equation 1-25) as:

Y = HU + e, (1-26)

where H is called design matrix for the problem at hand.

A way to solve this linear problem is the use of the Least Squared Error (LSE) [21],


S= argmin Y H 112. (1-27)


Many researchers have attempted to devise methods for improving LSE estimation by

reducing the number of features. When using LSE to solve this model, we are trying to

increase prediction accuracy, and interpretability. After all, we would like to decrease

the influence of features that are not important in the reduction of the error between the

predicted output and the desired output. This is an important, sought after feature in

B i, -i i, classifiers, because their accuracy decreases as the number of features increases

due to the well known bias-variation trade-off [35].

Two standard techniques for improving LSE estimation (i.e. decreasing the number

of features involved) are subset selection and ridge regression. Subset selection can









select a set of features by some property of the data itself or by using the classification

algorithm as a black box [36] to select the possible best set. An example of this method

can be seen in Sindhwani et al. [37], where mutual information and a greedy subset

selection are applied to choose the best possible subset of elements in problems involving

the use of neural networks and support vector machines.

Although subset selection provides interpretable models it tends to be extremely

sensitive due to the discrete process selection and changes in the data. Furthermore,

the regression coefficients are not learned simultaneously with feature selection. For this

reason, ridge regression (also known as weight decay) provides a much better method of

optimization:


f= argmin Y HY HpHl2 + 6TJ (1-28)


where || 1|2 is a regularization factor which penalizes large values for the elements of g.

Although this method allows us to use classic optimization tools for minimization,

which improves stability of solutions, it does not drive elements of g to zero fast enough.

For example, if we consider (1-28) as a posterior probability problem


7r(tlY) c exp {-(Y Ht)T(Y Hpt))} exp {- t, } (1-29)


The first exponential corresponds to the LSE part of (1-28), and the second exponential

corresponds to the quadratic constraints of the ridge regression. At the same time this last

exponential can be seen as
K
exp {- ,} Iexp {-p}) (1 30)
j=1

where each exp {-p} is proportional to a zero-mean Gaussian. Therefore, the quadratic

term of the ridge regression (Equation 1-28) imposes a Gaussian distribution on the t's

weights. This means that the p terms tend to go slowly to zero. Furthermore, from the

point of view of ridge regression the following examples are equally likely for a problem









with dimensionality two


A1 1 1 2

A2 {/i 0 2 1}


which is counterproductive when looking for solutions that promote sparsity. Thus, it is

necessary to impose a more restrictive regularization term on the original LSE.

1.3.2 Sparsity Promotion in Cost Functions : Least Absolute Shrinkage and
Selection Operator

Robert Tibshirani [38] has proposed a modification of the Ridge regression cost

function to
K
p=argmin |Y Xp\2 + A Y I (1 31)
j=1
This last equation is equivalent to the following cost function [39]

SN K 2 K
S= arg min i h (xj) j + )}A Ij. (1-32)
i=1 j 1 j=1

This cost function is known as Least Absolute Shrinkage and Selection Operator

(LASSO). It can be shown, as in the previous discussion about Ridge regression, that the

regularization term !-1 p/jj can be seen as putting a Laplacian distribution (Figure 1-3)

with zero mean on each of the weights pj. As Tibshirani [38] pointed out this reflects the

greater tendency of the LASSO over the Ridge regression to produce estimates that are

either large or zero.

Several methods can be used to find a possible solution for the cost function

(Equation 1-32). In our case, we are interested in the method developed by Figueiredo [40].

1.3.3 Extending the LASSO to a Bayesian Hierarchical Models

Another way to solve the linear model (Equation 1-24) is assuming that the difference

between the desired output and the computed output f(x, tt) is distributed as a zero

mean Gaussian with variance a2. In addition, we can assume that each p has a normal









distribution with mean zero and variance r. In this way, we can maximize the posterior

distribution p(pt, a2 y, T).

Figueiredo [40] proposed the following hierarchical model to accomplish this


yj ~ N(H p,,2) V -t...,n

pYi N(H i0, ) Vj = 1,...,m

J ~ exp Vj 1,...,mr (1-33)

7 ~ non-informative prior

a2 ~ non-informative prior


Where the yl,..., y, represent our desired outputs or labels, the Hi's are the row of the

design matrix H of the data; the pi,..., p,m represent the weights to be learned; the

TI, ... ,Tm represent the hidden variables that promote sparsity on the weights; 7 is the

amount of sparsity in the model and a2 is the amount of noise in the model.

It can be proven under this model that the pj's with respect to the 7j's have the

following density:


p7~ N(0, T) exp dry = exp {- V j Ip } (Laplacian), (1-34)

which has plot (Figure 1-3).

As was observed before, this density promotes sparsity in the ij terms since it has a

sharp peak at zero. If the components of p are samples from the Laplacian, then they are

likely to be large or zero.

Now, assuming r-, ..., j as hidden data, a Maximum A Posteriori ( \ AP) EM

algorithm can be used to maximize the posterior likelihood:


p(t, a2 1y, r) O p(ylt, a2)p(tTr)p(a2). (135)

The MAP-EM algorithm for this posterior has the following steps









MAP-EM


1. E-step. Compute the Q-function for each pj:

Q(,2 ), ) = log p(,A2 y, )p )dr. (1-36)

In this case, we have:

Q ta2 (t%) n loga2 Hp2 _TV(t) (137)

with
V(t) 7 diag {C), | )| (1-38)

2. M-step. Maximize Q with respect to a2 and [ which yield:

SIly H 112
a2(t+1) 2 (139)

and
(t+1) ( (i(t+1)V(t) + HTH)-'HTy. (1-40)



Although this is an efficient algorithm, it assumes no relations exist between the weights

p~j. Thus, if we want to promote sparsity on models where complex relations exist, we

need to develop other strategies for maximizing the cost functions for the models.
1.4 Logistic Classification

In this section, we develop the theory for the basic use of the logistic regression in

classification.

1.4.1 Logistic Distribution

The interest in the logistic distribution comes from the fact that its CDF can be

used to simulate the probability of a Bernoulli random variable.

The logistic distribution has PDF

exp {-m
f(x|m, b)= b (1 41)
b [l+exp ({' ]2'









with parameters m E R and b > 0, and CDF


exp (b-m)
F(xm, b) ((1-42)
1 + exp ( (bm)

The previous CDF is also known as the sigmoid function. This CDF is used to map

continuous or discrete random variables into binary categories-the patient is dead or alive,

it is raining or not, etc. These PDF's and CDF's can be plotted (Figure 1-4).

1.4.2 The Binary Classification Problem

Given m different features x1,..., x,, we would like to find a binary variable y to

classify the collection of vector samples {x (xl,... ,mz)T} into two classes.

For this, we can use the linear function f(x ,..., xm) = "i pxi. Then, it is possible

to use a threshold wo to classify the x's entries as follow:


y =0 if e xi < Po,
i=1

y= 1 if ii > po.
i=1

Now, if we define h(x) = -po + x11 lixi, we can rewrite the previous conditions in the

form:


y = 0 if h(x) < 0,

y = 1 if h(x) > 0.


Then, our objective is to find weights Pt, ..., Pm and a threshold Po such that we maximize

the probability of correct classification. Clearly, we can use different types of h(x) instead

of linear ones. For example:


h(x) = po+ i exp {xi},
i=


i=











h(x) = po + pjexp -X- 2 (1-43)
i= 1
More complex examples of h(x) exist, and they depend on the type of problem being

studied [41-43].

An example of these functions that help see how they separate different classes can be

seen in (Figure 1-5).

1.4.3 Motivations for Using the Sigmoid

When converting geometric problems of binary classification into a B i. -i il

classification, it is desirable to have a probability with the following properties:

t. P(Y -= 1x) = 0.5 # h(x) = 0.

2. lim P(Y = 1 x) = 0.
h(x)--oo
3. lim P(Y = 1 x) = 1.
h(x)-Doo
A good choice for this probability is the sigmoid function.

Sexp {h(x)}44
Y ) +exp{h(x)}

and
P(Y O exp {h()} exp {-h(x)}
1 +exp h(x)} 1 +exp{-h(x) (1 45)
Then, using the logistic regression (Section 1.4.4), we can construct a decision boundary as

follows

logit {P(Y = 1 x)} log P(Y ) h(x) (1 46)
1 P(Y 1ix)
and
r P(Y = 0|x} )
logit {P(Y = 0Ix)} = log 1P(Y -O ) h(x). (1-47)
1 P(Y 0|X) )
An example of this kind of mapping can be seen in (Figure 1-6).

1.4.4 Logistic Regression

Logistic regression can be used to investigate the degree to which the data can be

separated into two groups [3, 44], which is important to know if we want to classify it. We









can think of logistic regression in the following terms: if the outputs are almost binary the

model does not have a lot of noise; if the outputs are less binary the model contains some

degree of noise.

In broad terms logistic regression is simply a combination of the geometric idea

in (Section 1.4.2) and the sigmoid in (Equation 1-44). This produces a probability

distribution, P(Y x), where the labels or desired outputs (Y's) are considered Bernoulli

random variables.

Now, it is possible to rewrite P(Y = l|x) as:

Sexp {h(x)}
P(Y 1lx) 1+ exp {h(x)}(

This kind of model can be solved using maximum likelihood [3]. Since Y is a Bernoulli

random variable with mean P(Y = l1x), we can interpret the probability of an element x

belonging to the positive class as the probability of Y = 1. Then, we have that

S P(yi =lxi) if yji 1
P(x, P(y = ) if (1 49)

P(y, = xI)Y(- P(y, lx)-y.

From this probability it is possible to develop a log likelihood which looks like:

In L(x, y) = (yiln P(y, = tlx) + (1 y)ln (1 P(y, = 1 xi))). (1-50)
i= 1
The optima of this log likelihood cannot be solved for directly. Thus, it is necessary to use

numerical methods to solve it, such as conjugate gradients or Gibbs samplers.

1.5 Minimum Classification Error

In Minimum Classification Error (I \CE) training [4-6], we do not consider cost

functions that use desired outputs as in the well known cost function using LSE

minimization. We instead consider a cost function that depends on a difference between









confidences of different classes [45, 46]. These differences are called dissimilarity mea-

sures.

1.5.1 Dissimilarity Measures

A way to measure the difference between vectors and sets is called a dissimilarity

measure [47].

Definition 1.16. A dissimilarity measure d on a space X is a function d : X x X -

R where R is the real numbers with the properties

1. 3do ER i -oo < do < d(x,y) < +oo,Vx / y E X, and d(x,y) = do ifx = y.

2. d(x, y) = d(y, x) Vx, y X.

Examples of dissimilarity measures are:

1. The Hamming distance, d(x, y) = EY Ixi ylj.

2. The difference function di(x) = -d,(x) +in-,:. .,, (x) where the d,(x) can be seen
as the distance to the prototype pi from x.

1.5.2 Loss Functions

In order to use these dissimilarity measures, we need to have a way to measure the

cost of each sample. We can do that using the concept of a loss function [21].

Definition 1.17. In statistics, decision theory and economics, a loss function is a

function that maps an event onto a real number representing a cost or regret associated

with the event.

In our case we will use the following notation li(f,) for the loss function where i is the

observation y, u is the pattern x and f, is the prediction f(x).

Examples of loss functions are:


li (fM) (d (f))a if di(f,) > 0 0, (5
a a> 0, (151)
0 if d (f) < 0

i( +e( a > 0. (1-52)
1 + e(-adj(f))









If the di is differentiable or at least differentiable almost everywhere, these types of loss

functions allow us to use gradient descent methods in the search for the optimal measure.

1.6 Information Theory

In this section, we will review the basic elements of information theory [7]. Then,

we will use these basic concepts to explore the use of mutual information for algorithm

selection.

1.6.1 Information Entropy

A basic concept used in information theory is that of entropy.

Definition 1.18. Let X be a discrete random variable that can take values {x, ..., x,}.

The Shannon's entropy of X is 1, I.,,1 as:


H(X) = E [I(X)] p(x) log(p(x)), (1-53)
i= 1

where I(X) = log(X) is the self-information of X and p(xi) = p(X = xi) is the

",.' ,1, 1 /.:l.:/; mass function.

Other definitions of entropy exist, such as for example the Renyi entropy:

1 n
H, = 1 log p(x)a. (1-54)
i= 1

In a similar way, if X is a continuous random variable, we have:


H(X) p(x)log(p(x))dx. (1-55)


This last equation is called differential entropy.

An immediate extension of entropy is the joint entropy.

Definition 1.19. The joint entropy of a pair of discrete random variables (X, Y) with

joint distribution p(x, y) is 1, 1I;,,, .as:


H(X, Y) -E[log(p(X, Y))] p(x, y) log(p(x, y)). (1-56)
xEX yEY

A similar version of this definition exists if X, Y are continuous random variables.









Now, we can define the conditional entropy:
Definition 1.20. If (X, Y) p(x, y) then the conditional entropy H(YIX) is 1, f,.,l

as:

H(YVX) = -Ep(,) [log(p(Y|X))] = p(x,y)log(p(,,i,)). (1-57)
xEX yEy
Using conditional entropy, it is possible to prove the following equivalences for random

variables X, Y and a set of random variables {X1, ..., X}.

H(X,Y) = H(X) + H(YIX), (1-58)

H(X, YIZ) = H(XIZ) + H(YIX, Z), (1-59)

H(XI, ..., X,) = H(Xi X_ ..., X) (C('. i1, rule of entropy), (1-60)
i= 1
H(XIY) < H(X) (Conditioning reduces entropy). (1-61)

Now, we can begin to define what is known as mutual information.

1.6.2 Mutual Information

Before we define the concept of mutual entropy, we need to define the concept
of relative entropy. The entropy of a random variable is a measure of how much

"information" is necessary to describe a random variable. The relative entropy is a

dissimilarity measure of the distance between two distributions, p and q. In statistics, it is

seen as an expectation with a logarithm of a likelihood ratio Ep log (). It measures

the inefficiency of assuming distribution q when the real one is p.

Definition 1.21. The relative entropy or Kullback-Leibler distance between two
p,', '11.':l. I/ mass functions p(x) and q(x) is 1/. ,.,/ as:


D(p) p(x)log ) log E og (X) (1-62)
xEX qx)/ q(X)/
Relative entropy is ahv--i- non-negative, and is zero if and only if p = q. Although
Kullback-Leibler distance looks like a metric, in reality it is not symmetric and does not

satisfy the triangle inequality [7].








Using the Kullback-Leibler distance, we can define the idea of mutual information
between two random variables.
Definition 1.22. Given two random variables X and Y with joint pI'. 1'.1l7,:.; mass
function p(x, y) and in,,,..':,,l] "'.',.,.:].:l,; mass functions p(x) and p(y), the mutual
information I(X; Y) is the relative entropy between the joint distribution and the product
distribution p(x)p(y), i.e.:

I(X;Y) p(x,y) log p( p(y)
JEX yEY
SD(p(x,y) lp(x)p(y))

E Ep(mx,) log p(X, Y) (1-63)

Mutual information has the following useful relations given X, Y random variables:

I(X;Y) H(X)- H(XY), (1-64)

I(X;Y) = H(Y)- H(YIX), (1-65)

I(X;Y) = H(X) + H(Y) H(X,Y), (1-66)

I(X; Y) = I(Y;X), (1-67)

I(X; X) H(X) H(X X) =H(X). (1-68)

As in entropy, it is possible to define a conditional mutual information.
Definition 1.23. Given X, Y and Z random variables:

I(X; YIZ) = H(XZ)- H(XY, Z) (1-69)
S( p(X, Y| Z) ] (1
Plog (XlZ)p(YZ) (170)

In addition, as in entropy, we can prove that mutual information is alv--i non-negative.
The chain rule of information for I(X, ..., X,; Y) random variables is written as:
n
I(Xi, ...,'x; Y) I(X; Y|X_1, ..., X). (1 71)
i= 1









1.6.3 Fano's Inequality

Let X be a random variable with distribution p(x). Assume a random variable Y

related to X by a conditional distribution p(i, ,). Imagine that from Y, it is possible to

calculate a function g(Y) = X, an estimate of X. It should be desirable to bound the

probability that X / X, defined as:

Pe= Pr(X X). (1-72)

We can consider P, to be a random variable with a given PDF. The bound for P, can be

estimated by the Fano's inequality.

Theorem 1.3. Given X, the space of all possible values of the random variable X, the

Fano's inequality is 1. If;, as:

H(Pe) + Pelog(IXI 1) > H(XIY), (1-73)

where H(Pe) = -P log PF (1 Pe) log(1 Pe), if we consider P, as the j, '..',/':l.:', of a

Bernoulli random variable.

This inequality can be weakened to

1+ Plog(IX|) > H(XIY) (1-74)

or

H(XY)- 1
P >() (1t75)
log(|X|)
H(X) I(X; Y) 1
> (1-76)
log(|X)

by the fact that H(Pe) is a concave parabola with maximum value one, and:


log(lX 1) < log(lXI). (1-77)

Notice that it is not possible to control the entropy H(X) and the size of the space X [48].

Thus, it is desirable to maximize the mutual information I(X; Y).









1.6.4 Feature Selection Using Mutual Information

This section is based on [37, 49]. First, it is necessary to consider the standard setting

for a classification and feature selection problem:

1. A collection of samples drawn from a random vector X = (X, ..., XN)T.

2. Each sample has an associated label y E {1,..., N}.

3. A collection of classification functions H = { f : X -+ {1,..., N}}. In the case
of [37], the two functions used for classification are the Support Vector Machine
(SVM) and the Multi L -V-r Perceptron (\!I.P).

4. The classification of the classes is made by using a single MLP for all the classes, or
multiple SVM's one for each class.

5. Let Y and Y = {f(x)lx E X} be the discrete random variables over {1,...,N},
describing the unknown true labels and the predicted labels by the classifier function
f respectively.

6. A subset of the total set of features G c X.

7. A collection of classification functions HG { f : G -I {1,, .,N} } where NHG
denotes the restriction of -H on G.

Erdogmus and Principe [49] provide a family of upper and lower bounds on the

misclassification P = Pr(Yf 7 Y). Now, if Shannon's definition for entropy (Definition

1.18) is used, the bounds are:

H(Y) I(Y; Yf) H(P,(f))
; Y) < P(f) (Fano's bound) (1-78)
log(N 1)
P(f) < H(Y)- I(Y;Yf)- H(P(f))
mini H(Yle,Yf= i)

This entropy is known as the binary entropy function [7]. Finally, H(Y e, Y = i) is the

Shannon entropy of the error distribution when the classifier f incorrectly output class i.

Due to the fact that it is not possible to control H(Y) and H(P,(f)), it is preferable

to maximize the mutual information I(Y; Yf) between Y and Yf to minimize P,(f) by

using,

f = max max I(Y;Yf). (1-80)
f G:IGI=K} ffEHG









This function (Equation 1-80) is not differentiable with respect to the classifier parameters.

For this reason Sindhwani et al. [37] prefer to 11i1,i::.;in iI. the optimization process

rather than approximate the objective function." For this purpose, they use the following

cost function:

f. max I(Y;Yf.), (1-81)
{G:IGI=K}
where fo is a classifier that has been trained using any convenient training objective

function with the feature set G.

Using (Equation 1-81), it is possible to design two different heuristics for the two

classification functions proposed in [37]. However, before the two heuristics are described

in more detail, it is necessary to describe some of the equations involved.

First, it is necessary to explain how to obtain an estimate of the mutual information

I(Y, Yf) given a specific problem. Assume that a certain problem is given with a total

number of samples S, and f is used to classify the samples in this problem. Thus, we have

the confusion matrix Qf = (qj), where qij represents the number of samples in class i are

classified in class j. This can be used to calculate the following probabilities:

Pi = P(Y = i) (Empirical probability of class i), (1-82)
S
P1 P(Yf j) (Frequency with which the classifier (1-83)
S
outputs class j),

Pij P(Y =iY j) qi (Empirical probability of the (1-84)

true label being class i when

the classifier outputs class j).

These probabilities allow us to calculate the following entropies:

H(Y)= PlogP (1-85)


H(Y|Yf = j) = Pi log Pi, (1-86)









H(Y|Yf) Py log H(YYf j). (1-87)
J
In this way, it is possible to estimate I(Y; Yf) = H(Y) H(YIYf). Finally, it is possible to

define a usefulness indicator Ij that measures the uncertainty involved in f selecting class
r:

SI(YYf) P(Yf j)(logN H(YIYf j))
I3 = I(YY YN) (1-88)
S i P(Yf = i)(log N H(YIYf = i))
Now, it is necessary to define a way to assign an importance weight to each of the features

under the framework of SVM and MLP. In the case of SVM, the squared reciprocal of the

margin of a SVM is given by:


w2 = aiajyiyjK(x, xj), (1-89)
ij

where ai is the Lagrange multiplier and yi is the label corresponding to the ith support

vector xi. K represents the kernel used to send the features to a high dimensional space.

Now, the derivative of this term with respect to feature k in xj is:


Dk (2) aiijO K( xi, xj) (1-90)
Oi 3 J

Now, it is possible to define a final total information flow for feature k over several SVM's:


Ik Ij ,D ) (1-91)
j \ P Dp )I

where j indexes the SVMs, p indexes the features and Ij represents (Equation 1-88). In

the case of MLP, consider the neuron L in the li-.-r indexed by 1, and the 1l r being fed

by this l-i-vr be indexed by k. In addition, denote the output of a neuron in 1-,-r k as

Ok and the interconnection weight between 1l.-r k and I as ,',:. Now, it is possible to

recursively compute the importance weight for these neurons by using:


IL = Ik i (1-92)
k \ Z|Cklj /









where Ckl = Covariance(Ok, i,',.01) with OL- as the sensitivity measure. In this way, it is

possible to compute the importance weight for each neuron that contains a feature.

Let X = {X1,...,XN} denote, the full set of N features; Dtrain denotes the training

set, and Dtest denotes the testing set. In addition, we assume a SVM for each class in the

problem. In this way, it is possible to describe the following heuristic, which is defined by

Sindhwani [37]:


SVM's Heuristic Feature Selection

1. Initialize G = X. Set the number 0 < K < N of desirable features.

2. While G| > K do,

(a) Train SVMs on Dtrain(G).

(b) Estimate:
If = I(Y; Yf)
on Dtest (G).

(c) For all j calculate Ij.

(d) Assign weights to each feature k using:


Ik (1-93)


(e) Eliminate worst feature according to these weights.

3. End while.

4. Return the multiclass SVM trained using these features.


The case of the MLP uses the same base framework but instead of multiple SVMs, the

heuristic uses a single MLP. Thus, the heuristic [37] has the following structure:


MLP Heuristic Feature Selection









1. Set the number 0 < K < N of sought features. Randomly select a set G of features
with cardinality K.

2. Initialize RESET = 0.

3. Train MLP on Dtrain(G).

4. Estimate If(Y; Yj) on Dtest(G).

5. Estimate IJo for each output 1l., r neuron Jo with:

S P(Y Jo) (log || H(YIY J))
Io = i P(Yf = i)(log y H(YIYf i))

6. Recursively compute the weights of the non-output neuron 1~. -r J by using:


k ( Y Cki
'jz(I ckg, ) (194)


7. If G gives the best performance so far, set G = G.

8. If there are untested features replace the least informative current feature by the
next untested feature and go to step (2).

9. If every feature has been tested, determine the best feature x not currently being
used and the worst feature y currently being used.

10. If weight(x) > weighty) replace y with x in G and go to step (2).

11. If weight(x) < weighty) and G = G go to (14).

12. If weight(x) < weighty) and / G set G and RESET RESET+1.

13. If RESET = 2 go to (14) else go to step (3).

14. Return the MLP trained in the current set of features.


These two heuristics, (Heuristic 1.6.4) and (Heuristic 1.6.4), can be used in the framework
of SVM and MLP to select the most important features from the set X.
Sindhwani et al. [37] tested these heuristics with the following data sets:

1. Corral is an artificial data problem with two classes and six binary features. The
first four features are used in a Boolean function to define which class a sample









belongs to. The fifth feature is irrelevant. The sixth feature is correlated to the
target, matching 7-.'. of the data points.

2. Parity is an artificial problem with 15 features; five relevant, five irrelevant and five
redundant. Each redundant feature is a ( i,- of the relevant features.

3. Breast Cancer, Vote and DNA problems from the UCI machine learning
depository.

1.7 Fuzzy Measures

A fuzzy measure is defined as follows [12, 50-52]:

Definition 1.24. Let X = {xl,..., x,} be ,,;, finite set. A discrete fuzzy measure on

X is a function p : 2X [0, 1] with the following properties:

1. p(0) 0 and p(X) = 1.

2. Given A, B 2x, if A c B then p(A) < p(B) (_l.notoi. .:1I P,.* I ',;).

For our purposes, the set X is often considered to contain the names of sources of

information (features, algorithms, agents, sensors, etc.), and for a subset A C X, p(A) is

considered to be the worth of this subset of information.

Different types of fuzzy measures exist, the most researched and used are evidence

measures, possibility measures, fuzzy measures induced by Ordered Weighted Averaging

(OWA) operators or cardinality measures, and Sugeno A-measures. The first two were

proposed and studied by Dempster [53] and Shafer [54]. The OWA operators were

explored in depth by Yager [55]. Finally, the Sugeno A-measure [51] is a nice example of a

measure recursively defined and has been used by many researchers [56-62].

1.7.1 Evidence Measures

Evidence measures are defined using what is known as basic probability assign-

ment.

Definition 1.25. Given a finite set X called a frame of discernment, then a function

m: 2x [0, 1] is called a basic probability assignment whenever:

1. m(0) 0.









2. EACX T(A) = 1.
Example 1.1. Given X = {x, x2, 3, we have the following subsets with their basic



m(0) 0, m({x1}) = 0.1, m({X2}) 0.1,


m({x3}) = 0.1, m({x, 2}) = 0.1, m({2, 3}) = 0.2,

m(({x, X3}) = 0.2, m(X) = 0.2.

Clearly m(0) = 0 and EACX m(A) = 1. In addition, this basic i,. /.l.'/'.,.:l; i--'.:,,,i. ,./ is a

'.11'.l.: '.:1;i on 2x, but not on X.

Two measures are defined in evidence theory, the belief measure and the plausi-

bility measure. The former is defined as follows:

Bel(A) = (B) VA e 2x. (1-95)
BCA

It can be proven that the belief measure has the following properties:

Theorem 1.4. If X is a frame of discernment, then a function Bel : 2X [0, 1] is a
. 1. fj function if and only if it -.'l.:i. the following conditions:

1. Bel(0) 0.

2. Bel(X) = 1.

3. For every positive integer n and every collection A, ..., An of subsets of X,

Bel J A > (- 1)I'+Bel ( A (1 96)
i-. I7 C..,...,n},I40 iE 7


4. For all A C X
m(A) (-1)IA-IlB) (-97)
BCA









Example 1.2. Using (Example 1.1), we can generate the following measure:


Bel(0)

Bel({x3})

Bel({x, x3})


Bel({x }) =0.1, Bel({x2}) 0.1,

Bel({, x2})= 0.3, Bel({x2, 3}) 0.4,

Bel(X)= 1.


The other measure defined in evidence theory is the plausibility measure which is

defined as follows:


Pl(A) mn(B) VA 2x.
BnA 0
It is possible to rewrite the plausibility measure as:


Pl(A) = 1- Bel(A) VA E 2x.


(1-98)




(1-99)


Thus, from (Theorem 1.4) we have that:


P1(0)

Pl(X)


ri(A)
m(A)


S0,

= 1,

< (-1)'I1P (U JA
IC{,1...,n ,740 iEl

B= (-1)A-B1[1 )]
BCA


1.7.2 Possibility Measures

A special case of evidence theory is the possibility theory. In this theory, the focal

elements or = {A C X m(A) > 0} are nested, i.e A1 C A2 C ... for all A, e Fc. Then,

we have the necessity measure:

Definition 1.26. A function Nec : 2X [0, 1] is called a necessity measure if and

only if


Nec Ai min Nec(Ai).
\ I i=1,...,n
i= 1
Possibility measures are defined similarly.


(1-104)


(1-100)

(1-101)

(1-102)

(1 103)









Definition 1.27. A function Pos : 2x [0, 1] is called a possibility measure if and

only if

Pos ( Ai max Pos(Aj). (1-105)
i= 1,...,n
i= 1

1.7.3 K-Additive Measures

A fuzzy measure is said to be additive if p(A U B) = (A) + p(B) whenever An B 0.

A k-additive measure is defined as follows [63, 64]:

Definition 1.28. A k-additive measure is a f,.. ., measure such that p(A) 0 for all

A C 2X such that |A| > k, and there is at least one A of k elements such that p(A) / 0.

1-additivity coincides with additivity.

Example 1.3. Given a discrete space X = { x2..., a, we can /. f.,:'' the following

k-additive measure:
A|I IAA < k
Pk+(A) x (1-106)
0 |AI >k

1.7.4 Sugeno Measures

The Sugeno A-measures are a special class of fuzzy measures. In keeping with

notational convention, we refer to this class of measures using g instead of p.

Definition 1.29. Let X = {xl,...,x,} be i, finite set and let A E (-1,+oo). A

Sugeno A-measure is a function g from 2X to [0, 1] with properties:

1. g(X) 1.

2. If A,B C X with An B = 0, then:

g(A U B) = g(A) + g(B) + Ag(A)g(B). (1-107)


It can be shown that a set function satisfying the conditions in (Definition 1.29) is

a fuzzy measure. In particular, (Equation 1-107) implicitly imposes the monotonicity

constraints on the Sugeno measures. As a convention, the measure of a singleton set {xi}

is called a density and is denoted by gi = g({xi}). In addition, we have that A satisfies the









property

A +1 (1 + Ag ). (1 108)
i-
The parameter A is specific to this class of measures and can be computed from (Equation

1-108) once the densities are known. Tahani and Keller showed that this polynomial has

a real root greater than -1 and several researchers have observed that this polynomial

equation is easily solved numerically [56-58]. By property (Equation 1-107), specifying

a Sugeno A-measure on a set X, with n elements, only requires specifying the n different

densities, thereby reducing the number of free parameters from 2" 2 to n.

In addition, the Sugeno A-measure satisfies the finite A-rule [65].

Definition 1.30. Let X = {xl,... ,x} be i,:. finite set, g a Sugeno A-measure and a

disjoint collection {E1, Ek} of subsets of X. The Sugeno A-measure g -.,/;-/. the

finite A-rule


kg (ui 1E) {Hi 1 [1+ Ag()] if A 0(1109)
g (Uj jE-) -=k (1-109)
Z Ig(Ei) if A 0.

1.8 Fuzzy Integrals

After the introduction of the fuzzy measure concept, we are ready to introduce some

of the basic definitions used in the integration of fuzzy measures.

Fuzzy integrals can be considered as functional operators on the set of functions

F = {f : X [0, 1]} to the R+. This can be seen as a way to express the weight of the

function in a certain space.

1.8.1 Sugeno Integral

This integral was introduced by Sugeno [51]. It is based on the use of supremum and

infimum.

Definition 1.31. The Sugeno integral of a function f : X [0, 1] with respect to

a f; .. ;, measure p is 1, I,.,, by S,(f(x ),..., f(x,)) VT" I(f(x()) A p(A(i))) where

(i) indicates that the indices have been permuted so that 0 < f(x(l)),..., f(x(.)) <









1, the symbol V represents the supremum, A represents the infimum and A() =

{X(),. ,X(n)}.
Although this definition can be useful, it is not related to the Lebesgue integral.

Thus, we prefer to use the Choquet integral which has the Lebesgue integral as a special

case.

1.8.2 The Continuous Choquet Integral Under a Fuzzy Measure

Gustav Choquet [10] introduced this integral as a way to understand some physical

models.

Definition 1.32. The Choquet integral of a function f : X [0, 1] with respect to a

f,.. ; measure p on X is 1, I,. by:


C,(f) \({xlf(x) > a}}da. (1-110)

Although this definition covers the general case of a Choquet integral, it is not often

used. So, it is necessary to look at the discrete case of the Choquet integral for a more

useful Choquet integral definition.

1.8.3 From the Continuous Case to the Discrete Case

In order to fuse evidence supplied by different sources of information from a discrete

fuzzy set of X, we use the discrete Choquet integral.

Definition 1.33. Let f be a function from X = {xl,. ,x,,} to [0,1]. Let {xc(l),... ,Xr(,,)}

denote a reordering of the set X such that 0 < f(x,(l)) < ... < f(x,(,)), and let A be

the collection of subsets 1. /I,., li by A(y) = {(T), .,x,(o )}. Then, the discrete Choquet

integral of f with respect to a f ..11 measure p on X is 1. /7.,1/ as:

C,(f) p(A())(f (x()) f(x(-1))) Y f (x())(p(A()) p(A(j+l))), (1-111)
i= 1 i 1

where we take f (x(o)) = 0, A(,+I) = 0 and x(i) x,(i).

The function f can be interpreted as a particular instance of the partial support

(evidence) supplied by each source of information in determining the confidence in









an underlying hypothesis. The integral fuses this objective support with the worth

(importance) of various subsets of the information sources.

Some extra notation is needed to make a reference to objects in the fusion problem.

Let Q denote the set of objects in the fusion problem. Each information source xi, for

i = 1,..., n is a function xi : -+ [0, 1]. For each uE c we define f, : X -+ [0, 1] by

fl(xi) = Xi(}).
Finally, it is possible to represent (Equation 1-111) as a dot product of vectors


C,(f) Hfu, (1-112)

where u = (Ps,... ps, )T is the vector of parameter values for the measure p of the

lattice 2x with Si E 2x, and the vector Hf is equal to the row vector Hf = (di,..., d,)

such that:

di [f(k))- ((k-1))] if S A(k) (
0 if S, / A(k)

1.8.4 Examples of Discrete Choquet Integral

1.8.4.1 Choquet integral with respect to belief and plausibility measures

With respect to the belief measure Bel of a space X and its basic probability

assignment m, the discrete Choquet integral can be written as [66]:

CBel(f) f(()) (Bel (A(i)) Bel (A(+l)))
i= 1

= f(x()) ( n(B) m(B) (1-114)
i1 BCA() BCA(i+)

which can be rewritten as:



i=1 BcA(i),
BA(i+) )









In the same way, by the definition of plausibility measure, the Choquet integral is given

by:


Cpj(f) ( f (x(j))nm (B) (1-116)
i=i B .
\BnA(i+1)=
The following example shows surfaces that can be generated using a belief measure, a

plausibility measure, and the Choquet integral:

Example 1.4. Let X = {x,x2} and assume basic p,'..1,ll.:l.:; --.:;,',;;. ,:. on X as follows:


m(0) = 0, m({xi}) = 0.2, m({x2}) = 0.4, m({X}) = 0.4. (1-117)

Then, we have:

Bel() = 0, Bel({xi}) = 0.2, Bel({x2}) = 0.4, Bel({X}) = 1 (1-118)


and

P1(0) = 0, Pl({xi}) = 0.6, Pl({x2}) = 0.8, Pl({X}) = 1. (1-119)

The -, il'.~ (x,y, C,(f)) can be seen in (Figure 1-7) and (Figure 1-8) for the '". f

m measure and 14.l ',, :1'.:1 :'1; measure ,'. -".. /,,; /;/

1.8.4.2 OWA operators as Choquet integrals

In 1988 Yager introduced the concept of an OWA operator.

Definition 1.34. An OWA operator of dimension n is a in'l'l':,j' F : RW -- R, that

has an associated vector w = (w, w2, ,,,)T such that i,' C [0,], 1 < i < n, and

S ",,' = 1. Furthermore F(a, a2,... an) = j 1 wybyj where bj is the j-th ',,', -/
element in the collection {ai, a2, an.

Special cases of OWA operators are the mean, median, max and min. The OWA

operator is a special case of the discrete Choquet integral where the measures have the

following property:


V A, B e 2X such that |A| = BI = p(A) p= (B). (1-120)









Such measures are called cardinality measures. The Choquet integral with respect to a

cardinal measure is defined as:
n
GC(f) f(x(i),' (1-121)
i= 1

with = p(A(i)) p(A(+I)) for all i = 1,..., n which, since Ai and Aj+I ahv--, have the

same number of elements, are independent of the sorting in the integral. In addition, it is

clear by the telescoping sum property that


wi + w2 + ... + w,, = /(A(1)) p(A(2)) + ... + p(A()) p(A(,+)) 1. (1-122)

1.8.4.3 Choquet integral with respect to k-additive measures

The discrete Choquet integral with respect to the k-additive measure can be written

as:


C/( f ) W f(Ex()) (pk+ (A({)) pk+ (A({+I))). (1-123)
i=n-(k-1)

K-additive measures allows us to interpret the Choquet integral as a CDF in the sorted

values {x(1), X(2), .. (n) }, which can be used to study the impact of each sets of size at

most k on the Choquet integral. This integral can be used to study the impact of each

k-level on the power lattice 2x.

1.8.4.4 Choquet integral with respect the Sugeno measure

Taking in consideration that under the Sugeno A-measure,


g(A())- g(A(+l)) g({x(j)})(1 + g(A(i+l))), (1-124)

then, by the finite A-rule (Definition 1.30):


g(A(i)) g(A(i+)) + i + A(())] (1-125)
g (x )) ( + ZE i g(x()) if A = 0.









Thus, we can write the discrete Choquet integral (Equation 1-111) as:


c(f) If(x())g(x()) (l n 1 + Ag(x())] l if (1 126).
C { (f ) ( i1f -26)
1 E f(xi ))g(x(i)) (i + 1 i+l g(x(j))) if A 0.

1.8.5 Choquet Integral as a General Aggregation Rule

Several authors have proposed different fundamental conditions for the .r'-:regation

operators [67, 68]. Mesiar and Komornikovd [69] proposed a set of common properties for

the aggregation operators.

Definition 1.35. An aggregation operator is a function Aggreg : UnN[0, 1]" [0, 1]

that .:. -, .

1. Aggreg(x,x, ,x) x (Identity when unary).

2. Aggreg(,..., 0) = 0 and Aggreg(1,..., 1) 1 (Boundary conditions).

3. If (xl,...,x,) < (yl,...,y,) then Aggreg(xi,...,x,) < Aggreg(yi,...,y,) (Non-
decreasing).

The most basic .,.- --regation operators are the minimum, the maximum and the

average. It can be verified that the Choquet integral satisfies the basic properties

of the .i.:: regation operators. Furthermore, Grabisch, Nguyen and Walker [66], and

Detyniecki [70] have pointed out that the Choquet integral generalizes operators like the

OWA operator and the weighted average.

1.8.6 Relation Between Plausibility and Belief Choquet Integrals

We know that the a Belief measure is the conjugate of a Plausibility measure, thus we

have that:
n
CBe,(f) = Bel(A(i))(f (x()) f(x(1i))
i= 1

= ( Pl(A(f)))(()) f (x())
i 1
1 Chp(1 f), (1-127)









i.e. the Choquet under the Belief measure is a conjugate of the Choquet under the

Plausibility measure.

1.8.7 Shapley Index

The tool used to measure the importance of each element in the .,.-'-regation is the

Shapley index [71]. The concept of Shapley index comes from the idea of a coalitional

game.

Definition 1.36. A pair (X, p) where X is a set of il.j>rithms and p is a function

p : 2N 2 R+ with the following properties:

1. P(0) 0
2. For all A, B 2x such that AnB = 0, p(A) + p(B) < p(AU B)

is called a coalitional game.

The Shapley index for algorithm xi under the function p, defined as:


I (p) XS S l )!(SUx) (S)) (1128)
IXSCXI

measures the importance of each element in the set X. Clearly, the fuzzy measures satisfy

the properties required for the coalitional game definition. Thus, it is possible to use the

Shapley index to measure the importance of each input in the computation of the Choquet

integral.

1.9 Information Fusion

Although the concept of information fusion is easy to grasp, a single precise

definition does not exist. Several have been proposed to define the concept [72-77]. Some

are more general than others, whereas some are defined for a specific field of research. In

1999 a more general definition was given by Wald [76].

Definition 1.37. "Information fusion is a formal framework in which are expressed

means and tools for the alliance of data originating from different sources. It aims at

'li:,.':i .':,,j information of greater q.Ial:/.i; ; the exact 1. b;,i:i/:..', of "greater ;,.ri../;;" will

depend upon the application."









1.9.1 Data Level, Feature Level and Decision Level Fusion

Information fusion can be divided in three main fields [78-83]:

1. Data level fusion, which fuses data acquired from multiple sources, such as images
acquired in different spectral bands, directly.

2. Feature level fusion, which combines features calculated on data acquired from
multiple sources.

3. Decision level fusion, which combines decision statistics and confidence base
information derived from algorithms applied to multiple sources.

One important objective of information fusion is to decide which data, features or

algorithms are useful in the classification problem. This issue is of great importance

because it allows us to reduce the dimensionality of the problem and the bias-variation

trade-off [35] as well as reducing costs of applications.

1.9.2 The Non-Gaussian Nature of Decision Level Fusion

Let us consider decision level fusion for a two class problem. When the outputs of

classifiers are plotted, we would hope to obtain an exponential distribution for elements

belonging to one class and a reverse exponential distribution for elements not belonging

to the other class (Figure 1-9) in the ideal case. In reality, we often obtain skewed

distributions like the ones in (Figure 1-10).

A problem with these kinds of outputs is that different classifiers produce information

that can overlap or plainly contradict. It is necessary to design fusion algorithms that can

handle such problematic distributions.

In the following sections, we explore some of the models proposed for this task.

1.9.3 Bayesian Fusion Models

Attempts have been made using B li-i i models to fuse information [78-81, 84, 85].

Although they have been useful in some measure, B li, i i models have problems handling

the overlapping information coming from multiple sources because probability models obey

the law of the exclusive middle [1]. This problem can be solved adding an extra 1, ir of

complexity where the law of the exclusive middle does not apply. For example, we can use









the discrete Choquet integral (Definition 1.33) to deal with the problem of overlapping

information and then use independent distributions for calculating each of the individual

measures in the power set of algorithms.

1.9.3.1 Bayesian hierarchical models

A number of papers papers [84, 86-88] describe attempts to deal with information

fusion using B li, -i i, hierarchical models. It is clear when reading the literature about

the subject that hierarchical models are used more as a technique inside other methods for

information fusion rather than methods in and of themselves.

For example, in [84] the B li-, -i i, models are used for classification inside a B li, -i in

network for the fusion of information from hyperspectral images. Another example can

be seen in Valin et al. [88], where a B li, -i i, classification method is fused together with

a possibilistic classifier, a neural network and a K-nearest neighborhood using a neural

network fuser and a Possibilistic fuser.

From these examples it is clear that the B i-, -i in hierarchical models are useful as

part of more suitable techniques for fusion.

1.9.3.2 Bayesian network models

B li, -i ,i network models [84, 85, 87, 89-92] are based on an idea similar to Markov

chains, but instead of considering only chain structures, we look at directed .,. i 11i graphs

with the Markov property [89]. These kinds of models can be used to fuse information

from a system with multiple sensors. This allows us to obtain an snapshot of the system

at a certain moment in time.

An example of feature level fusion using B li, -i in networks can be seen in Ferrari

and Vaghi [90]. This paper uses a B ,i, -i network to help in the design of features

for a ground penetrating radar, electromagnetic induction, and infrared sensors for

humanitarian demining. The structure of the B li, -i i, network sensor model is learned

from a database of measurements obtained from a set of known targets, although this

is a very difficult problem. Some assumptions are made to simplify the computational









complexity in the B li-, -ii i networks, each node is considered discrete and the data

samples are assumed to be independent. For training, a K2 algorithm [93] is used.

In the case of decision level fusion, we can see an example in Heckerman [82]. In

this paper the author uses B i- -i i networks to fuse acoustic information and spatial

information at decision level fusion for a human-robot interaction system for the Swiss

National Exhibition (Expo.02). These B li, -i 'i networks are used in an attempt to

improve interaction between human and robot, by trying to decide what the goals of the

user are. The nodes in the B li-, -i i, networks have the following random variables: UR,

user reliability; UG, user goals; ORR, observed recognition results; LSS, laser scanner

signal for the position of the robot; Lik, likelihood of data; SNR, speech to noise ratio;

and UFR, user in front of the robot. In addition to these basic B li, -i i, networks, the

authors develop a single B li-, -i i, network for the user goal by discovering the causal

relations between the nodes in the two B li, -i in networks. The training for these B li, -i in

networks is accomplished using B li-, i network inference techniques [89].

A problem with these models is the fact that, as the number of variables increases

and the causal relationships between them become more complex, calculating the

conditional probability can become extremely difficult to compute. In addition, maintaining

the probability properties when training the B-vi w network can be difficult [89].

1.9.4 Non-Bayesian Fusion Models

These kinds of models [94-98] are based on different, and somewhat more general

properties than the ones based in classic probability. For example, neural networks [97] are

models where probabilistic or non-probabilistic rules are learned.

1.9.4.1 Neural networks models

Neural networks [97] can be used in the context of information fusion to fuse sensor

information from sonar and infrared, geospatial information, etc, i.e. data level fusion

[96, 98-100].









An example of data fusion in medical image fusion can be seen at Wang and

Yide [99]. Here, a type of Pulse Coupled Neural Networks (PCNN) dendritic neural

network [101] is used to fuse medical images that are obtained from different sensors to

enhance the data. For example, certain types of Magnetic Resonance Images (\!RI) give

greater detail of anatomical structures, whereas other types of MRI images give greater

contrast between normal and abnormal tissues. Thus, one of the objectives of the image

fusion is to obtain a precise map of the abnormal tissue. Another example for data fusion

can be seen in Barbera et al. [96], where a neural network is used to fuse information from

unreliable ultrasonic and infrared sensors in a robotic platform for spatial navigation.

A novel neural network that combine OWA operators and MLP's has been used by

Wilson et al. [102]. to fuse information from ground penetrating radars and electromagnetic

induction sensors for hand-held mine detection systems. Although useful, a basic problem

with neural networks is the ability to train them and their problem with outliers [97].

They also have difficulty training with conflicting information.

1.9.4.2 Kernel models

In the kernel methods [97, 103], training samples are sent to high dimensional spaces

through the use of kernels, which helps in the task of classification.

An example of kernel methods can be seen in Liu et al. [104]. In this paper, the

authors fuse information at data level from a sensor network for traffic control. A Support

Vector Machine (SVM) with radial basis functions, for example f(x) exp -

is used to fuse the traffic information that includes speed, volume, occupancy under

incident-free situations and occupancy under incident-nonfree situations.

An other example of this technique is Singh et al. [105]. In this case, feature level

fusion is used for face recognition using multi-spectral information. First, the Fourier

transform is used to obtain two types of features: amplitude and phase. Then the authors









use dual-v SVM [106], based in the following cost function:


yi(wQ(x) + b) > p-,

s. ,' > 0 (1-129)

to minimize wt|w| C(vp- ),


where Q is the mapping function, p is the position of the margin, v the error parameter,

Ci(vp I' ) the cost of the error, w the normal vector and b the bias parameter. These

equations help to assign different errors to minimize the expected misclassification cost to

fuse the features from the Fourier transform. This is done for every face image in each of

the frequencies in the multispectral image. Then, all the fused feature vectors, one for each

frequency, are fused again by a SVM. Finally, a vector of fused features are returned for

the face recognition.

1.9.4.3 Genetic algorithm models

Genetic algorithms [107] have not been used widely in the area of information fusion.

Although, some attempts have been made to propose possible architectures for data

fusion of sensor information [108], and the learning of the fuzzy measure parameters in the

Choquet integral [109, 110].

Maslov and Gertner [108] studied the different v--v that evolutionary algorithms have

been used in information fusion. The genetic algorithm is seen as a special case, and more

than proposing an algorithm, they explore what would be a good architecture for data

fusion using genetic algorithms.

A more concrete example can be seen in Combarro and Miranda [110], where a

genetic algorithm is used to learn the fuzzy measures for a Choquet integral assuming a

least squared error fitness function in a convex space of fuzzy measures. Employing convex

space is useful because it is easy to prove that, given two fuzzy measures p/, p2, the convex

combination Ap1 + (1 A)P2 with A E [0, 1] is again a fuzzy measure.









1.9.4.4 Fuzzy set models

These types of models are based on the concepts of fuzzy sets [12, 53, 54, 111], fuzzy

measures [8-13] and fuzzy integrals [8-10, 13]. They have been used in the landmine

detection problem [52, 112], in handwriting recognition [113] and computer vision [57].

An example of decision level fusion using fuzzy sets is shown in Gader et al. [113].

First, a segmentation algorithm is applied to a string of characters to split them into a set

of primitives, pi, ...,PK, which are read sequentially by subsequent procedures. Second, a

feature extraction algorithm is used to assign basic confidences, then, a fuzzy rule system

is trained to learn the confidences of block of characters. These fuzzy rule systems use a

set of rules and variables to assign the confidences. An example of these rules can be seen

in (Equation 1-130):


IF complex, is large A : ip is medium A n is large THEN CONF is large. (1-130)


Other applications of these types of models exist, but our focus in the following sections

is on learning fuzzy measures and the Choquet integral since this will be the area of our

research contributions.

1.10 Previous Applications of the Choquet Integral

1.10.1 Choquet Integral in Pattern Recognition

Grabisch et al. [8, 59, 60, 114, 115] have used the fuzzy integral as a classifier in

pattern recognition problems (Although the Choquet integral is better suited as an

..-i-regation operator). Grabisch [8] proposed the use of the Choquet integral in the

framework of optimizing a quadratic problem with a Least Squared Error cost function:

2 (C() ai)2 + ... + (C(f) -a)2 (1 131)
wEC1 wECd

with desired outputs al,..., al under the constraints imposed by the fuzzy measure.









This method has been applied to the classification of landmines [61, 62]. In addition,

it has been applied to the classification of the iris data set and Wisconsin breast cancer

data set [8].

Wang [114, 115] proposed a genetic algorithm strategy to learn the fuzzy measures.

These algorithms have been applied to an artificial data set with two hundred three-dimensional

samples split into two classes. In addition, it has been applied to the Iris data set and

breast cancer Wisconsin data set.

Although the Choquet integral is not well-suited as a classifier, the results are

comparable to the best classifiers.

1.10.2 Fuzzy Integral in Machine Vision

Some work has been done in trying to use the Choquet integral in computer vision.

For example, attempts have been made in the area of face recognition and morphological

filters. Although the Sugeno integral is not a Choquet integral, it has been used in the

task of face recognition for decision level fusion. This is the reason that we decided to

include the Sugeno integral in this literature review.

1.10.2.1 Fuzzy integral in face recognition

The Choquet integral has been used in face recognition [116-118] as a way to

enhance the classification confidences coming from the different algorithms used in

facial recognition. As in landmine recognition [52, 61, 119, 120] and word recognition [45,

113, 121, 122], the fuzzy integral is used as an .,.-i-regation operator to improve the final

confidence using the outputs from the different classifiers.

For example, in several efforts [116, 117] a Sugeno fuzzy integral (Definition 1.31) is

used for decision level fusion. In this algorithm, samples coming from feature extractors

are classified using N classifiers. Then, a Sugeno integral fuser is used to improve the

confidences from the different classifiers.









1.10.2.2 Choquet integral as an image processing filter

Keller, Gader and Grabisch et al. [123-125] have proposed the use of the Choquet

integral as a morphological filter. For example, Grabisch [123] proved the connection

between order filters and the Choquet integral. Hocaoglu and Gader [124] proposed a gray

scale filter using Choquet integral called Choquet Morphological Operator. Keller and

Gader [125] proposed the use of Choquet filters to clean noise in Laser Radar (LADAR)

images. In addition, the Choquet integral has been used as a generalized hit-miss operator

in the landmine detection problem [52].

Grabisch [123] gave a definition of linear and ordered filters that correspond to

Choquet integrals. Grabisch also found a relation between the Choquet integral in the

different filters:

1. Linear Filters correspond to an additive fuzzy measure with respect to the

Choquet integral.

2. Order Statistic Filters correspond to operators that have similarities to the OWA

operators, a special case of the Choquet integral.

A classic example of data level fusion is that given to us by Keller and Gader [125].

In this paper, the OWA operator has been used to enhance pixels in the image known

as non-return pixels. Due to sensor errors, such pixels have very low local pixel value.

Thus, if we applied a threshold to these pixels, we would lose them. Therefore, the OWA

operators can help to enhance these pixels.

1.10.3 Choquet Integral for Decision Level Fusion

A successful use of the Choquet integral has been in the area of decision level fusion.

In this kind of fusion, confidences coming from different sources are .-.I- related to provide

a more accurate confidence about the classified objects [45, 52, 61, 113, 119-122]. This

framework will be described more thoroughly in the technical approach.









1.10.3.1 Choquet integral in word recognition

Gader and Keller et al. [45, 121, 122, 126] used the Choquet integral in handwritten

word recognition in the United States Postal Service.

One example [121, 122] involves fusing confidences provided by a fuzzy rule system

to obtain a final confidence about the word being analyzed. This was done using discrete

Choquet integrals, and was tested extensively against a neural network, the Choquet

integral performed significantly better.

Another example of fusing handwriting is Cao et al. [126] Here, a Sugeno integral is

used to fuse confidences of several neural network classifiers. The densities used in the

integrals were dynamically quantified by source relevance using a normalization equation.

1.10.3.2 Choquet integral in landmine detection

Gader et al. [52, 61, 119, 120] reported some success in decision level fusion for

landmine classification using the Choquet integral as an .-'-oregator operator. The two

main examples are the use of the Choquet integral as a morphological operator [52, 119]

and as .,-.--regator for different algorithms under a least squared criterion [61, 62].

In Gader et al. [52], the Choquet integral, together with probabilities for random

sets, was used to define the morphological operators for opening and closing. In addition,

connectivity issues were solved using path connectivity conditions. Then, the procedure

was used for landmine detection at data level fusion.

In [61] a gradient descent algorithm was used for solving the least squared error of a

Choquet integral under Sugeno A-measures. The algorithm was then applied to landmine

detection fusion at the decision level.

1.11 Learning Fuzzy Measures

Although different methods have been proposed for learning fuzzy measures [8, 60, 66,

114, 115, 127-130], many of these methods share several drawbacks. The first drawback is

that the number of parameters and constraints are an exponential function of the number

of inputs. For example, if the input has n features, the number of measure parameters









to be learned are 2" 1. Second, many of the algorithms proposed depend on a possible

desired output which many times is considered a magic variable. We show in (C'!i lpter

2) that the performance of the trained parameters can be very sensitive to the chosen

outputs. Third, in every algorithm, maintaining the monotonicity relation can be difficult.

These problems and their respective solutions are explored in each of the algorithms that

have been proposed to learn fuzzy measures for the Choquet integral

1.11.1 Nonlinear Optimization Methods

A well known method used in learning fuzzy measures is the Least Squared Error

(LSE) cost function [11, 131]:

2 1 1
E2 (C, (fw) C)2 + ... + (() )2 (1 132)
wECJ WEC1

together with the fuzzy measures constraints. It is possible to express (Equation 1-132) as

a Quadratic programming problem by realizing that the Choquet integral C,(f,) can be

written as an inner product of vectors:


C.(f,) = Hf u, (1-133)

under a set of constraints defined by the fuzzy measure definition (Definition 1.24) which

are represented in a matrix form:

Au < 0, (1-134)

Finally, the cost function E and Au < 0 can be put together as a minimization problem:


mmin uHu + Fu + a s.t. Au + b < 0. (1-135)


Although this cost function has appeared in the literature [11, 131], a detailed description

has not. For this reason, a complete description of how to derive this minimization

problem from the LSE cost function for the Choquet integral is in (Appendix A). This

cost function (Equation 1-135) can be solved by traditional methods of optimization [132].

Note that due to the exponential nature of the fuzzy measure parameters, the constraints









increase in an exponential way making the solution incredibly computationally expensive

in terms of both space and time [8, 62].

In particular, H, F, A and a1 are determined by the data and which outputs need to

be learned for each respective class.

An immediate problem in this approach is the use of the same measure for the

different classes. Grabisch and Nicolas [130] addressed this problem with a modified

version of (Equation 1 132) for a two-class problem:

E = (Y,(C1 (lf ]) 2 (2[f]) al)2+...
wEC1
+ E(C2(2[fw) C- c(Q[f])- _n)2, (1 136)
wEC2

where 01 and Q2 are functions that compute class specific confidence values from the

information source outputs. For example, in information fusion we use:


i (<)) = P( x(w)IC1)2-il py
= Fc,(xw)2-i(1- F ( ))i-1

fori 1,-2, (1-137)


whereas in classification we use:


i(f(x)) = P(x(w) C) for i 1,...,n. (1-138)


Note that in (Equation 1-137), i(f,(x)) E [0,1]. In (Equation 1-138), the values of

x are generally quantized so the distribution is discrete and Qj(f,(x)) E [0,1]. We can

employ a similar procedure as the one in (Equation 1-132) to convert this cost function

(Equation 1-136) into a quadratic problem under linear constraints. Although this method



1 The a's can be interpreted as the ideal result of evaluating a function F, the function
to be approximated, in an input x.









has desirable properties, like simplicity and many possible techniques for optimization,
two main problems are found when solving these quadratic programs. First, specifying a

general fuzzy measure requires 2" 2 parameters, which makes the input of the quadratic

programming exponential in nature [59]. Second, the solution can be sensitive to the
desired outputs ai as shown in (Section 2). This makes it necessary to study the problem
of finding the best possible desired outputs. This clearly increases the complexity of the

solution of (Equation 1-135).
1.11.2 Neural Networks

Several methods have been proposed to learn fuzzy measures using neural networks [60,

66, 127, 129].
1.11.2.1 Perceptron-like method

Grabisch [66] proposed using a perceptron-like criterion to learn fuzzy measures. In

this criterion, we want to minimize the number of misclassified samples given samples
from two classes C1 {(fl,..., fl} and C2 { f,...,f }, and i = (p,...,ul)T

U2 = ( p,..., / T the vectors of measure parameters for both classes.
The criterion is used to find u and an 1-dimensional non-negative vector t =

(ti,... t&) such that 1 tk is minimum under the constraints of monotonicity for
Iu and u2:


Cl(71(fli)) + C 2t(fl)) < (1139)

2(72(fi)) + C1 (7T1(,i)) < ti, (1-140)

where f f and f, represents algorithm confidences, for class 1 and class 2 respectively,
given by an object wi and 7r represents the CDF of confidences that belong to class i.

These (Equation 1-140) in matrix format using the ideas in (Appendix A):


H- (fi)ul + H(fi)u2 < i, (1 141)

HT(f2i)ul HT2(f)2 < i. (1-142)









Basically, the ti's represent the decision bias for each pair of confidences f~ and fl, in

the perception problem. It can be said that the perception wants to obtain a collection of

biases ti, ..., t, that maintain the constraints (Equation 1-140).

It is possible to define the following terms:

1. u = [UTUT .

2. v = [u tT]T the vector to be optimized.

3. C [OT T]T the matrix of constraints.

4. The set of constraints of each fuzzy measure in matrix format:

Alul + b > 0, (1-143)
A2u2 + b2 > 0. (1144)


5. B=[b bf.

6. The vector version of the Choquet integrals:

C w, 1(f))= H /tU, (1-145)
C, 2(f))= H u2. (1-146)


7. The matrix:
A1 OT 0 ... 0
OT A2 0 ... 0
-H T H T, ...
S* (1 047)
H -HT
Fi(f ) r .j)
0 1
Then, the linear problem takes the standard form:


minimize CTv

s.t. Ov+B > 0, (1-148)


which is simply a linear optimization under linear constraints.









A problem with this scheme is that the parameters of the measures are not really

part of the cost function because of the OT vector. Therefore, it overly stresses more the

minimization of CL ti, rather than the learning of the parameter of the measures [66]. In

addition, it has the drawback that the matrix of constraints is of size 2m+1+n x 2m+l+n

for n inputs of dimensionality m.

1.11.2.2 Look-alike gradient descent neural networks

Wang and Wang [60] proposed the use of a gradient descent algorithm to learn the

fuzzy measures for the Sugeno A-measure. Therefore, a neural network is used to simulate

the .,-:-regation by the Choquet integral. In the first 1-V.-r (input 1iv.-r) the confidences

are used as inputs. These inputs are then combined in the hidden -1 .-r in the form of

min-max differences:

6 = max (min fk- max fk (1-149)
S1,...,K k=K+l,...,m )

The output -1 .-r is the final sum in the Choquet integral. Wang and Wang point out that

the use of a direct gradient is really difficult for the Choquet integral neural network. This

is the reason to use small differences selected in a random way to optimize the LSE for the

Choquet integral.

Keller and Osborn [127] proposed to use a i 'v ii.t-puii-!iii, ili approach, similar to

a neural network. In this approach, given a sample w, the Choquet integral is calculated

for each class, and if the largest integral does not correspond to the label for the sample

w, the density-neurons of the Sugeno A-measure are punished by reducing their values. In

the case that the largest integral corresponds to the correct label for w, the densities are

rewarded by increasing them. The monotonicity property is maintained through the fact

that the densities can only have values in the interval [0, 1].

1.11.3 Genetic Algorithms

Genetic algorithms [114, 115, 133] are a non traditional method to learn fuzzy

measures. They are not more widespread because of the extra overhead of the algorithm

itself and the difficult in determining effective encoding. These algorithms are based on









the effective encoding of the weights, a random selection and mutation of these encodings

and a fitness function to test the fitness of each generation.

The method encodes each weighted Choquet integral as a chromosome, and the

chromosome population has size p corresponding to the total number of samples. The

probability of a chromosome in the population being chosen to be a parent depends on

its fitness. The optimization in the genetic algorithm is performed under the criterion

of minimizing the misclassification rate. The stopping condition of the algorithm is the

minimization of the classification error or zero misclassification rate.

1.11.4 Heuristic Least Squared Error

Given the complexity of using traditional optimization methods for solving a LSE,

Grabisch [8] proposed the use of an heuristic LSE to train the fuzzy measures. A small

description of this heuristic is described as:


Heuristic Least Squared Error

1. For each sample w with label y:

(a) Calculate error e = Cp,(f,) y, and set emax-

(b) update each measure pi as follows:

new old
i d (X(n) (n 1)), (1-50)
emax

with a [0, 1].

(c) Verify the monotonicity relations. If e > 0, the verification is done for lower
neighbors only in the power lattice; if e < 0, for upper neighbors only.

2. For every node left unmodified in (Step 1) (begin with lower levels), verify
monotonicity relations with upper and lower neighbors.



1.11.5 Conclusions

After reviewing the different attempts used to learn fuzzy measures, we arrive at the

following conclusions:









1. Many of the algorithms depend on a desired output. This means that an external
algorithm or methodology needs to be used to try to find the best possible desired
outputs, but as we will see in the next chapter, sometimes the best possible desired
outputs are counter-intuitive.

2. Exponential time complexity due to the parameters and constraints. It is clear
that given a problem with dimensionality n, the size of the input for each of these
methods becomes 2" 1 at least. That is, if we do not count the number of
constraints to maintain the monotonicity property. It is clear that any method
that depends on a matrix of constraints will be completely hopeless after a certain
number of algorithms or features are considered. For example, the optimization
by a quadratic programming method depends heavily on searching for a feasible
direction by using the matrix of constraints. It is clear that if the size of the matrix
of constraints increases, the search becomes slower.

3. Lack of cost functions with embedded fuzzy measures constraints. This is done
sometimes in optimization and machine learning because it is easier to minimize a
non-constrained cost function than a constrained one.

These are the reasons that are driving the ideas behind this dissertation. The next

chapters will help to shine some light into new v--,v- to solve these issues.

The following items summarize our solutions to the previous problems:

1. In (C'!i lpter 2) a novel MCE algorithm is developed to eliminate the dependencies
with respect to desired outputs. In addition, it reduces the time complexity to
O(Knlog(n) + Mn3 + KMn2) which is not exponential.

2. In (C'! plter 3) and (C'! plter 4), a new original algorithm is developed using a
B i, i i hierarchy and Gibbs sampler with the fuzzy measure constraints embedded
into the hierarchy. In addition, this algorithm is not limited to the Sugeno
A-measure, and it does not depend on a matrix of constraints. These makes the
algorithm is extremely fast compared to other algorithms dealing with a complete
fuzzy measure.

3. In (C'!i lpter 5) the new probabilistic algorithms are extended to deal with the
problem of desired outputs for the cases of a single and two measures by simulating
the MCE method with the logistic regression.

We firmly believe that the new probabilistic algorithms are much better suited to deal

with the exponential nature of the values of a fuzzy measure. The reason behind this is

that the MC'\ C methods are used to deal with high dimensional problems where the

classical derivative methods are difficult to use [2].







Table 1-1: Examples of conjugate priors.


Posterior
-(O|x)
N (+' 2o, 22
Beta(r + n, s N n)
F(r + n, s + 1)
Dirichlet(al + nl,..., ak + nk)


Likelihood
f(xlO)
N(O, a)
Binomial(N, 0)
Poisson(0)
Multinomial(01,..., Ok)


Prior
7(0)
N(po, )o)
Beta(r, s)
F(r, s)
Dirichlet(ail,..., cak)


La)

't e}


L(el


I
1l,,s);


On+


Figure 1-1: Example of a maximization iteration in EM.


L 0, -l o, o I,0 ........








































0 05 1 15 2 25 3 35 4 45 5


Figure 1-2: Example of Slice sampler for the exponential distribution.










Lapalacian
1.6


14


1.2

1


08-


06 -

04 / \


02 -


-3 -2 -1 0 1 2 3


Figure 1-3: Example of a Laplacian with 7 = 10.
















It
09

086

07-

0.6 I
I0I

0.5 I -
I I
04-
---PDFm= 2, s =0.25
03- I
I 1 -CDFm =2, s= 0.25



0 I----- ------
02 I




-2 -1 0 1 2 3 4 5 6

Figure 1-4: Examples of the PDF and CDF of a logistic distribution.










i (xp,,h(x,x))


04-

02
h(x1,x2) --


---
'~--i~---------~-
----r~ --- -----i

%las~l -
'=-
-- ,--
i-
2
il-i=


I


tClass-2- -


0-8O--


04 06 a0
X


12 1 4


Figure 1-5: Example of a decision boundary h(x) = po + [tixl + P2X





81


















0.9

0

0 7

06

C A




0




-10


Figure 1-6: Example of logit transformation.


0
h(x)

















09o


OB-

07-

06

(U 05-

0 04

03-

02-

01-


Y X


Figure 1-7: Example of a Choquet integral under a belief measure.






~ ~ ~ .ii -


Ia
0


06
04 07
34 05
0 2 -- 0 30
0 2201 02
02

Y X


Figure 1-8: Example of a Choquet integral under a plausibility measure.






83


0 7 C 8 o
0 1 02 03 0


.,..











10007
IOO

900

800


x not in class i


Figure 1-9: Distribution of the outputs of a classification problem.


I I I I I I I I


SM Class2


400H


Figure 1-10: Mor
Figure 1-10: Mor


I lh h 204 05 7ll l


1 2 3 4 5 6 7 8 9
FA

realistic distribution of the outputs of a classification problem.


e









CHAPTER 2
MINIMUM CLASSIFICATION ERROR USING THE SUGENO MEASURE

In this chapter we propose the following dissimilarity measure for the MCE algorithm:

di(f,) = -C,(i[f,]) + ,.,, iiCj (j [f]). (2-1)

Note that (Equation 2-1) allows for multiple classes. The MCE algorithm requires

differentiation. In addition, the function max is differentiable almost everywhere with a

very simple derivative given by:

Omax(f(xi), f(x2), .., (xf)) if f(xi) = max(f(x), f(x2),.., f(x))
axi 0 else

are

2.1 Loss Function for the Minimum Classification Error

In our specific optimization, we combine the loss functions given in (Equation 1-51)

and (Equation 1-52) in a single loss function for information fusion:


li(f~) = 1+e Odi ())' d >0 (2-3)
0 d,( < 0.

For classification, we use a slightly modified version of the loss function (Equation 2-3):

2 l(-1f 1) d(f) > 0
li(f) +(- d(fw)) (2-4)
0 di(f,) <0

These functions have the property that correctly classified samples have zero loss.

Thus, only samples that are not correctly classified are taken in consideration for the

accumulative change in the optimization.

With loss function (Equation 2-3), and the dissimilarity measure (Equation 2-1), we

have the following cost function for n classes:

E l(f)+ ... + ~ ,(f,). (25)
wEC'1 EC,









Hence for the loss function (Equation 2-3) ,


OE 1. ( (t OU, ) (-f6)
(f E f1)(1 i(f~)) 'dlf) +* + E l(f)(1 (f))' M ) (2-6)
ECi agWECn ag

where g' represents the jth density for ith class. Now, the term (f) is equal to:


9g;
8 7C ( f )\
f t (j if k i

i9 a(jf iff k / i and Cj,(f,) = maxs{k { C9(f}

0 if k / i and C,(f,) / mr., {C,(f,)}.

The derivations for the loss function (Equation 2-4) can be obtained in the same way. We

still need to find a closed form for the value dcgi ( )
i9gi
2.2 Derivation of the Derivative of the Choquet Integral with respect to the
Sugeno measure

The gradient is obtained by differentiating the discrete Choquet integral (Equation

1 111) with respect to the densities of the Sugeno A-measure. Thus, each partial derivative

of Cg(f,) with respect to gjl is given by:

OC,(fw,) g(A(i))
Oag Og ( (f(x(i)) f(x(i_1~)) (2-7)
Oyj i 1, (i= j

To derive g(A()) consider that, according to (Equation 1107):
dgj

g(A(i)) -g ({xy()} U A(i+)) g(i) + g(A(i+l)) + Ag(i)g(A(il)). (2-8)

The partial derivative of this last (Equation 2-8), with respect to a density gj, is equal to:

Og(A() +g(A(i+l)) .(, Og(A(+)) (29)
+ g + gg(i)g(A+l)) + A g(A(i+l)) + g() (29)
1 Ne tt A is a a f o Og



1 Note that A is also a function of gy. This can be seen in (Equation 1108).









Several cases need to be considered to obtain a general rule for this derivative. First, if

(i)= j, we have that:


Sg(A(i+l))
1+ gj
dg5


9A
+ g(i)g(A(i+)) +


SA(A()) (A(
1 + Ag(A(i+ )) + g(i)g(Ai+)) +


Ag(A(+))
Ag(A(i+l)) + Ag(i) Og
A gj
Sg(A(i l))
(1 + Ag 4))--


by the multiplication rule for derivatives. In a similar way, for (i) : j, we have:


Og(A(i+1))
0+
8gy


a\ 9g(A(+ )
+ O g(i)g(A(i+l)) + Agg(A()
Ogj Ogy


(A 1 Ag(A(+l ))
g(i)g(A(i+)) + (1 + Ag))
Ogj Ogj


(2-11)


From these last two equations, and the fact that g(A(n+1)) = 0, we can obtain:


1. Case I (i) / n, (i)


aA
1 + Ag(A(i+I)) + g(i)g(A(i+)) + ...
Ogj
g1(A(+1I))
+(1 + Ag(i))-) .


(2-12)


3. Case III (i)



4. Case IV (i)


Og(A(i))
Ogy


n, j = n



n,j 1n


(A(
9(i)g(A(i 1, ) OA
agj


+ g(A(+)). (2 13)
+ (1 + Ag(i)) (2 13)


Og(A()))
Ogy


Og(A(i))
Ogj


(2-14)


(2-15)


Og(A()))


(2-10)


2. Case II (i) / n, (i) / j








Now, we only need to obtain an expression for -. Differentiating both sides of (Equation
1-108) with respect to gj yields:

OA (A ni
Og = A (t + Agi) +.
i=1,iJ

+ aA I (1 + Ag) (216)
i 1 k 1,k#i
From this equation, we can get the following:


iy k =,k i

A (1 + Ag) (2-17)
Si=1,i j4
Which can be reduced to:

OA A ,ij( + Ag(2A18)
0g9 1 gi Hn 1,kji( + A k)

And because we can rewrite A + 1 = n (1 + Agi) as '1+ n-HTj (1 + Ag ), we have:

A 1+Xgj (2-19)
Ogj 1 EC lgi+Agi

We have finally that:

A A 2/ 0. (2-20)
0g9 (1 + gjA) l -(A + 1)E (i1 )]

This last equation, together with (Equation 2-10) and (Equation 2-11), we can get the
derivative of the Choquet integral with respect to the Sugeno A-measure for A / 0.
Note that the derivation of (Equation 1-108) from (Equation 2-8) assumes that
A / 0, and that the resulting expression for o- in (Equation 2 20) is undefined for A 0
(Since EI g = 1). We can apply L'Hopital's rule to see that lim o =j- n. Hence, in
the unlikely event that A = 0 during training one can take n.
ag5









We can then use (Equation 2-7) (Equation 2-20) to obtain a gradient descent

algorithm for the MCE cost function (Equation 2-5).

This new optimization has the advantages that we have been looking for. First, each

class is represented by a unique measure, and second, no desired outputs are necessary

whatsoever. This is a great advantage because the Choquet integral is sensitive to desired

outputs, and eliminating the necessity for desired outputs helps to avoid guessing which

desired outputs are correct.

2.3 Time Complexity Analysis of the MCE Training Algorithm

For this analysis, we assume that IXI = n, there are M classes, and each has .i

elements. In addition, it is easy to prove that once the sorting is done for a sample, the

calculation of the Choquet integral can be done in linear time for the Sugeno A-measure.

Then, calculating the Choquet integral with respect to the Sugeno A-measure has

.-i~,in1 l.l ic complexity O(nlog(n) + n) [134]. In addition, calculating the roots for

(Equation 1-108) has .,-vii1l'..1ic complexity O(n3) [134, 135].

We present the pseudo-code of the general algorithm with the order of operations of

the computational complexity steps in parentheses:


General MCE Algorithm
1. Set learning rate a.

2. for i = 1 to M.

(a) Sort all the samples of class Ci, (O(1i nlog(n))).

3. end for

4. Do

(a) fork = 1 to M

i. Set VEk = 0

ii. Calculate Ak for each class Ck (O(n3)).








iii. Calculate for each -.:i gjk) the partial derivative:

9Ak A2 + Ak
(k) ,k ( o ( )).
9j (t1+ j + af (A) I+ ,

iv. for h = 1 to 3M
A. Calculate d(fh) -Cg(k)( [f, [w) + ,4Cg(J)( j[fh), (0(Q )).
B. Calculate lk(fh), (0(1)).
C. Calculate g(k)(A( )) for all i = 1,..., (O(n)).
D. Calculate for all i = l,...,n the partial derivative of g(k)(Ai)) with
respect gk) for all j 1,...,n:

Og(k) (A(i)) g (k) g() (A) ) A () ((k)
g( k) g) k) + (k) k) (i) +

k (k) (A(i+l)) + Ak(i) a(k)i1)' (O(l)).
S( kgj

E. Calculate the partial derivative Cg_(fh) with respect to gk) for all
j 1, ...,n:

aCg(k) (f,,) N g(k)(A(i))
a(k) (Afk) ( (i)) f(x(i-1))) (O()).
99j i1i 9gj

F. Calculate for each I ,.-I, g k) the q'i,,'l:,1;

Odj(fw,)
njDikh lJ(fh)( -lJ(fh)) OdJ(h)
Djkh = w( h k^)h (O(1)).
gYj
(k) (k) (k)
G. For each g set g gj a* Djkh, (O(n)).
H. VEk VEk + (Dlkh,...,DNkh)T, (0(1)).
v. end for
(b) end for
5. while II(VE, ..., VEM)TI>









First, define K = if, i to be the total number of samples. Now, the time

complexity for a single iteration is:


O(Knlog(n)) +..

O(Mn) +...
O(Mn2) ...

O(KM) + ...


O(KM) +...


O(KMn) +...

O(KMn2)+ ...


O(KMn)+ ...


O(KMn)+ ...

O(KM) + ...

O(KM) +...


Time complexity of sorting all the samples

Time complexity of calculating A, ..., AM.

Time complexity of calculating VA ,..., VAM.

Time complexity of calculating all dissimilarity

functions dk (fh).

Time complexity of calculating all loss

functions Ik (fh,).

Time complexity of calculating all measures g(k)(A()).

Time complexity of calculating all

partial derivatives of g(k)

Time complexity of calculating all partial

derivatives of C ) (fkh).

Time complexity of calculating all Djkh.

Time complexity of updating all g k)

Time complexity for updating all VEk


We can rewrite this time complexity as:


O(Knlog(n) + Mn3 + KMn2)


(2-21)


Thus, we have that the time complexity for a single iteration in the MCE is:


Time Complexity for MCE single iteration = O(Knlog(n) + Mn3 + KMn2).


(2-22)


Then, assuming H iterations in the main while loop, we obtain the time complexity for the

MCE:


Time Complexity for MCE = O(Knlog(n) + HM(n3 + Kn2)).


(2-23)









2.4 Problems with the MCE Approach

An immediate problem with the cost function (Equation 2-5) is that this is not a

convex function. For example, for dimensionality one, l(x) xp has the following
l+exp has the following
Hessian:
d2](r}
d =1) (x)(1 -1(x)) [1 21(x)] (2-24)
dx"

where the part l(x)(1 l(x)) > 0 for all x e R, and


1 21(x) < 0 if x > 0,

1 21(x) > 0 if x < 0.


This means that the function l(x) is neither convex nor concave. In addition, it can be

shown that the Choquet integral can be a convex or a concave function depending on

which fuzzy measure is used. This can be seen in (Example 1.4).

It can be seen that a sum of functions li(f,) with di : [0, 1]" [-1.1] is not going to

be a convex or concave function [136]. Therefore, if we use the gradient descent proposed

in (Section 2.1) (Section 2.2), the most we can hope to obtain is a local minimum. Then,

it is necessary to propose a global optimization technique to solve this problem in a more

efficient way, but for now this is still an open problem.

2.5 Results under the MCE Algorithm

2.5.1 Description and Design of the Experiments

The LSE and MCE training methods were applied to a two-class algorithm fusion

problem in landmine detection and some standard data sets for pattern classification.

First, we will discuss the fusion experiments and later the classification experiments.

The landmine detection problem involved processing Ground Penetrating Radar

(GPR) sensor returns. This is well described in the literature, but is briefly specified

here. The goal is to discriminate between regions of ground that contain buried landmines

from regions of ground that do not contain buried landmines. GPR measurements were

made at multiple locations, some of which contain landmines and some of which do not.









Multiple detection algorithms have been developed by numerous researchers to process

samples obtained from these sensors, as described in [112, 137-141]. Each detection

algorithm involves a complex sequence of processes including signal processing, feature

extraction, and classification. The algorithms produce confidence values as output. The

larger the confidence value, the more likely it is that the input sample was acquired over a

region of ground containing a landmine.

The data set contained 2422 8-dimensional samples, each containing one confidence

value from each of the eight detection algorithms used in the detection problem. The data

set contained 271 mines samples and 2151 non-mine samples.

Three different information fusion algorithms were considered: LSE for general

measures, LSE for Sugeno A-measures, and MCE for Sugeno A-measures.

The probability of detection, PD, and the probability of false alarm, PFA, are used as

performance measures. They are defined as follows

PD (t) {w M ines : Cm, () >_ t} (225)
| {w C Mines} I

PFA(t) I {wu NonMines : C,,Mom,, (ft) > t} (
Su{wE c NonMines}
where |.1 denotes the set cardinality.

Since gradient descent is sensitive to initialization, we run N-fold cross-validation

M times to obtain a realistic estimate of the expected performance (for one experiment,

N = 5, M = 20 ). In addition, since LSE performance depends on the choice of desired

outputs and the results are sensitive to this choice, we average over a range of reasonable

desired outputs. The following pseudo code depicts the experimental procedure. In this

pseudo code, weights refers to the parameters of the measure to be learned. The function

ComputeROC computes the (Equations 2-25) and (Equations 2-26) for all t in the range

of detections. The values ca, a2 represent the desired outputs of the fuzzy integral in

the range [0,1], for mines and non-mines respectively. The value selected for ca ranges

between 0.5 and 1 for mines, and for a2 we choose values between 0.0 and ca 0.1. These









values are used because we want higher desired outputs for mines and lower outputs for
non-mines. Each fold in the crossvalidation scheme is represented by Ai.


General Testing Algorithm
1. Initialize

(a) DataSet {A},j where A, iA, 0, if i j.

(b) Number of repetitions for experiment= M;
2. for i =1 to M do:

(a) Ri,, .',,;,i initialize Weights.

(b) K 1

(c) for j =1 to N
if Algorithm is LSE (This varies the desired output)

i. for a = 0.5 to l(al, a2)

A. for a2 = 0.0 to a 0.1
B. CijK = Train(Weights, Ui 1,j {Ai},a ,a2)

C. K = K+1;

else

i. Ci = Train(Weights, U ,li. {A7}).
3. if Al.., .:i1,,, is LSE (This varies the desired output)

(a) fori = 1 toM
i. for j =1 to K-1

A. (PDy, PFAiy) ComputeROC({Cilj,..., Cij})

4. else

(a) for i = 1 toM
i. (PDi, PFA) = ComputeROC ({il, ..., CiN})

5. if Al,.., l:,,,, is LSE (This varies the desired output)
(a) PD i (1 K- 1 PDij

(b) pFA Kim ( Y 1 K PFAj)









6. else

(a) PD -= 1 PDL

(b) PFA -= PFA,



Here, PDijK and PFAijK represent the PD and PFA of the jth cross validation fold of the

ith experiment and the Kth variations in desired outputs for the LSE training functions. In

a similar fashion, PDij and PFAij of the jth cross validation fold of the ith experiment for

the MCE training function.

Before examining the results from each algorithm, we show the sensitivity of the

LSE training for the two measures used in the experiments. The Receiver Operating

C'!i ,o ,:teristic (ROC) plots in (Figure 2-1) and (Figure 2-1) show some of the variations

in the PD and FAR due to random initialization under the different desired outputs in

a single experiment. We can see that different desired outputs produce different ROC

curves. In addition, the best ROC curve is not obtained using ideal values like zero

for non-mines and one for mines, but non-intuitive values of 0.8 for mines and 0.2 for

non-mines in the case of a Sugeno A-measure, and 0.5 for mines and 0.1 for non-mines in

the case of a general measure. (Figure 2-1) and (Figure 2-1) show the sensitivity of LSE

schemes to desired outputs and random initialization.

2.5.2 Results and Comparison Against Other Methods

Now, we can show the results obtained from each algorithm. In (Table 2-1), an

average Sugeno A-measure trained via LSE is compared to each individual detector. For

PD's ranging from 80-101' ., the table shows the PFA achieved by the Choquet integral

with respect to Sugeno A-measure, the PFA achieved by each detector, and the reduction

of PFA achieved by the Choquet Integral with respect to Sugeno A-measure compared to

each detector. The percentage of reduction ranges between 0.0"' to 51.~' !'. Although

a Choquet integral, with respect to a Sugeno A-measure trained with LSE, performs









better than many of the individual results, it is still worse than the best possible detectors

(detector 6 and detector 7).

In (Table 2-2), we compare individual detectors against the general measure trained

using a LSE cost function. It is clear that general measures trained using LSE improve a

certain amount over Sugeno A-measures trained using LSE. This range of improvement

is between 3.25' and 55.1~11' However, the Choquet integral, with respect to a general

measure trained with LSE, is still not better than the best detectors (detector 6 and

detector 7). (Table 2-3) shows that, in contrast to the Sugeno A-measure and the general

measure trained with LSE, the Sugeno A-measure trained with MCE is, in general

better, than all the the individual detectors, with a range of improvement between 0.4 !'

and 65.07',. (Table 2-4) shows the improvement of MCE over the LSE. The range of

improvement is between 11.01,'. and 37.51 with respect to the LSE cost functions.

It is possible for the Sugeno A-measure and the general measure trained with LSE to

be as good as the one trained by MCE. For this to happen, it is necessary to have a set of

correct desired outputs. It is clear that, depending on initialization, these desired outputs

can change. This is a limitation for general measures and Sugeno A-measures under LSE

optimizations, and of course, an advantage of MCE training.

The MCE training was also applied to the Iris and Breast Cancer data and compared

to the results shown in Xu et al. [114] (Note, the appendicitis data is no longer at the

Machine Learning website). The Iris data is a three class problem, whereas the Breast

Cancer data is a two class problem. As in Xu et al. [114], ten-fold cross validation was

performed. We report the average error rates achieved in (Table 2-5). The average error

rate achieved on the Iris data was !'. whereas the average error rate achieved on the

Breast Cancer data was 22.7'. which compares favorably with the results in Xu et

al. [114].

The computational complexity of the proposed training algorithm is not high. First,

the number of free parameters is only n, whereas the number of free parameters for a









general measure is 2" 2. The final complexity in Big O notation is:


Time Complexity for MCE = O(Knlog(n) + HM(n3 + Kn2)), (2-27)


where H is the number of iterations in the main loop, K is the total number of training

samples and M is the number of classes. In comparison, the Sequential Quadratic

Optimization, used to solve quadratic problems under constraints, would finish with

an exponential time complexity.




























-4 --

SJ I =

S10o 20 30 40 50 60
Sugeno X-measure PFA

Figure 2-1: Examples of sensitivity to desired outputs for
a2 represent the desired outputs for mines and non-mines



100 ~ ~ ~ ~ ~ ~ ~ ~ ~ ~~ l ------------------------------------------
100





a, 1.0 a

2 . ..._. ...... .. = .0 .9 .a ,

a / ^ I'r^ na


Sugeno measure where ai and
respectively.


Choquet General Measure P
Choquet General Measure PFA


Figure 2-2: Examples of sensitivity to desired outputs for general measures where a, and
a2 represent the desired outputs for mines and non-mines respectively.












Table 2-1: Comparison of PFA for Sugeno A-measure trained with LSE against different


detectors.

PD PFA


PFA


Reduction


Sugeno detector


98.28
32.79
24.21
18.51
13.56
10.32
8.70
7.16
5.96
5.43
4.79


98.37
57.65
30.03
24.69
19.15
16.32
14.27
12.13
10.18
8.32
7.39


lII II'' ,.
43.11%
1-1 ;
25.l i .
2 1 I I' .'
36.77'.
39.1 1 .
41.1 1 .
41. 1 .
34.7 '.
, ", 'i i' .


PFA
detector2
95.40
68.11
45.23
31.06
15.48
13.67
12.13
10.04
9.07
7.62
7.16


Reduction PFA
detector


Table 2-1: Continued.


PD PFA
Sugeno
100 00 98 28


98.00
96.00
94.00
92.00
90.00
88.00
86.00
84.00
82.00
80.00


32.79
24.21
18.51
13.56
10.32
8.70
7.16
5.96
5.43
4.79


PFA
detector4
89.31
57.46
35.80
27.24
16.36
12.13
9.72
8.69
8.00
6.65
6.00


Reduction PFA
detector
-1"11 1,' 76.99


42 :'



17.11
14 .'
10. '.
17.. .
25. :'
18. :7'
20.1 .


49.09
31.94
24.31
15.53
12.69
10.51
8.37
7.72
6.37
5.16


Reduction PFA
detector
-27 .,' 77.03


: ; -II I' .
24 .
2 .; .'
12 .'
18.7 .
17.1.'
14. .
22.71
14 .
7.1 .


35.24
22.97
13.16
10.93
9.44
6.79
5.25
5.21
4.60
4.60


Table 2-1: Continued.


100.00
98.00
96.00
94.00
92.00
90.00
88.00
86.00
84.00
82.00
80.00


PFA
Sugeno
98.28
32.79
24.21
18.51
13.56
10.32
8.70
7.16
5.96
5.43


PFA
detector7
81.78
38.73
24.08
18.83
13.58
10.88
8.32
6.69
5.63
5.30
4.14


100.00
98.00
96.00
94.00
92.00
90.00
88.00
86.00
84.00
82.00
80.00


-3.11 '
51.".'.
46. 17'.

12.41%
24.51%
28._27' .
28.7 .
34.- .
33. 1 .
33.1"i .


95.07
66.99
40.31
28.36
18.83
14.64
12.18
11.11
9.34
7.81
6.65


Reduction

-3. 7'.
51 1 -, '

34.7:'
27 '-'
2" i

2 ", ".' r.

36 I'.
30.5 .
27 *, .


Reduction

-27 ".'
S' '* I' .
-5. i ;'
-40.71' .
-24.12'.
-_ I' .
-28..
-36.-
-14.51%
-17.'-' .
-4.1 7'.


Reduction

-20.1.-'.

-i ". i' .

1.*.' ,
0.11%
5.15%
-4 -. .

-I, I II I' .
-2 ;' .
-15.71' .


PFA
detector
94.24
62.20
32.03
26.69
18.46
15.20
13.44
12.27
10.55
9.67
9.25


Reduction

-4 2 .
47 '.
24.41%
1i. ', .
26.5 .
32.1".
35.22'
41..' .

4 '
48. 2'












Table 2-2: Comparison of PFA for general measure trained with LSE against different
detectors at different thresholds.


PD PFA general


100.00
98.00
96.00
94.00
92.00
90.00
88.00
86.00
84.00
82.00
80.00


measure
91.17
30.24
21.28
15.54
12.91
10.04
8.43
7.15
6.06
5.43
4.94


PFA
detector
98.37
57.65
30.03
24.69
19.15
16.32
14.27
12.13
10.18
8.32
7.39


Reduction PFA
detector2
7. : 95.40
47 .-.' 68.11
29.1 '. 45.23
37 I"' 31.06
32.1.'. 15.48
38. I '. 13.67
41, :' 12.13
41.0 10.04
40. i-'. 9.07
34.7' 7.62
33..2 7.16


Reduction PFA
detector
4. ; 95.07
55.1 66.99
.; *,' 40.31
4' 7'. 28.36
11, 1.'. 18.83
26.5-' 14.64
30.5 '. 12.18
28.7i' 11.11
33.1 9.34
2 2. 7.81
31 1,' .6.65


Table 2-2: Continued.


PD PFA general


100.00
98.00
96.00
94.00
92.00
90.00
88.00
86.00
84.00
82.00
80.00


measure
91.17
30.24
21.28
15.54
12.91
10.04
8.43
7.15
6.06
5.43
4.94


PFA
detector4
89.31
57.46
35.80
27.24
16.36
12.13
9.72
8.69
8.00
6.65
6.00


Reduction PFA
detector
-2 76.99
47 :;'. 49.09
41i -, 31.94
4 '7' .24.31
21.1.' 15.53
17..-' 12.69
13.. 10.51
17.71% 8.37
24.21% 7.72
18. 7' 6.37
17 i.'' 5.16


Reduction PFA
detector6
-18. 1' 77.03
38.41% 35.24
;; ; 1.' 22.97
36.11' 13.16
1. 7 10.93
2' -.' 9.44
19.7 .'. 6.79
14.51% 5.25
21. 17'. 5.21
14.7', 4.60
4 ;' 4.60


Table 2-2: Continued.

PD PFA general
measure
100.00 91.17
98.00 30.24
96.00 21.28
94.00 15.54
92.00 12.91
90.00 10.04
88.00 8.43
86.00 7.15
84.00 6.06
82.00 5.43
80.00 4.94


PFA
detector7
81.78
38.73
24.08
18.83
13.58
10.88
8.32
6.69
5.63
5.30
4.14


Reduction PFA
detector


-11. 1' .
21 12'
11.2',_
17. I '.
4 '2'.
7..,; '
-1 _
-( 7' .
-7.7 .
-2 II' .
-19 :'


94.24
62.20
32.03
26.69
18.46
15.20
13.44
12.27
10.55
9.67
9.25


Reduction

4.11%
54 '7'.
47 .
45.21%
31.45%
31.41%
30.7>'.
35.61%
35.15%
30.5"' .
25.7 .'.


Reduction



14- .
7 2' .
-18.1 .
-18.1 ,'
-6. :'
-24..- '.
-36.1.-'.
-l '.,
-17.1 .
-7 ,'


Reduction

'. .
51 .

41.77 .
:I1 1 I .

37 -.,'
41.71%
42.5E .
4 ; -'
4'. 1' .












Table 2-3: Comparison of PFA in MCE against different detectors at different thresholds.


PD PFA MCE PFA


100.00
98.00
96.00
94.00
92.00
90.00
88.00
86.00
84.00
82.00
80.00


Sugeno
94.65
26.89
15.80
11.07
8.07
6.65
5.45
4.96
4.38
3.93
3.61


detector
98.37
57.65
30.03
24.69
19.15
16.32
14.27
12.13
10.18
8.32
7.39


Table 2-3: Continued.

PD PFA MCE PFA


100.00
98.00
96.00
94.00
92.00
90.00
88.00
86.00
84.00
82.00
80.00


Sugeno
94.65
26.89
15.80
11.07
8.07
6.65
5.45
4.96
4.38
3.93
3.61


detector4
89.31
57.46
35.80
27.24
16.36
12.13
9.72
8.69
8.00
6.65
6.00


Reduction PFA
detector
3.7"'. 95.40
S: : 68.11
47. I' 45.23
55.1 '. 31.06
57 ','. 15.48
59.2-' 13.67
61.81% 12.13
59.111'. 10.04
57.01% 9.07
S' 7.62
51.111'. 7.16


Reduction PFA


-', '' ".
. '. _'l I' .
", ", % 7' ,
59 :,'.
50.71%
45._' .
4 .
42.91%
4 .
4 1, I'.
39.7 :'.


detector
76.99
49.09
31.94
24.31
15.53
12.69
10.51
8.37
7.72
6.37
5.16


Reduction PFA
detector
0.7'. 95.07
60.5-'. 66.99
I.-. -' 40.31
64 1'. 28.36
47 18.83
51 14.64
".. -' 12.18
a, -". ,' 11.11
51.7;'. 9.34
48.51% 7.81
49.51% 6.65


Reduction PFA


45.-2' .
".,11 I'
54. .' .
4 1 1 .'
47. 1' .
48.1" .
41 .'
4 .

2 1 I


detector
77.03
35.24
22.97
13.16
10.93
9.44
6.79
5.25
5.21
4.60
4.60


Table 2-3: Continued.


PD

100.00
98.00
96.00
94.00
92.00
90.00
88.00
86.00
84.00
82.00
80.00


PFA MCE
Sugeno
94.65
26.89
15.80
11.07
8.07
6.65
5.45
4.96
4.38
3.93
3.61


Reduction

0.1 1'.
0. I i' .

60.81%
1.11 'I ,
57.1'.' .
54 'I.

55.
53.1.' .
49.7 '
4"-, .


Reduction


-2 7' .
2 i ,' .
31.21%
1 -, ;* .
26.17%


5.5 :'

14.7' .
21. I1.'.


PFA
detector7
81.78
38.73
24.08
18.83
13.58
10.88
8.32
6.69
5.63
5.30
4.14


Reduction

-15.7 .'.
l I ",1 .' .
34.".'
41! .
4 11 .,

38.91%
34 -r .
2 .,
22 IP".
2, "2'.
12 '.1'.


PFA
detector
94.24
62.20
32.03
26.69
18.46
15.20
13.44
12.27
10.55
9.67
9.25


Reduction

-0. I '.
56.77'.
", i .

", I'
29.

59 -,.'
58.5 '.
59. I' .
I.I ', .












Table 2-4: Mean MCE PFA against mean general and Sugeno PFA.


PFA MCE

94.65
26.89
15.80
11.07
8.07
6.65
5.45
4.96
4.38
3.93
3.61


PFA general
measure
91.17
30.24
21.28
15.54
12.91
10.04
8.43
7.15
6.06
5.43
4.94


Reduction

'. N '
11.07' ,
25.7' .
28.7 :'
37.51
33.8 ;' ,
35.; ,.'
30.1 ;' ,
27.77'
27.1 .1.
26.77'


PFA Sugeno

98.28
32.79
24.21
18.51
13.56
10.32
8.70
7.16
5.96
5.43
4.79


Table 2-5: Comparison of MCE
breast cancer data.


Method

Linear
Quadratic
Nearest
neighbor


independent

quadratic
Neuronal net
PVM rule
QUAD
CLMS
HLMS
WCIPP
MCE


Iris
data


( ,)
2.00
2.70
4.00


6.70

16.00

3.30
4.00
3.30
4.00
4.70
4.00
4.00


Sugeno against several other classifiers for iris data and


Breast
cancer ( )
29.00
34.40
34.00

28.20


34.40

28.50
22.90
31.50
27.10
22.60
26.20
22.73


100.00
98.00
96.00
94.00
92.00
90.00
88.00
86.00
84.00
82.00
80.00


Reduction

3 i .,
18.01,
34.71,' .
40.1,' ,
40.52'
35.59' ,
37.;7,' ,
30.1, .'
26.59'
27.1. '
24.5 !' ,









CHAPTER 3
SPARSITY PROMOTION WITHIN CHOQUET INTEGRALS, USING A NAIVE
SAMPLER

3.1 Constraining the Hierarchical Model for Sparsity Promotion under the
Least Squared Error

A problem we have in model (1-33) is that it does not constrain elements in g to

satisfy required properties of the fuzzy measures (Definition 1.24)

1. (0) 0 and p(X) 1.

2. Given A, B E 2x, if A C B then p(A) < p(B).

If IX| = k, then we seek to estimate the 2k 2 parameters p(A), A C X. To simplify

our notation and treat a fuzzy measure as a parameter vector, we order the subsets of 2X

into the succession {A, A2, ..., A2k -}, and write pj = p(Aj). Now, we will impose these

relations using the following strategy based on the Gibbs sampler. Given the model for

sparsity promotion in (Section 1-24), and taking j = 1,...,m = 2k 2, we can try to

calculate the joint probability p(7 a, Li2,1,...m 7,, T *,... Tm) using the following strategy:

consider the power set lattice of X = {a, b, c} shown in (Figure 3-1). Starting at the top

(i.e. X) and moving downward to the singelton elements, it is easy to observe that the

value of a fuzzy measure on a particular set, for example p({a}), depends on the previous

values for the sets that contain {a}, i.e. p({a,b}), p({a,c}) and p({a, b,c}). In fact, we

must have V D s.t. {a} C D => p({a}) < p(D). Hence, in the general case, we can

constrain the value of each measure pij by sampling from a distribution on the interval

[0, min {p(As)IAj C As}]. One possible modification of hierarchical models from (Section

1.3.1) is the following one:


yi ~ N(Hi J2) Vi = ,...,n

plm = 1

y ~ Ky x N(0, T) x ..









I[, l.i..ii.ii, f .4A,)IAj CA, ] (Pj)

Vj 1,...,m- 1

TJ ~ exp Vj = 1,..., n m 1 (3-1)

7 ~ non-informative prior

a2 ~ non-informative prior


where:

,,,,, jACA}](j) 1 If pj E [0, min {p(A) IA, C A} (32)
0 else

and Kj is an integration constant to make N(0, Tj) x I[o,min{p(As)AjCAs,}](Pj) a PDF. In

addition, Kj x N(O, T ) x I[II.i,,,,,,I.s)|AjCAS}] ~ j) denotes the truncated distribution from

which the samples of ij are sampled. In this model, the row vectors Hi are composed

of zeros and the differences [f(x(i)) f(x(i-_))] correspond to the correct positions of

the vector pT = (p1,2, p, ..., p)t. In this way, the output of the Choquet integral can be

written as an inner product:



C(ft) =jp(A(k))(fi((k))- fi((k-1))) H (33)
k= 1


as shown in (Equation 1-112) of the literature review. We refer to models of this type as

the Monotonicity Constrained Models ( C'\ ).

3.2 Gibbs Sampler for the Monotonicity Constrained Model

Due to the fact that the monotonicity constraints for a fuzzy measure are imposed

sequentially in the MC'\ model, it is necessary to consider sequential probabilistic

numerical methods to solve it. A method that naturally fits into this scheme is the Gibbs

sampler [2].

The Gibbs sampler that we devised to solve the MC'\I model is called LSE Sparsity.

This name reflects the fact that Gaussian distributions over the data represent an LSE









in a probabilistic framework. An immediate advantage of solving the MC'\ models by

the Gibbs sampler is that the measures in MC' \ are full measures (i.e. they are not

constrained to be Sugeno measures, or any other particular class of fuzzy measures).

Another advantage of using a Gibbs sampler is that we can use proportional functions

to do the sampling (i.e. we do not need to know the value of each constant of integration

Kj).

Now, we are ready to describe the Gibbs sampler for the MC'\ model. Given

an initial point ('7(0o), oo) i) ) m (0) o ), and assuming that the samples

yi,..., y, are identically independent, the Gibbs sampler looks like:


Gibbs Sampler For LSE Sparsity

1. Given [ t)]T ( ), ..., )T generate:

(t+l) 2 (t) (t) (t) (t) 1
p ~ p0pi7(/),l (t) i -2 --- m' 2 > ""- m ,yi,-,yn) =
IP(PI17t, 2yn ((t),~t) t (t) ) (t





p(m1l lkm 1 p(yi, ..., n)
oc p(y ...,y, (t), PI, t2 ..., ) x
p(p 117(t)) x I[o,min{p(As)|A iCAs}]( Pli)
with p(pI (t)) N(0, 7Q),


(3-4)

(t+l) / I 2 (n) (7) (g) W M
P _1 ~ PILm 7(t),( 71 P, l/, m-, #1 ---> m-2, 91, **, Y -)
-1y/ I 2 (t) _(t) t)(
p(/ m--l /1, ..Y /|7 *() 0- t) 7 -- ) m-l t "--Pm-2)
p(yi, -.-, y-)
0_,2 (t+1) (t+1)
oC P(yi, ...,y,|(t), l 2 .... m-1) X
p(Pm-l|r'T) x I[o,min{p(As)Am-1CAs}}](Tm-l)
with p(P_- I(t)) N(0, r). (3-5)









2. Given [ft) ]T (t,) ., T))r generate:
(t+l) ( 2 ( C) (t) (C) (t) 1

= p( |7()) c exp {- (}
(t+l) P j 2 (t) ) (t) (t) 1


= p(Tm1 7(t)) oc exp 7,-2 *2


3. 7(t+l) ~ non-informative prior.

4. r (<+) ~ non-informative prior.


The final value for each measure parameter pj is the mean of the collection of samples
{(1, (2) ...,(K) generated by the previous algorithm.

For this model, we only need to devise a method for the posterior sampling of the
p~j's, and select the appropriate non-informative priors for 7 and 2.
Something that we need to stress is the problem of using non-informative priors for
the random variables 7 and a2. This arises from the fact that non-informative priors are
not real distributions [1, 2]. So, a posterior generated by non-informative priors cannot be
a real probability.
3.3 Sampling from the Posterior Distribution of Mu
Let (YI, ..., Y) be a sample from a multivariate normal variable with distribution
N(Hg, a21), where pT = (1,, P2, .*, Pm)T represents the mapping of a measure defined
on the lattice 2x 0 into a vector, and H represents the design matrix of values

[/(x(k)) (X(k-1))] described in (Equation 1-112). Then, we can see from the model
in (Section 3.2) for the Gibbs sampler that:

Sj py (yl, ,y H 21)P(pj jTj)I[o,min{p(A,)I|AjAs}](/j). (3 6)








The last part of (Equation 3-6), I[,ini-,, i.-s)|A CAs}](Ij), truncates the distribution

p(Y, ..., y, Hp|, '2I)p(pj Tj) into the interval [0, min {p(As)IAj C As}]. Consider for a
moment the distribution p(yi, ..., y,lHIp, a2I)p(p~j Tj). Since, it is assumed that yi,..., ,
are mutually independent, it can be proven that this last equation is an univariate
Gaussian distribution with respect to pj. Thus, to sample from this distribution, we only
need to devise a closed form expression for the Gaussian and truncate it. Now, denote
the mean and variance of this univariate Gaussian by 0 and 62. Then, we can write the
complete expression of p(ljpl0, 62) as:

p(~j0, 62) = p(yl,..., ynlHt, 721)p(pjlTj)

S x exp 12 (yi Hit)2
(27r) (1a 21 1) 2 j2 .
exp -P (3-7)

where Hi is the ith row in the design matrix:


H (3-8)

\ H.
and pij is the jth position in the vector tt. Rearranging the terms in p(pj 0, 62) leads to:

f 1 n= 1 12 i_
p(p\6, 2) ex exp 2j2 Z((Yi H j 2 -j (3 9)

where H, (Hi, ..., Hi-l, Hil,..., HU) and _j = (pj, ...,j_l,j+l, ..., ) (Equation
3-9) can be rearranged as:

/l Ix 1_E ((Haj)2p2 2(y H,-'p )Hzjpj)
p(P |10, 62) exp i-2j2...

2 2 27-_ j (3"t









The exponential term can be rewritten as:


H,jp_ )H,j] p,


2 [2 (
2j2 Tj


Ei Tj (Yi


-H7 _j) + 2 2
2jo2Tj


(3-11)


Thus, we simplify this to be equal to:


(y (EL (Hiy)2) + 2)/,i


We have finally that:


2T-j ((y H j_j)Hj


Tj Z(Hi )2 + ,2
i 1


n
(yi _j)2
i 1
j (H>j)2 +2
i 1


(3-13)


7 1(H)2 2
i- 1


Therefore, we have the following, after completing the squares and removing unnecessary


terms:


p(pj 0, 62) oc exp


i 1


2j2Tj

-1 (H" )2 + 2
i 1


Then, we have that:


- H,-+_ u)Hj]
(H,)2) +72) (T


(EZ I (H y )2) 2) ,j


,...,,m (3-15)


- 2rj [E l(yi
2o2rj


21 j(Yi T2
2j 2 -L


(3-12)


H a -i) 2



i)2 + 2


(3 14)


pj ~ N


[E i( T
(j (E i1


E I(Tj()ij)21 jp)


*i +


H, j_ j)H,,] pj








Using this distribution, we can then sample from the truncated normal in the interval
[0, min {psIAj C As}]. Finally, the truncated distribution looks like:

.[ l -3-J n) i] a
pj N (T [:> 4' (t ) 2 x...
(r i (E Hj)H2) + 2) (Tj ( H 2 I 2
I[O,min{fps|ACAs}](j), Vj 1,. ..m. (3-16)

3.4 Problems When Sampling from a Gaussian with Small Variance.
An immediate problem that this method has is that the variance for the normal
sampling distribution tends to be small. This produces inaccuracies in the sampling of the
measures. A way to solve this problem is based in the following observation. Imagine that
we are sampling from the following Gaussian distributions in (Figure 3-2). It is clear that
we have a higher probability of sampling near the mean than away from it. If the standard
deviation decreases, the probability of sampling values near the mean is higher. Now,
if we are trying to sample from an interval that is far away from the mean with respect
to the standard deviation, the curve of the Gaussian function tends to be flat. Thus, we
can assume that we are sampling from a uniform distribution in the interval of interest.
Although this looks like an inefficient solution, our experiments will show that this simple
approach works relatively fine, but improvements can be made. We need to stress that
this is not a theoretically correct solution. The correct solution is based in the following
distribution:
exp[ 7)
I- (3-17)
fexP[-
This is the reason why we are looking for more efficient v--,v to circumvent this problem.
3.5 Improving the Sparsity Promoting Model

Something that is clear from (Section 3.1) (Section 3.2) is that we are trying to
promote sparsity using a Gaussian distributions. This is clear if we look at the first









equation in the Gibbs sampler,


p(p\10,62) x exp j- (y- Hiti)2 exp 2- P (3-18)

It is clear that the first exponential represents the minimization term for the classification,

and the second exponential represents the sparsity promoting term. When we come to

understand this, we realize that given ij E [0, 1], we shall be using a better sparsity

promoting distribution. Figueiredo and others [40] -,r.-. --. I. that the Laplacian is a

more suitable prior for sparsity promotion than a Gaussian distribution, in our case the

exponential distribution, to promote more sparsity in our model. Thus, the new model

looks like:


yi ~ N(Hip, ,-2) Vi= -,...,n

lm = 1

j^ ~ Kj x exp () X I[o,min{p(As) AjCA, I]((j) Vj 1,... m- 1 (3-19)

7 ~ non-informative prior

2 ~ non-informative prior

Thus, the new model eliminates the intermediate 7, and directly uses an exponential

distribution for the measures to be learned. Now the Gibbs sampler looks like:


Improved Gibbs Sampler For LSE Sparsity

1. Given [p(t] (t)), ...,) generate:

(t+l) (t) 2 () (t)
P P""i 7t ,* (t),, 2 ,. .,Tm )


p(yi,,..., y )
oc P( i/, ..., /y t), 2 t), )

(ill 7(t)) X I[O,min{p(A,)|AiCA }] i)
with p(pi 7()) oc exp 7-t P ,









(3-20)
(t+l) T I (t) 2 (t) (t) /
m-1 ~ P(/m-1- 7t ,r(7 ), VI ,..., _-2, Y1,...,Y )
p(Im, l, ..., Un 17( t), 7 t2 ) / ( t)2)
p((yi,y..., ,)
oc ( ..,y (t), t... )
P(Pm|7(t)) x I[n.l,,i, .-'A)|AmcA }]( m),
with p(p(m 7(t)) N exp 7 t)} ,


2. 7(t+1) ~ non-informative prior.

3. +1) ~ non-informative prior.


In this new model, the pj will be sampled from the distribution:

S1(y, H'.-J_)Hj) + ,27 aH2
p 2 N X ...
_(Hi) I 1(Hij)2

I[o,min{p(As)|AjCAs}](Ij) Vj 1,.. .(3-21)

Another possible improvement for this new model could be assuming not a single 7 as

a rate of sparsity for all measure values, but one 7 for each measure value to be learned.

In addition, we would like to be able to have feedback from the sparsity ratio to these 7's

to increase the level of sparsity if it is necessary. In order to do this, we need to include

an extra distribution for the 7's, which cannot be a non-informative prior. Zare [142]

proposed methods for solving these problems.

3.6 Improving the Accuracy Term Sigma and the Sparsity Term Gamma

Here, we drop the assumption that a2 is sampled from a non-informative prior.

Instead, we use the Mean Squared Error (\!S;) between the desired outputs yi and the

calculated output Hip:
MSE(2) -(y Hp)T(y Hp). (3-22)

This estimates the accuracy after each iteration of the Gibbs sampler. This new

modification improves the classification capabilities of the algorithm because when









a2 0, the Var(pj) 0 values of p are sampled from regions for which the error

(y Hg)T(y Hg) is small.

Now, for the sparsity promotion term 7, we use a variation of the idea proposed by

Zare [142]. For each term ij, we have 7j = Thus, if pij 0 then 7y -i o, since

large values of 7 promote sparsity more than small values of 7.This means that sparsity

promotion for ij is accelerated as ij tends to zero. The new B i, -i i hierarchical model

is:


Yi ~ N(Hipt, 2) Vi =,...,n

lm = 1

~ Kj x exp ) x I[o,min{p(A,)|AjcAs}]( j) Vj 1,..., m- 1 (3-23)
1
Pij
2 1
7j = Vj= 1,...,m-1

a2 (y- H)T(y- Hp).


For the new Gibbs sampler, we only need to make two modifications by using the last two

equations, the rest is the same.

3.7 Interpreting the Results from the Sparsity Promoting Model

We can interpret the vector tt = (pi,, p2, ,)T aS a measure of the interaction of

the nonlinear .,I::-regation of the confidences between algorithms. We can use the Shapley

index to measure the importance of each element in the fusion of confidences.

As we have seen in our experiments using the MCE method (C'!i Ipter 2), the Shapley

indices tend to have a large standard deviation over an n-fold crossvalidation. This makes

it difficult to analyze which algorithms are important for fusion.

Can we i that the introduction of a sparsity promoting distribution will minimize

the standard deviation of the Shapley index in an n-fold crossvalidation? If this is

the case, the introduction of these PDFs allows us to improve our analysis about the

importance of each algorithm in the fusion.









An analysis of this property requires a more in-depth study of the model and LSE

Sparsity. This opens new venues of research for the future.

3.8 Comparison Against Quadratic Programming

A classical way to learn a complete measure is the optimization method proposed by

Grabisch et al. [130] In this method a quadratic objective function under constraints is

defined to obtain the optimal measures. The numbers of constraints and parameters are

exponential functions of the dimensionality of the input and they are stored in a matrix.

Therefore, solving the quadratic programming quickly becomes intractable. Although

the Gibbs sampler has a similar limitation the exponential number of parameters to be

learned the Gibbs sampler does not need to store a sparse matrix of constraints. Instead

it uses a clever way of imposing the constraints within the algorithm, which decreases the

complexity in calculating the solution.

An extra property is the fact that samples generated by Gibbs samplers are attracted

to regions of high probability. This overcomes one of the problems in deterministic

optimization when dealing with non-convex functions that converge to local minima. This

means that a Gibbs sampler looks at the global properties of a probability while gradient

based techniques look at local properties of a function. Therefore, a Gibbs sampler is a

global stochastic optimizer.

It has been seen empirically, using the same hardware for both methods, that

the Gibbs sampler is faster than the quadratic programming, but a complete rate of

covergence and complexity analysis needs to be performed. For now, we will leave this to

future research.

LSE Sparsity shares one drawback with quadratic programming, the use of desired

outputs. This makes it necessary to develop new methods that do not depend on desired

outputs.









3.9 Results under Sparsity Promoting Model with Respect to LSE

Here, we explore the results of using the sparsity model under the LSE assumption,

and the use of a naive sampler for the desired distribution. In this section, we present

results on two artificial problems to illustrate the sparsity promotion capability of the

system. In each of them we have a two class synthetic problem with 1000 samples in each

class, and four features in each sample. In addition, each Gibbs sampler chain has a length

of 10, 000 iterations with a burning period of 1000 iterations. Here, "burning period"

refers to the number of samples that are necessary to sample before the Markov chain

stabilize in the target distribution.

3.9.1 Case I

In this case the only separable features are in the odd positions which contain samples

for Class 1 from the distribution N(0.2, 0.1), and Class 2 from the distribution N(0.8, 0.1).

The second and fourth features contain uniform noise. For features greater than one and

less than zero, we clip the values to zero and one respectively. If we plot the samples in

their first three features (Figure 3-3), we can see the separation between the classes.

The confusion matrix (Table 3-1) shows that the algorithm is able to separate the

classes. The fuzzy measure values are in (Table 3-2) and the Shapley values are in (Table

3-3). Note that the Shapley indices of the informative features (features 1 and 3) are

approximately 20 times larger than those of the non-informative features (features 2 and

4).

3.9.2 Case II

In this case the only separable feature is the first one which contain samples for Class

1 from N(0.2, 0.1), and Class 2 from N(0.8, 0.1). The rest of the features contain uniform

noise. The same clipping strategy is used. The confusion matrix is shown in (Table 3-4).

The measure values are in the (Table 3-5) and the Shapley values are in (Table 3-6).









Table 3-1: Confusion matrix for artificial data set case I in Gibbs sampler for LSE
sparsity.


C'\i
class 1
class 2


class 1
1000


class 2
0
1000


Table 3-2: Measures for artificial data set case I with mean and standard deviation of the
Markov chains in Gibbs sampler for LSE sparsity.


Measures


Mean


L(f{x2})
p({x2})
p({x3})

p({X3, x4})
p({x2, X4})
p({x2, X3})
p({xI, x4})
p({fxI, X3})
p({xI, x2})
Pf({ax2, x3})
P({ax, X4})
f({ax3, x4})
p({X2, 3, 4})
P (X)


0.450600
0.010050
0.468070
0.007468
0.526480
0.035829
0.506710
0.499880
0.965110
0.497820
0.991380
0.533740
0.985270
0.574300
1.000000


Standard
deviation
0.033273
0.007843
0.038589
0.006883
0.0 ,-i I
0.016663
0.055892
0.042394
0.020748
0.046170
0.011033
0.050566
0.014186
0.055924
NA


Table 3-3: Shapley values for the features in artificial data set case I in Gibbs sampler for
LSE sparsity.


Shapley
value
0.462290
0.024708
0.487550
0.025449


Table 3-4: Confusion matrix for artificial data set case II in Gibbs sampler for LSE
sparsity.


C'\i
class 1
class 2


class 1
998


class 2


1000













Table 3-5: Measures for artificial data set case II with the mean and standard deviation of
the Markov chains in Gibbs sampler for LSE sparsity.


Measures


Mean


,P({lX})
p({x2})



p({x23, 4})

p({x2, x3})
p({x1, X3})
p({Xl2, X3})
P({Xlx, x1})

P({Xl,3, x4})
p({X2,x3, x4})
,P(X)


0.484600
0.019192
0.022071
0.025016
0.049664
0.051202
0.034180
0.,I l II
0.986030
0.605990
0.991370
0.664200
0.996480
0.113400
1.000000


Standard
deviation
0.042570
0.010254
0.010317
0.011738
0.016633
0.021779
0.016434
0.069511
0.035203
0.064725
0.013091
0.086878
0.020278
0.014942
NA


Table 3-6: Shapley values for the features in artificial data set case II in Gibbs sampler for
LSE sparsity.

Shapley
value
1 0.729930
2 (0 1I I ,' -.,
3 0.204660
4 0.035554



















































Figure 3-1: Example of a lattice where the arrows represent the subset relation on it.












117
































Mean =0, SD =1


Mean = 0, SD = 0.7071
035-

03-

025-


Figure 3-2: This plot shows the idea that sampling in an interval far away from the mean
of a Gaussian with small variance is similar to sampling from a uniform in the interval.
































+ _


Feature"6
3 04


0,
1'


> 0 0


I, I '.


Feature
2


S, 2

01 0
0 01


C 06
0. Feature
1


Figure 3-3: Plot of samples for class 1 'o' and class 2 '+' for the first three features in case
I.















119









CHAPTER 4
AN IMPROVED LSE SPARSITY ALGORITHM

In this chapter, we develop an alternative model for LSE Sparsity which is more

similar to Figueiredo's original model than the one in (C'! Ipter 3). This new model has

been proposed as a way to fix the problem of using a Gaussian distribution to simulate an

embedded Laplacian distribution for each pj in (Model 3-19).

4.1 Introduction

The improved LSE sparsity algorithm is based on the following important observation

concerning the B ,v,-i ,in model defined in (C'! Ilpter 3) to promote sparsity when using the

Gibbs sampler (Model 3-4). In this Gibbs sampler, pj is being sampled from:


PI -P p(l, ..., y.IH 2)p j Tj), (4-1)

which is the likelihood of p, given the data L(p yi,,..., y,) multiplied by a prior ij ~

N(0, Tj). In (Model 1-33), Figueiredo used EM to maximize the a posteriori probability

for p and a2 [40]:


p(Al, o2\y yn,T) OC p(yl, Y tn p, 72 )2), (4-2)

which embeds a Laplacian distribution for p in the final solution. This is different from

the Gibbs sampler (Model 3-20) used to solve (Model 3-19). In this case the Gibbs

sampler tries to maximize the likelihood that:


L(plyi, y. ) ,yj) N(y|Hp,72). (4 3)
i= 1
Therefore, if we want to have an embedded Laplacian distribution as in Figueiredo's

solution, it is necessary to develop a version of the Maximum A Posteriori (\! AP) EM for

the Gibbs sampler and the Choquet integral. For this, we use a modified version of the

Gibbs sampler over missing data as explained by Robert and Casella [2].









4.2 Maximum A Posteriori Gibbs Sampler Model

Using the idea of Robert and Casella [2], consider the following two B ,-i -i

hierarchies. First, one with an incomplete-data likelihood:


y ~ N(Hi, a2) ,VI ,-t... ,n,

lm = 1,

Ij ~ exp -- Pj} Vj- 1,...,m-1, (4-4)
1
7 Vj 1,...,m-1,
PIj
a2 (y H)T(y- Hp).


Now, if we augment the previous model with a hidden variable rj, we have the model with

a complete-data likelihood:


y, ~ N(Hip,,2), Vi ,...,n,

pm = 1,

pj ~ N(0,), Vj= ,...,m-1,

Tj ~ expj2 Ti Vj 1,...,m- 1, (4-5)
1
7j Vj= 1,...,m- 1,
I-I
2 (y- Hp)T(y- Hp).


Now, from these two models, we have the following incomplete and complete a posteriori

probabilities for each of the individuals ij:


p(ij, 2 y,7j) o< p(yl\j,U2)p(j\7j)p(U2), (4-6)

Ppj,2y,j) o p(ylja2)p(jlj)p(2). (4-7)









Using (Equation 4-7), (Equation 4-6) and assuming that the T-j's are independent

identically distributed (idd), we can define the density for the hidden data Tj:


Note that by the well-known formula (Equatk

by


p(paj, 2y, Tj) (48)


)n 1 34) and (Model 4 5), p(pj y7j) is given


(4-9)


This distribution is simply:


2
K (,tj y, pj, '2tj) o

OC


oc


In this last expression, we know that pyj >


p(yC /j, 2)p(Pj lTj)p(72)
p(y lPj, 72 (Ij 7j )P(72)
N(0, j)
Jexp{ p-,
1-exp {- + V3ll } (4-10)


0. Therefore, we have that -j has distribution:


1 r #2
,j ~ C, exp + ,
-I 2 vTj


(4-11)


where Cj is a constant of integration. We can go one step further by moving ~Pjpj to the

proportionality factor, i.e.:


7- C 1exp .

This can be converted into a gamma distribution if we take zj -:
Tj


z C/z -exp ,
_011


(4-12)


(4-13)


with k = and 0 = In this way, we see that Tj is governed by an inverse gamma

distribution:


7JCjI7J -a-l exP P}
1 T


(4-14)


with a = and3 = .


P(Il 7j) = V- exp |--j | "









Now, if we consider p(a2) to be constant, we have that the complete posterior

probability of p is:

I _P(PjlY, Tj,o-2) P(pJ',-2jY'Tj)
/ 9xp(72)
P(Ypp(y, T 2)2
p(, 2)


p= p(y H,,H2 p ) (4,15)
i= 1
Finally, it is known that ij needs to be sampled in the interval [0, min {p(AA) Aj C As}].

Therefore, we have that:

j Kj [fP(yi Hipl,O2) X p(j Tj) X I[O,min{p(A,)|AjCA,}](Pj). (4-16)
i= 1
This allows us to define the following Gibbs sampler for the MAP-EM


Gibbs Sampler for MAP-EM Sparsity

1. Given [(t)]T = t), ', generate:

z t+1) C1z lexp ,


(t+1) / k-1 lexp -1
Zm-1 r-Iexp 0 ,





pi K-p(y1-..., t) 2 i )ne
I[o,min{p(As)|AiCA,}] (1 ), (4 17)

o [p(yi\ Hiti, 2) exp x ...

I[0,min{p (A,)|AICA}] 1( ),










pm-1 ~ Kmp(yi, ,y\ (t)-2 2)p(Pm- T-) X ...
I[o,min{fp(As)|Am-_CA4}] (m-1),

c( p(yIl 2)ex iI exp -x...
N pi I 2TiJn

I[o,min{p(A,)|A-C, Ai}] (Pm-l).

3. Update (t+) (y H)T(y Hp).


Notice the similarities of this algorithm (Model 4-17) with the one developed by

Figueiredo [40]. This is due to an exponential term for the probability distribution
/12 1 1
functions (Equation 4-12), -i This last equation resembles -tt {,diag ,* ,"' p' I

in Figueiredo's solution for MAP-EM. Although this looks like a good solution, we still

have a problem. This comes from the fact that:


E -1|o)- O (t) = J2 (4-18)

Therefore:
-(t) when t oo, (4-19)
2-(t) 2
which is not the desired solution for promoting sparsity in the parameter values of pt. We

propose the following modification to overcome this problem.

4.3 Solving the MAP-EM Problem

First, recall that in Figueiredo [40] the logarithm of the posterior is:


logp(p,2 yIT) O -1og2 Y 2T(T),( (4-20)

where TT(Tr)p represents the regularization term 6J j p2 on the Ridge regression. This

means that (Equation 4-20) is simply a Ridge regression, which, as we know, does not

promote sparsity as much as (Equation 1-32). However, if the expectation is taken for the









term then:

E(i Y 2. (Q) <7 (4-21)
E j jY Ptt) P ( 4p
This allows one to write each of the terms in the regularization equation as:


7'j if 11I 7/ if p, > 0.

This last equation represents the Laplacian distribution. Therefore, it is clear that in

Figueiredo's MAP-EM, the expectation converts the Ridge regression into Tibshirani's

sparsity promotion distribution (Equation 1-32), which is maximized by the maximization

step.

These are the reasons why we decide to use a variation of the classical EM Gibbs

(Section 1.2.4), the slice sampler (Section 1.2.6) and (Equation 4-15) to overcome the

problem pointed out in (Section 4.2).

Consider then the complete likelihood of the data (Model 4-18):

L*(p, a21y, r) p(y,ry,a2)

-p(=,,I 2, 72)p(T|Pl, a72). (4 23)

If we assume that given p it is not necessary to know T, we can write (Equation 4-23) as:

L*(p, 21y, r) p(yl,2) p 2)
p(p l1a2)
p(y jp, a2)p(p, a2) ((4242)
(4-24)
p(pl-a2)

Now, from (Model 4 5) and (Equation 1 34), we know that p and 7 depend on 7.

Therefore, we can make the following assumptions


p(r l2) p(ral2,'y) = p(r'l), (4 25)

p(pa2) p(P2,7y) = p(pY). (4-26)









In this way, we have that:


p(ylu2)p(2 7, 2)pQ y)
L*(p, 2y,) 27I P 7 (427)

In a similar way, we have the incomplete data likelihood L(p, a2 ly) p(ylp, o2).

We then have the following distribution for r:

SL*(p, 21 L), p2 ) ()p(Tp )
k(rp, a2 ,, y) L(,=2) (4-28)
L(p,72ly) P(u|7)

Due to the fact that we are using slice samplers [2] to sample this distribution, it is

possible to remove the constant of integration p(p|7), remember that p and 7 are given.

Then, we have:

k(r-p,2,'7, y) oc p(l )p(rl7). (4-29)

It has been shown (Equation 1-34) that integrating tau in (Equation 4-29) produces the

marginal probability of p which is a Laplacian distribution with parameter 7.

Given a fix p*, it is possible to use Markov sampling to estimate the marginal value

for p* of p(p*, I|7) [143, 144] by observing that:


p(p*|7) p(p*, |7)dT p(p* 7)p(r|7)d EL(p(pl* 7)). (4-30)

Therefore, we have the following estimation for the marginal of py7:
M
j(p* ) = p(p *7(i),7) E E ,(p(p* lT, 7)), (4-31)
i= 1

which converges to [145]:

S p(p* -(i),7) p(p*7) exp 1 as M -oo. (4-32)
i= 1
Intuitively, this is telling us that if we sample 7 and then p sequentially to infinity, we

have that:

p() -+- exp {- ~) I as n --oo. (4-33)
2 2 J









Using all these concepts, we can modify the Gibbs sampler in (Model 4-17) as follows:


New Gibbs Sampler for MAP-EM Sparsity


(?t), ..!))T and [(t) ]T


1. Given [t(t)]T
generate:


( 't) ) ,
7-1 ***,T )n


1 271



m Cin-1 exp-
Tm- 1


exp



-/ t 1)2
2 m- 1


S (t)
791
2



exp


,


-1
----Tm- 1
2


(7+,") T 1_) T generate:


1 P" (p1,m ,2)p(lTi) X ...
I[0,min{/f(A,)|AI ACA, }1 (/' 1),


(4 34)


oc I p(y1 Hip '72) exp
i= I


- x ...
2,Tj


P m-1


I[0,min{p(A,)AI CA.}] A i ),



~ Kp(y ,yP\/ .. 2 (/,m-1__I -1 ) X ...

I[o,mmin{ (A,) Am- Cs A, }] nm-1),

ox p(yHipA,.2 exp n-x1 x...

1[o,min{p(As)|Am -iCA}] (m- ).


3. Update (t+1)

4. Update r(t+2)


for all j
t (1+)


1,... ,m 1.


'(y- Hy)T(y Hp).


(t+1)
T1



(t+1)
m'T-1


2. Given [T(t+l)]T


((t)), ..., )T, [ ()]T








Using the truncated Laplacian distribution in an interval [0, a], a > 0 forces us to use
the mode instead of the mean to obtain the final values for the pj's. The reason behind
this is that we are trying to impose a truncated exponential distribution over the /j's.
This makes it difficult to send the mean to zero fast enough because the lack of symmetry
on the exponential distribution. The mode is a better estimator for sparsity but is in not
perfect. More research needs to be done to obtain a better estimator for the final values
for the fj's.
In addition to the norm L2 (Euclidean distance) for learning functions, it is possible
to use different types of learning functions. For example the norm L1 (Absolute value), the
logistic function or Gaussian regression function. In (C'!i lpter 5), we will explore the use of
the logistic regression for implementing the MCE criterion.
4.4 Slice Sampler for the MAP-EM Sparsity
In order to improve the sampling method, LSE Sparsity, from (C'! lpter 3), we have
decided to use the slice sampler (Section 1-23). For MAP-EM Sparsity, it is necessary
to develop two slice samplers, one for the Tj and the other for the /j's. First, given
Tf ( (t..., L[7)] i(() j) and Ln ] r)... )T, we sample:

aj U 0, exp --- ,(4-35)


bcj u~ 0, exp {-) (436)

c ~ t ), (4-37)

For all j = 1,..., m 1. We need to generate the following sets:

A, T exp -( > a (4-38)

SI (4 39)
{ (440)
By = T exp 7 > b(4-39)

C, {T| > 0o (4-40)
L j








If we solve all the inequalities with respect to Tj, we obtain

< 7- < +oo, (4-41)
2log(aj)
2
0 < T < ) logbj, (4-42)
1
0 < T < (4-43)
ci

The first inequality (Equation 4-41) is due to the fact that:
(t (4)2
exp 2-t) C (0, ). (4-44)

We then use these values to generate the following intervals for each j:

Aj, -, +00 (4-45)
2 log aj

Bj 0,- W log bj (4-46)

Ci = 0,1 ]. (4-47)
C 0,c

Then we sample unifrmly each (t+) from the interval:


Tj max 0, ,min 7 log b }, t
2 log aj(t) gb J,2

Now that we have the updated 7t+1) ,+1), we can sample:

rTi U (0,p(yH|Hit, 72)) (4-49)

for all i =1, n. In addition we sample:

r, U (O,p(p~ )t+l))) (4-50)

and
ru ~ U (0, .,,,i A,.1 )lACA},](Pj)) (4-51)









Therefore, by the same step as that for the T's we have the following intervals from
V2a2p(yi Hi,, a2) > r|j:


-2a2 log(rT,)


(H,-j y,) (4-52)


where H, is the row vector Hi without the jth position H j.
In addition to these intervals, we have for rj, and ru:


--27 log r,,, -2, log r ] ,


(4-53)


[0, min {p(A,)I|A C A}1].


(4-54)


Then, we have the following set:

A, max {{- 2a21og(rij) (H p y ,...
I ^j i= )


min -2a2log(' ,)
HI Ij"


(H, J ) i= 1


(4-55)


/-2Tlogr ,,, min {p(A8,)|A, C A,}}].

Finally, we sample p~j1) from the uniform U (A,).
4.5 Observations about MAP-EM Sparsity
Three clear advantages of the new MAP-EM Sparsity model over the original
Figueiredos's solution for the particular framework of the Choquet integral are:

1. It naturally generates the Laplacian distribution for each ij by sampling Tr, instead
of using a complicated formulation for the expectation of ,

2. It uses a probabilistic numerical solution for the B ,-i ,- hierarchy, which can be
considered a stochastic global optimizer. Figueiredo's solution obtains a function
that is maximized by using derivatives which could or could not find a global
optima.


SHij


and


(H ,-- y) J-2a2 0ogr)
Hij









3. The new MAP-EM Sparsity can handle the monotonicity property of the fuzzy
measures. In comparison, Figueiredo's solution needs to be heavily modified to be
able to handle these relations increasing its complexity.

In addition, The new MAP-EM Sparsity solves some of the problems than can be found in

LSE Sparsity:

1. The lack of a correct distribution for 'rj.

2. The problem of trying to simulate an embedded Sparsity promoting distribution by
using a Gaussian distribution.

3. The use of a naive sampler for the truncated distribution of ij.

MAP-EM Gibbs solves these problem by maximizing a posterior distribution and using

slice samplers for sampling the truncated distribution.

Although this is an improvement over LSE Sparsity, MAP-EM Gibbs still depends

on desired outputs. For this reason, in (C'i lpter 5), we solve this problem using logistic

regression and a convenient B i, -i oi, hierarchical model.

4.6 Results Under MAP-EM Sparsity

In this section, we show the results of the MAP-EM Sparsity algorithm for artificial

data sets and landmine data sets.

4.6.1 Experiments Using Artificial Data Sets

In this section, we present results on artificial problems to illustrate the sparsity

promotion capability of the MAP-EM algorithm. In each problem, we have a two class

synthetic problem with 1,000 samples per class. For case I and case II, the Gibbs sampler

was run for 1,100 iterations with a burn-in period of 100 iterations. In case III, the

MAP-EM Sparsity was run for 400 iterations with a burn-in period of 100 iterations, due

to the fact that more iteration will produce unstable solutions.

4.6.1.1 Case I

In this case, we consider a problem with samples of dimensionality four where the

only separable features are in the odd positions. These features contain values from the

distribution N(0.8, 0.1) for class 1, and from the distribution N(0.2, 0.1) for class 2. The









second and fourth features contain uniform noise. For features bigger than one and less

than zero, we clipped the values. If we plot the samples in their first three features (Figure

4-1), we can see the separation between the classes.

The confusion matrix is in (Table 4-1), the fuzzy measure values are in (Table

4-2) and the Shapley values are in (Table 4-3). Note that the Shapley indices of the

informative features (features 1 and 3) are approximately 20 times larger than those

of the non-informative features (features 2 and 4). The sample outputs of the Choquet

integral using the learned measure can be seen in (Figure 4-2). If we compare them with

the outputs (Figure 4-5) of the MCE under Sugeno measures (Chapter 2), it is noticeable

that MAP-EM is better at separating the classes. However, if we look at the ROC curves,

(Figure 4-6) and (Figure 4-7), the two algorithms look similar.

The traces of the measures can be seen in (Figure 4-3), which are the values that the

parameters take at each iteration when sampling. Finally, the sample distributions of each

parameter are in (Figure 4-4). The non-important parameters have sample distributions

that look like exponential distributions. In addition, the important parameters have

sample distributions different from the exponential.

4.6.1.2 Case II

For this case, a eight dimensionality problem is proposed. Here, the only separable

features are the second and fifth, and they can be seen in (Figure 4-8). The rest of the

features contain uniform noise.

The confusion matrix is shown in (Table 4-4). Some of the measure values are in

(Table 4-5). The Shapley values are in (Table 4-6). These values show that the important

features are the second and the fifth.

The outputs in (Figure 4-9) and the confusion matrix (Table 4-4) show that the

MAP-EM Sparsity has difficulty separating the samples in this problem.









Another attempt with different desired outputs, which are calculated from the

outputs of a method based on logistic regression (C'i lpter 5), produces a better separation

of the classes. This can be seen in (Table 4-7), (Table 4-8) and (Figure 4-10).

This means that the MAP-EM Sparsity is highly sensitive to desired outputs because

its learning function is LSE under a probabilistic interpretation (Section 2.5). However,

when we compare with the output of the MCE under Sugeno measures in (Figure 4-11)

and the ROC curves (Figure 4-12), it is clear that the MAP-EM separates the classes

better. In addition, interpreting the densities in the MCE under Sugeno measures (Table

4-9) is difficult, when comparing with the Shapley index in (Table 4-8).

Finally, examples of traces (Figure 4-15) and distributions (Figure 4-16) show that

MAP-EM Sparsity, at least for this case, is able of producing distributions where the

unimportant parameters have exponential distributions, and the important parameters

have a non-exponential distribution.

4.6.1.3 Case III

This case is a slight variation from the Corral data set (Section 1.6.4). In this

variation, two classes with six feature samples x = (x, x2, 3, 4, 5, 6)T which are defined

as follows:


1. The fifth feature is sampled from a Bernoulli distribution with parameter .

2. The first four features obey the following logical equations:

(xi V2) A (x3 V 4) = 1 if x E class 1, (4-56)
(xi V x2) A (3 V 4) = 0 if x class 2, (4-57)


3. The sixth feature is sampled from a Bernoulli distribution with parameter p = if
x E class 1, and p if x E class 2.

In this case, we decide not to show the measure parameter values, but the Shapley indexes

(Table 4-11). From these values, it is clear that the the first four features are the most

important ones. These values are comparable to the importance indexes developed in [37].









Finally, the confusion matrix is in (Table 4-10), the outputs of the Choquet integral

are in (Figure 4-13), and the ROC curve is in (Figure 4-14). From the ROC curves we can

-.- that MAP-EM is better than MCE under Sugeno. For example, at 91' PD MCE has

a PFA of 10.1 when MAP-EM has 0' -

4.6.2 Conclusions over the Artificial Data Sets

The following conclusions can be inferred from the previous cases:

1. Depending on the problem the MAP-EM Sparsity algorithm may generate sparse
distributions (exponential distributions) for the non-important parameters.

2. It is easier to interpret the measure parameter values from the MAP-EM than the
ones by MCE under Sugeno measures.

3. The MAP-EM Sparsity algorithm is able to identify which combination of features
are important.

4. The MAP-EM Sparsity algorithm is sensitive to desired outputs.

5. The MAP-EM algorithm is better at the task of classification than MCE under
Sugeno measures. This sI---:. -1 that MAP-EM may be better at the fusion task than
the MCE under Sugeno measure.

4.6.3 Experiments with Landmine Detection Data Sets

Several data sets are considered. They are referred to as BJ2006, BJ2007, and A2007.

BJ2006 and BJ2007 consist of GPR data collected on multiple occasions from three U. S.

locations and one European location. A2007 consists of GPR data collected at a single

test site in the U. S. For each data set, a prescreener algorithm runs on all the GPR data

collected at the site and provides locations at which landmines might be present. The

pre-screener also provides a confidence that a mine is present at each location. Several

other landmine detection algorithms were applied to the data at only those locations

given by the pre-screener. Each of the other landmine detection algorithms also produced

confidence values at each of the locations defined by the pre-screener. The confidence

values were ..-'- regated by the Choquet integral using measures trained by the various









algorithms, and cross-validation (Section 2.5). For historical reasons, different sets of

algorithms or versions of algorithms were applied to each data set.

The exact nature of the algorithms is not used by the Choquet integral and so they

are not described here. Some of the algorithms are published and their descriptions

can be found in [137, 139, 146-150]. Here we provide the individual performances of

the algorithms and the performance of the Choquet integrals using the many methods

developed to optimize the measures. The relative performances of these training

algorithms can be determined using results obtained from these data sets collected

from all over the U. S. and part of Europe in support of finding a solution to the very

important problem of mine detection.

The names given to the algorithms at the different test sites are in (Table 4-12).

4.6.3.1 Case A2007

In this particular case, we have 436 mine encounters and 1,135 non-mines. In

addition, a 10-fold cross-validation test is used to train and test the algorithms. The

ROC curves for the different algorithms and the fusion are in (Figure (4-17)) and a

comparison with the MCE under Sugeno measures is in (Figure 4-18) In this case,

MAP-EM Sparsity is better than the MCE under the Sugeno measure. In addition, at

91' '. PD MAP-EM reduces the false alarm rate by a factor of three compared to the best

algorithm HMMconf.

4.6.3.2 Case BJ2007

In this case, we have 1, 593 mine encounters and 3,689 non-mines. In addition, a lane

based cross-validation test is used to train and test the algorithm. The ROC curves for

the different algorithms and the fusion are in (Figure 4-19) and a comparison with the

MCE under Sugeno measures is in (Figure 4-20). In this case, MCE under the Sugeno

measure is doing better than MAP-EM. We believe this is due to the fact that we are

already selecting the best possible subset of algorithms. In addition, we need to do a

better selection of desired outputs for MAP-EM Sparsity.









4.6.3.3 Case BJ2006

In this problem, we have 1, 593 mine encounters and 3,689 non-mines. In addition, a

lane based cross-validation test is used to train and test the algorithms. The ROC curves

for the different algorithms and the fusion are in (Figure 4-21) and a comparison with the

MCE under Sugeno measures is in (Figure 4-22). When looking at these results, MAP-EM

Sparsity improves over MCE under Sugeno measures, but still we have a high dependence

on the desired outputs.

4.6.4 Conclusions over the Landmine Data Sets

The following conclusions can be inferred from the landmine cases:

1. Depending on the data set MAP-EM can be superior than MCE under Sugeno
measures.

2. The MAP-EM Sparsity algorithm is still sensitive to desired outputs. It is necessary
to develop new techniques to avoid this problem.










Table 4-1: Confusion matrix for artificial data set case I for MAP-EM Sparsity.


C'\i
class 1
class 2


class 1
1000
0


class 2
0
1000


Table 4-2: Measures artificial data set case I with mean and standard deviation of the
Markov chains for MAP-EM Sparsity.


Measures

L({fX2})
2({x3})
3({x4})
p({x3, X4})
p({x2, X4})
p({x2, X3})
p({fx1, x4})
p({fx1, x3})
p({x1X, x2})
f({xlx2, x3})
({X, x2, X4})
({{XlX3, x4})
p({X2,X3,X4})
,P(X)


Table 4-3:
Sparsity.


Mean

0.415
0.001
0.421
0.020
0.455
0.043
0.463
0.493
0.987
0.440
0.999
0.530
0.999
0.474
1.000


Standard
deviation
0.0275740
0.0070956
0.0309940
0.0087525
0.0397650
0.0114070
0.0319010
0.0353220
0.0054752
0.0398200
0.0036539
0.0319250
0.0050597
0.0315100
NA


Shapley indexes for the features in artificial data set case I for MAP-EM


Shapley
value
0.489000
0.013667
0.469830
0.027500


Table 4-4: Confusion matrix for artificial data set case II for MAP-EM Sparsity.


C'\i
class 1
class 2


class 1
897
68


class 2
103
932










Table 4-5: Partial measure values for case II with the mean and standard deviation of the
Markov chains for MAP-EM Sparsity.


Measures

p({X2})
p({2X})
p({x3})

/({xs})
p({x6})
p({x7})
p({xs })
p({xI, X3})
p({x1, x3})
p({x1, x6})
p({x1, x})
p({xi, x6})
p({xi, x7})

1i({X2,X3}}
p({X2, x4})
p({X2, X5})
p({X2, X6})

p(X)


Mean

0.000
0.000
0.000
0.000
0.000
0.000
0.000
0.000
0.000
0.000
0.000
0.000
0.000
0.000
0.000
0.000
0.000
0.345
0.000

1.000


Standard
deviation
2.4571e-07
5.4967e-07
1.7770e-07
2.2294e-07
8.5032e-07
1.4586e-07
2.1505e-07
4.0928e-07
1.2712e-05
4.4331e-06
3.9189e-06
8.7484e-06
4.4174e-06
3.2225e-06
5.1592e-06
7.4844e-06
1.0085e-05
0.0737
1.2249e-05

NA


Table 4-6: Shapley indexes for the features in artificial data set case II for MAP-EM
Sparsity.

Shapley
value
1 0.039369
2 0.403490
3 0.050593
4 0.019267
5 0.402920
6 0.020917
7 0.032217
8 0.031233










Table 4-7: A better confusion matrix for case II.


C'\
class 1
class 2


class 1
998


class 2


1000


Table 4-8: More sparse Shapley indexes for case II.


Shapley
value
1 0.0095476
2 0.4575400
3 0.0155760
4 0.0103170
5 0.4733600
6 0.0108480
7 0.0101810
8 0.0126360


Table 4-9: Densities for the MCE under Sugeno measures for case II.


Densities
class


Value

0.6346
0.0000
0.6227
0.5921
0.0000
0.6187
0.5757
0.6667


Table 4-9: Continued.


Densities
class 2


Value

0.3043
1.0000
0.3018
0.1940
1.0000
0.2904
0.2696
0.2801













Table 4-10: Confusion matrix for artificial data set case III for MAP-EM Sparsity.


C\i
class 1
class 2


class 1
1000
0


class 2
0
1000


Table 4-11: Shapley indexes for MAP-EM Sparsity for case III.

Shapley
value
1 0.33322
2 0.16677
3 0.16677
4 0.33322
5 1.66670e-05
6 1.66670e-05


Table 4-12: Different data sets and detection algorithms for the landmine fusion problem.


BJ2006
EHD
GFIT
GMRF
HMM-DTXT
Prescreener
ROCA
SCF
TFC'\


BJ2007
HMM-DTXT
EHD
SCF
Prescreener


A2007
ACNA-EHDConf
HMM-DTXT
F1A-EHDConf
F1V4
Glconf
HMMConf
Radial
SCF

























- +


+ 'f-T W f- i


+ ^
+ + t +

Feature6 + + +

0+. ++ +I + +
02\+ + + + +





Fea3 ture
0. -, o' ."
.. ,:., ." ^ *' v *- + *





SFea te
2 ..04 Feat..ure
ot ':








u- .' .-0 03
01 02
01
0 0




Figure 4-1: Plot of samples for the first three features in case I where the second feature
has no value for classification.


"' '""


































I I I I I I I


0 0 0 0
0


S 0 00 0 0


09 00 C


*q ,% o.


01 0

o0 o: 0 o


S x xxx .
X x xx
X X x x x
xXX
x
"^^ n ^ x^


x
'
Y xx
x r u xx


S 0o u o 05 0 0
0 S F-'


I. CO
- Ca


0 a


I o
O0 0 0 0
C
eQO


>o~ 0 ,
0o,
0"o


x A
x x x"
xxx


x X X
x x "' x x x xWX x x X >;
,A x ; x A ,, x x

x x ." x x XBX Xx x \ >^ xx xx x Xx, '.: x xx -i
#x x x' x xX x x x xx x x x X x
xxx
x x xx x xAxxAx x x A Y A xx
x
x >: 0 x 0 x



100 200 300 400 500 600 700 800 900 1000


Figure 4-2: Outputs for MAP-EM Sparsity in Case I with "o" for class 1 and "x".for class

2.


0



o oj
0 0,


Class 1


Class 2


x x


x x x
x x

x


x
x



















x1) (x2) 3) (x4)



05 05 05 05


0o 0 0
0 200 400 600 800 1000 0 200 400 600 800 1000 0 200 400 600 800 1000 0 200 400 600 800 1000

g(Xa3X4) (X2x4) (X2,X3) g(XX4)
1 1 1


05 05 05 ^ ^ V 05

o o- 0 --0
0 200 400 600 800 1000 0 200 400 600 800 1000 0 200 400 600 800 1000 0 200 400 600 800 1000

(x 3) (x 2) x2x3) g(x!x21 4)
1 1 1


05 05 0.5


o----- o------o----- o-----
0 200 400 600 800 1000 0 200 400 600 800 1000 0 200 400 600 800 1000 0 200 400 600 800 1000

('X3'X4) (X2'X3 X4) (X)
1 1


05 05 v_ 05


0 0-0
0 200 400 600 800 1000 0 200 400 600 800 1000 0 200 400 600 800 1000

Figure 4-3: Traces for the measure parameter samples generated by MAP-EM Sparsity in
Case I.
























60 60 80 150

60
40 40 100
40
20 20 50
20 L

0 0 0 0
025 03 035 04 045 05 002 004 006 008 035 04 045 05 0 002 004 006


60 .100 o a

S0 60 60
40






035 04 045 05 004 006 008 01 012 035 04 045 05 02 03 04 05 06 07


50 50 80

40 40 o B0
30 30
4tt







0 0 00
20 20 200






40 C m 0 0
09 09 09 1 02 03 04 0 45 06 07 0 20 0 1 101 04 05 06 40









Figure 4-4: Distributions for the measure parameters generated by MAP-EM Sparsity in
Case 1000
8. 60 800

60 600

40 400

20 2 200
c 0
097 og8 a99 1 035 04 045 05 40 -20 a 20 40


Figure 4-4: Distributions for the measure parameters generated by MAP-EM Sparsity in
Case I.




































0 I
0 0

oo a
00 0% c
09%
' ,:, c


o on






0
a CO







0


oo a o

o o o o
0f 0n -' n h


00
C



C,


C a


00 o 0 -
y0
o 0 0 o





00 0 0 0
ao o



a oo C
0a
0 a


0 0 u


0
o0 0
0 00


o 0o
0o o oo o 0
00 .o 0 o o o o

0 0 0 00 o0


o Ca b0 C 0 0oo
o c0 Co o 8 00

o
O 0 O ( o0 ,
0 c0 0 0 0




o 0 00
o o o0


oo ag o

oc 0 o o
0 0
W 0
9d
Ge _. Sa


X x A XX A


A A





A A X
A X A


ax i~",

"~oo~'
~ "K
X*?

Xn
X
yXiM X


x x.






X X XA
tXx


$ PX
A A,? A


xxA






A&


x





A X g A
x x x



x x x x

x x x / x x



x x x ;
x x w x I



x ,
IA> Z


x x x

A x. x
A A A


S100 200 300 400 500 600 700 800 900 1000



Figure 4-5: Outputs for the MCE under Sugeno measures gradient descent method for

case I.


x x

X A\X


A x


x



Sx
x
"P n
^.


Sc


- 0 C


X k


x x x
x x
x
x
X


















S-: MAP-EM Sp y P0 00D0









50
05








60



i -: MAP-EM Sparsity PD/PFA : 100/0.0 90/0.


20 30 40 50 0 70 0 90 100
MAP-EM Sparsity PFA


Figure 4-6: ROC curve for MAP-EM Sparsity in case I.




























S 75 .... .....



70





60 : :



-: MCE gradient descent PD/PFA : 100/0.0 90/0.0

10 20 30 40 50 0 70 0 90 100
MCE gradient descent PFA


Figure 4-7: ROC curve for the MCE under Sugeno measures in case I.













o Class 1


* Class 2


08 -


i X

07




06




05




04


,Ix xx

XI


03 0.4 0.5 06 07 08

Figure 4-8: Plot of samples for the first and fifth three features in case II.


15~-tv~ ~~~w;~


















Class 2
06 o


o 0 0 0 0
0 0 0 0





0 0
O C 0 o 00 0 0
0o o o o o -, o



0 U 0 Co o C 0 O
o 0o 0 o o ,
0 0 0 0
A^ Ak gOC. A 0




0A 0., _, x 0
o o ox o

x0 '- p 0. x


xx x x





x x
x" x x x x
-<- 2k^ ^0 ^' '^ *,^ ,X > *""


S0


oo o Class 1

0o O


,00

0 0

*. C,


/
x
Cx x

n


x X




XA ax


XX x


0
0%000 -D0 0 -
Q0 0
c-500 0 O0

o o0 o 0o '

000 0 9X0
o 0 00

0 0 %0
0 0 l 0 o0 "
00 0 ? a 0 0 0 '.0


x0 x 0 '
0 o *
09 x A

Xx X"o ,I x X ^-
lA 'x 2i 6


C


x A'


100 200 300 400 500 600 700 800 900 1000


Figure 4-9: Outputs for MAP-EM Sparsity in Case II with "o" for class 1 and "x" for

class 2.







































149


u u


I


3


x


SXX

x:


X
x I

x,
(^r XK


X
Kx x


. .. '

































n --------IT- I I I I I I I




0 0o 0 o 0 0 0 oQ o o o 0 o

0 0 0 0 0 0 0 0 0 o I 0
0oc co S 00 &o 0 0 0 Class
o o0 o o o 0o o

* o0"0c 0 D@ o o0 0 o ...0,0 o00 0 o-
Q O 00 O,1o, t
o- oo o o o o o o..
I- o u o o
c co 00


X. 0


o 0 0 0 0. 0
0 0 0 00r 0( 0 "0
o 0 0 0 0 o


045k


0 x


A K K A x
x x x xx x x x :
K XXx 4x >4K A xK XX
X K
X X x
x X X x X K
x x
K K"
x XX x x x
_x ,




I xx xx
x xXx X
Sx X x X X
XX X x X
!I!K; X" 'X, sX
A xK X X X4 X
XA A X.X K XX
-X AK AC. A K K
A


X K

x K
x X
K X
X XXX

K K


OX

SXKX K
2

XX

K XX
Kx K


x x x x x

SX X x
x x x x L

X X x X X X X
>4 2K K X X x K A


Ax N XX xK xK K x


KX K x
x
K X

X Xx x X > x




Xx Xx 4 \-


0 100 200 300 400 500 600 700 800 900 1000


Figure 4-10: A better output separation for case II for MAP-EM Sparsity.














I I I I I I I


0C 0 0
07-% o 0 -



0 0-'
0 o n







0 0 xx
xT u O







x 0 x x x x
A >Q


a 0


A@-c





..





x x x x


9 0
o
cPo


oo
0 0o _
oo
_o 2 x o


SClass 1


o Class 2

0 o_, OOo o

_o o 0


S- o f
/; 9.t ^ 6


000 gc 0 o


0""~


, o0


0 x 0

A x
0 > x 0 xo' 0

x ox x3 x

x 0 x x

0 x x x

x x C


x


x
x




x x
0


S x Ox
3 xx r'b x
0 I 0 -


x a
x o
0


0 1 I I I I I I I
S100 200 300 400 500 600 700 800 900 1000


Figure 4-11: Outputs for the MCE under Sugeno measures for case II.










































151


^


"0 x X


XX

































I I


C
i'
.
C
r'
,
.r
-c'
.1'
I'
111


=I
e
I'

r

h
,
,I F

,I" h

F


30 40 50 60
MAP-EM Sparsity PFA


70 80 90 100


Figure 4-12: ROC

case II.


curves for the MCE under Sugeno measures and MAP-EM Sparsity for


AAP-EM Sparsity

DD/PFA: 100/0.0 90/0.0

ACE under Sugeno measure

'D/PFA: 100/99.8 90/72.8


: .,.,'"
'r"






























0o Class 2

o Class 1
08



07-



06



05



04



03



02-
02







S100 200 300 400 500 600 700 800 900 1000


Figure 4-13: Choquet outputs for MAP-EM Sparsity for case III with "o" for class 1 and
"x" for class 2.




























































I
































MAP-EM Sparsity

PD/PFA : 100/0.0 90/0.0

:MCE under Sugeno

PD/PFA : 100/46.0 90/10.1


30 40 50 60

MAP-EM Sparsity PFA


Figure 4-14: ROC curves for the


case III.


MCE under Sugeno measures and MAP-EM Sparsity for


I----
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I

I
I


S10


I I I I I I I I I I I


70 80



















(x2,x4)



8

6

4-

2


0 200 400 600 800 1000


(x2,x6)



8

6

4

2


0 200 400 600 800 1000


\(X2,X5)


0 200 400 600 800 1000



(X2,X7)



.8

.6

.4

.2


0 200 400 600 800 1000


Figure 4-15: Examples of traces for the measure parameter by MAP-EM Sparsity in case
II.


^WAOW""^




















(x2, X4)


0 1 2 3 4 5 6
x 10



g(x2,x6)
800 I

700

600

500

400

300

200

100


0 1 2 3 4 5 6
x 10-


Figure 4-
II.


g(x2,x5)


I I
052 054 056 058 06 062 064 066 068 07




g(X2,X7)
00 ..

'00

100












0 05 1 15 2 25 3 35 4 45 5
x 10-


16: Examples of distributions for the measure parameters by MAP-EM for case





























C9.. .... .. .. ......








8---HMM-DTXT PD/FAR : 95/0.00938, 90/0.00446, 85/0. 1
F1A EHDConf PD/FAR : 95/0.00576, 90/0.00130, 85/0.00098
F1V4 PD/FAR : 95/0.01226, 90/0.00232, 85/0.00107
--Gconf PD/FAR : 95/0.0002, 90/0.00172, 85/0.00116
075- *--HmmConf Coup PD/FAR : 95/0.00529, 90/0.00074, 85/0.00046
*.--Radial PD/FAR : 95/0.01927, 90/0.00385, 85/0.00163
SCF PD/FAR : 95/0.00204, 90/0.00111, 85/0.00046
*-- MAP-EM Sparsity PD/FAR : 95/0.00149, 90/0.00023, 85/0.00
S 05 1 15 2 25 3 35 4 45 5

FAR (FA/m2 ) o10

Figure 4-17: ROC curves for all the A2007 algorithms and MAP-EM Sparsity.













157


































.... .... ... ...... -- :- -.--
035- '-......---------------%-:" r' "


.S t : ,,.04 90/0 0 03 ,: 8 0 02
I I
-o :, -*"i i i i




I-




1 1
II




0085 12211*34















S4It MCE under Sugeno PDFAR .950.00246, 90.0.00042, 85 0.00028































t58
I I

I I

055 2I5 5 5







i : i






























] 5 .. .. .. .. .. .. .. .. .. . . .. .. . . .. -














-EHD PD/FAR :95/0.00783, 90/0.00236, 85/0.00141
0- -HMM-DTXT PD/FAR :95/0.00775, 90/0.00307, 85/0.00146
Prescreener PD/FAR: 95/0.01989, 90/0.00600, 85/0.00373
-SCF PD/FAR: 95/0.03589, 90/0.01736, 85/0.01034
-MAP-EM Sparsity PD/FAR :95/0.00519, 90/0.00161,85/0.00073






0 05 1 15 2 25 3 35 4 45


FAR (FA/m2)


Figure 4-19: ROC curves for all the BJ2007 algorithms and MAP-EM Sparsity.


xl0


s
















































-MAP-EM Sparsity PD/FAR :95/0.00519,90/0.00161, 85/0.00073

0 05 1 15 2 25 3 35 4 45 5
FAR (FA/mn)o

Figure 4-20: ROC curves for MCE under Sugeno and MAP-EM Sparsity in data set
BJ2007.













160





























23% Reduction ..... .....











~1

iEHDPD/FAR: 95/0.00783, 90/0.00236, 85/0.00141
e -' GFIT PDIFAR :95/0.02540, 90/0.00639, 85/0.00363
.y GMRFPD/FAR: 95/0.02165,90/0.00617,85/0.00371
.J HMM-DTXT PD/FAR: 95/0.00775, 90/0.00307, 85/0.00146
075- ..... ...rescreener PDFAR: 95/0.01989, 90/0.00600, 85/0.00373
ROCA PD/FAR: 95/0.01768, 90/0.00507, 85/0.00258
S.---SCF-BJD PD/FAR: 95/0.03589, 90/0.01736, 85/0.01034
SDTFCM PD/FAR: 95/0.02677, 90/0.00529, 850.00300
/ ---MAP EM Sparsity PD/FAR: 95/0.00748, 90/0.00185, 8510.0009
07 ,
05 1 15 2 25 3 35 4 45 5
FAR (FAin) 10

Figure 4-21: ROC curves for all the BJ2006 algorithms and MAP-EM Sparsity.















161









































os














-MAP EM Sparsity PD/FAR :95/0.00748, 90/0.00185, 85/0.00090

--MEC PD/FAR :95/0.00519,90/0.00168, 85/0.00105

07
0 05 1 15 2 5 4 45 5



Figure 4-22: ROC curves for MCE under Sugeno and MAP-EM Sparsity in data set
BJ2006.
















162


.-"' "' "t
.i.









CHAPTER 5
EXTENDING MCE TO A PROBABILISTIC FRAMEWORK USING LOGISTIC
DISTRIBUTION AND GIBBS SAMPLER

We propose an extension of the previous ideas that combines sparsity promoting

Gibbs samplers with the MCE concept. This idea provides an efficient mechanism for

learning full measures, without the problem of desired outputs.

5.1 Logistic Regression for the One Measure Case

Recall that the Choquet integral can be written (Appendix A) in the form:


,(fw ) = Hf p. (5-1)

An immediate problem with applying logistic regression in this context is the fact that the

range of the Choquet integral is between [0, 1]. Hence, we need to map these values to the

interval [-T, r]. This can be done using the following mapping:


T (C()) = 2rH r. (5-2)

Now, given the fact that we would like to fuse information for improving the classification

of two classes, we can assume that the label class

1 if f Class 1

0 if fe Class 2,

is a random variable with Bernoulli probability distribution. This allows us to use the

logistic regression framework (Section 1.4.4) for the fusion task. Thus, we have the

following probabilities for a two class problem:

P(Y 1ftexp{T(C, (f))}(53)
1 + exp{T (C (f,))}








and


P(Y O) exp{T (C,(f{))}
1 + exp{T(C,(f,))}
exp{-T (C,(f)) (54)
1 +exp{-T(C,(f,))}

We can use probabilities (Equation 5-3) and (Equation 5-4), and the exponential
distribution for sparsity promotion to design the following B ,i, -~i i hierarchical model:

exp(T (Cj(flj) ))
ti 1+ exprT (Ct(,j)) }_
exp{-rT (c(fj))} I1
t + exp{-rT (Ct())} '
i = l,...,n

Pm = 1 (5-5)

j ~ Kj x exp Py x I[,,.ii,,~ ,, )|AjCA}] (pj) tj 1, ..., 1
1
7 j 1, ...,m 1,

where pm = p(X), and K1,..., K are integration constants. It is clear that this model
avoids the necessity of desired outputs; at most we use class label yi to know to which
class the data belongs.
5.1.1 The Gibbs Sampler for the One Measure Case
Assuming that the data yi's are independent, for each ij the conditional distribution
is:
P(/j, Yf .. fi P, r-jyj)
p(Iij IY, fj,, f 7,- )-jf j) i t 7
(P(YIfl", ... f* P 1-jy, j *)
Because we know that p(yl f,,..., f~,, -7j,) is a constant, we can write (Equation 5-6)
as:

p(Iij Iy, fi, .., f.,A-j,7j) p(yl f,, ... f )p(j\7j)- (5-6)








which is finally:

P(Pl Y, fi,..., fl,,_j, 71) oN

i 1 i+exp{T(c(]))} l+exp-T(T(C (/, ))} 2 i- J "

Therefore, the Gibbs sampler for the one measure case for the logistic regression under the
monotonicity constraint is:

Gibbs Sampler for Logistic LASSO

1. Given [(ti T t), ..., I), generate:
(t+l) (t) t
p ~ p~wyi, y,, fe, fw, 2 3 T 1-, -1 71


K' x f expI T(C())} i exp'-T(c(fi))} 1 Y ..
I il= [+exp{I(c(fi))}} Ii+exp{-T(cp(if))}

exp ~ X I[o,min{p(As)|AiCAs}](1)


(5-8)
(t+l) r (t+1) (t+1) (t+1) t
Pm-1 p( l yi,...,yn,fw,...,J ,lpl P2 ..,-2 7m-1)
iK' exp{T(Ct(f ))} l'm [ exp{-T(cp(fi))}1 y'-- "
m-1 X i1 i+eX{(C(exp{T(cf ])) r(c(+expT(C )) }

exp -- X [ .i[,, i,,[I, .A,)I|A CA}] ( im-l)


2. For each j, update:
(t+l) 1
7j (t+l)"
PJ

It is clear that sampling from the distributions (Algorithm 5-8) can be difficult, for
that reason, we use the slice sampler.








5.1.2 The Slice Sampler For the One Measure Case Posterior
We do the following to sample from the distributions (Equation 5-8) for each ij.
First, we sample from uniform distributions for each f,

U (0 expT (Ct (fw)) ) if fE Class 1 (5-9)
1 + exp{T (C(f,)j)}
and
"* U 0,1 -xp- T (Ct i\j)) if f, e Class 2. (5-10)
U 1 + exp{-T (C(f,/)) }
In addition, we need a sample the exponential factor that represents the sparsity
distribution for pij:
wexp ~ U (O,exp }) (5-11)

Finally, to maintain the structure of the Slice sampler, we sample:

it' ~ U (0, I[o,min{p(A,)|AiCA,}](j)) U(0, 1). (5 12)

Now, we need to build the set:

A= { (lfi(pj) > i= 1, ...,n, fexp(pj) > wexp and funif(~P) > } (5-13)

to sample uniformly from it. In set A, the functions in the constraints have the following
definitions:

fi (j) exp{rT (C(jf,)) } .'
S 1+ exp{T (Cpt(f,)) }
exp{(-T (C(fj) 1-)y
1+ exp{-rT (C(f)) }
i 1,... ,n, (5-14)

fexp(j) exp 2p (5-15)

fu(pj) = I,min{,(A,)|AiCAs}](/j). (5-16)







In this way, by the same procedure in (Section 4.4), we have the following intervals from
the inequalities fi(pj) > i = 1, ...,n:

27 Hi log ( ) (27K-j 7) if f e Class 1, (5-17)
27H 1 ( "ID
-oo, 2- t log 1 )'+ (2TK-i -7 T) if fc e Class 2, (5-18)
2T7TH 1 /J
where Hij is the position (i,j) in the design matrix H, H_j is the design matrix H without
the column j, t_j is the measure vector tt without the position j and K_j = HTJj_. In
addition, we need the following intervals:

( 7 J
and
[0,min {p(A)|Aj C A}] (5-20)
from inequalities fexp (1i) > wexp and funif(/Pj) > wu respectively.
Finally, assuming that IClass 1| = n and IClass 21 = n2, we have the following
interval [B1, B2], where

B1 max { 2H log ( (2-K-K -)] ,O : (5-21)


B2-min- -t [log ( t + )+(2TX--T)1l ...
-1log(wexp),min I{(As)|A, C As}}. (5-22)

Then, we obtain a sample for p~ by sampling uniformly from the interval A = [B1, B2].









5.1.3 Problems of Assuming a Uniform Distribution

In our experiments, we discovered that in problems where we knew that the optimal

measure was


({Xi}) 1, p({x2}) 0, p({X3})= 0,

p({xi,X2}) 1, P({x2,x3}) = 0, p({x1,X3}) 1,

({X) = 1,

instead of obtaining the optimal measure, we obtained a suboptimal solution like

(0.76 0 0 0.89 0 0.83 1) In this case, it was possible to improve the learned measures

if we sampled all the measure parameters, and normalized them by the maximum measure

parameter, which is p(X) by the monotonicity property.

This has pointed out that using a uniform distribution for the interval


[0,min {p(A,)I|A C A,}]

might be not the best possible distribution to enforce the monotonicity property in our

model.

The decision to employ the uniform distribution over [0, min {p(As)IAj C As}] was

due to its simplicity and easy use for sampling. It is clear that a different probability

distribution is necessary for enforcing the monotonicity property of the fuzzy measure

in our model. For example, we could consider a half-Gaussian distribution, with mode

at min {p(As)IAj C As} to decrease the probability of sampling improbable measures.

However, at this moment we do not have a clear answer for this question, but we hope

that these observations will help us to open new venues of research.









5.1.4 Using a Hidden Variable for Logistic Regresion


As in (C'! plter 3) and (C'! plter 4), we can then modify (Model 5-6) as follows:


[1


exp{rT (C( (f,,))} I .
+ exp{T (C(f,,)) }


exp(-T (Crt( f,-))
S+ exp{-r (cf,)) }
i = 1, ...,n


'Im = 1,


K ex 2 2J
3 ~ K x exp -. I[Omin{p(As)|AyCA-A]}(PJ) J

TJ ~ Dj x exp --Tj j= 1,...,m- 1,


1
' j
I-l


(5-23)


1, ...,m 1,


j = ,..., m- 1.


(Model 5-23) allows us to write the following Gibbs sampler:


Gibbs Sampler for MAP-EM Logistic LASSO


1. Given [tt(t)]T
generate:


~ C exp




CTn-l1 exp
/Tm- 1


(t) 2
271


{-i


exp



- 1) 2
2,m-l)


2. Given [t(t)]T


(I ...,) F and [Lr()]


( (t+1) (t+l), T e
( [ 1 1 ...I. m generate:


(t+ l) t \ P 3 m (t)
P1 / -I p Y --,y" --, w 3 I- 1-) )
K/ expI{T(Cp(f))} exp{-T(c(fi))} x ...
K ix = 1Hi+exp{T(c,(f~))}j i+exp{T-(Cp(/ ))}J

exp 1 +) I[O,min{fp(A)AiCA,}] 1(P)
I 2T! (t J


(t) ).. and [T(t)]T


( t) ...


(t+1)
Ti1



(t+1)
Tnm-1


t) }
Yl
2


exp{


T1
--Tin- I


I, t) T) [f (t)]7
(Pl it) ...,~ ) I [7









(5-24)
(t+l) (t+1) (t+1) (t+l)\
Pm IPP(il,.,Yn,Jwi,...Jw ,ILI ,P2 P"n-m2J
K' [ exp{T(c~.(f))} Y"' [ exp{-T(c( -i))} 11
m- X i1 1+exp {T(c (f))} 1+exp{-T(c (fj))}

exp xl) IX[o,min{p(A,)|A- iCAs}](Pm-l)


3. For each j, update:
(t+1) 1
7j (t+l)"
PI


To sample the T's in this new Gibbs sampler, we can use the procedure in (Section
4.4). The p's can be sampled by using a combination between the slice samplers in
(Section 4.4) and (Section 5.1.2). In this way, we have two possible methods for the case
single measure under logistic regression.
5.2 Development of the Algorithm for the Two Measure Case

In this section, we solve the issue of MCE in a B i,- -i ,i framework using logistic
regression. First, consider the following dissimilarity measure:

d(f,) = CIl(Ol[f"]) C12(02[fw]) (5-25)

as a function of f,, which is a modified version of the dissimilarity measure used in MCE.
(Equation 5-25) can be written as (Appendix A):

d(f,) Hi,[flr] H,2,[f]2 (5-26)

where H1,[f] and H,2[f] represent the differences of functions in each Choquet integral.
1i and 02 represent the mapping functions for samples f, into class 1 and class 2 ranges
respectively. Although, (Equation 5-26) has a range in [-1, 1], this can cause problems in









logistic regression because, at the most extreme points, the sigmoids have values:

exp-1 exp1
exp 0.2689, xp 0.7311,
1 + exp-1 1 + exp
exp- ) 0731 exp (1) ,.
0.731 T 0 ._1 i).
1 + exp-(-) 1 + exp-(

Therefore, it is necessary to multiply the function d(f,) by a convenient 7 to get better

classification values from the sigmoids. In this way, we have the following probabilities for

each of the classes:

exp {-d(fw)}
P(Y 1f) fexp)Td(f)} (5-27)
1 +exp {Td(f,)}
P(Y 0 fexp{-Td( (528)
1 +exp {-d(f,)}

Thus, we can use the following hierarchical model for a two measure case, p, and p2,:

exp {d(f)} 1 exp {-rd(f,)} 1 y .
1 t l+exp{Td(f,)} L 1+exp{-Td(f,)} i 1
p j ~ Kkj x exp {- Pkj} [Oin{p(As, ;CA}]() (5-29)

k = 1,2 andj 1, ..., m
1
7kj k = 1, 2 and j = 1, ..., m.
Pkj

Now, we can develop a Gibbs sampler to solve this model.

5.2.1 The Gibbs Sampler for the Two Measures Case

The Gibbs sampler for B ,,- i ,in model (Model 5-29) has the following structure:


Gibbs Sampler for MCE Logistic LASSO

1. Given [1()] (~t) t ')t (t) )T generate:

PiI ~ P p I yi,..., n, fW, ..., l 2) P13, ) ... ) ., P21 P22, )..- 2m:









K [11 1 [ exp{rd(,i)} ] I [ exp{-r7d(f,)} 1yi
K11 x ili l+exp{rd(fi)} i+exp{-7(i)} j ..

exp -- p11 X I[O,min{p-(As)|AiCAs}] (i11)

(5-30)

(t+l) (t+l) (t+l) (t+l)
#2(rm-1) ~ 1( p9l 1 ***l'", nl, fJ ***1 "j l 12 ...**' lm-1 ***
(t+l) (t+l) (t+l)
[21 ,l[22 P' 2(m-2))
S X F exp{rTd(f,)} exp{-rTd()} 1 ^h
K2(m-1) x i i+exp{rd(f}i)} i+exp{-rd(fi)}j

exp 2(m-1) X [,min{P(A,)|A2(-)CA } 2(m1))


2. For each k and j, update:
(t+l) 1
Ykj (t+l) "
-kj


The simplicity of this model is appealing, if we compare it with the gradient descent

model in (C'! plter 2). In addition, it works for a full measure and not just the Sugeno
A-measure.

Now, we can use the slice sampler to sample from the posterior distributions

(Equation 5-30) in the Gibbs sampler.
5.2.2 The Slice Sampler for the Two Measure Case Posterior

Given the posterior probability (Model 5-30) for ilj we sample from the following
uniforms:
,* T u exp d(f) ) if f e Class 1, (5-31)
t +exp{rd(,)}y I
and
,* (, expJTd(fw) if f G Class 2. (5-32)
t + exp{-rd(f )}









In addition, we need to sample from the exponential sparsity term:


Wexp ~ U (O,exp{


WU ~ U (0, I[omin{.(A,)|IAiA}] (j)) U(0, 1).


Now, by the same method in (Section 4.4), we sample from the following intervals for 1lj:


iv)


TH I log
6ti Ywi)


Ilog (1


(TK -1) ,+oo if e Class 1


iv)


(5-35)


+ (rTK- ) if I,- e Class 2,


(5-36)


where K = C,, (1(f,))

In addition, we have:


C,2(a2(fi)) and K-lj


Hl (f)plJ.


Given |Class 1|


(-


2
--log(wexp) ,
71j


(5-37)


(5-38)


ni and IClass 21 = n2, we have the following interval [B1, B21, where:


B1 = max


B2 = min {


1
rHj

1
T- (H1
4(Li ^


[log ( -


log (1


I',)


(,K- lJ)] 0
1k 1
}n2
+ (rK1
k=1


it


2 log(wexp),min {k(Aj)|A, C A,} C.
71j


Then, we simply sample plj uniformly from the interval [B1, B21.

In a similar way, we have for P2j:


iv)


H1 [log (
HP2(fi) t


(TK-2j)) if i' c Class 1


and


(5-33)


and


(5-34)


1
THOl(f )


and


[0,min {pk(Aj) Aj C A}J].


(5-39)


(5-40)


(5-41)


c,, ( (fi))- C2 a (fi))









and

S logi + (TK-2j), +o if t,' E Class 2, (5-42)
_TH 2( ) ( ,- '
where

K = C1( (fU)) 2 (2 (f)), (5-43)

K-2j =C (01(f)) C2 (02(fW)) + p2(H2j)2j. (5-44)

In addition, we have:
( 2
-o, --2log(wexp)j (5-45)

and

[0, min{pk(A,)|Aj C As}]. (5-46)

The rest is similar to the case for Plj.
This allows us to implement the MCE in a probabilistic framework by using the
Gibbs sampler to enforce the monotonicity contraints of the fuzzy measures.
5.2.3 MAP-EM For the Two Measure Case

As in the case of one measure, it is possible to modify the original model for the two
measure case by adding a hidden variable as follows:

Sexp {rd(f,)} exp {-rd(f,)} 11-
y l1+exp{rd(f,)} 1 + exp{-rd(f,)}]
i = 1, ,n,

kj exp x I[o,min{p(As i ;CAi,}](Ikj)

k = 1, 2 and j = 1,..., m, (5-47)
7k 7kj 1

k = 1, 2 andj 1,..., m,
1
k 1and 1kj
k 1, 2 and 1,..., m









We then have the following Gibbs sampler:


Gibbs Sampler for MAP-EM MCE Logistic LASSO


1. Given [1(t)]T
[7(t+I)]T (lt+
I-1


T T, ..., ) t (t) T
1) (t+1) (t+1) (t+1) T
- 21 'T.T.. "r ) generate


Sexp 27
11 27T1

1 exp
V .. -1)


'1 ( m (t)
exp -21 7'


ex exp
272(m-1)


2. Given [i(t)]T (pt, l (t) /P t), (t)T and [(t+1) ]T
(t+l) (t+l))T
721 "" 2m ) I generate:


(t+l) (t+l)
7-'11 ) "'" -1 n


(t+l) f(t) (t (t) (t) (t) (t)
P11 ~ PP 1l ly n i n f* n 7 P12) 13 ) ***) Plm) 21 P22) 12m


K11 x Hi i l+exp {rd(f)} +exp {-d(f) }j x ...
ie x I,,do7m in{ p p(A -S)A C } 1)
exp X I[o,min{f (A,)IAn CA_ j}] (/ill)
I Ll


(5-48)


(t+1) (t+l) (t+l) (t+l) (t+1)
2(m-1) P(l l ,. ***. n, ,. *** P Hn P11 P12 1Inm-1,) 21 ,
(t+1) (t+1)
P22 )-) 2(m-2)
exp{rd(f)} exp{-rd( )} 1
K2(m-1) x [i 1+exp{rTd(fl)} ] +exp{-rd(f)}]


exp 7, X [O,min {2(AS)IA2(-r 1)CA2s}, (P2m-1())


3. For each k and j, update:


(t+l) 1
kcj (t+l)
Pkj


1
, 1


- Cm-1


(t+1)
711


S(t+1)
72(m-1)


(t)
72(m-1)
2


72(m- 1)


( (t) (t) (t) )T and
711 ,..., ',7lm,71 ,.,72,









The slice sampler for this Gibbs sampler can be built using a combination of the slice

samplers in (Section 4.4) and (Section 5.2.2).

5.3 Final Thoughts about MCE under the Logistic Framework

Recall that the cost function (Equation 2-5) for the MCE is not a convex function.

This is equivalent to having multiple local minima for the cost function. Further, the

only measures that the original MCE could handle are Sugeno A-measures. These are the

reasons why the MCE algorithm based on gradient descent has the following drawbacks:

1. The MCE in (Equation 2-5) can only handle Sugeno A-measures. This measure was
selected because of its recursive definition.

2. It is necessary to introduce global optimization ideas on top of the gradient descent
to solve the problem of the cost function not being a convex function. This only
increases the computational complexity of the solution.

These are the main reasons for developing an MCE in the probabilistic framework.

The original Monte Carlo methods were developed by physicists to integrate high

dimensional functions [2, 151, 152]. The method is based in the fact that an integral of a

function f can be interpreted as:


I = f (x)dx = f(x)h(x)Vdx = E(f), (5-49)

where D is the region of integration, V is a bounding box for D and:

1 if xED
h(x) = (5-50)
0 else

In this way, given random samples from D, {ax, i XN}:

V N
N () (5-5t)
i=1

Compared with methods based in classical integration and differentiation, the Monte Carlo

tends to be more efficient for solving high dimensional problems.

Given the last discussion, MCE Logistic has the following advantages:









1. MCE Logistic is efficient in dealing with high dimensionality problems. This points
out that an MCE Logistic might be better at dealing with the exponential number of
measure parameters than conventional optimization techniques.

2. MCE Logistic can use any type of fuzzy measure rather than being limited to the
Sugeno A-measure.

3. MCE Logistic solution is attracted to regions of high probability, making it a better
method to find global optima.

Although this is better than using gradient descent for the cost function (Equation 2-5),

we still have one possible drawback. Once a sample is generated, it is accepted with

probability one. This might increase the standard deviation of the generated samples. In

turn, this can make it difficult for the Markov chain to converge [2].

Therefore, it is necessary to devise a better global optimization method than the

Gibbs sampler. For this, we have been thinking of using a modified Gibbs sampler where

each measure is sampled through the use of a Metropolis-Hastings-like probability of

acceptance. This will allow us to create a better global optimizer for the Choquet integral

training problem. This is known as Metropolization of the Gibbs sampler. This is beyond

the scope of this dissertation, but we hope that this will be a future research subject.

In addition, in (Appendix C) we discuss the differences between these new algorithms

and the feature selection under mutual information (Section 1.6.4).

5.4 Results under Sparsity Promoting Model with Respect to Logistic and
MCE Logistic Algorithms

In this section, we show the results of the new logistic algorithms using artificial data

sets and landmine data sets.

5.4.1 Results of the Logistic LASSO Using Artificial Data Sets

The confusion matrices were created using the mean of each class under the Choquet

integral:

mi = Ewclass i(C,(f,)), (5-52)

and comparing each output with these means. These were made because we really do not

have desired outputs for these type of algorithms.









5.4.1.1 Case I

The first case is a two class problem of dimensionality four, with the only good

information coming from the first feature and the third feature. The second and fourth

features are samples from a uniform distribution on [0, 1]. The first and third feature is

drawn from N(0.8, 0.1) for class 1, and N(0.2, 0.1) for class 2. In addition, each class has

1,000 samples. A plot of the first three features can be seen on (Figure 4-1). The Gibbs

sampler is run for 1,100 iterations with a burning period of 100 iterations.

The measure parameters can be seen in (Table 5-2). It is very interesting to note that

the measures for individual features one and three are both zero, but the measure of the

pair of features is 0.98, almost one. This illustrate the ability of the Choquet integral to

focus on the worth of subsets if information sources. The outputs of the Choquet integral

can be seen on (Figure 5-1), the Shapley indices can be seen in (Table 5-3), the confusion

matrix can be seen in (Table 5-1), and the ROC curve is in (5-20).

The traces (Figures 5-12) and distributions (Figure 5-13) show that the Logistic

LASSO tends to do a good job of send non-important parameters to zero. Finally, it is

clear that the Logistic LASSO identifies the bad features and separates the classes well.

5.4.1.2 Case II

This is the same problem as in case II (Section 4.6.1) is considered. Some of the

measure parameter values can be seen in (Table 5-5), the Shapley indices are in (Table

5-6), and the ROC curves is in (Figure 5-21).

When looking at the Shapley indices, it is clear that the only important features

are the second and the fifth. The outputs of the Choquet integral using the learned

parameters can be seen in (Figure 5-2). The confusion matrix is in (Table 5-4).

Some of the distributions and traces for the measure parameters are in (Figure 5-15)

and (Figure 5-14) respectively.









5.4.2 Case III

This case is the same problem as in case III (Section 4.6.1) is considered. The

confusion matrix is in (Table 5-7), the Choquet outputs are in (Figure 5-10), the Shapley

indices are in (Table 5-8), and the ROC curve is in (5-22). Again the Shapley indices tell

us that the first four features are the important ones. In addition, it does a good job in

separating the two classes, but not as good as MAP-EM Sparsity.

5.4.3 The MAP-EM Logistic LASSO Artificial Data Sets

5.4.3.1 Case I

The same case as the case I in (Section 4.6.1) is considered. The measure parameters

values can be seen in (Table 5-10), the Shapley indices are in (Table 5-11), the ROC curve

is in (Figure 5-20), and the confusion matrix is in (Table 5-9). It is easy to see that the

only important features are the first and third. The outputs of the Choquet integral using

the learned parameters can be seen in (Figure 5-3). The distribution and traces for the

measure parameters are in (Figure 5-17) and (Figure 5-16) respectively.

If we compare the distributions of MAP-EM Logistic to the direct application of the

LASSO in Logistic, it is possible to notice that MAP-EM Logistic tends to be less able to

send unimportant parameters to zero. Still the MAP-EM Logistic LASSO does a good job

identifying the bad features.

5.4.3.2 Case II

This is the same problem as case II (Section 4.6.1) is considered. Some of the measure

parameter values can be seen in (Table 5-13), the Shapley indices are in (Table 5-14), the

ROC curve is in (Figure 5-21), and the confusion matrix is in in (Table 5-12).

If we look at the Shapley indices, it is easy to see that the only important features

are the second and the fifth. The outputs of the Choquet integral using the learned

parameters by MAP-EM Logistic LASSO can be seen in (Figure 5-4).

Some of the distributions and traces for the measure parameters are in (Figure 5-19)

and (Figure 5-18) respectively.









5.4.4 Case III

This case is the same as the case III in (Section 4.6.1) is considered. The confusion

matrix is in (Table 5-15), the Choquet outputs are in (Figure 5-5), the Shapley indices are

in (Table 5-16), and the ROC curve is in (5-22). Again the Shapley indices tell us that the

first four features are the important ones. In addition, it does a good job in separating the

two classes, but not as good as MAP-EM Sparsity, again.

5.4.5 Results of the MCE Logistic LASSO over Artificial Data Sets

5.4.5.1 Case I

This is the same data set as the one in (Section 5.4.3.1). The confusion matrix is in

(Table 5-17), the ROC is in (Figure 5-20), and the measure parameter values are in (Table

5-18).

We decided not to show the Shapley indices because they do not make that much

sense. This tells us that it is necessary to create a new importance index for problems

with a measure for each class. The outputs of the Choquet differences can be seen in

(Figure 5-6).

It is clear that we have difficulties interpreting the relation between the two measures.

In addition, simple subtraction, as in the classic dissimilarity measure (Section 1.5.1), does

not seem to be very effective at separating separating the classes. Therefore, we need to

propose a better dissimilarity operator for the MCE, but that is beyond the scope of this

dissertation.

5.4.5.2 Case II

This is the same problem as the one in (Section 4.6.1. Due to the fact that it is

difficult to interpret the parameters in the measures, we prefer to show only the Shapley

indices (Table 5-23). Clearly the Shapley indices make sense, if the measure for class one

is used we obtain that the Shapley values for the important features, the second and the

fifth, are higher that the rest. In the case of the measure for case two the reverse happens.









The outputs of the Choquet differences are in (Figure 5-7), the confusion matrix is in

(Table 5-19), and the ROC curve is in (Figure 5-21).

5.4.5.3 Case III

This case is the same that the one in (Section 4.6.1). The Shapley indices are in

(Table 5-24), the Choquet outputs are in (Figure 5-11), and the ROC curve is in (Figure

5-22), and the confusion matrix is in (Table 5-20). In this case, the Shapley indices are

difficult to interpret. Again, it is necessary to produce a better importance index.

5.4.6 Results of the MAP-EM MCE Logistic LASSO over Artificial Data Sets

5.4.6.1 Case I

The same data set that the one in (Section 5.4.3.1). The confusion matrix is in (Table

5-21), the ROC curve is in (Figure 5-20), and the measure parameter values are in (Table

5-22). The outputs of the Choquet differences are in (Figure 5-8).

5.4.6.2 Case II

This is the same problem as the one in (Section 4.6.1). The outputs of the Choquet

differences are in (Figure 5-9), the confusion matrix is in (Table 5-25), and the ROC

curve is in (Figure 5-21). In this case, the Shapley indices make sense (Table 5-26). If the

measure for class one is used we obtain that the Shapley values for the important features,

the second and the fifth, are higher than the rest. In the case of the measure for case two,

they are below than the rest.

5.4.6.3 Case III

This case is the same as the one in (Section 4.6.1). The ROC curve is in (Figure

5-22), and the rest of the answers are similar to the ones for the MCE Logistic algorithm.

5.4.7 Conclusions over the Artificial Data Sets

The following conclusions can be inferred from the previous cases:


1. MCE Logistic LASSO is much worse than the Logistic LASSO for the case II.

2. The measure parameters values in Logistic LASSO and MAP-EM Logistic LASSO
are easy to interpret than the one in MCE logistic methods.









3. From these experiments and the ones in (Section 4.6.1), it seems to be that MCE
methods are not the best way to avoid the problem of desired outputs. However,
MCE logistic methods seems to be more efficient than the ones by the MCE under
Sugeno measures.

4. Although the MCE logistic methods improve over the MCE under Sugeno measures,
it is clear that it is necessary to investigate better methods to avoid desired outputs
than MCE for multiple measures.

5.4.8 Experiments in Landmine Detection Data Sets

Several data sets are considered. They are referred to as BJ2006, BJ2007, and A2007.

BJ2006 and BJ2007 consist of GPR data collected on multiple occasions from three U. S.

locations and one European location. The general descriptions are in (Section (4.6.3)) The

names given to the algorithms at the different test sites are in (Table 4-12).

5.4.8.1 Case A2007

The ROC curves for the different algorithms and the fusion by Logistic LASSO are in

(Figure 5-23) and a comparison with all the logistic methods and the MCE under Sugeno

measures is in (Figure 5-24) In this case, all the logistic methods are better than the MCE

under the Sugeno measure.

5.4.8.2 Case BJ2007

The ROC curves for the different algorithms and the fusion by Logistic LASSO are in

(Figure 5-25) and a comparison with the MCE under Sugeno measures and all the logistic

methods is in (Figure 5-26). When looking at these results, it seems to be that the MCE

and MAP-EM Sparsity are similar for this particular case with MCE Sugeno near to be

the best one. A possible reason behind this is because these algorithms are already the

best ones for fusion. This means that any fuser will do a similar job.

5.4.8.3 Case BJ2006

The ROC curves for the different algorithms and the fusion by Logistic LASSO are in

(Figure 5-27) and a comparison with all the logistic methods and the MCE under Sugeno

measures is in (Figure 5-28). In this case, all the logistic methods are better than the









MCE under the Sugeno measure. In addition MCE based methods are better than the

Logistic LASSO and the MAP-EM Logistic in this data set.

5.4.9 Conclusions over the Landmine Data Sets

The following conclusions can be inferred from the landmine cases:

1. Depending on the data set the logistic methods can be superior than MCE under
Sugeno measures.

2. Fusion over already a few good sources of information does not make that much
sense (BJ2007).

3. It is necessary to investigate a way to improve the dissimilarity measures to obtain
better class separation for MCE algorithms.

4. We have still problems dealing with the exponential nature of the power set of
information sources.










Table 5-1: Confusion matrix for case I using Logistic LASSO.


C'\i
class 1
class 2


class 1
1000
0


class 2
0
1000


Table 5-2: Measure parameter values by Logistic LASSO in case I.


Measures






/p({X3, X})
/p({X2,X4})
p({X2,X3})
p({X1, X4})
p({x1, X3})
p({xI, X2})
p({ai2, x3})

p({a~i;x3, 24})
fp({x, x3, X4})
P({2,X3,X4})
P(X)


Mean

1.6294e-25
2.8931e-33
1.1049e-32
7.' :' '..-32
2.6878e-28
3.0910e-31
6.4258e-27
1.0534e-24
0.9835
8.4211e-20
0.9975
2.9422e-18
0.9961
4.6740e-23
1.0000


Standard
deviation
2.2733e-24
3.9158e-32
1.8291e-31
1.3706e-30
3.1461e-27
4.1886e-30
8.8310e-26
1.4935e-23
0.0119
1.2990e-18
0.0024
4.1793e-17
0.0039
7.0642e-22
NA


Table 5-3: Shapley indexes for case I by Logistic LASSO.

Shapley
value
1 0.4980900
2 0.0021428
3 0.4980900
4 0.0016679



Table 5-4: Confusion matrix for case II using Logistic LASSO.

C'M class 1 class 2
class 1 1000 0
class 2 0 1000










Table 5-5: Measure parameter values by Logistic LASSO in case II.


Measures


Mean


p({xl})

p({x2, 4})
p({x2, x5})
p({x2, x6})

({iXl,, x4})
p({Xlx, X5})
p({Xl,X2, 6})


P x({X,X,1, X5, x8})
P({X1,X2,X ,X5,X7})
p({Xl,X2,X4,x5,X6})

,P(X)


1.7208e-49

8.8076e-46
0.92906
4.8835e-48

8.0982e-40
0.9572
3.0630e-41

0.9875
0.9911
0.9914

1.0000


Standard
deviation
2.2424e-48

1.5720e-44
0.007928
1.0958e-46

1.0938e-38
0.008481
5.4893e-40

0.0048588
0.0024660
0.0026171

NA


Table 5-6: Shapley indexes for case II by Logistic LASSO.

Shapley
value
1 0.0020323
2 0.,' I I :111111
3 0.0026450
4 0.0020117
5 0.,!' I, : 11
6 0.0012476
7 0.0017010
8 0.0017552


Table 5-7: Confusion matrix for case III using Logistic LASSO.


C'\i
class 1
class 2


class 1
1000
0


class 2
0
1000









Table 5-8: Shapley indexes for case III by Logistic LASSO.

Shapley
value
1 0.2527900
2 0.1683600
3 0.1663100
4 0.2531700
5 0.0785320
6 0.0808410

Table 5-9: Confusion matrix for case I using MAP-EM Logistic LASSO.


C'\i
class 1
class 2


class 1
1000
0


class 2
0
1000


Table 5-10: Measure parameter values by MAP-EM Logistic LASSO in case I.


Measures


Mean


P({X21})

p({x24})

p({fx3, X})
p({x2, 4})
p({23, X3})
p({2X1, X4})
p({x21, X3})
p({fx1, x2})
p({XlX2, x3})
Pf({x2, x4})
Pf({axX3, x4})
p({x2,X3, x4})
P(X)


0.0100
0.0000
0.0100
0.0000
0.0600
0.0000
0.0500
0.0100
0.9800
0.0000
1.0000
0.0100
1.0000
0.0100
1.0000


Standard
deviation
0.0255
0.0110
0.0279
0.0120
0.0862
0.0411
0.0951
0.0837
0.0357
0.0760
0.0050
0.2243
0.0403
0.2367
NA


Table 5-11: Shapley indexes for case I by MAP-EM Logistic LASSO.

Shapley
value
1 0.49000
2 1.95160e-18
3 0.50667
4 0.00333









Table 5-12: Confusion matrix for case II using MAP-EM Logistic LASSO.


C'\i
class 1
class 2


class 1
1000
0


class 2
0
1000


Table 5-13: Some measure parameter values by MAP-EM Logistic LASSO in case II.


Measures


Mean


p({x, })

p({x2, X4})
p({X2, X5})
P({2x2, X6})



P({,x2, x51})
Pf({ax2, X6})

p({X2, X5, X6})
p({X2, X5, X7})
p({X2, X5, X8})

p(X)


0.00

0.00
0.93
0.00

0.00
0.96
0.00

0.94
0.93
0.94

1.00


Standard
deviation
5.3652e-07

4.3158e-05
0.0083167
4.6945e-05

0.00050468
0.00845410
0.00034276

0.00774640
0.01191200
0.00900450


Table 5-14: Shapley indexes for case II by MAP-EM Logistic LASSO.

Shapley
value
1 0.0033810
2 0.4915200
3 0.0039524
4 0.0032857
5 0.4901000
6 0.0019762
7 0.0030000
8 0.0027857

Table 5-15: Confusion matrix for case III using MAP-EM Logistic LASSO.

C'M class 1 class 2
class 1 1000 0
class 2 0 1000











Table 5-16: Shapley indexes for case III by MAP-EM Logistic LASSO.

Shapley
value
1 0.2478300
2 0.1723300
3 0.1586700
4 0.2558300
5 0.0896670
6 0.0756670




Table 5-17: Confusion matrix for case I using MCE Logistic LASSO.

C'M class 1 class 2
class 1 1000 0
class 2 0 1000




Table 5-18: Measure parameter values by MCE Logistic LASSO in case I.


Measure
class 1

p({X2i})
i({2X3})

p({X3, X4})
p({X2, X4})
p({X2,3 4})
p({21, X4})
p({21, X3})
p({x1, X24})
p({Zx, x3})

({IX, 2, X3})
p({IxX, 41})
p({x2, X3, X4})
P(X)


Mean Standard


0.0000
0.0000
0.0000
0.0000
0.0000
0.0003
0.3512
0.1320
0.0514
0.3941
0.7635
0.7928
0.8439
0.8195
1.0000


deviation
0.0000
0.0000
0.0000
0.0000
0.0000
0.0020
0.2268
0.2119
0.1402
0.1914
0.1858
0.1582
0.1275
0.1457
NA


Measure
class 2
'({X1i})
p({X2})

P({3X4})
p({X3, X4})
p({x2, X4})
p({x2, X3})
p({x2, x4})
p({x1, x3})
p({x1, x2})
p({Xr, x2, X3})
p({Xi, x2, X4})
p({X, x3, X4})
P({X2, X3, X4})
P(X)


Mean Standard


0.0018
0.0000
0.0000
0.0000
0.2650
0.0000
0.0938
0.2980
0.0170
0.2980
0.7980
0.8295
0.7889
0.8211
1.0000


deviation
0.0155
0.0000
0.0001
0.0000
0.2314
0.0000
0.1872
0.2312
0.0787
0.2400
0.1573
0.1425
0.1651
0.1408
NA










Table 5-19: Case II confusion matrix by the MCE Logistic LASSO algorithm.


C'\i
class 1
class 2


class 1
710
116


class 2
290
884


Table 5-20: Case III confusion matrix by the MCE Logistic LASSO algorithm.


C'\i
class 1
class 2


class 1
1000
26


class 2
0
974


Table 5-21: Case I confusion matrix by the MAP-EM MCE Logistic LASSO algorithm.


C'\i
class 1
class 2


class 1
1000
8


class 2
0
992


Table 5-22: Measure parameter values by MAP-EM MCE Logistic LASSO in case I.


Measure
class 1
P({X2i})

i({2X3})
3({X4})
p({X3, X4})
p({X2, X4})
p({X2, X3})
p({jx1, X4})
p({jX1, X3})
p({jX1, X2})
P({ix2, X31})
P({jx2, X4})
P({x1,x3, x4})
p({X2, X3, X4})
P(X)


Mean Standard


0.0010
0.0010
0.0010
0.0010
0.0090
0.0030
0.0310
0.0100
0.0050
0.0580
0.0840
0.0790
0.1870
0.2250
1.0000


deviation
0.0125
0.0035
0.0139
0.0032
0.0832
0.0065
0.0779
0.0765
0.0742
0.0681
0.2526
0.2467
0.2587
0.2565
NA


Measure
class 2
/({x1})

p({X42})

p({x3, X4})
p({x2, X4}))
p({x2, X4})

p({21, X4})
p({2X1, X3})
p({x1, x4})
p({xi, x3})


P({Xi, x2, 4})
P({Xi, 3, 4})
P({X2,X3, X4})
P(X)


Mean Standard


0.0160
0.0030
0.0147
0.0027
0.0960
0.0078
0.0946
0.0882
0.0859
0.0961
0.3930
0.4797
0.3780
0.4681
1.0000


deviation
0.0172
0.0034
0.0167
0.0029
0.0819
0.0070
0.0827
0.0776
0.0774
0.0831
0.2437
0.2425
0.2454
0 ;'
NA










Table 5-23: Case II Shapley indexes by MCE Logistic LASSO.


0 class 1


Shapley
value
0.055272
0.349770
0.056492
0.046570
0.333170
0.053005
0.051758
0.053964


0 class 2


Shapley
value
0.161500
0.042336
0.143170
0.142310
0.055509
0.153240
0.150790
0.151140


Table 5-24: Case III Shapley indexes by MCE Logistic LASSO.


0 class 1

1
2
3
4
5
6


Shapley
value
0.147600
0.228980
0.221980
0.155220
0.231070
0.015150


0 class 2


Shapley
value
0.146300
0.192820
0.199830
0.162430
0.219650
0.078964


Table 5-25: Confusion matrix for case II using MAP-EM MCE Logistic LASSO.


class 1


C'\i
class 1
class 2


class 2
328
830


Table 5-26: Shapley indexes for case II by MAP-EM MCE Logistic LASSO.


0 class 1

1
2
3
4
5
6
7
8


Shapley
value
0.048179
0.306800
0.045593
0.053748
0.387920
0.054574
0.051310
0.051883


0 class 2

1
2
3
4
5
6
7
8


Shapley
value
0.145580
0.066623
0.141850
0.136600
0.067783
0.147220
0.146870
0.147490






























o % o Class 1
00
0co. Class 2


': t .
0 : ,-'. I l: ,; .







0 (0 co
00 0 0 O 00


xX x


X
CX C A A


'xxC x %X

X > i x '
C I yXI *. XX X
axX



C < xx IX X t x
X Xx X X

x X x x x
x1'


x X x X
x xx x
X X X X XX X XX x

x. ..x x v. .
XX X x II X "
X X X X X X XX x

SX XX
x X. X X IX, XX
II I X C X
X x x .x N N Nx



N X x X X
N41 'X IXA


0 100 200 300 400 500 600 700 800 900 1000


Figure 5-1: Choquet outputs for the synthetic case I under Logistic LASSO with "o" for

class 1 and "x" for class 2.


x




x
X

x X


x
XXK X

X

















0


C C
%3


%0 -

00000 90



' 0
ao ? ?


0

o 0
" 0


00 00 0 O
V0 0 0 0 0

o 0 o Co Class 1


o o 00000o 0 o a

, 2, "0 0 o Co Do_0 0o o 0 00 O C
: o o o o O0


0 0 o 0 0 0 C I a C.-.



o o 0
o o o0 ,


J')0 0 0 0 O 0o
00 0 3


x
I x x

x Xd x *(xX
x x %
X X x
A A
x~ xx x.x
x


x Yx x x x xx xx
x x x x
x





A 7
x x

xx ,, x


x

x


H Xx X y x x




X X




x x



x x
x x x
A A AX
A 7 A
A A A x Ax


x AA x x x


x .


fx~
x xx x
xx




xx x x


x Y x\ ,
S / X xAx x

x x x
x x x
Xx x

x Xw "xx


S100 200 300 400 500 600 700 800 900 1000



Figure 5-2: Choquet outputs for the synthetic case II under Logistic LASSO with "o" for

class 1 and "x" for class 2.











































192


x


I X-


^ xx


x v i X


x


", w


XXYi















I I I I I I I


0 0
09 o o O Oo O oo
0 0 0 0 0 o 0
0 0 0 0 0 oo 00 ,

0 0 o'
' c ,_, o
P O *


co
0 0


a o o 0
o9 @-00 0 0 0 o t

00 0

q O o o _o o0

'2 o
0 0 oo
% o o o


o o o


0_ -. 0 L .0 O ;
0 "



0 o o ,o 0
C) o 0 a

0 00 0 0 D o



o OD
,, o coo -
--0 0 o 0 oo .
0 C 00

o 0 a O
a OB


-

o- o co
0 0 O O 0

0
C 0 0


03



K x AX / KX(





] 4
o x x x x x


x > A
x x




x A
x Xx
x x ,


X x;< x x

> X x


X


Sx

x x


x k x





Kx~"


x *






*, x
K, ,
A>" A;"~ "~' '
KU XAA



A AX


I I I I 4 I
100 200 300 400 500 600


700 800 00 1000


Figure 5-3: Choquet outputs for the synthetic case I under MAP-EM Logistic LASSO

with "o" for class 1 and "x" for class 2











































193


x Ax
xxx


&X xX;
f

x x"


, ,x X X
K


0


x X



















0 0
9o o

oo o o c
C C




C 0 0
C o -
o
c a


o o


% -


03



@0a


O 0 0 o o oco0 0




0 00 o
0 0&




S0 0 00 0


.. ouoo 0- ,
t, ooU 0 -0 OS


'* u a O 6 co
0
"0 0


Sx X x x,


x x x


XX







S xA






$

x x x








x x xx x
I x x


x


\"'


X x x


11 x x
x
x x

x%


SXx
xx x .


x{ 'x

x x( x
x

x*
S xxx


x x x,
x x


S100 200 300 400 500 600 700 800 900 1000



Figure 5-4: Choquet outputs for the synthetic case II under MAP-EM Logistic LASSO

with "o" for class 1 and "x" for class 2


















































194


*o0


x ,
x
xi


xx
A AX




AX


X


x
xx
x"(

xY
xx



xx

x


x
A




yxi


XA<



xx



x xw


wx x
y 'x


x


, o 0 o o 6

*

) ,


1


;(


x


^ *


vxx






















o Class 1

SClass 2

08


07


O0 e 0:G 0 CO 000 00 00 0000 T0 D T 0 0 00 0 0 o0 0 0
0 0 0 0 00 0 00 O 0 O 000 Co OcoC 00 00 1



04


03


02


01 -


0 ,,-- m m m m w m:- -, ..-- ,- m- m .- = .-__- -
S100 200 300 400 500 600 700 800 900 1000

Figure 5-5: Choquet outputs for the synthetic case III under MAP-EM Logistic LASSO
with "o" for class 1 and "x" for class 2


1 0 0 -- -, 0 D -f Y) (hL ( o -a


- I I I 1 11 I I n


...,.._... .~.,.,~ ............... ,

























o


CP o o


So

0 C
*


oo

o a
S 00
S 0 C 0
00



0


D 0


x x

I X x x x
I x X x X x



-< Xxr x x x x



A x X x


x A x x
x x


0


0 0 0 00

o o0 .

0: s
-" bB c


.... o p C Uo a o
0 C c 0 & 0
o 0

0 0 0 0 0 00
0o% o o 0 o





x x xx x x x x


x >( x x x x x x
" x x "x x
x xxx

X x
X X

A x x





A A A


100 200 300 400 500 600 700 800 900 1000


Figure 5-6: Choquet outputs for the synthetic case I under MCE Logistic LASSO with "o"

for class 1 and "x" for class 2


SClassi


x Class 2

o oc
0 -so 0
S o0





c u- (
, oC "
1 0


C
P o"


A A
xx xx


x
x x






















Soo o Class 1
00
S0 0
0o o0o 0 x Class 2
0 0 0 000 0 x
04o o oo o %
o o 0 0 0

C a o o a 0 0 o
0 % 7. o o "0 d os o a u o 0o o o: o



02 O %2 K 2 o o 0o 00 0
0 0 00 0 0 00 8
00 0 0 00, 'o 0

o -o o ,0



S-0 x


Sx
00 %x









"o" for class 1 and "x" for class 2
Figlr 5-7 Choue oupt fbr h ytei aeI ne C ogitcLSOw
%" fb cls 1 n x b ls














I I I I I


0 o


0 0


0 0 0 O0
0 CO C,

051 ocP a o






o o




'" "
, .. .':-


4




2


og &
0


0O"0 o0 0
0 0 C 0 0 0
00
o o

0 00

*, ,04 .' ; _0 ..- .




i I i 0 I, -
-, < o .. : "-
o U A .2. -*
a A


, '
X
x


X X


x x k
X xx
K K C
xx
A K K
A A K
4 K /(X
aXK AAK
KA CA A,
X Rxx CXK
~. 4


xX


K,
xK A
x wi~xx
XxxX
AC K X


0 -

&D. (50


X XX

xx



X XX
YS -


xx
( X









x
x X


Xx x x x
x x
Ix x x X x X x ,
x x x x x xx

x x x


AI i



,% oooo


x xx x
x x





x x x




: x x
Ax K


I I I I I
0 100 200 300 400 500 600 700 800 900 1000


Figure 5-8: Choquet outputs for the synthetic case I under MAP-EM MCE Logistic

LASSO with "o" for class 1 and "x" for class 2









































198


x XX x






K x K K X
/ X X x
Sx x y .

xx x Xx x x x


x X
x X x x X
- ,. x x K


II 0 I


<


X x






























o Class 1


SClass 2

o o


0 0


no


0- 0 ( 0 0 0 0 0 U 0
0b 0 0 0 0 0 0 0


_o o a o o o o
04- o 0 0 0 a 0
C 0 0 0 0 0 00
Sc 00 0 0 00 0C o
0 0 0
040 O0 0 0 0 0o b0
0 0Y 000
O 0

on : 0 : "
0 0 0 00o o0 0 2 -

S x 0

'" 0

-P 0' x 1 --. ''



xn- -u "
y xNN


-04 -


x


-0-
0


X 0
NN


xx XA XX XCK O
X xX x x


x
0. N


* x
x x x


0 x x
x 0 N x 0


n o
0


0

S00
*I0 00 0 0

no a

+'


u0 x 0
0
xnfllX


x
100 200 00 I I 400I I I00 900 1000
100 200 300 400 500 600 700 800 900 1000


Figure 5-9: Choquet outputs for the synthetic case II under MAP-EM MCE Logistic
LASSO with "o" for class 1 and "x" for class 2


















199


_v





























SClass 1


Class 2


o.0,3 L* :": jNi' : a _c i ~, *r.',*' '::,' :; I'"* ,. ...

S05 .X .._. .. .". ..* I, -* ,'a.*~ .* lE_, *. t".I.*. : "- 3 :r, .' .0 ,. t.r* ., -r .-;



04



03



02


Figure 5-10: Choquet outputs for the synthetic case III under Logistic LASSO with "o"
for class 1 and "x" for class 2

















200


1 0 0 -- ,06 D -f Y) (hL ( ()-9D a0


/mM hn*r~n~ rm n h~n~ nhhn Mh n m


' ti~~ O "a"


i.
1-


































oD 000 o 0o o O o 0 C o

() (


CD 0 a fl W m 0 C 0 D00 C 0 0 b
0 0 0 0 0 0 0 0
o o o o o o o o


Class 2



SClass 1


0 0 0 0 o0 00000 oo 0 o a 0o @ 0 Q o0 o o 0o 0 0
000 c_ o o ooco o o
-. ., -' .. : 0C.


I x x 0
r^ k 52^'.'.^ t -i f^ W X ^Xit" X .


A x x x x x
x X X x X x x x x x xx
x xx. x xxxx y x x x xx
xp X, xX x ; "X x xx x > x x x x x xx


x I x x
K x x


Saxxxx xx x Xxx xxxnx x


S100 200 300 400 500 600 700 800 900 1000


Figure 5-11: Choquet outputs for the synthetic case III under MCE Logistic LASSO with

"o" for class 1 and "x" for class 2


00

y94~


0 0



y *tr Me.^


x x


x x x x x x




















g(X1)



05


0 --------
0 200 400 600 800 1000


g(X3,X4)



05


0 200 400 600 800 1000
1 (x -x3)





05









05


200 -0 600 800 1000
0 200 400 600 800 1000


g(x2) g(x3)
10 0 -0- 0 10 02 4 0 0


05 05


0 0
0 200 400 600 800 1000 0 200 400 600 800 1000


g(x2,x4)



05


0 --------
0 200 400 600 800 1000


g(x1 ,2)



05


0 --------
0 200 400 600 800 1000


g(X2,X3,X4)



05


S 200 400 600 00 1000
0 200 400 600 800 1000


g(X2,X3)







0 200 400 600 800 1000

g(x ,x2,x3)



05


0 200 400 600 800 1000
g(X )









0.5


0 200 400 600 800 1000


g(X4)



05


0 200 400 600 800 1000


g(x1,x4)



05


0 200 400 600 800 1000

(x ,X2,x4)



05


0
0 200 400 600 800 1000


Figure 5-12: Traces for the measure parameters learned by the Logistic LASSO for case I.












202



















0(x1) (x2) (X3) g(x4)
1000 1000 1000
800
600
500 500 500
400
200
0 0 0 0
1 2 3 4 0 05 1 15 2 0 1 2 3 0 05 1 15 2
x 10-' x 10- x 10 x 10
(x3,x4) g(x2,x4) g(2,x3) gx1,x4)
1000 1000 1000
800 800
600 600
500 500
400 400
200 200
0 0 0
0 05 1 15 2 0 05 1 15 2 0 5 10 15 0 2 4 6 8
x 10-c x 10 x 10 x 102
g(x1 x3) g(x1,x2) glX21X3) (XX24)
Be 80 1000 150 1000
60
100
40 500 J 500


S o o--- o-O 0oo., --
20 0 0

092 094 096 098 1 0 1 2 3 098 0985 099 0995 1 1005 0 05 1 15 2
x10 xl0-

g(x1x3x4) 4(x2,X34) g(X)
150 1000 1000

100
S500 500
50

0 0- 0
096 097 098 099 1 101 0 05 1 15 -0 -20 0 20 40
x10-1

Figure 5-13: Distributions for the measure parameters sampled by the Logistic LASSO for
case I.























g(x2,x4)


0 100 200 300 400 500 600 700 800 O 0O 1000


g(x2,x5)



s-
1 ,







6

5

4

3

2


1
( 100 200 300 400 500 600 7(0 800 9M0 10


(X2,X7)


0 100 200 300 400 500 600 700 800 900 1000 0 100 200 300 400 500 600 700 800 900


Figure 5-14: Some of the traces for the measure parameters learned by the Logistic
LASSO for case II.


I I I II I I I I


n 6












(x2,x4)
1000 .


0 05 1 15 2 25 3 35 4 45
x 10-3


80

70

60

50

40

30 -

20 -




09 091


092 093 094 095 096


(X2,X6)


0 01 02 03 04 05 06 07 08 09 1
x 10"-


0 02 04 06 08 1 12 14 16 18 2
x 10-4


Figure 5-15: Some of the distributions for the measure parameters sampled by the Logistic
LASSO for case II.


90 .










( ) 2) (x3)
1------1l-0 0 120 00 0 0

05 05 05


0 200 400 600 800 1000 0 200 400 600 800 1000 0 200 400 600 800 1000


g(X3,X4)


0(x1 x3)


05


C 200 400 600 800 1000

J(X1,X3 X4)


05


S 200 400 600 800 1000
0 200 400 600 800 1000


g(x2,x4)


05


0 200 400 600 800 1000

(x1 ,X2)


g(X2,x3,x4)


g(x2,3)


g(x1 ,x2,x3)


0.5


0 200 400 600 800 1000

g(X)


05


0 200 400 600 800 1000


(x4)


05


0 200 400 600 800 1000

g(xl, 4)


g(xx2,x4)


Figure 5-16: Traces for the measure parameters learned by the MAP-EM Logistic LASSO
for case I.





















(X1) (2) gx3) (4)
150 400 150- 400
300 300
100 100
200 200
50 O 50I
100 10 100
0--- 0 0-- -- 0 O
0 005 0.1 015 0.2 0.25 0 0.02 0.04 006 0.08 01 0 0.05 01 015 0.2 025 0 005 01 015 02


(3,X4) (X2,X4) gX2X3) g(X1 X4)
150 300 200 300

100 200 200
100
50 100 100
S050
DL 0 -- 0 ----- : 0
0 02 04 06 08 01 02 03 04 0 02 04 06 08 0 02 04 06 08 1


x1x3) (x1,x2) g(X1 x2,X3) (x!,x2,x4)
80 400 200 150
60 300 150
100
40 200 100
20 100 50
0 0 .- 0 0
08 085 09 095 1 0 02 04 06 08 095 096 097 098 099 1 0 02 04 06 08 1


(x1,xx4) (X2,x3,x4) g(X)
300 80 1000
60
200
40 500
100

0 0 0
075 0.8 0.85 09 0.95 1 0 0.2 04 06 0.8 1 -40 -20 0 20 40

Figure 5-17: Distributions for the measure parameters sampled by the MAP-EM Logistic
LASSO for case I.




















g(x2,x4)


100 200 300 400 500 600 700 800 900 10


g(x2,x5)






5



4
3
2

1
0 100 200 300 400 500 600 700 800 900 10


(X2,X7)


0 100 200 300 400 500 600 700 800 900 1000 0 100 200 300 400 500 600 700 800 900

Figure 5-18: Some of the traces for the measure parameters learned by the MAP-EM
Logistic LASSO for case II.
















g(x2,x4)


100C

0 05 1 15 2 25 3 3!
x10-


(x2,x6)
Rfi/ ,


(x2,x5)


IJ"


091 092 093 094 095 096


x10- x 10'

Figure 5-19: Some distributions for the measure parameters sampled by the MAP-EM
Logistic LASSO for case II.


- j-


oL~



















-r I I I I I


Logistic LASSO
PD/PFA: 100/0.0 90/0.0


C


Q 75
70


0- : MAP-EM Logistic LASSO
55- PD/PFA : 100/0.0 90/0.0
5 -I I I I I-I
0 10 20 30 40 50 60 70 80 90 100
MAP-EM Logistic LASSO PFA


_ 75
70

65
60

55
so


._MAP-EM MCE Logistic LASSO
PD/PFA: 100/0.0 90/0.0

0 10 20 30 40 50 60 70 80 90 100
MAP-EM MCE Logistic LASSO PFA


MAP-EM MCE Logistic LASSO
P PD/PFA 100/0.0 90/0.0

0 10 20 30 40 50 60 70 80 90 100
MAP-EM MCE Logistic LASSO PFA


Figure 5-20: ROC curves for different algorithms in case I.


I I I1I


0 10 20 30 40 50 60 70 80 90 10(
Logistic LASSO PFA


95

90
go

85

s8

75

70

65

60

55


00

95

90

85

80

75

70

65

60

55


! !


! i



















1Un= -1L rF


90

85

80

. 75

70

65

60
55


00
95
90

85

80

75


90

85

80

. 75
70

65

60
55


.......... Logistic LASSO
SPD/PFA : 100/0.0 90/0.0

0 10 20 30 40 50 60 70 80 90 100
Logistic LASSO PFA


a


a
rq


70-r A

65- .. F ..... ..MO. MCE Logistic LASSO
6 .. .: / PD/PFA: 100/98.490/67.8
S : / : .MCE under Sugeno measure
S PD/PFA: 100/99.8 90/72.8
50
0 10 20 30 40 50 60 70 80 90 100
MCE Logistic LASSO PFA


..... MAP-EM Logistic LASSO

.. PD/PFA: 100/0.0 90/0.0

S 10 20 30 40 50 60 70 80 90 100
MAP-EM Logistic LASSO PFA


100 i




70 i .. . ..


65 .......


65 .MAP-EM MCE Logistic LASSO
S.... PD/PFA: 100/98.4 90/67.8
S : MCE under Sugeno measure
S/ ', PD/PFA: 100/99.8 90/72.8
0 10 20 30 40 50 60 70 80 90 10
MAP-EM MCE Logistic LASSO PFA


Figure 5-21: ROC curves for different algorithms in case II.












211


a


1Du !!!


















1I 1-. ...


95
90
85
80
D
Q 75
70
65
60
55


100
95
90
85
80

Q 75
70
65
60
55


... ..... PD/PFA : 100/0 0 .90/0,0
Logistic LASSO
PD/PFA :100/0.0 90/0.0
10 20 30 40 50 60 70 80 90 1
Logistic LASSO PFA


' ;i -- ,'. 4-.- '. ------- -: i -- f
I. .... i .. .. i .. .. .. .. ... .... .. .. ... .. .. .......





MCE Logistic
PD/PFA : 100/1.0 90/0.0
"PI : 0....
I, ..MCE under Sugeno
PD/PFA :100/46.0 90/10.1


95
90
85
80
n 75
70
65
60
55
11


100
95
90
85
80
Q 75
70
65
60
55


0 10 20 30 40 50 60 70 80 90 100
MCE Logistic PFA


MAP-EM Logistic LASSO
PD/PFA : 100/0.0 90/0.0
10 20 30 40 50 60 70 80 90 1
MAP-EM Logistic LASSO PFA


': --I~'--'----r :
- I -i ...: ... .. ... ... .. .. .. .. ..--- .

" i .. .... .. .. .... ..... .. ... .. .. i .. ," ......
I .




T _MAP-EM CE Logistic
PD/PFA : 100/26 90/0,0
i' ..MCE under Sugeno
PD/PFA: 100/46.090/10.1


0 10 20 30 40 50 60 70 80 90 100
MAP-EM CE Logistic PFA


Figure 5-22: ROC curves for different algorithms in case III.











212


1 -. ...



































.. .. ., .. .. -
-- ------
S*
*********** ****


S
,r
r


I


--ACNA EHDConf
- HMM-DTXT PD/F
F1A EHDConf
-- F1V4 PD/FAR
-- Glconf PD/FAR
-***HmmConf Coup
-**-Radial PD/FAR
SCF PD/FAR :
=***L L-


05 1 15 2


PD/FAR :
AR : 95/(


95/0.00933, 90/0.00418, 85/0.
).00938, 90/0.00446, 85/0.0025


0029r
L


'D/FAR : 95/0.00576, 90/0.00130, 85/0.00098
95/0.01226, 90/0.00232, 85/0.00107
S: 95/0.00502, 90/0.00172, 85/0.00116
PD/FAR : 95/0.00529, 90/0.00074, 85/0.0004(
S: 95/0.01927, 90/0.00385, 85/0.00163
95/0.00204, 90/0.00111, 85/0.00046
2 ~ I i i


4 45
,n-S


FAR (FA/m2)


Figure 5-23: ROC curves for all the A2007 algorithms and Logistic LASSO.




















213


r


085-
I






I
I
I






I
I

II
I I
0 I.. .


****** ..........*


I
.111
c
c



. :


s
-'***


I


7






6


























096F


.1.......,.,. .: . .... . .. ..' = =
S..... -... -........ .... '------------'' ....... '


... -.- .-""",...., ...


I ..
r--C~~' -'' -7--'------------ 1;




I I" I I I :
-9 -


092 .I -- ---- ----


I I
I il'














1 4 .. ..10 12
I I
086 I Logistic LASSO PD/FAR :95/0.00158, 90/0.00023, 85/0.00023

S -- -MAP-EM Logistic LASSO PDIFAR :95/0.00246, 90/0.00033, 85/0.00023
-MCE Logistic LASSO PDIFAR :95/0.00339, 90/0.00023, 85/0.00023
084 .... -MAP-EM MCE Logistic LASSO PDIFAR : 95/0.00209, 90/0.00028, 85/0.00023

MCE under Sugeno PDIFAR :95/0.00246, 90/0.00042, 85/0.00028
I
2 4 6 8 10 12
FAR (FA/r) 10


Figure 5-24: ROC curves for MCE under Sugeno and the logistic methods in data set
A2007.












































EHD PD/FAR :95/0.00783, 90/0.00236, 85/0.00141
a8 .: HMM-DTXT PD/FAR : 95/0.00775, 90/0.00307, 85/0.00146
Prescreener PD/FAR : 95/0.01989, 90/0.00600, 85/0.00373
SCF PD/FAR :95/0.03589, 90/0.01736, 85/0.01034
Logistic LASSO PD/FAR :95/0.00475, 90/0.00124, 85/0.00059





07
0 05 1 15 2 25 3 35 4 45 5
FAR (FA/R) 10

Figure 5-25: ROC curves for all the BJ2007 algorithms and Logistic LASSO.
































C085





-MEC under Sugeno PD/FAR :95/0.00354, 90/0.00107, 85/0.00049
-Logistic LASSO PD/FAR :95/0.00475, 90/0.00124, 85/0.00059
075- -MAP-EM Logistic PD/FAR : 95/000385, 90/0.00105, 85/0.00049
MCE Logistic PD/FAR: 95/0.00336, 90/0.00122, 85/0.00071
-MAPE-EM MCE Logistic :95/0.00380, 90/0.00151, 85/0.00080

0 05 1 15 2 25 3 35 4 45 5
FAR (FA/rr) 1

Figure 5-26: ROC curves for MCE under Sugeno and the logistic methods in data set
BJ2007.














































S EHD PDIFAR 95/0.00783, 90/0.00236, 85/0.00141
S- GFIT PD/FAR : 95/0.02540, 90/0.00639, 85/0.00363



ROCA PD/FAR :95/0.01768, 90/0.00507, 85/0.00258
S -SCF PDIFAR :95/0.03589, 90/0.01736, 85/0.01034
S. TFCM PD/FAR: 95/0.02677, 90/0.00529, 85/0.00300
S. -- --Logistic LASSO PD/FAR :95/0.00880,90/0.00202, 85/0.00102
0 7 1 1I II1
0 05 1 15 2 25 3 35 4 45 5
FAR (FA/rn) 10

Figure 5-27: ROC curves for all the BJ2006 algorithms and Logistic LASSO.
















217

































-c-fl


I
I
I
I
I
I
I


05 1 15 2 25

FAR (FA/rr?)


3 35 4 45 5
x 10


Figure 5-28:
BJ2006.


ROC curves for MCE under Sugeno and the logistic methods in data set


0 5-


085
n3


075k


---MCE under Sugeno PF/FAR: 95/0.00519, 90/0.00168, 85/0.00105

-MAP-EM Logistic LASSO PD/FAR : 95/0.00875, 90/0.00202, 85/0.00095

-MAP-EM MCE Logistic PD/FAR :95/0.00363, 90/0.00137, 85/0.00078

-MCE Logistic LASSO PD/FAR :95/0.00397, 90/0.00141, 85/0.0007

-Logistic LASSO PD/FAR :95/0.00880, 90/0.00202, 85/0.00102









CHAPTER 6
CONCLUSION

Learning fuzzy measures has become an important task in recent years for the

development of information fusion algorithms. In this dissertation, we addressed several

problems in the use of the Choquet integral as an .,.-i -regation operator for information

fusion.

In minimum classification error under the Sugeno A-measure, we developed a gradient

descent algorithm which allows us to eliminate the necessity of using desired outputs.

In addition, it enforces the monotonicity properties of the Sugeno A-measure. This new

algorithm reduces the time complexity of the classic LSE method by Grabischthrough

the use of the Sugeno A-measure and the elimination of a matrix of constraints. Other

problems with Grabisch's method is that it does not handle general measures, and does

not have a way to eliminate unnecessary sources of information.

In response to this, we developed an algorithm based on the Gibbs sampler under a

LSE criterion, based on a B-iv.-- hierarchical framework. Using this framework, we were

able to introduce a sparsity term into the learning structure for each of the parameters in

the fuzzy measure. This algorithm was improved by using a combination of MAP and EM

Gibbs. A problem with these algorithms are that they still depend on a series of desired

outputs.

Based in the LSE sparsity under a naive sampler, we developed a maximum a

posteriori EM algorithm based in Gibbs samplers and slice sampler to improve over

the naive sampler. This algorithm still depends on desired outputs, but it improves the

sparsity and regularization of the measure parameters.

To address this problem, we developed several versions of the minimum classification

error for the probabilistic framework by using logistic regression as the learning function.

In addition, we were able to solve the training of a single measure without desired outputs.

This new algorithms eliminate the necessity of desired outputs for problems with one or









two measures. These new methods can be considered as the probabilistic version of the

sparsity method by Tibshirani under the logistic learning function. It was shown that

these new algorithms are more efficient than the MCE gradient descent algorithm.









APPENDIX A
DERIVATION OF THE QUADRATIC PROGRAMMING WITH RESPECT TO THE
GENERAL MEASURE

Given the following quadratic cost function:

E2 (C(f()) 1)2 +... + E (Ct(f(x)) )2 (A 1)
xEC' xECn

and the definition of the Choquet integral (1-111), we have that:




n
E2 (Y p(A(j))(f (x(j)) f (x(i_l))) _- 1)2


S ( [(A())(f(x()) f(x( 1))) an)2 (A-2)
a2 Cn i=1

This, last equation can be simplified making the following substitutions:

U (/1 2, ** P123...n-1 23... (A-3)
Fx (0, ..., (x )) 0, ..., (x(,)) /( (,_l)),...,0)t, (A-4)

where M is the mapping of the lattice 2x into a vector, and F is the vector of difference
where only n position in the vector are different from zero. Thus, we have:

E2 (r* -al)2 +... + (r *U- an)2
xEC' xECn

= (ut u 2ai *u+a ) + ..
xEC1
+ (ut Ft ux u 2aFt U + a2)
xECn

which is equivalent to
E2 ut*D*u- 2at* + a2 (A-5)
where:

D (F Fx)+... + (F Fx), (A-6)



XECi XECn
ar +... + ar, (A-8)










Now, the constrains can be written down as follows:

/11 -112 < 0,
P1- P13 < 0,


/i P123...n < 0,


P123...n-1- 2123...n < 0,

which can be written down in terms of matrix notation as:


A*u


-1 0
0 -1


0 0


0


...


... -1
-


112
-113


/123...n-1
P123...n


This matrix inequality equation (A-9) has three terms, A a matrix of zeros, ones
minus ones, u the vector of measures, and 0 a vector of zeros. From (Equation A
(Equation A-9), we can obtain the following optimization under constraints:


1
min-utDu + Fu ,
2


(A-9)


and
5) and


s.t. Au + b > 0 .


... 0
0


... 0


S0


(A-10)









APPENDIX B
RELATION BETWEEN DIFFERENT CHOQUET ALGORITHMS


Table B-l: Different characteristics for Choquet algorithms.
Algorithm Measure type Cost function
structure
LSE General measure Sum of L2 norms


MCE under
Sugeno A-measure

LSE Sparsity

MAP-EM
Sparsity

Logistic
LASSO

MAP-EM Logistic
LASSO

MCE Logistic
LASSO

MAP-EM
MCE Logistic
LASSO


A-measures


General measures

General measures


General measures


General measures


General measures


General measures


Sum of sigmoid functions
with
dissimilarity measures
B li-, -i i, hierarchy
for L2 norms
B li-, -i i, hierarchy for
L2 and L1 norms and
simulated Laplacian
B li-, -i i, hierarchy for
a logistic regression
with LASSO
B li-, -i i, hierarchy for
a logistic regression with
simulated Laplacian
B li-, -i i, hierarchy for
a logistic regression
with MCE and LASSO
B li-, -i i, hierarchy for
a logistic regression
with MCE and
simulated Laplacian


Optimization
Method
Different possible
optimization
methods
Different possible
optimization
methods
Naive sampler

Gibbs sampler
and
slice sampler
Gibbs sampler
and
slice sampler
Gibbs sampler
and
slice sampler
Gibbs sampler
and
slice sampler
Gibbs sampler
and
slice sampler























































Figure B-l: Relations between the different Choquet algorithms.









APPENDIX C
DIFFERENCES BETWEEN THE BAYESIAN SPARSITY METHODS AND THE
FEATURE SELECTION UNDER MUTUAL INFORMATION

In this chapter, we compare the Maximum Output Information (\!OI) algorithms [37]
and the new algorithms proposed in this dissertation.

Before we list the possible differences between MOI and B ,i-i -i sparsity methods, it
is a good idea to list the common characteristic for each collection. The B ,i, -i i, sparsity
methods share the following characteristics:

1. The Choquet integral and fuzzy measures are used to define general .'-- regator
functions for decision level fusion.

2. A B li-, -i i, Hierarchical model that depends on some type of learning function for
the data is used.

3. A set of sparsity promotion distributions are chosen for the parameter values of tt.

4. Gibbs and slice samplers solve the problem sequentially.
In the case of MOI methods, we have:

1. SVM's + MLP's are used in classification problems.

2. A series of weights are defined by mutual information estimations over the confusion
matrix, geared toward maximizing the mutual information between the desired
outputs and the estimated algorithm outputs.

3. A measure of importance for each feature is defined in the case of SVM's by partial
derivatives over the margin cost function. And in the case of the MLP, it is defined
by recursion on the weights of the neural network.

4. A heuristic selects from the features after each training of the SVM's and MLP's and
the calculation of the importance measure for each feature.

If we compare the two lists, we can see the most obvious differences:

1. MOI methods are based on heuristics applied to SVM's and MLP's seeking to
increase the mutual information to reduce probability of error in classification
problems. Instead, B ,i- -i in sparsity methods are based in a B li, -i ,i, hierarchy over
the parameter values of a fuzzy measure for a Choquet integral. In addition, the
B ,i, -i i, sparsity methods are geared towards implementing some type of learning
function, which is solved by using Gibbs samplers.

2. SVM's and Neural networks are used for classification problems. B ,i, -i in sparsity
methods are designed to solve decision level fusion problems.









3. In MOI, feature selection is done by greedy selection and direct search. In B li-, -i I
sparsity models, algorithm selection is based on embedding a Laplacian distribution
on the parameters of the fuzzy measure, and is performed automatically by the
Gibbs sampler.

These are the reasons why the collection of algorithms are quite different. In addition,
instead of going through a complex heuristic to maximize the mutual information between
the desired outputs and the estimated outputs, the mutual information can be calculated
directly in the B -i, -i i, sparsity methods. For example, we could calculate the mutual
information I(y, y) for Logistic LASSO (Appendix D).








APPENDIX D
MAXIMIZING MUTUAL INFORMATION IN LOGISTIC LASSO
Given the B i,- -i ,i hierarchy (Model 5-6), we have the following probability
distribution for the random variable y:

exp(T (C f( )) Y
P(y|f [) x .
P(lf,) 1 + exp{T (C( } x ...

[ exp{-T (C[ (fJ))} 1--)
1 + exp{-T (C )(fJ)) } I

Therefore, we have that y = E(y lf, p). As can be seen in Sindhwani et al. [37] and
in Fano's inequality, we would like to maximize I(y; y) with respect to pt. For this, it is
necessary to define the joint probability of y and y. This can be done by observing that:

p(y, y) = p(y| )p(E(y|f,,,p))
p(yI#, f ,, i,)p(E(y f,i, ))
=Kp(y|If,, p). (D-2)
With p(E(y f,, pL)) = K. Now, we can write the mutual information as:

I(y; y) = H(y)- H(y|y). (D-3)
Now that we do not have control over the entropy of y [7], we are forced to minimize
H(yl ). We have the following cases:

1. If y and y are independent:

I(y,) = 0 and H(y) 1 < Pe, (D-4)
because y is a Bernoulli random variable its space of possible values is Y {0, 1}
or log921y = 1. Therefore, we do not have any way to control Pe. Thus, we are only
interested in the case when y and y are not independent.

2. If y and y are not independent:

I(y; y) = H(y) H(yl ) and H(,i -) 1 < Pe. (D-5)
Thus, it is necessary to minimize H(y y) to increase mutual information between y
and y, and Pe. We know that:

H(yy) =-E(logp(ylf,, p)). (D-6)
Therefore, we need to maximize p(yl f, t) = p(y f) which is the probability of the
Bernoulli random variable y.









Finally, the maximization of p(ylf,) is made by the Gibbs sampler because it gets
attracted to regions of high probability. This means that the Gibbs sampler forces the
Choquet integral C, to separate the confidence samples.

Therefore, Logistic LASSO maximizes the mutual information between y and y. In
addition to this feature, algorithm selection is made in an automatic way by assuming a
Laplacian prior distribution for each of the I's. This avoids the use of subset selection by
an external algorithm to decide which collection of algorithms are important for the fusion
task.

A similar argument can be made for the rest of the B ,i- -i in sparsity methods. Still,
we need to stress that the maximization of Y(y; y) depends on the learning function used
in the B ,-l, -~i i, sparsity model.









LIST OF REFERENCES


[1] G. Casella and R. L. Berger, Statistical Inference, 2nd ed. Pacific Grove: Duxbury
Press, 2002.

[2] C. P. Robert and G. Casella, Monte Carlo Statistical Methods (Springer Texts in
Statistics). Secaucus, NJ, USA: Springer-Verlag New York, Inc., 2005.

[3] P. Komarek, "Logistic regression for data mining and high-dimensional
classification," Ph.D. dissertation, Carnegie Mellon University, 2004.

[4] B.-H. Juang, W. Chou, and C.-H. Lee, \Ajiiiiiiiiii classification error rate methods
for speech recognition," IEEE Trans. Speech Audio Process., vol. 5, pp. 257-266,
MiT 1997.

[5] S. Katagiri, B.-H. Juang, and C.-H. Lee, "Pattern recognition using a family of
design algorithms based upon the generalized probabilistic descent method," Proc.
IEEE, vol. 86, pp. 2345-2372, November 1998.

[6] M. G. Rahim, B.-H. Juang, and C.-H. Lee, "Discriminative utterance verification
for connected digit recognition," IEEE Trans. Speech Audio Process., vol. 5, pp.
266-277, May 1997.

[7] T. M. Cover and J. A. Thomas, Elements of Information The.. 'i New York, NY,
USA: Wiley-Interscience, 1991.

[8] M. Grabisch, "A new algorithm for identifying fuzzy measures and its application to
pattern recognition," in Fourth IEEE International Conference on Fu,..;, S 1i.
Yokohama, Japan, March 1995, pp. 145-150.

[9] S. Kim and M. Tizhoosh, H.R.and Kamel, "Choquet integral-based ..-:regation
of image template matching algorithms," in Fu,..;1 IT.f.'-, i,/.:.n Processing S... /
2003. NAFIPS 2003. ',,./ International Conference of the North American, July
2003, pp. 143 148.

[10] G. Choquet, "Theory of capacities," Annales de l'Institut Fourier, 1955, vol 5, pp
131-295.

[11] M. Grabisch, T. Murofushi, and M. Sugeno, FP..;; Measures and Inh.lpid- Theory
and Applications, ser. Studies in Fuzziness and Soft Computing. Heidelberg:
Physica V i1 .- 2000.









[12] Z. Wang and G. J. Klir, F,..;., Measure The .,; N. .-. 11, MA, USA: Kluwer
Academic Publishers, 1993.

[13] S. Auephanwirayakul, J. Keller, and P. D. Gader, "Generalized Choquet fuzzy
integral fusion," Information Fusion, vol. 3, no. 1, pp. 69-85, 2002.

[14] M. Capinski and P. E. Kopp, Measure, Ir..i/ ,,l and P,.'1,1.7:.:l;; New York:
Springer, 1999.

[15] R. L. Wheeden and A. Zygmund, Measure and Ir..I gd An Introduction to Real
A,.,...-. Monographs and Textbooks in Pure and Applied Mathematics. 43. New
York Basel: Marcel Dekker, Inc. X, 274 p. SFr. 54.00 1977.

[16] W. Rudin, Real & Complex A -,,,,;-. 3rd ed. New York, NY: McGraw-Hill, Inc.,
1987.

[17] Principles of Mathematical A,.l;,r.: (International Series in Pure & Applied
Mathematics). New York: McGraw-Hill Publishing Co., September 1976.

[18] A. Papoulis and U. S. Pillai, Pi,.. 1b,17I,/ Random Variables and Stochastic Processes.
McGraw-Hill Science/Engineering/\! II !i December 2001.

[19] H. Raiffa and R. Schlaifer, Applied Statistical Decision The(., Cambridge: Division
of Research, Graduate School of Business Administration, Harvard University, 1961.

[20] H. Jeffreys, "An invariant form for the prior probability in estimation problems,"
prsla, vol. 186, no. 1007, pp. 453-461, 1946.

[21] R. O. Duda, P. E. Hart, and D. G. Stork, Pattern Cl-..: ,/l.:'n. New York:
Wiley-Interscience Publication, 2000.

[22] S. Kirkpatrick, C. D. Gelatt, and M. P. Vecchi, "Optimization by simulated
annealing," Science, Number 4598, 13 May 1983, vol. 220, 4598, pp. 671-680,
1983.

[23] E. Aarts and J. Korst, Simulated Annealing and Boltzmann Machines: A Stochastic
Approach to Combinatorial Optimization and Neural Cori,,T',.:.' New York, NY,
USA: John Wiley & Sons, Inc., 1989.









[24] A. P. Dempster, N. M. Laird, and D. B. Rubin, l\! i::.i i1i,, likelihood from
incomplete data via the EM algorithm," Journal of the Rcril Statistical. ... .:;
vol. B39, pp. 1-38, 1977.

[25] S. Borman, "The Expectation Maximization algorithm. A short tutorial," University
of Notre Dame, Tech. Rep., July 2004, this report introduces the expectation
maximization algorithm including a proof of convergence.

[26] S. Chib and E. Greenberg, "Understanding the metropolis-hastings algorithm,"
American Statistician, no. 4, pp. 327-335, Nov 1995.

[27] G. Casella and E. I. George, "Explaining the Gibbs sampler," Journal of the
American Statistical Association, vol. 46, no. 3, pp. 167-174, Aug. 1992.

[28] J. Bilmes, "A gentle tutorial on the EM algorithm and its application to parameter
estimation for Gaussian mixture and hidden Markov models," University of Berkeley,
Tech. Rep., 1997, CSI-TR-97-021.

[29] G. C. G. Wei and M. A. Tanner, "A Monte Carlo implementation of the EM
algorithm and the poor man's data augmentation algorithms," Journal of the
American Statistical Association, Theory and Methods, vol. 85, no. 411, pp. 699-704,
September 1990.

[30] B. Walsh, i\! Iil:,v chain Monte Carlo and Gibbs sampling," 2004.

[31] N. Metropolis, A. Rosenbluth, M. Rosenbluth, A. H. Teller, and E. Teller,
"Equations of state calculations by fast computing machines," Journal of C'li I,-
ical Ph;-;. vol. 21, pp. 1087-1092, 1953.

[32] W. K. Hastings, "Monte Carlo sampling methods using Markov chains and their
applications," Biometrika, vol. 57, pp. 97-109, 1970.

[33] S. Geman and D. Geman, "Stochastic relaxation, Gibbs distributions, and the
B ,,-i i restoration of images," IEEE Transactions on Pattern A,.,al -.: and
Machine Intelligence, vol. 6, no. 6, pp. 721-741, November 1984.

[34] W. R. Gilks, N. G. Best, and K. K. C. Tan, "Adaptive rejection Metropolis sampling
within Gibbs sampling," Applied Statistics, vol. 44, no. 4, pp. 455-472, 1995.









[35] P. Domingos, "A unified bias-variance decomposition and its applications," in Proc.
17th International Conf. on Machine Learning. Morgan Kaufmann, San Francisco,
CA, 2000, pp. 231-238.

[36] R. Kohavi and D. Sommerfield, 1, ,i i re subset selection using the wrapper method:
Overfitting and dynamic search space topology," in KDD, 1995, conf, pp. 192-197.

[37] V. Sindhwani, S. Rakshit, D. Deodhare, D. Erdogmus, J. Principe, and P. Niyogi,
1: ,II ,ire selection in mlps and svms based on maximum output information," IEEE
Transactions on Neural Networks, vol. 15, no. 4, pp. 937-948, 2004.

[38] R. Tibshirani, "Regression shrinkage and selection via the lasso," Journal of the
Royal Statistical S ..:. /I; Series B (Statistical MII,4..1/... .,i/;), vol. 58, no. 1, pp.
267-288, 1996.

[39] P. E. Gill, W. Murray, and M. H. Wright, Practical Optimization. Burlington, MA:
Academic Press, 1981.

[40] M. A. T. Figueiredo, "Adaptive sparseness for supervised 1 iiiii.r IEEE Trans.
Pattern Anal. Mach. Intell., vol. 25, no. 9, pp. 1150-1159, 2003.

[41] C. Gv-i~n. r and G. Lanckriet, "C(' i :teristics in flight data estimation with
logistic regression and support vector machines," in Proceedings of the 1st Inter-
national Conference on Research in Air Transportation (ICRAT), Zilina, Slovakia,
2004.

[42] J. Friedman, T. Hastie, and R. Tibshirani, "Additive logistic regression: a statistical
view of boo-I iir- 1998.

[43] J. Cramer, "The origins of logistic regression," Tinbergen Institute,
Tinbergen Institute Discussion Papers 02-119/4, Dec. 2002, available at
http://ideas.repec.org/p/dgr/uvatin/20020119.html.

[44] Y. So, "A tutorial on logistic regression," 1999.

[45] C. Wen-Tsong and P. Gader, \\W. i level discriminative training for handwritten
word recognition," in Proceedings of the Seventh International Workshop on Fron-
tiers in Handwritten Recognition, September 2000, pp. 393-402.









[46] H. Mizutani, "Discriminative learning for error and minimum reject classification,"
in Fourteenth International Conference in Pattern Recognition, vol. 1, 1998, pp.
136-140.

[47] S. Theodoridis and K. Koutroumbas, Pattern Recognition, Third Edition. New
York: Academic Press, February 2006.

[48] D. Xu and J. Principe, "Learning from examples with quadratic mutual
information," in IEEE Neural Networks for S.:',,l Processing Proceedings of
the IEEE Workshop, 1998, pp. 155-164.

[49] D. Erdogmus and J. C. Principe, "Lower and upper bounds for misclassification
probability based on renyi's information," J. VLSI S.:,.'1l Process. Syst., vol. 37, no.
2-3, pp. 305-317, 2004.

[50] M. Grabisch and M. Sugeno, \! !1 i-attribute classification using fuzzy integral," in
IEEE international conference on F,..;, Sl.11-. March 1992, pp. 47-54.

[51] M. Sugeno, "Theory of fuzzy integrals and its applications," Doctoral Thesis, Tokyo
Institute of Technology, Tokyo, Japan, 1974.

[52] P. Gader, W.-H. Lee, and A. Mendez-Vasquez, "Continuous Choquet integrals with
respect to random sets with applications to landmine detection," in Proceedings
IEEE International Conference on FI,..; Si-1i m-n 2004, vol. 1, Budapest, Hungary,
July 2004, pp. 523-528.

[53] A. P. Dempster, "A generalization of iv, -i i, inference," Journal of the R.;. rl'
Statistical S. .., I1 (Series B), vol. 2, no. 30, pp. 205-247, 1968.

[54] G. Shafer, A Mathematical Theory of Evidence. Princeton, NJ: University Press,
1976.

[55] R. R. Yager, "On ordered weighted averaging .i- :-regation operators in multicriteria
decisionmaking," IEEE Trans. Syst. Man C,.I,, i vol. 18, no. 1, pp. 183-190, 1988.

[56] K. L. -. i-i,-1I: P. Penczek, and W. Grochulzki, "Sugeno's fuzzy measure and fuzzy
clustering," F,..;1 Sets and S.1-/,. ,- vol. 15, no. 2, pp. 147-158, March 1985.









[57] H. Tahani and J. Keller, "Information fusion in computer vision using the fuzzy
integral," IEEE Transactions on S',I-1 ; Man and C;,I,. ,,.. /.:. vol. 20, no. 3, pp.
733-741, 1990, reprinted as an appendix to G. Klir and Z. Wang, F7,. .; Measure
Ti. ..',;, Plenum Press, 1992.

[58] J. M. Keller, P. D. Gader, H. Tahani, J. H. Chiang, and M. Mohamed, "Advances in
fuzzy integration for pattern recognition," F,...;; Sets and S.,'1I ,i vol. 65, no. 1, pp.
273-283, 1994.

[59] M. Grabisch, "Fuzzy integral for classification and feature extraction," in F,..;,
Measures and I,.I. ,il.- Theory and Applications, M. Grabisch, T. Murofushi, and
M. Sugeno, Eds. Heidelberg: Physica Verlag, 2000, pp. 348-374.

[60] J. Wang and Z. Wang, "Using neural networks to determine sugeno measures by
statistics," Neural Netw., vol. 10, no. 1, pp. 183-195, 1997.

[61] P. Gader, A. Mendez-Vasquez, K. C'!I ilerlin, J. Bolton, and A. Zare, \!!,i i-sensor
and algorithm fusion with the Choquet integral: applications to landmine detection,"
in 2004 IEEE International Geoscience and Remote Sensing Symposium, 2004.
IGARSS '04. Proceedings, vol. 3, September 2004, pp. 1605-1608.

[62] A. Mendez-Vazquez, P. Gader, J. M. Keller, and K. C'!I i linherlin, Iii iI II
classification error training for Choquet integrals with applications to landmine
detection," January 2007, to be published.

[63] M. Grabisch, "Modelling data by the Choquet integral," in Information Fusion in
Data Mining, V. Torra, Ed. Heidelberg: Physica V 1 .- 2003, pp. 135-148.

[64] G. M., "k-order additive discrete fuzzy measures and their representation," F ..;,
Sets and Sl i. m- vol. 92, pp. 167-189, December 1997.

[65] Z. Wang and G. J. Klir, F....;1 Measure Th(.. N,, -11, MA, USA: Kluwer
Academic Publishers, 1993.

[66] M. Grabisch, H. Nguyen, and E. Walker, Fundamentals of Uncer'l.I, ,', Calculi, with
Applications to F, ..;; Inference. Dordrecht: Kluwer Academic Publishers, 1995.

[67] G. A ior and T. E, "On the representation of some ..-i-regation functions," in
Proceeding of ISMVL, 1986, pp. 111-114.









[68] S. Ovchinnikov, Ai,/i. i -/.I.:,; and fusion of imperfect information (studies in fuzziness
and soft .. ii,,;,'i.:, 12). Lavoisier, 1997, ch. Aggregation Operators for Fusion
under Fuzziness, pp. 3-10.

[69] M. R. and K. M., "Aggregation operators," in Proceeding of the XI Conference on
applied Mathematics PRIM' 96, S. K. Herceg D., Ed. Institute of Mathematics,
Novi Sad, 1997, pp. 193 211.

[70] M. Detyniecki, \1 i 11i ii Ical .:.:r'egation operators and their application to video
querying," Ph.D. dissertation, Laboratoire d'Informatique de Paris, 2001.

[71] L. S. Shapley., "A value for n-person games," in Contributions to the Theory of
Games Volume II, ser. Annals of Mathematical Studies, H. Kuhn and A. Tucker,
Eds. Princeton, NJ: Princeton University Press, 1953, vol. 28, pp. 307-317.

[72] D. Hall and J. LIinas, "An introduction to multisensor data fusion," in In: Proceed-
ings of the IEEE, vol. 85(1), 1997, pp. 6-23.

[73] L. A. Klein, Sensor and Data Fusion Concepts and Applications. Bellingham, WA,
USA: Society of Photo-Optical Instrumentation Engineers (SPIE), 1999.

[74] H. Li, B. S. AlM ,unath, and S. K. Mitra, \!lI0 slensor image fusion using the
wavelet transform," Graphical Model and Image Processing, vol. 57, no. 3, pp.
235-245, 1995.

[75] M. Mangolini, "Apport de la fusion d'images satellitaires multicapteurs au niveau
pixel en tldtection et photointerprtation," Ph.D. dissertation, Universit Nice Sophia
Antipolis, France, 1994, 174 p.

[76] L. Wald, "Definitions and terms of reference in data fusion," International Archives
of Photogrammetry and Remote Sensing, vol. 32, part 7-4-3 W6, Valladolid, Spain,
pp. 2-6., June 1999.

[77] C. Phol and J. L. V. Genderen, \!11 l sensor image fusion in remote sensing:
concepts, methods and applications," International Journal of Remote Sensing,
vol. 19, no. 5, pp. 823-854, March 1998.

[78] A. H. Gunatilaka and B. A. Baertlein, 1:. II ire-level and decision-level fusion of
noncoincidently sampled sensors for landmine detection," IEEE Trans. Pattern Anal.
Mach. Intell., vol. 23, no. 6, pp. 577-589, 2001.









[79] U. Handmann, G. Lorenz, and W. von Seelen, "Fusion von basisalgorithmen
zur segmentierung von strasenverkehrsszenen," in DAGM-Symposium, 1998, pp.
101-108.

[80] G. T. Marika, "Fusion of texture measures for urban area characterization."

[81] E. den Breejen, K. Schutte, and F. Cremer, "Sensor fusion for anti-personal
landmine detection, a case study," 1999.

[82] P. J. Prodanov and A. Drygajlo, "B li, -i ,i networks based multi-modality fusion for
error handling in human-robot dialogues under noisy conditions." Speech Communi-
cation, vol. 45, no. 3, pp. 231-248, 2005.

[83] A. Noore, R. Singh, and M. Vatsa, "Robust memory-efficient data level information
fusion of multimodal biometric images," Information Fusion, vol. 8, pp. 337-346,
October 2007.

[84] A. Pinz, M. Prantl, H. Ganster, and H. Kopp-Boro'- liii._ "Active fusion a new
method applied to remote sensing image interpretation," Pattern Recogn. Lett.,
vol. 17, no. 13, pp. 1349-1359, 1996.

[85] V. Pavlovic, "Dynamic '.i, i -i ,i networks for information fusion with application
to human-computer interfaces," Ph.D. dissertation, University of Illinois at
Urbana-C'l i"1 ,i i,, January 1999.

[86] Y. Zhang, Q. Ji, and C. Looney, "Active information fusion for decision making
under uncertainty," in Proceedings of the Fifth International Conference on Informa-
tion Fusion, 2002, vol. 1, 2002, pp. 643-650.

[87] F. Fung, K. Laskey, M. Pool, M. Takikawa, and E. Wright, "Plasma: Combining
predicate logic and probability for information fusion and decision support," AAAI
Spring Symposium on ('! I!, i1ges to Decision Support in a C'!I ,ih5ing World,
Febraury 2006.

[88] P. Valin, F. Rhdaume, C. Tre-na1 n v, D. Grenier, A.-L. Jousselme, and E. Boss4,
"Comparative implementation of two fusion schemes for multiple complementary flir
imagery classifiers." Information Fusion, vol. 7, no. 2, pp. 197-206, 2006.

[89] R. E. Neapolitan, Learning B.n., -.in: Networks. Prentice Hall, December 2000,
northeastern Illinois University, 1st Edition.









[90] S. Ferrari and A. Vaghi, "Demining sensor modeling and feature-level fusion by
'I ., -i ii networks," IEEE Sensors Journal, vol. 6, pp. 471-483, April 2006.

[91] P. de Oude, B. Ottens, and G. Pavlin, "Information fusion with distributed
probabilistic networks." in Arl'.:i ,.: Intelligence and Applications, 2005, pp.
195-201.

[92] D. Heckerman, "A Tutorial on Learning B li, -i in Networks," Microsoft Research,
Tech. Rep. MSR-TR-95-06, March 1995.

[93] G. F. Cooper and E. Herskovits, "A B li-, -i i method for the induction of
probabilistic networks from data," Mach. Learn., vol. 9, no. 4, pp. 309-347, 1992.

[94] 0. Parsons and G. A. Carpenter, "Artmap neural networks for information fusion
and data mining: map production and target recognition methodologies," Neural
Netw., vol. 16, no. 7, pp. 1075-1089, 2003.

[95] B. V. Dasarathy, "Information fusion in the context of human-machine interfaces."
Information Fusion, vol. 6, no. 2, pp. 117-118, 2005.

[96] H. Barbera, A. Skarmeta, M. Izquierdo, and J. Bl-.v-, N\, ii I networks for sonar
and infrared sensors fusion," in Proceedings of the Third International Conference on
Information Fusion, 2000. FUSION 2000, vol. 2, July 2000, pp. 18-25.

[97] S. Haykin, Neural Networks: A Comprehensive Foundation. Upper Saddle River,
NJ, USA: Prentice Hall PTR, 1998.

[98] E. I. Shubnikov, \! i111 sensor information fusion in neural networks on the basis of
diffraction optics," Optics and Spectroscopy, vol. 98, no. 2, pp. 284-290, 2005.

[99] Z. Wang and Y. Ma, \!I. lical image fusion using m-PCNN," Information Fusion,
April 2007, in press, corrected Proof, available online April 19 2007.

[100] L. Yiyao, Y. V. Venkatesh, and C. C. Ko, "A knowledge-based neural network for
fusing edge maps of multi-sensor images." Information Fusion, vol. 2, no. 2, pp.
121-133, 2001.

[101] R. Eckhorn, H. J. Reitboeck, M. Arndt, and P. Dicke, 1". I 'ure linking via
synchronization among distributed assemblies: Simulations of results from cat
visual cortex," Neural Computation, vol. 2, no. 3, pp. 293-307, 1990.









[102] J. Wilson, P. Gader, K. Ho, W.-H. Lee, R. Stanley, and T. Glenn, "Region
processing of ground penetrating radar and electromagnetic induction for handheld
landmine detection," in Proceedings of SPIE, vol. 5415. Orlando, FL: SPIE, April
2004, pp. 933-944.

[103] C. J. C. Burges, "A tutorial on support vector machines for pattern recognition,"
Data Mining and Knowledge D.: ...;. ,/ vol. 2, no. 2, pp. 121-167, 1998.

[104] H. Liu, X. Wang, D. Tan, and L. Wang, "Study on traffic information fusion
algorithm based on support vector machines," isda, vol. 1, pp. 183-187, 2006.

[105] R. Singh, M. Vatsa, and A. Noore, "Hierarchical fusion of multi-spectral face images
for improved recognition performance," Information Fusion, August 2006, in press,
corrected proof, available online August 17 2006.

[106] H. C'!h. .-, R. Bogner, and C. Lim, "Dual nu-support vector machine with error rate
and training size '.1 i-ii in International Conference on Acoustics, Speech and
S.: j.,il Processing, ICASSP 2001, USA, 2001, p. 12691272.

[107] D. Whitk,-v, "A genetic algorithm tutorial," Statistics and Cori,,l,.:.' vol. 4, pp.
65-85, 1994.

[108] I. V. Maslov and I. Gertner, \! nl i-sensor fusion: an evolutionary algorithm
approach," Information Fusion, vol. 7, no. 3, pp. 304-330, September 2006.

[109] Z. Wang, K. Xu, J. W i,-: and G. J. Klir, "Using genetic algorithms to determine
nonnegative monotone set functions for information fusion in environments with
random perturbation," International Journal of Intelligent S;;-/. m" vol. 14, no. 10,
pp. 949-962, August 1999.

[110] E. F. Combarro and P. Miranda, "Identification of fuzzy measures from sample data
with genetic algorithms," Computers and Operations Research, vol. 33, no. 10, pp.
3046-3066, 2006.

[111] G. J. Klir and B. Yuan, F,. ..; Sets and F,..;. Logic: Theory and Applications.
Upper Saddle River, NJ, USA: Prentice-Hall, Inc., 1995.

[112] H. Frigui, P. D. Gader, W.-H. Lee, and J. N. Wilson, "Detection and discrimination
of landmines in ground penetrating using an eigenmine and fuzzy membership
function approach," in Proceedings of the SPIE Conference on Detection and
Remediation Technologies for Mines and Minelike Ti,,j I/ IX, Orlando, FL, USA,
April 2004.









[113] P. Gader, J. Keller, and J. Cai, "A fuzzy logic system for the detection and
recognition of handwritten street numbers," IEEE Transactions on FP,..;, Si 11 1
vol. 3, no. 1, pp. 83-95, 1995.

[114] K. Xu, Z. Wang, P.-A. Heng, and K.-S. Leung, "Classification by nonlinear integral
projections," IEEE Transactions on F,....; S.1,I. i, vol. 11, no. 2, pp. 187-2001,
2003.

[115] W. Wii: Z. Wang, and G. J. Klir, "Genetic algorithms for determining fuzzy
measures from data," Journal of Intelligent and F,..; S.II/. "m vol. 6, no. 2, pp.
171-183, 1998.

[116] J. Haddadnia and K. Faez, "Hybrid n-feature extraction with fuzzy integral in
human face recognition," in Video/Image Processing and Multimedia Communica-
tions 4th EURASIP-IEEE Region 8 International Symposium on VIPromCom, 2002,
pp. 93-98.

[117] K.-C. Kwak and W. Pedrycz, 1 1. recognition using fuzzy integral and wavelet
decomposition method," IEEE Transactions on Sii' 1/ Man and C'.,I1 ,. : Part
B, vol. 34, no. 4, pp. 1666-1675, August 2004.

[118] Y. Wu, H. Liu, and H. Zha, "Modeling facial expression space for recognition," 2005
IEEE/RSJ International Conference on Intelligent Robots and S1, /. 2005, (IROS
2005), pp. 1968-1973, August 2005.

[119] P. Gader, W.-H. Lee, and X. Zhang, "Renyi entropy with respect to Choquet
capacities," in 2004 IEEE International Conference on F,. ..; S. ,I/ m- 2004.
Proceedings, vol. 1, July 2004, pp. 529-533.

[120] M. A. Schatten, P. D. Gader, J. Bolton, A. Zare, and A. Mendez-Vasquez, "Sensor
fusion for airborne landmine detection," in Proceedings of SPIE Volume: 6217,
Detection and Remediation Technologies for Mines and Minelike Ti,,, I/ XI, May
2006.

[121] P. Gader and J. Keller, "Applications of fuzzy set theory to handwriting
recognition," in Proceedings of the Third IEEE Conference on F,..;, S.~,/i .
1994. IEEE World Congress on Computational Intelligence, June 1994, pp. 910-917.

[122] P. D. Gader, M. A. Mohamed, and J. M. Keller, "Dynamic-programming-based
handwritten word recognition using the Choquet fuzzy integral as the match
function," Journal of Electronic Imaging, vol. 5, no. 1, pp. 15-24, January 1996.









[123] M. Grabisch and M. Schmitt, \1 i, I. i11 ii ical morphology, order filters and fuzzy
logic," in Proceedings of 1995 IEEE International Conference on FJ,..;. Sil, ;1.
1995. International Joint Conference of the Fourth IEEE International Conference
on F,. .;, Si,-4 in- and the Second International F, ..; Engineering Symposium,
vol. 4, March 1994, pp. 2103-2108.

[124] A. Hocaoglu and P. Gader, "An interpretation of discrete Choquet integrals in
morphological image processing," The 12th IEEE International Conference on F,...;
Si1.-ii- 2003. FUZZ '03, vol. 2, pp. 1291-1295, A li,- 2003.

[125] J. Keller, P. Gader, R. Krishnapuram, X. W i,- A. Koksal Hocaoglu, H. Frigui, and
J. Moore, "A fuzzy logic automatic target detection system for ladar range images,"
in The 1998 IEEE International Conference on Fn,..;, S.l,; -1. Proceedings, 1998.
IEEE World Congress on Computational Intelligence, vol. 1, AI i,- 1998, pp. 71-76.

[126] J. Cao, M. Shridhar, and M. Ahmadi, "Fusion of classifiers with fuzzy integrals,"
in ICDAR '95: Proceedings of the Third International Conference on Document
A,r.:li-!- and Recognition (Volume 1). Washington, DC, USA: IEEE Computer
Society, 1995, p. 108.

[127] J. M. Keller and J. Osborn, "Training the fuzzy integral," Int. J. Approx. Reasoning,
vol. 15, no. 1, pp. 1-24, 1996.

[128] J.-H. Chiang, "Choquet fuzzy integral-based hierarchical network for decision
analysis," IEEE Transactions on F,,..; S.-,1 1. vol. 7, no. 1, pp. 63-71, February
1999.

[129] P. D. Gader, B. Nelson, A. Hocaoglu, S. Auephanwiriyakul, and M. Khabou, \N. i' I
versus heuristic development of Choquet fuzzy integral fusion algorithms for land
mine detection," in Neuro-fi ..;, Pattern Recognition, H. Bunke and A. Kandel, Eds.
World Scientific Publ. Co., 2000, pp. 205-226.

[130] M. Grabisch and J. Nicolas, "Classification by fuzzy integral: Performance and
tests," F,..,; Sets and Si-l m- vol. 65, no. 2-3, pp. 255-271, 1994.

[131] M. Grabisch, "The application of fuzzy integrals in multicriteria decision making,"
European Journal of Operational Research, vol. 89, pp. 445-456, 1996.

[132] M. S. Bazaraa, H. D. Sherali, and C. M. Shetty, Nonlinear P,..i,ri,,,,:,,i- Theory
and Algorithms. Hoboken, NJ: John Wiley & Sons, Inc, 1993, class notes for the
optimization class he taught in the ECE Dept. at University of Texas, Austin.









[133] D. Wang, X. Wang, and J. Keller, "Determining fuzzy integral densities using a
genetic algorithm for pattern recognition," in Proceedings of NAFIPS'97, Syracuse,
NY, September 1997, pp. 263-267.

[134] T. H. Cormen, C. Stein, R. L. Rivest, and C. E. Leiserson, Introduction to Algo-
rithms. New York: McGraw-Hill Higher Education, 2001.

[135] W. H. Press, S. A. Teukolsky, W. T. V i1. Ilin:- and B. P. Flannery, Numerical
Recipes in C: The Art of Scientific Cor',,'il1I,: New York, NY, USA: Cambridge
University Press, 1992.

[136] S. Boyd and L. Vandenberghe, Convex Optimization. Cambridge University Press,
March 2004.

[137] P. D. Gader, W.-H. Lee, and J. N. Wilson, "Detecting landmines with ground
penetrating radar using feature-based rules order statistics, and adaptive whitening,"
IEEE Trans. Geoscience and Remote Sensing, vol. 42, no. 11, pp. 2522-2534, 2004.

[138] K. C. Ho, L. Carin, P. D. Gader, and J. N. Wilson, "On using the spectral features
from ground penetrating radar for landmine-clutter discrimination," submitted to
IEEE Trans. Geoscience and Remote Sensing.

[139] H. Frigui and P. D. Gader, "Detection and discrimination of landmines based on
edge histogram descriptors and fuzzy k-nearest neighbors," in Proceedings of the
IEEE International Conference on F,. ..; S,/i-1 m- Vancouver, BC, Canada, July
2006.

[140] P. A. Torrione and L. M. Collins, "Application of texture feature classification
methods to landmine / clutter discrimination in off-road GPR data," in Proc.
IGARSS, 2004, pp. 1621-1624.

[141] P. A. Torrione, C. S. Throckmorton, and L. M. Collins, "Performance of an adaptive
feature-based processor for a wideband ground penetrating radar system," IEEE
Transactions Aerospace and Electronic S;,-1. m' vol. 42, no. 2, pp. 644-659, April
2006.

[142] A. Zare and P. D. Gader, "Sparsity promoting iterated constrained endmember
detection for hyperspectral imagery," IEEE Geoscience and Remote Sensing Letters,
2007.









[143] M. A. Tanner and W. H. Wong, "The calculation of posterior distributions by data
augmentation," Journal of the American Statistical Association, vol. 82, no. 398, pp.
528-540, 1987.

[144] M. A. Tanner, Tools for Statistical Inference: Methods for the Exploration of
Posterior Distributions and Likelihood Functions, 3rd ed., ser. Springer Series in
Statistics. Springer V I1 1996.

[145] L. Tierney, \I i :lv chains for exploring posterior distributions," The Annals of
Statistics, vol. 22, no. 4, pp. 1701-1728, 1994.

[146] J. N. Wilson, P. D. Gader, W. H. Lee, H. Frigui, and K. C. Ho, "A rigorous
evaluation of algorithms using ground penetrating radar for landmine detection
and discrimination," IEEE Trans. Geosci. Remote Sensing, no. 45, pp. 2560-2572,
2007.

[147] H. Frigui, K. Ho, and P. D. Gader, "Real-time land mine detection with ground
penetrating radar using discriminative and adaptive hidden markov models,"
EURASIP Journal on Applied S.:,..l Processing, vol. 2005, no. 12, pp. 1867-1885,
July 2005.

[148] Y. Zhao, P. D. Gader, and Y. Z. P. C'! i1,: "Training dhmms of mine and clutter to
minimize landmine detection errors," IEEE Trans. Geoscience and Remote Sensing,,
vol. 41, no. 5, pp. 1016-1024, May 2003.

[149] W.-H. Lee, P. D. Gader, and J. N. Wilson, "Optimizing the area under a receiver
operating characteristic curve with application to landmine detection," IEEE Trans.
Geoscience and Remote Sensing, vol. 45, no. 2, pp. 389-398, February 2007.

[150] K. C. Ho, L. Carin, P. D. Gader, and J. N. Wilson, "An investigation of using
the spectral characteristics from ground penetrating radar for landmine/clutter
discrimination," IEEE Trans. Geoscience and Remote Sensing, accepted for
publication.

[151] A. Smith and G. Roberts, "B li-, -i computation via the Gibbs sampler and related
Markov chain Monte-Carlo methods," J. R. Statist. Soc., vol. 55, pp. 3-23, 1993.

[152] J. S. Liu, Monte Carlo Strategies in Scientific CoTniI.g! Springer, October 2002.









BIOGRAPHICAL SKETCH

Andres Mendez-Vazquez is a Ph.D. student at the University of Florida. He got his

bachelor degree at the University of Yucatan. His research interests include landmine

detection, statistical methods for machine 1 'iir. : and fuzzy measures and Choquet

integration.





PAGE 1

1

PAGE 2

2

PAGE 3

TomySisterbecauseshelovesmewithoutquestions. TomyProfessorsbecausetheyanswermyquestions. TomyFriendsbecausetheywerepatientwithme. Thankyou 3

PAGE 4

IwouldliketothankDr.PaulGader,Dr.GerhardRitter,Dr.JoeWilson,Dr.ManuelBermudezandDr.MarkSchmaltzfortheirpatienceandunderstanding.IwouldalsoliketothankDr.JimKellerforhissupportformyrstjournalpaper.IwouldalsoliketothankAshishMyles,MariaVelezmoro,PedroB.Morales,IlPark,AlexandraCamacho,ArturoCamacho,BrunoMaciel,ErnestHall,ChristianCampbell,GanesanRamachandran,XhueiHueandArtBarnesfortheiradviceandfriendship.Finally,Iwouldliketothankmyco-workersatthelabAlinaZare,RaaziaMazhar,RyanBusser,GyeongyongHeo,JeremyBolton,JohnMcElroy,SeanMatthews,SenihaEsenYukselandXupingZhangfortheirunderstanding. 4

PAGE 5

page ACKNOWLEDGMENTS ................................. 4 LISTOFTABLES ..................................... 10 LISTOFFIGURES .................................... 13 LISTOFABBREVIATIONS ............................... 17 ABSTRACT ........................................ 18 CHAPTER 1LITERATUREREVIEW .............................. 20 1.1ProbabilityTheory ............................... 20 1.1.1SampleSpaces .............................. 20 1.1.2KolmogorovAxioms ........................... 21 1.1.3ProbabilityDistribution ........................ 22 1.1.4IndependenceandConditionalProbability .............. 23 1.1.5Moments ................................. 24 1.1.6Bayes'Rule ............................... 24 1.1.7LikelihoodFunctions .......................... 25 1.1.8PriorDistributionsandConjugatePriors ............... 25 1.2MonteCarloSimulations ............................ 26 1.2.1ExpectationMaximization ....................... 27 1.2.2MarkovChains ............................. 28 1.2.3Metropolis-HastingsAlgorithm ..................... 29 1.2.4GibbsSampler .............................. 30 1.2.5RelationoftheGibbsSamplerwithExpectationMaximization ... 31 1.2.6TheSliceSampler ............................ 31 1.3SparsityPromotingDistributions ....................... 33 1.3.1Introduction ............................... 33 1.3.2SparsityPromotioninCostFunctions:LeastAbsoluteShrinkageandSelectionOperator ......................... 36 1.3.3ExtendingtheLASSOtoaBayesianHierarchicalModels ...... 36 1.4LogisticClassication .............................. 38 1.4.1LogisticDistribution .......................... 38 1.4.2TheBinaryClassicationProblem ................... 39 1.4.3MotivationsforUsingtheSigmoid ................... 40 1.4.4LogisticRegression ........................... 40 1.5MinimumClassicationError ......................... 41 1.5.1DissimilarityMeasures ......................... 42 1.5.2LossFunctions .............................. 42 1.6InformationTheory ............................... 43 5

PAGE 6

.......................... 43 1.6.2MutualInformation ........................... 44 1.6.3Fano'sInequality ............................ 46 1.6.4FeatureSelectionUsingMutualInformation ............. 47 1.7FuzzyMeasures ................................. 52 1.7.1EvidenceMeasures ........................... 52 1.7.2PossibilityMeasures ........................... 54 1.7.3K-AdditiveMeasures .......................... 55 1.7.4SugenoMeasures ............................ 55 1.8FuzzyIntegrals ................................. 56 1.8.1SugenoIntegral ............................. 56 1.8.2TheContinuousChoquetIntegralUnderaFuzzyMeasure ...... 57 1.8.3FromtheContinuousCasetotheDiscreteCase ........... 57 1.8.4ExamplesofDiscreteChoquetIntegral ................ 58 1.8.4.1Choquetintegralwithrespecttobeliefandplausibilitymeasures ............................ 58 1.8.4.2OWAoperatorsasChoquetintegrals ............ 59 1.8.4.3Choquetintegralwithrespecttok-additivemeasures ... 60 1.8.4.4ChoquetintegralwithrespecttheSugenomeasure .... 60 1.8.5ChoquetIntegralasaGeneralAggregationRule ........... 61 1.8.6RelationBetweenPlausibilityandBeliefChoquetIntegrals ..... 61 1.8.7ShapleyIndex .............................. 62 1.9InformationFusion ............................... 62 1.9.1DataLevel,FeatureLevelandDecisionLevelFusion ......... 63 1.9.2TheNon-GaussianNatureofDecisionLevelFusion ......... 63 1.9.3BayesianFusionModels ......................... 63 1.9.3.1Bayesianhierarchicalmodels ................. 64 1.9.3.2Bayesiannetworkmodels ................... 64 1.9.4Non-BayesianFusionModels ...................... 65 1.9.4.1Neuralnetworksmodels ................... 65 1.9.4.2Kernelmodels ......................... 66 1.9.4.3Geneticalgorithmmodels .................. 67 1.9.4.4Fuzzysetmodels ....................... 68 1.10PreviousApplicationsoftheChoquetIntegral ................ 68 1.10.1ChoquetIntegralinPatternRecognition ............... 68 1.10.2FuzzyIntegralinMachineVision ................... 69 1.10.2.1Fuzzyintegralinfacerecognition .............. 69 1.10.2.2Choquetintegralasanimageprocessinglter ....... 70 1.10.3ChoquetIntegralforDecisionLevelFusion .............. 70 1.10.3.1Choquetintegralinwordrecognition ............ 71 1.10.3.2Choquetintegralinlandminedetection ........... 71 1.11LearningFuzzyMeasures ............................ 71 1.11.1NonlinearOptimizationMethods ................... 72 1.11.2NeuralNetworks ............................. 74 1.11.2.1Perceptron-likemethod .................... 74 6

PAGE 7

........ 76 1.11.3GeneticAlgorithms ........................... 76 1.11.4HeuristicLeastSquaredError ..................... 77 1.11.5Conclusions ............................... 77 2MINIMUMCLASSIFICATIONERRORUSINGTHESUGENOMEASURE .. 85 2.1LossFunctionfortheMinimumClassicationError ............. 85 2.2DerivationoftheDerivativeoftheChoquetIntegralwithrespecttotheSugenomeasure ................................. 86 2.3TimeComplexityAnalysisoftheMCETrainingAlgorithm ......... 89 2.4ProblemswiththeMCEApproach ...................... 92 2.5ResultsundertheMCEAlgorithm ...................... 92 2.5.1DescriptionandDesignoftheExperiments .............. 92 2.5.2ResultsandComparisonAgainstOtherMethods ........... 95 3SPARSITYPROMOTIONWITHINCHOQUETINTEGRALS,USINGANAIVESAMPLER ...................................... 103 3.1ConstrainingtheHierarchicalModelforSparsityPromotionundertheLeastSquaredError ................................. 103 3.2GibbsSamplerfortheMonotonicityConstrainedModel ........... 104 3.3SamplingfromthePosteriorDistributionofMu ............... 106 3.4ProblemsWhenSamplingfromaGaussianwithSmallVariance. ...... 109 3.5ImprovingtheSparsityPromotingModel ................... 109 3.6ImprovingtheAccuracyTermSigmaandtheSparsityTermGamma ... 111 3.7InterpretingtheResultsfromtheSparsityPromotingModel ........ 112 3.8ComparisonAgainstQuadraticProgramming ................ 113 3.9ResultsunderSparsityPromotingModelwithRespecttoLSE ....... 114 3.9.1CaseI .................................. 114 3.9.2CaseII .................................. 114 4ANIMPROVEDLSESPARSITYALGORITHM ................. 120 4.1Introduction ................................... 120 4.2MaximumAPosterioriGibbsSamplerModel ................ 121 4.3SolvingtheMAP-EMProblem ........................ 124 4.4SliceSamplerfortheMAP-EMSparsity ................... 128 4.5ObservationsaboutMAP-EMSparsity .................... 130 4.6ResultsUnderMAP-EMSparsity ....................... 131 4.6.1ExperimentsUsingArticialDataSets ................ 131 4.6.1.1CaseI ............................. 131 4.6.1.2CaseII ............................. 132 4.6.1.3CaseIII ............................ 133 4.6.2ConclusionsovertheArticialDataSets ............... 134 4.6.3ExperimentswithLandmineDetectionDataSets .......... 134 4.6.3.1CaseA2007 .......................... 135 7

PAGE 8

......................... 135 4.6.3.3CaseBJ2006 ......................... 136 4.6.4ConclusionsovertheLandmineDataSets ............... 136 5EXTENDINGMCETOAPROBABILISTICFRAMEWORKUSINGLOGISTICDISTRIBUTIONANDGIBBSSAMPLER ..................... 163 5.1LogisticRegressionfortheOneMeasureCase ................ 163 5.1.1TheGibbsSamplerfortheOneMeasureCase ............ 164 5.1.2TheSliceSamplerFortheOneMeasureCasePosterior ....... 166 5.1.3ProblemsofAssumingaUniformDistribution ............ 168 5.1.4UsingaHiddenVariableforLogisticRegresion ............ 169 5.2DevelopmentoftheAlgorithmfortheTwoMeasureCase .......... 170 5.2.1TheGibbsSamplerfortheTwoMeasuresCase ............ 171 5.2.2TheSliceSamplerfortheTwoMeasureCasePosterior ....... 172 5.2.3MAP-EMFortheTwoMeasureCase ................. 174 5.3FinalThoughtsaboutMCEundertheLogisticFramework ......... 176 5.4ResultsunderSparsityPromotingModelwithRespecttoLogisticandMCELogisticAlgorithms ............................... 177 5.4.1ResultsoftheLogisticLASSOUsingArticialDataSets ...... 177 5.4.1.1CaseI ............................. 178 5.4.1.2CaseII ............................. 178 5.4.2CaseIII ................................. 179 5.4.3TheMAP-EMLogisticLASSOArticialDataSets ......... 179 5.4.3.1CaseI ............................. 179 5.4.3.2CaseII ............................. 179 5.4.4CaseIII ................................. 180 5.4.5ResultsoftheMCELogisticLASSOoverArticialDataSets .... 180 5.4.5.1CaseI ............................. 180 5.4.5.2CaseII ............................. 180 5.4.5.3CaseIII ............................ 181 5.4.6ResultsoftheMAP-EMMCELogisticLASSOoverArticialDataSets .................................... 181 5.4.6.1CaseI ............................. 181 5.4.6.2CaseII ............................. 181 5.4.6.3CaseIII ............................ 181 5.4.7ConclusionsovertheArticialDataSets ............... 181 5.4.8ExperimentsinLandmineDetectionDataSets ............ 182 5.4.8.1CaseA2007 .......................... 182 5.4.8.2CaseBJ2007 ......................... 182 5.4.8.3CaseBJ2006 ......................... 182 5.4.9ConclusionsovertheLandmineDataSets ............... 183 6CONCLUSION .................................... 219 APPENDIX 8

PAGE 9

.......................... 221 BRELATIONBETWEENDIFFERENTCHOQUETALGORITHMS ...... 223 CDIFFERENCESBETWEENTHEBAYESIANSPARSITYMETHODSANDTHEFEATURESELECTIONUNDERMUTUALINFORMATION ...... 225 DMAXIMIZINGMUTUALINFORMATIONINLOGISTICLASSO ....... 227 LISTOFREFERENCES ................................. 229 BIOGRAPHICALSKETCH ................................ 243 9

PAGE 10

Table page 1-1Examplesofconjugatepriors. ............................ 79 2-1ComparisonofPFAforSugeno-measuretrainedwithLSEagainstdierentdetectors. ....................................... 99 2-2ComparisonofPFAforgeneralmeasuretrainedwithLSEagainstdierentdetectorsatdierentthresholds. ................................ 100 2-3ComparisonofPFAinMCEagainstdierentdetectorsatdierentthresholds. 101 2-4MeanMCEPFAagainstmeangeneralandSugenoPFA. ............. 102 2-5ComparisonofMCESugenoagainstseveralotherclassiersforirisdataandbreastcancerdata. .................................. 102 3-1ConfusionmatrixforarticialdatasetcaseIinGibbssamplerforLSEsparsity. 115 3-2MeasuresforarticialdatasetcaseIwithmeanandstandarddeviationoftheMarkovchainsinGibbssamplerforLSEsparsity. ................. 115 3-3ShapleyvaluesforthefeaturesinarticialdatasetcaseIinGibbssamplerforLSEsparsity. ..................................... 115 3-4ConfusionmatrixforarticialdatasetcaseIIinGibbssamplerforLSEsparsity. 115 3-5MeasuresforarticialdatasetcaseIIwiththemeanandstandarddeviationoftheMarkovchainsinGibbssamplerforLSEsparsity. ............... 116 3-6ShapleyvaluesforthefeaturesinarticialdatasetcaseIIinGibbssamplerforLSEsparsity. ..................................... 116 4-1ConfusionmatrixforarticialdatasetcaseIforMAP-EMSparsity. ....... 137 4-2MeasuresarticialdatasetcaseIwithmeanandstandarddeviationoftheMarkovchainsforMAP-EMSparsity. ............................ 137 4-3ShapleyindexesforthefeaturesinarticialdatasetcaseIforMAP-EMSparsity. 137 4-4ConfusionmatrixforarticialdatasetcaseIIforMAP-EMSparsity. ...... 137 4-5PartialmeasurevaluesforcaseIIwiththemeanandstandarddeviationoftheMarkovchainsforMAP-EMSparsity. ........................ 138 4-6ShapleyindexesforthefeaturesinarticialdatasetcaseIIforMAP-EMSparsity. 138 4-7AbetterconfusionmatrixforcaseII. ........................ 139 4-8MoresparseShapleyindexesforcaseII. ...................... 139 10

PAGE 11

............ 139 4-10ConfusionmatrixforarticialdatasetcaseIIIforMAP-EMSparsity. ...... 140 4-11ShapleyindexesforMAP-EMSparsityforcaseIII. ................ 140 4-12Dierentdatasetsanddetectionalgorithmsforthelandminefusionproblem. .. 140 5-1ConfusionmatrixforcaseIusingLogisticLASSO. ................ 184 5-2MeasureparametervaluesbyLogisticLASSOincaseI. .............. 184 5-3ShapleyindexesforcaseIbyLogisticLASSO. ................... 184 5-4ConfusionmatrixforcaseIIusingLogisticLASSO. ................ 184 5-5MeasureparametervaluesbyLogisticLASSOincaseII. ............. 185 5-6ShapleyindexesforcaseIIbyLogisticLASSO. .................. 185 5-7ConfusionmatrixforcaseIIIusingLogisticLASSO. ............... 185 5-8ShapleyindexesforcaseIIIbyLogisticLASSO. .................. 186 5-9ConfusionmatrixforcaseIusingMAP-EMLogisticLASSO. ........... 186 5-10MeasureparametervaluesbyMAP-EMLogisticLASSOincaseI. ........ 186 5-11ShapleyindexesforcaseIbyMAP-EMLogisticLASSO. ............. 186 5-12ConfusionmatrixforcaseIIusingMAP-EMLogisticLASSO. .......... 187 5-13SomemeasureparametervaluesbyMAP-EMLogisticLASSOincaseII. .... 187 5-14ShapleyindexesforcaseIIbyMAP-EMLogisticLASSO. ............. 187 5-15ConfusionmatrixforcaseIIIusingMAP-EMLogisticLASSO. .......... 187 5-16ShapleyindexesforcaseIIIbyMAP-EMLogisticLASSO. ............ 188 5-17ConfusionmatrixforcaseIusingMCELogisticLASSO. ............. 188 5-18MeasureparametervaluesbyMCELogisticLASSOincaseI. .......... 188 5-19CaseIIconfusionmatrixbytheMCELogisticLASSOalgorithm. ........ 189 5-20CaseIIIconfusionmatrixbytheMCELogisticLASSOalgorithm. ........ 189 5-21CaseIconfusionmatrixbytheMAP-EMMCELogisticLASSOalgorithm. ... 189 5-22MeasureparametervaluesbyMAP-EMMCELogisticLASSOincaseI. .... 189 5-23CaseIIShapleyindexesbyMCELogisticLASSO. ................. 190 11

PAGE 12

................ 190 5-25ConfusionmatrixforcaseIIusingMAP-EMMCELogisticLASSO. ....... 190 5-26ShapleyindexesforcaseIIbyMAP-EMMCELogisticLASSO. ......... 190 B-1DierentcharacteristicsforChoquetalgorithms. .................. 223 12

PAGE 13

Figure page 1-1ExampleofamaximizationiterationinEM. .................... 79 1-2ExampleofSlicesamplerfortheexponentialdistribution. ............ 80 1-3ExampleofaLaplacianwith=10. ........................ 80 1-4ExamplesofthePDFandCDFofalogisticdistribution. ............. 81 1-5Exampleofadecisionboundaryh(x)=0+1x1+2x22=0. ......... 81 1-6Exampleoflogittransformation. .......................... 82 1-7ExampleofaChoquetintegralunderabeliefmeasure. .............. 83 1-8ExampleofaChoquetintegralunderaplausibilitymeasure. ........... 83 1-9Distributionoftheoutputsofaclassicationproblem. .............. 84 1-10Morerealisticdistributionoftheoutputsofaclassicationproblem. ....... 84 2-1ExamplesofsensitivitytodesiredoutputsforSugenomeasurewhere1and2representthedesiredoutputsforminesandnon-minesrespectively. ....... 98 2-2Examplesofsensitivitytodesiredoutputsforgeneralmeasureswhere1and2representthedesiredoutputsforminesandnon-minesrespectively. ..... 98 3-1Exampleofalatticewherethearrowsrepresentthesubsetrelationonit. .... 117 3-2ThisplotshowstheideathatsamplinginanintervalfarawayfromthemeanofaGaussianwithsmallvarianceissimilartosamplingfromauniformintheinterval. ........................................ 118 3-3Plotofsamplesforclass1`o'andclass2`+'fortherstthreefeaturesincaseI. 119 4-1PlotofsamplesfortherstthreefeaturesincaseIwherethesecondfeaturehasnovalueforclassication. ............................ 141 4-2OutputsforMAP-EMSparsityinCaseIwith\o"forclass1and\x".forclass2. 142 4-3TracesforthemeasureparametersamplesgeneratedbyMAP-EMSparsityinCaseI. ......................................... 143 4-4DistributionsforthemeasureparametersgeneratedbyMAP-EMSparsityinCaseI. ......................................... 144 4-5OutputsfortheMCEunderSugenomeasuresgradientdescentmethodforcaseI. 145 4-6ROCcurveforMAP-EMSparsityincaseI. .................... 146 13

PAGE 14

............ 147 4-8PlotofsamplesfortherstandfththreefeaturesincaseII. .......... 148 4-9OutputsforMAP-EMSparsityinCaseIIwith\o"forclass1and\x"forclass2. 149 4-10AbetteroutputseparationforcaseIIforMAP-EMSparsity. ........... 150 4-11OutputsfortheMCEunderSugenomeasuresforcaseII. ............. 151 4-12ROCcurvesfortheMCEunderSugenomeasuresandMAP-EMSparsityforcaseII. ......................................... 152 4-13ChoquetoutputsforMAP-EMSparsityforcaseIIIwith\o"forclass1and\x"forclass2. .................................... 153 4-14ROCcurvesfortheMCEunderSugenomeasuresandMAP-EMSparsityforcaseIII. ........................................ 154 4-15ExamplesoftracesforthemeasureparameterbyMAP-EMSparsityincaseII. 155 4-16ExamplesofdistributionsforthemeasureparametersbyMAP-EMforcaseII. 156 4-17ROCcurvesforalltheA2007algorithmsandMAP-EMSparsity. ........ 157 4-18ROCcurvesforMCEunderSugenoandMAP-EMSparsityindatasetA2007. 158 4-19ROCcurvesforalltheBJ2007algorithmsandMAP-EMSparsity. ........ 159 4-20ROCcurvesforMCEunderSugenoandMAP-EMSparsityindatasetBJ2007. 160 4-21ROCcurvesforalltheBJ2006algorithmsandMAP-EMSparsity. ........ 161 4-22ROCcurvesforMCEunderSugenoandMAP-EMSparsityindatasetBJ2006. 162 5-1ChoquetoutputsforthesyntheticcaseIunderLogisticLASSOwith\o"forclass1and\x"forclass2. .............................. 191 5-2ChoquetoutputsforthesyntheticcaseIIunderLogisticLASSOwith\o"forclass1and\x"forclass2. .............................. 192 5-3ChoquetoutputsforthesyntheticcaseIunderMAP-EMLogisticLASSOwith\o"forclass1and\x"forclass2 .......................... 193 5-4ChoquetoutputsforthesyntheticcaseIIunderMAP-EMLogisticLASSOwith\o"forclass1and\x"forclass2 .......................... 194 5-5ChoquetoutputsforthesyntheticcaseIIIunderMAP-EMLogisticLASSOwith\o"forclass1and\x"forclass2 .......................... 195 5-6ChoquetoutputsforthesyntheticcaseIunderMCELogisticLASSOwith\o"forclass1and\x"forclass2 ............................ 196 14

PAGE 15

............................ 197 5-8ChoquetoutputsforthesyntheticcaseIunderMAP-EMMCELogisticLASSOwith\o"forclass1and\x"forclass2 ....................... 198 5-9ChoquetoutputsforthesyntheticcaseIIunderMAP-EMMCELogisticLASSOwith\o"forclass1and\x"forclass2 ....................... 199 5-10ChoquetoutputsforthesyntheticcaseIIIunderLogisticLASSOwith\o"forclass1and\x"forclass2 .............................. 200 5-11ChoquetoutputsforthesyntheticcaseIIIunderMCELogisticLASSOwith\o"forclass1and\x"forclass2 .......................... 201 5-12TracesforthemeasureparameterslearnedbytheLogisticLASSOforcaseI. .. 202 5-13DistributionsforthemeasureparameterssampledbytheLogisticLASSOforcaseI. ......................................... 203 5-14SomeofthetracesforthemeasureparameterslearnedbytheLogisticLASSOforcaseII. ....................................... 204 5-15SomeofthedistributionsforthemeasureparameterssampledbytheLogisticLASSOforcaseII. .................................. 205 5-16TracesforthemeasureparameterslearnedbytheMAP-EMLogisticLASSOforcaseI. ....................................... 206 5-17DistributionsforthemeasureparameterssampledbytheMAP-EMLogisticLASSOforcaseI. ................................... 207 5-18SomeofthetracesforthemeasureparameterslearnedbytheMAP-EMLogisticLASSOforcaseII. .................................. 208 5-19SomedistributionsforthemeasureparameterssampledbytheMAP-EMLogisticLASSOforcaseII. .................................. 209 5-20ROCcurvesfordierentalgorithmsincaseI. ................... 210 5-21ROCcurvesfordierentalgorithmsincaseII. ................... 211 5-22ROCcurvesfordierentalgorithmsincaseIII. .................. 212 5-23ROCcurvesforalltheA2007algorithmsandLogisticLASSO. .......... 213 5-24ROCcurvesforMCEunderSugenoandthelogisticmethodsindatasetA2007. 214 5-25ROCcurvesforalltheBJ2007algorithmsandLogisticLASSO. ......... 215 5-26ROCcurvesforMCEunderSugenoandthelogisticmethodsindatasetBJ2007. 216 15

PAGE 16

......... 217 5-28ROCcurvesforMCEunderSugenoandthelogisticmethodsindatasetBJ2006. 218 B-1RelationsbetweenthedierentChoquetalgorithms. ................ 224 16

PAGE 17

17

PAGE 18

Thisdissertationaddressesproblemsencounteredincombininginformationfrommultiplesources.Novelmethodsforlearningparametersforinformationaggregationareproposed. Inpracticalapplicationsofpatternclassication,multiplealgorithmsareoftendevelopedforthesameclassicationproblem.Eachalgorithmproducescondencevaluesbywhicheachnewsamplemaybeclassied.Wewouldliketoaggregatethesecondencevaluestoproducethebestpossiblecondenceforthegivensample.Thiscanbeseenasaparticularinstanceofwhatiscalledinformationfusion. Inadditiontolearningparametersofaggregationoperatorstoassignthebestcondenceforagivensample,wewouldalsoliketheaggregationoperatorstouseasubsetofthealgorithmcondencesandachievethesamelevelofperformanceastheentiresetofcondences.Usingasubsetofthealgorithmsimplieslowercostforapplications. Choquetintegralsarenonlinearoperatorsbasedonfuzzymeasuresthatcanrepresentawidevarietyofaggregationoperators.PreviousresearchhasdemonstratedtheutilityofChoquetintegralsforthisproblemcomparedtoothermethodssuchasneuralnetworksandBayesianapproaches. However,oneofthenovelresultsofthisresearchisthatthemeasureslearnedcanbeverysensitivetothechoiceofdesiredoutputs.Inresponsetothisproblem,weproposeanalternativetrainingmethodologybasedonMinimumClassicationError(MCE)training 18

PAGE 19

Thereisaneedforadditionalapproachestolearningunconstrainedfuzzymeasuresthataremorecomputationallyattractiveandprovidemorerobustperformance.WeproposeanapproachtolearningunconstrainedfuzzymeasuresthatreliesonMarkovChain/MonteCarlosamplingmethods.TheuseofsuchapproachesforlearningmeasuresforChoquetintegralfusioniscompletelynovel.Inaddition,weproposetheinclusionoftheBayesianapproachofimposingsparsitypromotingpriordistributionsonthemeasureparametersduringsamplingasawayofselectingsubsetsofthealgorithmsforinclusionintheaggregation.Thisapproachiscompletelynewforlearningfuzzymeasures. 19

PAGE 20

Theliteraturereviewhasbeendividedintoeightsections.In(Section 1.1 )-(Section 1.4 ),weexplorethebasicconceptsbehindprobabilitytheory[ 1 ],Bayesianprobability,hierarchicalmodels,MonteCarlomethods[ 2 ]andlogisticregression[ 3 ].In(Section 1.5 )and(Section 1.6 ),wedenethebasicsbehindtheminimumclassicationerrormethod[ 4 { 6 ]andinformationtheory[ 7 ]respectively.Following(Section 1.1 )-(Section 1.6 ),weintroducefuzzymeasuresanddeneSugenoandChoquetintegrals[ 8 { 13 ].WethenfocusonthespecialcaseofadiscreteChoquetintegralanditsproperties. (Section 1.9 )oninformationfusionisdividedintotwomaingroups,Bayesianandnon-Bayesianmodelsforinformationfusion.Giventheseconcepts,(Section 1.10 )discussespreviousworkdonewiththeChoquetintegralasaclassierandanaggregationoperatorforfusion. In(Section 1.11 ),thenalsectionoftheliteraturereview,wediscussprevioustechniquesforlearningfuzzymeasures. 1 14 15 ].First,wedenebasicconceptsaboutsamplespacesandevents.Wethenlookattheconceptsofprobability,measureandeventspaces.Thisleadstodeningprobabilitydistributionsanddensities.Afterthis,wedenetheconceptsofindependence,conditionalprobabilityandmoments.Followingthesebasicideas,welookattheideasofBayes'rule,likelihoodfunctions,priorprobabilityandconjugatepriors. 20

PAGE 21

1. 2. IfA2B,thenAc2B. 3. IfA=[1j=1AjandAj2Sforj=1;2;:::,thenA2B. 1. 2. 3. IfA1;A2;:::2Barepairwisedisjoint,thenP([1i=1Ai)=P1i=1P(Ai).

PAGE 22

Foreachdistributionfunction,wehaveanassociatedfunctioncalledthecumulativedistributionfunction(cdf)FX(x)ofarandomvariabledenedby: forallx.ItissometimespossibletoexpresstheprobabilitymeasureasaLebesgueintegralofafunctionfonasetA, IftheLebesgueintegralisthesameastheRiemannintegral[ 14 { 16 ],wehave: 22

PAGE 23

17 ],iff(x)iscontinuous,that dxFX(x)=f(x):(1{6) Thefunctionf(x)iscalledProbabilityDensityFunction(PDF),andisdenotedbyfX(x).Asimilarargumentcanbemadeforthediscretecase,andthefunctionfX(x)forthediscretecasereceivesthenameProbabilityMassFunction(PMF).ItispossibletoprovethatanyfunctionfX(x)isaPDFifandonlyiffX(x)0forallx,andRSfX(x)dx=1.Givenaprobabilitydistributionfunction,itmaynotbepossibletoderivethePDFiftheLebesgueintegraldoesnotcorrespondtoanyRiemannintegral. Anaturalextensionofthisconceptisthejointdistributionfunction. Thejointdensityfunctionhasthesamepropertiesastheunivariatedensityfunction.

PAGE 24

ItcanbeprovedthatifthedensityfunctionsofrandomvariablesX,Yexist,thelastequalityholds.FromnowonwewillassumethatallourrandomvariableshaveaPDForaPMF[ 18 ]. 24

PAGE 25

18 ]. wheref(xj)()=f(x;),f(x)isthemarginaldistributionofXf(x)=Rf(xj)()d,andthefunctionf(xj)iscalledthelikelihoodofthedata,giventheparameter.Thisformulationallowsustodiscovertheprobabilitydistributionofagivendatabyguessinganinitialprobabilityorpriorandthenusingtheprobabilityofthedataorlikelihoodtomodifythisprobabilityandgetabetterapproximationfortheparameterofthedataprobabilityorposterior.Aspecialcaseofpriorsthatareheavilyusedinstatisticsaretheconjugatepriors.Aconjugatepriorisaprobabilitywiththepropertythattheposterior 25

PAGE 26

19 ].ThesefamiliesarecollectionsofPDF'sf(xj)thatcanberewritteninaspecicform.Forexample,ifaPDFf(xj)belongstotheexponentialfamilyifitcanbewrittenas: whereaandcdependonlyonx,andbandddependsonlyon.(Table 1-1 )showssomeclassicalexamplesofconjugatepriors. Frequently,itisnecessarytoassumethattheparametersofaPDFaredistributedaccordingtoanotherPDF,leadingtotheBayesianhierarchicalmodels.Inthesemodels,wecanusethepropertiesofpriordistributionstoenhanceourknowledgeoftheparametersusedinthetopmost,orposterior,distributions.Severalpropertiesareusedtosimplifythemodelscreatedbythiskindofhierarchy.Forexample,wecanusehiddenvariablestoexpressspecialpropertiesfromthetopdistributions.Anotherexampleistheuseofnon-informativepriors[ 20 21 ]whenthedistributionoftheparametersarenotwellknownorknownatall. Manyinterestingapplicationsinvolveformulationswherewell-denedanalyticalorclosed-formsolutionsarenoteasytocomeby,suchaswhenwetrytointegratetheLaplaciandistributionoveraninterval,orwhenthedierentlayersinahierarchicalmodelproducediculttointegratefunctions.Insuchcases,onemayexplorenumericaland/orstochasticmethodstoattackthesekindsofproblems. 2 22 23 ]andEx-pectationMaximization(EM)[ 2 24 25 ].Inthesecondcase,themostrepresentativeproblemsrefertoMonteCarlointegrationandimportancesampling[ 2 ].Thethird 26

PAGE 27

2 26 27 ].Inourcase,weareespeciallyinterestedintheEMandMetropolis-Hastingsalgorithms. 24 ]SupposethatweobservetheidenticallyindependentsamplesX1;:::;XnfromaconditionaldistributionP(Xj).Wewouldliketoobtainasequenceofparametervalues1;2;:::;n;:::suchthatatstepntheloglikelihoodL(n+1jX)=lnP(Xjn+1)isgreaterorequalthanL(njx).Toaccomplishthis,weintroduceahiddenoraugmentedvariablezintheloglikelihoodofgiventhedatax, lnL(jX;z)=lnP(X;zj):(1{13) ThislastequationtogetherwiththeinequalityL(n+1jX)>L(njX)allowsustoderivethefollowingequality, Fromthisequalityonecandeviseaniterativealgorithmforthemaximizationof.Thus,theEMalgorithmhasthefollowingsteps, 2. ItcanbeproventhatEZjX;nflnP(X;zj)gisaconvexfunction.Then,aconvexfunctionisusedtomaximizethefunctionL(jX)asseenin(Figure 1-1 ).ThiscanbeaproblembecausetheEMalgorithmcanonlyndalocalmaximadependingontheinitialization. Inaddition,theEMalgorithmrequiresaclosedformoftheexpectationofarandomvariablewhichisavailableinsomeimportantcases[ 28 ].Itisoftenthecase 27

PAGE 28

29 ].Additionally,aclosedformmaynotbeavailablefortheM-step,sothemaximizationofthederivativecannotbeaccomplishedwithoutresortingtoiterativeoptimizationschemesforintegralapproximation.Thus,weneedtoconsideradierentwaytoobtaintheoptimalanswerfortheparametersusedinadistribution.IthasbeennotedbyRobertandCasella[ 2 ]thatMonteCarloMarkovChain(MCMC)methodsarebetteratsolvingproblemswhereitisdiculttondclosedformsforthequatitiesintheEandMstepsabove. 2 30 ]. AtthecenterofthestatisticalsamplingmethodsistheconceptofaMarkovchain. Ifthemeasureisaprobabilitymeasure,theprobabilityiscalledstationary.Then,wehopetondMarkovchainsthatconvergetoastationarydistribution. BeforewedenewhataMarkovChainMonteCarloMethod(MCMC)is,weneedtointroducethesimulationconcept. 28

PAGE 29

26 31 32 ].TheMetropolis-Hastingsalgorithmbeginswithatargetdensityf.Then,aconditionaldensityq(yjx)ischosensuchthatitisamenableeasytosimulation,andisdenedwithrespecttof.Thus,theMetropolis-Hastingsalgorithmisdenedinthefollowingway(RobertandCasellaetal.[ 2 ]): Givenx(0), 2. GenerateY(t)q(yjx(t))andu(t)U(0;1)fort=1;:::;n, 3. Take 4. Returnx(1);x(2);:::;x(2);:::;x(n). Thesymbol\"representsthatsamplesarebeinggeneratedfromq(yjx(t))andU(0;1).Inaddition,upper-caserandomvariablesrepresentvariablesthatneedtobetestedtobeaccepted,andlower-caserandomvariablesrepresentrandomvariableswithaxedvalue. 29

PAGE 30

ItcanbeproventhattheGibbssampler[ 2 30 33 ]isequivalenttoacompositionofMetropolis-Hastingalgorithms[ 1 ]. GiventherandomvariablesXandYwithjointdensityf(x;y),andconditionalprobabilitiesfXjY(xjy);fYjX(yjx),theGibbssamplergeneratesaMarkovchain(X(t);Y(t))fort=1;2;:::takingX0=x0andsamplingasfollows: (a) (b) AdrawbackofthismethodisthatsamplesfromGibbssamplersarenotmutuallyindependent.ThismeansthatsomecorrelationexistsbetweensamplesintheMarkovchainbeinggenerated.Therefore,itissometimesnecessarytosub-sampletheoriginalsampletoobtainamoreindependentcollectionofsamplesbychoosingelementswithgreaterseparationwithintheoriginalchain. Anotherpossibledrawback,dependingontheproblem,isthatMarkovchainscanexhibitslowconvergence(i.e.,varianceinthechaincanbetoohigh).AwaytoresolvethisproblembyaddingextrastepswhichessentiallyMetropolizetheGibbssamplerhasbeenproposed[ 1 34 ]. 30

PAGE 31

2 ].Robert,Casella,etal.[ 2 ]dividethelikelihoodofthedataintotwolikelihoods,thecomplete-datalikelihood(LC(jx)=f(x;zj)),andtheincomplete-datalikelihood(L(jx)=g(xj)),withmissingdatadensityk(zjx;)=LC(jx) thenitispossibletodenethefollowingGibbssampler: 2. Therststepistheexpectationstepandthesecondthemaximizationstep.ThebigdierencewithrespecttotheclassicalEMisthatinsteadofcalculatingE(Zjx;),theGibbssamplergeneratesarandomvariablefromthedensityk.Inasimilarway,insteadofmaximizingtheexpectedcomplete-datalog-likelihood,theGibbssamplergeneratesrandomsamples,fromtheL-thenormalizedcomplete-datalikelihood. AnotherspecialcaseoftheGibbssamplerthatisusedforgeneratingsamplesfromtheproductofdistributionsistheslicesampler. Theslicesamplerisbasedonthefundamentaltheoremofsimulations. 31

PAGE 32

Thus,weneedtouniformlysimulatefromthesub-graph RobertandCasellaetal.[ 2 ]pointoutthatanaturalsolutionistousearandomwalkonthesetG(f)becausetheyusuallyresultinastationarydistributionthatistheuniformdistributiononG(f).Thiscanbedoneinthefollowingway: Startingfromapoint: (x;u)2f(x;u)j0
PAGE 33

Givenf(x)/Qki=1fi(x), 2. Atiterationt+1,simulate 1:w(t+1)1U(0;f1(x(t)))...k:w(t+1)kU(0;fk(x(t)))k+1:x(t+1)U(A(t+1))A(t+1)=nyjfi(y)w(t+1)i;i=1:::;ko: Returnx(1);x(2);:::;x(N). ThissamplercanbeseenasaspecialcaseoftheGibbssampler.Itssimplicityallowsforitsuseindierentproblemsforwhichposteriorprobabilitiesareconstructedbytheproductofmanyindividualdistributions. Nowthatwehavethesebasictoolsforattackingnumericalproblemsinprobabilitytheory,weexploretheimportantconceptofsparsitypromotioninlinearmodels. 1.3.1Introduction wherefrepresentstheoutputofthelinearmodel,itheweightgiventotheithfeatureinthemodel,andxiistheithfeature. 33

PAGE 34

whereyisthedesiredoutputforthealgorithm,hi(x)representstheevaluationoftheinputxbyabasisfunctionhi,andcaneitherbethoughtofasnoiseaddedtothedataorastheerrordistribution.Thiscanbewritteninvectorformas: whereHTi=(h1(x);:::;hm(x)).Now,ifwehaveacollectionofdierentoutputsy1;:::;yn,wecanwrite(Equation 1{25 )as: whereHiscalleddesignmatrixfortheproblemathand. AwaytosolvethislinearproblemistheuseoftheLeastSquaredError(LSE)[ 21 ], ManyresearchershaveattemptedtodevisemethodsforimprovingLSEestimationbyreducingthenumberoffeatures.WhenusingLSEtosolvethismodel,wearetryingtoincreasepredictionaccuracy,andinterpretability.Afterall,wewouldliketodecreasetheinuenceoffeaturesthatarenotimportantinthereductionoftheerrorbetweenthepredictedoutputandthedesiredoutput.Thisisanimportant,soughtafterfeatureinBayesianclassiers,becausetheiraccuracydecreasesasthenumberoffeaturesincreasesduetothewellknownbias-variationtrade-o[ 35 ]. TwostandardtechniquesforimprovingLSEestimation(i.e.decreasingthenumberoffeaturesinvolved)aresubsetselectionandridgeregression.Subsetselectioncan 34

PAGE 35

36 ]toselectthepossiblebestset.AnexampleofthismethodcanbeseeninSindhwanietal.[ 37 ],wheremutualinformationandagreedysubsetselectionareappliedtochoosethebestpossiblesubsetofelementsinproblemsinvolvingtheuseofneuralnetworksandsupportvectormachines. Althoughsubsetselectionprovidesinterpretablemodelsittendstobeextremelysensitiveduetothediscreteprocessselectionandchangesinthedata.Furthermore,theregressioncoecientsarenotlearnedsimultaneouslywithfeatureselection.Forthisreason,ridgeregression(alsoknownasweightdecay)providesamuchbettermethodofoptimization: wherekk2isaregularizationfactorwhichpenalizeslargevaluesfortheelementsof. Althoughthismethodallowsustouseclassicoptimizationtoolsforminimization,whichimprovesstabilityofsolutions,itdoesnotdriveelementsoftozerofastenough. Forexample,ifweconsider( 1{28 )asaposteriorprobabilityproblem TherstexponentialcorrespondstotheLSEpartof( 1{28 ),andthesecondexponentialcorrespondstothequadraticconstraintsoftheridgeregression.Atthesametimethislastexponentialcanbeseenas expT;=KYj=1exp2j;(1{30) whereeachexpf2igisproportionaltoazero-meanGaussian.Therefore,thequadratictermoftheridgeregression(Equation 1{28 )imposesaGaussiandistributiononthe'sweights.Thismeansthatthetermstendtogoslowlytozero.Furthermore,fromthepointofviewofridgeregressionthefollowingexamplesareequallylikelyforaproblem 35

PAGE 36

22=1 2A2=f1=02=1g 38 ]hasproposedamodicationoftheRidgeregressioncostfunctionto Thislastequationisequivalenttothefollowingcostfunction[ 39 ] ThiscostfunctionisknownasLeastAbsoluteShrinkageandSelectionOperator(LASSO).Itcanbeshown,asinthepreviousdiscussionaboutRidgeregression,thattheregularizationtermPKj=1jjjcanbeseenasputtingaLaplaciandistribution(Figure 1-3 )withzeromeanoneachoftheweightsj.AsTibshirani[ 38 ]pointedoutthisreectsthegreatertendencyoftheLASSOovertheRidgeregressiontoproduceestimatesthatareeitherlargeorzero. Severalmethodscanbeusedtondapossiblesolutionforthecostfunction(Equation 1{32 ).Inourcase,weareinterestedinthemethoddevelopedbyFigueiredo[ 40 ]. 1{24 )isassumingthatthedierencebetweenthedesiredoutputandthecomputedoutputf(x;)isdistributedasazeromeanGaussianwithvariance2.Inaddition,wecanassumethateachhasanormal 36

PAGE 37

Figueiredo[ 40 ]proposedthefollowinghierarchicalmodeltoaccomplishthis Wherethey1;:::;ynrepresentourdesiredoutputsorlabels,theHi'saretherowofthedesignmatrixHofthedata;the1;:::;mrepresenttheweightstobelearned;the1;:::;mrepresentthehiddenvariablesthatpromotesparsityontheweights;istheamountofsparsityinthemodel,and2istheamountofnoiseinthemodel. Itcanbeprovenunderthismodelthatthej'swithrespecttothej'shavethefollowingdensity: whichhasplot(Figure 1-3 ). Aswasobservedbefore,thisdensitypromotessparsityinthejtermssinceithasasharppeakatzero.IfthecomponentsofaresamplesfromtheLaplacian,thentheyarelikelytobelargeorzero. Now,assuming1;:::;jashiddendata,aMaximumAPosteriori(MAP)EMalgorithmcanbeusedtomaximizetheposteriorlikelihood: TheMAP-EMalgorithmforthisposteriorhasthefollowingsteps 37

PAGE 38

Inthiscase,wehave: with 2. and Althoughthisisanecientalgorithm,itassumesnorelationsexistbetweentheweightsj.Thus,ifwewanttopromotesparsityonmodelswherecomplexrelationsexist,weneedtodevelopotherstrategiesformaximizingthecostfunctionsforthemodels. ThelogisticdistributionhasPDF b b2;(1{41) 38

PAGE 39

ThepreviousCDFisalsoknownasthesigmoidfunction.ThisCDFisusedtomapcontinuousordiscreterandomvariablesintobinarycategories-thepatientisdeadoralive,itisrainingornot,etc.ThesePDF'sandCDF'scanbeplotted(Figure 1-4 ). Forthis,wecanusethelinearfunctionf(x1;:::;xm)=Pmi=1ixi.Then,itispossibletouseathresholdw0toclassifythex'sentriesasfollow:

PAGE 40

Morecomplexexamplesofh(x)exist,andtheydependonthetypeofproblembeingstudied[ 41 { 43 ]. Anexampleofthesefunctionsthathelpseehowtheyseparatedierentclassescanbeseenin(Figure 1-5 ). 1. 2. limh(x)!P(Y=1jx)=0. 3. limh(x)!1P(Y=1jx)=1. Agoodchoiceforthisprobabilityisthesigmoidfunction. and Then,usingthelogisticregression(Section 1.4.4 ),wecanconstructadecisionboundaryasfollows logitfP(Y=1jx)g=logP(Y=1jx) 1P(Y=1jx)=h(x)(1{46) and logitfP(Y=0jx)g=logP(Y=0jx) 1P(Y=0jx)=h(x):(1{47) Anexampleofthiskindofmappingcanbeseenin(Figure 1-6 ). Logisticregressioncanbeusedtoinvestigatethedegreetowhichthedatacanbeseparatedintotwogroups[ 3 44 ],whichisimportanttoknowifwewanttoclassifyit.We 40

PAGE 41

Inbroadtermslogisticregressionissimplyacombinationofthegeometricideain(Section 1.4.2 )andthesigmoidin(Equation 1{44 ).Thisproducesaprobabilitydistribution,P(Yjx),wherethelabelsordesiredoutputs(Y's)areconsideredBernoullirandomvariables. Now,itispossibletorewriteP(Y=1jx)as: Thiskindofmodelcanbesolvedusingmaximumlikelihood[ 3 ].SinceYisaBernoullirandomvariablewithmeanP(Y=1jx),wecaninterprettheprobabilityofanelementxbelongingtothepositiveclassastheprobabilityofY=1.Then,wehavethat (1{49) =P(yi=1jxi)yi(1P(yi=1jxi)1yi: lnL(x;y)=nXi=1(yilnP(yi=1jxi)+(1yi)ln(1P(yi=1jxi))):(1{50) Theoptimaofthisloglikelihoodcannotbesolvedfordirectly.Thus,itisnecessarytousenumericalmethodstosolveit,suchasconjugategradientsorGibbssamplers. 4 { 6 ],wedonotconsidercostfunctionsthatusedesiredoutputsasinthewellknowncostfunctionusingLSEminimization.Weinsteadconsideracostfunctionthatdependsonadierencebetween 41

PAGE 42

45 46 ].Thesedierencesarecalleddissimilaritymea-sures. 47 ]. 1. 2. 1. TheHammingdistance,d(x;y)=Pni=1jxiyij. 2. Thedierencefunctiondi(x)=di(x)+maxj6=idi(x)wherethedi(x)canbeseenasthedistancetotheprototypeifromx. 21 ]. Examplesoflossfunctionsare: 1+e(di(f!));>0: 42

PAGE 43

7 ].Then,wewillusethesebasicconceptstoexploretheuseofmutualinformationforalgorithmselection. 1lognXi=1p(xi):(1{54) Inasimilarway,ifXisacontinuousrandomvariable,wehave: Thislastequationiscalleddierentialentropy. Animmediateextensionofentropyisthejointentropy. AsimilarversionofthisdenitionexistsifX;Yarecontinuousrandomvariables. 43

PAGE 44

Usingconditionalentropy,itispossibletoprovethefollowingequivalencesforrandomvariablesX;YandasetofrandomvariablesfX1;:::;Xng. (1{60) Now,wecanbegintodenewhatisknownasmutualinformation. Relativeentropyisalwaysnon-negative,andiszeroifandonlyifp=q.AlthoughKullback-Leiblerdistancelookslikeametric,inrealityitisnotsymmetricanddoesnotsatisfythetriangleinequality[ 7 ]. 44

PAGE 45

MutualinformationhasthefollowingusefulrelationsgivenX;Yrandomvariables: Asinentropy,itispossibletodeneaconditionalmutualinformation. (1{69) =Ep(x;y;z)logp(X;YjZ) Inaddition,asinentropy,wecanprovethatmutualinformationisalwaysnon-negative.ThechainruleofinformationforI(X1;:::;Xn;Y)randomvariablesiswrittenas: 45

PAGE 46

WecanconsiderPetobearandomvariablewithagivenPDF.TheboundforPecanbeestimatedbytheFano'sinequality. 1+Pelog(jXj)H(XjY)(1{74) or log(jXj) (1{75) log(jXj) (1{76) bythefactthatH(Pe)isaconcaveparabolawithmaximumvalueone,and: NoticethatitisnotpossibletocontroltheentropyH(X)andthesizeofthespaceX[ 48 ].Thus,itisdesirabletomaximizethemutualinformationI(X;Y). 46

PAGE 47

37 49 ].First,itisnecessarytoconsiderthestandardsettingforaclassicationandfeatureselectionproblem: 1. AcollectionofsamplesdrawnfromarandomvectorX=(X1;:::;XN)T. 2. Eachsamplehasanassociatedlabely2f1;:::;Ng. 3. AcollectionofclassicationfunctionsH=ff:X!f1;:::;Ngg.Inthecaseof[ 37 ],thetwofunctionsusedforclassicationaretheSupportVectorMachine(SVM)andtheMultiLayerPerceptron(MLP). 4. TheclassicationoftheclassesismadebyusingasingleMLPforalltheclasses,ormultipleSVM'soneforeachclass. 5. LetYandYf=ff(x)jx2Xgbethediscreterandomvariablesoverf1;:::;Ng,describingtheunknowntruelabelsandthepredictedlabelsbytheclassierfunctionfrespectively. 6. AsubsetofthetotalsetoffeaturesGX. 7. AcollectionofclassicationfunctionsHG=ff:G!f1;:::;NggwhereHGdenotestherestrictionofHonG. ErdogmusandPrincipe[ 49 ]provideafamilyofupperandlowerboundsonthemisclassicationPe=Pr(Yf6=Y).Now,ifShannon'sdenitionforentropy(Denition 1.18 )isused,theboundsare: log(N1)Pe(f)(Fano'sbound) (1{78) miniH(Yje;Yf=i): Thisentropyisknownasthebinaryentropyfunction[ 7 ].Finally,H(Yje;Yf=i)istheShannonentropyoftheerrordistributionwhentheclassierfincorrectlyoutputclassi. DuetothefactthatitisnotpossibletocontrolH(Y)andH(Pe(f)),itispreferabletomaximizethemutualinformationI(Y;Yf)betweenYandYftominimizePe(f)byusing, 47

PAGE 48

1{80 )isnotdierentiablewithrespecttotheclassierparameters.ForthisreasonSindhwanietal.[ 37 ]preferto\approximatetheoptimizationprocessratherthanapproximatetheobjectivefunction."Forthispurpose,theyusethefollowingcostfunction: wherefGisaclassierthathasbeentrainedusinganyconvenienttrainingobjectivefunctionwiththefeaturesetG. Using(Equation 1{81 ),itispossibletodesigntwodierentheuristicsforthetwoclassicationfunctionsproposedin[ 37 ].However,beforethetwoheuristicsaredescribedinmoredetail,itisnecessarytodescribesomeoftheequationsinvolved. First,itisnecessarytoexplainhowtoobtainanestimateofthemutualinformationI(Y;Yf)givenaspecicproblem.AssumethatacertainproblemisgivenwithatotalnumberofsamplesS,andfisusedtoclassifythesamplesinthisproblem.Thus,wehavetheconfusionmatrixQf=(qij),whereqijrepresentsthenumberofsamplesinclassiareclassiedinclassj.Thiscanbeusedtocalculatethefollowingprobabilities: (1{82) (1{83) outputsclassj),bPij=bP(Y=ijYf=j)=qij (1{84) truelabelbeingclassiwhentheclassieroutputsclassj). Theseprobabilitiesallowustocalculatethefollowingentropies: 48

PAGE 49

Inthisway,itispossibletoestimatebI(Y;Yf)=bH(Y)bH(YjYf).Finally,itispossibletodeneausefulnessindicatorIjthatmeasurestheuncertaintyinvolvedinfselectingclassj: Now,itisnecessarytodeneawaytoassignanimportanceweighttoeachofthefeaturesundertheframeworkofSVMandMLP.InthecaseofSVM,thesquaredreciprocalofthemarginofaSVMisgivenby: whereiistheLagrangemultiplierandyiisthelabelcorrespondingtotheithsupportvectorxi.Krepresentsthekernelusedtosendthefeaturestoahighdimensionalspace.Now,thederivativeofthistermwithrespecttofeaturekinxjis: Now,itispossibletodeneanaltotalinformationowforfeaturekoverseveralSVM's: wherejindexestheSVMs,pindexesthefeaturesandIjrepresents(Equation 1{88 ).InthecaseofMLP,considertheneuronLinthelayerindexedbyl,andthelayerbeingfedbythislayerbeindexedbyk.Inaddition,denotetheoutputofaneuroninlayerkasOkandtheinterconnectionweightbetweenlayerkandlaswkl.Now,itispossibletorecursivelycomputetheimportanceweightfortheseneuronsbyusing: 49

PAGE 50

LetX=fX1;:::;XNgdenote,thefullsetofNfeatures;Dtraindenotesthetrainingset,andDtestdenotesthetestingset.Inaddition,weassumeaSVMforeachclassintheproblem.Inthisway,itispossibletodescribethefollowingheuristic,whichisdenedbySindhwani[ 37 ]: InitializeG=X.Setthenumber0
PAGE 51

Setthenumber0weight(y)replaceywithxinGandgotostep( 2 ). 11. Ifweight(x)
PAGE 52

2. 3. 12 50 { 52 ]: 1. 2. GivenA;B22X,ifABthen(A)(B)(MonotonicityProperty). Dierenttypesoffuzzymeasuresexist,themostresearchedandusedareevidencemeasures,possibilitymeasures,fuzzymeasuresinducedbyOrderedWeightedAveraging(OWA)operatorsorcardinalitymeasures,andSugeno-measures.ThersttwowereproposedandstudiedbyDempster[ 53 ]andShafer[ 54 ].TheOWAoperatorswereexploredindepthbyYager[ 55 ].Finally,theSugeno-measure[ 51 ]isaniceexampleofameasurerecursivelydenedandhasbeenusedbymanyresearchers[ 56 { 62 ]. 1.

PAGE 53

m(fx3g)=0:1;m(fx1;x2g)=0:1;m(fx2;x3g)=0:2; m(fx1;x3g)=0:2;m(X)=0:2: Itcanbeproventhatthebeliefmeasurehasthefollowingproperties: 1. 2. 3. ForeverypositiveintegernandeverycollectionA1;:::;AnofsubsetsofX, ForallAX m(A)=XBA(1)jABjBel(B):(1{97) 53

PAGE 54

1.1 ),wecangeneratethefollowingmeasure: Itispossibletorewritetheplausibilitymeasureas: Thus,from(Theorem 1.4 )wehavethat: 54

PAGE 55

63 64 ]: jXjjAjk0jAj>k:(1{106) 1. IfA;BXwithA\B=;,then: Itcanbeshownthatasetfunctionsatisfyingtheconditionsin(Denition 1.29 )isafuzzymeasure.Inparticular,(Equation 1{107 )implicitlyimposesthemonotonicityconstraintsontheSugenomeasures.Asaconvention,themeasureofasingletonsetfxigiscalledadensityandisdenotedbygi=g(fxig).Inaddition,wehavethatsatisesthe 55

PAGE 56

Theparameterisspecictothisclassofmeasuresandcanbecomputedfrom(Equation 1{108 )oncethedensitiesareknown.TahaniandKellershowedthatthispolynomialhasarealrootgreaterthan-1andseveralresearchershaveobservedthatthispolynomialequationiseasilysolvednumerically[ 56 { 58 ].Byproperty(Equation 1{107 ),specifyingaSugeno-measureonasetX,withnelements,onlyrequiresspecifyingthendierentdensities,therebyreducingthenumberoffreeparametersfrom2n2ton. Inaddition,theSugeno-measuresatisesthenite-rule[ 65 ]. FuzzyintegralscanbeconsideredasfunctionaloperatorsonthesetoffunctionsF=ff:X![0;1]gtotheR+.Thiscanbeseenasawaytoexpresstheweightofthefunctioninacertainspace. 51 ].Itisbasedontheuseofsupremumandinmum.

PAGE 57

10 ]introducedthisintegralasawaytounderstandsomephysicalmodels. AlthoughthisdenitioncoversthegeneralcaseofaChoquetintegral,itisnotoftenused.So,itisnecessarytolookatthediscretecaseoftheChoquetintegralforamoreusefulChoquetintegraldenition. 57

PAGE 58

Someextranotationisneededtomakeareferencetoobjectsinthefusionproblem.Letdenotethesetofobjectsinthefusionproblem.Eachinformationsourcexi,fori=1;:::;nisafunctionxi:![0;1].Foreach!2,wedenef!:X![0;1]byf!(xi)=xi(!). Finally,itispossibletorepresent(Equation 1{111 )asadotproductofvectors whereu=(S1;:::;Sn)Tisthevectorofparametervaluesforthemeasureofthelattice2XwithSi22X,andthevectorHfisequaltotherowvectorHf=(d1;:::;dn)suchthat: 1.8.4.1Choquetintegralwithrespecttobeliefandplausibilitymeasures 66 ]: whichcanberewrittenas: 58

PAGE 59

Thefollowingexampleshowssurfacesthatcanbegeneratedusingabeliefmeasure,aplausibilitymeasure,andtheChoquetintegral: 1-7 )and(Figure 1-8 )forthebeliefmeasureandplausibilitymeasurerespectively. 59

PAGE 60

withwi=(A(i))(A(i+1))foralli=1;:::;nwhich,sinceAiandAi+1alwayshavethesamenumberofelements,areindependentofthesortingintheintegral.Inaddition,itisclearbythetelescopingsumpropertythat then,bythenite-rule(Denition 1.30 ): 60

PAGE 61

1{111 )as: 67 68 ].MesiarandKomornkova[ 69 ]proposedasetofcommonpropertiesfortheaggregationoperators. 1. Aggreg(x;x;;x)=x(Identitywhenunary). 2. Aggreg(0;:::;0)=0andAggreg(1;:::;1)=1(Boundaryconditions). 3. If(x1;:::;xn)(y1;:::;yn)thenAggreg(x1;:::;xn)Aggreg(y1;:::;yn)(Non-decreasing). 66 ],andDetyniecki[ 70 ]havepointedoutthattheChoquetintegralgeneralizesoperatorsliketheOWAoperatorandtheweightedaverage. 61

PAGE 62

71 ].TheconceptofShapleyindexcomesfromtheideaofacoalitionalgame. 1. ForallA;B22XsuchthatATB=;,(A)+(B)(ASB) measurestheimportanceofeachelementinthesetX.Clearly,thefuzzymeasuressatisfythepropertiesrequiredforthecoalitionalgamedenition.Thus,itispossibletousetheShapleyindextomeasuretheimportanceofeachinputinthecomputationoftheChoquetintegral. 72 { 77 ].Somearemoregeneralthanothers,whereassomearedenedforaspeciceldofresearch.In1999amoregeneraldenitionwasgivenbyWald[ 76 ].

PAGE 63

78 { 83 ]: 1. Datalevelfusion,whichfusesdataacquiredfrommultiplesources,suchasimagesacquiredindierentspectralbands,directly. 2. Featurelevelfusion,whichcombinesfeaturescalculatedondataacquiredfrommultiplesources. 3. Decisionlevelfusion,whichcombinesdecisionstatisticsandcondencebaseinformationderivedfromalgorithmsappliedtomultiplesources. Oneimportantobjectiveofinformationfusionistodecidewhichdata,featuresoralgorithmsareusefulintheclassicationproblem.Thisissueisofgreatimportancebecauseitallowsustoreducethedimensionalityoftheproblemandthebias-variationtrade-o[ 35 ]aswellasreducingcostsofapplications. 1-9 )intheidealcase.Inreality,weoftenobtainskeweddistributionsliketheonesin(Figure 1-10 ). Aproblemwiththesekindsofoutputsisthatdierentclassiersproduceinformationthatcanoverlaporplainlycontradict.Itisnecessarytodesignfusionalgorithmsthatcanhandlesuchproblematicdistributions. Inthefollowingsections,weexploresomeofthemodelsproposedforthistask. 78 { 81 84 85 ].Althoughtheyhavebeenusefulinsomemeasure,Bayesianmodelshaveproblemshandlingtheoverlappinginformationcomingfrommultiplesourcesbecauseprobabilitymodelsobeythelawoftheexclusivemiddle[ 1 ].Thisproblemcanbesolvedaddinganextralayerofcomplexitywherethelawoftheexclusivemiddledoesnotapply.Forexample,wecanuse 63

PAGE 64

1.33 )todealwiththeproblemofoverlappinginformationandthenuseindependentdistributionsforcalculatingeachoftheindividualmeasuresinthepowersetofalgorithms. 84 86 { 88 ]describeattemptstodealwithinformationfusionusingBayesianhierarchicalmodels.Itisclearwhenreadingtheliteratureaboutthesubjectthathierarchicalmodelsareusedmoreasatechniqueinsideothermethodsforinformationfusionratherthanmethodsinandofthemselves. Forexample,in[ 84 ]theBayesianmodelsareusedforclassicationinsideaBayesiannetworkforthefusionofinformationfromhyperspectralimages.AnotherexamplecanbeseeninValinetal.[ 88 ],whereaBayesianclassicationmethodisfusedtogetherwithapossibilisticclassier,aneuralnetworkandaK-nearestneighborhoodusinganeuralnetworkfuserandaPossibilisticfuser. FromtheseexamplesitisclearthattheBayesianhierarchicalmodelsareusefulaspartofmoresuitabletechniquesforfusion. 84 85 87 89 { 92 ]arebasedonanideasimilartoMarkovchains,butinsteadofconsideringonlychainstructures,welookatdirectedacyclicgraphswiththeMarkovproperty[ 89 ].Thesekindsofmodelscanbeusedtofuseinformationfromasystemwithmultiplesensors.Thisallowsustoobtainansnapshotofthesystematacertainmomentintime. AnexampleoffeaturelevelfusionusingBayesiannetworkscanbeseeninFerrariandVaghi[ 90 ].ThispaperusesaBayesiannetworktohelpinthedesignoffeaturesforagroundpenetratingradar,electromagneticinduction,andinfraredsensorsforhumanitariandemining.ThestructureoftheBayesiannetworksensormodelislearnedfromadatabaseofmeasurementsobtainedfromasetofknowntargets,althoughthisisaverydicultproblem.Someassumptionsaremadetosimplifythecomputational 64

PAGE 65

93 ]isused. Inthecaseofdecisionlevelfusion,wecanseeanexampleinHeckerman[ 82 ].InthispapertheauthorusesBayesiannetworkstofuseacousticinformationandspatialinformationatdecisionlevelfusionforahuman-robotinteractionsystemfortheSwissNationalExhibition(Expo.02).TheseBayesiannetworksareusedinanattempttoimproveinteractionbetweenhumanandrobot,bytryingtodecidewhatthegoalsoftheuserare.ThenodesintheBayesiannetworkshavethefollowingrandomvariables:UR,userreliability;UG,usergoals;ORR,observedrecognitionresults;LSS,laserscannersignalforthepositionoftherobot;Lik,likelihoodofdata;SNR,speechtonoiseratio;andUFR,userinfrontoftherobot.InadditiontothesebasicBayesiannetworks,theauthorsdevelopasingleBayesiannetworkfortheusergoalbydiscoveringthecausalrelationsbetweenthenodesinthetwoBayesiannetworks.ThetrainingfortheseBayesiannetworksisaccomplishedusingBayesiannetworkinferencetechniques[ 89 ]. Aproblemwiththesemodelsisthefactthat,asthenumberofvariablesincreasesandthecausalrelationshipsbetweenthembecomemorecomplex,calculatingtheconditionalprobabilitycanbecomeextremelydiculttocompute.Inaddition,maintainingtheprobabilitypropertieswhentrainingtheBayesnetworkcanbedicult[ 89 ]. 94 { 98 ]arebasedondierent,andsomewhatmoregeneralpropertiesthantheonesbasedinclassicprobability.Forexample,neuralnetworks[ 97 ]aremodelswhereprobabilisticornon-probabilisticrulesarelearned. 97 ]canbeusedinthecontextofinformationfusiontofusesensorinformationfromsonarandinfrared,geospatialinformation,etc,i.e.datalevelfusion[ 96 98 { 100 ]. 65

PAGE 66

99 ].Here,atypeofPulseCoupledNeuralNetworks(PCNN)dendriticneuralnetwork[ 101 ]isusedtofusemedicalimagesthatareobtainedfromdierentsensorstoenhancethedata.Forexample,certaintypesofMagneticResonanceImages(MRI)givegreaterdetailofanatomicalstructures,whereasothertypesofMRIimagesgivegreatercontrastbetweennormalandabnormaltissues.Thus,oneoftheobjectivesoftheimagefusionistoobtainaprecisemapoftheabnormaltissue.AnotherexamplefordatafusioncanbeseeninBarberaetal.[ 96 ],whereaneuralnetworkisusedtofuseinformationfromunreliableultrasonicandinfraredsensorsinaroboticplatformforspatialnavigation. AnovelneuralnetworkthatcombineOWAoperatorsandMLP'shasbeenusedbyWilsonetal.[ 102 ].tofuseinformationfromgroundpenetratingradarsandelectromagneticinductionsensorsforhand-heldminedetectionsystems.Althoughuseful,abasicproblemwithneuralnetworksistheabilitytotrainthemandtheirproblemwithoutliers[ 97 ].Theyalsohavedicultytrainingwithconictinginformation. 97 103 ],trainingsamplesaresenttohighdimensionalspacesthroughtheuseofkernels,whichhelpsinthetaskofclassication. AnexampleofkernelmethodscanbeseeninLiuetal.[ 104 ].Inthispaper,theauthorsfuseinformationatdatalevelfromasensornetworkfortraccontrol.ASupportVectorMachine(SVM)withradialbasisfunctions,forexamplef(x)=expnkxk2 AnotherexampleofthistechniqueisSinghetal.[ 105 ].Inthiscase,featurelevelfusionisusedforfacerecognitionusingmulti-spectralinformation.First,theFouriertransformisusedtoobtaintwotypesoffeatures:amplitudeandphase.Thentheauthors 66

PAGE 67

106 ],basedinthefollowingcostfunction: (1{129) tominimize1 2kwkXiCi(vi); 107 ]havenotbeenusedwidelyintheareaofinformationfusion.Although,someattemptshavebeenmadetoproposepossiblearchitecturesfordatafusionofsensorinformation[ 108 ],andthelearningofthefuzzymeasureparametersintheChoquetintegral[ 109 110 ]. MaslovandGertner[ 108 ]studiedthedierentwaysthatevolutionaryalgorithmshavebeenusedininformationfusion.Thegeneticalgorithmisseenasaspecialcase,andmorethanproposinganalgorithm,theyexplorewhatwouldbeagoodarchitecturefordatafusionusinggeneticalgorithms. AmoreconcreteexamplecanbeseeninCombarroandMiranda[ 110 ],whereageneticalgorithmisusedtolearnthefuzzymeasuresforaChoquetintegralassumingaleastsquarederrortnessfunctioninaconvexspaceoffuzzymeasures.Employingconvexspaceisusefulbecauseitiseasytoprovethat,giventwofuzzymeasures1;2,theconvexcombination1+(1)2with2[0;1]isagainafuzzymeasure. 67

PAGE 68

12 53 54 111 ],fuzzymeasures[ 8 { 13 ]andfuzzyintegrals[ 8 { 10 13 ].Theyhavebeenusedinthelandminedetectionproblem[ 52 112 ],inhandwritingrecognition[ 113 ]andcomputervision[ 57 ]. AnexampleofdecisionlevelfusionusingfuzzysetsisshowninGaderetal.[ 113 ].First,asegmentationalgorithmisappliedtoastringofcharacterstosplitthemintoasetofprimitives,p1;:::;pK,whicharereadsequentiallybysubsequentprocedures.Second,afeatureextractionalgorithmisusedtoassignbasiccondences,then,afuzzyrulesystemistrainedtolearnthecondencesofblockofcharacters.Thesefuzzyrulesystemsuseasetofrulesandvariablestoassignthecondences.Anexampleoftheserulescanbeseenin(Equation 1{130 ): Otherapplicationsofthesetypesofmodelsexist,butourfocusinthefollowingsectionsisonlearningfuzzymeasuresandtheChoquetintegralsincethiswillbetheareaofourresearchcontributions. 1.10.1ChoquetIntegralinPatternRecognition 8 59 60 114 115 ]haveusedthefuzzyintegralasaclassierinpatternrecognitionproblems(AlthoughtheChoquetintegralisbettersuitedasanaggregationoperator).Grabisch[ 8 ]proposedtheuseoftheChoquetintegralintheframeworkofoptimizingaquadraticproblemwithaLeastSquaredErrorcostfunction: 2X!2C1(C(f!)1)2+:::+1 2X!2Cl(C(f!)l)2;(1{131) withdesiredoutputs1;:::;lundertheconstraintsimposedbythefuzzymeasure. 68

PAGE 69

61 62 ].Inaddition,ithasbeenappliedtotheclassicationoftheirisdatasetandWisconsinbreastcancerdataset[ 8 ]. Wang[ 114 115 ]proposedageneticalgorithmstrategytolearnthefuzzymeasures.Thesealgorithmshavebeenappliedtoanarticialdatasetwithtwohundredthree-dimensionalsamplessplitintotwoclasses.Inaddition,ithasbeenappliedtotheIrisdatasetandbreastcancerWisconsindataset. AlthoughtheChoquetintegralisnotwell-suitedasaclassier,theresultsarecomparabletothebestclassiers. 116 { 118 ]asawaytoenhancetheclassicationcondencescomingfromthedierentalgorithmsusedinfacialrecognition.Asinlandminerecognition[ 52 61 119 120 ]andwordrecognition[ 45 113 121 122 ],thefuzzyintegralisusedasanaggregationoperatortoimprovethenalcondenceusingtheoutputsfromthedierentclassiers. Forexample,inseveraleorts[ 116 117 ]aSugenofuzzyintegral(Denition 1.31 )isusedfordecisionlevelfusion.Inthisalgorithm,samplescomingfromfeatureextractorsareclassiedusingNclassiers.Then,aSugenointegralfuserisusedtoimprovethecondencesfromthedierentclassiers. 69

PAGE 70

123 { 125 ]haveproposedtheuseoftheChoquetintegralasamorphologicallter.Forexample,Grabisch[ 123 ]provedtheconnectionbetweenorderltersandtheChoquetintegral.HocaogluandGader[ 124 ]proposedagrayscalelterusingChoquetintegralcalledChoquetMorphologicalOperator.KellerandGader[ 125 ]proposedtheuseofChoquetlterstocleannoiseinLaserRadar(LADAR)images.Inaddition,theChoquetintegralhasbeenusedasageneralizedhit-missoperatorinthelandminedetectionproblem[ 52 ]. Grabisch[ 123 ]gaveadenitionoflinearandorderedltersthatcorrespondtoChoquetintegrals.GrabischalsofoundarelationbetweentheChoquetintegralinthedierentlters: 1. 2. AclassicexampleofdatalevelfusionisthatgiventousbyKellerandGader[ 125 ].Inthispaper,theOWAoperatorhasbeenusedtoenhancepixelsintheimageknownasnon-returnpixels.Duetosensorerrors,suchpixelshaveverylowlocalpixelvalue.Thus,ifweappliedathresholdtothesepixels,wewouldlosethem.Therefore,theOWAoperatorscanhelptoenhancethesepixels. 45 52 61 113 119 { 122 ].Thisframeworkwillbedescribedmorethoroughlyinthetechnicalapproach. 70

PAGE 71

45 121 122 126 ]usedtheChoquetintegralinhandwrittenwordrecognitionintheUnitedStatesPostalService. Oneexample[ 121 122 ]involvesfusingcondencesprovidedbyafuzzyrulesystemtoobtainanalcondenceaboutthewordbeinganalyzed.ThiswasdoneusingdiscreteChoquetintegrals,andwastestedextensivelyagainstaneuralnetwork,theChoquetintegralperformedsignicantlybetter. AnotherexampleoffusinghandwritingisCaoetal.[ 126 ]Here,aSugenointegralisusedtofusecondencesofseveralneuralnetworkclassiers.Thedensitiesusedintheintegralsweredynamicallyquantiedbysourcerelevanceusinganormalizationequation. 52 61 119 120 ]reportedsomesuccessindecisionlevelfusionforlandmineclassicationusingtheChoquetintegralasanaggregatoroperator.ThetwomainexamplesaretheuseoftheChoquetintegralasamorphologicaloperator[ 52 119 ]andasaggregatorfordierentalgorithmsunderaleastsquaredcriterion[ 61 62 ]. InGaderetal.[ 52 ],theChoquetintegral,togetherwithprobabilitiesforrandomsets,wasusedtodenethemorphologicaloperatorsforopeningandclosing.Inaddition,connectivityissuesweresolvedusingpathconnectivityconditions.Then,theprocedurewasusedforlandminedetectionatdatalevelfusion. In[ 61 ]agradientdescentalgorithmwasusedforsolvingtheleastsquarederrorofaChoquetintegralunderSugeno-measures.Thealgorithmwasthenappliedtolandminedetectionfusionatthedecisionlevel. 8 60 66 114 115 127 { 130 ],manyofthesemethodsshareseveraldrawbacks.Therstdrawbackisthatthenumberofparametersandconstraintsareanexponentialfunctionofthenumberofinputs.Forexample,iftheinputhasnfeatures,thenumberofmeasureparameters 71

PAGE 72

2 )thattheperformanceofthetrainedparameterscanbeverysensitivetothechosenoutputs.Third,ineveryalgorithm,maintainingthemonotonicityrelationcanbedicult.TheseproblemsandtheirrespectivesolutionsareexploredineachofthealgorithmsthathavebeenproposedtolearnfuzzymeasuresfortheChoquetintegral 11 131 ]: 2X!2C1(C(f!)1)2+:::+1 2X!2Cl(C(f!)l)2;(1{132) togetherwiththefuzzymeasuresconstraints.Itispossibletoexpress(Equation 1{132 )asaQuadraticprogrammingproblembyrealizingthattheChoquetintegralC(f!)canbewrittenasaninnerproductofvectors: underasetofconstraintsdenedbythefuzzymeasuredenition(Denition 1.24 )whicharerepresentedinamatrixform: Finally,thecostfunctionEandAu0canbeputtogetherasaminimizationproblem: min1 2utHu+u+,s.t.Au+b0.(1{135) Althoughthiscostfunctionhasappearedintheliterature[ 11 131 ],adetaileddescriptionhasnot.Forthisreason,acompletedescriptionofhowtoderivethisminimizationproblemfromtheLSEcostfunctionfortheChoquetintegralisin(Appendix A ).Thiscostfunction(Equation 1{135 )canbesolvedbytraditionalmethodsofoptimization[ 132 ].Notethatduetotheexponentialnatureofthefuzzymeasureparameters,theconstraints 72

PAGE 73

8 62 ]. Inparticular,H,,Aand Animmediateprobleminthisapproachistheuseofthesamemeasureforthedierentclasses.GrabischandNicolas[ 130 ]addressedthisproblemwithamodiedversionof(Equation 1{132 )foratwo-classproblem: where1and2arefunctionsthatcomputeclassspeciccondencevaluesfromtheinformationsourceoutputs.Forexample,ininformationfusionweuse: whereasinclassicationweuse: Notethatin(Equation 1{137 ),i(f!(x))2[0;1].In(Equation 1{138 ),thevaluesofxaregenerallyquantizedsothedistributionisdiscreteandi(f!(x))2[0;1].Wecanemployasimilarprocedureastheonein(Equation 1{132 )toconvertthiscostfunction(Equation 1{136 )intoaquadraticproblemunderlinearconstraints.Althoughthismethod 73

PAGE 74

59 ].Second,thesolutioncanbesensitivetothedesiredoutputsiasshownin(Section 2 ).Thismakesitnecessarytostudytheproblemofndingthebestpossibledesiredoutputs.Thisclearlyincreasesthecomplexityofthesolutionof(Equation 1{135 ). 60 66 127 129 ]. 66 ]proposedusingaperceptron-likecriteriontolearnfuzzymeasures.Inthiscriterion,wewanttominimizethenumberofmisclassiedsamplesgivensamplesfromtwoclassesC1=f1w1;:::;f1wnandC2=f2w1;:::;f2wn,andu1=11;:::;1n1T,u2=21;:::;2n2Tthevectorsofmeasureparametersforbothclasses. Thecriterionisusedtonduandanl-dimensionalnon-negativevectort=(t1;:::;tn)suchthatPnk=1tkisminimumundertheconstraintsofmonotonicityforu1andu2: wheref1!iandf2!irepresentsalgorithmcondences,forclass1andclass2respectively,givenbyanobject!i,andirepresentstheCDFofcondencesthatbelongtoclassi.These(Equation 1{140 )inmatrixformatusingtheideasin(Appendix A ): 74

PAGE 75

1{140 ). Itispossibletodenethefollowingterms: 1. 2. 3. 4. Thesetofconstraintsofeachfuzzymeasureinmatrixformat: 5. 6. ThevectorversionoftheChoquetintegrals: 7. Thematrix: =2666666664A10T0...00TA20...0HT1(f1!i)HT2(f1!i)1.........0HT1(f2!j)HT2(f2!j)::::::013777777775(1{147) Then,thelinearproblemtakesthestandardform: minimizeCTvs.t.v+B0; whichissimplyalinearoptimizationunderlinearconstraints. 75

PAGE 76

66 ].Inaddition,ithasthedrawbackthatthematrixofconstraintsisofsize2m+1+n2m+1+nforninputsofdimensionalitym. 60 ]proposedtheuseofagradientdescentalgorithmtolearnthefuzzymeasuresfortheSugeno-measure.Therefore,aneuralnetworkisusedtosimulatetheaggregationbytheChoquetintegral.Intherstlayer(inputlayer)thecondencesareusedasinputs.Theseinputsarethencombinedinthehiddenlayerintheformofmin-maxdierences: TheoutputlayeristhenalsumintheChoquetintegral.WangandWangpointoutthattheuseofadirectgradientisreallydicultfortheChoquetintegralneuralnetwork.ThisisthereasontousesmalldierencesselectedinarandomwaytooptimizetheLSEfortheChoquetintegral. KellerandOsborn[ 127 ]proposedtousea\reward-punishment"approach,similartoaneuralnetwork.Inthisapproach,givenasamplew,theChoquetintegraliscalculatedforeachclass,andifthelargestintegraldoesnotcorrespondtothelabelforthesamplew,thedensity-neuronsoftheSugeno-measurearepunishedbyreducingtheirvalues.Inthecasethatthelargestintegralcorrespondstothecorrectlabelforw,thedensitiesarerewardedbyincreasingthem.Themonotonicitypropertyismaintainedthroughthefactthatthedensitiescanonlyhavevaluesintheinterval[0;1]. 114 115 133 ]areanontraditionalmethodtolearnfuzzymeasures.Theyarenotmorewidespreadbecauseoftheextraoverheadofthealgorithmitselfandthedicultindeterminingeectiveencoding.Thesealgorithmsarebasedon 76

PAGE 77

ThemethodencodeseachweightedChoquetintegralasachromosome,andthechromosomepopulationhassizepcorrespondingtothetotalnumberofsamples.Theprobabilityofachromosomeinthepopulationbeingchosentobeaparentdependsonitstness.Theoptimizationinthegeneticalgorithmisperformedunderthecriterionofminimizingthemisclassicationrate.Thestoppingconditionofthealgorithmistheminimizationoftheclassicationerrororzeromisclassicationrate. 8 ]proposedtheuseofanheuristicLSEtotrainthefuzzymeasures.Asmalldescriptionofthisheuristicisdescribedas: Foreachsample!withlabely: (a) Calculateerrore=C(f!)y,andsetemax. (b) updateeachmeasureiasfollows: emax(x(ni)x(ni1));(1{150) with2[0;1]. (c) Verifythemonotonicityrelations.Ife>0,thevericationisdoneforlowerneighborsonlyinthepowerlattice;ife<0,forupperneighborsonly. 2. Foreverynodeleftunmodiedin(Step 1 )(beginwithlowerlevels),verifymonotonicityrelationswithupperandlowerneighbors. 77

PAGE 78

Manyofthealgorithmsdependonadesiredoutput.Thismeansthatanexternalalgorithmormethodologyneedstobeusedtotrytondthebestpossibledesiredoutputs,butaswewillseeinthenextchapter,sometimesthebestpossibledesiredoutputsarecounter-intuitive. 2. Exponentialtimecomplexityduetotheparametersandconstraints.Itisclearthatgivenaproblemwithdimensionalityn,thesizeoftheinputforeachofthesemethodsbecomes2n1atleast.Thatis,ifwedonotcountthenumberofconstraintstomaintainthemonotonicityproperty.Itisclearthatanymethodthatdependsonamatrixofconstraintswillbecompletelyhopelessafteracertainnumberofalgorithmsorfeaturesareconsidered.Forexample,theoptimizationbyaquadraticprogrammingmethoddependsheavilyonsearchingforafeasibledirectionbyusingthematrixofconstraints.Itisclearthatifthesizeofthematrixofconstraintsincreases,thesearchbecomesslower. 3. Lackofcostfunctionswithembeddedfuzzymeasuresconstraints.Thisisdonesometimesinoptimizationandmachinelearningbecauseitiseasiertominimizeanon-constrainedcostfunctionthanaconstrainedone. Thesearethereasonsthataredrivingtheideasbehindthisdissertation.Thenextchapterswillhelptoshinesomelightintonewwaystosolvetheseissues. Thefollowingitemssummarizeoursolutionstothepreviousproblems: 1. In(Chapter 2 )anovelMCEalgorithmisdevelopedtoeliminatethedependencieswithrespecttodesiredoutputs.Inaddition,itreducesthetimecomplexitytoO(Knlog(n)+Mn3+KMn2)whichisnotexponential. 2. In(Chapter 3 )and(Chapter 4 ),aneworiginalalgorithmisdevelopedusingaBayesianhierarchyandGibbssamplerwiththefuzzymeasureconstraintsembeddedintothehierarchy.Inaddition,thisalgorithmisnotlimitedtotheSugeno-measure,anditdoesnotdependonamatrixofconstraints.Thesemakesthealgorithmisextremelyfastcomparedtootheralgorithmsdealingwithacompletefuzzymeasure. 3. In(Chapter 5 )thenewprobabilisticalgorithmsareextendedtodealwiththeproblemofdesiredoutputsforthecasesofasingleandtwomeasuresbysimulatingtheMCEmethodwiththelogisticregression. Wermlybelievethatthenewprobabilisticalgorithmsaremuchbettersuitedtodealwiththeexponentialnatureofthevaluesofafuzzymeasure.ThereasonbehindthisisthattheMCMCmethodsareusedtodealwithhighdimensionalproblemswheretheclassicalderivativemethodsarediculttouse[ 2 ]. 78

PAGE 79

Examplesofconjugatepriors. PosteriorLikelihoodPrior ExampleofamaximizationiterationinEM. 79

PAGE 80

ExampleofSlicesamplerfortheexponentialdistribution. ExampleofaLaplacianwith=10. 80

PAGE 81

ExamplesofthePDFandCDFofalogisticdistribution. Exampleofadecisionboundaryh(x)=0+1x1+2x22=0. 81

PAGE 82

Exampleoflogittransformation. 82

PAGE 83

ExampleofaChoquetintegralunderabeliefmeasure. ExampleofaChoquetintegralunderaplausibilitymeasure. 83

PAGE 84

Distributionoftheoutputsofaclassicationproblem. Morerealisticdistributionoftheoutputsofaclassicationproblem. 84

PAGE 85

InthischapterweproposethefollowingdissimilaritymeasurefortheMCEalgorithm: Notethat(Equation 2{1 )allowsformultipleclasses.TheMCEalgorithmrequiresdierentiation.Inaddition,thefunctionmaxisdierentiablealmosteverywherewithaverysimplederivativegivenby: (2{2) are 1{51 )and(Equation 1{52 )inasinglelossfunctionforinformationfusion: 1+e(di(f!));di(f!)>00di(f!)0: Forclassication,weuseaslightlymodiedversionofthelossfunction(Equation 2{3 ): 1+e(di(f!))1 2;di(f!)>00di(f!)0; Thesefunctionshavethepropertythatcorrectlyclassiedsampleshavezeroloss.Thus,onlysamplesthatarenotcorrectlyclassiedaretakeninconsiderationfortheaccumulativechangeintheoptimization. Withlossfunction(Equation 2{3 ),andthedissimilaritymeasure(Equation 2{1 ),wehavethefollowingcostfunctionfornclasses: 85

PAGE 86

2{3 ), @gji=X!2C1l1(f!)(1l1(f!))@d1(f!) wheregjirepresentsthejthdensityforithclass.Now,theterm@dk(f!) 2{4 )canbeobtainedinthesameway.Westillneedtondaclosedformforthevalue@Cgi(f!) 1{111 )withrespecttothedensitiesoftheSugeno-measure.Thus,eachpartialderivativeofCg(f!)withrespecttogj Toderive@g(A(i)) 1{107 ): Thepartialderivativeofthislast(Equation 2{8 ),withrespecttoadensitygj,isequalto: @gjg(i)g(A(i+1))+@g(i) 1{108 ). 86

PAGE 87

@gj+g(A(i+1))+g(i)@g(A(i+1)) @gj+(1+g(i))@g(A(i+1)) bythemultiplicationruleforderivatives.Inasimilarway,for(i)6=j,wehave: @gjg(i)g(A(i+1))+g(i)@g(A(i+1)) @gj+(1+g(i))@g(A(i+1)) Fromtheselasttwoequations,andthefactthatg(A(n+1))=0,wecanobtain: 1. @g(A(i)) @gj+:::+(1+g(i))@g(A(i+1)) 2. @g(A(i)) @gj+(1+g(i))@g(A(i+1)) 3. @g(A(i)) 4. @g(A(i)) 87

PAGE 88

@gj.Dierentiatingbothsidesof(Equation 1{108 )withrespecttogjyields: @gj=nYi=1;i6=j(1+gi)!+:::+@ @gjnXi=1ginYk=1;k6=i(1+gk)!: Fromthisequation,wecangetthefollowing: @gj1nXi=1ginYk=1;k6=i(1+gk)!=nYi=1;i6=j(1+gi)!: Whichcanbereducedto: @gj=Qni=1;i6=j(1+gi) 1Pni=1giQnk=1;k6=i(1+gk):(2{18) Andbecausewecanrewrite+1=Qni=1(1+gi)as1+ @gj=1+ Wehavenallythat: @gj=2+ Thislastequation,togetherwith(Equation 2{10 )and(Equation 2{11 ),wecangetthederivativeoftheChoquetintegralwithrespecttotheSugeno-measurefor6=0. Notethatthederivationof(Equation 1{108 )from(Equation 2{8 )assumesthat6=0,andthattheresultingexpressionfor@ @gjin(Equation 2{20 )isundenedfor=0(SincePni=1gi=1).WecanapplyL'Hopital'sruletoseethatlim!0@ @gj=n.Hence,intheunlikelyeventthat=0duringtrainingonecantake@ @gj=n. 88

PAGE 89

2{7 )-(Equation 2{20 )toobtainagradientdescentalgorithmfortheMCEcostfunction(Equation 2{5 ). Thisnewoptimizationhastheadvantagesthatwehavebeenlookingfor.First,eachclassisrepresentedbyauniquemeasure,andsecond,nodesiredoutputsarenecessarywhatsoever.ThisisagreatadvantagebecausetheChoquetintegralissensitivetodesiredoutputs,andeliminatingthenecessityfordesiredoutputshelpstoavoidguessingwhichdesiredoutputsarecorrect. 134 ].Inaddition,calculatingtherootsfor(Equation 1{108 )hasasymptoticcomplexityO(n3)[ 134 135 ]. Wepresentthepseudo-codeofthegeneralalgorithmwiththeorderofoperationsofthecomputationalcomplexitystepsinparentheses: Setlearningrate. 2. fori=1toM. (a) SortallthesamplesofclassCi,(O(Minlog(n))). 3. endfor 4. Do (a) fork=1toM i. SetrEk=0 CalculatekforeachclassCk(O(n3)).

PAGE 90

Calculateforeachdensityg(k)jthepartialderivative: forh=1toMi Calculatedk(f!h)=Cg(k)(k[f!h])+maxj;j6=kCg(j)(j[f!h]),(O(Mn)). B. Calculatelk(f!h),(O(1)). C. Calculateg(k)(A(i))foralli=1;:::;n(O(n)). D. Calculateforalli=1;:::;nthepartialderivativeofg(k)(A(i))withrespectg(k)jforallj=1;:::;n: CalculatethepartialderivativeCg(k)(f!h)withrespecttog(k)jforallj=1;:::;n: Calculateforeachdensityg(k)jthequantity: Foreachg(k)j,setg(k)j=g(k)jDjkh,(O(n)). H. v. endfor (b) endfor 5. whilek(rE1;:::;rEM)Tk>

PAGE 91

Thus,wehavethatthetimecomplexityforasingleiterationintheMCEis: TimeComplexityforMCEsingleiteration=O(Knlog(n)+Mn3+KMn2):(2{22) Then,assumingHiterationsinthemainwhileloop,weobtainthetimecomplexityfortheMCE: TimeComplexityforMCE=O(Knlog(n)+HM(n3+Kn2)):(2{23) 91

PAGE 92

2{5 )isthatthisisnotaconvexfunction.Forexample,fordimensionalityone,l(x)=expx wherethepartl(x)(1l(x))>0forallx2R,and 12l(x)0ifx0;12l(x)>0ifx<0: 1.4 ). Itcanbeseenthatasumoffunctionsli(f!)withdi:[0;1]n![1:1]isnotgoingtobeaconvexorconcavefunction[ 136 ].Therefore,ifweusethegradientdescentproposedin(Section 2.1 )-(Section 2.2 ),themostwecanhopetoobtainisalocalminimum.Then,itisnecessarytoproposeaglobaloptimizationtechniquetosolvethisprobleminamoreecientway,butfornowthisisstillanopenproblem. 2.5.1DescriptionandDesignoftheExperiments ThelandminedetectionprobleminvolvedprocessingGroundPenetratingRadar(GPR)sensorreturns.Thisiswelldescribedintheliterature,butisbrieyspeciedhere.Thegoalistodiscriminatebetweenregionsofgroundthatcontainburiedlandminesfromregionsofgroundthatdonotcontainburidedlandmines.GPRmeasurementsweremadeatmultiplelocations,someofwhichcontainlandminesandsomeofwhichdonot. 92

PAGE 93

112 137 { 141 ].Eachdetectionalgorithminvolvesacomplexsequenceofprocessesincludingsignalprocessing,featureextraction,andclassication.Thealgorithmsproducecondencevaluesasoutput.Thelargerthecondencevalue,themorelikelyitisthattheinputsamplewasacquiredoveraregionofgroundcontainingalandmine. Thedatasetcontained24228-dimensionalsamples,eachcontainingonecondencevaluefromeachoftheeightdetectionalgorithmsusedinthedetectionproblem.Thedatasetcontained271minessamplesand2151non-minesamples. Threedierentinformationfusionalgorithmswereconsidered:LSEforgeneralmeasures,LSEforSugeno-measures,andMCEforSugeno-measures. Theprobabilityofdetection,PD,andtheprobabilityoffalsealarm,PFA,areusedasperformancemeasures.Theyaredenedasfollows jf!2Minesgj;(2{25) jf!2NonMinesgj;(2{26) wherej:jdenotesthesetcardinality. Sincegradientdescentissensitivetoinitialization,werunN-foldcross-validationMtimestoobtainarealisticestimateoftheexpectedperformance(foroneexperiment,N=5,M=20).Inaddition,sinceLSEperformancedependsonthechoiceofdesiredoutputsandtheresultsaresensitivetothischoice,weaverageoverarangeofreasonabledesiredoutputs.Thefollowingpseudocodedepictstheexperimentalprocedure.Inthispseudocode,weightsreferstotheparametersofthemeasuretobelearned.ThefunctionComputeROCcomputesthe(Equations 2{25 )and(Equations 2{26 )foralltintherangeofdetections.Thevalues1,2representthedesiredoutputsofthefuzzyintegralintherange[0,1],forminesandnon-minesrespectively.Thevalueselectedfor1rangesbetween0.5and1formines,andfor2wechoosevaluesbetween0.0and10:1.These 93

PAGE 94

Initialize (a) DataSet=fAigNi=1whereAiTAj=;,ifi6=j. (b) Numberofrepetitionsforexperiment=M; 2. fori=1toMdo: (a) RandomlyinitializeWeights. (b) K=1 (c) forj=1toN i. for1=0.5to1(1,2) A. for2=0.0to1-0.1 B. C. K=K+1; i. 3. ifAlgorithmisLSE(Thisvariesthedesiredoutput) (a) fori=1toM i. forj=1toK-1 A. else (a) fori=1toM i. ifAlgorithmisLSE(Thisvariesthedesiredoutput) (a)

PAGE 95

else (a) Beforeexaminingtheresultsfromeachalgorithm,weshowthesensitivityoftheLSEtrainingforthetwomeasuresusedintheexperiments.TheReceiverOperatingCharacteristic(ROC)plotsin(Figure 2-1 )and(Figure 2-1 )showsomeofthevariationsinthePDandFARduetorandominitializationunderthedierentdesiredoutputsinasingleexperiment.WecanseethatdierentdesiredoutputsproducedierentROCcurves.Inaddition,thebestROCcurveisnotobtainedusingidealvalueslikezerofornon-minesandoneformines,butnon-intuitivevaluesof0.8forminesand0.2fornon-minesinthecaseofaSugeno-measure,and0.5forminesand0.1fornon-minesinthecaseofageneralmeasure.(Figure 2-1 )and(Figure 2-1 )showthesensitivityofLSEschemestodesiredoutputsandrandominitialization. 2-1 ),anaverageSugeno-measuretrainedviaLSEiscomparedtoeachindividualdetector.ForPD'srangingfrom80-100%,thetableshowsthePFAachievedbytheChoquetintegralwithrespecttoSugeno-measure,thePFAachievedbyeachdetector,andthereductionofPFAachievedbytheChoquetIntegralwithrespecttoSugeno-measurecomparedtoeachdetector.Thepercentageofreductionrangesbetween0.09%to51.84%.AlthoughaChoquetintegral,withrespecttoaSugeno-measuretrainedwithLSE,performs 95

PAGE 96

In(Table 2-2 ),wecompareindividualdetectorsagainstthegeneralmeasuretrainedusingaLSEcostfunction.ItisclearthatgeneralmeasurestrainedusingLSEimproveacertainamountoverSugeno-measurestrainedusingLSE.Thisrangeofimprovementisbetween3.25%and55.60%.However,theChoquetintegral,withrespecttoageneralmeasuretrainedwithLSE,isstillnotbetterthanthebestdetectors(detector6anddetector7).(Table 2-3 )showsthat,incontrasttotheSugeno-measureandthegeneralmeasuretrainedwithLSE,theSugeno-measuretrainedwithMCEis,ingeneralbetter,thanallthetheindividualdetectors,witharangeofimprovementbetween0.44%and65.07%.(Table 2-4 )showstheimprovementofMCEovertheLSE.Therangeofimprovementisbetween11.06%and37.51%,withrespecttotheLSEcostfunctions. ItispossiblefortheSugeno-measureandthegeneralmeasuretrainedwithLSEtobeasgoodastheonetrainedbyMCE.Forthistohappen,itisnecessarytohaveasetofcorrectdesiredoutputs.Itisclearthat,dependingoninitialization,thesedesiredoutputscanchange.ThisisalimitationforgeneralmeasuresandSugeno-measuresunderLSEoptimizations,andofcourse,anadvantageofMCEtraining. TheMCEtrainingwasalsoappliedtotheIrisandBreastCancerdataandcomparedtotheresultsshowninXuetal.[ 114 ](Note,theappendicitisdataisnolongerattheMachineLearningwebsite).TheIrisdataisathreeclassproblem,whereastheBreastCancerdataisatwoclassproblem.AsinXuetal.[ 114 ],ten-foldcrossvalidationwasperformed.Wereporttheaverageerrorratesachievedin(Table 2-5 ).TheaverageerrorrateachievedontheIrisdatawas4%,whereastheaverageerrorrateachievedontheBreastCancerdatawas22.7%,whichcomparesfavorablywiththeresultsinXuetal.[ 114 ]. Thecomputationalcomplexityoftheproposedtrainingalgorithmisnothigh.First,thenumberoffreeparametersisonlyn,whereasthenumberoffreeparametersfora 96

PAGE 97

TimeComplexityforMCE=O(Knlog(n)+HM(n3+Kn2));(2{27) whereHisthenumberofiterationsinthemainloop,KisthetotalnumberoftrainingsamplesandMisthenumberofclasses.Incomparison,theSequentialQuadraticOptimization,usedtosolvequadraticproblemsunderconstraints,wouldnishwithanexponentialtimecomplexity. 97

PAGE 98

ExamplesofsensitivitytodesiredoutputsforSugenomeasurewhere1and2representthedesiredoutputsforminesandnon-minesrespectively. Examplesofsensitivitytodesiredoutputsforgeneralmeasureswhere1and2representthedesiredoutputsforminesandnon-minesrespectively. 98

PAGE 99

ComparisonofPFAforSugeno-measuretrainedwithLSEagainstdierentdetectors. 100.0098.2898.370.09%95.40-3.02%95.07-3.37%98.0032.7957.6543.11%68.1151.85%66.9951.05%96.0024.2130.0319.38%45.2346.47%40.3139.93%94.0018.5124.6925.01%31.0640.39%28.3634.72%92.0013.5619.1529.20%15.4812.41%18.8327.98%90.0010.3216.3236.77%13.6724.51%14.6429.54%88.008.7014.2739.02%12.1328.27%12.1828.54%86.007.1612.1341.02%10.0428.73%11.1135.59%84.005.9610.1841.44%9.0734.23%9.3436.19%82.005.438.3234.79%7.6228.82%7.8130.52%80.004.797.3935.20%7.1633.10%6.6527.95% Continued. 100.0098.2889.31-10.05%76.99-27.66%77.03-27.58%98.0032.7957.4642.93%49.0933.20%35.246.94%96.0024.2135.8032.36%31.9424.19%22.97-5.43%94.0018.5127.2432.05%24.3123.86%13.16-40.70%92.0013.5616.3617.14%15.5312.67%10.93-24.12%90.0010.3212.1314.96%12.6918.70%9.44-9.34%88.008.709.7210.42%10.5117.16%6.79-28.23%86.007.168.6917.68%8.3714.48%5.25-36.23%84.005.968.0025.43%7.7222.74%5.21-14.51%82.005.436.6518.37%6.3714.80%4.60-17.91%80.004.796.0020.13%5.167.18%4.60-4.07% Continued. 100.0098.2881.78-20.18%94.24-4.29%98.0032.7938.7315.32%62.2047.28%96.0024.2124.08-0.54%32.0324.41%94.0018.5118.831.68%26.6930.63%92.0013.5613.580.11%18.4626.53%90.0010.3210.885.15%15.2032.12%88.008.708.32-4.59%13.4435.22%86.007.166.69-6.90%12.2741.69%84.005.965.63-6.00%10.5543.50%82.005.435.30-2.39%9.6743.88%80.004.794.14-15.76%9.2548.23%

PAGE 100

ComparisonofPFAforgeneralmeasuretrainedwithLSEagainstdierentdetectorsatdierentthresholds. 100.0091.1798.377.33%95.404.43%95.074.11%98.0030.2457.6547.55%68.1155.61%66.9954.87%96.0021.2830.0329.13%45.2352.95%40.3147.19%94.0015.5424.6937.06%31.0649.97%28.3645.21%92.0012.9119.1532.61%15.4816.62%18.8331.45%90.0010.0416.3238.45%13.6726.52%14.6431.41%88.008.4314.2740.93%12.1330.52%12.1830.78%86.007.1512.1341.04%10.0428.76%11.1135.61%84.006.0610.1840.48%9.0733.15%9.3435.15%82.005.438.3234.79%7.6228.82%7.8130.52%80.004.947.3933.22%7.1631.05%6.6525.75% Continued. 100.0091.1789.31-2.08%76.99-18.42%77.03-18.35%98.0030.2457.4647.38%49.0938.41%35.2414.20%96.0021.2835.8040.54%31.9433.36%22.977.32%94.0015.5427.2442.97%24.3136.10%13.16-18.10%92.0012.9116.3621.12%15.5316.87%10.93-18.15%90.0010.0412.1317.22%12.6920.86%9.44-6.43%88.008.439.7213.23%10.5119.75%6.79-24.22%86.007.158.6917.71%8.3714.51%5.25-36.18%84.006.068.0024.21%7.7221.47%5.21-16.39%82.005.436.6518.37%6.3714.79%4.60-17.91%80.004.946.0017.69%5.164.34%4.60-7.25% Continued. 100.0091.1781.78-11.48%94.243.26%98.0030.2438.7321.92%62.2051.39%96.0021.2824.0811.62%32.0333.55%94.0015.5418.8317.48%26.6941.77%92.0012.9113.584.92%18.4630.06%90.0010.0410.887.67%15.2033.93%88.008.438.32-1.32%13.4437.25%86.007.156.69-6.87%12.2741.71%84.006.065.63-7.73%10.5542.57%82.005.435.30-2.40%9.6743.88%80.004.944.14-19.30%9.2546.64%

PAGE 101

ComparisonofPFAinMCEagainstdierentdetectorsatdierentthresholds. 100.0094.6598.373.78%95.400.78%95.070.44%98.0026.8957.6553.35%68.1160.52%66.9959.86%96.0015.8030.0347.40%45.2365.08%40.3160.81%94.0011.0724.6955.14%31.0664.34%28.3660.95%92.008.0719.1557.89%15.4847.90%18.8357.16%90.006.6516.3259.27%13.6751.38%14.6454.62%88.005.4514.2761.81%12.1355.08%12.1855.25%86.004.9612.1359.10%10.0450.58%11.1155.33%84.004.3810.1857.01%9.0751.72%9.3453.16%82.003.938.3252.82%7.6248.51%7.8149.73%80.003.617.3951.10%7.1649.51%6.6545.63% Continued. 100.0094.6589.31-5.99%76.99-22.95%77.03-22.87%98.0026.8957.4653.20%49.0945.23%35.2423.69%96.0015.8035.8055.87%31.9450.54%22.9731.21%94.0011.0727.2459.35%24.3154.46%13.1615.83%92.008.0716.3650.71%15.5348.05%10.9326.17%90.006.6512.1345.23%12.6947.64%9.4429.58%88.005.459.7243.90%10.5148.12%6.7919.69%86.004.968.6942.91%8.3740.69%5.255.53%84.004.388.0045.26%7.7243.28%5.2115.94%82.003.936.6540.94%6.3738.36%4.6014.70%80.003.616.0039.73%5.1629.95%4.6021.46% Continued. 100.0094.6581.78-15.75%94.24-0.44%98.0026.8938.7330.56%62.2056.77%96.0015.8024.0834.40%32.0350.68%94.0011.0718.8341.19%26.6958.50%92.008.0713.5840.58%18.4656.30%90.006.6510.8838.91%15.2056.28%88.005.458.3234.50%13.4459.43%86.004.966.6925.87%12.2759.56%84.004.385.6322.19%10.5558.52%82.003.935.3025.92%9.6759.40%80.003.614.1412.64%9.2560.93%

PAGE 102

MeanMCEPFAagainstmeangeneralandSugenoPFA. PDPFAMCEPFAgeneralReductionPFASugenoReductionmeasure 100.0094.6591.17-3.82%98.283.69%98.0026.8930.2411.07%32.7918.01%96.0015.8021.2825.78%24.2134.76%94.0011.0715.5428.73%18.5140.18%92.008.0712.9137.51%13.5640.52%90.006.6510.0433.83%10.3235.59%88.005.458.4335.35%8.7037.37%86.004.967.1530.63%7.1630.65%84.004.386.0627.77%5.9626.59%82.003.935.4327.66%5.4327.65%80.003.614.9426.77%4.7924.54% Table2-5: ComparisonofMCESugenoagainstseveralotherclassiersforirisdataandbreastcancerdata. MethodIrisBreastdata(%)cancer(%) Linear2.0029.00Quadratic2.7034.40Nearest4.0034.00neighborBayes6.7028.20independentBayes16.0034.40quadraticNeuronalnet3.3028.50PVMrule4.0022.90QUAD3.3031.50CLMS4.0027.10HLMS4.7022.60WCIPP4.0026.20MCE4.0022.73 102

PAGE 103

1{33 )isthatitdoesnotconstrainelementsintosatisfyrequiredpropertiesofthefuzzymeasures(Denition 1.24 ): 1. 2. GivenA;B22X,ifABthen(A)(B). IfjXj=k,thenweseektoestimatethe2k2parameters(A);AX.Tosimplifyournotationandtreatafuzzymeasureasaparametervector,weorderthesubsetsof2XintothesuccessionfA1;A2;:::;A2k1g,andwritej=(Aj).Now,wewillimposetheserelationsusingthefollowingstrategybasedontheGibbssampler.Giventhemodelforsparsitypromotionin(Section 1{24 ),andtakingj=1;:::;m=2k2,wecantrytocalculatethejointprobabilityp(;2;1;:::;m;1;:::;m)usingthefollowingstrategy:considerthepowersetlatticeofX=fa;b;cgshownin(Figure 3-1 ).Startingatthetop(i.e.X)andmovingdownwardtothesingeltonelements,itiseasytoobservethatthevalueofafuzzymeasureonaparticularset,forexample(fag),dependsonthepreviousvaluesforthesetsthatcontainfag,i.e.(fa;bg),(fa;cg)and(fa;b;cg).Infact,wemusthave8Ds.t.fagD)(fag)(D).Hence,inthegeneralcase,wecanconstrainthevalueofeachmeasurejbysamplingfromadistributionontheinterval[0;minf(As)jAjAsg].Onepossiblemodicationofhierarchicalmodelsfrom(Section 1.3.1 )isthefollowingone:

PAGE 104

(3{1) where: andKjisanintegrationconstanttomakeN(0;j)I[0;minf(As)jAjAsg](j)aPDF.Inaddition,KjN(0;j)I[0;minf(As)jAjAsg](j)denotesthetruncateddistributionfromwhichthesamplesofjaresampled.Inthismodel,therowvectorsHiarecomposedofzerosandthedierencesf(x(i))f(x(i1))correspondtothecorrectpositionsofthevectorT=(1;2;:::;m)t.Inthisway,theoutputoftheChoquetintegralcanbewrittenasaninnerproduct: asshownin(Equation 1{112 )oftheliteraturereview.WerefertomodelsofthistypeastheMonotonicityConstrainedModels(MCM). 2 ]. TheGibbssamplerthatwedevisedtosolvetheMCMmodeliscalledLSESparsity.ThisnamereectsthefactthatGaussiandistributionsoverthedatarepresentanLSE 104

PAGE 105

AnotheradvantageofusingaGibbssampleristhatwecanuseproportionalfunctionstodothesampling(i.e.wedonotneedtoknowthevalueofeachconstantofintegrationKj). Now,wearereadytodescribetheGibbssamplerfortheMCMmodel.Givenaninitialpoint((0);2(0);(0)1;:::;(0)m;(0)1;:::;(0)m),andassumingthatthesamplesy1;:::;ynareidenticallyindependent,theGibbssamplerlookslike: Given(t)T=((t)1;:::;(t)m)T,generate: (3{4) 105

PAGE 106

Given(t)T=((t)1;:::;(t)m)T,generate: 4. Thenalvalueforeachmeasureparameterjisthemeanofthecollectionofsamplesn(1)j;(2)j;:::;(K)jogeneratedbythepreviousalgorithm. Forthismodel,weonlyneedtodeviseamethodfortheposteriorsamplingofthej's,andselecttheappropriatenon-informativepriorsforand2. Somethingthatweneedtostressistheproblemofusingnon-informativepriorsfortherandomvariablesand2.Thisarisesfromthefactthatnon-informativepriorsarenotrealdistributions[ 1 2 ].So,aposteriorgeneratedbynon-informativepriorscannotbearealprobability. 1{112 ).Then,wecanseefromthemodelin(Section 3.2 )fortheGibbssamplerthat: 106

PAGE 107

3{6 ),I[0;minf(As)jAjAsg](j),truncatesthedistributionp(y1;:::;ynjH;2I)p(jjj)intotheinterval[0;minf(As)jAjAsg].Considerforamomentthedistributionp(y1;:::;ynjH;2I)p(jjj).Since,itisassumedthaty1;:::;ynaremutuallyindependent,itcanbeproventhatthislastequationisanunivariateGaussiandistributionwithrespecttoj.Thus,tosamplefromthisdistribution,weonlyneedtodeviseaclosedformexpressionfortheGaussianandtruncateit.Now,denotethemeanandvarianceofthisunivariateGaussianbyand2.Then,wecanwritethecompleteexpressionofp(jj;2)as: (2)n 2exp(1 22nXi=1(yiHi)2)1 2j2j; whereHiistheithrowinthedesignmatrix: andjisthejthpositioninthevector.Rearrangingthetermsinp(jj;2)leadsto: 22nXi=1((yiHjij)Hijj)21 2j2j);(3{9) whereHji=(H1i;:::;Hj1i;Hj+1i;:::;Hmi)andj=(1;:::;j1;j+1;:::;m)T.(Equation 3{9 )canberearrangedas: 107

PAGE 108

Thus,wesimplifythistobeequalto: Wehavenallythat: Therefore,wehavethefollowing,aftercompletingthesquaresandremovingunnecessaryterms: Then,wehavethat: 108

PAGE 109

3-2 ).Itisclearthatwehaveahigherprobabilityofsamplingnearthemeanthanawayfromit.Ifthestandarddeviationdecreases,theprobabilityofsamplingvaluesnearthemeanishigher.Now,ifwearetryingtosamplefromanintervalthatisfarawayfromthemeanwithrespecttothestandarddeviation,thecurveoftheGaussianfunctiontendstobeat.Thus,wecanassumethatwearesamplingfromauniformdistributionintheintervalofinterest.Althoughthislookslikeaninecientsolution,ourexperimentswillshowthatthissimpleapproachworksrelativelyne,butimprovementscanbemade.Weneedtostressthatthisisnotatheoreticallycorrectsolution.Thecorrectsolutionisbasedinthefollowingdistribution: Rbaexph()2 Thisisthereasonwhywearelookingformoreecientwaystocircumventthisproblem. 3.1 )-(Section 3.2 )isthatwearetryingtopromotesparsityusingaGaussiandistributions.Thisisclearifwelookattherst 109

PAGE 110

22nXi=1(yiHi)2)exp1 2j2j: Itisclearthattherstexponentialrepresentstheminimizationtermfortheclassication,andthesecondexponentialrepresentsthesparsitypromotingterm.Whenwecometounderstandthis,werealizethatgivenj2[0;1],weshallbeusingabettersparsitypromotingdistribution.Figueiredoandothers[ 40 ]suggestedthattheLaplacianisamoresuitablepriorforsparsitypromotionthanaGaussiandistribution,inourcasetheexponentialdistribution,topromotemoresparsityinourmodel.Thus,thenewmodellookslike: (3{19) Thus,thenewmodeleliminatestheintermediate,anddirectlyusesanexponentialdistributionforthemeasurestobelearned.NowtheGibbssamplerlookslike: Given(t)T=((t)1;:::;(t)m)T,generate:

PAGE 111

(3{20) 3. Inthisnewmodel,thejwillbesampledfromthedistribution: Anotherpossibleimprovementforthisnewmodelcouldbeassumingnotasingleasarateofsparsityforallmeasurevalues,butoneforeachmeasurevaluetobelearned.Inaddition,wewouldliketobeabletohavefeedbackfromthesparsityratiotothese'stoincreasethelevelofsparsityifitisnecessary.Inordertodothis,weneedtoincludeanextradistributionforthe's,whichcannotbeanon-informativeprior.Zare[ 142 ]proposedmethodsforsolvingtheseproblems. ThisestimatestheaccuracyaftereachiterationoftheGibbssampler.Thisnewmodicationimprovestheclassicationcapabilitiesofthealgorithmbecausewhen 111

PAGE 112

Now,forthesparsitypromotionterm,weuseavariationoftheideaproposedbyZare[ 142 ].Foreachtermj,wehavej=1 (3{23) AswehaveseeninourexperimentsusingtheMCEmethod(Chapter 2 ),theShapleyindicestendtohavealargestandarddeviationoverann-foldcrossvalidation.Thismakesitdiculttoanalyzewhichalgorithmsareimportantforfusion. CanwesaythattheintroductionofasparsitypromotingdistributionwillminimizethestandarddeviationoftheShapleyindexinann-foldcrossvalidation?Ifthisisthecase,theintroductionofthesePDFsallowsustoimproveouranalysisabouttheimportanceofeachalgorithminthefusion. 112

PAGE 113

130 ]Inthismethodaquadraticobjectivefunctionunderconstraintsisdenedtoobtaintheoptimalmeasures.Thenumbersofconstraintsandparametersareexponentialfunctionsofthedimensionalityoftheinputandtheyarestoredinamatrix.Therefore,solvingthequadraticprogrammingquicklybecomesintractable.AlthoughtheGibbssamplerhasasimilarlimitation-theexponentialnumberofparameterstobelearned-theGibbssamplerdoesnotneedtostoreasparsematrixofconstraints.Insteaditusesacleverwayofimposingtheconstraintswithinthealgorithm,whichdecreasesthecomplexityincalculatingthesolution. AnextrapropertyisthefactthatsamplesgeneratedbyGibbssamplersareattractedtoregionsofhighprobability.Thisovercomesoneoftheproblemsindeterministicoptimizationwhendealingwithnon-convexfunctionsthatconvergetolocalminima.ThismeansthataGibbssamplerlooksattheglobalpropertiesofaprobabilitywhilegradientbasedtechniqueslookatlocalpropertiesofafunction.Therefore,aGibbssamplerisaglobalstochasticoptimizer. Ithasbeenseenempirically,usingthesamehardwareforbothmethods,thattheGibbssamplerisfasterthanthequadraticprogramming,butacompleterateofcovergenceandcomplexityanalysisneedstobeperformed.Fornow,wewillleavethistofutureresearch. LSESparsitysharesonedrawbackwithquadraticprogramming,theuseofdesiredoutputs.Thismakesitnecessarytodevelopnewmethodsthatdonotdependondesiredoutputs. 113

PAGE 114

3-3 ),wecanseetheseparationbetweentheclasses. Theconfusionmatrix(Table 3-1 )showsthatthealgorithmisabletoseparatetheclasses.Thefuzzymeasurevaluesarein(Table 3-2 )andtheShapleyvaluesarein(Table 3-3 ).NotethattheShapleyindicesoftheinformativefeatures(features1and3)areapproximately20timeslargerthanthoseofthenon-informativefeatures(features2and4). 3-4 ).Themeasurevaluesareinthe(Table 3-5 )andtheShapleyvaluesarein(Table 3-6 ). 114

PAGE 115

ConfusionmatrixforarticialdatasetcaseIinGibbssamplerforLSEsparsity. CMclass1class2 class110000class201000 Table3-2: MeasuresforarticialdatasetcaseIwithmeanandstandarddeviationoftheMarkovchainsinGibbssamplerforLSEsparsity. MeasuresMeanStandarddeviation Table3-3: ShapleyvaluesforthefeaturesinarticialdatasetcaseIinGibbssamplerforLSEsparsity. 10.46229020.02470830.48755040.025449 Table3-4: ConfusionmatrixforarticialdatasetcaseIIinGibbssamplerforLSEsparsity. CMclass1class2 class19982class201000 115

PAGE 116

MeasuresforarticialdatasetcaseIIwiththemeanandstandarddeviationoftheMarkovchainsinGibbssamplerforLSEsparsity. MeasuresMeanStandarddeviation Table3-6: ShapleyvaluesforthefeaturesinarticialdatasetcaseIIinGibbssamplerforLSEsparsity. 10.72993020.02985530.20466040.035554 116

PAGE 117

Exampleofalatticewherethearrowsrepresentthesubsetrelationonit. 117

PAGE 118

ThisplotshowstheideathatsamplinginanintervalfarawayfromthemeanofaGaussianwithsmallvarianceissimilartosamplingfromauniformintheinterval. 118

PAGE 119

Plotofsamplesforclass1`o'andclass2`+'fortherstthreefeaturesincaseI. 119

PAGE 120

Inthischapter,wedevelopanalternativemodelforLSESparsitywhichismoresimilartoFigueiredo'soriginalmodelthantheonein(Chapter 3 ).ThisnewmodelhasbeenproposedasawaytoxtheproblemofusingaGaussiandistributiontosimulateanembeddedLaplaciandistributionforeachjin(Model 3{19 ). 3 )topromotesparsitywhenusingtheGibbssampler(Model 3{4 ).InthisGibbssampler,jisbeingsampledfrom: whichisthelikelihoodof,giventhedataL(jy1;:::;yn)multipliedbyapriorjN(0;j).In(Model 1{33 ),FigueiredousedEMtomaximizetheaposterioriprobabilityforand2[ 40 ]: whichembedsaLaplaciandistributionforinthenalsolution.ThisisdierentfromtheGibbssampler(Model 3{20 )usedtosolve(Model 3{19 ).InthiscasetheGibbssamplertriestomaximizethelikelihoodthat: Therefore,ifwewanttohaveanembeddedLaplaciandistributionasinFigueiredo'ssolution,itisnecessarytodevelopaversionoftheMaximumAPosteriori(MAP)EMfortheGibbssamplerandtheChoquetintegral.Forthis,weuseamodiedversionoftheGibbssamplerovermissingdataasexplainedbyRobertandCasella[ 2 ]. 120

PAGE 121

2 ],considerthefollowingtwoBayesianhierarchies.First,onewithanincomplete-datalikelihood: 121

PAGE 122

4{7 ),(Equation 4{6 )andassumingthatthej'sareindependentidenticallydistributed(idd),wecandenethedensityforthehiddendataj: Notethatbythewell-knownformula(Equation 1{34 )and(Model 4{5 ),p(jjj)isgivenby Thisdistributionissimply: Inthislastexpression,weknowthatj0.Therefore,wehavethatjhasdistribution: whereCjisaconstantofintegration.Wecangoonestepfurtherbymovingp Thiscanbeconvertedintoagammadistributionifwetakezj=1 withk=1 2and=2 j;(4{14) with=1 2and=2 122

PAGE 123

Finally,itisknownthatjneedstobesampledintheinterval[0;minf(As)jAjAsg].Therefore,wehavethat: ThisallowsustodenethefollowingGibbssamplerfortheMAP-EM Given(t)T=((t)1;:::;(t)m)T,generate: Given(t+1)T=(t+1)1=1

PAGE 124

Update2(t+1)=1 4{17 )withtheonedevelopedbyFigueiredo[ 40 ].Thisisduetoanexponentialtermfortheprobabilitydistributionfunctions(Equation 4{12 ),2j Therefore: 2whent!1;(4{19) whichisnotthedesiredsolutionforpromotingsparsityintheparametervaluesof.Weproposethefollowingmodicationtoovercomethisproblem. 40 ]thelogarithmoftheposterioris: logp(;2jy;)/nlog2kyHk22 whereT()representstheregularizationtermPj2jontheRidgeregression.Thismeansthat(Equation 4{20 )issimplyaRidgeregression,which,asweknow,doesnotpromotesparsityasmuchas(Equation 1{32 ).However,iftheexpectationistakenforthe 124

PAGE 125

Thisallowsonetowriteeachofthetermsintheregularizationequationas: ThislastequationrepresentstheLaplaciandistribution.Therefore,itisclearthatinFigueiredo'sMAP-EM,theexpectationconvertstheRidgeregressionintoTibshirani'ssparsitypromotiondistribution(Equation 1{32 ),whichismaximizedbythemaximizationstep. ThesearetheresonswhywedecidetouseavariationoftheclassicalEMGibbs(Section 1.2.4 ),theslicesampler(Section 1.2.6 )and(Equation 4{15 )toovercometheproblempointedoutin(Section 4.2 ). Considerthenthecompletelikelihoodofthedata(Model 4{18 ): Ifweassumethatgivenitisnotnecessarytoknow,wecanwrite(Equation 4{23 )as: Now,from(Model 4{5 )and(Equation 1{34 ),weknowthatanddependon.Therefore,wecanmakethefollowingassumptions 125

PAGE 126

Inasimilarway,wehavetheincompletedatalikelihoodL(;2jy)=p(yj;2). Wethenhavethefollowingdistributionfor: Duetothefactthatweareusingslicesamplers[ 2 ]tosamplethisdistribution,itispossibletoremovetheconstantofintegrationp(j),rememberthatandaregiven.Then,wehave: Ithasbeenshown(Equation 1{34 )thatintegratingtauin(Equation 4{29 )producesthemarginalprobabilityofwhichisaLaplaciandistributionwithparameter. Givenax,itispossibletouseMarkovsamplingtoestimatethemarginalvalueforofp(;j)[ 143 144 ]byobservingthat: Therefore,wehavethefollowingestimationforthemarginalofj: whichconvergesto[ 145 ]: 1 Intuitively,thisistellingusthatifwesampleandthensequentiallytoinnity,wehavethat: 126

PAGE 127

4{17 )asfollows: Given(t)T=((t)1;:::;(t)m)T,(t)T=((t)1;:::;(t)m)Tand(t)T=((t)1;:::;(t)m)Tgenerate: Given(t+1)T=(t+1)1;:::;(t+1)m1T,generate: Update(t+1)j=1 4. Update2(t+1)=1 127

PAGE 128

InadditiontothenormL2(Euclideandistance)forlearningfunctions,itispossibletousedierenttypesoflearningfunctions.ForexamplethenormL1(Absolutevalue),thelogisticfunctionorGaussianregressionfunction.In(Chapter 5 ),wewillexploretheuseofthelogisticregressionforimplementingtheMCEcriterion. 3 ),wehavedecidedtousetheslicesampler(Section 1{23 ).ForMAP-EMSparsity,itisnecessarytodeveloptwoslicesamplers,oneforthejandtheotherforthej's.First,given(t)T=((t)1;:::;(t)m)T,(t)T=((t)1;:::;(t)m)Tand(t)T=((t)1;:::;(t)m)T,wesample: Forallj=1;:::;m1.Weneedtogeneratethefollowingsets: 128

PAGE 129

0j2 0j1 Therstinequality(Equation 4{41 )isduetothefactthat: exp(((t)j)2 Wethenusethesevaluestogeneratethefollowingintervalsforeachj: Thenwesampleuniformlyeach(t+1)1fromtheinterval: Nowthatwehavetheupdated(t+1)1;;(t+1)m,wecansample: foralli=1;;n.Inadditionwesample: and 129

PAGE 130

whereHjiistherowvectorHiwithoutthejthpositionHij. Inadditiontotheseintervals,wehaveforrjandrU: and [0;minf(As)jAjAsg]:(4{54) Then,wehavethefollowingset: 1. ItnaturallygeneratestheLaplaciandistributionforeachjbysamplingj,insteadofusingacomplicatedformulationfortheexpectationof1 2. ItusesaprobabilisticnumericalsolutionfortheBayesianhierarchy,whichcanbeconsideredastochasticglobaloptimizer.Figueiredo'ssolutionobtainsafunctionthatismaximizedbyusingderivativeswhichcouldorcouldnotndaglobaloptima. 130

PAGE 131

ThenewMAP-EMSparsitycanhandlethemonotonicitypropertyofthefuzzymeasures.Incomparison,Figueiredo'ssolutionneedstobeheavilymodiedtobeabletohandletheserelationsincreasingitscomplexity. Inaddition,ThenewMAP-EMSparsitysolvessomeoftheproblemsthancanbefoundinLSESparsity: 1. Thelackofacorrectdistributionforj. 2. TheproblemoftryingtosimulateanembeddedSparsitypromotingdistributionbyusingaGaussiandistribution. 3. Theuseofanaivesamplerforthetruncateddistributionofj. MAP-EMGibbssolvestheseproblembymaximizingaposteriordistributionandusingslicesamplersforsamplingthetruncateddistribution. AlthoughthisisanimprovementoverLSESparsity,MAP-EMGibbsstilldependsondesiredoutputs.Forthisreason,in(Chapter 5 ),wesolvethisproblemusinglogisticregressionandaconvenientBayesianhierarchicalmodel. 131

PAGE 132

4-1 ),wecanseetheseparationbetweentheclasses. Theconfusionmatrixisin(Table 4-1 ),thefuzzymeasurevaluesarein(Table 4-2 )andtheShapleyvaluesarein(Table 4-3 ).NotethattheShapleyindicesoftheinformativefeatures(features1and3)areapproximately20timeslargerthanthoseofthenon-informativefeatures(features2and4).ThesampleoutputsoftheChoquetintegralusingthelearnedmeasurecanbeseenin(Figure 4-2 ).Ifwecomparethemwiththeoutputs(Figure 4-5 )oftheMCEunderSugenomeasures(Chapter 2 ),itisnoticeablethatMAP-EMisbetteratseparatingtheclasses.However,ifwelookattheROCcurves,(Figure 4-6 )and(Figure 4-7 ),thetwoalgorithmslooksimilar. Thetracesofthemeasurescanbeseenin(Figure 4-3 ),whicharethevaluesthattheparameterstakeateachiterationwhensampling.Finally,thesampledistributionsofeachparameterarein(Figure 4-4 ).Thenon-importantparametershavesampledistributionsthatlooklikeexponentialdistributions.Inaddition,theimportantparametershavesampledistributionsdierentfromtheexponential. 4-8 ).Therestofthefeaturescontainuniformnoise. Theconfusionmatrixisshownin(Table 4-4 ).Someofthemeasurevaluesarein(Table 4-5 ).TheShapleyvaluesarein(Table 4-6 ).Thesevaluesshowthattheimportantfeaturesarethesecondandthefth. Theoutputsin(Figure 4-9 )andtheconfusionmatrix(Table 4-4 )showthattheMAP-EMSparsityhasdicultyseparatingthesamplesinthisproblem. 132

PAGE 133

5 ),producesabetterseparationoftheclasses.Thiscanbeseenin(Table 4-7 ),(Table 4-8 )and(Figure 4-10 ). ThismeansthattheMAP-EMSparsityishighlysensitivetodesiredoutputsbecauseitslearningfunctionisLSEunderaprobabilisticinterpretation(Section 2.5 ).However,whenwecomparewiththeoutputoftheMCEunderSugenomeasuresin(Figure 4-11 )andtheROCcurves(Figure 4-12 ),itisclearthattheMAP-EMseparatestheclassesbetter.Inaddition,interpretingthedensitiesintheMCEunderSugenomeasures(Table 4-9 )isdicult,whencomparingwiththeShapleyindexin(Table 4-8 ). Finally,examplesoftraces(Figure 4-15 )anddistributions(Figure 4-16 )showthatMAP-EMSparsity,atleastforthiscase,isableofproducingdistributionswheretheunimportantparametershaveexponentialdistributions,andtheimportantparametershaveanon-exponentialdistribution. 1.6.4 ).Inthisvariation,twoclasseswithsixfeaturesamplesx=(x1;x2;x3;x4;x5;x6)Twhicharedenedasfollows: 1. ThefthfeatureissampledfromaBernoullidistributionwithparameter1 2. 2. Therstfourfeaturesobeythefollowinglogicalequations: (x1_x2)^(x3_x4)=1ifx2class1; (x1_x2)^(x3_x4)=0ifx2class2; 3. ThesixthfeatureissampledfromaBernoullidistributionwithparameterp=3 4ifx2class1,andp=1 4ifx2class2. Inthiscase,wedecidenottoshowthemeasureparametervalues,buttheShapleyindexes(Table 4-11 ).Fromthesevalues,itisclearthatthetherstfourfeaturesarethemostimportantones.Thesevaluesarecomparabletotheimportanceindexesdevelopedin[ 37 ]. 133

PAGE 134

4-10 ),theoutputsoftheChoquetintegralarein(Figure 4-13 ),andtheROCcurveisin(Figure 4-14 ).FromtheROCcurveswecansaythatMAP-EMisbetterthanMCEunderSugeno.Forexample,at90%PDMCEhasaPFAof10.1%whenMAP-EMhas0%. 1. DependingontheproblemtheMAP-EMSparsityalgorithmmaygeneratesparsedistributions(exponentialdistributions)forthenon-importantparameters. 2. ItiseasiertointerpretthemeasureparametervaluesfromtheMAP-EMthantheonesbyMCEunderSugenomeasures. 3. TheMAP-EMSparsityalgorithmisabletoidentifywhichcombinationoffeaturesareimportant. 4. TheMAP-EMSparsityalgorithmissensitivetodesiredoutputs. 5. TheMAP-EMalgorithmisbetteratthetaskofclassicationthanMCEunderSugenomeasures.ThissuggestthatMAP-EMmaybebetteratthefusiontaskthantheMCEunderSugenomeasure. 134

PAGE 135

2.5 ).Forhistoricalreasons,dierentsetsofalgorithmsorversionsofalgorithmswereappliedtoeachdataset. TheexactnatureofthealgorithmsisnotusedbytheChoquetintegralandsotheyarenotdescribedhere.Someofthealgorithmsarepublishedandtheirdescriptionscanbefoundin[ 137 139 146 { 150 ].HereweprovidetheindividualperformancesofthealgorithmsandtheperformanceoftheChoquetintegralsusingthemanymethodsdevelopedtooptimizethemeasures.TherelativeperformancesofthesetrainingalgorithmscanbedeterminedusingresultsobtainedfromthesedatasetscollectedfromallovertheU.S.andpartofEuropeinsupportofndingasolutiontotheveryimportantproblemofminedetection. Thenamesgiventothealgorithmsatthedierenttestsitesarein(Table 4-12 ). 4-17 ))andacomparisonwiththeMCEunderSugenomeasuresisin(Figure 4-18 )Inthiscase,MAP-EMSparsityisbetterthantheMCEundertheSugenomeasure.Inaddition,at90%PDMAP-EMreducesthefalsealarmratebyafactorofthreecomparedtothebestalgorithmHMMconf. 4-19 )andacomparisonwiththeMCEunderSugenomeasuresisin(Figure 4-20 ).Inthiscase,MCEundertheSugenomeasureisdoingbetterthanMAP-EM.Webelievethisisduetothefactthatwearealreadyselectingthebestpossiblesubsetofalgorithms.Inaddition,weneedtodoabetterselectionofdesiredoutputsforMAP-EMSparsity. 135

PAGE 136

4-21 )andacomparisonwiththeMCEunderSugenomeasuresisin(Figure 4-22 ).Whenlookingattheseresults,MAP-EMSparsityimprovesoverMCEunderSugenomeasures,butstillwehaveahighdependenceonthedesiredoutputs. 1. DependingonthedatasetMAP-EMcanbesuperiorthanMCEunderSugenomeasures. 2. TheMAP-EMSparsityalgorithmisstillsensitivetodesiredoutputs.Itisnecessarytodevelopnewtechniquestoavoidthisproblem. 136

PAGE 137

ConfusionmatrixforarticialdatasetcaseIforMAP-EMSparsity. CMclass1class2 class110000class201000 Table4-2: MeasuresarticialdatasetcaseIwithmeanandstandarddeviationoftheMarkovchainsforMAP-EMSparsity. MeasuresMeanStandarddeviation Table4-3: ShapleyindexesforthefeaturesinarticialdatasetcaseIforMAP-EMSparsity. 10.48900020.01366730.46983040.027500 Table4-4: ConfusionmatrixforarticialdatasetcaseIIforMAP-EMSparsity. CMclass1class2 class1897103class268932 137

PAGE 138

PartialmeasurevaluesforcaseIIwiththemeanandstandarddeviationoftheMarkovchainsforMAP-EMSparsity. MeasuresMeanStandarddeviation Table4-6: ShapleyindexesforthefeaturesinarticialdatasetcaseIIforMAP-EMSparsity. 10.03936920.40349030.05059340.01926750.40292060.02091770.03221780.031233 138

PAGE 139

AbetterconfusionmatrixforcaseII. CMclass1class2 class19982class201000 Table4-8: MoresparseShapleyindexesforcaseII. 10.009547620.457540030.015576040.010317050.473360060.010848070.010181080.0126360 Table4-9: DensitiesfortheMCEunderSugenomeasuresforcaseII. DensitiesValueclass1 Table4-9: Continued. DensitiesValueclass2 139

PAGE 140

ConfusionmatrixforarticialdatasetcaseIIIforMAP-EMSparsity. CMclass1class2 class110000class201000 Table4-11: ShapleyindexesforMAP-EMSparsityforcaseIII. 10.3332220.1667730.1667740.3332251.66670e-0561.66670e-05 Table4-12: Dierentdatasetsanddetectionalgorithmsforthelandminefusionproblem. BJ2006BJ2007A2007 EHDHMM-DTXTACNA-EHDConfGFITEHDHMM-DTXTGMRFSCFF1A-EHDConfHMM-DTXTPrescreenerF1V4PrescreenerG1confROCAHMMConfSCFRadialTFCMSCF 140

PAGE 141

PlotofsamplesfortherstthreefeaturesincaseIwherethesecondfeaturehasnovalueforclassication. 141

PAGE 142

OutputsforMAP-EMSparsityinCaseIwith\o"forclass1and\x".forclass2. 142

PAGE 143

TracesforthemeasureparametersamplesgeneratedbyMAP-EMSparsityinCaseI. 143

PAGE 144

DistributionsforthemeasureparametersgeneratedbyMAP-EMSparsityinCaseI. 144

PAGE 145

OutputsfortheMCEunderSugenomeasuresgradientdescentmethodforcaseI. 145

PAGE 146

ROCcurveforMAP-EMSparsityincaseI. 146

PAGE 147

ROCcurvefortheMCEunderSugenomeasuresincaseI. 147

PAGE 148

PlotofsamplesfortherstandfththreefeaturesincaseII. 148

PAGE 149

OutputsforMAP-EMSparsityinCaseIIwith\o"forclass1and\x"forclass2. 149

PAGE 150

AbetteroutputseparationforcaseIIforMAP-EMSparsity. 150

PAGE 151

OutputsfortheMCEunderSugenomeasuresforcaseII. 151

PAGE 152

ROCcurvesfortheMCEunderSugenomeasuresandMAP-EMSparsityforcaseII. 152

PAGE 153

ChoquetoutputsforMAP-EMSparsityforcaseIIIwith\o"forclass1and\x"forclass2. 153

PAGE 154

ROCcurvesfortheMCEunderSugenomeasuresandMAP-EMSparsityforcaseIII. 154

PAGE 155

ExamplesoftracesforthemeasureparameterbyMAP-EMSparsityincaseII. 155

PAGE 156

ExamplesofdistributionsforthemeasureparametersbyMAP-EMforcaseII. 156

PAGE 157

ROCcurvesforalltheA2007algorithmsandMAP-EMSparsity. 157

PAGE 158

ROCcurvesforMCEunderSugenoandMAP-EMSparsityindatasetA2007. 158

PAGE 159

ROCcurvesforalltheBJ2007algorithmsandMAP-EMSparsity. 159

PAGE 160

ROCcurvesforMCEunderSugenoandMAP-EMSparsityindatasetBJ2007. 160

PAGE 161

ROCcurvesforalltheBJ2006algorithmsandMAP-EMSparsity. 161

PAGE 162

ROCcurvesforMCEunderSugenoandMAP-EMSparsityindatasetBJ2006. 162

PAGE 163

WeproposeanextensionofthepreviousideasthatcombinessparsitypromotingGibbssamplerswiththeMCEconcept.Thisideaprovidesanecientmechanismforlearningfullmeasures,withouttheproblemofdesiredoutputs. A )intheform: AnimmediateproblemwithapplyinglogisticregressioninthiscontextisthefactthattherangeoftheChoquetintegralisbetween[0;1].Hence,weneedtomapthesevaluestotheinterval[;].Thiscanbedoneusingthefollowingmapping: Now,giventhefactthatwewouldliketofuseinformationforimprovingtheclassicationoftwoclasses,wecanassumethatthelabelclass 1.4.4 )forthefusiontask.Thus,wehavethefollowingprobabilitiesforatwoclassproblem: 163

PAGE 164

Wecanuseprobabilities(Equation 5{3 )and(Equation 5{4 ),andtheexponentialdistributionforsparsitypromotiontodesignthefollowingBayesianhierarchicalmodel: (5{5) 5{6 )as: 164

PAGE 165

Therefore,theGibbssamplerfortheonemeasurecaseforthelogisticregressionunderthemonotonicityconstraintis: Given(t)T=((t)1;:::;(t)m)T,generate: ... (5{8) 2. Foreachj,update: 5{8 )canbedicult,forthatreason,weusetheslicesampler. 165

PAGE 166

5{8 )foreachj.First,wesamplefromuniformdistributionsforeachf!i and Inaddition,weneedasampletheexponentialfactorthatrepresentsthesparsitydistributionforj: Finally,tomaintainthestructureoftheSlicesampler,wesample: Now,weneedtobuildtheset: tosampleuniformlyfromit.InsetA,thefunctionsintheconstraintshavethefollowingdenitions: 166

PAGE 167

4.4 ),wehavethefollowingintervalsfromtheinequalitiesfi(j)>wi;i=1;:::;n: 2Hijlogwi 2Hijlogwi whereHijistheposition(i,j)inthedesignmatrixH,HjisthedesignmatrixHwithoutthecolumnj,jisthemeasurevectorwithoutthepositionjandKj=HTjj.Inaddition,weneedthefollowingintervals: and [0;minf(As)jAjAsg](5{20) frominequalitiesfexp(j)wexpandfunif(j)wUrespectively. Finally,assumingthatjClass1j=n1andjClass2j=n2,wehavethefollowinginterval[B1;B2],where 2Hijlogwi (5{21) 2Hijlogwi Then,weobtainasampleforjbysamplinguniformlyfromtheintervalA=[B1;B2]. 167

PAGE 168

Thishaspointedoutthatusingauniformdistributionfortheinterval [0;minf(As)jAjAsg] mightbenotthebestpossibledistributiontoenforcethemonotonicitypropertyinourmodel. Thedecisiontoemploytheuniformdistributionover[0;minf(As)jAjAsg]wasduetoitssimplicityandeasyuseforsampling.Itisclearthatadierentprobabilitydistributionisnecessaryforenforcingthemonotonicitypropertyofthefuzzymeasureinourmodel.Forexample,wecouldconsiderahalf-Gaussiandistribution,withmodeatminf(As)jAjAsgtodecreasetheprobabilityofsamplingimprobablemeasures.However,atthismomentwedonothaveaclearanswerforthisquestion,butwehopethattheseobservationswillhelpustoopennewvenuesofresearch. 168

PAGE 169

3 )and(Chapter 4 ),wecanthenmodify(Model 5{6 )asfollows: 5{23 )allowsustowritethefollowingGibbssampler: Given(t)T=((t)1;:::;(t)m)T,(t)T=((t)1;:::;(t)m)Tand(t)T=((t)1;:::;(t)m)Tgenerate: Given(t)T=((t)1;:::;(t)m)Tand(t)T=((t+1)1;:::;(t+1)m)T,generate: 169

PAGE 170

(5{24) 3. Foreachj,update: 4.4 ).The'scanbesampledbyusingacombinationbetweentheslicesamplersin(Section 4.4 )and(Section 5.1.2 ).Inthisway,wehavetwopossiblemethodsforthecasesinglemeasureunderlogisticregression. (5{25) asafunctionoff!,whichisamodiedversionofthedissimilaritymeasureusedinMCE.(Equation 5{25 )canbewrittenas(Appendix A ): whereH1[f!]andH2[f!]representthedierencesoffunctionsineachChoquetintegral.1and2representthemappingfunctionsforsamplesf!intoclass1andclass2rangesrespectively.Although,(Equation 5{26 )hasarangein[1;1],thiscancauseproblemsin 170

PAGE 171

exp1 Thus,wecanusethefollowinghierarchicalmodelforatwomeasurecase,1and2,: (5{29) 5{29 )hasthefollowingstructure: Given(t)T=((t)11;:::;(t)1m;(t)21;:::;(t)2m)T,generate: 171

PAGE 172

(5{30) 2. Foreachkandj,update: 2 ).Inaddition,itworksforafullmeasureandnotjusttheSugeno-measure. Now,wecanusetheslicesamplertosamplefromtheposteriordistributions(Equation 5{30 )intheGibbssampler. 5{30 )for1j,wesamplefromthefollowinguniforms: and 172

PAGE 173

and Now,bythesamemethodin(Section 4.4 ),wesamplefromthefollowingintervalsfor1j: and whereK=C1(1(f!i))C2(2(f!i))andK1j=C1(1(f!i))C2(2(f!i))H1j1(f!i)1j.Inaddition,wehave: and [0;minfk(Aj)jAjAsg]:(5{38) GivenjClass1j=n1andjClass2j=n2,wehavethefollowinginterval[B1;B2],where: Then,wesimplysample1juniformlyfromtheinterval[B1;B2]. Inasimilarway,wehavefor2j: 173

PAGE 174

where Inaddition,wehave: and [0;minfk(Aj)jAjAsg]:(5{46) Therestissimilartothecasefor1j. ThisallowsustoimplementtheMCEinaprobabilisticframeworkbyusingtheGibbssamplertoenforcethemonotonicitycontraintsofthefuzzymeasures.

PAGE 175

Given(t)T=((t)11;:::;(t)1m;(t)21;:::;(t)2m)T,(t)T=((t)11;:::;(t)1m;(t)21;:::;(t)2m)Tand(t+1)T=((t+1)11;:::;(t+1)1m;(t+1)21;:::;(t+1)2m)T,generate: Given(t)T=((t)11;:::;(t)1m;(t)21;:::;(t)2m)Tand(t+1)T=((t+1)11;:::;(t+1)1m;(t+1)21;:::;(t+1)2m)T,generate: (5{48) 3. Foreachkandj,update:

PAGE 176

4.4 )and(Section 5.2.2 ). 2{5 )fortheMCEisnotaconvexfunction.Thisisequivalenttohavingmultiplelocalminimaforthecostfunction.Further,theonlymeasuresthattheoriginalMCEcouldhandleareSugeno-measures.ThesearethereasonswhytheMCEalgorithmbasedongradientdescenthasthefollowingdrawbacks: 1. TheMCEin(Equation 2{5 )canonlyhandleSugeno-measures.Thismeasurewasselectedbecauseofitsrecursivedenition. 2. Itisnecessarytointroduceglobaloptimizationideasontopofthegradientdescenttosolvetheproblemofthecostfunctionnotbeingaconvexfunction.Thisonlyincreasesthecomputationalcomplexityofthesolution. ThesearethemainreasonsfordevelopinganMCEintheprobabilisticframework. TheoriginalMonteCarlomethodsweredevelopedbyphysiciststointegratehighdimensionalfunctions[ 2 151 152 ].Themethodisbasedinthefactthatanintegralofafunctionfcanbeinterpretedas: whereDistheregionofintegration,VisaboundingboxforDand: Inthisway,givenrandomsamplesfromD,fx1;;xNg: NNXi=1f(xi):(5{51) Comparedwithmethodsbasedinclassicalintegrationanddierentiation,theMonteCarlotendstobemoreecientforsolvinghighdimensionalproblems. Giventhelastdiscusion,MCELogistichasthefollowingadvantages: 176

PAGE 177

MCELogisticisecientindealingwithhighdimensionalityproblems.ThispointsoutthatanMCELogisticmightbebetteratdealingwiththeexponentialnumberofmeasureparametersthanconventionaloptimizationtechniques. 2. MCELogisticcanuseanytypeoffuzzymeasureratherthanbeinglimitedtotheSugeno-measure. 3. MCELogisticsolutionisattractedtoregionsofhighprobability,makingitabettermethodtondglobaloptima. Althoughthisisbetterthanusinggradientdescentforthecostfunction(Equation 2{5 ),westillhaveonepossibledrawback.Onceasampleisgenerated,itisacceptedwithprobabilityone.Thismightincreasethestandarddeviationofthegeneratedsamples.Inturn,thiscanmakeitdicultfortheMarkovchaintoconverge[ 2 ]. Therefore,itisnecessarytodeviseabetterglobaloptimizationmethodthantheGibbssampler.Forthis,wehavebeenthinkingofusingamodiedGibbssamplerwhereeachmeasureissampledthroughtheuseofaMetropolis-Hastings-likeprobabilityofacceptance.ThiswillallowustocreateabetterglobaloptimizerfortheChoquetintegraltrainingproblem.ThisisknownasMetropolizationoftheGibbssampler.Thisisbeyondthescopeofthisdissertation,butwehopethatthiswillbeafutureresearchsubject. Inaddition,in(Appendix C )wediscussthedierencesbetweenthesenewalgorithmsandthefeatureselectionundermutualinformation(Section 1.6.4 ). andcomparingeachoutputwiththesemeans.Theseweremadebecausewereallydonothavedesiredoutputsforthesetypeofalgorithms. 177

PAGE 178

4-1 ).TheGibbssamplerisrunfor1,100iterationswithaburningperiodof100iterations. Themeasureparameterscanbeseenin(Table 5-2 ).Itisveryinterestingtonotethatthemeasuresforindividualfeaturesoneandthreearebothzero,butthemeasureofthepairoffeaturesis0.98,almostone.ThisillustratetheabilityoftheChoquetintegraltofocusontheworthofsubsetsifonformationsources.TheoutputsoftheChoquetintegralcanbeseenon(Figure 5-1 ),theShapleyindicescanbeseenin(Table 5-3 ),theconfusionmatrixcanbeseenin(Table 5-1 ),andtheROCcurveisin( 5-20 ). Thetraces(Figures 5-12 )anddistributions(Figure 5-13 )showthattheLogisticLASSOtendstodoagoodjobofsendnon-importantparameterstozero.Finally,itisclearthattheLogisticLASSOidentiesthebadfeaturesandseparatestheclasseswell. 4.6.1 )isconsidered.Someofthemeasureparametervaluescanbeseenin(Table 5-5 ),theShapleyindicesarein(Table 5-6 ),andtheROCcurvesisin(Figure 5-21 ). WhenlookingattheShapleyindices,itisclearthattheonlyimportantfeaturesarethesecondandthefth.TheoutputsoftheChoquetintegralusingthelearnedparameterscanbeseenin(Figure 5-2 ).Theconfusionmatrixisin(Table 5-4 ). Someofthedistributionsandtracesforthemeasureparametersarein(Figure 5-15 )and(Figure 5-14 )respectively. 178

PAGE 179

4.6.1 )isconsidered.Theconfusionmatrixisin(Table 5-7 ),theChoquetoutputsarein(Figure 5-10 ),theShapleyindicesarein(Table 5-8 ),andtheROCcurveisin( 5-22 ).AgaintheShapleyindicestellusthattherstfourfeaturesaretheimportantones.Inaddition,itdoesagoodjobinseparatingthetwoclasses,butnotasgoodasMAP-EMSparsity. 5.4.3.1CaseI 4.6.1 )isconsidered.Themeasureparametersvaluescanbeseenin(Table 5-10 ),theShapleyindicesarein(Table 5-11 ),theROCcurveisin(Figure 5-20 ),andtheconfusionmatrixisin(Table 5-9 ).Itiseasytoseethattheonlyimportantfeaturesaretherstandthird.TheoutputsoftheChoquetintegralusingthelearnedparameterscanbeseenin(Figure 5-3 ).Thedistributionandtracesforthemeasureparametersarein(Figure 5-17 )and(Figure 5-16 )respectively. IfwecomparethedistributionsofMAP-EMLogistictothedirectapplicationoftheLASSOinLogistic,itispossibletonoticethatMAP-EMLogistictendstobelessabletosendunimportantparameterstozero.StilltheMAP-EMLogisticLASSOdoesagoodjobidentifyingthebadfeatures. 4.6.1 )isconsidered.Someofthemeasureparametervaluescanbeseenin(Table 5-13 ),theShapleyindicesarein(Table 5-14 ),theROCcurveisin(Figure 5-21 ),andtheconfusionmatrixisinin(Table 5-12 ). IfwelookattheShapleyindices,itiseasytoseethattheonlyimportantfeaturesarethesecondandthefth.TheoutputsoftheChoquetintegralusingthelearnedparametersbyMAP-EMLogisticLASSOcanbeseenin(Figure 5-4 ). Someofthedistributionsandtracesforthemeasureparametersarein(Figure 5-19 )and(Figure 5-18 )respectively. 179

PAGE 180

4.6.1 )isconsidered.Theconfusionmatrixisin(Table 5-15 ),theChoquetoutputsarein(Figure 5-5 ),theShapleyindicesarein(Table 5-16 ),andtheROCcurveisin( 5-22 ).AgaintheShapleyindicestellusthattherstfourfeaturesaretheimportantones.Inaddition,itdoesagoodjobinseparatingthetwoclasses,butnotasgoodasMAP-EMSparsity,again. 5.4.5.1CaseI 5.4.3.1 ).Theconfusionmatrixisin(Table 5-17 ),theROCisin(Figure 5-20 ),andthemeasureparametervaluesarein(Table 5-18 ). WedecidednottoshowtheShapleyindicesbecausetheydonotmakethatmuchsense.Thistellsusthatitisnecessarytocreateanewimportanceindexforproblemswithameasureforeachclass.TheoutputsoftheChoquetdierencescanbeseenin(Figure 5-6 ). Itisclearthatwehavedicultiesinterpretingtherelationbetweenthetwomeasures.Inaddition,simplesubtraction,asintheclassicdissimilaritymeasure(Section 1.5.1 ),doesnotseemtobeveryeectiveatseparatingseparatingtheclasses.Therefore,weneedtoproposeabetterdissimilarityoperatorfortheMCE,butthatisbeyondthescopeofthisdissertation. 4.6.1 .Duetothefactthatitisdiculttointerprettheparametersinthemeasures,weprefertoshowonlytheShapleyindices(Table 5-23 ).ClearlytheShapleyindicesmakesense,ifthemeasureforclassoneisusedweobtainthattheShapleyvaluesfortheimportantfeatures,thesecondandthefth,arehigherthattherest.Inthecaseofthemeasureforcasetwothereversehappens. 180

PAGE 181

5-7 ),theconfusionmatrixisin(Table 5-19 ),andtheROCcurveisin(Figure 5-21 ). 4.6.1 ).TheShapleyindicesarein(Table 5-24 ),theChoquetoutputsarein(Figure 5-11 ),andtheROCcurveisin(Figure 5-22 ),andtheconfusionmatrixisin(Table 5-20 ).Inthiscase,theShapleyindicesarediculttointerpret.Again,itisnecessarytoproduceabetterimportanceindex. 5.4.6.1CaseI 5.4.3.1 ).Theconfusionmatrixisin(Table 5-21 ),theROCcurveisin(Figure 5-20 ),andthemeasureparametervaluesarein(Table 5-22 ).TheoutputsoftheChoquetdierencesarein(Figure 5-8 ). 4.6.1 ).TheoutputsoftheChoquetdierencesarein(Figure 5-9 ),theconfusionmatrixisin(Table 5-25 ),andtheROCcurveisin(Figure 5-21 ).Inthiscase,theShapleyindicesmakesense(Table 5-26 ).IfthemeasureforclassoneisusedweobtainthattheShapleyvaluesfortheimportantfeatures,thesecondandthefth,arehigherthantherest.Inthecaseofthemeasureforcasetwo,theyarebelowthantherest. 4.6.1 ).TheROCcurveisin(Figure 5-22 ),andtherestoftheanswersaresimilartotheonesfortheMCELogisticalgorithm. 1. MCELogisticLASSOismuchworsethantheLogisticLASSOforthecaseII. 2. ThemeasureparametersvaluesinLogisticLASSOandMAP-EMLogisticLASSOareeasytointerpretthantheoneinMCElogisticmethods. 181

PAGE 182

Fromtheseexperimentsandtheonesin(Section 4.6.1 ),itseemstobethatMCEmethodsarenotthebestwaytoavoidtheproblemofdesiredoutputs.However,MCElogisticmethodsseemstobemoreecientthantheonesbytheMCEunderSugenomeasures. 4. AlthoughtheMCElogisticmethodsimproveovertheMCEunderSugenomeasures,itisclearthatitisnecessarytoinvestigatebettermethodstoavoiddesiredoutputsthanMCEformultiplemeasures. 4.6.3 ))Thenamesgiventothealgorithmsatthedierenttestsitesarein(Table 4-12 ). 5-23 )andacomparisonwithallthelogisticmethodsandtheMCEunderSugenomeasuresisin(Figure 5-24 )Inthiscase,allthelogisticmethodsarebetterthantheMCEundertheSugenomeasure. 5-25 )andacomparisonwiththeMCEunderSugenomeasuresandallthelogisticmethodsisin(Figure 5-26 ).Whenlookingattheseresults,itseemstobethattheMCEandMAP-EMSparsityaresimilarforthisparticularcasewithMCESugenoneartobethebestone.Apossiblereasonbehindthisisbecausethesealgorithmsarealreadythebestonesforfusion.Thismeansthatanyfuserwilldoasimilarjob. 5-27 )andacomparisonwithallthelogisticmethodsandtheMCEunderSugenomeasuresisin(Figure 5-28 ).Inthiscase,allthelogisticmethodsarebetterthanthe 182

PAGE 183

1. DependingonthedatasetthelogisticmethodscanbesuperiorthanMCEunderSugenomeasures. 2. Fusionoveralreadyafewgoodsourcesofinformationdoesnotmakethatmuchsense(BJ2007). 3. ItisnecessarytoinvestigateawaytoimprovethedissimilaritymeasurestoobtainbetterclassseparationforMCEalgorithms. 4. Wehavestillproblemsdealingwiththeexponentialnatureofthepowersetofinformationsources. 183

PAGE 184

ConfusionmatrixforcaseIusingLogisticLASSO. CMclass1class2 class110000class201000 Table5-2: MeasureparametervaluesbyLogisticLASSOincaseI. MeasuresMeanStandarddeviation Table5-3: ShapleyindexesforcaseIbyLogisticLASSO. 10.498090020.002142830.498090040.0016679 Table5-4: ConfusionmatrixforcaseIIusingLogisticLASSO. CMclass1class2 class110000class201000 184

PAGE 185

MeasureparametervaluesbyLogisticLASSOincaseII. MeasuresMeanStandarddeviation Table5-6: ShapleyindexesforcaseIIbyLogisticLASSO. 10.002032320.494300030.002645040.002011750.494300060.001247670.001701080.0017552 Table5-7: ConfusionmatrixforcaseIIIusingLogisticLASSO. CMclass1class2 class110000class201000 185

PAGE 186

ShapleyindexesforcaseIIIbyLogisticLASSO. 10.252790020.168360030.166310040.253170050.078532060.0808410 Table5-9: ConfusionmatrixforcaseIusingMAP-EMLogisticLASSO. CMclass1class2 class110000class201000 Table5-10: MeasureparametervaluesbyMAP-EMLogisticLASSOincaseI. MeasuresMeanStandarddeviation Table5-11: ShapleyindexesforcaseIbyMAP-EMLogisticLASSO. 10.4900021.95160e-1830.5066740.00333 186

PAGE 187

ConfusionmatrixforcaseIIusingMAP-EMLogisticLASSO. CMclass1class2 class110000class201000 Table5-13: SomemeasureparametervaluesbyMAP-EMLogisticLASSOincaseII. MeasuresMeanStandarddeviation Table5-14: ShapleyindexesforcaseIIbyMAP-EMLogisticLASSO. 10.003381020.491520030.003952440.003285750.490100060.001976270.003000080.0027857 Table5-15: ConfusionmatrixforcaseIIIusingMAP-EMLogisticLASSO. CMclass1class2 class110000class201000 187

PAGE 188

ShapleyindexesforcaseIIIbyMAP-EMLogisticLASSO. 10.247830020.172330030.158670040.255830050.089667060.0756670 Table5-17: ConfusionmatrixforcaseIusingMCELogisticLASSO. CMclass1class2 class110000class201000 Table5-18: MeasureparametervaluesbyMCELogisticLASSOincaseI. MeasureMeanStandardMeasureMeanStandardclass1deviationclass2deviation 188

PAGE 189

CaseIIconfusionmatrixbytheMCELogisticLASSOalgorithm. CMclass1class2 class1710290class2116884 Table5-20: CaseIIIconfusionmatrixbytheMCELogisticLASSOalgorithm. CMclass1class2 class110000class226974 Table5-21: CaseIconfusionmatrixbytheMAP-EMMCELogisticLASSOalgorithm. CMclass1class2 class110000class28992 Table5-22: MeasureparametervaluesbyMAP-EMMCELogisticLASSOincaseI. MeasureMeanStandardMeasureMeanStandardclass1deviationclass2deviation 189

PAGE 190

CaseIIShapleyindexesbyMCELogisticLASSO. 10.05527210.16150020.34977020.04233630.05649230.14317040.04657040.14231050.33317050.05550960.05300560.15324070.05175870.15079080.05396480.151140 Table5-24: CaseIIIShapleyindexesbyMCELogisticLASSO. 10.14760010.14630020.22898020.19282030.22198030.19983040.15522040.16243050.23107050.21965060.01515060.078964 Table5-25: ConfusionmatrixforcaseIIusingMAP-EMMCELogisticLASSO. CMclass1class2 class1672328class2270830 Table5-26: ShapleyindexesforcaseIIbyMAP-EMMCELogisticLASSO. 10.04817910.14558020.30680020.06662330.04559330.14185040.05374840.13660050.38792050.06778360.05457460.14722070.05131070.14687080.05188380.147490 190

PAGE 191

ChoquetoutputsforthesyntheticcaseIunderLogisticLASSOwith\o"forclass1and\x"forclass2. 191

PAGE 192

ChoquetoutputsforthesyntheticcaseIIunderLogisticLASSOwith\o"forclass1and\x"forclass2. 192

PAGE 193

ChoquetoutputsforthesyntheticcaseIunderMAP-EMLogisticLASSOwith\o"forclass1and\x"forclass2 193

PAGE 194

ChoquetoutputsforthesyntheticcaseIIunderMAP-EMLogisticLASSOwith\o"forclass1and\x"forclass2 194

PAGE 195

ChoquetoutputsforthesyntheticcaseIIIunderMAP-EMLogisticLASSOwith\o"forclass1and\x"forclass2 195

PAGE 196

ChoquetoutputsforthesyntheticcaseIunderMCELogisticLASSOwith\o"forclass1and\x"forclass2 196

PAGE 197

ChoquetoutputsforthesyntheticcaseIIunderMCELogisticLASSOwith\o"forclass1and\x"forclass2 197

PAGE 198

ChoquetoutputsforthesyntheticcaseIunderMAP-EMMCELogisticLASSOwith\o"forclass1and\x"forclass2 198

PAGE 199

ChoquetoutputsforthesyntheticcaseIIunderMAP-EMMCELogisticLASSOwith\o"forclass1and\x"forclass2 199

PAGE 200

ChoquetoutputsforthesyntheticcaseIIIunderLogisticLASSOwith\o"forclass1and\x"forclass2 200

PAGE 201

ChoquetoutputsforthesyntheticcaseIIIunderMCELogisticLASSOwith\o"forclass1and\x"forclass2 201

PAGE 202

TracesforthemeasureparameterslearnedbytheLogisticLASSOforcaseI. 202

PAGE 203

DistributionsforthemeasureparameterssampledbytheLogisticLASSOforcaseI. 203

PAGE 204

SomeofthetracesforthemeasureparameterslearnedbytheLogisticLASSOforcaseII. 204

PAGE 205

SomeofthedistributionsforthemeasureparameterssampledbytheLogisticLASSOforcaseII. 205

PAGE 206

TracesforthemeasureparameterslearnedbytheMAP-EMLogisticLASSOforcaseI. 206

PAGE 207

DistributionsforthemeasureparameterssampledbytheMAP-EMLogisticLASSOforcaseI. 207

PAGE 208

SomeofthetracesforthemeasureparameterslearnedbytheMAP-EMLogisticLASSOforcaseII. 208

PAGE 209

SomedistributionsforthemeasureparameterssampledbytheMAP-EMLogisticLASSOforcaseII. 209

PAGE 210

ROCcurvesfordierentalgorithmsincaseI. 210

PAGE 211

ROCcurvesfordierentalgorithmsincaseII. 211

PAGE 212

ROCcurvesfordierentalgorithmsincaseIII. 212

PAGE 213

ROCcurvesforalltheA2007algorithmsandLogisticLASSO. 213

PAGE 214

ROCcurvesforMCEunderSugenoandthelogisticmethodsindatasetA2007. 214

PAGE 215

ROCcurvesforalltheBJ2007algorithmsandLogisticLASSO. 215

PAGE 216

ROCcurvesforMCEunderSugenoandthelogisticmethodsindatasetBJ2007. 216

PAGE 217

ROCcurvesforalltheBJ2006algorithmsandLogisticLASSO. 217

PAGE 218

ROCcurvesforMCEunderSugenoandthelogisticmethodsindatasetBJ2006. 218

PAGE 219

Learningfuzzymeasureshasbecomeanimportanttaskinrecentyearsforthedevelopmentofinformationfusionalgorithms.Inthisdissertation,weaddressedseveralproblemsintheuseoftheChoquetintegralasanaggregationoperatorforinformationfusion. InminimumclassicationerrorundertheSugeno-measure,wedevelopedagradientdescentalgorithmwhichallowsustoeliminatethenecessityofusingdesiredoutputs.Inaddition,itenforcesthemonotonicitypropertiesoftheSugeno-measure.ThisnewalgorithmreducesthetimecomplexityoftheclassicLSEmethodbyGrabischthroughtheuseoftheSugeno-measureandtheeliminationofamatrixofconstraints.OtherproblemswithGrabisch'smethodisthatitdoesnothandlegeneralmeasures,anddoesnothaveawaytoeliminateunnecessarysourcesofinformation. Inresponsetothis,wedevelopedanalgorithmbasedontheGibbssamplerunderaLSEcriterion,basedonaBayeshierarchicalframework.Usingthisframework,wewereabletointroduceasparsitytermintothelearningstructureforeachoftheparametersinthefuzzymeasure.ThisalgorithmwasimprovedbyusingacombinationofMAPandEMGibbs.Aproblemwiththesealgorithmsarethattheystilldependonaseriesofdesiredoutputs. BasedintheLSEsparsityunderanaivesampler,wedevelopedamaximumaposterioriEMalgorithmbasedinGibbssamplersandslicesamplertoimproveoverthenaivesampler.Thisalgorithmstilldependsondesiredoutputs,butitimprovesthesparsityandregularizationofthemeasureparameters. Toaddressthisproblem,wedevelopedseveralversionsoftheminimumclassicationerrorfortheprobabilisticframeworkbyusinglogisticregressionasthelearningfunction.Inaddition,wewereabletosolvethetrainingofasinglemeasurewithoutdesiredoutputs.Thisnewalgorithmseliminatethenecessityofdesiredoutputsforproblemswithoneor 219

PAGE 220

220

PAGE 221

Giventhefollowingquadraticcostfunction: andthedenitionoftheChoquetintegral( 1{111 ),wehavethat: This,lastequationcanbesimpliedmakingthefollowingsubstitutions: x=0;:::;f(x(1))0;:::;f(x(n))f(x(n1));:::;0t; whereMisthemappingofthelattice2Xintoavector,andisthevectorofdierencewhereonlynpositioninthevectoraredierentfromzero.Thus,wehave: whichisequivalentto where: =Xx2C1x+:::+Xx2Cnx; 221

PAGE 222

Thismatrixinequalityequation( A{9 )hasthreeterms,Aamatrixofzeros,onesandminusones,uthevectorofmeasures,and0avectorofzeros.From(Equation A{5 )and(Equation A{9 ),wecanobtainthefollowingoptimizationunderconstraints: min1 2utDu+u,s.t.Au+b0.(A{10) 222

PAGE 223

TableB-1: DierentcharacteristicsforChoquetalgorithms. AlgorithmMeasuretypeCostfunctionOptimizationstructureMethod LSEGeneralmeasureSumofL2normsDierentpossibleoptimizationmethodsMCEunder-measuresSumofsigmoidfunctionsDierentpossibleSugeno-measurewithoptimizationdissimilaritymeasuresmethodsLSESparsityGeneralmeasuresBayesianhierarchyNaivesamplerforL2normsMAP-EMGeneralmeasuresBayesianhierarchyforGibbssamplerSparsityL2andL1normsandandsimulatedLaplacianslicesamplerLogisticGeneralmeasuresBayesianhierarchyforGibbssamplerLASSOalogisticregressionandwithLASSOslicesamplerMAP-EMLogisticGeneralmeasuresBayesianhierarchyforGibbssamplerLASSOalogisticregressionwithandsimulatedLaplacianslicesamplerMCELogisticGeneralmeasuresBayesianhierarchyforGibbssamplerLASSOalogisticregressionandwithMCEandLASSOslicesamplerMAP-EMGeneralmeasuresBayesianhierarchyforGibbssamplerMCELogisticalogisticregressionandLASSOwithMCEandslicesamplersimulatedLaplacian 223

PAGE 224

RelationsbetweenthedierentChoquetalgorithms. 224

PAGE 225

Inthischapter,wecomparetheMaximumOutputInformation(MOI)algorithms[ 37 ]andthenewalgorithmsproposedinthisdissertation. BeforewelistthepossibledierencesbetweenMOIandBayesiansparsitymethods,itisagoodideatolistthecommoncharacteristicforeachcollection.TheBayesiansparsitymethodssharethefollowingcharacteristics: 1. TheChoquetintegralandfuzzymeasuresareusedtodenegeneralaggregatorfunctionsfordecisionlevelfusion. 2. ABayesianHierarchicalmodelthatdependsonsometypeoflearningfunctionforthedataisused. 3. Asetofsparsitypromotiondistributionsarechosenfortheparametervaluesof. 4. Gibbsandslicesamplerssolvetheproblemsequentially. InthecaseofMOImethods,wehave: 1. SVM's+MLP'sareusedinclassicationproblems. 2. Aseriesofweightsaredenedbymutualinformationestimationsovertheconfusionmatrix,gearedtowardmaximizingthemutualinformationbetweenthedesiredoutputsandtheestimatedalgorithmoutputs. 3. AmeasureofimportanceforeachfeatureisdenedinthecaseofSVM'sbypartialderivativesoverthemargincostfunction.AndinthecaseoftheMLP,itisdenedbyrecursionontheweightsoftheneuralnetwork. 4. AheuristicselectsfromthefeaturesaftereachtrainingoftheSVM'sandMLP'sandthecalculationoftheimportancemeasureforeachfeature. Ifwecomparethetwolists,wecanseethemostobviousdierences: 1. MOImethodsarebasedonheuristicsappliedtoSVM'sandMLP'sseekingtoincreasethemutualinformationtoreduceprobabilityoferrorinclassicationproblems.Instead,BayesiansparsitymethodsarebasedinaBayesianhierarchyovertheparametervaluesofafuzzymeasureforaChoquetintegral.Inaddition,theBayesiansparsitymethodsaregearedtowardsimplementingsometypeoflearningfunction,whichissolvedbyusingGibbssamplers. 2. SVM'sandNeuralnetworksareusedforclassicationproblems.Bayesiansparsitymethodsaredesignedtosolvedecisionlevelfusionproblems. 225

PAGE 226

InMOI,featureselectionisdonebygreedyselectionanddirectsearch.InBayesiansparsitymodels,algorithmselectionisbasedonembeddingaLaplaciandistributionontheparametersofthefuzzymeasure,andisperformedautomaticallybytheGibbssampler. Thesearetheresonswhythecollectionofalgorithmsarequitedierent.Inaddition,insteadofgoingthroughacomplexheuristictomaximizethemutualinformationbetweenthedesiredoutputsandtheestimatedoutputs,themutualinformationcanbecalculateddirectlyintheBayesiansparsitymethods.Forexample,wecouldcalculatethemutualinformationI(y;by)forLogisticLASSO(Appendix D ). 226

PAGE 227

GiventheBayesianhierarchy(Model 5{6 ),wehavethefollowingprobabilitydistributionfortherandomvariabley: Therefore,wehavethatby=E(yjf!i;).AscanbeseeninSindhwanietal.[ 37 ]andinFano'sinequality,wewouldliketomaximizeI(y;by)withrespectto.Forthis,itisnecessarytodenethejointprobabilityofyandby.Thiscanbedonebyobservingthat: Withp(E(yjf!i;))=K.Now,wecanwritethemutualinformationas: Nowthatwedonothavecontrolovertheentropyofy[ 7 ],weareforcedtominimizeH(yjby).Wehavethefollowingcases: 1. Ifyandbyareindependent: becauseyisaBernoullirandomvariableitsspaceofpossiblevaluesisY=f0;1gorlog2jYj=1.Therefore,wedonothaveanywaytocontrolPe.Thus,weareonlyinterestedinthecasewhenyandbyarenotindependent. 2. Ifyandbyarenotindependent: Thus,itisnecessarytominimizeH(yjby)toincreasemutualinformationbetweenyandby,andPe.Weknowthat: Therefore,weneedtomaximizep(yjf!i;)=p(yjf!i)whichistheprobabilityoftheBernoullirandomvariabley. 227

PAGE 228

Therefore,LogisticLASSOmaximizesthemutualinformationbetweenyandby.Inadditiontothisfeature,algorithmselectionismadeinanautomaticwaybyassumingaLaplacianpriordistributionforeachofthe's.Thisavoidstheuseofsubsetselectionbyanexternalalgorithmtodecidewhichcollectionofalgorithmsareimportantforthefusiontask. AsimilarargumentcanbemadefortherestoftheBayesiansparsitymethods.Still,weneedtostressthatthemaximizationofY(y;by)dependsonthelearningfunctionusedintheBayesiansparsitymodel. 228

PAGE 229

[1] G.CasellaandR.L.Berger,StatisticalInference,2nded.PacicGrove:DuxburyPress,2002. [2] C.P.RobertandG.Casella,MonteCarloStatisticalMethods(SpringerTextsinStatistics).Secaucus,NJ,USA:Springer-VerlagNewYork,Inc.,2005. [3] P.Komarek,\Logisticregressionfordataminingandhigh-dimensionalclassication,"Ph.D.dissertation,CarnegieMellonUniversity,2004. [4] B.-H.Juang,W.Chou,andC.-H.Lee,\Minimumclassicationerrorratemethodsforspeechrecognition,"IEEETrans.SpeechAudioProcess.,vol.5,pp.257{266,May1997. [5] S.Katagiri,B.-H.Juang,andC.-H.Lee,\Patternrecognitionusingafamilyofdesignalgorithmsbaseduponthegeneralizedprobabilisticdescentmethod,"Proc.IEEE,vol.86,pp.2345{2372,November1998. [6] M.G.Rahim,B.-H.Juang,andC.-H.Lee,\Discriminativeutterancevericationforconnecteddigitrecognition,"IEEETrans.SpeechAudioProcess.,vol.5,pp.266{277,May1997. [7] T.M.CoverandJ.A.Thomas,ElementsofInformationTheory.NewYork,NY,USA:Wiley-Interscience,1991. [8] M.Grabisch,\Anewalgorithmforidentifyingfuzzymeasuresanditsapplicationtopatternrecognition,"inFourthIEEEInternationalConferenceonFuzzySystems,Yokohama,Japan,March1995,pp.145{150. [9] S.KimandM.Tizhoosh,H.R.andKamel,\Choquetintegral-basedaggregationofimagetemplatematchingalgorithms,"inFuzzyInformationProcessingSociety,2003.NAFIPS2003.22ndInternationalConferenceoftheNorthAmerican,July2003,pp.143{148. [10] G.Choquet,\Theoryofcapacities,"Annalesdel'InstitutFourier,1955,vol5,pp131-295. [11] M.Grabisch,T.Murofushi,andM.Sugeno,FuzzyMeasuresandIntegrals.TheoryandApplications,ser.StudiesinFuzzinessandSoftComputing.Heidelberg:PhysicaVerlag,2000. 229

PAGE 230

Z.WangandG.J.Klir,FuzzyMeasureTheory.Norwell,MA,USA:KluwerAcademicPublishers,1993. [13] S.Auephanwirayakul,J.Keller,andP.D.Gader,\GeneralizedChoquetfuzzyintegralfusion,"InformationFusion,vol.3,no.1,pp.69{85,2002. [14] M.CapinskiandP.E.Kopp,Measure,IntegralandProbability.NewYork:Springer,1999. [15] R.L.WheedenandA.Zygmund,MeasureandIntegral.AnIntroductiontoRealAnalysis.MonographsandTextbooksinPureandAppliedMathematics.43.NewYork-Basel:MarcelDekker,Inc.X,274p.SFr.54.00,1977. [16] W.Rudin,Real&ComplexAnalysis,3rded.NewYork,NY:McGraw-Hill,Inc.,1987. [17] ||,PrinciplesofMathematicalAnalysis(InternationalSeriesinPure&AppliedMathematics).NewYork:McGraw-HillPublishingCo.,September1976. [18] A.PapoulisandU.S.Pillai,Probability,RandomVariablesandStochasticProcesses.McGraw-HillScience/Engineering/Math,December2001. [19] H.RaiaandR.Schlaifer,AppliedStatisticalDecisionTheory.Cambridge:DivisionofResearch,GraduateSchoolofBusinessAdministration,HarvardUniversity,1961. [20] H.Jereys,\Aninvariantformforthepriorprobabilityinestimationproblems,"prsla,vol.186,no.1007,pp.453{461,1946. [21] R.O.Duda,P.E.Hart,andD.G.Stork,PatternClassication.NewYork:Wiley-IntersciencePublication,2000. [22] S.Kirkpatrick,C.D.Gelatt,andM.P.Vecchi,\Optimizationbysimulatedannealing,"Science,Number4598,13May1983,vol.220,4598,pp.671{680,1983. [23] E.AartsandJ.Korst,SimulatedAnnealingandBoltzmannMachines:AStochasticApproachtoCombinatorialOptimizationandNeuralComputing.NewYork,NY,USA:JohnWiley&Sons,Inc.,1989. 230

PAGE 231

A.P.Dempster,N.M.Laird,andD.B.Rubin,\MaximumlikelihoodfromincompletedataviatheEMalgorithm,"JournaloftheRoyalStatisticalSociety,vol.B39,pp.1{38,1977. [25] S.Borman,\TheExpectationMaximizationalgorithm.Ashorttutorial,"UniversityofNotreDame,Tech.Rep.,July2004,thisreportintroducestheexpectationmaximizationalgorithmincludingaproofofconvergence. [26] S.ChibandE.Greenberg,\Understandingthemetropolis-hastingsalgorithm,"AmericanStatistician,no.4,pp.327{335,Nov1995. [27] G.CasellaandE.I.George,\ExplainingtheGibbssampler,"JournaloftheAmericanStatisticalAssociation,vol.46,no.3,pp.167{174,Aug.1992. [28] J.Bilmes,\AgentletutorialontheEMalgorithmanditsapplicationtoparameterestimationforGaussianmixtureandhiddenMarkovmodels,"UniversityofBerkeley,Tech.Rep.,1997,CSI-TR-97-021. [29] G.C.G.WeiandM.A.Tanner,\AMonteCarloimplementationoftheEMalgorithmandthepoorman'sdataaugmentationalgorithms,"JournaloftheAmericanStatisticalAssociation,TheoryandMethods,vol.85,no.411,pp.699{704,September1990. [30] B.Walsh,\MarkovchainMonteCarloandGibbssampling,"2004. [31] N.Metropolis,A.Rosenbluth,M.Rosenbluth,A.H.Teller,andE.Teller,\Equationsofstatecalculationsbyfastcomputingmachines,"JournalofChem-icalPhysics,vol.21,pp.1087{1092,1953. [32] W.K.Hastings,\MonteCarlosamplingmethodsusingMarkovchainsandtheirapplications,"Biometrika,vol.57,pp.97{109,1970. [33] S.GemanandD.Geman,\Stochasticrelaxation,Gibbsdistributions,andtheBayesianrestorationofimages,"IEEETransactionsonPatternAnalysisandMachineIntelligence,vol.6,no.6,pp.721{741,November1984. [34] W.R.Gilks,N.G.Best,andK.K.C.Tan,\AdaptiverejectionMetropolissamplingwithinGibbssampling,"AppliedStatistics,vol.44,no.4,pp.455{472,1995. 231

PAGE 232

P.Domingos,\Auniedbias-variancedecompositionanditsapplications,"inProc.17thInternationalConf.onMachineLearning.MorganKaufmann,SanFrancisco,CA,2000,pp.231{238. [36] R.KohaviandD.Sommereld,\Featuresubsetselectionusingthewrappermethod:Overttinganddynamicsearchspacetopology,"inKDD,1995,conf,pp.192{197. [37] V.Sindhwani,S.Rakshit,D.Deodhare,D.Erdogmus,J.Principe,andP.Niyogi,\Featureselectioninmlpsandsvmsbasedonmaximumoutputinformation,"IEEETransactionsonNeuralNetworks,vol.15,no.4,pp.937{948,2004. [38] R.Tibshirani,\Regressionshrinkageandselectionviathelasso,"JournaloftheRoyalStatisticalSociety:SeriesB(StatisticalMethodology),vol.58,no.1,pp.267{288,1996. [39] P.E.Gill,W.Murray,andM.H.Wright,PracticalOptimization.Burlington,MA:AcademicPress,1981. [40] M.A.T.Figueiredo,\Adaptivesparsenessforsupervisedlearning,"IEEETrans.PatternAnal.Mach.Intell.,vol.25,no.9,pp.1150{1159,2003. [41] C.GwiggnerandG.Lanckriet,\Characteristicsinightdata-estimationwithlogisticregressionandsupportvectormachines,"inProceedingsofthe1stInter-nationalConferenceonResearchinAirTransportation(ICRAT),Zilina,Slovakia,2004. [42] J.Friedman,T.Hastie,andR.Tibshirani,\Additivelogisticregression:astatisticalviewofboosting,"1998. [43] J.Cramer,\Theoriginsoflogisticregression,"TinbergenInstitute,TinbergenInstituteDiscussionPapers02-119/4,Dec.2002,availableathttp://ideas.repec.org/p/dgr/uvatin/20020119.html. [44] Y.So,\Atutorialonlogisticregression,"1999. [45] C.Wen-TsongandP.Gader,\Wordleveldiscriminativetrainingforhandwrittenwordrecognition,"inProceedingsoftheSeventhInternationalWorkshoponFron-tiersinHandwrittenRecognition,September2000,pp.393{402. 232

PAGE 233

H.Mizutani,\Discriminativelearningforerrorandminimumrejectclassication,"inFourteenthInternationalConferenceinPatternRecognition,vol.1,1998,pp.136{140. [47] S.TheodoridisandK.Koutroumbas,PatternRecognition,ThirdEdition.NewYork:AcademicPress,February2006. [48] D.XuandJ.Principe,\Learningfromexampleswithquadraticmutualinformation,"inIEEENeuralNetworksforSignalProcessing-ProceedingsoftheIEEEWorkshop,1998,pp.155{164. [49] D.ErdogmusandJ.C.Principe,\Lowerandupperboundsformisclassicationprobabilitybasedonrenyi'sinformation,"J.VLSISignalProcess.Syst.,vol.37,no.2-3,pp.305{317,2004. [50] M.GrabischandM.Sugeno,\Multi-attributeclassicationusingfuzzyintegral,"inIEEEinternationalconferenceonFuzzySystems,March1992,pp.47{54. [51] M.Sugeno,\Theoryoffuzzyintegralsanditsapplications,"DoctoralThesis,TokyoInstituteofTechnology,Tokyo,Japan,1974. [52] P.Gader,W.-H.Lee,andA.Mendez-Vasquez,\ContinuousChoquetintegralswithrespecttorandomsetswithapplicationstolandminedetection,"inProceedingsIEEEInternationalConferenceonFuzzySystems,2004,vol.1,Budapest,Hungary,July2004,pp.523{528. [53] A.P.Dempster,\Ageneralizationofbayesianinference,"JournaloftheRoyalStatisticalSociety(SeriesB),vol.2,no.30,pp.205{247,1968. [54] G.Shafer,AMathematicalTheoryofEvidence.Princeton,NJ:UniversityPress,1976. [55] R.R.Yager,\Onorderedweightedaveragingaggregationoperatorsinmulticriteriadecisionmaking,"IEEETrans.Syst.ManCybern.,vol.18,no.1,pp.183{190,1988. [56] K.Leszczynski,P.Penczek,andW.Grochulzki,\Sugeno'sfuzzymeasureandfuzzyclustering,"FuzzySetsandSystems,vol.15,no.2,pp.147{158,March1985. 233

PAGE 234

H.TahaniandJ.Keller,\Informationfusionincomputervisionusingthefuzzyintegral,"IEEETransactionsonSystems,ManandCybernetic,vol.20,no.3,pp.733{741,1990,reprintedasanappendixtoG.KlirandZ.Wang,FuzzyMeasureTheory,PlenumPress,1992. [58] J.M.Keller,P.D.Gader,H.Tahani,J.H.Chiang,andM.Mohamed,\Advancesinfuzzyintegrationforpatternrecognition,"FuzzySetsandSystems,vol.65,no.1,pp.273{283,1994. [59] M.Grabisch,\Fuzzyintegralforclassicationandfeatureextraction,"inFuzzyMeasuresandIntegrals.TheoryandApplications,M.Grabisch,T.Murofushi,andM.Sugeno,Eds.Heidelberg:PhysicaVerlag,2000,pp.348{374. [60] J.WangandZ.Wang,\Usingneuralnetworkstodeterminesugenomeasuresbystatistics,"NeuralNetw.,vol.10,no.1,pp.183{195,1997. [61] P.Gader,A.Mendez-Vasquez,K.Chamberlin,J.Bolton,andA.Zare,\Multi-sensorandalgorithmfusionwiththeChoquetintegral:applicationstolandminedetection,"in2004IEEEInternationalGeoscienceandRemoteSensingSymposium,2004.IGARSS'04.Proceedings,vol.3,September2004,pp.1605{1608. [62] A.Mendez-Vazquez,P.Gader,J.M.Keller,andK.Chamberlin,\MinimumclassicationerrortrainingforChoquetintegralswithapplicationstolandminedetection,"January2007,tobepublished. [63] M.Grabisch,\ModellingdatabytheChoquetintegral,"inInformationFusioninDataMining,V.Torra,Ed.Heidelberg:PhysicaVerlag,2003,pp.135{148. [64] G.M.,\k-orderadditivediscretefuzzymeasuresandtheirrepresentation,"FuzzySetsandSystems,vol.92,pp.167{189,December1997. [65] Z.WangandG.J.Klir,FuzzyMeasureTheory.Norwell,MA,USA:KluwerAcademicPublishers,1993. [66] M.Grabisch,H.Nguyen,andE.Walker,FundamentalsofUncertaintyCalculi,withApplicationstoFuzzyInference.Dordrecht:KluwerAcademicPublishers,1995. [67] G.MayorandT.E,\Ontherepresentationofsomeaggregationfunctions,"inProceedingofISMVL,1986,pp.111{114. 234

PAGE 235

S.Ovchinnikov,Aggregationandfusionofimperfectinformation(studiesinfuzzinessandsoftcomputing,12).Lavoisier,1997,ch.AggregationOperatorsforFusionunderFuzziness,pp.3{10. [69] M.R.andK.M.,\Aggregationoperators,"inProceedingoftheXIConferenceonappliedMathematicsPRIM'96,S.K.HercegD.,Ed.InstituteofMathematics,NoviSad,1997,pp.193{211. [70] M.Detyniecki,\Mathematicalaggregationoperatorsandtheirapplicationtovideoquerying,"Ph.D.dissertation,Laboratoired'InformatiquedeParis,2001. [71] L.S.Shapley.,\Avalueforn-persongames,"inContributionstotheTheoryofGamesVolumeII,ser.AnnalsofMathematicalStudies,H.KuhnandA.Tucker,Eds.Princeton,NJ:PrincetonUniversityPress,1953,vol.28,pp.307{317. [72] D.HallandJ.LIinas,\Anintroductiontomultisensordatafusion,"inIn:Proceed-ingsoftheIEEE,vol.85(1),1997,pp.6{23. [73] L.A.Klein,SensorandDataFusionConceptsandApplications.Bellingham,WA,USA:SocietyofPhoto-OpticalInstrumentationEngineers(SPIE),1999. [74] H.Li,B.S.Manjunath,andS.K.Mitra,\Multisensorimagefusionusingthewavelettransform,"GraphicalModelandImageProcessing,vol.57,no.3,pp.235{245,1995. [75] M.Mangolini,\Apportdelafusiond'imagessatellitairesmulticapteursauniveaupixelentldtectionetphotointerprtation,"Ph.D.dissertation,UniversitNice-SophiaAntipolis,France,1994,174p. [76] L.Wald,\Denitionsandtermsofreferenceindatafusion,"InternationalArchivesofPhotogrammetryandRemoteSensing,vol.32,part7-4-3W6,Valladolid,Spain,pp.2-6.,June1999. [77] C.PholandJ.L.V.Genderen,\Multisensorimagefusioninremotesensing:concepts,methodsandapplications,"InternationalJournalofRemoteSensing,vol.19,no.5,pp.823{854,March1998. [78] A.H.GunatilakaandB.A.Baertlein,\Feature-levelanddecision-levelfusionofnoncoincidentlysampledsensorsforlandminedetection,"IEEETrans.PatternAnal.Mach.Intell.,vol.23,no.6,pp.577{589,2001. 235

PAGE 236

U.Handmann,G.Lorenz,andW.vonSeelen,\Fusionvonbasisalgorithmenzursegmentierungvonstrasenverkehrsszenen,"inDAGM{Symposium,1998,pp.101{108. [80] G.T.Marika,\Fusionoftexturemeasuresforurbanareacharacterization." [81] E.denBreejen,K.Schutte,andF.Cremer,\Sensorfusionforanti-personallandminedetection,acasestudy,"1999. [82] P.J.ProdanovandA.Drygajlo,\Bayesiannetworksbasedmulti-modalityfusionforerrorhandlinginhuman-robotdialoguesundernoisyconditions."SpeechCommuni-cation,vol.45,no.3,pp.231{248,2005. [83] A.Noore,R.Singh,andM.Vatsa,\Robustmemory-ecientdatalevelinformationfusionofmultimodalbiometricimages,"InformationFusion,vol.8,pp.337{346,October2007. [84] A.Pinz,M.Prantl,H.Ganster,andH.Kopp-Borotschnig,\Activefusionanewmethodappliedtoremotesensingimageinterpretation,"PatternRecogn.Lett.,vol.17,no.13,pp.1349{1359,1996. [85] V.Pavlovic,\Dynamicbayesiannetworksforinformationfusionwithapplicationtohuman-computerinterfaces,"Ph.D.dissertation,UniversityofIllinoisatUrbana-Champaign,January1999. [86] Y.Zhang,Q.Ji,andC.Looney,\Activeinformationfusionfordecisionmakingunderuncertainty,"inProceedingsoftheFifthInternationalConferenceonInforma-tionFusion,2002,vol.1,2002,pp.643{650. [87] F.Fung,K.Laskey,M.Pool,M.Takikawa,andE.Wright,\Plasma:Combiningpredicatelogicandprobabilityforinformationfusionanddecisionsupport,"AAAISpringSymposiumonChallengestoDecisionSupportinaChangingWorld,Febraury2006. [88] P.Valin,F.Rheaume,C.Tremblay,D.Grenier,A.-L.Jousselme,andE.Bosse,\Comparativeimplementationoftwofusionschemesformultiplecomplementaryirimageryclassiers."InformationFusion,vol.7,no.2,pp.197{206,2006. [89] R.E.Neapolitan,LearningBayesianNetworks.PrenticeHall,December2000,northeasternIllinoisUniversity,1stEdition. 236

PAGE 237

S.FerrariandA.Vaghi,\Deminingsensormodelingandfeature-levelfusionbybayesiannetworks,"IEEESensorsJournal,vol.6,pp.471{483,April2006. [91] P.deOude,B.Ottens,andG.Pavlin,\Informationfusionwithdistributedprobabilisticnetworks."inArticialIntelligenceandApplications,2005,pp.195{201. [92] D.Heckerman,\ATutorialonLearningBayesianNetworks,"MicrosoftResearch,Tech.Rep.MSR-TR-95-06,March1995. [93] G.F.CooperandE.Herskovits,\ABayesianmethodfortheinductionofprobabilisticnetworksfromdata,"Mach.Learn.,vol.9,no.4,pp.309{347,1992. [94] O.ParsonsandG.A.Carpenter,\Artmapneuralnetworksforinformationfusionanddatamining:mapproductionandtargetrecognitionmethodologies,"NeuralNetw.,vol.16,no.7,pp.1075{1089,2003. [95] B.V.Dasarathy,\Informationfusioninthecontextofhuman-machineinterfaces."InformationFusion,vol.6,no.2,pp.117{118,2005. [96] H.Barbera,A.Skarmeta,M.Izquierdo,andJ.Blaya,\Neuralnetworksforsonarandinfraredsensorsfusion,"inProceedingsoftheThirdInternationalConferenceonInformationFusion,2000.FUSION2000,vol.2,July2000,pp.18{25. [97] S.Haykin,NeuralNetworks:AComprehensiveFoundation.UpperSaddleRiver,NJ,USA:PrenticeHallPTR,1998. [98] E.I.Shubnikov,\Multisensorinformationfusioninneuralnetworksonthebasisofdiractionoptics,"OpticsandSpectroscopy,vol.98,no.2,pp.284{290,2005. [99] Z.WangandY.Ma,\Medicalimagefusionusingm-PCNN,"InformationFusion,April2007,inpress,correctedProof,availableonlineApril192007. [100] L.Yiyao,Y.V.Venkatesh,andC.C.Ko,\Aknowledge-basedneuralnetworkforfusingedgemapsofmulti-sensorimages."InformationFusion,vol.2,no.2,pp.121{133,2001. [101] R.Eckhorn,H.J.Reitboeck,M.Arndt,andP.Dicke,\Featurelinkingviasynchronizationamongdistributedassemblies:Simulationsofresultsfromcatvisualcortex,"NeuralComputation,vol.2,no.3,pp.293{307,1990. 237

PAGE 238

J.Wilson,P.Gader,K.Ho,W.-H.Lee,R.Stanley,andT.Glenn,\Regionprocessingofgroundpenetratingradarandelectromagneticinductionforhandheldlandminedetection,"inProceedingsofSPIE,vol.5415.Orlando,FL:SPIE,April2004,pp.933{944. [103] C.J.C.Burges,\Atutorialonsupportvectormachinesforpatternrecognition,"DataMiningandKnowledgeDiscovery,vol.2,no.2,pp.121{167,1998. [104] H.Liu,X.Wang,D.Tan,andL.Wang,\Studyontracinformationfusionalgorithmbasedonsupportvectormachines,"isda,vol.1,pp.183{187,2006. [105] R.Singh,M.Vatsa,andA.Noore,\Hierarchicalfusionofmulti-spectralfaceimagesforimprovedrecognitionperformance,"InformationFusion,August2006,inpress,correctedproof,availableonlineAugust172006. [106] H.Chew,R.Bogner,andC.Lim,\Dualnu-supportvectormachinewitherrorrateandtrainingsizebiasing,"inInternationalConferenceonAcoustics,SpeechandSignalProcessing,ICASSP2001,USA,2001,p.12691272. [107] D.Whitley,\Ageneticalgorithmtutorial,"StatisticsandComputing,vol.4,pp.65{85,1994. [108] I.V.MaslovandI.Gertner,\Multi-sensorfusion:anevolutionaryalgorithmapproach,"InformationFusion,vol.7,no.3,pp.304{330,September2006. [109] Z.Wang,K.Xu,J.Wang,andG.J.Klir,\Usinggeneticalgorithmstodeterminenonnegativemonotonesetfunctionsforinformationfusioninenvironmentswithrandomperturbation,"InternationalJournalofIntelligentSystems,vol.14,no.10,pp.949{962,August1999. [110] E.F.CombarroandP.Miranda,\Identicationoffuzzymeasuresfromsampledatawithgeneticalgorithms,"ComputersandOperationsResearch,vol.33,no.10,pp.3046{3066,2006. [111] G.J.KlirandB.Yuan,FuzzySetsandFuzzyLogic:TheoryandApplications.UpperSaddleRiver,NJ,USA:Prentice-Hall,Inc.,1995. [112] H.Frigui,P.D.Gader,W.-H.Lee,andJ.N.Wilson,\Detectionanddiscriminationoflandminesingroundpenetratingusinganeigenmineandfuzzymembershipfunctionapproach,"inProceedingsoftheSPIEConferenceonDetectionandRemediationTechnologiesforMinesandMinelikeTargetsIX,Orlando,FL,USA,April2004. 238

PAGE 239

P.Gader,J.Keller,andJ.Cai,\Afuzzylogicsystemforthedetectionandrecognitionofhandwrittenstreetnumbers,"IEEETransactionsonFuzzySystems,vol.3,no.1,pp.83{95,1995. [114] K.Xu,Z.Wang,P.-A.Heng,andK.-S.Leung,\Classicationbynonlinearintegralprojections,"IEEETransactionsonFuzzySystems,vol.11,no.2,pp.187{2001,2003. [115] W.Wang,Z.Wang,andG.J.Klir,\Geneticalgorithmsfordeterminingfuzzymeasuresfromdata,"JournalofIntelligentandFuzzySystems,vol.6,no.2,pp.171{183,1998. [116] J.HaddadniaandK.Faez,\Hybridn-featureextractionwithfuzzyintegralinhumanfacerecognition,"inVideo/ImageProcessingandMultimediaCommunica-tions4thEURASIP-IEEERegion8InternationalSymposiumonVIPromCom,2002,pp.93{98. [117] K.-C.KwakandW.Pedrycz,\Facerecognitionusingfuzzyintegralandwaveletdecompositionmethod,"IEEETransactionsonSystems,ManandCybernetics,PartB,vol.34,no.4,pp.1666{1675,August2004. [118] Y.Wu,H.Liu,andH.Zha,\Modelingfacialexpressionspaceforrecognition,"2005IEEE/RSJInternationalConferenceonIntelligentRobotsandSystems,2005,(IROS2005),pp.1968{1973,August2005. [119] P.Gader,W.-H.Lee,andX.Zhang,\RenyientropywithrespecttoChoquetcapacities,"in2004IEEEInternationalConferenceonFuzzySystems,2004.Proceedings,vol.1,July2004,pp.529{533. [120] M.A.Schatten,P.D.Gader,J.Bolton,A.Zare,andA.Mendez-Vasquez,\Sensorfusionforairbornelandminedetection,"inProceedingsofSPIEVolume:6217,DetectionandRemediationTechnologiesforMinesandMinelikeTargetsXI,May2006. [121] P.GaderandJ.Keller,\Applicationsoffuzzysettheorytohandwritingrecognition,"inProceedingsoftheThirdIEEEConferenceonFuzzySystems,1994.IEEEWorldCongressonComputationalIntelligence,June1994,pp.910{917. [122] P.D.Gader,M.A.Mohamed,andJ.M.Keller,\Dynamic-programming-basedhandwrittenwordrecognitionusingtheChoquetfuzzyintegralasthematchfunction,"JournalofElectronicImaging,vol.5,no.1,pp.15{24,January1996. 239

PAGE 240

M.GrabischandM.Schmitt,\Mathematicalmorphology,orderltersandfuzzylogic,"inProceedingsof1995IEEEInternationalConferenceonFuzzySystems,1995.InternationalJointConferenceoftheFourthIEEEInternationalConferenceonFuzzySystemsandtheSecondInternationalFuzzyEngineeringSymposium,vol.4,March1994,pp.2103{2108. [124] A.HocaogluandP.Gader,\AninterpretationofdiscreteChoquetintegralsinmorphologicalimageprocessing,"The12thIEEEInternationalConferenceonFuzzySystems,2003.FUZZ'03,vol.2,pp.1291{1295,May2003. [125] J.Keller,P.Gader,R.Krishnapuram,X.Wang,A.KoksalHocaoglu,H.Frigui,andJ.Moore,\Afuzzylogicautomatictargetdetectionsystemforladarrangeimages,"inThe1998IEEEInternationalConferenceonFuzzySystemsProceedings,1998.IEEEWorldCongressonComputationalIntelligence,vol.1,May1998,pp.71{76. [126] J.Cao,M.Shridhar,andM.Ahmadi,\Fusionofclassierswithfuzzyintegrals,"inICDAR'95:ProceedingsoftheThirdInternationalConferenceonDocumentAnalysisandRecognition(Volume1).Washington,DC,USA:IEEEComputerSociety,1995,p.108. [127] J.M.KellerandJ.Osborn,\Trainingthefuzzyintegral,"Int.J.Approx.Reasoning,vol.15,no.1,pp.1{24,1996. [128] J.-H.Chiang,\Choquetfuzzyintegral-basedhierarchicalnetworkfordecisionanalysis,"IEEETransactionsonFuzzySystems,vol.7,no.1,pp.63{71,February1999. [129] P.D.Gader,B.Nelson,A.Hocaoglu,S.Auephanwiriyakul,andM.Khabou,\NeuralversusheuristicdevelopmentofChoquetfuzzyintegralfusionalgorithmsforlandminedetection,"inNeuro-fuzzyPatternRecognition,H.BunkeandA.Kandel,Eds.WorldScienticPubl.Co.,2000,pp.205{226. [130] M.GrabischandJ.Nicolas,\Classicationbyfuzzyintegral:Performanceandtests,"FuzzySetsandSystems,vol.65,no.2-3,pp.255{271,1994. [131] M.Grabisch,\Theapplicationoffuzzyintegralsinmulticriteriadecisionmaking,"EuropeanJournalofOperationalResearch,vol.89,pp.445{456,1996. [132] M.S.Bazaraa,H.D.Sherali,andC.M.Shetty,NonlinearProgramming:TheoryandAlgorithms.Hoboken,NJ:JohnWiley&Sons,Inc,1993,classnotesfortheoptimizationclasshetaughtintheECEDept.atUniversityofTexas,Austin. 240

PAGE 241

D.Wang,X.Wang,,andJ.Keller,\Determiningfuzzyintegraldensitiesusingageneticalgorithmforpatternrecognition,"inProceedingsofNAFIPS'97,Syracuse,NY,September1997,pp.263{267. [134] T.H.Cormen,C.Stein,R.L.Rivest,andC.E.Leiserson,IntroductiontoAlgo-rithms.NewYork:McGraw-HillHigherEducation,2001. [135] W.H.Press,S.A.Teukolsky,W.T.Vetterling,andB.P.Flannery,NumericalRecipesinC:TheArtofScienticComputing.NewYork,NY,USA:CambridgeUniversityPress,1992. [136] S.BoydandL.Vandenberghe,ConvexOptimization.CambridgeUniversityPress,March2004. [137] P.D.Gader,W.-H.Lee,andJ.N.Wilson,\Detectinglandmineswithgroundpenetratingradarusingfeature-basedrulesorderstatistics,andadaptivewhitening,"IEEETrans.GeoscienceandRemoteSensing,vol.42,no.11,pp.2522{2534,2004. [138] K.C.Ho,L.Carin,P.D.Gader,andJ.N.Wilson,\Onusingthespectralfeaturesfromgroundpenetratingradarforlandmine-clutterdiscrimination,"submittedtoIEEETrans.GeoscienceandRemoteSensing. [139] H.FriguiandP.D.Gader,\Detectionanddiscriminationoflandminesbasedonedgehistogramdescriptorsandfuzzyk-nearestneighbors,"inProceedingsoftheIEEEInternationalConferenceonFuzzySystems,Vancouver,BC,Canada,July2006. [140] P.A.TorrioneandL.M.Collins,\Applicationoftexturefeatureclassicationmethodstolandmine/clutterdiscriminationino-roadGPRdata,"inProc.IGARSS,2004,pp.1621{1624. [141] P.A.Torrione,C.S.Throckmorton,andL.M.Collins,\Performanceofanadaptivefeature-basedprocessorforawidebandgroundpenetratingradarsystem,"IEEETransactionsAerospaceandElectronicSystems,vol.42,no.2,pp.644{659,April2006. [142] A.ZareandP.D.Gader,\Sparsitypromotingiteratedconstrainedendmemberdetectionforhyperspectralimagery,"IEEEGeoscienceandRemoteSensingLetters,2007. 241

PAGE 242

M.A.TannerandW.H.Wong,\Thecalculationofposteriordistributionsbydataaugmentation,"JournaloftheAmericanStatisticalAssociation,vol.82,no.398,pp.528{540,1987. [144] M.A.Tanner,ToolsforStatisticalInference:MethodsfortheExplorationofPosteriorDistributionsandLikelihoodFunctions,3rded.,ser.SpringerSeriesinStatistics.SpringerVerlag,1996. [145] L.Tierney,\Markovchainsforexploringposteriordistributions,"TheAnnalsofStatistics,vol.22,no.4,pp.1701{1728,1994. [146] J.N.Wilson,P.D.Gader,W.H.Lee,H.Frigui,andK.C.Ho,\Arigorousevaluationofalgorithmsusinggroundpenetratingradarforlandminedetectionanddiscrimination,"IEEETrans.Geosci.RemoteSensing,no.45,pp.2560{2572,2007. [147] H.Frigui,K.Ho,andP.D.Gader,\Real-timelandminedetectionwithgroundpenetratingradarusingdiscriminativeandadaptivehiddenmarkovmodels,"EURASIPJournalonAppliedSignalProcessing,vol.2005,no.12,pp.1867{1885,July2005. [148] Y.Zhao,P.D.Gader,andY.Z.P.Chen,\Trainingdhmmsofmineandcluttertominimizelandminedetectionerrors,"IEEETrans.GeoscienceandRemoteSensing,,vol.41,no.5,pp.1016{1024,May2003. [149] W.-H.Lee,P.D.Gader,andJ.N.Wilson,\Optimizingtheareaunderareceiveroperatingcharacteristiccurvewithapplicationtolandminedetection,"IEEETrans.GeoscienceandRemoteSensing,vol.45,no.2,pp.389{398,February2007. [150] K.C.Ho,L.Carin,P.D.Gader,andJ.N.Wilson,\Aninvestigationofusingthespectralcharacteristicsfromgroundpenetratingradarforlandmine/clutterdiscrimination,"IEEETrans.GeoscienceandRemoteSensing,acceptedforpublication. [151] A.SmithandG.Roberts,\BayesiancomputationviatheGibbssamplerandrelatedMarkovchainMonte-Carlomethods,"J.R.Statist.Soc.,vol.55,pp.3{23,1993. [152] J.S.Liu,MonteCarloStrategiesinScienticComputing.Springer,October2002. 242

PAGE 243

AndresMendez-VazquezisaPh.D.studentattheUniversityofFlorida.HegothisbachelordegreeattheUniversityofYucatan.Hisresearchinterestsincludelandminedetection,statisticalmethodsformachinelearning,andfuzzymeasuresandChoquetintegration. 243