UFDC Home myUFDC Home  |   Help
<%BANNER%>

# Piecewise Linear Lattice Based Associative Memories

## Material Information

Title: Piecewise Linear Lattice Based Associative Memories
Physical Description: 1 online resource (148 p.)
Language: english
Creator: Mcelroy, John R
Publisher: University of Florida
Place of Publication: Gainesville, Fla.
Publication Date: 2007

## Subjects

Subjects / Keywords: associative, lattice, memory
Computer and Information Science and Engineering -- Dissertations, Academic -- UF
Genre: Computer Engineering thesis, Ph.D.
bibliography   ( marcgt )
theses   ( marcgt )
government publication (state, provincial, terriorial, dependent)   ( marcgt )
born-digital   ( sobekcm )
Electronic Thesis or Dissertation

## Notes

Abstract: Research into associative memories using linear correlation matrices began with the introduction of the Hopfield model in the 1980s. These networks have a limited capacity in terms of the number of pattern pairs that they can store. During the 1990s Ritter et al. introduced a new family of associative memories based on lattice algebra instead of linear algebra. These memories provide unlimited storage capacity and, though they have been well discussed in the literature, research in this area is still active. It is possible to generalize the existing lattice based memories into a family of memories using order statistic operators for encoding and decoding. A novel method for encoding memories using all order statistic operators is examined. This new method provides a level of robustness to outliers in the initial data set that is not present in current lattice based models. In addition, this approach treats the lattice based associative memory as a piecewise linear dynamical system. This piecewise linear viewpoint imparts a new set of tools for analyzing the behavior of the canonical lattice based associative memories. Improvement in the retrieval of noisy patterns in the presence of outliers in the initial data is demonstrated using a widely available computer science data set. Using this piecewise linear framework, another class of memories using what are referred to as 'near min-max' encoding and decoding operators is introduced. These memories, along with the order statistics memories, are a subset of novel set of memories created using Ordered Weighted Averages (OWAs), a process which is also introduced here. This experiment sheds light on an important property of the canonical lattice based associative memories, herein referred to as being 'well behaved in the min-max sense.'
General Note: In the series University of Florida Digital Collections.
General Note: Includes vita.
Bibliography: Includes bibliographical references.
Source of Description: Description based on online resource; title from PDF title page.
Source of Description: This bibliographic record is available under the Creative Commons CC0 public domain dedication. The University of Florida Libraries, as creator of this bibliographic record, has waived all rights to it worldwide under copyright law, including all related and neighboring rights, to the extent allowed by law.
Statement of Responsibility: by John R Mcelroy.
Thesis: Thesis (Ph.D.)--University of Florida, 2007.

## Record Information

Source Institution: UFRGP
Rights Management: Applicable rights reserved.
Classification: lcc - LD1780 2007
System ID: UFE0021636:00001

## Material Information

Title: Piecewise Linear Lattice Based Associative Memories
Physical Description: 1 online resource (148 p.)
Language: english
Creator: Mcelroy, John R
Publisher: University of Florida
Place of Publication: Gainesville, Fla.
Publication Date: 2007

## Subjects

Subjects / Keywords: associative, lattice, memory
Computer and Information Science and Engineering -- Dissertations, Academic -- UF
Genre: Computer Engineering thesis, Ph.D.
bibliography   ( marcgt )
theses   ( marcgt )
government publication (state, provincial, terriorial, dependent)   ( marcgt )
born-digital   ( sobekcm )
Electronic Thesis or Dissertation

## Notes

Abstract: Research into associative memories using linear correlation matrices began with the introduction of the Hopfield model in the 1980s. These networks have a limited capacity in terms of the number of pattern pairs that they can store. During the 1990s Ritter et al. introduced a new family of associative memories based on lattice algebra instead of linear algebra. These memories provide unlimited storage capacity and, though they have been well discussed in the literature, research in this area is still active. It is possible to generalize the existing lattice based memories into a family of memories using order statistic operators for encoding and decoding. A novel method for encoding memories using all order statistic operators is examined. This new method provides a level of robustness to outliers in the initial data set that is not present in current lattice based models. In addition, this approach treats the lattice based associative memory as a piecewise linear dynamical system. This piecewise linear viewpoint imparts a new set of tools for analyzing the behavior of the canonical lattice based associative memories. Improvement in the retrieval of noisy patterns in the presence of outliers in the initial data is demonstrated using a widely available computer science data set. Using this piecewise linear framework, another class of memories using what are referred to as 'near min-max' encoding and decoding operators is introduced. These memories, along with the order statistics memories, are a subset of novel set of memories created using Ordered Weighted Averages (OWAs), a process which is also introduced here. This experiment sheds light on an important property of the canonical lattice based associative memories, herein referred to as being 'well behaved in the min-max sense.'
General Note: In the series University of Florida Digital Collections.
General Note: Includes vita.
Bibliography: Includes bibliographical references.
Source of Description: Description based on online resource; title from PDF title page.
Source of Description: This bibliographic record is available under the Creative Commons CC0 public domain dedication. The University of Florida Libraries, as creator of this bibliographic record, has waived all rights to it worldwide under copyright law, including all related and neighboring rights, to the extent allowed by law.
Statement of Responsibility: by John R Mcelroy.
Thesis: Thesis (Ph.D.)--University of Florida, 2007.

## Record Information

Source Institution: UFRGP
Rights Management: Applicable rights reserved.
Classification: lcc - LD1780 2007
System ID: UFE0021636:00001

Full Text
xml version 1.0 encoding UTF-8
REPORT xmlns http:www.fcla.edudlsmddaitss xmlns:xsi http:www.w3.org2001XMLSchema-instance xsi:schemaLocation http:www.fcla.edudlsmddaitssdaitssReport.xsd
INGEST IEID E20101118_AAAAAG INGEST_TIME 2010-11-18T06:40:57Z PACKAGE UFE0021636_00001
AGREEMENT_INFO ACCOUNT UF PROJECT UFDC
FILES
FILE SIZE 1051941 DFID F20101118_AAADLE ORIGIN DEPOSITOR PATH mcelroy_j_Page_015.jp2 GLOBAL false PRESERVATION BIT MESSAGE_DIGEST ALGORITHM MD5
4d5325470ace3a76d545e3b6ed2cb480
SHA-1
8404fe5064df322c1ae904f21abee46babe79ddc
1108c98a41f49a2859692c9af0f4b4fb
3d964a942528b579f2737fe96c90521697e43b4c
a52fe5d62cb873424cb69c40871c5926
5714e2b4259660b8a0587e8924a43d39977dbcc1
c21cb539976c05a50d0863421644c106
02fd695fe2b34d6903f536d47088645302b3114c
f9fbfb3ab93373ae34de6a32f747cab2
3f594a7cbfc885c1b6c948a3bae0fba9bf73061a
154e4a5f1443fc8cbf70930e858f626c
792abd2b01a823762240b2010d9c7aefcb24d2de
3d9ed6019416e52f33c9a22d046c0134
2fe818892480b7691301740c9ffec131af723a87
205b8e64d62129d445f5eb42a03ec1ae
cce7e6cebbb67ed5bb72173e9f8a8eef9dd20d36
73296d6990fbf3951837a6d66dcbccbb
b123bafbbb9cb1111a2ae5c4e6c96451dab48175
94176dd2f48cb2775f641af2849711f3
0ccf0048aaefa8574b480c93215d13c4
9a74666253970fcede9c464cc06f411faed21f66
3ebc1bb4c9a7e11428e82626ec87304d
cd49218827986fce0a87e3819b8e0e6ff7607f9b
a12b4b9daeef9eb4f942d96f5779df05
b6db9715e621cf1ec246bb4d19d9523a5422d919
59ec891bfcf81b62638f5133e59aef62
9fbd509f0e16c612a46a60d1a9f86e36
078a30a497a5e48a235cc822331871e05b14811c
36272700633ac781459e68a255cb95bc
0b3fba506b3a18015dba35cac891dd2ff261edac
db6fc60f962c013602535a417abd9791
d62e4880281e3a60a43b7d42c27c7b2743a6bc1f
6656d46ccab77ceb2c0acb34114c594f
c36d3786d639726949ac663c47e4ee430f6f1172
4eea01c43834192cc0064bfe18769d5b
58fbb24c2076cde246f9772cb49bf994f4848818
7ffbd9d22098f87e4c32cec740ba63874ea2437e
93cd6d2b14abdabd1db17c3d7f2fab05
6d53fbc901b363a5612e2d29b9bfdf0e0c8e6379
a691e12a7c1d9151fe558fcb34c847f9
e3e583ca6166b2c231d091e07365747a6247a073
90f984be71f5ef0eb1a9278ed8b8f852
4c2420dac7abbd76a7c166f012ab6473
d3d5a235be87f8ec45a4b9b7225fab59d2c3f5db
9428c038a31aa6591f815c2209990f60
561e51d5c27a19647ff7a14541573bc3af759b1d
78c3386b8d28d0e08ee1f7f713324a55
63e4016962afb5e1b0e74d6276c12303c59f047c
e47f8ce1b8512748bf49b2d346143033
986a0042582932e5c00358a3a1d550d0f0823a55
20c9f0fe4e033ffaae9b65092e55deda
fc8dbb0a9681745130da97a6f86f30b50b5e850f
de2ffd8c7d6477b2db74fb0a7a81ed42
24f6940a0a0a986f0817d3ea01e8048afd7660e2
51827fefa5e33f7f5eab96181769f2f3
159216bd58615a135ffd7ebe39efbb4d08f2e446
f96dfc81db872c16c5a34226a9404e3a
ec4fe9ebed511ebd805f709b2495cfbb630fe6ac
1965da2927c293c6b5a54a7dda969efd
ea864446c46daf54ff608acc9b69ff5b8e37c1a7
867f380a85214dee13b4f1b7241610ce
7fb91dc249186c07b44e6d9af6e00f00b51b5667
3eed040017edf0dd2a809845645f6fa2
d2bbaa1d872aef507b0d8442d440c27255f888f4
7eb0c6461bf6411b7466f9b0aeb26e37
99b59c69793323647b3511206b24f3aff0d64ff8
12a39f2e33753c363ba38348db215559
4ce2bf8ba944366a6931a83abb6a80e4d47c0a27
32cb7d089740f22a6fa02fda7a8fb8aa
1d5d283e493ccf91645571600d0010311f3de7cd
0468a065f1a690680c7facb0d759598725c6c65e
1308b15822af413650cc0f8cacb3bc4f
7db5752dc86f4e0c36e90cce8743580a
0b039e16aa9c5e09b5110f0e74003f3c
1f3294d6e7f337d4a2651b528420fdc8ab105359
118a06640c0c103cdb30990a1216a438
d10fc422b012786030b63cc6473c5cd8b16f0d53
c11bd36031716934833e7d4aff998d22
8f397669767d2153b388e539831ba2fffc98efef
7420e04a6a050a6d7218851153c1291c
161794bc4ca8ca1e4bbaca89f3443799d9fca446
5e572a9a434eb73343dc2b178b966467
3957e47d865679431c4cf05629191385c3e029d2
fe4fa5db50c8ff43fb20ab54c2a53bcb
2cfa826bf97030da820983c14bdb8985b43f039e
11c003491935439e6805afec76ba7553
6bc008930734b1f6aea88ccdbc48857c4d8b2635
c9a0061cd7d266d11971ed14602b7ce9
f068571d4b3a8f2b19f02a76f6e3d5de856c368b
114e5d885a716a827d931a7352352b1f
56d8116d969ef739c1c1429ca140f822c1de87dc
2ab31ef315e62c9a7bb44aeffec06fcd
7eab3b3baeed049157a6ee7c401a48885515b97a
579a08f9892db4dbfbe8e561e70bc566
c66639cc349330deb26861d8a7e8e0ffd317ae7d
5cfa5e3f55b676f240212cfc28b8f326571d2976
4f972cccb845fb32a6773f94e19985be
f304c2d8203254c179ddbc83855db2948c183bdc
acb6e1cfff1ec4bf7de707e16f993a75
f4093141082bb0268777248a09d4852899519ff3
b3bfe79bf881dfe3393d8092d6aa328d
d0dc2b0dd03dda784c5208026bfe940e0b6b0a69
0f997d5efc0bf142fd9f979081458ded
124948f452a9c678bc9d9561e5291147525a614d
5d86b4053d491ee46fe7ebcef6b2b3d20e08860e
5614a241ec3c3bc4d49090ef4578574c
2dda0ecef5b68a49c4e81a7852926c8cf7ab9dd0
34c943834ffeb58172eee3e0e5e2c0b1
bc5bd55654d69fec3f83a89d422400a385939f01
cb2aaf5ed8f2382d0b22d3e69cc8c7d3
b1ea566e3105eb327940ab92be7bd13abf934466
32cbfc9f7cf175b7d92b16dff74bf164
2ed31b55d9577a4114fbba5c04155ce1
9d2341311215cd2ebe207e21ff4acd5a9da76920
be654ce9fd01eefb9e7b0bd19900a054c73380d8
7d74d2250a812584dc9e031c8c6feb7b4128ef4c
2f20cdcb881c5d523f9ef14e23d5a836
ec155451b91cfa374856b0a749f56d9f5e3c438c
19de1a8b064074218fd4819cf23ca816
76fb3a2a904b7cd0264bb3c6ba1395e4c653c25c
532725269659fb66ee9a1c4159a4ecd0
670ef403b64d2650f6f1199890dd32d3ab4ec8dc
851b8ed7fa3196a004ce661fa6c1596f
8123b9c0843a5bd802e74368946d59a0f46e4eb3
71320f9ee22a98943649fe3c482bcca0
d93653de89d354abe55e492c81ea1fc202b8002a
1ddbe9cc3b543810d4803a9a40121539
28d85ee486de7a5ceebc215b19c892dc457d4f81
719fd72fec09ee8458aa6bde6d63bb9d
d20e8c9a857a5d4a8f5208750d070e8116f66ff3
15c8833dd28d3fe753d4b481b72cb99f
e4ef6423b807f3672404286e0d6afd5d7b720638
28739f76a393dec2ab91296a677ff65855d9f18a
852f4d56cf32955e4d1973666f3284ee
2c9521b1fee36b00462a67e86e07da39b263c16b
03e58dc3e9313c40f6eedeebb0119363
787dac6d3c8fcbed846511cd702abb366e7c1b9c
c22145a5b2d467af24de5e8a1bcf2850
6e0a357b49b6aeea643b36913715f4b9fd382f70
374ea1647980abc677bd270ce47dc5b9
10cea102d2e857b6bc96218010be6315fa03595b
4c0a6438d98cd184b31969e69060e034
7cb89ae470b7ff90a30d0f150fd232d1663d8b14
9d5c44ec7fc42ed4e0d27d177572787f
1dc7717bc975b750c1da44fea990095a
dd8b019022db65b167db7163aed7b89360ee8271
ded58f58d90d188ceb427012a7dbc07a0578f075
6406139c192762fc87db762afce6b1a8
490d782ec95e4d72f0f2a2d2d9473b2eb340efc2
d4daefc9776b162f33f6a90e67ab8376
a16f46baedd784aee70cd8dcaf5c1272122ddd6e
ff2db55e19a98278028601cc5b7b2579
e42b57da06fdbb051d922a462711b4f0
fb32c2d6a792e0cee27edb2eb858815845e4f8d7
1bb9b21829f548f14298eafc2034cb89
d6101bda68d99e847d7234520f0a22acfabc4d1c
ce0ca8b023959a216f9ff3622f8ac91c
7462a5d2863d20f570506a0fd998b897e7ba6094
2c224f2dfa7c4216377d473b13d8cdfd
b532907086e56c4efc3ed3a441308a15a27278ee
46db1a481365269ffdefb39f712e489b
7a8df1701c4e0aaaf7f104f0e199fa277e908095
740c6ccffefe76befbffd4fdd07f6111
354daf1fbc340dcf25886177a314561c55e62016
1ce22929124817720ba36d30f1fdb07dd4ee0654
6864fde3a8239c8ce06a2bd5a682a35b
6beb7f7de93f0e06be5645fac926b42417058364
9591d1a3b777dccd80173c8bd68c197b
019d3f16d8efcd51556d8aec70283f63bba792f0
786dce04026f9e49aa4e4d30bb311997
6dcc7bc56074a8fe88940ef66cb2cb5acdd09260
e48e68d9e8ee192d18723c73b52b2d33
599f2cf4d12feca8172ec91bbe3da48cc0a42329
0f6692c801bc1bc38f05fc5c8e56753f
b570db50b8acf8e30ce0abc7ec48f7427a809ff6
9f367a6000e42a18e7809e73a4c8c288
80d1772f3f4aa91812bccd2d774dd5b4
0c48caf44cec56b0555011752e4b1a0e4e3cb0f5
abfc39976c38997e2658ebb8018abd25
e7b9208af271a1d7c1dd63a104f7614d3f19b8d0
d00367d9760049d3af8acbee1e9ab527
26ac30be1f2c8583271dfa64a3800e1d
77283c8211b2574549afcb6cf3ddf531e6384b53
92080d7a85962db54e8e1e1ee3f94479
6ac2af250cf4b126367814825718322a3b937490
b372dba516801fd80d4086bbf9244040
eb5ce71dd76a0b0830df3cd9e9a14003
99514e72de57218f960c2ee48d9fa89fcb31e8c3
3d8279092b3825e2fb77de338db820ae
fa65bee7f24bee06da246c3075f6a90b4bb1a503
60aef648447115902412f1037486e75b
af7e86f79e27faa8cffd7ed56644f601ef112412
a9fc244a1f30c192e3f8457aa704b50f
34ffb430f84a2a22e31c6590068ebf38
5ca93f280bf3c9dedfded64248bbd776bd13e750
4278d3b823ae510610f5486f00bc8c2e
bae5b84b645f4ce25111f94e77a9c03ef4b9579c
c0e38f8ed086f64bee0c6e0a4d42ce0c
4c05f0eef266ee5fa4579c90397a6c7c34cc0614
7a118c87abcb428e41439ff2e400e672
4af31ccfb0cb6dfd7ca988d46a6ba48725039dc1
359e48b2fcbfb9e4492550d88f7b2146
270b95856924587b611a7c77c7a723a9bdd464e9
0ea71a17d4699e94b28eae0b1f5f16ab
993afafb13843759dc37303ac009788cf420994a
0ee4c097f753d67d1b972899404e821c
693140420241f46fb079b0f2e135738432249920
07342ef7bb2ba722c1bd31fc80fcd71f
1848219554ec9a972e23eb37dd51bf6b9e9fc912
456589cea812c5576f5e824570b97d921e3e7552
2b6073f3cb9286704c1539cfa3b61518
c125e4e04e9e045acb554fecfd38b2ff23d1e587
856e5bfe70fab9d0701f6a2c875d3da8
41898f15526353b4ca22af993962b84ea8557092
81c232a44bd77a0bc2f6c9d21396ec7e
a94cd2c005dddd3cc4b3f338694a70362f2450fc
ece932366976d433253c419e90ac33e2
00b9702e069cdaa8e8346d0223fdba229d2f5e51
81682d7b0609a16e1deeb1f139aa2e75
d4e3969d112f17629a0099bf99228489035d6231
d2c8435d3884e1669836e559474a1685
8cd8ae89ef677c3103454949f061bb24b7e4d316
981b0946e44722f7a13a479907729492
1bf053ae0942f816bcd8657b0c217594a29cde0f
d51e99ae8c7636a49e6ea58ac6f41d81
951f6ec8a99bf1d9aa78cf57cfc80433694d1db6
a57aa103da137c64840371540d5268df
1d89686d0c897c41194d57e146b3b5d771693d59
c9e54123561959f6e9af43620e799e24
d99a2fff7548608c4a93f9a3f0c9bd10569e9d28
7ab7312e3c50df9916b2e9e34d9cc4df
1859aafab26a6d632d1e509293819be87b6c4f33
507bb449b43a5109f43ac36244f156cf
4df6836f8b67deb564b8d7f8394646fb
331c69089b439f6f5597c50f1d57a11007241a3c
da206a7c7585c83a7181afc86c5ba190
cee9a0acfa0e320abddcec998ee07688ca981d7b
d094249bd1765aebe4326dd3a9cf6ee1
a59b4ccd096dea759ed7a7b1e70c5a43c98dbacb
e45ae01cecda3cdfb1dd05ffc6791ff9
5629945b2dd52d58d5ca4ee2f8d9cd50ab64eefe
20cb386b142c5e374c9bb5571719e030
4889f100a13f78a840d33a1e27bf5c6a476da941
56204c63d65732035841ff6184538b63
8a4ecbdb4f431ef2ea86c57d47fdb01b
c27ebe9b8b3710d7b495563e7266652484a1c6c4
edfb5c940cf074ba7402506fb3351b71
264b9613cddded12f016ef6f7e6789111250c980
8382a674f4d315de53c336a245161abece038d0a
075da12976975f89d92837b847f24c5c
967e0cd01a1cb71ff880d88448492193e200fe4b
bc8370de562057e426b2ed820167ca5e
dea4878d8f8e30725372ea5f53d826db493f48e3
9d5353eeeb5eaaf815d3b9f884c3f33a
864a272dbaf1da2271a6654ed9ed6fb29b65cbe3
f58b07652ba6a8502079aa01e45a411b36357be1
4eab3386e3eb3d6e4ee147152da219a8
c77e69d19bd59d180ae48e8dba01b4e2647704e2
a5f57ba7a3aa4d999d3ee7c8caae0e23
7c05ab1983bb7605f4271616c8028bce1dec04d0
969f42dcbd06cafbbfbe759925a1db97
20df58bfd94b285b311b3058681d66e4
c89ceb23938e97b04a4360f445132485edda9bb4
4a91acc1697e96883c58253da7a41041
24108b37a13f7504fb33c720f6dd11b3cd5976c0
206101675259a60bbeb06bb3831d0a64
86105d4443a4f219b99990cf22b4340f214684f5
aac534d8d11cd925d780ffa5d7ee875d
0314e04ed273ab18b389b4ab11ebd679e8a353b7
0b05eca21f4e1225b55e34106097d795
1db8a631d20afcdcbc6e5016a2539417ed8538e8
df7fc58a1624a6eee2233c77b517c720
3129778aa02804eb50992ba0860e0cdfe2fec907
f0799962b2d645686e47ca0e8e732fc1
c8e8540af93f7490cd9abb1562bf217d3c33bb47
0feda22fc5ea5df36d5837fcaffede4a
fe98079f44d2480644ab9f1f9d6b55d4e12fea3e
0e4a64cda5e711869724d2e12778dd04
1b413dc632c58e75a17db851c3cb7cf2435efbc9
21543e3b31886daaa3910f994fb9031a
3210611012861bd9dfab7f99fcef9b77
7817154036c2fee7a199680ee1973fe5
75e343ef49191b6fa81fdafd8541d65e12432c5b
45533cc932ab24f95b6f76eecc6bb97f
d795ea1871b7f823fc33157ddcaea5b9
1c1999ea0c1c75d49729322c53bbe63a611bfc10
3618e6ccb5c504fc5b42cc55a51b4f52
4cbf19aa7bd36b0d0e1008a8a0bc22c4
14f5e11b167e431270872376d412bd5c0e55a227
1b930d4aacb01a8e68bbb2147b70b856
8e86305bcf5a6d2cdc01abe62b47e23da70d124c
9ee694df9b9f6f7e89a648856e6f39de
a325c177ab68ced463b1710492a622f26ec3fcfc
26287b182aca55b5a41b528872244aed
76b2b1f971fd03683b73097e54640853c0ae0569
98a81c7316e4f6021141ee6c51c22596
99c7ae4009d948fd83ffa6e14b61482ec7323345
9b573e70136c0b65dabf8af4ed50e3b1
c00873ca533a29cfe75a061d137702e0925533c3
65077e988c5157cd93bb20a0fbece700
db960d3443470273c1d27790e08f67f06e9737fb
eb09b5ac102190327cb8317458120bf2
b4130bb8eafc83cc8545bc45c2ea9e46ebf2560b
295cb6ffc70a49bb69236df1896f92b3
4a3dac12409424b1ab8a25a0d26840d908b4730e
0a8195770d9851b9899529e1ee16e0c0
d13f15272204a5817d5992c7d70b505c6399940e
5b6d19288ac61a70a80b80f5661e0205
515b167df3b85e2efe8c5e4d938953e9dfacae4e
58fd57f45269c892abcacc643400bb8c
2c4a685f2b6b926f06ddb23268dbb0c600bc08cb
8a575b7059d4b8a11ea10373999dd653
cfaed214fdc2b508b617719172b28259a9e34d42
f25a1bdb8d5a6df38d0e7c7e19bacf66
7790c3be8c4fa7e813aa289a8936f02e
b9c503c577f59821df878d84310abe92194855a0
0ba735f0a08010e24d17ed66a844aa97
5570309d58ae1d648c78760452e41dd82660bc08
97a538441f6e2ac82f0bc9aacfceb2e1
db484d30109a7cb57906794271f3e64045c33669
520d79887c4780f12850c13aab39d0a5
7006597c1010051aba4cc37b31b435d160e3fa72
a01c535f5f0bf2c24b6e521d98729863
e7ff64bf6083a17824c0d6ec1c5bb96270435fd5
541dbdd573ac41724fb640b72a2a7be7
cd533b56c17712985e87c4264ea7d68eb95c0115
917c1af1f723e569995a359b2e5cae94
7e2c2c8b48b4407e8e3e778dac3dc028211e4ab5
917a27a646daf9fdb081a9599d923e734a23e0d8
99ebff5d14f552249ede9b84910d7c72
d58574fd44982e9fa16bfb8c22ed9f3e
0960ba82a0589ceb114bcd9329e389e638182eea
c39f7e3cd28ebc2fcc3ffbac3c451c04
af7582f460d1531c9c07fc759ef005d96255388f
918ba2f7d5fa319d169038e54a27412c
561bfb374bf7eaf6193b1b5964f444e766218478
fa7aaefc22e86c944729316da38661f952d087e6
8c65e24e101800b8284d03ebf67778ec
01ca7ae1dbc5ed3835d172b84b7f382731993b0f
2f3792207f1288a100ba8e9013c832da
258dd6af450a84ca0fc3b35656ec0040e1144e02
b70af74ba39bebb287b996698eda5896
5daeae1b85a1abb5cc9e525befe7777ef9d8094b
71f709cc690700dd67fba70603c63271
3269256361a901371e9bc965c61e41767ce7d7b6
b5a1101b4358b44fe580e261f33e36bc
dd78be23fa1b618f1836d55455984b3f68f765db
7aee3b378c86bc92e6f1d864fda22f7d
b793bc4949ab672c3df2e800583f590b8c937707
647381da743d9630329fe517bc152026
b217f569c792e5a464ef710cdf175421bb50083d
f5cbd2e4a8ef77fe14e719174d642d47
1e7bf2c3739a21dbc9bcfe2f0cef236786703e69
cd77d7c6797355d3d3cc7d897aae16cf
6cbcab468f8391ac654e230338a5168fae241830
056f68feb788037b9dddd76348df3cae
c006fd1bdbe7404423a97bd8a85e7e9ae1bdde81
ab570e295c9ac3fba4879b356a887032
4c564dd97e4a1655667358cdcab019ef2a577042
39247e4ff615d502b22c82858ca33065
0a6b47b70c37a3d22b59599dc6f74a79f4eea9c3
66c032f8d9091d914f0a6b4d81a58912
0627a7e73e56953d7cdb9090890cd6193868dce9
eb33d9f07b97f44838f0c560bb2b64c8
49a9b1dc8d7c7d7f093414e8c02610ab22bf761d
aedf2d3633a72ed3c1796b594077f607
f39eb57e9f9c6f5067078e1496d78eae9971fd74
bf4262ab6124cec9082a9afb20e2b845
e5748bffc5077a0a06e067ac1a42bc52f6a1e7cb
73ceeb6df7b34043cbc68755eabfa3dd
be98055f0b11286bcb0be195f918ca9225dd02e9
955d29ba89a536492f5b28dfc86ed6f3
83ef8527c3f3863a261ea7d58538d94d16a57225
cc978e2a4b7caaede652d75a4b125c10
ff70fbf407a804bb3b89b1086db051fc2b707e06
07eb5d1a29a6dc9aeb8caf7137aa4c4b
798525d67cb6ca14cb0ff801904fa7f3
e4a93dcc28f2dd59a0c89af7d2b7908467d6b0a5
9621933a12cdf8989da45b415fde0736
61753d2a14e2c711ce3479904dea0b3d519501a5
88346e5559b8eb6fcb13a7f30bbf1af5
65fcc93807412a883201aa6ebf60417912e01064
fdb57bbf41517e1754d9fe2c0dfc9502
62f45fb43eb63fa4f590e2243591f23c76e785b3
a76abb81b1518a53b6ede06a332c74fe8a0182ee
84fb4248d9dba2c098148f6ed95c706b0e43188b
da6f05c123ab092a18c34b15822e1206
ae123aefa26364c8d3e075fe9bcb8d7e073e9867
789b3fcab519a5664e4455ae600cb163
95f5cece2d620ac1ba6b5597084d57f984f53606
2d46d0570aca65488a505016c1c4a953
f57af0183692a65e335bc28470a8de0ed9a6e798
8427c8d8b307ba10e899172311975267
4bbbbd3bb116c069029fb06e2b8eac0636401050
a36c7859e098d06f5e91b9d3130e4d9c
3535d021f7d12b23b307c0563d51f792522a9146
d34bcfb212f257e9ed2d0edbd0b4cb82
d20acf28a840fdf88306fc518e436cd1
c9a1c42cc2780d174137524a30d12454
56deae257f4a0c3387c6f7883e93875725b429b2
f6cfbdac35a7b39f3e4305df7811f84d
5cda683df5ab83677f0d5c1e7c87d150cbb0439a
7f56433b4694be19a798141a08c362f0
001569f03349c098a60a59db5b896c7e6a8aaf8a
93b5e43ddb7eb77e39b5ff19d002e4e1
e546da506248ceaa696a2bca878427186af608fb
16f9f20ff8a0cef9eb1a5ae20f08945a
fe42d3923d9c2aa344e71317d0cbfd8613fd0f45
bb865fbb06b1ce5cd3b41c3cf066489a
4ebb1520aa339a0154fda3e5deeda53577d49879
171f5fd48af3f5ac1807b174036255dc
835ed600f93cf658f67fa322082a57e1001915ae
9f44e7ab288c2c5ac3335e022b803680
e4f031e265f8127f10d427363bbe3a47528cd1ee
74f71099a887453323b6d3e22dbacfd5
d7dc9d8a5fa0075a40f7937d6b94468f7b3e5862
58aa56cfd2a831df6f4f37c4a7c78085bcacdeae
1860 F20101118_AAAEAA mcelroy_j_Page_064.txt
f254d216dc7bd0b78e5dedb2f71ca8ce757b577c
fddfdeb981d8ff42c2e3e34b7a8cc73c
8e25fa83d1d9cef4ecaeee255f5a8d47404f6e00
836eb76fa2a009fbb2a78e836cbda443
68d797e3c1a1dfbac9467aa5d510889c78472c08
1491 F20101118_AAAEAB mcelroy_j_Page_065.txt
17505f8331f6ccdeab99838bbc3bca59
6779f4d719c943afe54ba4a20994a634
22cdafc8e5eb7f041c808d8f3c1353f68ae9e772
640d42f096b064b60bba2584192440d1
022d085a15cb364d297838cdba7a27539266032f
1246 F20101118_AAAEAC mcelroy_j_Page_068.txt
da86211ff797857738955a06392b1a17
7cf8027eaff8bec31c820c8ec125c5afe3ac8239
387301cb92b2ea5d1f8689c8e00dd049
492d066cd07d9b8bdaf5e7a2c8084b28616b6c15
4b4377af63d0f1c412b0604088400059
14be34b407491c4571b5e0c7815339fc4f42f830
a8a6ba50b182bfa2a68973f9693a7915
2bc7cec301e7dacbba443e4d93d79c1b6baff304
b697aed12aee018b473069f16c2b4c39
385a275d7784809b8430986a61f2c0fc60960906
45f82f7031b32f511a20dea1882ccc65
1729 F20101118_AAAEAE mcelroy_j_Page_072.txt
aa051ef26d1d77285579ff75d95368fb
3661b213a8aca6995c8115e435ae2bb8
10356ede7a92499f90ee05f0e251e0ce2b76d3aa
57a0dba91d58810f196b79bf52cc1fee
2b0a01ba74beee39b20feb7d3820a7e8ac512772
1828 F20101118_AAAEAF mcelroy_j_Page_073.txt
3105ee05e428f8561186cb8147992983
9086621b2eb6aef87302d4f67ecd05ec430c78b0
20fed7a54c6688e64c7143c0f8cf9ae9
4a7a6e8e7bcbe32ae87c409784bab09cefa30c75
1049 F20101118_AAAEAG mcelroy_j_Page_074.txt
6a3e63af79e80c9a5a57066c7b11b42c
cc22fb53775edd9b117ecf204ac4bb0789296b44
856613b1b8258e48a3d9682689684283
baecc4ae919b801870f4346a5db789a58d0ea005
7894f454c3232800917f989b422884c4
e6025379fae3d4ea38d720f6810122e7e8ab607c
53af2b01daf13c0c70850c3bc3747c03
5d1880f984e3a780345641e432bcdff00936b9e8
96ebd580f148dc2941d47d72051bfbd7
07d9aefaa4d429b04e51b7458041112bfe633132
c944794f198f56a06525e13fe8df7f4f
cf04309df3a393dc5511f027f181ec1ee64ba9d8
1460 F20101118_AAAEAH mcelroy_j_Page_075.txt
20f93db3e27e963daab03aa40da0eeca
f0e5a95a4c0e66024b230a086dde3e15d8a2082e
500d3bc9fecc995df067982d5186e723
b11486cc567dc319271103bdcfaf38fa850bc35d
cffeeca3a75f8b10999d2e0f1b162384
f88d0177794efcaa3a10ac202f889d3162523055
6ca799ebeafff31bf07eab241735e3e2
1b946dbcfeb3bcd10a5a6c76191f239822f866e9
1703 F20101118_AAAEAI mcelroy_j_Page_076.txt
50bc75572c57c23df81834c085641963
dd9220939918ca1dde3837b8da757b524cf7cd07
de9bdfba49b7c00b325d44cd61fe8fe1
d9e8f2c5536b694ac49d2c2c9fe4807c
df664bfb28c45b7c551367efc31c0c277e28de6c
2280 F20101118_AAAEAJ mcelroy_j_Page_077.txt
df4893e118f053df61a928d7eeb1d0ff
bcec08fbed15941c95e26d8742ee0dfcf0c33f56
b0da3232fc5c0f8b3d0d026e6dc51996
3dcbd1da2662e4e98d9817759f6c869b8ec4fc68
29eac3cb6ab2888ffc3b9092bd4182fd801c0861
248 F20101118_AAAEAK mcelroy_j_Page_080.txt
e5a700d2259e6aa29101dff1176c996d
b432f2481ec5ac97a419677f68d8b4dcaba0ce8b
1f85130d92aebde627c8ca8a753e1499
3ee76f57403de1360c7145a924c397a3296be4da
2ed2a9bdeaf92f5a8516e0021fb0562e
145a7186bb73aeee8a414ce72445157951113179
155 F20101118_AAAEAL mcelroy_j_Page_082.txt
ac39b22c14383582a974899fd502fc86
e490565cfcbe1ae7899163608c911ce7405dd0c3
6a98d985deca637b1b2267bdbef9abe0
81541e4c382a0b246ff9cc368dba4e03e6a03422
2a1571a7fcb67075bce5a766651fb6a75c31a05e
268 F20101118_AAAEBA mcelroy_j_Page_101.txt
f772bd786dc230e4888204b46e141e0d
fb36ae7b739c110dbf92e5e277b8996a5f7ff26e
63c9cabd844f72c62b1205b7c026dbae
8c72e3c9bed8d1c02545d0c55004817b
9ff855293383cdfceec738d1c5f195fa3797fbf7
323 F20101118_AAAEBB mcelroy_j_Page_103.txt
1811 F20101118_AAAEAM mcelroy_j_Page_083.txt
8fd3bc51ac3bddaaf4f2a08805ac4c56
945567c95d7306ce4376ec0e15a6e43694998131
026d4218faf126de435f48ef7f8c7cc3b4a5a2ae
5643c62fa330e8657ae82492a42927cd99101e85
331 F20101118_AAAEBC mcelroy_j_Page_105.txt
f3657ee40f4853d6940e41b2485549eb
bec9447a75de211c0dff4867317140061f146049
1697 F20101118_AAAEAN mcelroy_j_Page_084.txt
bfba7bf8b18f0da359c437a19b5f188e
5ed19dd9321ec73aac715a225a78c07ae84ba214
ac071527737030f4b0014c1e2a4bd669
e4edcd38dd8f875798d0fee3246d57420629318d
186 F20101118_AAAEBD mcelroy_j_Page_106.txt
c497505b73e5ff01b2e8ece329dc3af6
2501b73aef07ea183c6ebfedc7d3b97511b66411
1550 F20101118_AAAEAO mcelroy_j_Page_085.txt
945b5d42710f09945030a665e0be5c8a
daf552d1bf2f7d208f5119436bbc5ecedb2bac2d
162 F20101118_AAAEBE mcelroy_j_Page_107.txt
c5c583dbdfa4e526c519f97b3107414c
fe9965b1307b50d2d45d41646798624f
791ceb59c435912cf8ccd24b69c96db3
1b1b0de2a1988d77423fa3a6a3cdbdde5df89c25
1869 F20101118_AAAEAP mcelroy_j_Page_087.txt
ea49a7a9d6e78a6e12b98d06f5e8e27d
ec147bee05e1dab10011d898accf46ab33b51631
288 F20101118_AAAEBF mcelroy_j_Page_108.txt
27535c59d7aac7163473ccf8d6739b95
736d4daf18ed1616c667c0f5f3548f25d6a105c0
d8fa9a444688365167a916d06d6e494c
04457cf2322231efeb9f6ffc23cb911027cdf9af
179ce44940f4b36d95a67a92dcbe8874a1b44738
2154 F20101118_AAAEAQ mcelroy_j_Page_088.txt
d14e4f23c213129a09e3c396d4d3d8fc
92753a92b7e2f9b1e96f24f3f4830419abefa3ba
360 F20101118_AAAEBG mcelroy_j_Page_109.txt
bf73cf1771612a062541c66ea44ef12c
f9a1002d0d484e2ff607553918788fea13dfd23c
b1d9e6bddef7c62c6826317888d4e9f2
a0e76d859b48159883bd89ecc288424a84e42f6f
4a2349ea25f6a940d0942f2ea1b3c248
1432 F20101118_AAAEAR mcelroy_j_Page_089.txt
6fa729892dc58d0c35c1276521fbb4aa
01c723c04deafcce1d99749542b1e1fbf56375d3
173 F20101118_AAAEBH mcelroy_j_Page_110.txt
ca5cdb92aff3190f2c1f1fe7ea8702ce
c42e95f0e31ed01b17400c5f570672f436d7e064
a23e4434342129980d650a3fb5e72f20
ac39babd9db24a6afab8722d1f0a45a112f6e561
41c8b421c026c6a550d3e379ffb02463
c06abc032e27cc27e4226e94d6acea583dac9d44
ef6024b859f15344404b0442a272eace
09960fd0237da2325a4a42e1549e0630ca990de6
1785 F20101118_AAAEAS mcelroy_j_Page_092.txt
93eea7d4dcf4db62c694e665e56f741e
6f574c78598e9fb72356115d16c4bc95ca594c89
6112d0c103332b837297abc73bf12778
e1afe0e6a5277039a7b1382c30744b7cbceb278d
9e3fe0cd9968e31b474d5ede206524f4
7b5f196c02aab130f005afb103270fcd188ff11c
3a70fb95999fa082839095de57dcd998
8cfa76c6bd7ac13ec842df86a20cd5d569fc6ddf
1634 F20101118_AAAEAT mcelroy_j_Page_093.txt
175e14518ae27a5c86f0b27b5342fe09
19ef79a889469f7efea3df967e81bcf4c2350d12
423 F20101118_AAAEBI mcelroy_j_Page_112.txt
231586f470e55aaf9aff5bb7e2290427
fb218aeb5c5620967ffdaba4fd89394f98eb9df9
f5d477bf0fa551079c73d569828eb81b
0b7966b615022a5b5e444454eb2e0cbcb0cd0cab
12422b3cbe98b9a03a85b0df6a006f78
4a8bf79a86f2f4fd73f368c4b5a8f8796527db81
2284 F20101118_AAAEAU mcelroy_j_Page_094.txt
2161ae43f4950b6ea11fed425a052b6c
caef70c089440abfbed7f79300177d3a3857e6f2
2107 F20101118_AAAEBJ mcelroy_j_Page_115.txt
b30a7c230ae1891eb40a018c08a0bf88
7a21bd990c1d927daa2a2001a6ef6216114d554d
d462d16b7ba3beee21eb876aa16481d2
1e8c44a4d9660fbac3440aa780b9fc3f9bf39327
bc3aeb34f0d4ab371311420b0aea607a
1908 F20101118_AAAEAV mcelroy_j_Page_095.txt
9b78ea342740d2311c89bd6d3aa4105b
fd4c75eb104977e452583b70068ff67dd5c4aa75
1672 F20101118_AAAEBK mcelroy_j_Page_116.txt
6c7ed6459645d1112378d4e51fc5f413
0899ce8695d8a10da1e0d0c0efa2f02fc0b9b8b3
28a46f268f557f632ce87781afa61327
60d8b83356a8e7e501647ffb3ac5f533206982ac
5efc7c517e1ed2780dbd3ba7f2e2d58f
73f652d4af54309ebe60b6037356843d2703b751
2329 F20101118_AAAEAW mcelroy_j_Page_096.txt
3f312466b515332a2969dd1663296e9f
a06cfb0841ceff2ef0e393bd8c100c1da822e7cf
240 F20101118_AAAECA mcelroy_j_Page_137.txt
49c39c2de0b092af296161f550255475
4e546ed1304e5aed0da2da90732699573895ec4b
1602 F20101118_AAAEBL mcelroy_j_Page_117.txt
bbf4ca9f3b7e1a0c8a573ef8caf88bfd
a8c1b1f1e5816294a46000142490e7d54139891d
952f59b58d1a446c715947acb71eea4b
7fd687516ac88ee970037e7c5e94cb5e79886f0b
1ba7e6fb8c4013c1c781870fb78a2957
2235 F20101118_AAAEAX mcelroy_j_Page_097.txt
51028f689d2b7a646b17659285496a21
5e92b993080c5000591d6174f4c7a4131be92f92
411 F20101118_AAAECB mcelroy_j_Page_140.txt
7e3387f4d73b6832bb65f6e4f4ba150f
bef99d5d806a26299eca7946fc0d7813e541d810
1145 F20101118_AAAEBM mcelroy_j_Page_119.txt
1aec55286037cdded89513a610ab7865
e0e8197dc847b4f80267b89f74837c0b
501a2167ac0f64a19f71ff1d705ebf4bfcba18e7
6550dcde105858e8311b8f6a8525129f
389850e75540cdddf50e57421df1f2b9e69a7eb9
1883 F20101118_AAAEAY mcelroy_j_Page_098.txt
598f203b484d494431e710f862f3304e
cc1fee0d5e6bfbc8a5050ba253f440f3d76e21e1
275 F20101118_AAAECC mcelroy_j_Page_141.txt
c4d45a2a1b0bf6690761eb01d6445e29
273c8737d35ed346f5830c78e344f84a4a416289
2282 F20101118_AAAEBN mcelroy_j_Page_120.txt
edc784623e8c7be8f385939c9831ec9010fc0390
0f9510dc04c4587a2c9239d8f1b449e0
39dd6dd634b578633282644e26ae327c02020740
ab63e45295dcfb7e33473fbe181fce7f122f26ba
293 F20101118_AAAEAZ mcelroy_j_Page_099.txt
ccc57afd75a4af5e4623a2b0409bfa1d
c4449214f6071a8ed0bd16b0c43330bf6295b517
441 F20101118_AAAECD mcelroy_j_Page_142.txt
c9f038b9316f064e2dfab2306b55eb4e
4906ca89396b69038d6018696f24110a925fa1c4
2397 F20101118_AAAEBO mcelroy_j_Page_121.txt
22528a830ddcbe5cef289923987c90d5
59834fe33e08a4ba6eed8db19641e6c6e1e55234
1543bc23c0df51fa76d5aaff58973fde
cbb1f27f7b74d6628b1cb1b950d4d6642b255d77
2331 F20101118_AAAECE mcelroy_j_Page_143.txt
efd439090a8be6780d74e2eed4363f6a
2420 F20101118_AAAEBP mcelroy_j_Page_122.txt
ee59afdf682105a9b62a5fb0c6fda4ba74cdd70a
88ac05069a42407877cf3f5386ff75dd
da76cd6a7c79f3f4fd5a3a28066dae3a5048747c
86f48b6fb12d24154fa15b5564778c0c
f286cc0de734e6b44be64a9238b4c4626a5e5bf9
1850 F20101118_AAAECF mcelroy_j_Page_144.txt
8d237662c6d877570194a5f5872c8da8
2091 F20101118_AAAEBQ mcelroy_j_Page_123.txt
fa4785dde3361f3ed11478170d0a626e
257ed320b94a50912bd2b1a71e68bd96
308a777b593c60eacded656bdd3f028c
51bf28c47ef4d9c412589224c16aa0af6872c36b
2341 F20101118_AAAECG mcelroy_j_Page_145.txt
f1bbc9115cae88daee2303e48cf6f72b
e19f2bc7b20fbd06efde3cc671dc398131b4b08d
1875 F20101118_AAAEBR mcelroy_j_Page_124.txt
0b2a20808b393ebba278b802d8b13397
6e3f33509ced291a35fc0d8aa239e14d223bbdc4
35e7271dc46c32b00460854c37fbdce6
bfbb1431fcf2da86206ef1c6ab87c521c447d19e
e5c37053ea63118a529d8a8612154811
f8dacbbd5390632c5432e6fc8b81dd3cbd662e12
1972 F20101118_AAAECH mcelroy_j_Page_147.txt
e5535c8595912737ce3d490b40b4341c6944e94c
1386 F20101118_AAAEBS mcelroy_j_Page_125.txt
1eaf8899e24b6b02cba8ef241ea7ba49
03edfa7ae4dfa08d0c5b19263d325bf8f0760097
009f3e28928159a0c6c027543ae8bd36
bed9b4888ca10ae41cf462fc8db53e24fcc827a8
4a82def3be26208070abb15e6529cf0c
5dbf907900edf7d684243762d775953bb93f61e5
8c8b3e51241743bd3ea5690c3d09afa1
b398592ca74fa3b4571bbbe1e5625e6b21152793
206 F20101118_AAAECI mcelroy_j_Page_148.txt
40958a5a741119369ae6215eceff9254
fd9842e9681e0e3703ce1a34f75b5f434fdcf236
1589 F20101118_AAAEBT mcelroy_j_Page_126.txt
d71769c83ecbc716ed6d20c6de6e647b
f874911aec3f586625b1e8b780d17197c250b81e
c99f01486a7cf191bf72e0760b7e71a4
916373dc5b37ae80fbb5bfb1ba39ac8bc57c4438
2ea40e3668ae3825d4f51039c1ec7cd6
7abddbe6663d8661dab32b499f3b4f6dce38599c
a6c537f3cee7340b7a5dbb566f253c9e
e40667771baa555db51e95a087bb5c8fb524350f
1500 F20101118_AAAEBU mcelroy_j_Page_128.txt
9a8dbe8d962c72e7601ccec110957341
0f928e032124355cc7432d2c2f7ab8ac78d13064
060782e1ea437ebb7642423ae0736264
eaea93150b8b1a7dffdfd1be53e86164dcd32601
c5666632cf0b85f2a7ec447c25e7d942
0f38a57a04cde839ac38c629d698a39570972d24
2237667 F20101118_AAAECJ mcelroy_j.pdf
494b9e41a88c4ac1a8b0ba21c99205df
1568 F20101118_AAAEBV mcelroy_j_Page_129.txt
5f0df81c0695967acd49682499c6fb9a
8d91dc22b17ed8ca3a93f8c10910bef01f9e9844
babc0007bf7d5b5c82f1096467c6059c36b3d269
0f2054f5c9e34e2724f2b05326e71f61
d9171e771c6ce0a41352b455c22509f1ba53f784
6447 F20101118_AAAECK mcelroy_j_Page_001.QC.jpg
058a740652a7e58c40388bd762a46643
d6f214ef744bbe48e2ac145b6bd678631dbbdc22
1251 F20101118_AAAEBW mcelroy_j_Page_130.txt
4bd728c415a47f07eeaf2d1071ca19d3
ae7ed62022abac46b72f472131ae246a8895bbdb
c1c1196fcf426a8d47aa5bc0eefb422a
20d189414830768c4a0793f60aa95910ee88ae25
975c66530f55490cc6f9ee4c72129d7c
928232219551a47bb7f7953fed724ea51290bf34
11918 F20101118_AAAEDA mcelroy_j_Page_010.QC.jpg
2ee2f886bc31323a99a95ef281dc9bb6
432a5ec453cc73c0cc3ff81454f12a01014fae3e
3070 F20101118_AAAECL mcelroy_j_Page_002.QC.jpg
ffeb39a0614d8721710ec5c7d082c32d
db37ac3c236ccf9097fec15bacf4f13a217a9a91
2037 F20101118_AAAEBX mcelroy_j_Page_134.txt
4d718aa53f1a23549afaa1193902ecee
f71b1929ba75c9b3b8f3103205ac6e03453d6dca
f22df546b8e498cb655b408322ac6b0f
524ca78f9d606ba912c5ef8239457cc40550cbff
cdd940a0b71c6f65af8de450fe1e8cc7
0f9bd74f1dfba2f9e1acf198d6cd7d9a614e552c
22496 F20101118_AAAEDB mcelroy_j_Page_011.QC.jpg
08133ec7b1ae077c8a5b087a7db537efd1531310
3530 F20101118_AAAECM mcelroy_j_Page_003.QC.jpg
c035f746310df1d270c8dc606f1a69d2
38cb50f594aa7c4662b76d797ae75d4873242798
1427 F20101118_AAAEBY mcelroy_j_Page_135.txt
3dd6ec43dc5a77acb5e85ed7d8e0fdc9
203608bd4fd6802bb946425259c02711
2251628dbda6a98c3fb3396a71565c971838e1c9
4e8fcf076a5761f018c0ba7725a9198c
b9b877e105b356752f5002e20e94da42e7c183e5
5491 F20101118_AAAEDC mcelroy_j_Page_011thm.jpg
93b1b16fe335f66a5611470081476f1e
1438 F20101118_AAAECN mcelroy_j_Page_003thm.jpg
d2f5b8cb532f81a8bdee0e44f35b1645
246909d5cc215a6fe16612539c6ff7249d7659eb
1820 F20101118_AAAEBZ mcelroy_j_Page_136.txt
fba8de3fb9cf82b9971037a571fd91c5
cef05d2fabe3263fe520574de42b00998600be0d
fb2b9169f62e4ef423a82d2193026fe8
94ce808ff174e83507ef8ac093d77a5ae795cf05
87e30b65bf9d21391442cffebc8d7999
d2995959fbea4946ee5c7bf0507c313025324f2b
a4abcecb442c71907c1b87212ef4d166
3ec14c4e1f61bc29898e1c067075ff13b7722f98
4478 F20101118_AAAEDD mcelroy_j_Page_012.QC.jpg
11b436d125b9440fb6e7e1fe5524d9de
24583a94493e3099616e4b2a0d2ed351b59e2289
6333 F20101118_AAAECO mcelroy_j_Page_004.QC.jpg
4401ff439246e396c595f23516940000
9b46592ecd6e85f56a3726d0e72e39dcf421f591
63600341008c9efac8559e92c95b21a5dfe45ac8
f35319ba0bfece7ef9f2cb673eb6a1e7
ebcc4b40de82d699646e8cb0ce87de7f81bca400
bd40be0f75403e0e98a0368c477a8e9b
44ff28fae8d0b06b22ee3e09fed14f80556afaf0
1608 F20101118_AAAEDE mcelroy_j_Page_012thm.jpg
b1769f551ef1c844e5972ca489ea783a
7438388e969b9505780c9d42d3542a9857ed4b02
2048 F20101118_AAAECP mcelroy_j_Page_004thm.jpg
4a4bb677f6366858467952408c58dd27
ea4c9544acc46aefd866e09f581e900423639061
f29cbec11d605eee10ff618278c53b9e
e599bec58d4932ca385e468155aedf7486782095
5834eb9365d48de9f7c3db89a79bc6c77519836f
24727 F20101118_AAAEDF mcelroy_j_Page_013.QC.jpg
53499f26fcefcb28cff39c5a571c8c0c
af08c2409423d998db56a6c5257523f7bd81a2ba
22641 F20101118_AAAECQ mcelroy_j_Page_005.QC.jpg
b32de2532fba0db5d67cff4c6992b14b
59ee1bd74c6404f151bc7813a134e11908161a71
75caa4619110e4bfb244ed2ec1f828c6
73d73e24d65e42118e9f36728d164c24e45f1694
4d9e954f96f95dd784ac064070ebc2b8
32f66ccdb52e6ccfd0d7c099af97efcd7808fd09
b6c9592529dabeb2166507744d8a3f86
7e526dc715c03d42ac9abd1bbfc43c55554d9e33
6232 F20101118_AAAEDG mcelroy_j_Page_013thm.jpg
189427e1b4fcc1b47ec964aef94716e4
e0febc12bc6c9778ac30638690b123443603c34e
5382 F20101118_AAAECR mcelroy_j_Page_005thm.jpg
5486407467e17958f05976005b3c5167
b932fd363d2d2d5724f5fe13868d7eef
a3b7315520f36c3f1b29a928c688830bcc26f556
fd57590fbd0c33e3ce194e0a65f0541d
789ee23bbb1eca36aeae5ef511bf850a47c98694
619b9b6f4f0c9de031b2668429c6b6e6
4663 F20101118_AAAEDH mcelroy_j_Page_014thm.jpg
0345c545e67ea8e77ce90e2ef383f1f0
cc16d24c457c63c16fca399c19cbe7669588d370
21265 F20101118_AAAECS mcelroy_j_Page_006.QC.jpg
05c7f475922b63520c194b257e03fe69
02a3c0c025482325a159d95db0294882370bd22c
c7662e2c39ce535e91014f5894bd5448
265b74ab2b611bce17c7c99be961184a
056daa1502992f525aff8f27a9ab8fc23a917800
de206bfeef5a3fa3d4955301fac70afd
a8e49ef569921942810b30d55aeae0124efa7c86
3d6b744a6d57d0008e356df3d74914eb
24739 F20101118_AAAEDI mcelroy_j_Page_015.QC.jpg
3f8cb54158f183af3ca616d009ff78973b663fc7
5219 F20101118_AAAECT mcelroy_j_Page_006thm.jpg
27b8ca0537c7b1f3432a10824aa0533b
2ae2f1429d0fc09116345b69f23c0b76
1b45dac5892f88052e7eb132cda45bd4605fd281
1e75f63a42a4715c809cb1eb8d3cf261
5cde47937fb7a5ff0fec4306494895bbbd4bd74f
ca4b2e74fcb6fe1e24f85311c2e60a3f
d264decff1151ac4d7ece95522a6410792f18173
c1e35fe2229ff9c89484e3f41f8bdb3e
fe1d645f0982158f2b01e7248f9eeec4f281e98b
6217 F20101118_AAAEDJ mcelroy_j_Page_015thm.jpg
4efc80c975528cb53bfb919ec219466d
3613 F20101118_AAAECU mcelroy_j_Page_007.QC.jpg
2b9e939b4c4bc3be80a10b7fe0271780
e9bb9745518b984702cc0932e1e44b8d
1ddd2529fe900b519c381f23650b05d1
b35a36ddbe89d3e3b2f127b29e187b6bcb5f5c75
14473f0153b9a0566a79e0658d2df74d53ae3798
1476 F20101118_AAAECV mcelroy_j_Page_007thm.jpg
74b44e7183e6b5dfb13f9da076218e96
9e4b3967673f9e7dd020c218656653d7c5384052
d934e87ff29109947dd17774863a399a
5efc500bbbdeb3a98a0dbb6839f5cc777d45a5ac
c0ae83123bd2016d052c1d97f4cb4ca9
9c3738c5666d777e82f28c0e086dc3e5fd821507
19867 F20101118_AAAEDK mcelroy_j_Page_016.QC.jpg
192c012760cbfb82a38f58fde905b673
c13c587d521c23b68681906ca6537b980edc4b18
22643 F20101118_AAAECW mcelroy_j_Page_008.QC.jpg
b609b1565320d537961d5100107a6727
9f87e7346fd4be7e195f45448169470b61db4a3b
9b00d147ececd4817258dfc1f31da693
c7a3c848fe82b9e0e33362c74eec7dc31f08efbf
27b427d21e9efec51f581cf1d1bea2bf
7972f71a55007648d23fb0796c2047878b11ecd2
6dc88e25b5d834ffe2bbd648295d3cd6
13ba484e023c2a285a8d854fefcfe2f97595f7ae
19694 F20101118_AAAEEA mcelroy_j_Page_025.QC.jpg
fec461136dea4ca670c2c5365db27125
fdc79899ae299135a8ce46605748240f86c24d52
22554 F20101118_AAAEDL mcelroy_j_Page_017.QC.jpg
7ca8ff2b2a7923355b6c0e26e74a51b4
99ce16e48e5859ec28fee0ca645d04186a38217c
5632 F20101118_AAAECX mcelroy_j_Page_008thm.jpg
7359f2e0df6495e7e6f37152d047c751fe72a1ef
94e3321e1e7b264145b550af9c696118
c8584e9e50522ab6fb824d5fcb508dfac13f059d
8f9eb19764a1458d8f6852a8aea71b83
239921bb64546a908a7097760ffbdb59080f87e0
5092 F20101118_AAAEEB mcelroy_j_Page_025thm.jpg
30460cb0677eaf0dc29369de3448a69a
2f0ab00b50395f597461b11fd6e2a88c378a6f1e
5933 F20101118_AAAEDM mcelroy_j_Page_017thm.jpg
885b93bdd3d7c30d2b3159564ce7d906
7cd32dce7093f5de2de22c8e7d0d129842dbeb8f
22569 F20101118_AAAECY mcelroy_j_Page_009.QC.jpg
facd3b2f4d49478f37412716d14a6f4b
30bd86f6146f4165976858dac69b38e1c7338179
26e76c0872465ee65eb1f1af168f306a
c19ebe9f504e2d1ff8645faebf7193505f836cd4
a7d554069e153856d29688f30e7d5626
2251933ca08f6a7947a6e298d88420075bc5350a
71551c2a92062a68e432d5f1fd7da080
9263f8f242560e7b87f8ed2a1f4b3dfb7f057993
21398 F20101118_AAAEEC mcelroy_j_Page_026.QC.jpg
ae74dfae81a146603ce0eda54f42c72b
03b2af2124aaff61fec68c8b2d55b7ea341f0140
17932 F20101118_AAAEDN mcelroy_j_Page_018.QC.jpg
d16f8cb713748658c7f538e4628777a2cc4ed53f
5723 F20101118_AAAECZ mcelroy_j_Page_009thm.jpg
5774b878ce1c114fd60e832a20e51078
8a8b10ba53548228dae603a97ab6f96a57a7caaa
23aef27fc91c975450deae9d696d7c94
7eb491f012b2cf32a447b083e6e6655472488fce
a95c12c6c94f30522dba9c07a4bb79ae
79992a74f09c08738205c8a4d2e3dd59
b7d46e3e4f69709a8b030128d1fd55cc66be2a8f
53b2a643109d786bdb150eaa7a33e71f
730f206d350af455d2959f192c98645dc185da78
24009 F20101118_AAAEED mcelroy_j_Page_027.QC.jpg
405e47489a19ffe2e219043ec4823256
c10ff54f29616d4b8d2711b9a32fbcfde4968bf1
5340 F20101118_AAAEDO mcelroy_j_Page_018thm.jpg
e121cdab15b3378f5f98100d1e7fc0be
ec7c93a5cc8b6b81e44e73098967ae25
32875788dbda61ba6a8b53e91b2f3bef23964dab
99de682241442b1e92fe429b59320f54
3a001faaf8d0e3cecb6cdb0ece5e5581562bd96e
e130a7df1e839c14bb7995766b44766f
4fd8164b014f6f63816c8b107f30e87f70210079
cd34e6ed5fbe79c1fa6c4e81cc1bc8b9
042c01cc3716ae61e8cd50891564ca171e1ceb8e
6351 F20101118_AAAEEE mcelroy_j_Page_027thm.jpg
1f33451273bb7eacba865666ba456445
599ce0bef9a277dc07e3faa56d77a69cf3e0200d
20550 F20101118_AAAEDP mcelroy_j_Page_019.QC.jpg
5ffd574d30e5af4be7a6529324d22f4a
d636cb9083d31069bc8b7c507a4beaafae2217b8
bab29a1aca07b4fe91e1d9fc4282f4a4
086447e9c4a39917d7e5914885ff8b0ebdbb567b
c5315ec0eb1af56c1ac53e1f6475d335
6267897eed8a06049026070a75f04df1384d906f
eb1726d6c68eded78ba953c3389fee02
47e3b0e7b3834bce6976d61b2438b47d3d1b1619
22437 F20101118_AAAEEF mcelroy_j_Page_028.QC.jpg
034ed6d123075606cf44621123089aec
041ded6da4e9b406c2162c685335b827038ecd31
5623 F20101118_AAAEDQ mcelroy_j_Page_019thm.jpg
d88b799ca1aeb6181b34847f89cdd605
40dfea7fa7084a9cd81f63a103a04d8a
e9af0fdede0bbe33b195daea48d419aba0bc4232
415bbeb3a187ce4f83dedb6b30b5d901
bb8baa11f2713471605eb269ce0c282979d618b9
445ca6ac79a759608a06a9c383265537
c8d36dbdf5a14e0f3acb6fbb52eeac91042bd5d0
0a38b46118ced898631c55454b16cb58f1cf3fc5
5853 F20101118_AAAEEG mcelroy_j_Page_028thm.jpg
ea62a11524c87faf03cfc963b9931fcb
c1c03e473f698c546020d6c1d90c4faa04d03b54
15130 F20101118_AAAEDR mcelroy_j_Page_020.QC.jpg
e178f0c61b8df15e7d77e1cbdaa75778
c4c9ab23bc5c6cebbec458ba763edfefb3fa09af
6a55541005b0b9c52148fc1c71e51b7b
9ed6cde64c853ae03b5af89cc1bc48ec6ea59743
2bda32a8ef7c98101d9379250dc7f39f
68738521089210e3a85fb0f7a58c66540ab2e527
e8963751e5334284b8320a9d231de429
3b80e6325e792c3fc871e1aaf2a72665d8d8866b
eef01af99ab9d8ec0c52869ba379e2a2039d74a8
ecd490b018924d2818075ddc0eca58f9
e7d1775ac9ceef17f000a25f9434ae61c20d27be
F20101118_AAAEEH mcelroy_j_Page_029thm.jpg
61d416e3b72b6d60fd7b0079234e115b
f08cced79494d4f922b37c6994d17eec6ae006fd
4569 F20101118_AAAEDS mcelroy_j_Page_020thm.jpg
df836063d235aeca13b0e67e19137110
c0d8a6afb9d8136b76cf0642e20a7823
97d8ae8a395f1c0d398724f7942a2421bac24c0a
abaeac24fe30c160b4113786b5345132
f54114fd070c7c945d4d2dfa8b9c769bf2fab84b
28eb165ac39f40567c8b8dc738ebebe8
8b05033d026f77549e165ddbc919904d20b41206
22073 F20101118_AAAEEI mcelroy_j_Page_030.QC.jpg
77c56bcdfa798bd67b080d0cb10d7fa9
c8b449a69946a308f4727c3899d05cdb2a00a176
23529 F20101118_AAAEDT mcelroy_j_Page_021.QC.jpg
915508648c930075b6fb1603fae128b2
e32a80388dde59c35499dc2ac59e0041dccc0be8
09f6b66a8f8d09fbd3a480946dded192bed1dba5
1ba7a115ca453e0c833a7da7b92bd110
654cf3379ef1172a09af77f2e528742eb5bba273
5c5d571ed2e0961aa136578b493b2c9dc9f871ec
f325d13c5fef089599e2293ed0fd6371
38ea73919bbe139f808a84c2106636abf19bd59d
6074 F20101118_AAAEEJ mcelroy_j_Page_030thm.jpg
6e75eb74e70a120af9a99742025f3e2e
dd3ac6ee4e3f20c6b79cd1380eae5d5a7dbc9ca2
5885 F20101118_AAAEDU mcelroy_j_Page_021thm.jpg
ec9dc89f3cf954b0c533c25caa95920b
4e2817d5e35b5cffae5ee125b15e03eba49b4a90
afcf05e0556512408f38e647a93793658c6fc795
6fcc61a205c8b8fbd5a1aaa133b2c11f
b34d5290b80008a9808becdd571fd1a489cb29de
8424a07dc298569d9a502147050a13ea
7856ab12258d6729d7f7ca448d3e148d2e6a1fab
fb3edd1935181d4d43ab0dd886a4f2e7
f3577b91be861033b6c1a06dea0f3f824d4434fe
26190 F20101118_AAAEEK mcelroy_j_Page_031.QC.jpg
efc84dfd527c2168c781b983ac02040530b3a835
17528 F20101118_AAAEDV mcelroy_j_Page_022.QC.jpg
faa183276cc03398bf4de3e1bd207b44
8da2b078a4fd4bebb3c8a7e3599fb7e2d15869ae
2a1b9d02c9cef8541461cd34462f23f7
ef277ebf732e8dda8db1ce231c40e1b8626a51bd
a4bef745e50fefd80275625c11af40ace43ce7ea
7c2ca2da405c95b92ffd27118bc534eb
f9efd43a2661201da0867e397d4f462e0a731beb
24792 F20101118_AAAEDW mcelroy_j_Page_023.QC.jpg
67159c8fd7eed9b70694b7e128fd61a5
c320c4e36743e96932755c0873c2248164cd1da0
be0014ea7338b61162544fdbb2901df1
ee037f93e1d9a7224babdd00d196b91d86ed69a0
37120db5e8a8cd0ab51b209d6078cda6
e2eed6892f00b0db3fb9a5bf3a43a530c3a5929b
81392460f7fe34d3730acb589351bc6e
5123 F20101118_AAAEFA mcelroy_j_Page_043thm.jpg
58da7b84fb7bcfe590d41921782bbbd8
82de0b6e5c84d579bcb35ce75f2935d87362dfe1
6371 F20101118_AAAEEL mcelroy_j_Page_031thm.jpg
52f418174e3e7fd90a9ba56e265b860a
cc2544a6c777a8bfafe5853fb01acd4d5c0beebf
6252 F20101118_AAAEDX mcelroy_j_Page_023thm.jpg
90be06d6692c94684419d54cc2f7605e
b1b07b5939a539239f40c66947c544dfc77246d3
93431145e743afae1e48e51de0b59451
aee9e45bcda86848e6ca0560eb9910323476df75
a97d20b89809b125b6c0defa9c8e75b4
45952228299d08109cb3853803a72730cf24e913
986b6e1c2f5c10c224074996768af273
91468f8d9f56557366f835ee4de1fe51920cdd5a
e77e8447076a5af83abefdaf7d9fdbbd
6723dc2661ded4d64e2a3fa545e18f5edf726ed3
25417 F20101118_AAAEFB mcelroy_j_Page_044.QC.jpg
ef4af622c2db4bab9ef96d0d007c75c0
2f4b14c4bacf0255643f6dcca9f0543c24d7c979
20377 F20101118_AAAEEM mcelroy_j_Page_032.QC.jpg
9889e9911a20ecbb055a43980faea8ef
46019dbf5261b88e27e5afb56b1961fc9ca0b1f9
22702 F20101118_AAAEDY mcelroy_j_Page_024.QC.jpg
635c62b26dd8ca39388b049630b13d61
ccdc4889ffcb88b5d0affccd3c02568081453a09
c054bc662cccc2588e27ab977f8e8bd5
3ca10f5d24a0fb64f6a0df915f902036759e0b50
104b4629dde9d0c84a97b7b8e0868f48
d56dbc31839b7880d9d97d751bdd99fbfca71f0f
7dc56b7fc1398a199877542bcbd117df9bdd32ee
81338174bb6c038a5e93a7cd7ce8469d
ffb46c3cc76077149216b2d556286945ca1d4551
18161 F20101118_AAAEFC mcelroy_j_Page_045.QC.jpg
10669dbf3a1c6969109d1309c3187ffd
52ea9b0065aeb82abe1f289a79b5fc06df3dd1ee
5400 F20101118_AAAEEN mcelroy_j_Page_032thm.jpg
a45ee92985dcc9dd07fd9c4239bc8903
7125916fd8b1f02e5172e52f74c4247a0957c784
5982 F20101118_AAAEDZ mcelroy_j_Page_024thm.jpg
88fe1dabbd26c117d4a303c57ffd0135
a49fac1746e8920dfd0821bd8790de57820c5df2
42e8dc5fdcf67866cdf74318acd524d8
2da6e3e23e08cc2447b12e1d0cc2ec210f934c6e
cca9fc65f9212afecfe5f5ec6648957b
cd674c96ce1b513b4faffb0f0afb3ac189bc786c
9c67f849e27d612167462d4587c18e4a
479260da8a3e7892b98e21f9d959112f
1f97d5ba50e9058c623133ab7f3662be09762754
b3b6c16f26e06621a386c460494af667
e8fd859b15cb0a95ebe6fb2ab1b45f5c7f6790f5
4927 F20101118_AAAEFD mcelroy_j_Page_045thm.jpg
3a3615831a38898ae7b377ebc674190e
77acb21bb186368027333fd7b2d51ce5a4e45293
15846 F20101118_AAAEEO mcelroy_j_Page_033.QC.jpg
8532739c0931bc199fd6d5a367f08cc9
1f5ae24a5b5ae6f71ab3735c97aee76136ee5a4d
d3293a244d7febd75c0b28239e38bc47
63f8c049fa88e3c21732efcaf950d80273808c9a
ecdf7474432216470929391d8e2137f6
9d396d99e6a289c376fa9b144dd6a132ab06d7e9
2032b583c388006e2baa8067d547bd84d99a490e
0d1357f2c9f65447e90d32116c806d80
cf1552a68a9f67169d8f81038eab546a078d45c1
68db0797859128c176d8a1a9f9db00fe
0ae04d0cccf38dd66fc129411c2f42d72467231c
6691 F20101118_AAAEFE mcelroy_j_Page_047thm.jpg
ed5df89eabcf66b458f9518b18922d81
b7028aae145e3a629a1146198e476b3bf69b5b26
4357 F20101118_AAAEEP mcelroy_j_Page_034thm.jpg
93b44032f7b4694b9943bc939b4bb4fa
3c4319f55dbc606379153b320065691cf4683d86
7b2d9e31e2488841429848677e804afb
236fb430b718d50093ac1495e3bc940f
be7a8a225807c819f026d2d184278b4fcc830950
d4f84fdce3d3a8b2c18f459b60456dfa
3941663aecbdfb76774ddec1bb549a55f553b36a
b1ca1fe4aebb63002a5e0aabea8eda03
aaba626c7d726ef7bde8e10e0e3b7a3a8c028e03
47bb64f25d7213ac3ae01be6496fab87
20935 F20101118_AAAEFF mcelroy_j_Page_048.QC.jpg
165030a3268a6cb81d09e913458d0ab6
c55884ce83fb3f17de8f973e654fa5bf1c586711
19710 F20101118_AAAEEQ mcelroy_j_Page_037.QC.jpg
3caedb1a009269c10ec9172772d7107b
e414fedc695662ace5e92fd17625abfaed7486af
afdc5dc9eb4b6bf719e421fbb87ab4ac
6d6dc0179d6284b7247b72d39f1237fd9fb874d6
07e55e7e20185c8370d0748c1534e548
6d0d3d5e45c632c37b700ede19f794bfe3af518b
2de8a2ee408ed0c9281fab276f3afe8d
cd15110c4ba2d3a4948b615b3a4b20e03c961978
576497a3359d4b7b2a54253e65a089cd
b7052dca84d250f4c926601e66a51b4ed148f4c2
5629 F20101118_AAAEFG mcelroy_j_Page_048thm.jpg
e404fe9c0bf65135af0a208eebc51713
a7b85bfa66ec217452c28fbbe482a032db6aee79
5359 F20101118_AAAEER mcelroy_j_Page_037thm.jpg
471fa6616b7cca8e65551547e868e221
af9c59dfaffd3fbb7f94e7d5a15b868387224b6e
c630b574cec633ac6fa5cb98454906f6
19d69afff22984cd3349ec726a4f5a3014a5fbba
edee01286e56d51ba87d0d880c2b2a0d
196a8bc4a9fe827732aea5c63cf62a0fa3549173
2b788288601b3fcb5a73b024b96e00d9
8bcf5c7ee470a7761c59609d36398f6d19e054a6
424a20d6877df073f53b03117af53678
e0260e8f24ba5eb3c8109c850360e62c2d67dbb3
22027 F20101118_AAAEFH mcelroy_j_Page_050.QC.jpg
906ab0fe11574a9183507971ff8ab2cd
7d232eebf588104d5bf570a28d23320e91b51519
20553 F20101118_AAAEES mcelroy_j_Page_038.QC.jpg
117b41b87d92a547b49a5094d40a168e
4d1ee2b15d04e6de83bc0fb8ab821e5fc22e3d9a
a45af3e670c24d5d8060214a44094b1f
e65bd412164d2f412848c8654e67c4c41afe2250
a994457ef9420b2e08b33b726a9456d183a1d3a4
e32a38aa32bb3e726ed81f91e3de956c
cdd8bcd9d7d841e642f699c05228a4d2f01f950f
e4767ecf801d2d26c1d8a7ae334cfd55
d2cf18ef95e6a6d677a149959a4d1d9e07c14f42
6122 F20101118_AAAEFI mcelroy_j_Page_050thm.jpg
77561f237333e1b65883b3ab515a7703
727d495a0489ae4d1332b465ea4d9114cf3e18c4
5719 F20101118_AAAEET mcelroy_j_Page_038thm.jpg
3b0fdeb5f481eac3dcc1f914388c3ae2
206f6bb82364d10e9464ca3ee6aac00929bb0054
ec284d06886b4eabe88bd9a040965495
1e1ccff11ef291a3c1b7b1fec62fa52b849dbc33
b3a1583ba16a6067cfb3b92cce3a35ec
b607e338db94bc17af714fd60f9f2e43559a9e88
8a6ebf405727962d0be5d427033205ee
3ba2b85db6ac3bb426abfd8b43c37dbceeb9f2f3
855164f3e498fb42afa85017553c3898
2feeb94870517feca12bbe03c27dde1e5e555788
25185 F20101118_AAAEFJ mcelroy_j_Page_051.QC.jpg
2660745b797fee8951bfc34a7ae782c92fd2ff35
16373 F20101118_AAAEEU mcelroy_j_Page_040.QC.jpg
460d4232d16889ff46956a2233ca4ee0
3bb0d5d40c438e5caa2c2fef98e03d0b91fe05c6
2442 F20101118_AAACXA mcelroy_j_Page_146.txt
175cf504dcb02f05f885197e0636546b
6fdfa47a6ebe54d7202c84de502838e16d98f67d
f42d2f5ebc87f7dc94d8101d612c8795
b78df5a69f49559529dca55478f1208124a9e5e2
bcda79fb6b15fd367403475feb07ccdf
457271404556049ed2781df24041979a5ca384b8
0b03cc255147eb81b04a7eb789a0ac8a
573ff01dd3a72a3d0686f64744e97766b5d8cdf3
6399 F20101118_AAAEFK mcelroy_j_Page_051thm.jpg
0bb66f2cbd0a00db1cafb76f5de0b024
25d3d5b4675a23d682b8c4a3c76e16500f97c146
4845 F20101118_AAAEEV mcelroy_j_Page_040thm.jpg
95fddaa2c3ee5a8d404ea5f70544eb9c
2ef0424559aea6db5fdebb67135687a2a241b931
6243 F20101118_AAACXB mcelroy_j_Page_146thm.jpg
dac04182df7908a4f0699fb9f87f4f4b
3f85a0edf126e20ddba82cdd673d7848a47fd0fd
c4da4c8fbcdb2b01c9d7edd837384f3f42cdec9e
a21115678173ab09bc2cf171fa40bb73
d571ab89e41119bcd2e1e5620a80a440ea6454cf
91669bfcf024cbc4ab3bf89ccc67fe80
923d648336a7eec5ffaabfcf2821dddd1f0eaba4
26853 F20101118_AAAEFL mcelroy_j_Page_052.QC.jpg
dbf88b0e69e2f779dafe4db352fe5b1c
d19ac5b6dc2194eb67679c45002edf3546a2dfbf
25198 F20101118_AAAEEW mcelroy_j_Page_041.QC.jpg
f4ef0c6db75f4b3612c5f988566a40e5
a202f7fcc97d11d3a978348c5974be3c7d1fb93f
F20101118_AAACXC mcelroy_j_Page_039.tif
5ee420c2474973eefa69caba7178b08f
afc15bb85efb58321d70cfe1e8779d2b1320cb56
7583710650177b848e213d004f6a37fb
933a13e99dc0cc1a7e13f9f6a49a52c8555b9e51
edcf8a98a8ec231d59bb6a56c4de3d51
eda212633cddf8b0cb6de21477cf6a5cebfa608b
f59f5ecff898f30652bc8ff2254f65f6
5b7f18c00a09a4fd8383e9c3685e105a73f03b91
5869 F20101118_AAAEGA mcelroy_j_Page_063.QC.jpg
15d4986d77eaab9d159bbe2378b92dde
e440b1b94da69b9a642bb4569cfb0bc04da5003d
21455 F20101118_AAAEEX mcelroy_j_Page_042.QC.jpg
92385574e55753ca75e086c7c8f013a9
b272699e56ffd1044eccb9780790fca7b77b2762
197 F20101118_AAACXD mcelroy_j_Page_012.txt
e2db2cd1d0ba5c8266de5ed8b6aa3cef
b585d6a1a9d2283f62be32fb050ac4a109807589
afa3a821e3eba9bbd1c39b0398c6583c
2fbbcaa27e8486485bb4f64c0d4be9f95f5bdd58
b25e139d5cddf6e81ceba1b3268ea713a4f3bbc6
1862 F20101118_AAAEGB mcelroy_j_Page_063thm.jpg
6c6f2595b2619da3403992c1ab7729ce
866c5ae05f92efdcfde3c8605b2e31a432cb737e
6725 F20101118_AAAEFM mcelroy_j_Page_052thm.jpg
500f290489a4628607695dd01035f564
6018 F20101118_AAAEEY mcelroy_j_Page_042thm.jpg
bbc138c6c05c5663bb1ce4fa44e5d8c73ccdf3ca
19649 F20101118_AAACXE mcelroy_j_Page_062.jpg
3d1e38946dbc1357dd58172d13d9df69
c15bc5337d45653427730a1d0f034945388ae6bc
c3626182aacba693304d4218d13893e3
d3c746a36e53972801c43f63ec126f99f8206ecf
40250288e7c35c16655729f941124b1b
955876f7f45f14ea80ea4b5d51c08db4d6cda9f2
3eea0947ff144284436751b71f7b35e4
e1ee93b55cfa2c6ccfbe0d94411f9522a4c032c6
21618 F20101118_AAAEGC mcelroy_j_Page_064.QC.jpg
7d17f2c4f466c257144d5de48452f0bb
df49d1bb33604838897f44622ef7b9aabef8b624
13805 F20101118_AAAEFN mcelroy_j_Page_053.QC.jpg
a3c0021fbcacb97207baab793d216933
17707 F20101118_AAAEEZ mcelroy_j_Page_043.QC.jpg
54d6e542d140a59f3d39987685607f7e
25c1c982015370d70469686b6dfa04a7563966e8
6150 F20101118_AAACXF mcelroy_j_Page_081.QC.jpg
ea45f1727ecfcb00d5a2b28a5fbbbf28
4bb050fc22d6991d08e25e2e21d598b5a3e7dc1f
f27060e67c4c3bc975c33621cd18d0d5
8d049cfe372d04ae8d6f5185acd967fee0fc8b30
463e8a870dd280dcb6e60aeb5fbea64b5fdce910
23f84fc9718af2278e861cbeccd987ff
462c8fb662044fb954ca3bd3235d021ea821ded9
9775ae0f3bec9a78c6b09eed4a001f37
680e25dbfd2da7c4a3e4fb068e95f2d1830cd824
5776 F20101118_AAAEGD mcelroy_j_Page_064thm.jpg
77f1f35f954b7a244d0259e48a631d27
29e0e441eab842da369d07b763767158b5663c93
3767 F20101118_AAAEFO mcelroy_j_Page_053thm.jpg
acc9a7d3ca28f944b4454c3166db2492
1771c05783a3b350799f158be2a404f4669d24e9
2085 F20101118_AAACXG mcelroy_j_Page_090.txt
a11d1f628050e6a6d35095dda8ab96cb
54d0a6a64688e11fa485897f50b153441617a6ee
6e76fa3e27e8240323bc9bf4d81a2683
1ddb355b3a475af0e87ffd048df309c9003f8de2
8149b4613db7f8089cdc8cf54e0d0e8bcccfa1a1
c870f663711eb7cec9be192196d1fea2
2ca79c7cd5ebed5a59c93a77e0cf1e40d0559368
1d6b571a35f2ec594a66acbda708d24c
274ef23590115e6acff425c1e5b6d2594e496552
13914 F20101118_AAAEGE mcelroy_j_Page_065.QC.jpg
3c9067cc037010d3d188346f64c8ff09
142b744b1c3bec52015454d0b3bb1dc63f483bcd
8049 F20101118_AAAEFP mcelroy_j_Page_055.QC.jpg
d44fd175724c7d02d2c65337e1f52e63
133fe67979e91bc98926dfd6fb1d2ab48d4d005f
1497 F20101118_AAACXH mcelroy_j_Page_014.txt
ef4666e27a5e6283ab950be00dea84b0
95ef3722e98537ec9d318a0a6d24c452851665fe
e8bb57a58ec8f8915c2b60c2282965a4
102e221a5c2e3e2592a92594a5ee8401
36b72446edba93eea5243414dc9feb3d
edf94aed69f420e66bcc757616c6ef2d9fd7290f
25dd9e56e8100e38de03b7913055fb43
4302 F20101118_AAAEGF mcelroy_j_Page_065thm.jpg
b1962878519e72089c0b3712a4f281a4aa380380
8127 F20101118_AAAEFQ mcelroy_j_Page_056.QC.jpg
de9b0309040741923068473aa3100851
15844 F20101118_AAACXI mcelroy_j_Page_034.QC.jpg
a0b829d78538718254a61fbfbcc8eb08
59e05ed4d2c0f914a0069ac2352877eb8611050d
3472be245875c0a960dd61886551448e
16eda86d80c4f247f95a9e2e3cb850208424d191
17c19da7575bec042f8e4b8fa88b6990
1893d15289811eb1dc294cfc169d87738c9586cd
9609973480c715a2864764798ab8fc36
4fbe310d1ec92373627bf0446c49dbb2f91a76da
15624 F20101118_AAAEGG mcelroy_j_Page_066.QC.jpg
cbf416dcb2f8b698bab21de5fdfb73c25428f267
2688 F20101118_AAAEFR mcelroy_j_Page_056thm.jpg
443a0b5bdff1d9e5783270a13d1a34bf
084e577658d44f43453284f4a6d1d2d46e4e89a5
85484 F20101118_AAACXJ mcelroy_j_Page_031.jpg
8fd208393b53669c3997bd105de6f90d
11ab8ef4da4e556e8392218dd2a80ea131fb68bd
2921fc8f8d3e0436ed8090e22297ce66
97b53a6d0d6d432391feb4b725bfd58670d9332b
d2dd71e4edfb855b2bbc7451f095f08d
350bbf62c0f8c920d0d700ceed5ff2ba4c18ab85
271f4d05d7cf040a31eb305413cf27cd
11d5dbbb2eb440ed4563c7090035151e3b5c70f9
4443 F20101118_AAAEGH mcelroy_j_Page_066thm.jpg
a00c2f5065deee3f77f0831b7474171a
1af340c531560e8c99c79c5c478981e1c5e341d8
7008 F20101118_AAAEFS mcelroy_j_Page_057.QC.jpg
084ea82b4637418ccb1644b34e19514b
0254c3edb5a6f707e629b55c114c586e76446347
750831 F20101118_AAACXK mcelroy_j_Page_118.jp2
bafe7dcdb28a5c10243aa23389d77a54b894ec4e
c20e7633299b1edab4b63fbfee98f593
cba9334e095475d36ec6fd9bf925ebeb10b9fb43
4597edccd7f44b88c5c6f51e57befb8e
8e1615b262a83162ebd8a9a98dd94465b9a91954
88e7101c241530614781a6831cc0720b
564b81af885a3d04617073e63bc67510c28bbf44
11937 F20101118_AAAEGI mcelroy_j_Page_068.QC.jpg
fc6a4f31395f1eec01115271312d3cf4
d3740e589db6e8ffc1d22d0c332ac8b6d1518595
F20101118_AAAEFT mcelroy_j_Page_057thm.jpg
dac8bf088a11e9a34996650c09d34246
60674 F20101118_AAACXL mcelroy_j_Page_144.jpg
2b4836c51eebdb95a342cdd9a57b3152
46f9ffacaceefb852c11f6ae3c7a5f68e99a53d4
7fcd872e04424309b217b147183bb16e
7fd104967775a4b3cd915059fe99485a6dfc72c3
6668 F20101118_AAACYA mcelroy_j_Page_049thm.jpg
e57fd687b98c17f5fe6a783463260390
a3c9af879c3d4df6ebce3ab032594d25bc72ec85
cb13b18c0f4061ebec34aa6c77939f00
c951a23002755ecd401447b728b755b9db459327
3593 F20101118_AAAEGJ mcelroy_j_Page_068thm.jpg
f1cc3c002edb804b46f99f761b7bdde4
528c044fe55555cdb08ca01867745c8b9f707354
6618 F20101118_AAAEFU mcelroy_j_Page_058.QC.jpg
3b6ae012e8211a94fedfc724f200fd91
80856f9d2b1a7069dabb0b61d39144be83c14abf
80614 F20101118_AAACXM mcelroy_j_Page_013.jpg
f206960a172e9d9dde3653b5e4c5eeda
9d496d938bf041c577b8025a1fcd5a9bb692663c
F20101118_AAACWY mcelroy_j_Page_002.tif
29837cb12c4c78568376fccd3bf65e4a
79b553dedfca92716d2d5e7dce052fb329cff91b
8cbf55fac04afaaec553aeddc6b7a84a
59d18a51fa0a6e8a26e60b9d5737746fcb81bd89
805c988d741338db386912d9e20d8d47
4dfe7be2fdf90744a1918c8e93ed5d16c56b22b7
F20101118_AAAEGK mcelroy_j_Page_069.QC.jpg
7503b2d1ddce6214a3b341ae02aaa747
ce1d258a94a48de0f466255e1695a018cbe8ab70
7231 F20101118_AAAEFV mcelroy_j_Page_059.QC.jpg
0ed0c8a058307cb63a55d214fb39dac5
f6ddb949002c34311e92d12e64c1f31c48b8259a
79829 F20101118_AAACXN mcelroy_j_Page_015.jpg
f24fe16ee66cd7ab0df168ed645cc976
8ecc23521af14823d2ce187d8971ae6b6ec858ae
1423 F20101118_AAACWZ mcelroy_j_Page_066.txt
8cb881d17878bae236ddd3f56ca14a7b1f36ca28
eaceb443da63c92e372a1be5dd4c1dde
01d1c0f107701afd11395d6aa0aa40c83b110dd6
1663 F20101118_AAACYB mcelroy_j_Page_018.txt
142af8ee7144586ca88ea5788abfc32c
dbc6881e74e6ff6c978a75f4bc181bcd4e7f9654
522bcb175039fef514076b34e83ffb0c
4bd4ec10c53b837bcf672fcbf844a02af8dabdd6
26088 F20101118_AAAEGL mcelroy_j_Page_070.QC.jpg
8e36d6e0f30b70525025eabeab58092d
b1b34b2b45bd6ee818981fabfe153df077be5880
2637 F20101118_AAAEFW mcelroy_j_Page_059thm.jpg
be2effecb1c4aac84580bed69c47fe86
e3a848095b2281fdee0950a9a8f69bc6655a546f
19679 F20101118_AAACXO mcelroy_j_Page_144.QC.jpg
4e68de6bdb1e4106262d626a9899c1ef
53b87f313a63dea3324664fd70abb373b675a20e
054cd468281404e717c401c74424df78
7059f8aa98b87f04034998039de260d50498c924
1949 F20101118_AAACYC mcelroy_j_Page_091.txt
43e24b0c6505043d77b93f90f2f8dc87
e402ca70e054b6e1e9ba256a441533282c930d3c
de02b2419c4b7d85a4cd195410dd349d
8cd999e4488f7332f80a21fde8d7ff51e43215f6
3838 F20101118_AAAEHA mcelroy_j_Page_080thm.jpg
262a0cf86a28e7c092172c22cfafbd63
02e0aca28bed6db708a86ca58bb334102d40ba1e
24682 F20101118_AAAEGM mcelroy_j_Page_071.QC.jpg
5fe97530d491c227ff75b5f6677b942b
8019 F20101118_AAAEFX mcelroy_j_Page_061.QC.jpg
f27744a5f25421d264229370614588b2
6723 F20101118_AAACXP mcelroy_j_Page_107.QC.jpg
8d20bd0f45eee2f0ff5c721d241c8a04e03e7bc5
f55e840a42c59dd39dc908496d8c95b1
1716e5a5b63e1ffa8987da7512851e9b707ba237
52771 F20101118_AAACYD mcelroy_j_Page_080.jpg
7845806f25d154a94207eaba5bffc1b1
da96d1709428cace9e17de61a21cdca8da057caa
a7413ccb23f5b97724e67c14853e68cf
37d36372f6c3e64851a901bb16b537bcdc80b9a8
6229 F20101118_AAAEHB mcelroy_j_Page_082.QC.jpg
0d2cb049f549a282b0a994a9f24011a634ac83f4
6439 F20101118_AAAEFY mcelroy_j_Page_062.QC.jpg
54a3bdbc6740c1c5e80d75f0365ca975
59728c88f6f69cd72d545a25af9c1c59396ba739
6890 F20101118_AAACXQ mcelroy_j_Page_140.QC.jpg
59e11bfccdc13b0137d775b01a537b61
d73e3bfd29ae5ab986642939494a4310bcbd1126
16193 F20101118_AAACYE mcelroy_j_Page_119.pro
dac41c9b5ca8c824ffa8fb9955e84952
960f3f7c3cfe5d007b141956741acd7d1f7c42fb
fe96e5c9512de45104e2bcc7e501bee6
9f55efb92909a8de78bd5d5e9c74f647c7787087
2277 F20101118_AAAEHC mcelroy_j_Page_082thm.jpg
38acb77177ab2ac0b1d7df2b0f488a64
775ae32612e3ac05057b8b3645e7ca89f2257d40
20904 F20101118_AAAEGN mcelroy_j_Page_072.QC.jpg
ab816ee4502614436a3b6e06d45d586e
F20101118_AAAEFZ mcelroy_j_Page_062thm.jpg
869fde8216be435153410762fab85c07
bb6f0100d485de5696a86ec05a34b7f8f7ea7e4d
F20101118_AAACXR mcelroy_j_Page_124.tif
e1f2741e72464baa67e0817f3fd2b545
7d9fc11f13a8479f8047cc7dbb731ecdf33207bc
8467d9be2f679f78588e5181ce780f67
8b98a5f87461670a4c45bb8a03ab3faf2326d03f
4f59a1ca8b6b1d20aa48d889f48a528f
0fbef7c8bda9c7a2208435bbc0a24163a6498942
16694 F20101118_AAACYF mcelroy_j_Page_084.QC.jpg
a97f5f8ea288b608323de7e302492e40
be3ba8a031dd811db6717cd323a1343b7d398eac
4fd0606e423ac461b413c8ab2bf3d843009310da
5110 F20101118_AAAEHD mcelroy_j_Page_083thm.jpg
732ecb7e7503ea768534332c9aa8a3be
36450706476db8f2086f4b1d4090ba2c3f07d5b0
17844 F20101118_AAAEGO mcelroy_j_Page_073.QC.jpg
2486e30cc2c869e9a7f6fd8ae007c151
1f36d6059076fdbb9b63ec28b2711f5e4262a085
6464 F20101118_AAACXS mcelroy_j_Page_044thm.jpg
7dbc8a39d5216a14b3e668db685975ee
b752b66ce248f09b7827c5fbb2b65cedcdfc58aa
7c17f6611fca3323d94e9ac61132d9f6
12a4c24a389b61b1dafe140034377080a55d4273
bd9f1ca001a90f483699c47cf8be4519
f41ce47769dd9aa4c8bcdfe67b4d07648dc5b319
2318 F20101118_AAACYG mcelroy_j_Page_058thm.jpg
021607f9c7220a87d3b59489070e6992
3cff72f5e57d78f81e17ef7613b4894f344cc915
faa1fd65b6cb5304a019aeabff81003e
162992c2a69d773c0a92675e392914c3804e6013
4835 F20101118_AAAEHE mcelroy_j_Page_084thm.jpg
a4d4a21982066bd412fbc933f0828cd1650e385c
5033 F20101118_AAAEGP mcelroy_j_Page_073thm.jpg
6a495ecfa6e25585fc1b56e171e72f6a
733ca4d4e14c3ce93a0ccfd16afc084d65d2fe1d
25964 F20101118_AAACXT mcelroy_j_Page_077.QC.jpg
d3823596047730735733ca103b0e9ddf5ba47807
4b0f452610b7cf9d86e3e482fecb0da8
3c6440b05a583a4a894e596678dc57b7ce546f25
8a5732abf4a28e2ef222db54b7d0e415
19835032639192474447b2a2720a4b74f618b214
6093 F20101118_AAACYH mcelroy_j_Page_102.pro
3f7d5b4387177a33ecd6f7a714f951c3
fb5f9bb5e61d605a3f304359021e03b4e99c70c8
c9374c62f40b6680d9dda35038d12807
d071a8e8a1f628a5aaf152a933f0322aa411c4e0
4742 F20101118_AAAEHF mcelroy_j_Page_085thm.jpg
1b298cefddb0f15c5c8afcccc4a7de86
2601daa03f365b7c90f5835cb308de06f972930b
14079 F20101118_AAAEGQ mcelroy_j_Page_074.QC.jpg
23653d4f5d2733f1af153827a967fc0c
3c95f0765557a7726c0ec96eee8a61beb1fdffbe
4712 F20101118_AAACXU mcelroy_j_Page_136thm.jpg
52b381ed71c2e6e39029607715a4b69171e8ea4c
3844c487b5986572686a9cb5ef447235
e842b294e99ebb7b428c43168d3145df9c54b744
a95312b1cbfb8ef475d07d3f0be67158
01bdde5eaca4b3b7df73b99af0022b6aed049852
1540 F20101118_AAACYI mcelroy_j_Page_067.txt
11e53a9f8560c9c68a66e6dfeb1234b6
213a4e88c7a9bf91fe95e1a99c3f23a5668114e4
14085 F20101118_AAAEHG mcelroy_j_Page_086.QC.jpg
deaf3caa2e27e64b0d485e495d6c96f6
8a31b8119e675dfa4bcbf539408d4c1d3996fdf5
4221 F20101118_AAAEGR mcelroy_j_Page_074thm.jpg
d8a9c46cdb2bef4b3167a5d6d724b224
608150915dfbaf93b0079a83db16310931f761fd
15685 F20101118_AAACXV mcelroy_j_Page_046.QC.jpg
6a69cce2d1f114343c872475fdf16052
f37bfb00fb5ea098c196cf0fa87b3e3c22ba5cd5
634152b28447eb73b6c52380053b6b2e
876e393179bba00effcf45a4d6f9f79d6087ee20
59102a1c4d592713dcf3daf0c63c39d3
d254118b0c6eb6e90398e319238f11cbf17fa97f
4610 F20101118_AAACYJ mcelroy_j_Page_114.QC.jpg
5421c69db6a63790e8ce87845633ed15
21322 F20101118_AAAEHH mcelroy_j_Page_087.QC.jpg
b25dc9485682a53233867cc8e1f1c327
0495e637569a915a7206c5b564e4afab647fe122
14046 F20101118_AAAEGS mcelroy_j_Page_075.QC.jpg
f330f990af72a970fd93d969cd48f15d
a414f4aae77840e4e32f31e618c7c3aee64712ca
F20101118_AAACXW mcelroy_j_Page_071.tif
4b104b8662cd5e945aa398687f4e61ef
8fff9fa5cf7d65d9227c1b6d3d931b17f8685cf6
7fe54aceafc09885d7a9ea2fa23c0ba7
c544f43fdc244ee8678303d0b48a1873f436e77e
b0a62e5ebb06cc0502c76e2d0ab3b5ac
060a7f5c64fff132b9864b5eda0f8bd471604911
20530 F20101118_AAACYK mcelroy_j_Page_057.jpg
65372e72913fb5f2b53193ea3f1bd2f6
87957282441ce0ea09935a7aea97bda2d0f1c3de
6214 F20101118_AAAEHI mcelroy_j_Page_088thm.jpg
e4be81bbc7ff9b5f2115f3acf3824795
cd8d95f064ecc1bb1272772e5b668ae501df10c9
4160 F20101118_AAAEGT mcelroy_j_Page_075thm.jpg
bf3b9b92fcc8a24a52b36ccea42ef7ba
757f30b0d6b3291652abcb39b130887919dea9b6
78939 F20101118_AAACXX mcelroy_j_Page_036.jpg
ecf8408f20188be17458f868c5364f0d
174986f443693b8017afa0f5bc76b9f570790b91
bc65ea41bec123f3c48536fd114a47f3
c196f1503bacefda795af83d2758daf90bf81b5c
69941 F20101118_AAACZA mcelroy_j_Page_034.jp2
0dab2e88408513e5b81d2881603e523f
6c1bc5b7e20817bf52edb19b5367e7cfd2b0d55f
5a438a37635effcbb9911850a15e51aa
4cf74a6124f06c9f1ae5e0b56614b1598146f99c
2570 F20101118_AAACYL mcelroy_j_Page_061thm.jpg
af81caf4fc8c931920b531e4f17c28fc
008b7fbb495771b15bae8a37b121c1c8b460387d
15988 F20101118_AAAEHJ mcelroy_j_Page_089.QC.jpg
cbe1b8102c63191a2b209f7f815cf840
1549ea49ae8ef32b2eeacf22fa751a6d975f04ca
19943 F20101118_AAAEGU mcelroy_j_Page_076.QC.jpg
d1594a844831fc2d9d5487028ffd623b
ae49d4d724eb2e9df470db8c04edf4d13c391b03
1458 F20101118_AAACXY mcelroy_j_Page_133.txt
5910dd89c7c03f4cdff72793f05d26d7
4d152a121a1c32dc9b8fcb3eca764d6b9d29af69
01252684137d15fe12fb1e76c8bef01e
476900b866fe477d6615738b58c6160d303ae7fc
68813 F20101118_AAACZB mcelroy_j_Page_087.jpg
5e0e23837646cd12f33a257dd6aca306
caec41a6b8a617749fe71e6a900f67c3e094fc4f
9a6d61c8d242e276127fbcf8b9719826
2319b1d5f85fc2036ee27c5d5246d35a1e3b3ed2
18543 F20101118_AAACYM mcelroy_j_Page_029.QC.jpg
3192d1561d829f5e1b265cec770cd202
82144712b13d63f2b13775f2f9b134da3e62fa8d
4617 F20101118_AAAEHK mcelroy_j_Page_089thm.jpg
007a3d443dfb3b2cfb73ac531a16e726e665d776
6686 F20101118_AAAEGV mcelroy_j_Page_077thm.jpg
63309e54c2150f7bb148d05d8b804e70
2d4748cfd88692634168773d237436a0b4ddb907
4441 F20101118_AAACXZ mcelroy_j_Page_108.pro
14c065fab26ef4aa4db81e1c5d12ec2f
cfc137bc604a771e99f387233c58dfcc
34091254f82a3642aeb37dfae1bd5b3c2567c06f
54ecb0629d94318339f0bd6b8398c75f
1724 F20101118_AAACYN mcelroy_j_Page_032.txt
4a7779299fd7d0a8baa7597e5c35a57c
23346 F20101118_AAAEHL mcelroy_j_Page_091.QC.jpg
89525e30d8c1d9e91a7b4c6d2760a1c9
6523 F20101118_AAAEGW mcelroy_j_Page_078thm.jpg
722839c28e4f1d8be73a2deaffaa0177
2527eb362ab7b70a9202a147bc590ac0f1e321b8
0c1c582e997f8202f40f211c627752e2
19743 F20101118_AAACZC mcelroy_j_Page_147.QC.jpg
b5afa458fdcc6be5b7d49185c750e78a
84dd576b142635b18de5fe5cb33eb7e998a3fc92
c7f7fe10cbc02ce1d34d4f6826580d88
373d47d4d6bc1282ba7f173d72b7e0ced2d6a763
F20101118_AAACYO mcelroy_j_Page_008.tif
b80da278e10907fd9b1f3e54f4897e79
3c0f50ee0e6065d9f711b4c83914156e1834aa5e
7159 F20101118_AAAEIA mcelroy_j_Page_101.QC.jpg
74081391d2e86581448cf8c87b396492
0224ee7f7c3cb4cce1efb08849caa34ec53874dc
6095 F20101118_AAAEHM mcelroy_j_Page_091thm.jpg
5b7c5ecbe1bd2fe36cf75f14dc8de17b
bb945d08e850c75ab9095a710fffc6a1897db64a
F20101118_AAAEGX mcelroy_j_Page_079.QC.jpg
4f558885a20fb4ebe12204bb5ffefabe
dd324a02a5d0c4dc7fc410eb521b7e3bd5f3ed07
4306cd3256b479965e7cefdffc95528b
494f95df35b1ac934b03ed501653a43581621f6f
27677 F20101118_AAACZD mcelroy_j_Page_066.pro
e8b896f0f90b9e89f84b27898f6c728a
a37bc7cb46fd5c6f385601b2b6a7fcbbfb09f7e3
5da084f73324ef2ceefec999bb7fdec6
6056493d004c3b4d05ccf9163b8c242a797b22a7
44132 F20101118_AAACYP mcelroy_j_Page_038.pro
F20101118_AAAEIB mcelroy_j_Page_101thm.jpg
10023bd49ca5c927f4f77826f368cfca
e2279a9db73d8a2218288e1c0dac999381d416be
18591 F20101118_AAAEHN mcelroy_j_Page_092.QC.jpg
316a16b0ff1944ea9c4c7c322f2d1339
58030b7c67c8482da158c36350f507fc6180642f
2019 F20101118_AAAEGY mcelroy_j_Page_079thm.jpg
15b93e8643a0ec2dde7f0bf694828c57
1179a5a6df37ca9c7857cd05508be34a389d073f
861b8bde2ef55354335722f38a431128
8f00bdc22e73f12402f62633624dc60f25ea2d5a
33897 F20101118_AAACZE mcelroy_j_Page_085.pro
8d11a2bc5758ccedae2ef11df5484c5e
66f2216bb67c817eb3665ab798d848ca
bbdc20ac6e2ccf4d852f21561c85759022e8751f
763 F20101118_AAACYQ mcelroy_j_Page_131.txt
e78663a6b6e1fcb41b3cc6c74d2df703
7175 F20101118_AAAEIC mcelroy_j_Page_102.QC.jpg
cc40fc9c0f6f5681a3038c7ebae0b7b7
13658 F20101118_AAAEGZ mcelroy_j_Page_080.QC.jpg
93c9fe811c06aed50be4e178de517ed8
22869 F20101118_AAACZF mcelroy_j_Page_142.jpg
6f8b1963565d7e4c7215b6c00c76e108
78f72891e0540d058479f521e960d765cbc1f4e0
6619 F20101118_AAACYR mcelroy_j_Page_108.QC.jpg
bed11f1f8ecf1dce2dd1c70e8df5a336
96749ee1f771781de875d1a4e0c61f2f
d3dc1713e7d371f4b7fb347ebb553bd9500dd6ce
2489 F20101118_AAAEID mcelroy_j_Page_102thm.jpg
5383 F20101118_AAAEHO mcelroy_j_Page_093thm.jpg
47f4c2e769518d89a33b4622669032ff
b0c6a8f3ea322b1bcf22ce0f2915a950
9de942c0ea580fa878ff8a48262f340d30fba97b
47264 F20101118_AAACZG mcelroy_j_Page_095.pro
c272740f758f3e2ee0eaa503eedc6575
d57cffccb0a12d71dd7d88fa33372e4608bd402a
7472949c7b00e8084b92f828c18f244f
9ddc346009e1c12a42fd83825c861ce25ff6c254
F20101118_AAACYS mcelroy_j_Page_037.tif
3f928000dedbe07083f565300eeda8f8
f51d72649d38b53d9e90c29d5fac3476
1ee0f2c9b7cc98abe4f64434586247d50d4bdf4e
6706 F20101118_AAAEIE mcelroy_j_Page_103.QC.jpg
378bfcd70d105529f531f29759da4ab5
26515 F20101118_AAAEHP mcelroy_j_Page_094.QC.jpg
a6f8263a315cd0e2ed3be03226ae1886
63774551bb6fb1ebc9b9cab38a736a905e8fc04b
beb82fbd5dde7ecc7136f1c5817c9074
ed6963ae8ea5ed2becc7ef15c843bf0800bd6f6c
343 F20101118_AAACZH mcelroy_j_Page_138.txt
d4c9108e97ba59efa389a41d63e992a0b65f01bb
9d19bbef589a2f6913c08fe32bc297eb
89fcbecc7484e17fa29028d8da74612f8c23be05
13508 F20101118_AAACYT mcelroy_j_Page_127.QC.jpg
f7d71e71013ae56b9579fdaa66d45b0a
ce47ecaa214009581638382a72c8b496249ab9ff
481077dc896866599006d44acdaa5825
5f2e78b91d7e4847a9c372e747fb16d78db77fe5
2292 F20101118_AAAEIF mcelroy_j_Page_103thm.jpg
ef851033f591a93babec9d4cca716094
74bcba2771d4cb86535a21fef0dd8f213bb430c4
6526 F20101118_AAAEHQ mcelroy_j_Page_094thm.jpg
c39945dc69efcaca99dd03b854913ae1
09125b264025c1f8773bdd94d95dd98649aa1776
779c34503b552bdf740838031c6fb74a
a88f320934b3dbb4ea722d1fe99675ddb0d55748
50438 F20101118_AAACZI mcelroy_j_Page_017.pro
8ee7635d2fbf6af18946a1069bd8ce8b
5384bb04f2670111b8339bd4e0e0d12b0548d94a
2041 F20101118_AAACYU mcelroy_j_Page_071.txt
31f9bf0b700e4bcc87ee8eb3a8de3d78
8be15b942fb5ce7331ef3e139d592177469d077e
b676067197e00f1faf63706c09f9ed69
9864228f64de57a595a957fccb7537d51056dc89
6767 F20101118_AAAEIG mcelroy_j_Page_105.QC.jpg
229d6deeb06ebbffbe9a8b6c376d09cc
bcb9811ca5856b80d78a72bd606e9ab0e449203a
21789 F20101118_AAAEHR mcelroy_j_Page_095.QC.jpg
cd248f18d1d7a0af9a7310d419f9105d
c35a142de85f5cfd55f9c29c33294254f6a5f896
f5f9078afd4452401a14fe19309a01d86cd9a16f
F20101118_AAACZJ mcelroy_j_Page_100.tif
2ae1e273e8bcf13a6f8899e8c5cfe217
b706d24e4e7bc9ba172d859d2f1a8d87771302c3
908396 F20101118_AAACYV mcelroy_j_Page_032.jp2
28987da4b89ed0d48e28a21409f9f960
0026b33d2614d3c3f5f56b4a3426b21f5d1f2a19
503a4c6bf8c74711423be55d945abf28
52d97463c82fbfe5b7f769f92314a00f68947cb2
2390 F20101118_AAAEIH mcelroy_j_Page_106thm.jpg
5cfab68b6e5c85ac0916ac6660327c8b
a7a4eafe0447c4b7f45e968a1a866214cab29430
26796 F20101118_AAAEHS mcelroy_j_Page_096.QC.jpg
98efeee8773574b864da8d1566c3bffde497232f
622565e4883a19b608ee254922fabbe1
d9b90381da52a39a59324921763dfb0c7e12151a
1563 F20101118_AAACZK mcelroy_j_Page_118.txt
9d55d92297e8c78b6e1b57563dea3421
85211 F20101118_AAACYW mcelroy_j_Page_094.jpg
4d70210309ff82a02809129d357ede97
7c3c8d171a7f5c2ea63eb24e30ef6a5f
1da1f7e0d610d92bcd8c7e1b4384f4ae90b1f141
2712 F20101118_AAAEII mcelroy_j_Page_107thm.jpg
4c6ae0584b43128007f91ef21e0aba42
5203838319677d4e82f3ac117044b1a7cbba5b75
6966 F20101118_AAAEHT mcelroy_j_Page_096thm.jpg
acbb116bb549051b0976c89385259a7f
44438914300f61a4da20d46ab7d9d782bff01770
7fe3224b36da1118bc86217624dbe8b9
199b92ed6e5a3ef17c9616b55d1e2db1c77db0b1
16620 F20101118_AAACZL mcelroy_j_Page_014.QC.jpg
b0f376ef0f4a6a44c67fbb76f2c6aa4b
1b2234eeb31d26a68ac6eb5b8f509d612179f262
6534 F20101118_AAACYX mcelroy_j_Page_106.QC.jpg
a7e1ae0d6479c260b5d7210e48bde6cb
2d20fc190e52c60390e2f2c09d328c72b7bd3bc6
fbff3e58ec382af1ca069c7329b74daa
06073de11708f0b6385bc543de9690945283268a
2725 F20101118_AAAEIJ mcelroy_j_Page_108thm.jpg
94fc3d97c2d5b3468de55565ddc0a207
59ae7cf8262879116a520f5b12266caabeeed362
26772 F20101118_AAAEHU mcelroy_j_Page_097.QC.jpg
d18873bb6c223fe8b129932487c7b57e
a837b6a51f71281288a927f70747099b00b79728
9b98ea6a161be305d2fd20e63486caea
02090ff8f1d5b26f880df843ea377eb3f7bacf8d
6402 F20101118_AAACZM mcelroy_j_Page_141.QC.jpg
b29d13a14a673e34b6c40e21805a1a48
863d6eec07f144092bb2b77fef73803f5b53fd64
F20101118_AAACYY mcelroy_j_Page_099.QC.jpg
544029966349ed4d9cb13b8acf34f332
ed518768a04b833f651401a8f1112165558b1ab3
e129d2c0b13968c1870bb58306f29367
6911 F20101118_AAAEIK mcelroy_j_Page_109.QC.jpg
faf62ae7fc6646d4690126aa8eabb5da
6735 F20101118_AAAEHV mcelroy_j_Page_097thm.jpg
2f26d09b7ccd456e51d72e37f8130eb3
fe26444a6e86aca97a344ab4000bf8b4b72f5db9
46b6e1619ef5a9b648f51240c1dd1a47
d6087ae7ffea71bdee52667c29c0b765f6df8bc0
19915 F20101118_AAACZN mcelroy_j_Page_106.jpg
7dcd4403c317cbe6a34fd898b3aa973a
311bc395eb68336c99b486745b6140b06f17713a
54967 F20101118_AAACYZ mcelroy_j_Page_132.jpg
c2b4765cbdcd1d0e4ea6619a16cdd6645605a34d
3e4ebbc428e320f7a048ed466589e69f
2922 F20101118_AAAEIL mcelroy_j_Page_109thm.jpg
5e1998b8d9c4449652f790db2d02bd8d
81fa3d6a2b55df57b90fe5b3b5cc57177b11127f
21210 F20101118_AAAEHW mcelroy_j_Page_098.QC.jpg
74089da337b8510e445e8cf72d316f0e
2a0e406d0e75f8dfbd1d0676ed1f907f
a277f74cdbf071a4635771853723b145253ca886
1375 F20101118_AAACZO mcelroy_j_Page_002thm.jpg
6d3eccaec07b8b1619c1c3a33a938301
1861d0a00632aa9002663b6171a37ae2516d41ae
b51252e4e878f168760d0d26f9024bbb
cbe5be4442cf835286a0e26150058e3580614b95
5177 F20101118_AAAEJA mcelroy_j_Page_117thm.jpg
6462b660c63d795eaa3b833f38e36d0b
67d26b6f8f2e2af0926aa410bcd58b358bf24190
6690 F20101118_AAAEIM mcelroy_j_Page_110.QC.jpg
32cb27d701c4e5a8747edf0338f7397bb4706fc0
5820 F20101118_AAAEHX mcelroy_j_Page_098thm.jpg
6a7d12b60c6c58a8ed64957bac3a351504e5743d
d96b3fdf9b7995af52d46975a38d9d0b
bbe661be10bfc21dc84c2231ef46bae020ca626e
601049 F20101118_AAACZP mcelroy_j_Page_130.jp2
8b3ee5c056c801daa067efc7469547da
71123bbc82014bae5530bd51e95f5446
3400ee81fb51cf63f415826f620d399a583b9dce
17197 F20101118_AAAEJB mcelroy_j_Page_118.QC.jpg
cd56cae53627a4125e57f683cb9870ca
3bf6432316917841d3b2321a169fa29bda71446a
2708 F20101118_AAAEIN mcelroy_j_Page_110thm.jpg
2f43bf92c2a084e534bd89c2aacd4fb9
04260e22703e3eb1733f0513d3c434fca00550c9
7273 F20101118_AAAEHY mcelroy_j_Page_100.QC.jpg
0454ddb82efcb8815a8443462b52d33d
6898efa6e70d00aa79cc32aebaaf6a3162345f16
d18f122a888f4ed43df6e4f90c9ce172
3944cf545d2c7401927a3f7da2bd2261dc582daa
1462 F20101118_AAACZQ mcelroy_j_Page_132.txt
b8c277e1d6a3b76e5ed963bf6980c527
4d092b68f2cd52c59392959582d088a04c3a9ace
98679da50bcf364ce02fd081a5724692
2e68a1c87efe79d35d6b2831e9b3b60c1ecf7c57
5017 F20101118_AAAEJC mcelroy_j_Page_118thm.jpg
fe25a5abd35f1fee30c9ebb1603b62fb
d9230ea981a064cdcb93ae24d16ef02331890052
7758 F20101118_AAAEIO mcelroy_j_Page_111.QC.jpg
92e5f08307ede95016a6bcdd182ed3e8
252a44ea00221d5e00d8cc4b74bd7f9f920273f7
2474 F20101118_AAAEHZ mcelroy_j_Page_100thm.jpg
af3ab913b36f8e065d017a9765a33c66
ce5c1130443596a9eb72d6af0fff380502248089
F20101118_AAACZR mcelroy_j_Page_035.tif
be1ed22506cd1767b28e0419144916f04b655407
42c0d42f688af03c7c29ff1599c6c1aa
fc10b8c6a4296c01380d5b9d10e92a86
7ee0d95d7359f6c84d983190e81ae367986b6ff0
06ca97d4ed1cb15695c0d534e794074e
977a2c57e952d0b109f7ca1181d9287f1b439cd5
11949 F20101118_AAAEJD mcelroy_j_Page_119.QC.jpg
3139e731692f69fe114dec8659ca2234f0942dc0
5292 F20101118_AAACZS mcelroy_j_Page_016thm.jpg
43d009b9e75283dda8f399bf4bee4f8b
803367e78e567ae3041f2f5fd1a1cc37dbf04e82
9bef28f33824b031e48ce5b06327f15b
81ce5be36e5852dcb11543c3c8d2a5626cde5f96
f87df6916a158d85c8483ca9f0a6cc3b
0d933bd19ba9a3c175189deaf4e170b2cb789e70
3949 F20101118_AAAEJE mcelroy_j_Page_119thm.jpg
2bd9a73f78d1f811bb40a5df2baa171f
20108bc39d22510c34e8ea5f3068dae3abf4c867
2947 F20101118_AAAEIP mcelroy_j_Page_111thm.jpg
aa3a31de5af20b5dec113b7262d86753
39f47015c01fdcf636fe19dab5b141c7c7b21d2d
40860 F20101118_AAACZT mcelroy_j_Page_083.pro
cf3bef03f74a7392c9aa4f6be1c07c37
7ace44d7780a35ed018bb10038a24fd12816bbee
01b4f74885d695f2a0c4b2c2f270db9e
39b0bf421440dd64a7151e8a2c986e10863fb8b0
23998 F20101118_AAAEJF mcelroy_j_Page_120.QC.jpg
3a701893d1525161cc531fb7d6299b08
36cfbb253bbffabec7268b1b59d479238a8b8d47
7830 F20101118_AAAEIQ mcelroy_j_Page_112.QC.jpg
f6cf8fcbd849fef3476f990c009b2bc7
522c03a41356af6607585d9a7a0919af4393f123
332 F20101118_AAACZU mcelroy_j_Page_114.txt
16c5f0aea3aae8fb97565c4e44fbb6d7
a2b86c0ff5ffd18df3b7dea83e9d8f321546891d
c4d535dc6beb33b7b2e746608394552c
831d03c8ce2ab35a2095d15a2e9e1a4b6694c3ca
40c60aa7f336366ae7e4c29f2f502bf7
6078 F20101118_AAAEJG mcelroy_j_Page_120thm.jpg
ab7e677e4297822aab240a28f7f99446
43d50200e499da5dda231d1066999261bbaf4ca7
3162 F20101118_AAAEIR mcelroy_j_Page_112thm.jpg
dacab63987f577486cefa8555e9b72ef922683e9
2786 F20101118_AAACZV mcelroy_j_Page_055thm.jpg
dd39c1e9120a648d32cc86c30bb645a3
5b54582f23dc499029e935394935f1b5c607c1d4
64da5c3f509fca0046534aa57e357c65
46da18421c802263fc128d744dec5591c92a7594
183ae1158aa369265506bba37f1a5068
2c09373deaa0a5b35ebaa3e4e98e9e243ab0e826
27455 F20101118_AAAEJH mcelroy_j_Page_121.QC.jpg
7871cc1797d54e7a6b92346a816f7366
73a8c8c2dafc69f5969b9f8fd1c3a7393c31bf1c
7329 F20101118_AAAEIS mcelroy_j_Page_113.QC.jpg
772ab56ed9b03fe0ec765873ec343662
12b8b7127312f50cdc77338bb6cd67c7682cb597
F20101118_AAACZW mcelroy_j_Page_103.tif
8e4062d75f0b68c2979acce0dfc4f9a6
0e41fa5cd486735cf1d6d63fa4ba1463243d3e82
0c830627996b5874a1dfa9199d531051
6794 F20101118_AAAEJI mcelroy_j_Page_121thm.jpg
686264d4f8dfd4f16813665b2f424987
567444025c89d35142ee7f464f49a92955b8a7e4
2993 F20101118_AAAEIT mcelroy_j_Page_113thm.jpg
9f6fdbe50cf14d2a6f4a299947a40967
1d3b5ae179be8643abd61e1a4a5168e45a157741
24869 F20101118_AAACZX mcelroy_j_Page_049.QC.jpg
f06b869dc5d24eba3ce7c94a09ba9def
ccbe2f2a7c45cf05c504041e40aceab1965ebe5f
0d9706b56609f31dfc5a374811113ba3
763b7c7e9c87d0c2eaf9bd0ff80089febb7fd687
287bf8288425fcbcbdb7489467df0d1a
86e33f4ec8877606d617556ce4451ac5b0d66dde
27523 F20101118_AAAEJJ mcelroy_j_Page_122.QC.jpg
80c3496e6a6294dd6e0678e57255809b
3ab95dc9829079cc4e1a748bc78e968c836b1cb1
2025 F20101118_AAAEIU mcelroy_j_Page_114thm.jpg
2e76c682a84133ea745edec9e176cc6e
a1ce163ff28a078dff58624a7f55c199f34b0ebd
2308 F20101118_AAACZY mcelroy_j_Page_104thm.jpg
66e0a23df1aba67244fb487158e054e5
80b5747dec634d055737861704f1d84e45dcddfe
ca447fd9b7ecda1fd781d9105210e0e0
669c4aa9b0781ef778af7c92ddfa6521f2f794aa
677cb3c78180f84dfe221256ed8f1851
1396aa4fb5b02843a4ca07702b0f3fbb4a3f7dbf
6750 F20101118_AAAEJK mcelroy_j_Page_122thm.jpg
f0d8ab1c20afd762d7e4d80f5691fe40dbc4880d
22518 F20101118_AAAEIV mcelroy_j_Page_115.QC.jpg
44138b320590e6e83f99152539300ef5
59ecef5e9ca6c2bae5c16e918208c7245de98b6c
79284 F20101118_AAACZZ mcelroy_j_Page_146.jpg
614de74df47d304bc2ff67655ef46b6b
1b82289be6bdaeee37da99de61a195241d824174
f0d075a581e6766b06754d12d8cde235
120d6d80746b243538be8d439e8ed16e9047148c
45ba9b8feed2358546f9eb2b4008b7b7
fc573bdebcff695aae014bc19b365baeb62f8e3a
23569 F20101118_AAAEJL mcelroy_j_Page_123.QC.jpg
58e3dc64dd4d8a899a68366b6f711e8b
4f6bd211b99eb44f79d1b0583b06caa2fcf6297e
5634 F20101118_AAAEIW mcelroy_j_Page_115thm.jpg
e9b51c1f7e175584197ae0b8d8729a05
ba693613b59fe2c8b488d0aa9ef08b15134b9d51
626be4560bc8ae7830fd5b755465005a
c69d91a1de9047f44f13676f337c231e809ed65b
eaa9bd74ac9c4fd99dc370dc1bb020f5
7fbb3b78768c6452f870cfee55fa7714bd3985fe
3373 F20101118_AAAEKA mcelroy_j_Page_131thm.jpg
836080a60b18111742e09b53ff1bdf3c
2879d18c880747d86f4e8d25214d39a0b3025782
6251 F20101118_AAAEJM mcelroy_j_Page_123thm.jpg
2922b0fe1de05c5d73e78753a4644e6e
20457 F20101118_AAAEIX mcelroy_j_Page_116.QC.jpg
c048c605a4e27f2fb5d047436c99c38a
3db835db07e71f92cb07380d672903eb
f46f1324777fe08b381f6ea34ff40ff308424f3c
16141 F20101118_AAAEKB mcelroy_j_Page_132.QC.jpg
cfeff5b00cf810354a9b407e7423e7a1
250a97f3aff30b92c4cd35f7b4bc1132c260d41c
21144 F20101118_AAAEJN mcelroy_j_Page_124.QC.jpg
019e33860880df343fc1efb52cc19e37
4422a4994ebbbdc935019101d8b0653379a5fa90
5561 F20101118_AAAEIY mcelroy_j_Page_116thm.jpg
2b174d326c443370b6ac400b4af2fca2
075fa514df5df346c0c333f97eb11ab069222d18
d78ab257d49cdb830f71bac8808558b6
8c66e2a112edfbdd92202943342352b8a0401066
4b53954db265da04edb31a53b757213ee29e99bb
4735 F20101118_AAAEKC mcelroy_j_Page_132thm.jpg
9306ac2812b5f901936a1503aeb14679
4354dcca2b257b35eb1d9365b5c2ded75f700139
5608 F20101118_AAAEJO mcelroy_j_Page_124thm.jpg
9052b6284e5cbf9ea05366ac2691cabe
30a8effaf17a985c5cc2c39a5b227b67302fd3ac
18303 F20101118_AAAEIZ mcelroy_j_Page_117.QC.jpg
227c87aefef5ed44d32019078ff2abcaa4b361d6
da7108c56429f54afe51dcef60242d17
96cc9774a26688e95aa3ca2d8e0c7f071412d9b1
766017df7bfa848166d286d5bcbf6076
b56b55fda2ecefa8b07dcdc4cbacc87f
58dcd080a9f0898ea61ab8f31d57db53f454f5ea
15980 F20101118_AAAEKD mcelroy_j_Page_133.QC.jpg
6e42150728784b66f38f696bfb9024e2
5aca40117ea44a37985a3381552dedca0fb710c3
15035 F20101118_AAAEJP mcelroy_j_Page_125.QC.jpg
1b1ebd58b8cd8f509f4ecf51af308a0c
20f0dc4be9c87731ce4134af2c36cd1cc0eb8a50
e5173cf58a2840e10ce98c43d7852f97
d28e6e41327a45dd86426fafc4a8c77d8b5ee497
4908 F20101118_AAAEKE mcelroy_j_Page_133thm.jpg
65db9897af4d50fe79e5afef30f69e93
9a6cef6a12dfe59201a9d2d33979982e
ec14b7acd6f7bba7ec20a66a426389d34d8f0b83
106dfbdc7dbc2cf7ee4ec41d6a123bef
ae303926258c9eedab5fec87367591035f949cc6
5443 F20101118_AAAEKF mcelroy_j_Page_134thm.jpg
4790 F20101118_AAAEJQ mcelroy_j_Page_125thm.jpg
af5bf8d530e514386cabc99a3d0cff2a
ebb741e2950cd08be5ba238226aabb99578179b1
c1cf3b23492fbbdef12948bd674bd0f2
ecd6e9fff60441187dc1bb1308a9dac5bf94d5d3
12117 F20101118_AAAEKG mcelroy_j_Page_135.QC.jpg
82a27b4a9e5caf402417e21bab53c6dd
590af7b28deede09a9cb529755225539c6c81524
15508 F20101118_AAAEJR mcelroy_j_Page_126.QC.jpg
170d54bb9ed48a86b3e2f30f13fb0682
5c44baf5b1ffe36de43a7e6250b96b1201c27906
47049cc86a2a84e9b1f89984310137e6
0727ea221d223d6d3915a4b7ed76279a6aa60f95
0c528300b44d4610372defb42ed85f12
4328ed5876f5853524a2e4a7c21ddab188555470
3646 F20101118_AAAEKH mcelroy_j_Page_135thm.jpg
bf3d0ceeecb03f2abf31686d29b2cc4a
8df5398aa5b48d158a8fa1e64353d18f3e5dc9a8
4951 F20101118_AAAEJS mcelroy_j_Page_126thm.jpg
94041599510e8c9e11ec025005845c02
8a50e9f23b072dc46bca4d4e47413d6eed2d1348
f4ac48452a53cb756197f21e7dfc8221
dac934bd88016ffe92910b38f3e6b506fd8e0369
1a09355be7f00067c211abfc795f9480
16379 F20101118_AAAEKI mcelroy_j_Page_136.QC.jpg
7330aa984eb32a266a8f0100b7197495
de33901dc9f09f4eb18452b046d121e7fccf1c89
3820 F20101118_AAAEJT mcelroy_j_Page_127thm.jpg
abc316cb20221fcd2388a2c4590cbc38
851e1b38ec4b01ef5f535fb5b06e023813c1fa0b
dd30a1ea44e741faf1e185f9b438f650
ac67d0f970c6972389f760c807066e99
451c531b0fbbdd42691497fa9bba4e0c5af0ce6e
5044 F20101118_AAAEKJ mcelroy_j_Page_137.QC.jpg
807d7e7fcfdec29380397266f7a40279
a51b7d8358d29f3005a8f6827fd7761b105f8bd6
3519 F20101118_AAAEJU mcelroy_j_Page_128thm.jpg
be38bc2ed4c0a9ac91db9792ae512933
088833804c21b6a0e70a71434cc236de677cbf2b
c93cd5a7b673d7e7d24887911fbf8f07
c043680ddc49255ea82cacdb02a2c2c29beedaa5
b2538cb01f4fab9c7f1bf0fe680e58e4
02a4a96eabf9d103c63097281b4e7d13817bd80e
1757 F20101118_AAAEKK mcelroy_j_Page_137thm.jpg
2508cd11c2c25059839534d5735c1796
97e7f1600ec36bcb22cf444eebd4b07fafb26c11
11796 F20101118_AAAEJV mcelroy_j_Page_129.QC.jpg
1fc3ab5ba060dc6858d538a30dcd7a72bfaa58d6
4f9df83077a74c5253bf44c3274d3a8b
b75e3a1767498fd3d7f00482ee9c1f397bb9c0f4
761e754ebae0894281ff9d79cfb6db01
54dbcd8010f356141fd8465d7fbe6697066425d7
6422 F20101118_AAAEKL mcelroy_j_Page_138.QC.jpg
2bbdb12e7c48be3dfe0a72d66e7988ba
ee370015dae37dd3e7b722fb286c9dd92ecdcdaf
3620 F20101118_AAAEJW mcelroy_j_Page_129thm.jpg
550b627cd209ba95c7da06c7b1febec4
e6058e05dfd1f09cb0d4d99eb59a8e84d5cb90d7
523d73bacd47ab12e1eb6d9e683b4ace
83c48628b50c189c7e48b6027af616a0
1b2d7549c1d2e0044bfcd292bcc62feae65d7220
2201 F20101118_AAAEKM mcelroy_j_Page_138thm.jpg
1c1091dab654e1692eee0f7022508dd2
ff5f9e78640f1468103158437726b8d152e4d9a6
14660 F20101118_AAAEJX mcelroy_j_Page_130.QC.jpg
fdaf1cf99748b47aaf8678686b0e6971
035cb922534846a766746488ba1240a53b8b2fa5
9d36ca2f5138af24a14bb918cb7a4fc0
6c00e8580da09eaff405e29f9d87a967dba1a679
0f4ec7138675a35a2488ac4103a2cd4c53a4bfdf
2181 F20101118_AAAEKN mcelroy_j_Page_139thm.jpg
d2eb7ca02727cec8d2af52620d935478
4505 F20101118_AAAEJY mcelroy_j_Page_130thm.jpg
3616a9c808a3fc0c479f471a9da0486d
249fd3f7fcbcd27bf17bd6815b37e894
93511968ac752feb5b0cde1d6c0a0a2af5da89cb
3edacd19c6e5724c8baf3f99179322f1
00676515d0fa07b309c5fbee8ba927fb502b3737
2340 F20101118_AAAEKO mcelroy_j_Page_141thm.jpg
dcacb6ab2e008524e82f88b575e13e716d18f9b0
9774 F20101118_AAAEJZ mcelroy_j_Page_131.QC.jpg
820480bd20040a0a670e452f47b45ab6
9dbb09156be038bfa75f64f67f3daa9fec765e44
79614406802332d52ffbc7ef7a76ed4a
3199a7d06c69cc20e154cfc33a3f35b3d5887e30
d965dcef1db4c73eafe6c8c3bfa332f3
43301619ce20391ab4d9eb727038eb2a9c96ca6d
7291 F20101118_AAAEKP mcelroy_j_Page_142.QC.jpg
18b487699d67f9539e209481d3c15494
74e55924895d2faf64318d8d38c340eaa4695a85
a5d01f0b4d8b421eec194f976d7e52c4
0f30977b3a725eec4f9a152982f1c300a38c11b6
45449046c94150e29e489d771d2e0107ed3b93d7
ecfc574a38dd5a7bda5f7f51d16e9981
57b9447e2c2940e023191c6eb29703ea1ca50919
26907 F20101118_AAAEKQ mcelroy_j_Page_143.QC.jpg
916fda43d8f8ddf62dcc342d2bfc2df4736c0a9a
4e3830b9a3fb18231603c67b3781faa9
e58871d006537232d15ee39a9d626a668d4f196a
5902dc21681fd431190df193d21b1c40
5ce2dca68b3cd9e2b949d4f0bbc5f1f71a6c44f0
ebb44d78ab6e56f4d1af26e8a360eeef
049e3fe44f277c3acb325981b90baff0
d85f5f4207d7890b0781d2c6ba82f569deab5045
5200 F20101118_AAAEKR mcelroy_j_Page_144thm.jpg
d013bb8832f75d54734630d443fb2474
40bb845359bda3ee6dc7a74dc54901f8fac62a45
6e8f683949048f374afec7dc1861e216
5d7e3160eb00e7c835742312fac9a0a123311e52
8a58cd139a2a565b3936b79895d53e1d
7b23afee8824f8809a0fd8bc4fdc8c71ea765e94
22626 F20101118_AAAEKS mcelroy_j_Page_145.QC.jpg
17b0e5708a180503dece7deca5a985e7
a6fef4abc0c42a6cde537730d0331cd7
732f057772be2b72e95b9fbe653c447a24f6ea4a
5daa34155fdf258fca0a9549f86e5ff7
6444ec6c50dec8bb6773728f5f9399d794bd3e3d
5999 F20101118_AAAEKT mcelroy_j_Page_145thm.jpg
459d45c4786dc1b8cdaf957944bbdcdb
c335ee4160ee638bba50e231a90a38ace0e72f77
f55af7b4e9e63433d81870bcfb6f7865
31a9b7928c07244d90224a7c0297a343a3d0599c
e22ea2bfbb31618461e5acfc712f976f
686c1eb2a1d8e8b32d28eecd7e02ba4c9cc155de
23429 F20101118_AAAEKU mcelroy_j_Page_146.QC.jpg
a708be56461f2ec4840e60f05346c72f
598b94b7408a1a2f9d8a2e7b1328e8a9879ce434
c6b58c0c7b9d4a9e3607c7f19e930378
b652d5e147646226b615f772ce3e414af1b67654
df7bda85d83d1d7d2c29c17752cd0798
cb7b381d50b1d2cb82b707c0bee1b3ba6f1703bf
5612 F20101118_AAAEKV mcelroy_j_Page_147thm.jpg
34e16c035231aaa4125cddc4a300f045
999ba562206241415fc9e666882e2f884c1b28f8
d8db07427d99590dcfcccd57928cb509
9b7b71948a8f90d173746166a44de20f625e8e58
c5a83d3156c4b733a350925f5e332f74
856505c41efb3a723edcc7f3a9294d7d45cbface
4684 F20101118_AAAEKW mcelroy_j_Page_148.QC.jpg
1532cf175de9954509f80582c839c5be
cae67bee92463e099feda661805049923c4fb10b
fcfa8200c726d91f872db57cd606bbed
719e6b38614c6edef735894cea9006ddded992c9
1635 F20101118_AAAEKX mcelroy_j_Page_148thm.jpg
4fc3c95216d7d3a4c78503e5d0da019e
31e409b00f1ee377cb4bd5b510d9594481d94c01
18f29ace384603338f4a40933918d3f1
d04852dda931d6578256e5a4aafb157eb5400a52
68a774ca8216da9b66e750fb041c0d27
015e2d56e63a075690e65e2cfe4121f5ef74ac5f
171554 F20101118_AAAEKY UFE0021636_00001.mets
d694a53c7046a42d194f56dd3cff20d4
133f1eb450e157c7b2374284c2fc4863d6f00e6b
3ec108ec3c023952743995b659dcd1b8
3616ce60e473bebe25e9d1ef482d2efe9c46c3ca
b23a194782ca24c0f8eeb5debe6db4cf
f6621b2e474974cc591e40e6c0c7909c34534d6f
d5715454e0ff575024c9c5a18d52f2c4
83dec090cd1f352e3b9b5086f4becac1
a68cdc8453cd83419b1895e6002ce137cdddf0a7
e4cce8323ff3a368edb3d60bd6dba056
2d3d8069c4a75eaef89495290c4134901b007446
d9809f1ede94e3f3b2243fed9b9b3d85
1308ffe1a3c47335d2937de95de3ee6c
1c8a602097493c581874e95585b0b120d60209c2
726f09667f3f3f627e82950c87c4cbb4
6b9b3ffacd4bf758da2e495c87315aeedaf7ea58
6ef9da76f48b7daee1de87c75981ff8ebd58fb94
14585b95654fa91d038177d582b16f88
629d20e4a785482d3aaef3cb9013378b
003cfb05d8e1fc34716a713f62e1789336b85166
264a6ea46b4d44effd31ff54f0ce9085
0cee3c946d4e64c71522e4d35e540002ff8938f5
7557c990e8185bf70bb9ffdf05105c89
858476914a29cb4ab5ee8865024b32b66ec6028b
11ec357c81c1f051e5c2cdc42359b38b
7d4688c645927013dc5110e6f7f9d46d
126d32d613f45d1bd0282445caf21ce6
bb6b83eb3c8c6b15fd4a56297912a952f28c5441
49ddd6527bca62f9db6e5885553947c8
3085e0a35e67d0fc55887c42ebe644b89d0a91af
206855d64a12ff38a6fcd852451168f8c815bc0d
4f977cc70ff2ccf364b22e079aa1b40e
eb80bdf9f77e18cf310e53ff380c21c64bf105aa
67f9db0937d92d9db1aa2620bdccc02d
2baca8990d7a0af593176393e5024776a4d584f6
94f936363e543e241ae170230766a407
30bcc63c34a99dc6e27a21b76a6e123c6be6cce6
f3b2a22e236f2a0ca5d1b2d3e257c685
ba5ed3f575d275df1d35769446003a9ccee02e38
3e00091a2dbcb75d951ca74c1e2c1f7d
261fe599774b9a9c20f3a926b35a1ac512d062f2
4da6cf70cf57333b3aba57c94a98a8bb
91729c5cccb429f003974eec63305c1c64950353
38c20091b397aceaf6e2ac9cab901c2f
50f191d7440df9d5c74893b900f4c5a747f6b035
9527c129c1f08c8035fce31fcd180ba7
07b91af36d091fef240f0b79746250100d34d20f
183457d7b06071034a940f11d86f7e52
caf27ab456459ab95e887b25db64fe0f17197796
9c36009cf8cf0dc0403c63904639a633
a5c4ccd5656998dd6aea725c8b2d4e60
0349d506839e05957302e4fc63731456998707a0
82bb44b1b3c599423c97e0e2da68d9d4
f9fa999d67c605249e04220a8eef7df6effe760d
ecdcb65015b1752238bbc02603fe1fb1
feb4dff55d8c9152625ae394ce8471dfc5f7b28f
26034850906c925ae0fa7e9d882458f3
c154c508d3c33fa9611065f43f49fa93
978a23936f305e80937d5e348dc83f0066ba797e
5582537bdc87df0185f55dc48f80be4e44728119
dfbee8e90b365698bcb7a55523c41046
80fb52f9c11f566569b0252138eb54ddde594660
50cff83b271a3a56f2cd89faa02a994d
a725b0c33e816627d8199624beb941ced6655c07
3133c89cc9214b46f819597983f92cb0
0391e51d92fe27d169efbb34971c9d0d090cf882
f80f4037e1062c2ae07648eef0e2a204
31c996af87e99d70223233de2b914314f06f03e4
3b68331280351295d49cc2d14ba73fc2
efb5d4b0c2ffef8e33f1b1e42ac556ef6c5b1528
7d9a94c46ba44c343320698c2b736b03
938c899412c98f7fec31f81eabafcc71c56aca44
6bbe9691d49daa398f9689cef5ac661ecb101a24
8e2944a29fc2a7fa407a2a8390b2da05
60d2b017280b0cd9da7c582c421c86633bb353c3
8430a6a1ae9fb4c48f75d28d76396a55
64f4bff110744dd7ab39218198dd23101ebda99c
67fd487a82d94581aa8932fc87654c7ebdd6637a
c2c3ba082172c3ddf303c22a978c6b60
0289e297e1b25fd4945e07957f70253fae78e29d
34d00941aefd12be64eb04b5bb94d770
ee322dc037445416500e57f35b038c233c0493a0
06b7956b80d01361ecd67119a1502060
c9abaf094c742bab75e69255444e073fc48cd294
103a030ab902612bdbd2bc2f7949c3d5
43242256f345996489996470f35fcb820bb50f62
7fd4007459da21e34369f28f48c818b9
47e7d99e87d944980370e52e898028dc9261ea09
ae4fcf422577f51c2d0588c42121a4f9
b4077ce0aa59e1dc0141e10bf662e3b6acb62275
b2a74ef7d820e587e11ec0504570595c
c1489d8c18b6b6b77e1f5afac02d0824f189cfa9
3ae93df2cfc62d6a2311b076ddc175c6
35fff6c3cbaacd70bc5e79b25fe10301
b2859ae38e99fe0919625d0b6d7fb4718c270905
5ab03f123cf2bb7d48f7beef2a41bafc
88c8edf2d2c030498db2138d2a76d046
b7b5412887fc95fd0bba305647d19875fe2c1c73
d01dec523e7a82ec902218f40ee99a2a
29cbca8973a44fb6657e65c411c9acba7c434941
bf573a838f6cece5c5c78047ec390c32
2716fa171d7ae7d453478f4ef02989e4043b4932
a2bcf181de693c41d50082e9156a3ed3
188575a56658d9eae83453b8e535cbd9309d3414
7680e99c9a5c1c1d4333d14894bbfa38
f585b243ac866757ddced0f3a2a99caa
c3700e0e7f7d747cac5e1aaf8bc75b2f0cb87fea
f0f7ea3ab0017df62c7d3e154e07754d
4d2c920e7dd4903061efdcd42dd657b2
1cf6d57b8e022a923fa99651b52e78b333dfe160
c99b04b8f3a8e088cf981bd5ff46a430
1809eef8ce954e3084a39f68eea2fb0e26c6569c
ab2bfbde0c86071c846c99fd770a6e43
800b3c3e350629d70b3b29cd6b0a20e5fd9c66b8

PIECESWISE LINEAR LATTICE BASED ASSOCIATIVE MEMORIES

By
JOHN McELROY

A DISSERTATION PRESENTED TO THE GRADUATE SCHOOL
OF THE UNIVERSITY OF FLORIDA IN PARTIAL FULFILLMENT
OF THE REQUIREMENTS FOR THE DEGREE OF
DOCTOR OF PHILOSOPHY IN COMPUTER ENGINEERING

UNIVERSITY OF FLORIDA

2007

@ 2007 John R. McElroy

To my wife, Veronica Lye-McElroy, and my parents, Sharon and Bob McElroy

ACKNOWLEDGMENTS

I would like to thank my committee chair, Dr. Paul D. Gader, along with the rest of

my committee, Dr. Gerhard X. Ritter, Dr. Joeseph N. Wilson, Dr. Arunava Banerjee, and

Dr. K(. Clint Slatton. I would also like to thank my family and three close friends Jason

Jerauld, Philip Whitt, and Travis Storm.

page

ACK(NOWLEDGMENTS ......... . .. .. 4

LIST OF TABLES ......... .... .. 7

LIST OF FIGURES ......... ... .. 8

ABSTRACT ......... ..... . 11

CHAPTER

1 INTRODUCTION ......... .. .. 13

1.1 Statement of the Problem ......... ... 13
1.2 Objective and Approach ......... ... 13

2 LITERATURE REVIEW ......... . 15

2.1 Linear Dynamical Systems . ..... .... 15
2.1.1 Linear Systems and the State Space Model ... .. .. .. 15
2.1.2 Existence and Uniqueness of Solutions .... .. .. 16
2.1.3 Equilibrium Points and Stability ..... .. 17
2.1.4 C'!I. ...g of Basis and Diagonalization .. .. 18
2.1.5 Eigenvalues and Stability . ..... .. 20
2.2 Non-Linear Dynamical Systems . ...... .. 21
2.2.1 Linear versus Non-Linear Systems .... ... .. 21
2.2.2 Lyapunov Stability ......... .... 22
2.3 Fundamentals of Associative Memories ...... .. 23
2.3.1 The Linear Associator ........ .. .. 24
2.3.2 The Hopfield Model ........ .... 25
2.3.3 The Bidirectional Associative Memory .... .. .. 27
2.3.4 Problems with Linear Associative Memories ... .. .. 30
2.4 Lattice Based Associative Memories . .... .. 31
2.4. 1 Introduction to Lattice Algebra .... .. 31
2.4.2 Fundamentals of the Lattice-based Associative Memory .. .. .. 34
2.4.3 Existence of Perfect Recall Memories .... ... .. 35
2.4.4 Fixed Point Sets . ..... .. ... 36
2.4.5 Geometric Interpretation of the FixdPitSt 37
2.4.6 Behavior of Non-Fixed Points, Noise Suppression and Convergence 41
2.5 Ordered Weighted Average (OWA Operators) ... ... .. 45
2.5.1 Basics of OWA Operators . .... .. 46
2.5.2 De Morgan's Law and Duality ..... .. 46
2.5.3 Yager Measures of Andness and Orness .. .. .. 47
2.6 CI....~i~l:a aps.... .. ... .......... 49

Outlier

Robustness

3 PIECEWISE LINEAR ASSOCIATIVE MEMORIES

3.1 Lattice Based Memories as Piecewise Linear Systems
3.1.1 2D Piecewise Analysis of Ritter-Gader Results
3.1.2 Vector Fields for 2D Example
3.2 Generalization of 2D Example to Higfher Dimensions
3.2.1 The Number of Cases.
3.2.2 Geometry of the F'ixed Point Set.
3.3 Verification of Canonical Results
3.4 Advantage of Piecewise Linear Framework

4ENC ODING AND DEC ODING OPERATORS

4. 1 Order Statistics for Encoding Operators
4.1.1 Order Statistics-Based Associative Memories.
4.1.2 Order Statistics Based Encoding Operators for
4.2 The Decoding Process
4.3 Noise and the Narrow Fixed Point Set
4.4 Performance on Real Data.
4.4. 1 The Data .
4.4.2 The First Experiment.
4.4.3 The Second Experiment
4.5 Conclusion.

5 NEAR-MIN AND NEAR-MAX MEMORIES

5.1 Introduction to the Near-Min and Near-Max
5.2 Comparison with Tent Maps .....
5.3 Effect Of Initialization
5.4 Harmonic Analysis ......
5.5 Existence of Fixed Points ......
5.6 Growth in N Dimensions ......
5.7 Conclusion. .. ......

6 CONCLUSIONS AND FUTURE WORK ...

REFERENCES .......... .

BIOGRAPHICAL SKETCH ........

Memories

LIST OF TABLES

Table

page

4-1 Data Dimensions ........ .. .. 95

LIST OF FIGURES

Figure

page

2-1

2-2

2-3

2-4

2-5

2-6

2-7

2-8

2-9

2-10

2-11

2-12

2-13

2-14

2-15

2-16

2-17

2-18

2-19

3-1

3-2

3-3

3-4

4-1

4-2

Eo(el) for the 2D example..

So(i, 1, a) and So(i, 1, b) for 2D example.

Half spaces and their intersection, B2-

B2 and the resulting F(X).

Fi(X), F2(X) and the resulting F(X).

ul, ml and the resulting BI.

Noise Suppression Properties of Mxx-

One iteration of a tent map, with p = 0.5

One iteration of a tent map, with p = 0.1

Fifteen iterations of a tent map orbit starting at x = 0.4,

Fifteen iterations of a tent map orbit starting at x = 0.4,

Ten thousand iterations of a tent map orbit starting at x

One hundred iterations of a tent map orbit starting at x I

One thousand uniform random numbers for comparison.

Bifurcation diagram for the tent map.

Bifurcation diagram zoomed to show oscillations.

Bifurcation diagram oscillations zoomed further..

Patterns in bifurcation diagram..

Randomness in bifurcation diagram.

Division of the state-space for a 2D example..

Vector field for a 2D example.

Points fixed in dimension pair (i, j).

Points, some of which are fixed in dimension pair (p, q)

Narrowing the fixed point set for dimension pair (1, 2).

Narrowing the fixed point set for dimension pair (1, 3).

with p

with p

0.1

1.0

: 0.4, with p

0.4, with p

4-3

4-4

4-5

4-6

4-7

4-8

4-9

4-10

4-11

4-12

4-1:3

4-14

4-15

4-16

4-17

4-18

4-19

4-20

4-21

4-22

4-2:3

4-24

4-25

Narrowing the fixed point set for dimension pair (1, 4). .

Narrowing the fixed point set for dimension pair (2, :3). .

Narrowing the fixed point set for dimension pair (2, 4). .

Narrowing the fixed point set for dimension pair (:3, 4). .

The fixed point set for the nxin and next-to-nxin encoded

Decoding with AL,., dimension pair (1, 2). .....

Decoding with 4,:,, dimension pair (1, 2). .....

Decoding with Afr,., dimension pair (1, :3). .....

Decoding with Q,:,, dimension pair (1, :3). .....

Decoding with Afr,., dimension pair (2, :3). .....

Decoding with Q,:,, dimension pair (2, :3). .....

Narrowing the fixed point set for dimension pair (i, j). .

Noise and the narrowed fixed point set. .....

Dimension 1: nipg. .....

Dimension 2: cylinders. ....

Dimension :3: displacement. .....

Dimension 4: horsepower. .....

Dimension 5: weight. .....

Dimension 6: acceleration. .....

Dimension 7: model year. .....

Dimension 8: origin. .....

Example boundaries front dimension pair (1, 2). .....

Example boundaries front dimension pair (1, 4). .....

.. 100

.. 100

.. 101

.. 101

niories. .. .. 102

.. 102

10:3

10:3

.. 104

.. 104

.. 105

.. 105

.. 106

.. 106

.. 107

.. 107

.. 108

.. 108

.. 109

.. 109

.. 110

.. 110

111

. . 111

in only. .. .. .. 112

.. 112

. 11:3

.

nie

4-26 Max encoding: squared error in the fourth dimension only.

4-27 Next-to-nmax encoding: squared error in the fourth dimensio

4-28 Max encoding: rnise across all dimensions. .....

4-29 Next-to-nmax encoding: rnise across all dimensions. ....

4-:30 Max encoding: rnise across all dimensions. ...... .. . 11:3

4-:31 Next-to-nmax encoding: rnise across all dimensions. ... .. .. .. 114

5-1 Bifurcation diagram for 2D nienory. . ...... .. .. 1:38

5-2 Canonical behavior in hifurcation diagram. ...... .. . 1:38

5-3 When wl = w2 = 0.5 in the hifurcation diagram. .... . .. 1:39

5-4 Twenty iterations of a nmax encoded nienory. ... .. .. 1:39

5-5 An example of the incoherent pseudo-sinusoidal pattern. .. .. 140

5-6 Discontinuous jumps in the hifurcation diagram. .... .. .. 140

5-7 Twenty iterations with wl = 0.1. ........ ... .. 141

5-8 Bifurcation diagram with negative initialization. .... .. .. 141

5-9 Bifurcation diagram with positive initialization. .... .. .. 142

5-10 Fitting a sine wave to the hifurcation diagram. .... .... .. 142

Abstract of Dissertation Presented to the Graduate School
of the University of Florida in Partial Fulfillment of the
Requirements for the Degree of Doctor of Philosophy in Computer Engineering

PIECESWISE LINEAR LATTICE BASED ASSOCIATIVE MEMORIES

By

John R. McElroy

December 2007

Major: Computer Engineering

Research into associative nienories using linear correlation matrices began with the

introduction of the Hopfield model in the 1980's. These networks have a limited capacity

in terms of the number of pattern pairs that they can store. During the 1990's Ritter, et

al. introduced a new family of associative nienories based on lattice algebra instead of

linear algebra. These nienories provide unlimited storage capacity and, though they have

been well discussed in the literature, research in this area is still active.

It is possible to generalize the existing lattice based nienories into a family of

nienories using order statistic operators for encoding and decoding. A novel method

for encoding nienories using all order statistic operators is examined. This new method

provides a level of robustness to outliers in the initial data set that is not present in

current lattice based models. In addition, this approach treats the lattice based associative

nienory as a piecewise linear dynamical system. This piecewise linear viewpoint imparts

a new set of tools for analyzing the behavior of the canonical lattice based associative

nienories. Iniprovenient in the retrieval of noisy patterns in the presence of outliers in the

initial data is demonstrated using a widely available computer science data set.

Using this piecewise linear framework, another class of nienories using what are

referred to as "near nxin-1!! I:: encoding and decoding operators is introduced. These

nienories, along with the order statistics nienories, are a subset of novel set of nienories

created using Ordered Weighted Averages (OWAs), a process which is also introduced

here. This experiment sheds light on an important property of the canonical lattice based

associative memories, herein referred to as being i.- !! behaved in the min-max sense."

CHAPTER 1
INTRODUCTION

1.1 Statement of the Problem

Associative memories using linear correlation matrices have been well documented.

The introduction of the Hopfield model sparked a period of excitement within the research

community in the 1980's. Bart K~osko's bidirectional associative memory energized the

community in the early 1990's by extending the Hopfield model to the bidirectional, or

heteroassociative case [1]. Both of these networks have a limited capacity in terms of the

number of pattern pairs that they can store [2] [3]. Though the literature is replete with

updated models providing greater capacity, it is still not possible to achieve perfect recall

on an arbitrarily large set of pattern pairs. This issue is discussed in further detail in

Section 2.3.4.

Also during the 1990's Ritter, et al. introduced a new family of associative memories

based on lattice algebra instead of linear algebra [4]. These memories provide unlimited

storage capacity and, though they have been well discussed in the literature, research in

this area is still active. Morphological neural networks, of which lattice based associative

memories are a subset, were given a special issue of the Journal of Mathematical Imaging

and Computer Vision in 2003 [5].

The lattice algebra based networks are not .ll.-- li-- robust in the presence of noise,

however [6]. Though the literature focuses on improving such networks' ability to retrieve

clean input patterns when presented with noisy versions, it fails to take into account the

presence of outliers within the initial data. Such outliers are capable of making the fixed

point set arbitrarily wide, which can greatly decrease the performance of the memory

when presented with a distorted version of an initial data point.

1.2 Objective and Approach

A review of the background information necessary for understanding the proposed

research is presented. This review covers the basics of linear correlation based associative

nienories and the theory of lattice based nienories as well as the fundamentals of linear

and nonlinear systems analysis.

The piecewise linear nature of lattice based associative nienories is explained and

a framework is developed for discussing these systems in general. There are an infinite

number of piecewise linear encoding and decoding operators to explore, and two scenarios

are treated here.

First, a novel category of piecewise linear associative nienories is introduced, which

makes use of order statistic encoding operators beyond simply the nxin and the nmax. This

class of nienories contains the canonical nienories, and is shown to be more robust with

respect to outliers within the initial data set. This intprovenient is described in detail,

and improved tolerance to outliers is quantified using a widely available computer science

database.

Second, this family of encoding and decoding operators is expanded to include all

Ordered Weighted Average (OWA) operators. A subset of these operators referred to

as near-nlin-nlax operators is studied. This analysis introduces the notion of a nienory

being i.-- !!-hehaved in the nxin/nlax c. .!- which is shown to be a property crucial to the

application of a generalized OWA hased system.

The notion of encoding and decoding with OWA operators generalizes both of these

families of associative nienories and provides a coninon framework for both.

CHAPTER 2
LITERATURE REVIEW

2.1 Linear Dynamical Systems

Systems theory is a mathematical discipline derived from calculus, differential and

difference equations, linear algebra, probability and statistics[7]. The abstractions provided

by systems theory apply to physical quantities, such as voltage and velocity, as well as

concepts from ecology, economy, engineering and sociology [8]. A system that evolves over

time is referred to as a dynamical (or often dynamic) system. This section paraphrases

the material found in "Introduction to Dynamic Systems" by David Luenberger, which is

an excellent reference [9]. An understanding of linear dynamical systems is essential for

understanding piecewise linear approach described in OsI Ilpter 3.

2.1.1 Linear Systems and the State Space Model

A mathematical model often used for describing and analyzing a dynamic system is

the state-space model. There are three types of variables in such a model: input variables,

output variables, and state variables. Though the first two types are self explanatory, state

variables require some elaboration.

At any given moment the state of the system can be fully described by the values of

the state variables, collectively known as the state vector. With knowledge of the system's

architecture, and the current values in the state vector, it is possible to predict the next

state that the system will take. Geometrically, the state vector defines a point in a space

known as the state-space, or sometimes the phase-space. Time is an independent variable

upon which these state variables depend. For a continuous-time system with n-dimensional

state vector x(t) = [xl(t), ..., X,(t)]T the change in system state over time can be

expressed with a differential equation. The order of the system is the largest derivative in

the equation[9]. In general, for some function f, the continuous-time equation governing

movement within the state-space is

x(t) = f (x(t)), (2-1)

where the overdot denotes differentiation with respect to time. At time, t = 0 the value

x(0) proivdes the initial conditions of the system. In the discrete-time case the state

vector is x[k] =(xi[k], ..., x,,[k])7 and the corresponding equation for calc~ulatingr the

change of state is

x[k + 1] = f (x[k~]) (2-2)

Here the order of the system is the difference between the largest index and the smallest.

At time step k = 0 the value x[0] provides the initial conditions of the system. When these

initial conditions are known, the output of the system at any time can be calculated by

simply iterating through each time step. In both cases f is a vector valued function that

does not depend on time directly. Because its domain is a set of points and its range is a

set of vectors, f defines a vector field in n-dimensional Euclidean space [10]. A constant

vector x is referred to as an equilibrium vector in continuous-time if

ic(t) = f (x) = 0. (2-3)

The discrete-time analogy is

f (x) = x. (2-4)

Since the rate of change is now zero, one solution of the system is the constant function

x(t) = x. The next section covers the existence and uniqueness of solutions. For a system

described by a linear transformation the function f(x) can be described by the following

equation

f (x) = Ax + b. (2-5)

2.1.2 Existence and Uniqueness of Solutions

First, let us examine the existence and uniqueness theorem for difference equI liGo!['i]:

THEOREM 2.1.1. Let a difference equation of the form

X(k + n) + f [x(k + n 1), ..., x(k), k] = 0, (2-6)

where f is
of consecutive integer values of k (k = ko, ko + 1, ko + 2, ..) The equation htss one

and only one .solution corresponding to each arbitraryy i~~~....W.rl..>,n of the n initial values

r(ko), r(ko + 1), ..., r(ko + n 1).

By simply specifying an initial condition and computing the value of f it is possible

to iteratively calculate a unique value of x at any time step k. By knowing the initial

values one can compute the value of x at any time t. This case is 1\arkovian the sense

that that knowledge of the current state is all that is required to compute the next state.

2.1.3 Equilibrium Points and Stability

The state of a dynamic system does not ahr-l-w change with time. Recall that points

in the state-space where the system comes to rest are known as equilibrium points. Once

a system comes to rest at an equilibrium point it will stay at rest. Formally, for the linear

systems described Eq. (2-5), a point x is an equilibrium point if and only if

x = Ax + b. (2-7)

Note that if this was a homogeneous linear system, i.e. if there was no b vector in Eq.

(2-7), then x would be an eigenvector of A with an associated eigenvalue of A = 1. For a

homogeneous linear dynamical system without an eigenvalue of A = 1, the only equilibrium

point is the origin. As the system is non-homogeneous, the role of the eigenvectors of

the system matrix is somewhat different. It is sometimes possible for x to be solved for

directly, i.e.

x = [I A]- b, (2-8)

though it requires that I A be invertible. This in turn requires that A does not have

an eigenvalue equal to one else the determinant of I A would be zero, implying that it

was singular. In such a case, Eq (2-7) may have infinitely many solutions or none at all,

depending on whether or not the system of equations is consistent.

Each equilibrium point x is described as .I- unpl..i l ally stable if for any initial

condition the state vector tends to x as time increases. It is described as unstable if for

some initial condition the state vector tends toward infinity. If it neither tends to zero nor

tends to infinity it is called marginally stable.

At any time step the distance between the state vector and x can he calculated, i.e.

x[k + 1] x = Ax[k] Ax + b b, (2-9)

which implies that

x [k + 1] x = A (x [k] x) (2-10)

Letting z[k] = x[k] x simplifies Eq. (2-10) to the following homogeneous equation

z [k + 1] = Az [k]. (2-11)

In the case that the state vector tends towards x it is then synonymous to ;?i that

z [k + 1] tends towards zero. As mentioned previously, the stability of a homogeneous

system such as Eq. (2-11) is governed by the eigenvalues of ,4. This relationship is clearly

illustrated by diagonalizing ,4.

2.1.4 Change of Basis and Diagonalization

Any point in a linear subspace can he described by a linear combination of vectors in

a basis for that space. The most common basis for n-dimensional Euclidean space is the

set of vectors E = el, e2, .., where

e3 if i = (2-12)

This canonical basis is not the only linearly independent set of vectors that spans SR"n

and it is possible to represent any point in terms of a different basis. This is known as a

change of hasis [11].

Let x = {.ri,.-2, *.. i e a VectOT in s,2 defined ill ternis of the canonical basis E.

Furthermore, suppose that there is another n x n matrix P of vectors that also provides a

basis for the subspace. Let the representation of x as a linear combination of vectors in P

with a vector coefficients z. The relationship between the two vectors is given by

x = Pz. (2-13)

Let there exist a matrix A definingf a linear transformation in the standard basis such that

y = Ax. To calculate the vector w representing y in the new hasis we use Eq. (2-13) to

determine that w = P-ly. This implies that

w = P-ly P-1APz. (2-14)

This means that P-1AP is the linear transformation for the new hasis, which corresponds

to A in the standard basis. This fact will be useful when performing an eigfenspace

analysis of a linear system.

Another useful analysis tool is one known as diagonalization. If it is the case that all

n eigenvalues of an n x n matrix A are distinct, the corresponding n eigenvectors define a

basis in which the original transformation is represented by a diagonal matrix. Analysis of

a diagonal system can he more straight-forward due to the lack of cross-ternis.

Note that if A is an n x n matrix with a set V = {vl, V2, ..., v,2} of distinct

eigenvectors, then V must he linearly independent. The matrix V of n linearly independent

eigenvectors is an important construct known as the niodal matrix of A.

Since the vectors in V are linearly independent any x e sR" can he represented as a

linear combination of these vectors. For some coefficients cl, ..., ci, it must he true that

Ax = X10172, 202~V2, *, ;zczv,. (2-15)

Note that there are no cross-terms here. Each of the new coefficients Asci are simply scalar

multiples of the previous coefficients. The representation of x in terms of the new basis is

x = Vc, (2-16)

where c is the vector of coefficients. Using Eq. (2-14) and the definition of V it is possible

to create a transform corresponding to A in the new basis as follows:

A = V- AV. (2-17)

Because Eq. (2-15) illustrates that the application of A to x with respect to the original

basis equates to simply multiplying the corresponding vector in the new basis by a

diagonal matrix of eigfenvalues

At 0 ... O

A = 0 2.. O(2-18)

0 0 .. As

To solve for A it is possible to rewrite Eq. (2-17) as

A = VAV-1 (2-19)

which is useful when analyzing the stability of a linear dynamical system.

2.1.5 Eigenvalues and Stability

For the homogeneous system described in Eq. (2-11), assume that A can be

diagfonalized. This implies that there is some matrix M such that

A = MAM-1, (2-20)

where Eq (2-22) is true. Consider the matrix Ak, Where A is multiplied by itself k times,

1.e.

Ak" = MAM- MAhM- ... (2-21)

Since M-1M = I, this can he simplified to A = Mh'k -1

Ak: 2 -1. (2-22)

Should Ak" tend toward zero then Eq. (2-11) will also tend toward zero and the system

will be .I-i-mph.1le1;ally stable. This will only happen when |Ag| < 1 for every i, which

means that the system is .I-i-mptotically stable if and only if every eigenvalue lies

within the unit circle on the complex plane. Likewise, if there is an eigfenvector with a

magnitude greater than one then some initial condition will lead to a solution that grows

geometrically with time and the system will be unstable. In the event that none of the

eigenvalues are outside the unit circle, but one or more lie directly on the unit circle the

system neither converges to zero nor infinity; it is marginally stable.

All of this assumes that the eigenvalues are distinct. All lli--i of a system with

non-distinct eigenvalues is possible using the Jordan Canonical form of A which is nearly

diagonal, if not actually diagonal.

2.2 Non-Linear Dynamical Systems

2.2.1 Linear versus Non-Linear Systems

In the study of non-linear systems certain functions named after Russian mathematician

Aleksandr Lyapunov are used to describe stability. Though an equilibrium point is defined

in the same manner for both linear and non-linear systems the description of such points

in non-linear systems is much richer for two reasons [9]. First, since they represent

solutions to the differential (or difference) equations they are often more difficult to solve

for in complicated non-linear systems. Second, a non-linear system may have any number

of equilibrium points from none to a one to a finite or even infinite set of them and they

may be arranged in almost any locations within the state space. The study of equilibrium

points and their stability is therefore much more intricate in non-linear systems than it is

in linear systems.

2.2.2 Lyapunov Stability

Originally Lyapunov attempted to model the behavior of the system with a linear

approximation of the behavior of the system near equilibrium points. This is referred

to as Lyapunov's first method and, though it was able to characterize the behavior of

the system in small neighborhoods around .-i-mptotically stable or unstable equilibrium

points it was unable to address marginally stable equilibrium points or to describe

behavior at all outside of those small neighborhoods.

Lyapunov's second method, however, works directly with the non-linear system

without having to approximate it via the linearization over small neighborhoods. The

Lyapunov functions used in this method are often referred to as energy functions when

there is a physical analogy to the system being studied. The general requirements for a

Lyapunov function V defined around an arbitrary domain D of the state space containing

the equilibrium point x are

1. V is continuous

2. V(x) has a unique minimum (not necessarily 0) at x with respect to all other points
in D

3. Along any trajectory in D, V decreases monotonically.

Assume f is continuous. Examining the behavior of the discrete time system

x [k + 1] = f(x [X]) (2-23)

around equilibrium point x, notice that the function V(x) decreases monotonically over

time if the following holds

av(x) -- V(f(x)) V (x) < 0. (2-24)

That is to ;?i- that if for every time step k the function AV(x) is less than zero, then

the function decreases monotonically over time. The function AV(x) describes the change

in V(x) from time k to k + 1 by Eq. (2-24). This allows the third condition for a function

to be called a Lyapunov function to be re-written as

The function AV(x) V(f(x)) V(x) satisfies

avxl < 2-25)

for all x in D.

The reason behind these conditions is that if a continuous function decreases

monotonically over time and has a unique minimum then it will eventually reach that

minimum and will not be able to move from it, due to the monotonicity requirement and

the fact that every other point will be higher. This is .I-i innind ilc stability. If it is simply

non-inl i. I-;li. but not strictly (.1. ~ I-;li it may stay in an orbit through points near but

not equal to the equilibrium point. This is marginal stability.

The Lyapunov Theorem and the geometric proof is found in Luenberger [9]:

THEOREM 2.2.1. (Lyapunov's Theorem) If there exists a L;,,ri' '.>~v function V(x) in

a spherical region S(x, Ro) with a center x, then the equilibrium point x is stable. If, the

function AV(x) is strictly ,: lr.; .e.. at every point except x then the i~rd..G:/;, is 'i d*-

2.3 Fundamentals of Associative Memories

An associative memory is a type of content addressable memory. In general, an

associative memory M provides a mapping between a set of input queries, often called

keys, and the values we would like to associate with them. Given a set of input pairs

of the form (xeX, yeY) such that X = {xl, X2, ..., Xk) is the set of keys and Y =

{y y2, k} ,Y~ are their corresponding values, we ;?i- that M~ is a perfect recall memory

for X and Y if and only if x" M ~ y", VEe {1, ..., k}).

The associative memory model described above is known as a hetero associative

memory because the set of keys may be different than the set of stored patterns. When

these two sets are identical the network is known as an auto associative memory. In other

words, an auto associative memory associates pattern pairs of the form (xE, xE) suchl

that xt e X. Typically these memories are used for retrieving the clean version of a stored

pattern upon the application of a noisy version.

One of the earliest, and simplest, varieties is the linear associator, developed

independently by Anderson [12] and K~ohonen [13] in 1972. It uses a linear correlation

matrix (LC'jL) to store the input pattern pairs. Other popular LC'jL-l -0 .d associative

memory architectures include the Hopfield Network, developed by John Hopfield in

1982 [14], and K~osko's bi-driectional associative memory (BAM), introduced in 1988 [1].

An alternative to the LC'jl-b~ased architecture is the lattice-based associative memory

introduced by Ritter in 1997 [4]. These memories, based on lattice algebra rather than

linear algebra, are covered in ('! .pter 3.

2.3.1 The Linear Associator

Anderson's Linear Associator is an extremely simple, fully connected, feedforward

neural network consisting of m input nodes, a output nodes, and a weight matrix W=

t'I, .]mxn such that the weight I, represents the synaptic connection between input node i

and output node j. This weight matrix is used to store k pattern associations of the form

(xE, yE) 1,, whr x e {-1, 1}m" and ye { -1, 1}'". In order to store the input pattern pairs,
a correlation matrix Wk, is created for each pair and the matrices are summed, i.e.

(= 1

where a~ is a normalization constant, usually 1/k, used to keep values in a reasonable

range regardless of the number of input vectors being stored.

When presented with a key vector, the weight matrix is applied to the key to generate

the system output. Let x be the input to the associative memory, and let z be the output.

Pattern z is recalled according to the following formula:

j=1

Along with a thresholding process

1/ r if zi > 8i (2-28)
-1if zi < 8 ,

where 8 is a vector of containing thresholds for each output node, chosen beforehand by

the user. This memory is only able to t irllb recall mutually orthogonal patterns. If the

patterns are not mutually orthogonal, there will be some cross-talk between the patterns

that prevents perfect recall [15].

2.3.2 The Hopfield Model

The Hopfield network is a nonlinear recurrent network. It is recurrent because it

includes feedback from the output stream as part of the input stream during computation.

Unlike with the linear associator, computation with the Hopfield Network is a recursive

process and analysis of this process requires an understanding of the state-space model of

a dynamical system, along with a description of the stability of the state space known as

Lyapunov stability [16].

Unlike the Linear Associator, the Hopfield network does not have a feedforward

architecture. The output of each neuron feeds the input of every other neuron, but no

neuron is connected to itself. This structure is equivalent to a single, fully connected 1 ri

with no cycles. In addition, the weight matrix is symmetric.

The Hopfield network is a dynamical system. The relationship between the state

vector at time step k and time step k + 1 is governed by the equation

where Ii is a constant input value. This should not be confused with the identity vector.

The Si function refers to a thresholdingf function.

Neurons in a Hopfield Network may have outputs in the range [0, 1] or the range

[-1, 1]. For neurons of the former type, the following thresholding is applied to the result:

1 if x,"+ > 8i

Siz (xf ) ==xi if x,"+ = 8i (2-30)

0 if x,"+ < 8i

The latter type use the similar formula

1 if xf*] > 8i

Siz (xf*) ==xi if xf*l = 8i (2-31)

-1 if x,"+ < 8i

The Hopfield network keeps track of a state vector that changes over time and is therefore

a dynamical system. Computation occurs in two directions, rather than in one. Because

the memory W is symmetric, fixed point stability may occur. In networks where the

feedforward and feedbackward connection matrices are different other forms of stability

such as chaotic wandering and oscillation occur more frequently [17].

Updating of neurons according to the Hopfield model is done .-i-nchronously, or one

neuron at a time. The entire network can then be viewed as a vector random process[1],

where each element in the vector is a neuron a scalar random process. At every time step

an arbitrary subset of neurons will change its state, the size of this subset can be reduced

to one neuron if the time resolution is fine enough.

In this way the Hopfield model is specific case of a network known as the Bidirectional

Associative Memory. It's behavior is characterized in the following section. Unlike the

Linear Associator, the Hopfield Model is capable of perfectly recalling patterns that

are not mutually orthogonal, but perfect recall is only Il;;.;ranteed in the event that the

patterns are mutually orthogonal.

2.3.3 The Bidirectional Associative Memory

In order to discuss the behavior of Bidirectional Associative Memories (BAMs),

including the Hopfield Network, it is necessary to establish a certain technical vocabulary.

This will provide a framework for discussing the properties of BAMs as neuronal

dynamical systems. Such systems are often described in terms of neuronal fields and

the differential equations[18] that govern the synapses therein. The BAM model was

created by K~osko [1], though it has been extended through research by Wang, et al. [19]

and Mathai, et al. [20].

Neurons in a field are related by the connections between them. Following the

conventions set forth by K~osko [17] the default neuronal field will be denoted as Fx.

If second or third field is necessary, those will be Fy and Fz, respectively. Using this

notation a three-lin-;-r feed forward neural network would be denoted as Fx Fy Fz,

where Fx is the input 1 ... r, Fz is the output 111-; r, and Fy is a single hidden 1 ... r.

With regard to two lI ... r networks, K~ohonen's conventions for dimensionality will be

followed [17], so that the input field, Fx, contains a neurons, and the output field, Fy,

contains p neurons. The number of vectors pairs (xi, yi) that the system associates is m so

that the sets of input patterns are X = {xl, ..., xm} and Y = {y ..., ymb.

For the field Fx the differential equation associated with the activation time function

of the ith neuron is

x1 = gi (Fx, Fy) (2-32)

Here the overdot signifies the derivative with respect to time. The state of a neuronal

dynamical system at any time t can be represented completely by a state vector of the

form

X(t)= (xl(t),..., x,(t)) (2-33)

The synaptic weights in a two-1 ... r neuronal dynamical system can be maintained in

an a x p connection matrix M~ such that the ijth synapse mij represents the synapse

resulting from the ith neuron in Fx to the je" neuron in Fy such that mij is positive if the

synapse is excitatory and negative if it is inhibitory. This connection matrix is used during

the forward projection of the network from field Fx to Fy, or in both directions in the

auto-associative case, when there is only one 111-;-r Fx. In the case of a hetero-associative

network an additional p x n connection matrix NV will be required to handle the backward

projections.

A bi-directional network is one in which M~ = NVT and NV = M~T. This implies

that Ml = Nyj. If the dynamics allow for stable behavior these systems are known as

bi-directional associative memories, or BAMs. According to K~ohonen [21] a one-lins-r

associative memory is a heteroassociative memory and a two-1 e. -r associative memory is

referred to as an autoassociative memory.

The output of each neuron decides the state of a bidirectional associative memory.

According to K~osko [17], the output of a neuron will decay to its resting value in the

absence of external input or neuronal stimuli. This means that without input the

differential equation corresponding to the change in output for the neuron will be

xi = -xi. (2-34)

The solution for Equation 2-34 is xi(t) = xi(0)e-t, implying that without input, neuronal

or external, the neuron output will simply decay to its resting value.

To model different decay rates for different neurons it is possible to add a mutliplicative

decay rate parameter Ai > 0 so that Equation decayDiffeq becomes

xi = Aixi. (2-35)

The solution for Equation 2-34 is xi(t) = xi(0)e-Ait. Equation 2-35 is simply a

generalization of Equation 2-34, which can be thought of as having a decay rate Ai = 1.

In the autoassociative case of the bidirectional associative memory, there are a

first-order differential equations that govern movement within the state space:

Pi = Aixe + S;(xy)mii + Is, (2-36)
j= 1
where M is the matrix handling the synaptic weights.

This is the general form differential equation governing a neuron's output over time,

with the addition of a term comprised of the weighted sum of neuronal inputs and an

external input source. This vector, I, should not be confused with the identity vector,

referred to elsewhere in this document. In the heteroassociative case, there are equations

for propagating the state changes back and forth between both 1-@ r ;:

P, = Aie +C S;(ys)ngs + I, (2-37)
j=1

i= 1
where the matrix N handles the synaptic connections in the opposite direction of those

handled by the matrix M.

Both K~osko's BAM and the Hopfield model, to which the BAM reduces in the

autoassociative case, are discrete bivalent networks, meaning that their activation is

modeled by difference equations and that each neuron has either a binary state (0 = off, 1

= on) or a bimodal state (-1 = off, 1 = on).

The discrete equations corresponding to Eq. (2-37) and Eq. (2-38) are as follows

xi* S (y)my +Is (2-39)
j= 1

y :* = Ssxfma + y,(240
i= 1
where the J vector contains constant inputs for the opposite direction. Note that,

although the theory allows for a different memories for each direction, both the bivalent

BAM and Hopfield model simply use one memory M.

The encoding of input pairs in order to create a memory is an area that has been

researched heavily. This document will cover the most common method, known as the

outer product learning method, though others have been proposed [22] [23] [24].

In order to encode the vector associations (xiyi) from the sets X and Y the

following equation is applied

MI = XkT k (2-41)

According to K~osko a BAM system (Fx, Fv, M) is bidirectionally stable"if all inputs

converge to fixed-point equilibria" [17], which implies that bidirectional stability is a form

of global stability. Since there are two directions in a BAM overall system stability can be

achieved by oscillating between two states, one for Fx and one for Fy.

The energy of a bidirectional associative memory is governed by a Lyapunov function

L, = S(X)MS(Y)T S(X) [I U] S(Y) [J- ] (2-42)

where U and V are the thresholds. The lower bound for the energy function is

L; > C I -? |ms, ,Us |, V,|. (2-43)

An interesting outcome of K~osko's research is the Bivalent BAM Theorem, which states

that any matrix M is bidirectionally stable. It does not, however, guarantee that the

system will converge to a useful fixed point. The equilibrium point might be a spurious

memory.

2.3.4 Problems with Linear Associative Memories

Though the linear correlation matrix provides a framework for associating pairs of

pattern vectors, it has certain limitations. Regarding the Hopfield Network, for instance,

it has been shown that when the dimensionality of the key vectors is n it is only possible

to recall up to n/(4logn) pairs perfectly [3]. Furthermore, all samples must be pairwise

orthogonal [25].

For K~osko's BAM the capacity is NV/(21(1,1_.), where NV is equal to the total number

of nodes in both lI .v. rs [2]. Shortly after the BAM was introduced, a number of researchers

proposed adjustments to the model that would allow for increased capacity. Leung

and Ch.! II!s;I__ -r--- -1.. a method known as the Householder Encoding Algorithm[26],

Haines and Hecht-Nielson proposed an alternate scheme for thresholding the state vector,

referring to the resulting model as a Non-Homogeneous BAM [2].

More recent networks proposed for increasing the capacity of the linear correlation-based

memories, include the Second Order Asymmetric Bidirectional Associative Memory

(SOABAM) introduced by Chang and Cho in 2003 [27] and Wang's Optimal Bidirectional

Associative Memory [28]. Given an arbitrary set of pattern associations, however, perfect

recall with these memories is not guaranteed because their capacity, though increased

beyond what was originally available, is not infinite. Complete recall of an unlimited

number of arbitrary pattern associations is possible, however, with the Lattice Based

Associative Memories of Section 2.4.

2.4 Lattice Based Associative Memories

2.4.1 Introduction to Lattice Algebra

Lattices are mathematical constructs, like groups or rings, that plI li- an important

role in the discussion of abstract algebra[29]. Early evidence of lattice theory stems from

Boole's work on the nature of thought in 1854 [30] and that of Dedekind at the turn

of the 20th Century [31]. An understanding of the fundamentals of lattice algebra will

be necessary in order to understand the lattice based associative memory of Ritter and

Gader. The theoretical underpinnings of this document stem from their work.

There are two standard frameworks for discussing lattices. One is in terms of partially

ordered sets (posets), their definitions, and properties. This method allows for a more

geometrical interpretation through a visual tool known as the Hasse diagram named after

the German number theoretician Helmut Hasse.

The other branch of lattice theory is algebraically similar to the theory of groups and

rings. This second methodology will be used extensively in this document.

DEFINITION 2.4.1. A trip~let (L, V, A), where L is a non-tiilh;, set and both V and A

are binary operators is called a lattice if

1. xVy=yVx

B. xAy=yAx

3. x V (y V z) = (x V y) V z

4. x A(y Az)= (x Ay) Az

5. xVx= x

6'.xAx=x

7. x = x V(x Ay)

8. x = x A (x V y).

The first two requirements are simply the commutative laws for the V (often called

the "join") and A operators (often called the "meet"). The second two requirements are

the associative laws. The idempotent and absorption laws are covered by the last two

pairs of requirements. For any two elements in L there must be a greatest lower bound,

determined by the meet, and a least upper bound, determined by the join[29].

The triplet (sR, V, A) is an example of a lattice, where sR denotes the set of real

numbers and V and A are defined such that x V y = max {x,y } and x A y = min {x, y }.

This triplet, (sR, V, A), defines a distributive lattice, but it is not a complete lattice

because there is no smallest element in the set of real numbers, nor is there a largest

element. In order to create a complete lattice it is necessary that the set of real numbers

be extended to include both a smallest and largest element. The symbol -oo will be used

to denote the smallest element, and likewise the symbol 00 will be used to denote the

largest. With the set of real numbers extended in this way, a new triplet (SJao,, V, A) is

created. Here, EsR, is the set sRU {-oo, 00}.

Addition operators may be added to the structure as well. By extending the set of

real numbers to contain the elements -co and oo the matter of addition becomes more

complicated, requiring the use of two different addition operators, each with its own

properties regarding inputs at the boundary of the extended real numbers.

Let + be an addition operator with the following properties

a +(-oo) =(-oo) + a (-oo) Va e R. (2-44)

a + 00 = 00 + a = 00 Va e sR, (2-45)

a + 0= 0+ a = a VaeR, (2-46)

(--00) + 00 = 00 + (--00) = --00. (2-47)

Also, define +* to be another addition operator with the properties

a +* (-oo0) = (-oo0) +* a = (-oo) Va e sRo, (2-48)

a +* 00 = 00 +* a = 00 Va e sRo, (2-49)

a + 0= 0+ a = a Va eRo, (2-50)

(--00) +* 00 = 00 +* (--00) = 00. (2-51)

The difference between the two addition operators lies in how they handle the case that

-oo and oo are added together. The resultant structure (sR4,, V, A, +, +*) is known as a

bounded 1-group, or blog.

Along with the addition operators, it is useful to define two new operators and

I known as the min product and max product. The are defined such that C = A B

means that cij = A =(aik + bkj). Similarly, C = A B implies that cij = VI x(tik + bkj).

These operators are the lattice algebraic equivalents to the matrix product of linear

algebra.

The concept of conjugation exists in lattice theory as well. For an element re 9 4,~

the conjugation of r is defined by

r* = r if e 9,(2-52)

r* = co if r = -oo, (2-53)

r* = -oo0 if r- = c. (2-54)

Conjugation in the lattice domain has the properties that (r*)* = r and r An = (r* Au*)*

(r V u) for all r, u.

2.4.2 Fundamentals of the Lattice-based Associative Memory

Unlike associative memories based on linear correlation matrices, the lattice-based

associative memory uses the maximum (or max) and minimum (or min) operators to store

pattern pairs of the form (x y ) for ( = 1, ..., k. The goal is the same as before,

i.e. to store the pairs in a memory M such that y' is recalled whenever xt is presented.

Symbolically, this can be denoted xt M ye. Let the set of input keys be denoted X

and the set of output "memories" be denoted Y.

Using the lattice-theoretic framework it is possible to create two different memories

for the chosen pair of pattern associations. One of these memories uses the pointwise

minimum and is defined as Wxy such that

(= 1

Likewise there is a memory Mxy created using the pointwise maximum and it is defined

such that

mij =V (sy -e). (2-56)
(= 1

Another useful method for definingf the memories Wxy and Mxy is through the use of

the minimax outer product, defined as

y1 + x* =1 (2-57)

It is often notationally convenient to define the memories in the following manner

Wxy = [y + (x )*] = Mxy =V v+()*. (2-58)
E=1 E=1

The memories Wxy and Mxy will be referred to as canonical memories throughout

this document to separate them from other proposed memories making use of encoding

operators other than the pointwise minimum or maximum. They are also known as

morphological 1!! 1....11. [r;]. Occasionally the subscripts will be removed and the memories

will be referred to simply as W and M.

A memory that will recall pattern ye when presented with ye for all (e {(1, ..., k} is

known as a perfect recall memory for (X, Y).

2.4.3 Existence of Perfect Recall Memories

This section examines the existence of perfect recall memories for the lattice based

associative memory. That is, for which sets of input pattern pairs (X, Y) will the

memories Wxy or Mxy behave as perfect recall memories. The material in this section

is based on the work of Ritter and Sussner and Gader [4], [6] and proofs can he found in

those documents.

THEOREM 2.4.1. Let (X, Y) denote the associatedd .set of pattern vector pairs.

Whenever there exri~st perfect recall memories A and B .such that A x~ = y' and

B I xE = yE for ( = 1, .. k, then

A < Wxy < Mxy < B and V( Wxy V x' = y~ = MxyI x (2-59)

This is enough information to prove that the min product of an input key with the

memory W yields the same output as the max product of the same key with the memory

Since y =A ~x
and y
then Wxy x = Mxy x = y. (2-60)

Equation 2-60 is important as it states that when either of the morphological memories

W or M are presented with a key vector they will return the same memorized pattern.

Also, Wxy is the least upper bound of all perfect recall memories involving the max

product and Mxy is the greatest lower bound of all perfect recall memories involving the

min product. Furthermore, if perfect recall memories exist for a set of pattern associations

then both of the canonical memories will be perfect recall memories. Still, it is necessary

to speak to the existence of perfect recall memories in the first place. The following

theorem, and proof, address this question and are taken from Ritter and Gader[6].

THEOREM 2.4.2. Wxy is a perfect recall memory for (xE,yE) if and only if each

row of the martrirr [y' + (xf)*] -Wxy, contains
perfect recall memory for the pattern pair (xE, yE) if and only if each row of the f actrix:

M'xy [yE + (xE)*] contains 2.4.4 Fixed Point Sets

As mentioned in 2.3 an autoassociative memory is one in which the input keys and

output patterns are the same. In other words, an autoassociative memory stores pattern

pairs of the type (x xE). A fixed point is an input key for which the memory's output is

the same vector, i.e. x M = x. This discussion of the fixed point sets of lattice-based

associative memories will focus entirely on the autoassociative case. Let the fixed point set

for either canonical memory be denoted F(X). The material in this section will reveal that

the fixed point set is the same for both canonical memories.

Because of 2-60, and the fact that ye is not restricted to being different from xe it is

clear that Wxx [ V xt = xt = Mxx [ X] xt. This means that X C F(X). Another

noteworthy theorem states that for every point x in the fixed point set the set of points

{a + x} is also in F(X), for any real number a. The theorems in this section are based on

the work of Ritter and Gader [6].

THEOREM 2.4.3. If x e F(X), then (a + x) e F(X), Va e sR.

This means that membership in the fixed point set is closed under scalar addition

[32].
2.4.5 Geometric Interpretation of the Fixed Point Sets

In order to visualize, and indeed to fully understand, the fixed point set of an

auto-associative lattice-based associative memory it is useful to describe it in geometric

terms. This section will provide mathematical tools for understanding the geometry of the

fixed point set, as well as a two-dimensional example that puts the tools to use. As with

all of the material in this chapter, it is based on the work of Ritter and Gader [6].

Given a set of input patterns X = {xE e s9", (= 1, ., k} define a function A such

that

A(1) {1,..., }/1.(2-61)

That is to ?-w that A(1) returns all of the indices into the set of dimensions except for

index 1. This will be useful as part of one of the later definitions.

Next, define a set of unit vectors {ez } such that

e4 if i = (2-62)
0if ifl

For each of these a vectors define a plane Eo(el) such that

Eo(el) = {z e s9" : zz = 0}. (2-63)

In other words, Eo(el) is the set of points in sR" with the value zero in the Ith dimension.

To visualize this, examine Figure 2-1 which shows the possible Eo planes in a two-dimensional

case. Also, for the sake of this discussion, define two points in sR2, a = (2, 1) and

b = (4, 3), both of which are shown in Figure 2-1. Let the set of input patters X = {a, b}.

Note that, in the two-dimensional case, Eo(el) is simply the Z2-aXiS, While Eo(e2) is the

Z1-aX1S.

Let us also define the following set such that for some pe 9"S

So(i, 1, p) = {z e Eo(el) : zi = pi}. (2-64)

For the two-dimensional example shown in Figure 2-2, the values of So(2, 1, a), So(2, 1, b),

So(1, 2, a), and So(1, 2, b) are shown. For instance, the value of So(2, 1, a) is simply the

point So(2, 1, a) = (0, 1) because that is the only point on the x2 aXiS With an x2 Valut

that is equal to a2-

Note that in the 2D case each of the sets So(i,1i, p) contains exactly one point, and

that it lies on the xA(1) aXiS. This point divides its axis into two half planes. For each of

these points, take the half plane containing a and b. Let the intersection of these half

planes be denoted as BI. That is

BI = [H(i, a) nH(i, b)], (2 65)
ieA (1)

where H(i, a) is half plane created by So(i,1i, b) containing point b and similarly for

H(i, b). For the 2D case consider specifically the case that 1 = n and

B,, = [H(i,a)n H(i, b) ] (2-66)
ieA (n)

where H(i, a) is half plane created by So(i, n, b) containing point b and similarly for

H(i, b). These points, and the hyperplanes they define, are shown in Figure 2-3. Their

intersection, B,, is the shaded rectangle.

Given these definitions, then, it is possible to define the fixed point set F(X) as a

subset of the prism created by translating every point in B,z along a diagonal line such

that

F(X) C U(B,z + ). (2-67)

The region corresponding to F(X) in the two dimensional example is shown in Figure

2-4. It is the region between the two orange lines. This is one way of visualizing the fixed

point set. The formulas are for the n-dintensional case, though the example is specifically

in two-dintensions. There is another useful way of visualizing the same fixed point set. It

requires slightly different constructs.

First, let the set Fi(X) represent the image on the plane Eo(el) for every point z in

F(X). That is, for such a point z = (xl, ... -, ),; e F(X) the point (ci i xl .. x, xl)

is in FI(X). The formal definition of FI(X) is

Fi(X) = {z e Eo(el) : 3 x e F(X) with x(1) = x}, (2-68)

where

{x(1)} = L(x)n Eo(el) (2-691)

For any point x there is a point x(1) = a + x for some ae sR. In an n-dintensional case

there are n possible sets Fi(X), ..., F,z(X) and any of then can he used to define the

fixed point set. That is, for any le {1, ..., n}

F (X) =U [a + F, (X)] (2-70)

This is illustrated in Figure 2-5 where two sets of axis are presented, each with orange

lines depicting the region F(X). One pair of axis contains Fi(X) and the other contains

F2 (X).
The 2D example shown in Figure 2-1 only contained two points. To generalize this

result to the the case in which X contains more than just points a and b consider the

definition of jx(1) given in Equation refxlDef, which can be used to show that F(X)I is

bounded for every I and thus F(X), is bounded.
Let u" =/ V "1,an "=/ xf (1). Note that in the 2D example u' a and

mI = b because b > a. Define the BI to be the hyperbox determined by ul and m i.e.

BI = Bz (ul, mi). Figure 2-6 shows an example in 2D of a set of points for which ul and ml

(shown in blue) form the hyperbox B, (shown in gray), though they are not part of the

set X (points shown in black).

To prove this, let X(1) = {xl(1), ..., xk(1)} and for each ( = 1, ..., k define

as = -xf, which is the value xi would have to be shifted by in order to intersect Eo(el).

This means that xt (1) = as + xt.

Letting w(1)ij = [Wx(i)x(i)]ij and m(1)ij = [Mx(i)x(i)]ij

I, ( x3 ) \ [(a+ x i (,+ (2-71)
E=1 (= 1

E=1

and

may~~~~ = x-x) V[e ] (2-73)
(= 1 (= 1

(= 1

This implies that Wxx = Wx(l)x(l) and Mxx = Mx(l)x(l). Now, if x e FI(X) then for

ii ..., a

xi =[WxYx x]i [Wx(E)x(1)0 x]i =VaZL'() ., 31
J=1

> w (1)i + xl = ( xf (1) x: (1)) = xf (1), (2-75)
(= 1 (= 1

since xt x: (1) = 0. A::- simila proof yild xi = V i () It follows t~hat (1

x < V, xt (1) which shows that x e Bz and. Fz (X) c BZ1.

2.4.6 Behavior of Non-Fixed Points, Noise Suppression and Convergence

As mentioned in Section 2.3 one common use for the autoassociative memory is to

recall clean patterns when presented with noisy ones. Lattice-based associative memories

are tolerant of certain types of noise. For a given input vector x let the noisy version xe be

denoted as xt and define the process of erosive noise as one that results in the relationship

xt < xE while a dilative change is one resulting in the relationship x~ > xE.

The noise suppression characteristics of the morphological memories has been

examined in the literature [6]. Digital images, both binary and real-valued, were

transformed into input vectors via a row-scan. These vectors were stored in a morphological

memory. Perfect recall is possible for each of the original images, as shown in Section

2.4.,3.

When images corrupted with 1;:' randomly located dilative noise were applied to

the memory Mxx the original images were again produced. Upon applying the same

randomly corrupted images to the memory Wxx, however, the result was total recall

failure .

In the same report, a similar experiment was carried out by corrupting 1;:' of the

pixels in each image, again randomly chosen, with erosive noise. Once again, the corrupted

images were applied to the memory Mxy and this time it was unable to recall any of the

corrupted patterns. The memory Wxx, on the other hand, was able to recall the original

images perfectly when presented with the inputs that contained the erosive noise.

Further experiments were performed, including the non-random removal of up to

T.~' of a pattern. Even in such an extreme case of missing data the patterns could be

recalled perfectly by Wxx. This dli ph an unbelievable ability to tolerate uncertainty!

In real-world situations, where images and other signals from physical sensors are being

processed, missing data can often be a significant problem. A pattern recognition system

capable of recognizing as little as 25' of a target pattern would be extremely useful.

The following theorems formalize the noise suppression characteristics of the lattice

based associative memory. They are the result of work done by Ritter, Gader, and Dias de

Leon [32].

THEOREM 2.4.4. Sup~pose that xe denotes an eroded version of xe. The equation

Wxx IV xe = xt holds if and only if for each row index: ie { 1, ..., n} there exists a

column index: ji e {1, ..., n} such that

(:li= xi V V '("x -xx).] (2-76)

Similarly, if xE denotes a dilated version of xt, then the equation MxxI x~ = x~ holds if

and only if for each row index: ie { 1, ..., n} there exists a column index: ji e {1, ..., n}

such that
x )-xf + x\ j, (2-77)

The following example was inspired by one given in Ritter and Gader [32]. The

original demonstrated the noise suppression characteristics of the memory Wxx, while

this example assumes that the memory Mxx was used instead. All of the vectors from the

original are used here, with the addition of a new vector, y that is useful in highlighting

the characteristics of Mxx. Let X = {x1, X2, ., Xn) Such that xl = (3, 2), X2

(5.5, 0.5), x3 = (6, 4.5), and x4 8 )

The resulting lattice-based associative memory, using the max operator during

encoding, becomes

Mxx = OO (2-78)

Using the same corrupted vectors as in the original example, plus an additional vector y

that is exclusive to this example means that xl = (0.5, 2), x = (1, 1.5), x2 = (5.8, 1),

X3 = (3, 4.5), X4 = (8.5, 2), and y = (3, 6). Then, calculating the min product of the

corrupted vectors with the memory yields the following decoded vectors

MxIr\Ix 0.5 (2-79)
-0.5

Mxx ~x = ) (2-80)

MXx IX 2 = ) (2-81)

Mxx 2" = (2-82)

Mxx 4 (2-83)

Mxx ~y = (2-84)

This information is summarized in Figure 2-7, which is similar to the figure accompanying
the original example from Ritter and Gader [:32]. The 1!! li.r~ difference is that the points
outside of the fixed point set F(X) move differently. In the original, points above F(X)
moved to the nearest point on the border of F(X) in a strictly horizontal manner, i.e.
to the ligbht". In this example, using the max-encoded morphological memory Mxx
these points still move directly to the border of F(X), but they do so in a strictly vertical
direction, i.e. "d i- as Similarly, in the original example points below F(X) moved moved
to the nearest point on the border of F(X) in a strictly vertical manner, i.e. "up". In this

example, using Mxx these points still move directly to the border of F(X), but they do
so in a strictly horizontal direction, i.e. "left". This is the outcome of using Mxx instead

of Wxx and comparing both this example and the original is useful when first learning
about the behavior of morphological memories with respect to noise.

An interesting side-effect of this change in direction is that the point xl was originally

moved to the point xl but now it converges downward to a different point that is actually

further away! The added point, y, a distorted version of xl, maps directly to the clean

version in this example, whereas it would have mapped downwards to a different point

entirely if it had been included in the original example. The point X2 1S already inside the

fixed point set and therefore does not move at all in either example. The point x4 d),@l~-,;

interesting behavior in both examples. Because X4 has been corrupted in both dimensions

it cannot make it back to the original, cleaner version with either memory because the

dynamics of the system restrict the movement of non-fixed points to one direction only.

In the original example X4 WaS moved to the nearest point on the border of F(X) in a

horizontal manner, i.e. to the left. In this example it moves upwards. Neither direction

alone will suffice because x4 4 has non-zero components in both dimensions.

Furthermore it is interesting that distorted points within the F(X) will never be

moved at all regardless of how corrupted they may be. That is, the number of spurious

memories is infinite since F(X) contains infinitely many points. On the other hand, if a

point inside F(X), but not on the border, is corrupted it will never be perfectly recalled.

This is true because, if the corrupted version is still within the fixed point set it will not

move at all and if it is outside the fixed point set it will move continuously towards the

border, stopping at the first fixed point that it reaches and will therefore never be able to

penetrate the border into F(X) to reach the clean version.

Theorem 2.4.4 can be used to explain difference in the behavior of point xl in both

examples. For instance, Theorem 2.4.4 states that in order for Wxx I Ix = xl there

must be some column index js such that

2), = x) v V (x71 : x;)]. (2-85)

Letting i = 1 it is clear that when ji = 2 then

2 = 2 V (3 + 0.5 5.5) V (3 + 4.5 6) V (3 + 3 8), (2-86)

and when i = 2 and ji = 2

2 = 2 V (2 + 0) V (2 + 0) V (2 + 0), (2-87)

so for each i an appropriated ji exists and therefore Wxx I Ix = x.

On the other hand, to prove that Wxx 21 V / xl it is enough to examine the

formula

X'-- = x1 A (x) xf + x) (2-88)

Letting i = 1 and ji = 1

0.5 / 0.5 A (3 + 0) A (3 + 0) A (3 + 0), (2-89)

and letting i = 1 and ji = 2

2 / 2 A (3 + 0) A (3 + 0) A (3 + 0). (2-90)

Therefore, there is no appropriate ji for i = 1 and therefore Wxx 21 V / xl by

Theorem 2.4.4.

2.5 Ordered Weighted Average (OWA Operators)

In 1988 Ronald Yager introduced an .I__regation operator known as the Ordered

Weighted Average, or OWA, operator [33]. Knowledge of OWA operators is necessary for

understanding the material proposed in Chapter 3 and ChI Ilpter 4.

Both the min and the max are special cases of this operator, as are all of the order

statistics. The behavior of the min is described as an .IsmIs behavior, meaning that

it behaves like a logical ANVD. Likewise, the max operator is said to have an "(s sing

behavior like the logical OR operator. The OWA operators allow for a specific degree of

andingf and oringf behavior, known as the "andness" and "orness" of the operator. Though

they were briefly defined earlier, they are described in detail here.

2.5.1 Basics of OWA Operators

Let I be the interval [0, 1]. Yager defines an OWA operator of dimension a to be a

mappmng

F : I" Ii, (2-91)

with an associated weighting vector W,

W = 2(2-92)

such that W1 e (0, 1) and C, W1 = 1. The action of F on a sequence of a values

al, ..., a, is defined such that F(al, ..., a,) = wlapl), ..., wla(,), where asi is the ith

largest of the values in the sequence, assuming that the sorted items are listed from least

to greatest.

In this case, the min operator has an associated weighting vector of

,, = if = 1(2-93)

while the max operator is associated with the weighting vector

,, 1if i = n -4

2.5.2 De Morgan's Law and Duality

The min and max operators have a dualistic relationship in the sense that V(al, ... a )

- /\(-al, ..., -a,). This is because all) < a(2) < ... < at,) implies that -a(l) > -a(2)

.. > -a(,). This relationship was crucial to Theorem 2.4.2.

In Section 4.1.2 the duality of order statistics based operators was discussed. This

definition could be extended to all of the OWA operators. This definition, however, only

applies when both of the dual operators are applied to the same number of elements. This

did not present a problem in the previous section, because duality was only considered

for pairs of encoding operators, or pairs of decoding operators. In either case, both

operated over the same number of elements the number of vectors in X in the case of the

encoding operators, and the dimensionality of the input vectors in the case of the decoding

operators.

In the general study of piecewise linear associative memories, however, it is useful to

discuss the case in which each of the dual operators apply to a different number of items.

For instance, in the canonical lattice based associative memory the decoding operator

is the dual of the decoding operator. One is ah-- .--s the max and the other the min. As

described above, these may not apply to the same number of elements. These particular

operators are relatively tolerant to a change in the number of elements. For instance, a

min operator over a sequence of three elements has the weights (1, 0, 0) where as its dual,

the max, has the weights (0, 0,1i). C'I..III I the number of elements from three to seven

simply means that the weights for the max become (0, 0, 0, 0, 0, 0, 1). This is handled by

simply prepending four zeros to the beginning of the weight vector.

For a general OWA operator with the weights (.:3, .4, .3) however, it is unclear how

to create a dual operator that applied to seven elements. Section 2.5.3 describes a method,

developed by Ronald Yager, for handling precisely this problem.

2.5.3 Yager Measures of Andness and Orness

In 1993, Yager defined measures of andness and orness for OWA operators [:34]. Note

that the definitions given here have been adjusted to account for the fact that Yager

ordered his elements from greatest to least when sorting, and this document assumes an

ordering from least to greatest.

The andness of an OWA operator is a real number in the interval [0, 1] quantifying

the degree of similarity between the given OWA operator and the min operator. Formally,

for a set of weights W

andness(W), = 1 ((n -i) ) (2-95)
i= 1

From this definition it is clear that the andness of the min operator is

1 n-1
orness ((1, 0, .. ,0)) = [(n -1) 1 + 0 + .. + 0] 1, (2-96)
n-1 n-1

while the andness of the max operator is

orness ((0, 0, .. ,1))= [0 + 0 + .. + (n n) 1] = 0. (2-97)
n-1

The orness of an OWA operator is also a real number in the interval [0, 1]. Orness

quantifies the degree of similarity between the given OWA operator and the max operator.

The formal definition of orness is simply orness(W) = 1 andness(W). Therefore, the

orness of the min operator is simply 1 1 = 0 and the orness of the max operator is

1-0=1.

Duality, when defined in terms of Yager's andness and orness, is simple. For a given

operator A a dual is any operator B such that andness(B) = orness(A) or, equivalently,

orness(B) = andness(A).

The power to determine the andness and orness of an operator is useful, but by itself

does not allow one to create a dual operator of arbitrary length as in the example in

Section 2.5.2. It is possible to specify a particular OWA operator given only the andness

or orness of the operator [35].

Osa s~~- Theory is often used to model the behavior of non-linear dynamical systems,

and has been an integral part of dynamical systems theory since as early as the 1960's[36].

A chaotic system is a deterministic system that has been shown to exhibit random

behavior, also known as strange behavior, in certain circumstances [37]. Similarities

between the definition of the piecewise linear associative memory and that of a simple

chaotic system known as the tent map motivate a comparison between the two, and a

discussion of the possibility for chaotic behavior in piece wise linear associative memories.

2.6 Chaotic Maps

C'l. .. .l c behavior, also called chaotic v-- nclrll 11. is one of several types of steady state

behavior exhibited by dynamical systems.

Recall that at an equilibrium point x[k + 1] = x[k]. The neighborhood of the state

space in which an equilibrium point acts as an attractor is known as a basin of attraction

for that equilibrium point. In this neighborhood all trajectories approach the equilibrium

point.

Often dynamical systems have periodic solutions, known as periodic points [38]. For

some number of time steps T > 0, known as the period, x[k + T] = x[k]. This means that

every T steps the orbit will return to the same spot in the state space. Such a trajectory

is known as a limit cycle. If all neighboring paths approach such a cycle it is known as a

periodic attractor [39]. Fourier analysis of periodic orbits shows a fundamental component

at as well as harmonic components at integer multiples of the fundamental [37].

Quasi-periodic trajectories, in which the orbit is the result of summing several

periodic orbits, are also possible [37]. Fourier analysis of quasi-periodic orbits are similar

to those of periodic trajectories, but contain sidebands[37].

Aperiodic, yet bounded, steady state behavior is possible outside of quasi-periodic

trajectories [40]. C'll I... !c behavior is characterized by trajectories that seem impossible

to predict, but that remain bounded within a particular neighborhood. While Fourier

analysis of periodic and quasi-periodic orbits show distinct harmonic components,

those of chaotically wandering orbits have a random appearance closer to a uniform

distribution [37] .

('I! I..1 c behavior is seen in nature from fluid dynamics [41] to plate tectonics [42].

Though the output generated by some of the functions is quite complicated, the actual

functions themselves are often simple. Several popular chaotic functions include Arnold's

cat map, the circle map, the Poincare map, and the tent map. The governing equation for

the tent map is similar to that of the piecewise linear associative memory.

A map f : sR" i 9" is one example of a dynamical system. The corresponding state

equation is

x [k + 1] = f (x [k]) (2-98)

At an equilibrium point f(x[k]) = x[k].

A tent map is defined as

x~k + 1] = x[k] for x[n] < (299
p (1 x [k]) for x [n] > ,

where p is a parameter set by the user. Though this equation looks simple, it can produce

quite complicated output.

Consider the plot in Figure 2-8. A single iteration of a tent map with p = 0.5 is

shown for initial values between zero and one. The horizontal axis represents the initial

system state value x[0]. Because the tent map is one dimensional the state vector reduces

to a scalar value. The triangular shape of the plot is the reason that this function is called

a tent map. The value at the peak of the tent map is equal to x [1] = ( and is located at

the halfwray point x[0] = With these parameters the system dli ~l in simple piecewise
behavior.

In Figure 2-9 a similar case is plotted. Again only one iteration of the tent map

function was calculated, this time with the setting p = 0.1. The horizontal and vertical

axes represent the input and output values, respectively, as in Figure Figure 2-8. The peak

of the triangle is now at x [1] = 0.05 which is still equal to x [1] = 9, and it is located in the

halfway point along the horizontal axis as well.

A single orbit, traced for fifteen iterations, is di;1l-pl .Iv in Figure 2-10. The initial

parameters were x[0] = 0.4 and p = 0.1. The value of x decreases monotonically towards

zero. In fact, zero is a stable equilibrium point for this system for any initial value in the

range [0, 1]. To prove this, first note that when .r[k] = 0 then .r[k + 1] = p~r[k] = 0. Next,

if .r[k] < 0.5 then .r[k + 1] = 0.1.r[k] < .r[k] and if .r[k] >= 0.5 then 1 .r[k] < .r[k] and

.r[k + 1] = 0.1(1 .r[k]) < 0.1.r[k] < .r[k]. So the fixed point at zero is an attractor for

points beginning in the range [0, 1].

Figure 2-11 illustrates another orbit he ginning at the point .r[0] = 0.4, this time with

I-1 = 1.0. The horizontal line indicates that 0.4 is a fixed point of the system. This is true

of all points .r[k] < 0.5 because .r[k + 1] = p~r[n] = .r[n], since p = 1. Simply by changing

the p- parameter front 0.1 to 1.0, then, the behavior of the system changed such that there

no longer existed a single attractor located at zero, but rather the entire range of points

.r[0] e [0, 0.5) are all equilibrium points. Though the piecewise linear equation given in

Eq. (2-99) looks simple, its behavior can change drastically by simply adjusting the p

parameter.

Though the change fronl y = 0.1 to p = 1.0 was interesting because of the alteration

it caused in the systent's behavior, the actual behavior itself was not that complicated.

The behavior shown in Figure 2-12, however, is not so simple. Here, the system is again

initialized to .r[0] = 0.4, but with y = 1.3. Ten thousand iterations were performed to

allow for the emergence of periodic patterns, if they exist. It is evident that the system

maps the range [0.455, 0.65] hack onto itself. The individual points chosen during the

mapping, though, seems entirely random.

A plot of the same trajectory is shown if Figure 2-13, this time after only 100

iterations. It is easier to observe the behavior of the system from iteration to iteration

in this figure. By simple inspection it is not possible to predict the next output value.

Though this, once again, does not prove that no pattern exists, it is useful for an intuitive

understanding of the behavior.

For comparison, Figure 2-14 shows 1,0000 random elements generated front a uniform

distribution. Note that the sequence in Figure 2-14 does not appear much different than

the one in Figure 2-12. The trajectory shows no more significant frequency content than

the randomly generated noise.

So far, it has been demonstrated that the simple tent nmap is capable of both complex

changes in behavior coinciding with changes in the p parameter and complex, seemingly

random behavior for a given initialization.

Figure 2-15 contains a hifurcation diagram for the tent nmap. This diagram shows

the values of .r taken during several hundred iterations of the tent nmap for 191 values of

the p- parameter distributed evenly between 0.1 and 2.0. For each simulation the first 100

iterations are discarded, in order to remove the initial transient data and focus on the

steady state behavior of the system. Though the system was tested at numerous initial

value settings, and the output values were different for each initialization, the overall

patterns in the system behavior were the same each time. Each of the figures used in this

section were the result of a system initialized to .r[0] = 0.4 and the behavior noted was

seen in every simulation.

Note that that the stable at tractor at .r[k] = 0 is present up until around 7< = 1, at

which point the system behavior alters dramatically. At first a vertical shift brings the

system state up to approximately .r[k] = 0.5. Front there, the state oscillates between two

regions, the upper and lower bounds of which display a sort of I linining out" pattern. As

these regions fan out they merge into one region at around y = 1.4. This region continues

go expand and at this point there is pattern of behavior discernable to the human eye -

the system wanders chaotically in the region .r[k] e [0, 1].

Figure 2-16 contains a version of the hifurcation diagram from Figure 2-15 zoomed in

to the region gs e [1, 1.4], approximately. Here it is easier to observe the two regions in the

state space as they merge. Note that within these regions between approximately ft = 1.04

and y- = 1.16 each of the regions is broken down into two smaller regions, both of which

fan out to merge in the same way as the larger regions do. Figure 2-17 contains a plot of

this region, zoomed in further to to show the interval yt e [1, 1.16].

Recall that in the previous simulation, the output of which was shown in Figure 2-12,

no periodic behavior was found and it seemed possible that the system was behaving

chaotically at this point. A close up of the hifurcation diagram shown in figure Figure 2-18

reveals that the values are actually moving between the two regions, and that this is part

of the larger overall pattern of the regions expanding and merging. That is to ;?i- this

behavior is not truly random. The behavior is not periodic, however, as state values do

not repeat. There are unique values for the state at every iteration, though the values are

constrained by the region boundaries.

Once the two regions merge, however, such patterns are no longer discernable and

the system behavior appears random, except for the expanding of the boundaries for

the region in which the state wanders. Figure 2-19 contains a section of the hifurcation

diagram zoomed in to the region pe~ [1.78, 1.92] where the behavior seems to wander

chaotically.

-L

______________

X2I

1111

Eale' )

Figure 2-1: Eo(el), Eo(e2), and the points a and b.

So(2, 1, b) 3L

.111111

m

-a
c*
11I

1111

' IIIIII

Sc(1, 2, a) .

So(1, 2, b)

Figure 2-2: So(i,i, a) and So(i,i, b) lines for 2D example.

Eof e2

1 1 1 1

IIIIII
X1

________________

_ __ _ _ _

fi

Figure 2-3: Half spa es and their intersecion, B2*

X2

1 1 1 / 1 1

I ldllI

' I I I I I

/ F (X)

Figure 2-4: B2 and the resulting F(X).

Hi2, a ) xr,
F7-~7-F--
ftf f __t

-t H 1,a

initu 4l

Hr1, IN

lITI

tiH(2, b)

j"ill

Figure 2-6: ul, ml and the resulting BI.

t+++4

_Y

r

1 I I I L

1__ L

X2

X2

F,(X)

-, ,

-zU

R r;l

F(X) '

tll

F(X)~-r

I~

Figure 2-5: Fi(X), F,(X) and the resulting F(X).

SX2

- 11'1

.~X4

P

/'F(X) ,
I
/ x'
(X4

I

Figure 2-7: Noise Suppression Properties of Mxx

x[0]

Figure 2-8: One iteration of a tent map, with p

x[0]

Figure 2-9: One iteration of a tent map, with p = 0.1

04

0 35-

03-

0 25

S02-

0 15

01-

0 05-

0 2 4 6 8 10 12 14 16
iteration

Figure 2-10: Fifteen iterations of a tent map orbit starting at x = 0.4, with pL = 0.1

12-

08-

06-

S04-

02-

-0-

-0 2

-0 4
0 0 2 4 6 8 10 12 14 16

iteration

Figure 2-11: Fifteen iterations of a tent map orbit starting at x = 0.4, with p = 1.0

0 65

06

0 55

0 45

04
0 2000 4000 6000 8000 10000 12000

iteration

Figure 2-12: Ten thousand iterations of a tent map orbit starting at x = 0.4, with p = 1.3

iteration

Figure 2-13: One hundred iterations of a tent map orbit starting at x = 0.4, with p = 1.3

09

07

X 05
04
03
02
01

S2000 4000 6000 8000 10000
ite rat io n

Figure 2-14: One thousand uniform random numbers for comparison.

09
08

06

S05
04

0 05 1 15 2
mu parameter

Figure 2-15: Bifurcation diagram for the tent map.

O 55 1 ;

045 lill *5l

0 95 1 1 05 1 1 1 15 1 2 1 25 1 3 1 35 1 4
mu parameter

Figure 2-16: Bifurcation diagram zoomed to show oscillations.

0 SB

0 57

0 56
0 55

0 54

0 o53
0 52

11

I
r

1 102 104 106 108 11 112 114 116

mu parameter

Figure 2-17: Bifurcation diagram oscillations zoomed further.

O 64

0 62

06

0 58

0 56

0 54

0 52

05

0 48

0 46

1 2996 1 2998 1 3 1 3002 1 3004 1 3006 1 3008

mu parameter

Figure 2-18: Patterns in bifurcation diagram.

09 g i .

07

06

04

03

02
178 18 182 184 186 188 19 192

mu parameter

Figure 2-19: Randomness in bifurcation diagram.

CHAPTER 3
PIECEWISE LINEAR ASSOCIATIVE MEMORIES

Though lattice based associative memories discussed in Section 2.4 are non-linear,

they can be treated as piecewise linear systems. This section introduces a piecewise linear

analysis framework for discussing these memories. This piecewise linear viewpoint allows
for the extension of the lattice based model to include other order statistic operators,

and other OWA operators in general, beyond the min and the max. This extension is the

subject of OsI Ilpter 4.
3.1 Lattice Based Memories as Piecewise Linear Systems

Given a set of input patterns X = {xl, X2, ., Xk~ Such that xi e sR2, 1i, CTOate a

lattice based associative memory

exx = ~11 1
21 221

where #xx can be either Wxx, or Mxx. Treating the lattice based associative memory

as a dynamical system let x[k] denote the state of vector, i.e.

x[k: + 1] = #z x[k] (3-2)

where I Istands for [A or [V] dlepe~nding on what is appropriate for Ithe encoding

operator. This new notation will be used in the next two chapters, when the definition

of #;xx and IOIare extended to include different encoding and decoding operators.
Expanding the right-hand side of Eq. (3-2) yields:

I[k + 1]= (0lil) (+xx,(l)[k])wi~l+ (O1xx(2) f s,(2) [k]) w2
(3-3)

where ;r(j) is a function returning the index of the je" element after sorting the ith

sequence of values shown in Eq. (3-3), the sort occurs after the addition is performed. In
this case the w values are determined by whether #xx is encoded using the min or max

operator. The weights corresponding to the min operator are (wl = 1, w2 = 0), while

the weights from the max operator are (wl = 0, w2 = 1). This concept will be covered

in more detail in the Section 3.1.1. Algebraically, Eq. (3-3) is equivalent to the affine

transformation

x[k + 1] = Aix[k] + bi (3-4)

Here the elements of Ai and bi depend on the sort, for which there are four possible

outcomes :

Case : 0() 1x(2) = 2 (3-5)

iT2(1)= 2T(2) = 2

Case : 1r(1) = 2 x1l(2)= 1 (3-6)

X2(1)= 2r(2) = 2

Case : 1(1) (2)= 2 (3-7)

X2(1) = 2 iT2(2)=

Case4 : 1() 2x12

X2(1) = 2 iT2(2) *~

Each of the four sorting outcomes creates a unique Ai matrix. These matrices each

define a unique linear system, and the input domain of each system is constrained by the

conditions of the sort. For example, in Casel Eq. (3-3) becomes

x, [k + 1] = (f it + xi [k]) wl + (012 2 x[k]) w2 (3-9)

X2 [k + 1] = (421 1 x[k]) wl + (422 2 x[k]) w2. (3-10)

This means that Eq. (3-4) becomes

x[k + 1] = Alx[k] + b (3-11)

where

Al = I1 (3-12)

and

(01W(21W1 ~1W)22W2) 3

Assuming that the sorted elements are ordered from least to greatest, constraints placed

on the input domain are

(11 1 X[k]) < (012 2 X[k]) (3-14)

( 21 1 X[k]) < (422 2 X[k]) (3-15)

In general, for ie { 1, 2, 3, 4}

Ai = -() ()(3-16)

and

bi = 0x() +1(22

while the general rule for constraints in each case is given by

1xx 1) x (1 [k) < ( 1x (2 xx 2) k])(3 18)

'. --Tl (1) a(1) [k1]) < (1. + x2 (2) [k]) (3 19)

This effectively turns the non-linear system described in Eq. (3-3) into four linear

systems, only one of which may operate at a given time step.

In each of the four cases the state of the lattice based associative memory at any time

is defined by a system of two linear first-order state equations, the unknowns of which

are the two dimensions of the previous state vector. For the moment, consider each of the

four systems alone rather than as part of a larger, non-linear system. Though the system

matrices Ai are each unique, they are constant with respect to time. That is to ;? w, any of

the four systems by itself has constant coefficients, making them each time-invariant [9].

3.1.1 2D Piecewise Analysis of Ritter-Gader Results

Assume that the memory #xx was created using the max operator during encoding,

1.e.

oij = ( V( -x( m .(320)
(= 1
This is equivalent to using an Ordered Weighted Average(OWA) of the form

(= 1 p= 1

where w is a vector of OWA weights and the index (p) refers to the index of the pt

largest sum. All of the weights are equal to zero except the one corresponding to the
maximum element, i.e. wtk). This means that decoding will employ the min operator,

which, in the 2D case, uses the OWA weights (1, 0), assuming that the sorted elements are

reported from least to greatest.

The four system matrices resulting from piecewise linearization, along with their b

vectors and constraints, become

Al = I O (3-22)

bl = It 01 I~ (3-23)

The constraints on the input domain for Casel are

xi [k] < 412 2 x[k], (3-24)

21 1[k] < 2 [k]. (3-25)

A2 = r o (3-26)

b2 12 (:327)

and the constraints on the input domain for Case2 are

ri [k] > 412 72 x[k], (:328)

21 1 k] < XT2[k]. (:329)

A3 = I O (:330)

b3 I 11 0 : (:331)

and the constraints on the input domain for Cases are

r [k] < 412 72 x[k], (:332)

21 1 k] > :T2 k]. (3-33)

b4 12 12 (:335)

and the constraints on the input domain for Case4 are

ri [k] > 412 2 k~[], (3-36)

21 1 k] > :T2 k]. (3-37)

The result of the constraints corresponding to each sort is that the Euclidean plane
is divided into three regions, each of which encompasses the domain of one of the
sub-systems. The reason that there are only three, and not four, is that the constraints for
Cas62 are contradictory and therefore no region of the plane can satisfy them.

To see this, it is helpful to define the two lines

Li : x2 1 21i (338)

L2 xa 2 1 12~i (3-39)

The two orange lines in Figure 3-1 represent the lines L1 and L2. Though it is possible

that they may be equal, it is usually the case that one is above the other. The operators

chosen for encoding decide which line is the top line, and it is simple to prove that the top

line is ahr-l-ns L1 in the case that the max operator is used for encoding.

Li > L2 1 21 1 a >x 12 (3-40)

which implies that

21 12-)

Combining Eq. (3-1) and Eq. (3-20) yields

(= 1 (= 1

Applying De Morgan's Law to the right-hand side shows that

k k

(= 1 (= 1

which is true by definition of the min and max operators. Therefore, L1 is ah-- .1-< above

L2 when the max operator was used to encode the input vectors and so the top-most of

the orange lines in Figure 3-1 corresponds to L1. Likewise, when the min operator was

used to encode, L2 > L1.

Note that L1 and L2 divide the state-space into three regions, labeled RA, Bg

and Rc. The region that is above both L1 and L2 is TegiOn RA, Which corresponds to

the constraints of Casel. The region Rc exists below both lines, and represents the

constraints of Case4. In between L1 and L2 l16S RC, Which corresponds to Case3. As

mentioned, Case2 COnStrainS the range of state vectors to be both above L1 and below L2,

which is impossible in this case.

Each of these three regions defines the range over which one of the linear systems

operates. That is, it defines the input range for a non-homogenous system of the form in

Eq. (3-4), where A and b are defined by the sorting outcome, or case, operating in the

region. The lines L1 and L2 form the boundaries for these regions.

Considering only state vectors from a particular region as inputs, only the appropriate

linear sub-system need be considered. The output of such a sub-system may, however,

lie outside the region under which it operates. This makes a strictly linear analysis

of the entire system impossible. Calculating the value of Af as in Eq (2-22) for some

Case is therefore useless unless it can be proved that the system stays in one region and

never crosses a boundary. Should the output of the system cross one of the boundaries

into another region at any time step the next Ai and bi will be different. It will not

be possible, therefore, to analyze the entire system using linear systems theory alone,

though much of that body of literature will be useful in analyzing any one of the three

sub-systems.

3.1.2 Vector Fields for 2D Example

Considering the system as a whole, it is necessary to use non-linear analysis tools to

describe its behavior over time. One well understood analysis tool is the state-space vector

field.

The input domain of Eq. (3-4) spans all points in the state-space, or sR2, and the

output corresponds to the next state. This can be described by plotting arrows with their

tails rested upon a certain sampling of input points and their heads resting upon the

corresponding output point. This graphically illustrates the movement of the state vector

from one point to the other in one time step. By exercising good judgement in the choice

of input samples an informative description of the system's evolution over time is created.

This is subjective, however, and does not substitute for a rigorous algebraic an~ .k-i--;

Figure :3-2 contains an example vector field for a 2D lattice based associative memory

generated using max encoding across 100 randomly generated input vectors. The blue

lines correspond to the boundaries between input domains for the three linear sub-systems,

i.e. L1 and L2. In this case the lines are not equal so L1 > L2. Figure :3-2 was plotted in

Matlah using the function arrow.m, available from the Matlab-Central File Exchange. The

arrow tails correspond to locations that have been sampled as inputs to the system. The

arrow heads represent the output produced by the system for each corresponding input.

Examine the region between the boundaries, or Rg. Each of the vectors is comprised

of simply an arrow head oriented upwards. This is the result of plotting an arrow with

zero length, using arrow.m. This implies that for each of the regularly sampled input

points in Rg the difference between the output state and the input state was zero. That

is to ;?i, each of these points is an equilibrium point as described in Eq. (2-7). Though

the input points were spaced so as to provide a close approximation to the dynamic

characteristics of the system, it does not provide a solid proof that every point in Rg is an

equilibrium point. For that we need the following theorem.

THEOREM 3.1.1. For every point x[k] in Rg. x = A:3x[k] + b:3.

According to Eq. (:330) and Eq. (:331)

A:3x[k] + b:3 = xIk + = x[k]. (:344)

So, all points between lines L1 and L2 are equilibrium points, as -II---- -1h Il by Figure :3-2.

Examining the region R4, which corresponds to the input domain above both L1 and

L2, all of the sampled input points map to the point on L1 reached by strictly vertical

downwards movement. This is indicative of the overall behavior of points in R 4 as

characterized hv the following theorem.

THEOREM 3.1.2. For every point x in R4. A1x[k~] + bl = I ri a

The proof for Theorem 3.1.3 relies on Eq. (3-22) and Eq. (3-23) which imply that

Alx[k] + bl = xIk + 1 x ~ (3-45)

Figure 3-2, therefore, captures the evolution of points in RA Well. FOT TegiOn eC, eXiSting

below both L1 and L2, the vector field is similar to that of R1 with the exception that the

points move strictly horizontally to a point on L2 aS described in the following theorem.

THEOREM 3.1.3. For every point x in Rc, A4x[k] + b4 2 12:

The proof for this is similar to the previous proof, using Eq. (3-22) and Eq. (3-23)

A4x[k] + b4 =~ x[k] + (3-46)

Note that the condition xl = x2 ~12 1S equivalent to the condition x2 = X1 ~12. This

implies that points in Rc are indeed mapped horizontally to the boundary L2*

This shows that the fixed point set of this lattice based associative memory is the area

corresponding to Rg. Furthermore, any point outside of this area converges to the fixed

point set, specifically to the closest point on the nearest boundary, in one iteration. This
result echoes that of Ritter and Gader.

3.2 Generalization of 2D Example to Higher Dimensions

Though the power of the piecewise linear approach to describe the behavior of

a lattice based associative memory in two dimensions has been demonstrated, it is

necessary to carry out such an analysis in higher dimensions. We first consider the case of

3 dimensions.

Let Wxx be a canonical lattice based associative memory created to store a set of

patterns X = { xl, ., Xk) Such that xt e sR3 for all ( e {1, ., k }. This implies that

decodingf will be handled as follows

[W~xx[ V Ix] = Vl (+za V + 21, it'_- + 2, it'. 3 3 -'
j=1

This yields

xi [k + 1] = (win(1lj) + xxx (1) [k]) UII + (wUlle(2) f -,(2)]) [k)W2 (-1xx(3) x 1,(3) [k]) Ir.,

where (wl, w2, it' .) = (0, 0, 1). In order for x to be a fixed point Xi[ [k 1] = (re- + x [k~]),

for all i = 1, ., 3. This means that (,r,- ,:1 f + x,(l) [k]) = r-+ xi [k]) for all i. This, in

turn, implies that (re + xi [k]) > (11r y. + Xp) and (I, + xe [k]) > (wi, + x,), where p and q

are the two dimensions other than dimension i

For each of row we can define the hyperplanes I, I xi = Ir y.+x, and I, I xi = wi,+x,.

The three dimensional state space, then, is divided by the following six boundaries

wii xi = wl2 2 2a 1 a=x w12, 38

w22 2 021 1 2a 1 xlE21= l+ (3-49)

wit xl = Wl3 + 3 i 3 = 1 13, (3-50)

I,' Z 3 = It' i + 1 i 3 = 1 it' (3-51)

w22 2 X = it'_ .+ 3 i 3 2 it'_ ., (3-52)

I,. Z 3 = It' _. 2 X 3 2 X~+It' _.. (3-53)

For each pair of dimensions (1, 2) (1, 3) and (2, 3) the state space can be divided up into

three regions, in a manner similar to the two dimensional case. For instance, for the

dimension pair (1, 2) the state space is divided into one region for which x2 1S above both

boundaries xl wl2 and xl + w21, a SeCOnd region which is below both boundaries, and

a third that exists between them. The dimension pairs (1, 3) and (2, 3) can be similarly

partitioned. As with the two-dimensional example from Section 3.1.2 the placement of

points within these boundaries is governed by the outcome of the sort described in Eq.

(3-2). The following theorem describes the behavior of points inside the boundaries for
both dimension pairs.

THEOREM 3.2.1. Given a memory Wxx encoding input patterns of the form xt e Sanl

a vector x[ k] is a fixed point if xj lies between the boundaries x, e ., and xi + In, ,, for all

and

Recall from Section 3.1.1 that x [k] I, > x, [k] + In,~ when the min operator is used to

encode and so xy [k] is between the two boundaries. Q.E.D.

THEOREM 3.2.2. Given a memory M~xx encoding input patterns of the form x' e s9"l,

vector x is a ~fix~ed point if and only if xj lies between the boundaries x, e ., and xi + In,~

[Wx~x I~xlk]]

[W~xx x[k]]j1

= r,[k] ~I rl

a xi[k] = V, ( + x[] [k]

= j= 1

+ x, [k] >
o x [k] >

[M~xx x[k]], = x[k] (3 -62)

a x [k =(mix + zx[k]) (3-63)
x= 1
+ xi[k] < mix + x [k], VA (3-64)

a Xs [k] < mij + x [k] (3-65)

and

[MZxxY A x[k]]j = xy [k] (3-66i)

r xy [k] = (my, + x [k]l) (3 -67)
x= 1
+ xy [k] < mjx + xy [k], VA (3-68)

a xj[k] < mji + Xi[k] (3-69)

Recall from Section 3.1.1 that xi [k] I, < xs [k] + w,., when the max operator is used to

encode and so xy [k] is between the two boundaries. Q.E.D.
3.2.1 The Number of Cases

Only one dimension pair, (1, 2), was analyzed in the two-dimensional example. In
three dimensions, there were three dimension pairs, (1, 2), (1, 3), and (2, 3). The number

of dimension pairs for ar dimnensions is i4=l. This metans that analysis by the geomnetry of

dimension pairs does not scale well. For 20 dimensions the number of dimension pairs is
190!

Luckily, by examining the system equation the description of the state space is

simplified. Let

Asy= 1if i = j ( 0
Aij 0 if ifj j30

bij= E2 (3-71)

Equations 3-70 and 3-71 lead to the identity transform Aijx[k + 1] + bij = x[k], Vx[k].

Of course, this is only one of ]rn lw: system equations that could lead to an identity, i.e. a

fixed point, for some x[k]. In order to fully describe the fixed point set of an n-dimensional

lattice based associative memory using this linear systems approach it is necessary to

describe all such pairs that might yield a fixed point. This is actually simpler than it

might appear. The following theorem addresses the power of the identity transform in

describing the fixed point set.

THEOREM 3.2.3. The system equation of the linear subsystem operating within the

ix~ed point set is Aijx[ k + 1] + bij = x[ k], where Aij is the .:1~,1.01.10 matrix: and bij is the

zero vector, provided that ties during decoding are broken in favor of the diagonal element

Proof:

For the n-dimensional lattice based associative memory using min encoding, the

decoding of the ith row of the output vector is handled in the following manner

xs[k + 1] = \(may + x [k], may + xy [k], m,, + x,[k]l) (3-72)

In the event of a fixed point, i.e. when x[k + 1] = x[k], then

(\may + x [k], -, may + xy [k, me,+ ,[k]) = sk] (3-73)

The case described above, in which Aij is the identity matrix and bij is the zero vector, is

only one case in which a fixed point could occur. It is also possible that, for some j / i,

(mij + xj[k]) = xi[k], which is described by a different system equation.

Because the diagonal elements of the Mxx matrix are ah-- .1-< zero, however, it is true

that (mii + xi [k]) = xs [k] as well. This means that there is a tie during the sort between

(mii + xi[k]) and (mij + xj[k]). In fact, there is an even stronger implication which is

that every element between (mii + xi[k]) and (mij + xj[k]) (after the sort is performed) is

equal .

This means that the value of x4 [k + 1] = xs [k] can be calculated in one of at least two

v- .1-<. In one case, (mii + xi [k]) I, = xi, in which case the ith row of Aij is all zeros save

for a 1 in the ith location and 'I .]i = mil = 0. Since the same value is reached either way,

this means that for any system equation that yields a fixed point will result in the identity

transformation if ties are broken in favor of the diagonal elements (mii + xi).

Similar logic will show that this is true for min-encoded memories as well.Q.E.D.

3.2.2 Geometry of the Fixed Point Set

The geometry of the n-dimensional fixed point set is more complicated. For each of

th~e "~I dimncsion? pairs it is necessary for a. fixed point to reside within th~e ilnner region?

defined by the two boundaries. Simply verifying that a point is within the bounds of a

particular dimension pair is not sufficient, however, as some of the points in the inner

region of dimension pair (i, j) may be in an outer region of the state space with respect to

dimension pair (p, q),I and so forth. The intersection of the Ii7= sets of points existing

within the middle region of a certain dimension pair is the fixed point set. Points outside

this intersection might be fixed with respect to one or more dimension pairs, but will not

be a fixed point of the entire n-dimensional system.

The following two figures provide an example. The points in Fig. (3-3) are within

the state space boundaries, and are therefore fixed with respect to dimension pair (i, j).

Examining dimension pair (p, q) in Fig. (3-4) shows that many of the potential fixed

points are ruled out. Those points in the outside regions are not fixed in dimension p and

q and, therefore, these points cannot be fixed points of the overall system. Furthermore,

points that are within the inner regions of both figures may still be outside the fixed point

region of another dimension pair, and so even they are not guaranteed to be fixed points.

3.3 Verification of Canonical Results

The results of the previous section match the results presented in Ritter and Gader

[6]. A point x is a fixed point if and only if for every dimension pair (i, j), xj is in the

region between the hyperplanes xj = xi e- I. and xj = xi + In, ,. Similarly, for a

max-encoded memory xj must be in the region between xj = xi e- ., and xj = xi + In, .

This subject is revisited in Section 4.1.

For an n dimensional lattice based associative memory, let I be the remaining

dimension, such that {di } U {d2) U .. U {1} U .. U {d,} = { 1, 2, .. ., n}, then the

intersection Eo(el), as described in Section 2.4.5, is the set of points z such that zz = 0, i.e.

Eo(el) = {z e s9" : zz = 0}. (3-74)

Every point x in the fixed point set defines a set of lines {x(1) = {a + x}} for all ae sR.

Any point y e x(1) is also in the fixed point set since adding the same constant a to each

dimension will not change the relationship between any of the dimensions. Therefore, if y

is a fixed point of the system then (y yl) e Eo(el) is in the fixed point region of every

dimension pair (i, j). The set of points (y yl) fits the definition of Fz (X) given in Section

2.4.5

Fl(X) = {(y yz) e Eo(el) : 3 y F(X) with y(1) = (y yz)}. (3-75)

And as before the set of points corresponding to U,,, [a + Fz (X)] describes the fixed point

set of the system.

3.4 Advantage of Piecewise Linear Framework

This framework is not intended to replace the current one, but rather to augment

it. Using techniques and terminology common to most engineers, it provides a novel

and widely accessible toolbox to supplement the current lattice algebra based method.

In addition to providing an additional entry point into the study of lattice based

associative nienories, for those who may prefer more a familiar theoretical foundation,

the introduction of a novel language with which to describe the system may lead to unique

advancements. One such advancement is the generalization of lattice based nienories to

those encoded with other order statistics, which is the subject of the next chapter.

Figure 3-1: The three regions defined by the 2D piecewise linear memory using max
encoding

Figure 3-2: Vector field for each of the three regions defined by an example using max
encoding

Figure 3-3: These points are fixed in dimension pair (i, j), but may not be fixed points.

Figure 3-4: Some of these points are fixed in dimension pair (p, q), but may not be fixed
points.

CHAPTER 4
ENCODING AND DECODING OPERATORS

Viewing the canonical lattice based associative memories as the result of encoding key

vectors using a specific order statistics based operator, either the min or the max, opens

the door to using other order statistics operators during the encoding process as well.

Generalizing the encoding operator in this way is shown to provide control over the width

of the fixed point set, in terms of the distance between its boundaries, which allows for

the ability to trim a particular class of outliers from the initial input vectors. It is shown

that without this control, outliers in the initial input data can make the fixed point set

arbitrarily wide. This means that the fixed point set might not fit the majority of the data

points closely and, on average, the associative memory may perform quite poorly when

presented with noisy versions of initial data points. The improved performance gained

from having control over the width of the fixed point set is quantified and tested using a

widely available computer science data set.

4.1 Order Statistics for Encoding Operators

For the purposes of this discussion the process of creating a memory using the bth

order statistic will be denoted

where (b) refers to the bt" elemlent of xf x ,) w hen sorted from least to grreatest.
This means that the canonical method of encoding a set of k vectors using the min and

max operators, respectively, becomes

I, = x x (1 ()V {1..,k},(42)

and

my = xF as)V(e{1 .,k 43

The vector of OWA weights used for this encoding would only differ from the k-dimensional

zero vector at the bth index, which would contain a 1.

For decoding, which takes place over the n dimensions of the state vector, let the

dcltcoding operator for the blt" order statistic be denotedl which means that the

operator will be denoted O and th~e operator will be denoted a~s For a key
vector x, and memory Wxx then, the canonical decoding process would be written

Wxx I x = Wxx In x, (4-4)

and likewise

M'xx x = MxxY O x. (4-5)

For a system decoded using the max product a fixed point is described as

xi[k + 1] = Wxx U xs [k], Vi e {1, ..., n} (4-6)

Likewise, for a system decoded using the min product operator a fixed point is
described as

xs[ki + 1] =Wxx 1 xs[ki], Vi; e {1, ..., n} (4-7)

Recall that, when the max operator is used during encoding a vector x[k] is a fixed point

if and only if for every dimension X ={, ., n}), [127x I x[k~]] = rx [k + 1]. This means
that for every dimension pair (i, j),

[12x Ex[k]] = xs [k], (4-8)

and

[11x; A x[k]]3 = x[k]. (4 9)

This leads to the interesting result that, for all vectors xE e X, the difference Xi[k] Xj[k~]

must be less than or eqlua~l to th~e difference~r xf T ad th~e difference x, [k] -, xsk] must be

less than or equal to the difference x~ xf for all vectors x~t F X. This comes from the

fact that

xs [k] < mij + xj [k] (4-10)

Xs [k] Xy [k] < mij (4-11)

and

xy [k] < mji + xe [k] (4-12)

Xj[k] x[k]

Equations (4-10) and (4-12) come from Section 3.3, which states that a point x is a

fixed point if and only if for every dimension pair (i, j), xj is in the region between the

hyperplanes xj = xi mij and xj = xi + mji, for all j. Similar logic will show that when
min-encoding is used, x is a fixed point if and only if for all vectors xE e X, the difference

xsrk -~ xyn] ust be greater than or eqlua~l to th~e difference~r xf- and th~e difference

xy-[k] -x[ki] rmust, be greater than or equanl to the difference x5 x, for all vectors xt F X.
Section 4.1.1 will discuss how this is handled when encoding using order statistics

other than the min and the max.

4.1.1 Order Statistics-Based Associative Memories

Though associative memories based on the first and last order statistics (min and

max) have been explored, no literature exists describing the use of any other order statistic
in the encoding process. This section describes the behavior of such memories. Again,

elements are assumed to be sorted in increasing order.

Encoding a lattice based associative memory with the bth order statistic yields a

memory @xx such that

= x x(4-14)

Decodingf usingf the min operator, then, means that

xr[k + 1], [@xx [A x[k]]l ( + xy [k]) (4 15)
j= 1

and decoding using the max operator implies

xrlk + 1]: = [d'xx [V xlbk]] = V; ( + xy k]) (4 16)
j= 1

THEOREM 4.1.1. When using the bth order statistic for encoding and the frst order

statistic for I. ~ ~.:,l ./. a vector x is a Rxed point if and only if for all vectors xt X ,

the difference x,[k;] z- [k] must be less than, or equal to the beh dieren~ce xf x for all

dimelrnsioL pairsi (i, j) and al 1 vectors x' i X, tha~t is xZ -
Proof: For all i,j e {1, ... ,n}

rx]-r[k]~ < Xy~] -Xf -x (4-18)

axc,[k]~ < x[k]+(S -., ) (4-19)

a x, [k] = ( + xy [k]) (4 20)
j= 1

+ x,[k] =O, Ox 1 x[kj] (4-21)

Q.E.D.
THEOREM 4.1.2. When using the bth order statistic for encoding and the last order

statistic for I/~~ ~...../.;a vector x is a Rxed point if and only if for all vectors xt e X, the

di~fe~rnce x,[k] xylk] must be greater than or equil( to the b"" di~ference~r xf for all
dimension pairs (i, j) and all vectors xe e X.

Proof: For all i,j e {1, ... ,n}

xsi[k] xy[k] > xf )-, I~ (1~b 422)

+ x4 [k] = --+ y[k]) (4-25)
j= 1
+ xi[k] = [@xx [i I x[k]]i (4-26)

THEOREM 4.1.3. When the bth order statistic is used to encode a set of keys and the

first order statistic (min) is used for decoding, the system will converge in one iteration.

Proof: By Equations (4-14) and (4-15), for any vector that is not fixed in some

dimension i then xi is replaced with the minimum value of (' ". + xj) across all of the

dimensions je { 1 e n}. This vector will then be fixed in this dimension because xi <

.+ xj for all j. Since this is true for all i, if a vector x is not a fixed point originally, it

will be after one iteration. Q.E.D.

THEOREM 4.1.4. When the bth order statistic is used to encode a set of keys and the

last order statistic (maxc) is used for decoding, the system will converge in one iteration.

Proof: By Equations (4-14) and (4-16), for any vector that is not fixed in some

dimension i then xi is replaced with the maximum value of (' + xj) across all of

the dimensions je { 1 e n}. This vector will then be fixed in this dimension because

xi '> + xj, for all j. Since this is true for all i, if a vector x is not a fixed point

originally, it will be after one iteration. Q.E.D.

At this point the reader may wonder what happens when an order statistics operator

outside of the min or the max is used during decoding. Though memories using such

decoding operators were not proposed as a topic for this research, they might provide an

interesting research topic in the future.

For now, consider a case in which the min operator has been used during encoding

and the bth order statistic is used during decoding. In order for a vector x to be a fixed

point, WMxx x] = xs for every dimen~rsion i. Thlis means1 thalftt (r--,,, + Zb[k]l) = Xi.
What this means is that out of the n dimensions in state vector x, there are precisely

b-1 dimensions j such that (I, + xj) < xi and n-b dimensions p such that w..,, + x, > xi.

This means that there are exactly b-1 cases in which I, < xi xj and n-b cases in which

Likewise, in the event that the max operator has been used during encoding and the

bt" order st~atistic is ulsed during decoding, x is a fixedl point if and only if M~xx x =

xi for every dimension i. This means that (mimi(b) + b) = Xi

Therefore, out of the n dimensions in state vector x, there are precisely b-1

dimensions j such that (mij + xj) < xi and n-b dimensions p such that me, + x, > x .

This means that there are exactly b-1 cases in which mij < xi xj and n-b cases in which

These can be extended to cases in which non-max and non-min order statistics are

used for both encoding and decoding. As it can be seen, the fixed point sets, if they exist,

are define by sets of inequalities that depend on the result of sorting the data and may

provide an interesting topic for future research. Section 4.1.2 illustrates the usefulness of

encoding with such order statistics, however, and ('! .pter 5 discusses a family of memories

using non-min and non-max decoding in detail.

4.1.2 Order Statistics Based Encoding Operators for Outlier Robustness

The beauty of encoding a set of vectors using generalized order statistics based

operators is that it allows for a certain amount of robustness to outliers in the input

keys. Encoding a memory using the bth order statistic implies that for any dimension pair

(i, j) the upper bound for the difference (xi xj) for any fixed point x is the bth largest

difference found between dimensions i and j in any of the original key vectors xE e X.

This means that the original patterns in X are not required to be free of impulse

noise, as they are in the canonical min/max lattice based associative memories.

As an example, let

0

-3
X =
1

-2

This means that if we use the max operator

2 6 -

1 0 -

to encode X,

(4-27)

then

Mxx

(4-28)

If the second-largest order statistic (next-to-max) was used to encode X, then the

resulting memory would be

@xx

(4-29)

Examining the dimension pair (i, j), as illustrated in Figure 4-1, the width of the fixed

point set is clearly less when using the memory @xx than the width when using memory

M~xx. Here, the second dimension of each vector in X is plotted along with the lines L1,

representing x2 1 x + Mysi, and L2, representing x2 1 x i i.. These lines, shown in

orange, represent the boundaries of the fixed point set with respect to memory M~xx. This

is the traditional fixed point set of the canonical lattice based associative memories.

On the other hand, the lines L3, representing x2 1 l '. ,and L4, representing

x2 1 x are Shown as dashed black lines. These represent the boundaries of the fixed

point set with respect to the memory encoded with the next-to-max operator. Notice that

this fixed point set is narrower, trimming out the most extreme point in either direction.

In Figure 4-2 each of the points in X dimension pair (1, :3) is plotted with similar

results. The fixed point set is narrowed to exclude the most extreme elements on either

side.

Figure 4-3 shows a case where the narrower fixed point set collapses into a single

line. This happens when all of the remaining points are collinear and is more likely when

the number of elements in the set X is small, or the number of outliers trimmed is large.

Evidence of this behavior is shown again in Figure 4-5, using dimension pair (2, 4).

Figure 4-4 and Figure 4-6 contain the same plot for dimension pairs (2, :3) and (:3, 4),

respectively.

Note that the same fixed point set would be created by encoding with the min

operator, in the canonical memory case, or with the second order statistic (next-to-min) in

the case of the order statistics based memory. In this case the canonical memory would be

0 0 -1 -1

-6 0 -4 -2
TWxx = (4-30)
-7 -6 0 -5

4 1 -:3 0

Likewise, the next-to-min encoded memory, labeled @xx just like its predecessor, would be

0112

-:3 0 -:3 1
@xx = (4-31)
-:3 1 0 2

-2 -1 -1 0

Note that @xx is a general name given to a memory encoded using any order statistics or

OWA hased operator. This @xx has been encoded with the next-to-min operator unlike

the previous one, which had been encoded with the next-to-max. In the future, whenever

this ambiguity is possible, a memory encoded using the bth order statistic operator will

carry a superscripted b, like AbX,, in order to distinguish it from a memory encoded using

a different order statistic operator.

Figure 4-7 shows the boundaries for dimension pair (1, 2). Note that this picture is

extremely similar to Figure 4-1 except for the fact that the boundary lines have changed

places. That is to -?,, lines L1 and L2 have switched places, as have lines L3 and L4. This

does not change the fixed point sets themselves, however. Theorem 4.1.5 explains why this

happens .

THEOREM 4.1.5. Let ObXX be the associative memory encoded with the bth order

statistic, using some set of input vectors X in a dimensions. Let # Xnb)behem oy

encoded with the order statistic n-b, using the same set of input vectors X Then, 'h

(n-b)

Proof: In order to create the memory element '.^,, the list (x) x), x x,.. "

is sorted to become x2:1 X x 2) 2) (n) n)) .At this point thle bt" element
from the sorted list is chosen and '.^, = Z ~b) (jb). In order to create the memory element
n-b the list (x ) x, x -x ...,xy-x) x -x, x5 .~: x"-y) is sorted

and element n-b is chosen. This means that -b -)_ -) __b) b ___ ,

Q.E.D.

Since causes the lines xj = xi 'h, and xj = xi + ,: -" to be the same, as with lines

Xj = Xi + i'.^ and xj = xi ,: -'

Therefore the fixed point sets are the same, but the boundaries have switched places.

This section illustrates the ability of order statistics encoded memories to change the

width of the fixed point set, but it does not explain the benefit of having such control.

This benefit will become clear in Section 4.2, where this behavior is extended to the

n-dimensional case and the application of such this technique is explained.

4.2 The Decoding Process

Recall that in Section 3.2.1 it was shown that for any canonical lattice based

associative memory the A, b pair associated with a fixed point was ah-- .--s the identity

matrix and the zero vector, assuming that ties in the decoding sort were .ll.-- li--< broken in

favor of the diagonal element ( ii + xi).

Since this discussion dealt only with the decoding process it was not limited to

memories created with the canonical encoding operators, it is a property of all memories

that are decoded with canonical operators. In fact, since the order statistics based

memories presented here use canonical, i.e. either min or max, decoding operators the

property detailed in Theorem 3.2.3 is true of the order statistics based memories as well.

Consider a simple three-dimensional example in which

0 2 1-1 1

X = 3 1 2 0 3 (4-32)

0 -1 0 -3 1

In this case, the canonical max-encoded memory

033

M~xx = 2 0 3 (4-33)

030

The memory encoded with the next-to-max operator is

012

@xx = 1 0 2 (4-34)

0 -2 0

Next, consider the noisy vector 1l = (0, -4, 0). Figures 4-8 through 4-13 show the result

of decoding 1 using M~xx and @xx. In general points below both of the boundaries move

to the left to the nearest boundary and points above both boundaries move downward to

the nearest boundary. The only exception is that the movement of a point in dimension

pair (i, j) can be affected by movement in any dimension pair sharing either dimension i or

dimension j. An example of this behavior is given in Figures 4-10 and 4-11. Since xil is on

the boundary in this case it is fixed in dimension pair (i, j) but it still moves as a result of

being outside of the fixed point set in dimension pairs (1, 2) and (2, 3).

This example is not meant to illustrate the usefulness of general order statistic

encoding operators, but simply to show the difference between the canonical and general

memories. The usefulness of such memories will become clear in the next section.

4.3 Noise and the Narrow Fixed Point Set

For an n-dimensional example consider the case when

2 3.5 4 4.5 5 5.5

6 4 5 6 3 6

Here, X is an n-dimensional matrix containing 11

only two rows need to be defined. Let row i be

6 7

5.5 3

vectors.

7.5 3.5 -

5 -5 9

For the sake of this example,

[X]i = (2, 3.5, 4, 4.5, 5, 5.5, 6, 7, 7.5, 3.5, -8),

(4-36)

and let row j be defined such that

[X]j = (6, 4, 5, 6, 3, 6, 5.5, 3, 5, -5, 9).

(4-37)

Without specifying all of the elements in X it is impossible to calculate an entire memory,

but only the entries created from dimensions i and j are necessary for this example. Here,

M l; 8.5, Mysi = 17, = 4, and '. = 4.

93

Figure 4-14 contains a plot of each point in X, along with the lines that form the

boundaries for the fixed point set of each memory, as before. Note that points x1o and x"l

are extreme outliers for this data set and for this example they are presumed to be the

result of some sort of noise in the original input set X.

This causes the boundaries of the fixed point set for the canonical lattice based

associative memory Afxx to be much wider than necessary for the rest of the data points.

The implications of this widening are twofold. First, points in X can he greatly distorted

while still remaining in the fixed point set. Secondly a distorted point, once pushed

outside of the fixed point set, cannot he brought as back as closely to the original because

the boundary is too far away.

An example of this problem is given in Figure 4-15. Here, distorted versions of the

initial data points are shown as black diamonds. The two example points shown here

illustrate two different problems with the canonical set of encoding operators.

Assume that the point located at (-3, 8) is a noisy version of any of the points

in X, except the outliers. Because of the extreme outliers in the initial data, and the

over-widening of the fixed point set that they produce, this noisy point is still within the

fixed point set of the canonical lattice based associative memory. This noisy point will

therefore not he changed and will remain just as distant from it's original value. If the

next-to-max encoding operator had been used, the fixed point set would have better fit the

initial data and the noisy sample would be moved closer to its original value.

Another consideration is the treatment of points that are located outside of an

over-wide fixed point set. Again considering Figure 4-15, assume that the point located at

(7, -3) is a distorted version of one of the points in the initial set, other than the outliers.

Though the noisy point is outside of the fixed point set, and therefore will be moved to the

fixed point set in one iteration, it will not he able to move as closely to its original value

because the boundaries are two far-removed from most of the points.

Using the next-to-max encoding operators trims the outliers, the fixed point set is

more descriptive of the remaining data points. Of course, in the event that the outliers are

legitimate data points and not the result of noise, it may be optimal to use the canonical

encoding memories rather than trimming them. Given their distance from the rest of the

points, however, it may be better overall to trim them so that the majority of the points

can be brought closer to their original values if distorted.

4.4 Performance on Real Data

In order to quantify the improvement that order statistics based encoding can have in

the presence of outliers in the initial inputs three experiments were carried out on a set of

real world data.

4.4.1 The Data

The data set chosen for this experiment is the auto-l,,lty data from the Carnegfie

Mellon University Statistics Library. It contains 398 records for the city-cycle fuel

consumption of various automobiles [43]. The auto-l,,lgy data was originally used to

test graphical analysis packages at the 1983 American Statistical Association Exposition

and is currently available from the University of California, Irvine Machine Learning

Repository [44].

The auto-l,,lty data is intended for predicting the fuel efficiency of an automobile

using 4 continuous variables and 3 discrete variables. Included in each data point is the

fuel efficiency of each car, in miles per gallon. The order of the variables is shown in Table

4.4.1. Histograms of the values taken in each of the dimensions are shown in Figures 4-16

Table 4-1: Data Dimensions
Dimension Variable Type
1 mpg continuous
2 cylinders discrete
3 displacement continuous
4 horsepower continuous
5 weight continuous
6 acceleration continuous
7 model year discrete
8 origin discrete

through 4-2:3. From Figure 4-16 it is clear that the fuel efficiency of the various vehicles

covers a broad range, from 10 nipg to 45 nipg, though the bulk of these lie between 10 and

:35. Primarily four-cylinder cars were used, as shown in Figure 4-17, though 6-cylinder and

8-cylinder models were used as well. No vehicle had as much as 250 horsepower, though

their weights varied greatly (from 1,500 lbs to 5,000 lbs). All vehicles were made between

1970 and 1982.

4.4.2 The First Experiment

The intent of these experiments is to compare the behavior of lattice based associative

nienories with both the canonical encoding weights and the more general order statistics

weights. For the initial experiment outliers were only considered in a single dimension, for

the sake of simplicity. The fourth dimension (horsepower) was chosen because there are

Iur Ilry cars that fit within the appropriate model years which have horsepower ratings far

in excess of the 250 horsepower limit found here.

First, a single element front the database was chosen randomly and it's horsepower

rating was changed to an extreme value of :300. For the purposes of this experiment, this

change is assumed to be the result of noise in the initial data. In general such a change

could also be considered simply an outlying point that is still valid. In this case, however,

since the element was chosen randomly it might not make sense to believe that the vehicle

chosen has :300 horsepower and so, strictly -p.' I1:;0s this data point should be considered

noisy. In point of fact, this is not a very important distinction for this experiment since

the purpose is to explore the behavior of lattice based nienories in the presence of such

outliers and not to concentrate on anything related to automobiles specifically.

Both the canonical nienory M~xx and the next-to-nlax nienory @xx are created

using this slightly distorted version of the input data. Figure 4-24 shows the boundaries

for both nienories with respect to dimension pair (1, 2). The boundaries for the fixed

point set of M~xx are the wider ones, shown in orange. The boundaries for @xx are the

narrower ones, shown in orange. Since @xx was encoded using the next-to-nlax operator

exactly one data point is trinined off of each boundary. Figure 4-25 is similar but for

dimension pair (1, 4). Here, the lower boundaries of the two nienories are so close as to

almost he indistinguishable front one another but the top boundaries are quite different.

The canonical nienory A~xx must include the outlying point in its fixed point set while

@xx leaves it out.

Once the two nienories were created a random vector was picked front the data set

and its 4th dimension was given a new value randomly picked from between :300 and 500,

a reasonable upper bound for the time period. This distorted vector was then given as

input to the system and the output was computed. The squared error between the fourth

dimension of the output vector and that of the original vector was computed for both

nienories. This test was performed 1,000 times. The histogfram of errors obtained for the

nienory M~xx is shown in 4-26 and that for nienory @xx is given in Figure 4-27. Note

that although the histogframs from both nienories overlap somewhat, the error for @xx is

significantly smaller, on average, than that of M~xx.

For each of the 1,000 tests the squared error was computed for each of the remaining

dimensions and the error in dimensions 1-8 was averaged for each sample. Figure 4-26

shows a histogfram of the rnise values for the output of Mys while Figure 4-27 shows the

same for @xx. This benefits M~xx, as its output has no error in any dimension other than

4, while the #XX might incur some error in any dimension if one of the points chosen was

one of the points left out during the encoding process. Still, nienory ~xx outperfornis its

canonical counterpart.

4.4.3 The Second Experiment

In the second experiment a single vector is chosen at random and dimension 2

(cylinders), dimension 4 (horsepower), and dimension 7 (model year) are all distorted

as dimension 4 was in the first experiment. This noise vector is given 10 cylinders, :350

horsepower, and a model year of 198:3. Though a car with these specs may or may not

exist in real life, each individual specification is realistic and, again, the point of this

exercise is to quantify difference between the two nienories as it pertains to data in

general.

Next, each of the vectors in turn are chosen in turn and distorted in dimensions

2, 4, and 7. The distorted vector is applied to each nienory as input and the output

is compared to the original, clean version of the vector. The root mean squared error

between the output vector and the original vector is computed and stored. This entire

process is repeated 1,000 times. Figure 4-30 shows a histogfram for the rnise values

pertaining to M~xx. Figure 4-31 shows a histogfram for the rnise values pertaining to @xx.

Again, the error is substantially lower for @xx

4.5 Conclusion

Effective associative nienories can he created using order statistic operators other

than the nxin and the nmax. These encoding operators provide control over the width of

the fixed point set, which in turn provides a degree of robustness to outliers and noise

in the initial input data. In light of this fact, choosing the canonical encoding operators

constitutes a choice on the part of the operator to include all input points in the fixed

point set, which may have a negative inpact on the error between the output of distorted

vectors and their original, clean values.

Two experiments were carried out using a coninon computer science data set. These

trials show that if even a single outlier is contained within the input data, the width of

the fixed point set may be made arbitrarily large. This widening was shown to increase

the error between the output of distorted vectors and their original, clean values. When

using the b'^-largfest order statistic to encode and the nxin to decode n b outliers will be

trinined front each boundary of each dimension pair (i, j).

LL4

I X1

Figure 4-1: Narrowing the fixed point set for dimension pair (1, 2).

Figure 4-2: Narrowing the fixed point set for dimension pair (1, 3).

99

Figure 4-3: Narrowing the fixed point set for dimension pair (1, 4).

Figure 4-4: Narrowing the fixed point set for dimension pair (2, 3).

Figure 4-5: Narrowing the fixed point set for dimension pair (2, 4).

X4

L,

3L" L2

Figure 4-6: Narrowing the fixed point set for dimension pair (3, 4).

IIIIIY-lrlrlllll

Lz

I
I

~"L'
3

I X1

Figure 4-7: The fixed point set for the min and next-to-min encoded memories.

X21

IIIIIII

4LZ~

IIIIIIII

Figure 4-8: Decoding with Mz,, dimension pair (1, 2).

102

'4

~ .

Figure 4-9: Decoding with #z,, dimension pair (1, 2).

L,= L3

L4: L,

I
I

Figure 4-10: Decoding with Mz,, dimension pair (1, 3).

103

X3

L,=L~

L4: L

Figure 4-11: Decoding with #z,, dimension pair (1, 3).

Figure 4-12: Decoding with Mz,, dimension pair (2, 3).

104

Figure 4-13: Decoding with #z,, dimension pair (2, 3).

./ X

L,

HK~k.

.L4 -

Figure 4-14: Narrowing the fixed point set for dimension pair (i, j).

105

HH~k-

L4 r-

L

Figure 4-15: Noise and the narrowed fixed point set.

MPG D mension

80

Figure 4-16: Dimension 1: mpg.

MPG D mension 2

Figure 4-17: Dimension 2: cylinders.

MPG D menslon 3

I
10

Figure 4-18: Dimension 3: displacement.

MPG D mension 4

100

80-

60-

40-

20-

S50 100 150 200 2:

Figure 4-19: Dimension 4: horsepower.

5C

MPG D menslon

90~

Figure 4-20: Dimension 5: weight.

MPG D mension 6

900

801

60

50

40

30

20

10

8 10 12 14 16 18 20 22 24 26

Figure 4-21: Dimension 6: acceleration.

MPG D mension 7

70 77

76 78 8

Figure 4-22: Dimension 7: model year.

MPG D mension

Figure 4-23: Dimension 8: origin.

X1 vs X2

10 -

8ir 4-24 avooOE:GKpeO OO nare Ormdmninpi 1

X1 vs X4

10 15 20 25 30 35 40 45
X1

Figure 4-25: Example boundaries from dimension pair (1, 4).

Distance in Fourth Dimension Using M

Figure 4-26: Max encoding: squared error in the fourth dimension only.

Distance in Fourth Dimension Using Phi

50 100 150 200

Figure 4-27: Canonical encoding: squared error in the fourth dimension only.

Distance Between Vectors Using M
120

100

80

60

40-

20-

O 5 0 15 20 25 30

Figure 4-28: Max encoding: rmse across all dimensions.

Distance Between Vectors Using Phi

5 10 1520 25 30

Figure 4-29: Next-to-max encoding: rmse across all dimensions.

Distance Between Vectors Using M
800l

700o

600n

500

400

300

200

100

0 5 10 15 20 25 30 35 40

Figure 4-30: Max encoding: rmse across all dimensions.

Distance Between Vectors Using Phi

_I
S 25 30 3:

Figure 4-31: Next-to-max encoding: rmse across all dimensions.

114

CHAPTER 5
NEAR-MIN AND NEAR-MAX MEMORIES

An interesting pair of associative memories, known as near-min and near-max

memories, is described here. Unlike the canonical memories, these do not converge in one

iteration. In the future, this behavior will be described as not being i.- !! behaved in the

mmn-max sense."

Near-min and near-max memories, referred to collectively as near-min-max memories,

look similar to a family of chaotic maps known as tent maps. The behavior these

near-min-max memories, which do not exhibit chaotic behavior, is the subject of this

chapter.

5.1 Introduction to the Near-Min and Near-Max Memories

In the canonical min-encoded memories the encoding weight vector is wE

(w,w2,.. CP C, Wrhere II'F = 0 for ie {2,. .,n} and w = 1. As ahr-l-w, sorted

elements are assumed to appear ordered from least to greatest. The decoding weight

vector for a min-encoded memory uses the max operator, where the weight vector is

wDP D Dt, ).~> Here I,,, = 0 for i {1, .. ., a 1} and er,~ = 1. Likewise, in the

canonical max-encoded memories the encoding vector is wE 2E .w j 0) here

wE = 0 for ie {1, a 1} and wE = 1. This max-encoded memory is decoded using the

min operator, where the weight vector is wD (D 2D D = 0 for ie {2, .. ., n}

and WP = 1.

We can similarly define a near-min memory in which the encoding vector is closer

to the min-encoded memory than it is to the max-encoded memory. That is to ;?i, the

encoding weight vector is defined such that ,IF = for ie {2(,..., n 1}, w~e [, ,1]

and WE = 1 w,' Such a memory is decoded using a weight vector where I,,, = 0

for ie {2,..., n 1}, Wp = Ie,, and wD w? 1 wf. A near-max memory, then,

is a memory that is encoded with a vector closer to the max encoding vector than it

is to the min encoding vector. Such an encoding vector is defined so that I,F = 0 for

i e{2,..., n~-1}), w~ [0, ]1 and ,r"= -wf

Encoding with a near-min-max produces a memory @xx such that

(= 1 (= 1

In this section, it is important to differentiate between the number of input vectors,

usually denoted k, and the current time step, which is also typically k. The solution will

be to temporarily use the variable t to denote the current time step. Decoding with a

memory using the near-min product or near-max product, denoted ,1 is defined such~

that

xi[L + 1] = [@txx x[t]]; = w? (4 t)+tDV(a+x[] (5-2)
=1 \Yh I=1h~l r

Here, the nature of the decoding process (near-min product or near-max product) will be

specified to avoid ambiguity.

Theorem 5.1.1 will demonstrate that a near-min-max memory is simply a weighted

average of the two canonical memories Wxx and Mxx.

THEOREM 5.1.1. For a near-min-max: associative memory encoded with the weight

vector wE = (wC, 0, 0,
@.xx is such that = I, wi
Proof: The canonical memories, Wxx and M~xx, are discussed in C'!s Ilter 2. They
a~re definecd such thait ,I =I /_ (x -x anld mea =' V (i- ) o h na-i-a

memory described in Theorem 5.1.1,

(= 1 (= 1
=w1"' .+(1 trime (5-4)

This means that @xx = wfWxx + ,r~ixx. Q.E.D.

When there is no chance of ambiguity, the superscripts "E" and "D" are dropped

from the weight vectors, and their components in order to simplify the notation.

Recall that in Equation (3-40) we defined two lines that form the boundaries of the

fixed point set for the two-dimensional piecewise linear lattice based associative memory.

We will now refine this concept in order to discuss the properties of the near-min-max

associative memory.

For each dimension pair (i, j) let the line forming the top-most boundary of the inside
case for (i, j) be known as L. and let the line forming the bottom-most boundary be

known as LB Define LT such that for all t

Sxj[t] = xi[t] '. if wp e [0, ]
xj [t]= xi[t]+ ". if wp e [ ,1]

It is convenient to introduce two new notational elements, NVo [0, ] and NI e [, 1,whr
No~ ~ ~ ~~~I +I NI=1 ee oi s I' and NI~ is IIn lI-one". Note that w? e [0, ]' can

then be written wD = No and wD? e [, 1] can then be written w, = Ni. The same can be

done for try~, w? and tryfi. Also note that with the near-min-max lattice based associative

memory w~ = 1 I,F, wp = 1 w, and, because near-min encoding is ah-liws paired

with near-max decoding and vice versa, wp = 1 wF = (Iry.' Equation (5-5) can then be

re-written

C xy [t]=4[t]-N A ( ) + No V~= x~ ))) if w:' e [o, 21 ]
23 xy[t t] + Noi x- X)+N V ( -- x)) if w:' e [I, 1]

for all t. Since

= 1) (=7

E=1 (= 1

k k
(= 1 (= 1

Equation (5-6) can be simplified to

LT : xy [t] =\ x4t o ( ) c ;- o nw 0,1. (0-10)
(= 1 (= 1

Likewise, for all t let

x X[t] = X4 [t] +".if ww e1 [0,a j]
L (5-11)
x [t] = xi [t] ". if w? e [ 1]

This means that

Xj [t]= x4 [t]+ NI~ 1A (x-X x) +N Vg 1x) Xi) i wee[o
Lt -~ (5-12)
sy xy ~ ~/t]= xi~t] NoA3 (x-)NV x -)) if WD eI [, 1]

for all t. Using our previous logic we can conclude that, for all t

(= 1 (= 1

THEOREM 5.1.2. After one iteration, every state vector of an n-dimensional near-min-

ma~ me~mory lies w~ithiin the boulrndrirAes L and LB fo~ r eah dli~me~nion parir (~ i, j), where

Proof: Since the case when wp = 1 represents the canonical min-decoded (max-encoded)

memory Mxx, which is known to converge to the middle case for every dimension pair in

one epoch we know that
k k n
x4,[t] + NI 1(x -2 xf ) N V( \j-X i < re + 11 A sx + s te]) (5 13
(= 1 = 1 x= 1

< x4lt] + o ( x)+N V 4 (-10
(= 1 (= 1

Likewise, the case when wp = 0 represents the canonical max-decoded (min-encoded)

memory Wxx, which is known to converge to the middle case for every dimension pair in

one epoch we know that

k k n
xi [t] + NIv (x-x) oV -x 4)< + V (4 + x [t]) (5-16)
(= 1 = 1 x= 1

(= 1 (= 1
Therefore, for a near-min-max memory with wD e" [01] andl wfr = 1 wD

= 1 (=V'IC"UW

+w x4 [t + N (x x) + o x ) 59
(r k k
1~~~ 1 (=1O 58

E=1 E=1

< w?'' x4, [ + N ( x) + Nx, V 1 x ) (5219)
E=1 (=1

+wf x4mx [t + No( -x) + Nx, V -~X x iXt) ,(5-22)
x=1 (=1

x4~; [t] + N (x -x x))+ NoV ( x ) 5-3

n n "

E= 1 = 1

< x[t] + Nol (xi x)C) +~ N V'-i x ). (5 25)
E=1 (= 1

This, in turn, means that

L x [t + 1] < L (5-26)

Q.E.D.

5.2 Comparison with Tent Maps

Recall that in Eq. (3-3), from Section 3.1,

xzl[k+ 1]= (Ola,(lj + xxx(l[k])w1I + (01xx(2) f S,(2) [k]) w2
(5-27)
z2[k+ 1 ]= ('.--1)TI~ + x ,(1) [k])w11I+ (11._--,_; + Zx2(2) [kI]) w2-

K~eeping in mind that CE manner

xi[k + 1] =( 41 + x [k~]) W1 + ( i' _. + 2 [k]) (1 w1) if (si + Xz [k]) < ( i' _. + 2 [k])
( 2[k]) wl +(#si +Xx[k~]) (1 w1) if (#si +Xx[k~]) > ( i' _. + 2[k~])
(5-28)

Immediately certain similarities between the tent map and the two-dimensional lattice

based associative memory are apparent. Here, thanks to the constraints on the sum of the

OWA weights, the value wl resembles the p parameter. Also, the overall system behavior

is decided by two linear sub-systems. It is important, then, to determine whether or not

the lattice based associative memory di physi- chaotic behavior as well.

A similar bifurcation diagram was created for the two-dimensional lattice based

associative memory. A set of random vectors was chosen from a uniform distribution and a

series of 201 values for my were generated, these values were evenly distributed throughout

the interval [0, 1]. For each value of wl, 200 iterations of the lattice based associative

memory were performed and the values of each dimension of the state vector were plotted

separately. For each value of wl the trajectory begins at a different initial point, which

is generated from uniform distribution over the interval [0, 1]. The decision to generate

independent bifurcation diagrams for each of the two dimensions of the state vector was

made to facilitate easy comparison to the bifurcation diagrams of the tent map. The result

of this simulation is shown in Figure 5-1.

Note that there is a clear global pattern that emerges across the entire diagram, it is

almost sinusoidal in nature. The range of values taken by the first dimension of the state

vector changes with the value of the my parameter, first forming a negative curve and then

a positive curve of the same amplitude.

The first value taken for wl is zero. This creates a vector of OWA weights (0, 1)

that corresponds to a canonical, max-encoded lattice based associative memory. As such,

the number of unique values taken by the first dimension of the state vector is at most

two. In the event that the first point is a fixed point, or if it is in a region of the state

space where movement towards the fixed point set is purely within the second dimension

there will only be one unique value taken. If the initial point exists in a region of where

movement to the fixed point set happens within the first dimension then there will be

unique values for both the initial point and the fixed point to which it moves in one

iteration, as discussed in C'!s Ilter 2. Figure 5-2 has been zoomed in to show the single

unique value taken by the first dimension of the state vector in 201 iterations.

A similar graph is shown in Figure 5-3, where the system is set to wl = 0.5. This

case represents the event that wl = w2 = 0.5, which was discussed in Section ??. This

is another case in which the system is well behaved in the min-max sense. That is to ;?i,

a point outside the fixed point set (which has now reduced to a single line L1 = L2) W111

move to the fixed point set in one iteration. Therefore there are again at most two values

taken during the 201 iterations. In this example there were two values.

An example of one of these cases is illustrated in Figure 5-4. Twenty iterations were

calculated for a canonical lattice based associative memory using max encoding. As was

described in C'!s Ilter 2, the system converges to a fixed point in one iteration. Note that

the data shown in Figure 5-4 is the result of a separate simulation from the data presented

in Figure 5-2. This means that the random initialization is different, and therefore the

number of unique values taken by the first dimension of the state vector may be different

as well. In fact, the first data set generated a fixed point immediately, and therefore

only one unique point in 200 iterations, where as the second simulation did not initially

generate a fixed point, and so there are two unique values.

For all but the canonical cases, and the special w1 = w2 = 0.5 case, there were 201

unique values taken by the state variable through out the 201 iterations. This means that

every iteration produced a unique value i.e. there were no fixed points. The range over

which these 201 values was spread depended upon the value of wl, as did the direction in

which the state variable moved. Increasing or decreasing the number of iterations between

simulations increased or decreased, respectively, the amplitude of the psuedo-sinusoidal

pattern but did not change the pattern in any other way.

The simulation that generated the bifurcation diagram shown in Figure 5-1 was

repeated close to 30 times to observe variations caused by random initialization. Though

the actual values taken during the 200 iterations calculated for each setting of the wl

parameter differed, the pseudo-sinusoidal pattern in global system behavior was present in

all of the simulations. The coherence of the psuedo-sinusoidal pattern was not consistent,

though. That is, the pattern was often noisy. Figure 5-5 shows the results of one of

the least coherent, i.e. One of the noisiest, simulations. Note that the psuedo-sinusoidal

pattern is still discernable, but it is not nearly as clean. This noise is the result of having

different randomly created initial values for the system at each point on the wl axis. If the

initial point is ahr-l- .- zero, or if the initial point is subtracted from the value of the state

vector before pll..1 111. the wave form is ah-- 11-< completely coherent.

Also noticeable in Figure 5-5 are the single points that appear outside of the

pseudo-sinusoidal pattern. These are the initial points and there is only one such point for

each of the values of wl. From the initial point there is a gap as the the trajectory jumps

from an outside case to the inside case.

Figure 5-6 shows the same data as that in Figure 5-5, but zoomed in to give a better

glimpse of these discontinuous jumps. Across all of the simulations that were run in

this experiment the size of these discontinuous jumps was ahr-l- .- inversely proportional

to the coherence of the psuedo-sinusoidal pattern. That is, the smaller the jumps the

more coherent the pattern. For each value of wl the initial point of the orbit is different.

Subtracting off this initial point, or simply starting at xi [0] = 0, yields a coherent

pseudo-sine.

An example of one of these trajectories is shown in Figure 5-7. As with Figure 5-4,

twenty iterations were plotted. Unlike with it's predecessor, however, in this case there

is no fixed point. The system parameter was initialized to wl = 0.1, and so this is not a

canonical lattice based associative memory.

5.3 Effect Of Initialization

The pseudo-sinusoidal pattern di;11l-p Id in the previous examples does not depend

upon the initial set X of input vectors. Previously the vectors were initialized to values in

[0, 1]". From figure 5-8 it is apparent that changing the initialization to xE e [--100, --200]

does not disrupt this pattern. In fact, in Figure 5-8 the pseudo-sinusoidal pattern is

extremely clear. Perhaps the amplitude has been affected, but not the pattern itself.

A similar test was performed on positive values other outside of the range [0, 1]. For

this experiment the system was initialized such that xt e [50, 100]. It is clear from Figure

5-9 that the pattern still appears.

Another test was constructed to test the behavior of the system when the values of xt

cross zero, in this case x e [-30, 30]. Again, the pseudo-sinusoidal pattern held.

For each of these experiments the system was tested with 10 different sets X for each

range of initial values. The images shown are representative of those produced during the

simulations .

In order to study the effect of the initial conditions of a particular trajectory several

other simulations were constructed. These experiments the set of initial vectors X

was kept constant throughout. At first, a random initial point was chosen such that

x[0] e [-5, 5]. Again, the pseudo-sinusoidal pattern was evident. In fact, the pattern

was evident in two further experiments one in which x[0] e [-25, -5] and one in which

x[0] e [5, 25]. Each simulation was performed 10 times.

This so__~-r-;- that the psuedo-sinusoidal pattern di;111 i-, II in the bifurcation diagram

for the generalized OWA version of the lattice based associative memory is invariant to the

initialization of the input vectors.

5.4 Harmonic Analysis

To determine the source of the pseudo-sinusoidal pattern exhibited in the bifurcation

diagram for the piecewise linear associative memory it is necessary to examine the system

equation itself. To simplify this, note that the envelope of the bifurcation diagram is

formed by the values taken at the last iteration. Assuming that the number of iterations

is greater than one, this means that the system will be in the between the two boundaries

in state space, in the inner case mentioned in Section 3.1.1. Therefore, we can refine our

system function to the following simpler version:

xz k 1 = (1 I~xt + w112 .1 12 l 0 .11 C [0, (529
xixt + (1 wix25112111

In order to perform the harmonic analysis using Fourier series approximation it was

necessary to change the variable from w which is defined over the interval [0, 1], to some

variable p, which is defined over the interval [-T, T], for some T. To achieve that, let

p =wi .This means that wl (p + )\ and that (1-w (1 p) Define the fulnction

f (p) such that

-~p p) xi+( 2 9 2 0 -, (5-30)
(p + )xi + ( ) Z2 ) d121 ,

The variable p is then defined on the interval [- ~].

For any piecewise continuous function f on the interval [T, -T] the Fourier series of f

is the trigfonometric series

f (x)~ + cos +b, sin (5-31)
2 TT

124

where a, and b, are defined by

an f x) os dp, a = 0, 1, 2, .. (5-32)
/T7

b, = f(xT) sI in -dp, a =1, 2, 3, (5-33)
-T7 T
and the symbol ~ is used to highlight the fact that the series represents f(p), but may not

converge to it [18].

Assume a simple example in which x is initialized to the zero vector, and the initial

set of vectors X is set to the identity matrix. Letting T = ,

fo (p) + {, cos(2xup)r~ + b,, sin(2xup~))} (5-34)
n= 1

a,=ai 2 f(p) cos(2xup~)dp, n = 0, 1,2,..., (5-35)

be 2 f (p) sin(2xup~)dp, = 1, 2, 3, .. .. (5-36)

Furthermore, recall that when the decoding vector is (wl, w2)' the encoding vector is

(w2, w1)' and the definition of # is such that

(= 1 (= 1

.This means explicitly that # can be written as a function of wl, since w2 w1-

Converting this to a function of p,

(= 1 (= 1

.Frafixed set of input vectors X, the values x -- x adil V (xi 4 i

remain fixed. Let m X ) and M V ("iE x) Th~erefore, using 5 30,

we can redefine f(p) to be

f (p) = -p x+ p+ 2 22+p+)Mifp -,0
(p + )xi + ) .r -i p" Z2 /2 _2) MI if peC [0, ~
(5-39)
Since f(p) is a piecewise function, the Fourier coefficient integral can be written

an 2 f (p) cos(2xunp)dp 2 f (p) cos(2xnp) + 2 2f (p) cos(2xunp). (5-40)

Substituting Eq. (5-40) into Eq. 5-39 yields

2 o p x l + 2/\ x9 2 2i + p + M c s( x up d p
(5-41)
This can be attempted using integration by parts. Let u = f (p). This implies that

du=r-xl + x2 2pm + (2p + 1)M~dp (5-42)
xl x2 + (2p 1)m 2pl~dp

Let dv = cos(2xunp)dp. This implies that

sin(2xunp)
v =(5-43)
2nxr

which is not defined when n = 0. From Equation 5-35, however, is clear that when n = 0

ao 2 f (p) cos(0)dp 2 f (p)dp + 2 f (p)dp. (5-44)
2 2

Therefore ,

ao = xi + -2 + + +M.(5-45)

For the cases when n / 0, integration by parts shows that

an, = 21lbSci l J 2 v U + 2n J 22 vd (546,)

2(cos(xun) + 2xansin(xun) 1)xl (cos(xun) 1)(2x2 + M~ + m)
2x2n2

One can similarly calculate be,

2 cos(xun) ~rn sin(xun) 2
b, = (M m) (5-47)

5.5 Existence of Fixed Points

The existence of fixed points revolves around the amplitude of the pseudo-sinusoid as

the number of iterations approaches infinity. That is, does the pseudo-sinusoid reach some

sort of limiting sine wave, and therefore identify a fixed point, or does it simply increase in

amplitude?

THEOREM 5.5.1. The amplitude of the envelope of the bifurcation diagram for a near-

min-max: lattice based associative memory grows without bound as the number of iterations

increalses, for w*I e 0,) (( C1)~ I.

Proof: First, note that when wl > ,

x [k] = T,' [k] + T21 [k] + T12 [k] + T211 [k], (5-48)

where

Ti [k] = x z[0] g (, ) (5-49)

T21 [k] = [0] (k j) (5-50)

T12 [k] = w2 2g i j (-1
i=O 0t: i.(11

0 if k= 1

T211 ~L~i [k] =J' w22 1g(i j)] if k >1, (-2

Here ,

g(a, b) = c(a, bj'w 'a-b)w I, (5-53)
a!
c(a, b) =,(5-54)
b!(a b)!'

{0 2 .. -1 if k is odd (5-55)
{0, .., k} if k is even

21 {1 3, .., k} if k is odd (-6
{1 ,..,k-1 if k is even

12 { ,2 ..,i-1 if i is odd ( -7

{0, .., i} if i is even
and

211 ,3 f II d (5-58)
{1,3, ..,i -1} if i is even

Likewise,
x2 [k] = T'2", [k + 221, [k +2 [k] + T221[k,1 (5-59)

where

T,2 [k] = Xi [0] gtJk, j) (5-60)

T22[k = Z20 5,2 j) (5-61)

0 ~if k =1
T2[]= 21 1 2g~,j i 1, (5-62)

T,221[k] = W2 21 gi ) .(-3
i= 0t: ye ]12
Here ,
2 1 3 }if k is o d d ( 4

{1,3, ..,k -1} if k is even

22 {,2 .. } if k is odd (-5

{0, .., k} if k is even

2 1 ,. if i is Odd (5-66)
{1,3, ..,i -1} if i is even
and

a = { ,,.. i 1 if i is odd ( -7
{0, .., i} if i is even
These formulas can be derived intuitively by calculating each term at steps k = 1, 2, 3, .

The terms corresponding to x [k] for time steps k = 1 and k = 2 are given below as an

example:

T 1[1] = xz [0]c(1, 0)wl = wz xl[0], (5-68)

T21 [1] = [0]c(1, 1)w2 E27 w.[n], (5-69)

T,,12 2 12~lc(0, 0) = w2 12, (5-70)

T211 [1] = 0. (5-71)

Ti [2] = xi [0] [g(2, 0) + g(2, 2)] (5-72)

= xI [0]c(2, 0)w, 2-0)w~ ~ m + (,2w22w(-3

= xi[0] [wf + wl2 (5-74)

T21[2] = x2[0] [g(2, 1)] (5-75)

= Z2j[0]c(2,1I)w l2-1)w~l (5-76i)

= 2[0] [2wlw2] (5-77)

T,12[2] = w2 12 [g (0, ) +g(1, 0) ] (5-78)

= E2 t12 Lc(0, 0)/w2 w c(1, 0)wu (-79

=~l [212 1 ] (5-80)

T21[]=w22 g1, 1)] (5-81)

= w221 C1, )E 11)w(5-82)

= E2 1 E2] (5-83)

This verifies the fact that

x, [1] = wlxl[0] + w22[0] + w212,

Z2[l E2 1x[0] + w1Z2 [0] + w2 21,

(5-84)

(5-85)

Also,

=w xz [0] + wlw2.ln 10 12 ~,

+
=~2 (w + x 0]+2ww 2 1~X[0] + (w1 w2 WE2 ~12 it I,* I.

xz [2]

(5-86)

(5-87)

(5-88)

These formulas can be proven correct for all k > 2 using induction. Since the basic

mechanism is the same for each of the terms in xl[k] and Z2[k], with only slight algebraic
differences, only the proof for T/ [k] will be shown he-re

Since the case when k = 1 was proven to be correct, it will be the base case. It is then

necessary to show that for k > 1,

(5-89)

(5-90)

imply that

x[k] = T/ [k] + T21 [k] + T/2 [k] + T211 [k] (5-91)

=wlxt [k 1] + w2 2 [k 1] + w2 12, then Equation (5-91) is true if and only

Because xi [k]

xI [k] wi (T/ [k 1] +T1 [k- 1] l + I [ 1] + 21 [k. 1])

+w2 (TF [k 1] + T22 [k -- 1] + T12 [k 1] T22 [k. ] 2 !12,

(5-92)

(5-93)

xi [k 1] = T/ [k 1] + T21 [k 1] + T/2 [k 1] + T211 [k 1]

Z2 [k 1] = T, [k~ -- 1] + T22[k~ 1] + T122 [k -- 1] + T221 [k -- 1]

which would mean that

Ti [k] = wlT, [k 1] + W2T,2 [k

(5-94)

which is true iff

xi[0]~ g(ki,j)

wixt [0]C g(k

1, j) + w2z 1[0] g (k

1, j).

(5-95)

In the event that k is odd, this is true iff

g(k, 0) + g(k, 2) + ... + g(k, k

1) = wl (g(k 1, 0) + g(k

+w2 (g(k 1, 1) + g(k

-1, k- 1))

This is true iff,

c(k, 0)w? + c(k, 2)uj"w k2) + ... + c(k~, k

1)wim k-1)

= wi (c(k :

1, 0)wl k-1) + (k 1

1, 1)wl(k-2)w2 + c(k

1, k -1wk 1

k, 1 k -2w mk2

,2w-3W, + ... + c(k~ -

- 1,3)wk-4W, + ... + c(

[c(k, 0) -- c(k -- 1, 0)] W~ + [c(k~, 2)

-c(k 1, 1) c(k 1, 2) W k-2)c; 2

+t[c(k k 1) -c(k 1,k-1) -c(k 1,k-2)]wlwk-1)=0

Note that [c(k, 0)

c(k, j)

c(k: 1,0)] 1_ 1 and

c(k 1, j
k!
(k j)!j!
k!
(k j)!j!
k!
(k j)!j!
k!
(k j)!j!

1) c(k 1, j)
(k 1)! (k 1)!
[(k 1) (j 1)]!(j 1)! (k 1 j)!j!
j(k 1)! (k j)(k 1)!
j ((k 1)(j 1))! (k j)(k 1 j)!j!
j(k 1)! (k j)(k 1)!
(k 1)j! (k j)!j!
- (k j + j)!(k 1)!
(k 1)j!j!

(5-96)

(5-97)

(5-98)

(5-99)

(5-100)

(5-101)

Therefore, if k is odd, then T [k] =

Equation (5-91) is true if and only if

iT/ [k 1] + W~?2T12 [k-]. If k~ is even, however,

cuk, 2)w k

-2w ... +c~\k, k~/w k

1, 0)w k-1) +c(k~ 1, 2)w k-3)w + .. 1 1,k- 2)wim k,(-2)

1, 1)wul(k-2)E2a + c(k 1, 3)w k-4)W~ + ... c~nk -1,k-)w-)

c(k,0)w? +

+ w2 c(k

This is true if and only if

[c(k, 0) c(k 1, 0)] w' +t [c(k, 2) c(k: 1, 1) c(k 1, 2)] w k-~12) 2

+t[c(k k)j- c(k:-1,kX- 1)]wk~j) =0,

By Equations (5-96) through this is true.
Since k is ahr-l- .- either odd or even, this is .ll.h li-- true. This concludes the proof for

T/ [k]. For all other terms, the inductive proof is extremely similar.
THEOREM 5.5.2. For a two-dimensional lattice based associative men .-re; when

S< wz; < 1, ]lmk->oo 1[lk] = 00. W~h~en 0 < wzl < ,? lim~k->o 21[k] = -oo

Proof: First assu~me that, 0 < wz: < ~. Note that, C o g(a;, b)= (a + b)k, thanks to the

binomial theorem. Also, note that the odd terms are given by ~[(a + b)k + (a b)k] because

(a + b)k = c(k, 0)ak + c(k, 1)alk 1)b + c(k, 2)a(k"-2)b2 + ... + c(k, k)bk,

(5-102)

(a b)k

=c(k, 0)ak + c(k, 1)a(k-1)(-b) + c(k, 2)a(k-2)(-b)2 + (5-103)

+ckk)-bk (5-104)

=c(k, 0)ak c(k, 1)a(k"-1)b + c(k, 2)a(k"-2)b2 + ... + c(k, k)(-b)k, (5-105)

where c(k, k)(-b)k" = c(k, k)bk" When k is even and c(k, k)(-b)k

odd. This means that

-c(k, k)bk, When k is

(a + b)k + (a b)k = 2c(k, 0)ak + 2c(k, 2)a(k"-2) + 2c(k, 3)ak"-3b3 + ..., (5-106)

so half of this yields the sum of the even terms. Likewise the even terms are given by

~[(a + b)k (a b)k] because

(a + b) k (a b) k

-2c(k, 1)a(k-1) (-b) 2c(k, 3)a(k"-3) (-b)3 + (5-107)

2c(k,1l)a(k-1)b +2c(k, 3)a(k"-3)b3 +..., (5-108)

and so half of this yields the sum of the odd terms. Furthermore, since k (and therefore i

and j), wl and w2 arT ar-iT~ nTIOn-negative, the sum of the first k 1 terms in Equations

(5-49) through (5-52) are less than the sum of the first k terms. This leads to

- lim x [0] [(wl + w2) k (w1 w2 k]
2 k->oo
lim xi [0] [1 + (wi w2 k]
2 k->oo

-xi [0].

lim Tz' [k] <
k->oo

(5-109)

(5-110)

(5-111)

The equation for T21 [k] is extremely similar to that of TI [k].

(k-1)
li 12 [k] > 2 12lim [w 2(-) 0 2(-) 5 12
k->m 2 k->m
i= 1
(k-1)
E2 12 olim [1 + ( I w 2 (i 1) (5 -1 13 )
i= 1

The equation for T211[k] is similar to that of T2l[k].

Since T/ [k] and T21 [k] are bounded, the growth of xi [k] is therefore determined by

T/2[k]' and T211[k]. Since both limnk->m k-1) [1+(1 .U2 (i-1)]oo and limk-m (k-1)[

(wl w2 l~(i-1) = OO iS 10ngjl as I* ) > 0. To see that it is, let w?! and w, denote the

weights used during encoding, and wp and wD denote the weights used during decoding.

Furthermore, since we have assumed that w?' > it must be true that wC < Let
may = A (xf x r ),k 1,f ;F' = n~).Recall t~hat mij=-Ms

12a 21~ > 0 (5-114)
Sw? if _+E + wfm21 ,,12 2E1_1 > 0 (5-115)

+ wE 11f -w~ii f_ + wfml2 2EWm12 > 0 (5-116)

S(w~ w?) if + (w? w?)ml2 > 0 (5-117)

S(w~ w?) [M~12 m12] > 0 (5-118)

i 1 T_ ml2 > 0. (5-119)

Therefore, when wl > on decoding, limk->m 1 [k] = 00. A similar proof shows that when

W1 <1 during decoding, limk->m 1[k] = -oo. This means that the magnitude of the

pseudo-sine wave grows without bound. So, for the two-dimensional case the only fixed

points occur when wLI t {0, ~, 1}.
5.6 Growth in N Dimensions

The behavior for an n-dimensional system is the same. The near-min-max lattice

based associative memory is defined such that the weight vector is w = (w1,0,...,0, Ir,,).

Theorem 5.6.1 will show that when wl = In~,, = the system has a fixed point.

134

THEOREM 5.6.1. For an n-dimensional near-min-max: lattice based associative

memot e; when wl =? i[k: + 1] = xs[k] ~for all points in the inside: case.

Proof: In n-dimensions, a vector x is in the inside case if and only if xi ". < xj <

Xi + i' for i, je {1, n}. Also, since wl = then In~,, = 1 wl = as well. Furthermore

.- (f- x ) i mI) (5-120)
(= 1 (=1
k k
xf- )-iV(2- i (5-121)

=1 (=1
k\ k
2 V \"j Xi/ +1 A w; -\ x)) 5-12)
(= 1 (= 1
(5-123)

This means that xi ". .

means that sil + xl = ...

xj = xi + '.. .Taking into account that (as = 0 for all i, this

~i+ x4= ... = + xj = ... = + x, for all i, j. Since

wxix[k] + In~,,(1,, + Xn[k])

xi[k] + n( i + X, [k])

x4[k] + xi[k]

Xs [k],

-124)

-125)

-126)

-127)

Q.E.D. A 3-dimensional example follows:

Z2 1 21 ~~

1l ~12

-128)

-129)

-130)

Z3 2 '~ : 2 X

xl = X1+ 31

1 13.

x [k + 1]

Therefore ,

Z3 + 33 = 1 + 31 2 X 32, (5-131)

Z2 22 1a 21 x a 3 + 23, (5-132)

xi + t1 = x2 12 ~i 3 + 31. (5-133)

Lemmas 5.6.1 and 5.6.2 are necessary in order to address the existence of non-fixed point

values for wl.

LEMMA 5.6.1. If xi[k] represents a state vector in the inside case, then Xi[k]=

A l =1 x [k] +"
Proof: Since xi is in the inside case, xi '. < xs [k] < xi + ". for all j and so

x [k] = xi[k] + (as < xj + ., for all j. Q.E.D.

LE MMA 5.6.2. If xi[ k] represents a state vector in the inside case, then xi[ k + 1] > Xi[] k]

Proof: By Theorem 5.6.1 xs[k] = Af (xy [k] + .) which means that

s[]< V (xy3 [k] (5-134)
j=1

+ ~ ~ inV(k[]+" ) -I, w [k] > (5-135)
j= 1

+ ~ ~ ~ i inVZkk )+ (w, 1)Xi[k] > 0 (5-136)
j= 1

+ wie [] +in, V Zk k] "..) xs[k](5-137)
j=1
+ Xi [k + 1] > Xi [k] (5-138)

Equality only holds in the event that A L,(Zk [k] + i. .) = V~7x (~k [k] + i. .). Q2.E.D.
5.7 Conclusion

It has been shown that although the near-min-max lattice based associative memory

looks algebraically similar to a chaotic map known as the tent map, it does not display

chaotic behavior. Furthermore, a fixed point outside of the canonical fixed points, namely

the average, at wl = 0 and wl = 1 exists. When the value of wl e (0, ~) U ( 1), however,

the system does not contain any fixed points. This system is not well behaved in the

min-max sense, as described in Chapter 4.

08

Figure 5-1: Bifurcation diagram for 2D memory. Only the first dimension of the state
space is shown.

0-n

-003 -002 -001 0 001 002 003
w1 parameter

004 005

Figure 5-2: Canonical behavior in bifurcation diagram.

0 02 04 06 ~te

X1

-2

-3
0 42 0 44 0 46 0 48 0 5 0 52 0 54 0 56

w1 parameter

Figure 5-3: When wl = w2 = 0.5 in the bifurcation diagram.

02

0 15

01

0 5 10 15 20 25
iteration

Figure 5-4: Twenty iterations of a max encoded memory.

15

05

0

-0 5

-1 5

20 02 04 06a~te 08

Figure 5-5: An example of the incoherent psuedo-sinusoidal pattern.

12

08

T- 0 6

0 4

0 2

-0

01 02 03 04 05 06

w1 parameter

Figure 5-6: Discontinuous jumps in the bifurcation diagram.

08

06 C
0.
05- O o

01I -

0s 5 0 1 0 2
iteratio

Figure~~~~~ ~~~~ 57Twnyieaoswitw =01

02 04 06a~fe 08

Bifurcation diagram initialized to xE e [--100, -200].

Figure 5-8:

-100 E

w1 p2W arame~ter

08

Figure 5-9: Bifurcation diagram initialized to x t [50, 100].

10
8
6
4
2

-2
-4
-6

0 2w1 param~eter~

Figure 5-10: Fittingf a sine wave to the bifurcation diagram.

142

CHAPTER 6
CONCLUSIONS AND FITTIRE WORK(

The non-linear lattice based associative memory introduced by Ritter, et al. was

analyzed as a piecewise linear system using a particular subset (the min and the max) of

the set of ordered statistics, which is in turn a subset of the set of all ordered weighted

average (OWA) operators.

This piecewise linear viewpoint allows for the extension of the lattice based model to

include other order statistic operators, and other OWA operators in general, beyond the

min and the max. This extension was the subject of C'!s Ilter 4.

First, a method for reducing the effect of outliers in the initial data set was described

using the general ordered statistic operators for encoding memories and canonical min-max

operators for decoding. A class of memories known as near min-max memories was

introduced, which lead to the description of an important property known as being i.- -!!

behaved in the min-max sense."

Order statistics encoding was shown to provide the ability to choose the width of

the fixed point set. This ability, in turn, provides robustness to outliers in the initial

data set. Without this ability, using only the canonical min or max operators to create

the associative memory, a single outlier within the initial data can make a memory's

fixed point set arbitrarily large. When the fixed point set is widened, the memories

ability to retrieve a stored pattern given noisy input is reduced. Viewing the canonical

encoding operators as a subset of the larger set of order statistics based operators allows

for the memory to trim outliers for better performance on noisy input patterns. This

improvement was quantified using a widely available computer science data set.

The near-min-max memories, on the other hand, were compared to the algebraically

similar tent map in order to test for chaotic behavior. Though no chaotic behavior was

discovered, it was found that the system di;1l. li-. I an unexpected, almost sine-wave-like

pattern in the envelope of the hifurcation diagram. Formulas were derived to give a

harmonic analysis of this pattern in order to better understand where it comes from. The

near-min-max system was shown to be without fixed points except for when the encoding

and decoding operators were the canonical operators, or the average between the min and

max operators.

Lattice based associative memories have shown promise in the field of artificial neural

networks. In light of the material presented in this document, the continued use of the

canonical memories constitutes a conscious decision to choose the min and max operators

for encoding and decoding out of the infinite assortment of OWA operators outside of the

min and the max. This decision greatly affects the behavior of the system, including the

shape of the fixed point set.

The use of encoding operators to shape the fixed point set is an excellent area for

future research. A fantastic beginning would be the development of a learning algorithm

that could decide on the optimal order statistics encoding for a particular data set, with

respect to the retrieval of patterns given noisy inputs. Also, other encoding schemes

should be examined in hopes that further control over the shape of the fixed point set,

outside of simply the width, can he gained.

Also, the field of Control Theory contains a framework for exploring piecewise

linear dynamical systems. In the future, an attempt should be made to find a method

for exerting external control over the dynamics of piecewise linear associative memories

that are not well behaved in the min-max sense. This might yield a system that is well

behaved, while having unique properties due to its encoding and decoding operators.

Lastly, the study of the entire set of OWA hased associative memories will require

further research. Exploration of memories outside of those presented herein will be

necessary.

144

REFERENCES

[1] B. K~osko, "Bidirectional associative memories," IEEE Transactions on S;8i; and C;l,I~ ,,..I. :. vol. 18, pp. 49-60, January 1988.

[2] K(. Haines and R. Hecht-Nieson, "A bam with increased information storage capacity,"
Proc IEEE ICNN-88, vol. 1, pp. 181-190, 1988.

[3] R. J. McEliece, E. C. Posner, E. R. Rodemich, and S. S. Venkatesh, "The capacity of
the hopfield associative memory," IEEE Transactions On Information The ..<;; vol. 33
(4), pp. 461-482, July 1987.

[4] G. X. Ritter and P. Sussner, "Associative memories based on lattice algebra," in 1997
IEEE International Conference on Comp~utational C;l,I~ ,... 1.:. 4 and Simulation, vol. 4,
(Orlando, Florida, USA), pp. 3570-3575, Systems, Man, and Cybernetics, October
1997.

[5] P. Sussner and M. Grana, "Guest editorial: Special issue on morphological neural
networks," Journal of M~athematical Imaging and V/ision, vol. 19 (2), pp. 79-80,
September 2003.

[6] G. Ritter and P. Gader, Advances in Imaging and Electron Ph;, vol. 144,
ch. Fixed Points of Lattice Transforms and Lattice Associative Memories,
pp. 165-238. Academic Press, September 2006.

[7] J. Lewis, A,:ale;,: of Linear D;, ..7 .ii SI,! iii Illinois: Matrix Publishers,
Incorporated, 1977.

[8] J. Johnson, Linear S;,/.ii A .rl; : New York: The Ronald Press Company, 1975.

[9] D. Luenberger, Introduction to D;,, .ii~ Sl, /.ii New York: John Wiley and Sons,
1979.

[10] J. Stuart, Calculus Early Transendentals. New Jersey: Gary W. Ostedt, 1999.

[11] H. Schneider and G. P. Barker, M~atrices And Linear Algebra. New York: Dover, 1973.

[12] A. Anderson, "A simple neural network generating an interactive memory," M~athe-
matical Biosciences, vol. 14, pp. 197-220, 1972.

[13] T. K~ohonen, "Correlation matrix memories," IEEE Transactions on Comp~uters,
vol. C-21(4), pp. 353-359, 1972.

[14] J. Hopfield, \. IIo I1 networks and physical systems with emergent collective
computational abilities," Proceedings of the National A .<.1. I,,;t of Sciences of the
USA, vol. 79, pp. 2554-2558, April 1982.

[15] D. Valentin, H. Abdi, and B. Edelman, "Connectionist face off: Different algorithms

[I16 S. Haykin, Net~ural NetworIk;S Ai comprehensivee Forundation. New Jersey: Prentice Hall,
1999.

[17] B. K~osko, Neural Network and F,..;; Sl,;liii New Jersey: Prentice Hall, 1992.

[18] R. K(. Nagle, Differential Equestion~s and Bo J,:itti V Ialue Problems. New York:

[19] Y. Wang, J. Cruz, and J. H. Mulligan, "Two coding strategies for bidirectional
associative memory," IEEE Tnen~srctions on Neural Networks, vol. 1 (1), pp. 81-92,
March 1990.

[20] G. Alathai and B. Upadhaya, "Performance analysis and application of the
bidirectional associative memory to industrial spectral signatures," in 1989 IEEE
Itrnatona Jin conferencee on Neural Networks, vol. 1, pp. :3:337, June 1989.

[21] T. K~ohonen, Self-Ollr,:r. ..el..>,n and A~ssociative Af. ;;;.,;; New York: Springer-\ 15 II.
1988.

[22] W. G. Wee, "Generalized inverse approach to adaptive multiclass pattern
classification," IEEE Tman~srctions on C'omputers, vol. 17 (12), pp. 262-269, December
1968.

[2:3] W. G. Wee, "Generalized inverse approach to <11-1. 11). feature detection, and
classification," IEEE Tman~srctions on C'omputers, vol. 17 (:3), pp. 1157-1164, May
1968.

[24] F. H. K~ishi, Advances in C'ontrol So/;,~i;; Theory and Applications, ch. On Line
Computer Control Techniques. New York: Academic Press, 1964.

[25] J. Zhang and T. Zhurii lr "A novel approach to design weight matrix of hopfield
network," in Proceedings of the 2005 IEEE Engineering in M~edicine and B..~~I J..;,it 2th
Annual C'onference, (Shanghai, Clash~ gI, pp. 1556-1558, September 2005.

[26] C. S. Leung and K(. F. C!. IIus, "Householder encoding for discrete bidirectional
associative memory," IEEE Proc IJC'NN 91., vol. 1, pp. 2:37-241, November 1991.

[27] A. Anderson, "Second-order .I-i-inin.! r ile ham design with a maximal basin of
attraction," IEEE Tman~srction~s On So-;,~i;- Man. And C';l,I~ ,,t.:. /. vol. :33,
pp. 421-428, 200:3.

[28] Z. ou Wang, "A bidirectional associative memory based on optimal linear associative
memory," IEEE Tnen~srctions on C'omputers, vol. 45 (10), pp. 1171-1179, October
1996.

[29] S. Burris and H. Sankappanavar, A C'ourse in Universal Algebra. New York:
Springer-\ 15 1 1981.

[:30] S. Burris, "The laws of boole's thought." Web Reference Pre-print, April 2000.

[31] E. ILll Spi S.;;;.gretty-- (Translations of M~athematical M~on.r:y.1;-'l,), vol. 3. Rhode
Island: Amer Mathematical Society, June 1978.

[32] G. X. Ritter, P. Sussner, and J. L. D. de Leon, "Morphological associative memories,"
IEEE Transactions on Neural Networks, vol. 9 (2), pp. 281-293, January 1998.

[33] R. Yager, "On ordered weighted averaging .I__-oegation operators in multi-criteria
decision 1!! I1:;1,- IEEE Transactions on Si;8i; pp. 183-190, 1988.

[34] R. Yager, "Families of owa operators," Fo,..;, Sets and Sl, /.ii vol. 57, pp. 125-148,
1993.

[35] B. S. Ahn, "On the properties of owa operator weights functions with constant level
of orness," IEEE Transactions On Fu,..;; Si, /.ii vol. 14, pp. 511-515, 2006.

[36] K(. Alligood and T. Sauer, Chaos: An Introduction To D; ...;;;;... ;1 Sl, /.ii New York:
Springer-V. ~11 1997.

[37] T. Parker and L. O. Chua, "ChI I. .- A tutorial for engineers," Proceedings Of The
IEEE, vol. 75 (8), pp. 982-1008, August 1987.

[38] W. de Melo and S. van Strien, One-Dimensional D ,ian New York: Springer,
1996.

[39] S. H. Strogatz, Nonlinear D ,inan -~ and Chaos. New York: Perseus Books Group,
2001.

[40] D. Ruelle, Cin;r~~.i:. Evolution and Sirr,..y.- Attractors. Cambridge: Cambridge
University Press, 1989.

[41] G. S. Duane and J. J. Tribbia, "Synchronized chaos in .z. .ph1i--;1 11 fluid dynamics,"
Phys. Rev. Lett., vol. 86, pp. 4298-4301, May 2001.

[42 D L Trcote Facalsan Cao i GJ.it a G*'in-. New Jersey: Gary W.
Ostedt, 1997.

[43] J. R. Quinlan, "Combining instance-based and model-based learning," in Pro-
ceedings of the Tenth International Conference on M~achine Learning, (Amherst,
Massachusetts), pp. 236-243, Morgan K~aufmann, 1993.

[44] D. N. A. Asuncion, "UCI machine learning repository," 2007.

BIOGRAPHICAL SKETCH

John McElroy was born and raised in Florida. He has a Ph. D. and an M.S. in

computer engineering and a bachelor's degree in fine arts.

PAGE 1

1

PAGE 2

2

PAGE 3

3

PAGE 4

PAGE 5

page ACKNOWLEDGMENTS ................................. 4 LISTOFTABLES ..................................... 7 LISTOFFIGURES .................................... 8 ABSTRACT ........................................ 11 CHAPTER 1INTRODUCTION .................................. 13 1.1StatementoftheProblem ........................... 13 1.2ObjectiveandApproach ............................ 13 2LITERATUREREVIEW .............................. 15 2.1LinearDynamicalSystems ........................... 15 2.1.1LinearSystemsandtheStateSpaceModel .............. 15 2.1.2ExistenceandUniquenessofSolutions ................. 16 2.1.3EquilibriumPointsandStability .................... 17 2.1.4ChangeofBasisandDiagonalization ................. 18 2.1.5EigenvaluesandStability ........................ 20 2.2Non-LinearDynamicalSystems ........................ 21 2.2.1LinearversusNon-LinearSystems ................... 21 2.2.2LyapunovStability ........................... 22 2.3FundamentalsofAssociativeMemories .................... 23 2.3.1TheLinearAssociator ......................... 24 2.3.2TheHopeldModel ........................... 25 2.3.3TheBidirectionalAssociativeMemory ................. 27 2.3.4ProblemswithLinearAssociativeMemories ............. 30 2.4LatticeBasedAssociativeMemories ...................... 31 2.4.1IntroductiontoLatticeAlgebra .................... 31 2.4.2FundamentalsoftheLattice-basedAssociativeMemory ....... 34 2.4.3ExistenceofPerfectRecallMemories ................. 35 2.4.4FixedPointSets ............................. 36 2.4.5GeometricInterpretationoftheFixedPointSets ........... 37 2.4.6BehaviorofNon-FixedPoints,NoiseSuppressionandConvergence 41 2.5OrderedWeightedAverage(OWAOperators) ................ 45 2.5.1BasicsofOWAOperators ........................ 46 2.5.2DeMorgan'sLawandDuality ..................... 46 2.5.3YagerMeasuresofAndnessandOrness ................ 47 2.6ChaoticMaps .................................. 49 5

PAGE 6

............... 64 3.1LatticeBasedMemoriesasPiecewiseLinearSystems ............ 64 3.1.12DPiecewiseAnalysisofRitter-GaderResults ............ 67 3.1.2VectorFieldsfor2DExample ..................... 70 3.2Generalizationof2DExampletoHigherDimensions ............ 72 3.2.1TheNumberofCases .......................... 75 3.2.2GeometryoftheFixedPointSet .................... 77 3.3VericationofCanonicalResults ....................... 78 3.4AdvantageofPiecewiseLinearFramework .................. 78 4ENCODINGANDDECODINGOPERATORS .................. 83 4.1OrderStatisticsforEncodingOperators ................... 83 4.1.1OrderStatistics-BasedAssociativeMemories ............. 85 4.1.2OrderStatisticsBasedEncodingOperatorsforOutlierRobustness 88 4.2TheDecodingProcess ............................. 92 4.3NoiseandtheNarrowFixedPointSet .................... 93 4.4PerformanceonRealData ........................... 95 4.4.1TheData ................................. 95 4.4.2TheFirstExperiment .......................... 96 4.4.3TheSecondExperiment ........................ 97 4.5Conclusion .................................... 98 5NEAR-MINANDNEAR-MAXMEMORIES ................... 115 5.1IntroductiontotheNear-MinandNear-MaxMemories ........... 115 5.2ComparisonwithTentMaps .......................... 120 5.3EectOfInitialization ............................. 123 5.4HarmonicAnalysis ............................... 124 5.5ExistenceofFixedPoints ............................ 127 5.6GrowthinNDimensions ............................ 134 5.7Conclusion .................................... 136 6CONCLUSIONSANDFUTUREWORK ...................... 143 REFERENCES ....................................... 145 BIOGRAPHICALSKETCH ................................ 148 6

PAGE 7

Table page 4-1DataDimensions ................................... 95 7

PAGE 8

Figure page 2-1E0(el)forthe2Dexample. .............................. 54 2-2S0(i;l;a)andS0(i;l;b)for2Dexample. ...................... 54 2-3Halfspacesandtheirintersecion,B2. ........................ 55 2-4B2andtheresultingF(X). ............................. 55 2-5F1(X),F2(X)andtheresultingF(X). ....................... 56 2-6ul,mlandtheresultingBl. ............................. 56 2-7NoiseSuppressionPropertiesofMXX. ....................... 57 2-8Oneiterationofatentmap,with=0:5 ..................... 57 2-9Oneiterationofatentmap,with=0:1 ..................... 58 2-10Fifteeniterationsofatentmaporbitstartingatx=0:4,with=0:1 ..... 58 2-11Fifteeniterationsofatentmaporbitstartingatx=0:4,with=1:0 ..... 59 2-12Tenthousanditerationsofatentmaporbitstartingatx=0:4,with=1:3 .. 59 2-13Onehundrediterationsofatentmaporbitstartingatx=0:4,with=1:3 .. 60 2-14Onethousanduniformrandomnumbersforcomparison. ............. 60 2-15Bifurcationdiagramforthetentmap. ....................... 61 2-16Bifurcationdiagramzoomedtoshowoscillations. ................. 61 2-17Bifurcationdiagramoscillationszoomedfurther. .................. 62 2-18Patternsinbifurcationdiagram. ........................... 62 2-19Randomnessinbifurcationdiagram. ........................ 63 3-1Divisionofthestate-spacefora2Dexample. .................... 80 3-2Vectoreldfora2Dexample. ............................ 80 3-3Pointsxedindimensionpair(i;j) ......................... 81 3-4Points,someofwhicharexedindimensionpair(p;q) .............. 82 4-1Narrowingthexedpointsetfordimensionpair(1,2). .............. 99 4-2Narrowingthexedpointsetfordimensionpair(1,3). .............. 99 8

PAGE 9

.............. 100 4-4Narrowingthexedpointsetfordimensionpair(2,3). .............. 100 4-5Narrowingthexedpointsetfordimensionpair(2,4). .............. 101 4-6Narrowingthexedpointsetfordimensionpair(3,4). .............. 101 4-7Thexedpointsetfortheminandnext-to-minencodedmemories. ....... 102 4-8DecodingwithMxx,dimensionpair(1,2). ..................... 102 4-9Decodingwithxx,dimensionpair(1,2). ..................... 103 4-10DecodingwithMxx,dimensionpair(1,3). ..................... 103 4-11Decodingwithxx,dimensionpair(1,3). ..................... 104 4-12DecodingwithMxx,dimensionpair(2,3). ..................... 104 4-13Decodingwithxx,dimensionpair(2,3). ..................... 105 4-14Narrowingthexedpointsetfordimensionpair(i,j). .............. 105 4-15Noiseandthenarrowedxedpointset. ....................... 106 4-16Dimension1:mpg. .................................. 106 4-17Dimension2:cylinders. ................................ 107 4-18Dimension3:displacement. ............................. 107 4-19Dimension4:horsepower. .............................. 108 4-20Dimension5:weight. ................................. 108 4-21Dimension6:acceleration. .............................. 109 4-22Dimension7:modelyear. .............................. 109 4-23Dimension8:origin. ................................. 110 4-24Exampleboundariesfromdimensionpair(1,2). .................. 110 4-25Exampleboundariesfromdimensionpair(1,4). .................. 111 4-26Maxencoding:squarederrorinthefourthdimensiononly. ............ 111 4-27Next-to-maxencoding:squarederrorinthefourthdimensiononly. ....... 112 4-28Maxencoding:rmseacrossalldimensions. ..................... 112 4-29Next-to-maxencoding:rmseacrossalldimensions. ................ 113 9

PAGE 10

..................... 113 4-31Next-to-maxencoding:rmseacrossalldimensions. ................ 114 5-1Bifurcationdiagramfor2Dmemory. ........................ 138 5-2Canonicalbehaviorinbifurcationdiagram. ..................... 138 5-3Whenw1=w2=0:5inthebifurcationdiagram. .................. 139 5-4Twentyiterationsofamaxencodedmemory. ................... 139 5-5Anexampleoftheincoherentpseudo-sinusoidalpattern. ............. 140 5-6Discontinuousjumpsinthebifurcationdiagram. .................. 140 5-7Twentyiterationswithw1=0:1. .......................... 141 5-8Bifurcationdiagramwithnegativeinitialization. .................. 141 5-9Bifurcationdiagramwithpositiveinitialization. .................. 142 5-10Fittingasinewavetothebifurcationdiagram. ................... 142 10

PAGE 11

11

PAGE 12

12

PAGE 13

1 ].Bothofthesenetworkshavealimitedcapacityintermsofthenumberofpatternpairsthattheycanstore[ 2 ][ 3 ].Thoughtheliteratureisrepletewithupdatedmodelsprovidinggreatercapacity,itisstillnotpossibletoachieveperfectrecallonanarbitrarilylargesetofpatternpairs.ThisissueisdiscussedinfurtherdetailinSection 2.3.4 .Alsoduringthe1990'sRitter,etal.introducedanewfamilyofassociativememoriesbasedonlatticealgebrainsteadoflinearalgebra[ 4 ].Thesememoriesprovideunlimitedstoragecapacityand,thoughtheyhavebeenwelldiscussedintheliterature,researchinthisareaisstillactive.Morphologicalneuralnetworks,ofwhichlatticebasedassociativememoriesareasubset,weregivenaspecialissueoftheJournalofMathematicalImagingandComputerVisionin2003[ 5 ].Thelatticealgebrabasednetworksarenotalwaysrobustinthepresenceofnoise,however[ 6 ].Thoughtheliteraturefocusesonimprovingsuchnetworks'abilitytoretrievecleaninputpatternswhenpresentedwithnoisyversions,itfailstotakeintoaccountthepresenceofoutlierswithintheinitialdata.Suchoutliersarecapableofmakingthexedpointsetarbitrarilywide,whichcangreatlydecreasetheperformanceofthememorywhenpresentedwithadistortedversionofaninitialdatapoint. 13

PAGE 14

14

PAGE 15

7 ].Theabstractionsprovidedbysystemstheoryapplytophysicalquantities,suchasvoltageandvelocity,aswellasconceptsfromecology,economy,engineeringandsociology[ 8 ].Asystemthatevolvesovertimeisreferredtoasadynamical(oroftendynamic)system.Thissectionparaphrasesthematerialfoundin"IntroductiontoDynamicSystems"byDavidLuenberger,whichisanexcellentreference[ 9 ].AnunderstandingoflineardynamicalsystemsisessentialforunderstandingpiecewiselinearapproachdescribedinChapter 3 9 ].Ingeneral,forsomefunctionf,thecontinuous-timeequationgoverningmovementwithinthestate-spaceis _x(t)=f(x(t));(2{1) 15

PAGE 16

10 ].Aconstantvector _x(t)=f( 9 ]: 16

PAGE 17

2{5 ),apoint x+b:(2{7)Notethatifthiswasahomogeneouslinearsystem,i.e.iftherewasnobvectorinEq.( 2{7 ),then 2{7 )mayhaveinnitelymanysolutionsornoneatall,dependingonwhetherornotthesystemofequationsisconsistent. 17

PAGE 18

x+bb;(2{9)whichimpliesthat 2{10 )tothefollowinghomogeneousequation 2{11 )isgovernedbytheeigenvaluesofA.ThisrelationshipisclearlyillustratedbydiagonalizingA. 11 ]. 18

PAGE 19

2{13 )todeterminethatw=P1y.Thisimpliesthat 19

PAGE 20

2{14 )andthedenitionofVitispossibletocreateatransformcorrespondingtoAinthenewbasisasfollows: 2{15 )illustratesthattheapplicationofAtoxwithrespecttotheoriginalbasisequatestosimplymultiplyingthecorrespondingvectorinthenewbasisbyadiagonalmatrixofeigenvalues 2{17 )as 2{11 ),assumethatAcanbediagonalized.ThisimpliesthatthereissomematrixMsuchthat 2{22 )istrue.ConsiderthematrixAk,whereAismultipliedbyitselfktimes,i.e. 20

PAGE 21

2{11 )willalsotendtowardzeroandthesystemwillbeasymptoticallystable.Thiswillonlyhappenwhenjij<1foreveryi,whichmeansthatthesystemisasymptoticallystableifandonlyifeveryeigenvaluelieswithintheunitcircleonthecomplexplane.Likewise,ifthereisaneigenvectorwithamagnitudegreaterthanonethensomeinitialconditionwillleadtoasolutionthatgrowsgeometricallywithtimeandthesystemwillbeunstable.Intheeventthatnoneoftheeigenvaluesareoutsidetheunitcircle,butoneormoreliedirectlyontheunitcirclethesystemneitherconvergestozeronorinnity;itismarginallystable.Allofthisassumesthattheeigenvaluesaredistinct.Analysisofasystemwithnon-distincteigenvaluesispossibleusingtheJordanCanonicalformofAwhichisnearlydiagonal,ifnotactuallydiagonal. 2.2.1LinearversusNon-LinearSystemsInthestudyofnon-linearsystemscertainfunctionsnamedafterRussianmathematicianAleksandrLyapunovareusedtodescribestability.Thoughanequilibriumpointisdenedinthesamemannerforbothlinearandnon-linearsystemsthedescriptionofsuchpointsinnon-linearsystemsismuchricherfortworeasons[ 9 ].First,sincetheyrepresentsolutionstothedierential(ordierence)equationstheyareoftenmorediculttosolveforincomplicatednon-linearsystems.Second,anon-linearsystemmayhaveanynumberofequilibriumpointsfromnonetoaonetoaniteoreveninnitesetofthemandtheymaybearrangedinalmostanylocationswithinthestatespace.Thestudyofequilibrium 21

PAGE 22

1. 2. AlonganytrajectoryinD,Vdecreasesmonotonically.Assumefiscontinuous.Examiningthebehaviorofthediscretetimesystem 22

PAGE 23

2{24 ).ThisallowsthethirdconditionforafunctiontobecalledaLyapunovfunctiontobere-writtenasThefunction4V(x)V(f(x))V(x)satises 9 ]: 23

PAGE 24

12 ]andKohonen[ 13 ]in1972.Itusesalinearcorrelationmatrix(LCM)tostoretheinputpatternpairs.OtherpopularLCM-basedassociativememoryarchitecturesincludetheHopeldNetwork,developedbyJohnHopeldin1982[ 14 ],andKosko'sbi-driectionalassociativememory(BAM),introducedin1988[ 1 ].AnalternativetotheLCM-basedarchitectureisthelattice-basedassociativememoryintroducedbyRitterin1997[ 4 ].Thesememories,basedonlatticealgebraratherthanlinearalgebra,arecoveredinChapter3. 24

PAGE 25

PAGE 26

17 ].UpdatingofneuronsaccordingtotheHopeldmodelisdoneasynchronously,oroneneuronatatime.Theentirenetworkcanthenbeviewedasavectorrandomprocess[ 1 ],whereeachelementinthevectorisaneuron-ascalarrandomprocess.Ateverytimestepanarbitrarysubsetofneuronswillchangeitsstate;thesizeofthissubsetcanbereducedtooneneuronifthetimeresolutionisneenough.InthiswaytheHopeldmodelisspeciccaseofanetworkknownastheBidirectionalAssociativeMemory.It'sbehaviorischaracterizedinthefollowingsection.UnliketheLinearAssociator,theHopeldModeliscapableofperfectlyrecallingpatternsthatarenotmutuallyorthogonal,butperfectrecallisonlyguaranteedintheeventthatthepatternsaremutuallyorthogonal. 26

PAGE 27

18 ]thatgovernthesynapsestherein.TheBAMmodelwascreatedbyKosko[ 1 ],thoughithasbeenextendedthroughresearchbyWang,etal.[ 19 ]andMathai,etal.[ 20 ].Neuronsinaeldarerelatedbytheconnectionsbetweenthem.FollowingtheconventionssetforthbyKosko[ 17 ]thedefaultneuronaleldwillbedenotedasFX.Ifsecondorthirdeldisnecessary,thosewillbeFYandFZ,respectively.Usingthisnotationathree-layerfeedforwardneuralnetworkwouldbedenotedasFX!FY!FZ,whereFXistheinputlayer,FZistheoutputlayer,andFYisasinglehiddenlayer.Withregardtotwolayernetworks,Kohonen'sconventionsfordimensionalitywillbefollowed[ 17 ],sothattheinputeld,FX,containsnneurons,andtheoutputeld,FY,containspneurons.Thenumberofvectorspairs(xi;yi)thatthesystemassociatesismsothatthesetsofinputpatternsareX=fx1;:::;xmgandY=fy1;:::;ymg.FortheeldFXthedierentialequationassociatedwiththeactivationtimefunctionoftheithneuronis _x1=g1(FX;FY):(2{32)Heretheoverdotsigniesthederivativewithrespecttotime.Thestateofaneuronaldynamicalsystematanytimetcanberepresentedcompletelybyastatevectoroftheform 27

PAGE 28

21 ]aone-layerassociativememoryisaheteroassociativememoryandatwo-layerassociativememoryisreferredtoasanautoassociativememory.Theoutputofeachneurondecidesthestateofabidirectionalassociativememory.AccordingtoKosko[ 17 ],theoutputofaneuronwilldecaytoitsrestingvalueintheabsenceofexternalinputorneuronalstimuli.Thismeansthatwithoutinputthedierentialequationcorrespondingtothechangeinoutputfortheneuronwillbe _xi=xi:(2{34)ThesolutionforEquation 2{34 isxi(t)=xi(0)et,implyingthatwithoutinput,neuronalorexternal,theneuronoutputwillsimplydecaytoitsrestingvalue.TomodeldierentdecayratesfordierentneuronsitispossibletoaddamutliplicativedecayrateparameterAi>0sothatEquationdecayDieqbecomes _xi=Aixi:(2{35)ThesolutionforEquation 2{34 isxi(t)=xi(0)eAit.Equation 2{35 issimplyageneralizationofEquation 2{34 ,whichcanbethoughtofashavingadecayrateAi=1.Intheautoassociativecaseofthebidirectionalassociativememory,therearenrst-orderdierentialequationsthatgovernmovementwithinthestatespace: 28

PAGE 29

_xi=Aixi+pXj=1Sj(yj)nji+Ii;(2{37) _yj=Ajyj+nXi=1Si(xi)mij+Jj;(2{38)wherethematrixNhandlesthesynapticconnectionsintheoppositedirectionofthosehandledbythematrixM.BothKosko'sBAMandtheHopeldmodel,towhichtheBAMreducesintheautoassociativecase,arediscretebivalentnetworks,meaningthattheiractivationismodeledbydierenceequationsandthateachneuronhaseitherabinarystate(0=o,1=on)orabimodalstate(-1=o,1=on).ThediscreteequationscorrespondingtoEq.( 2{37 )andEq.( 2{38 )areasfollows 29

PAGE 30

22 ][ 23 ][ 24 ].Inordertoencodethevectorassociations(xi;yi)fromthesetsXandYthefollowingequationisapplied 17 ],whichimpliesthatbidirectionalstabilityisaformofglobalstability.SincetherearetwodirectionsinaBAMoverallsystemstabilitycanbeachievedbyoscillatingbetweentwostates,oneforFXandoneforFY.TheenergyofabidirectionalassociativememoryisgovernedbyaLyapunovfunction 3 ].Furthermore,allsamplesmustbepairwiseorthogonal[ 25 ]. 30

PAGE 31

2 ].ShortlyaftertheBAMwasintroduced,anumberofresearchersproposedadjustmentstothemodelthatwouldallowforincreasedcapacity.LeungandCheungsuggestedamethodknownastheHouseholderEncodingAlgorithm[ 26 ],HainesandHecht-Nielsonproposedanalternateschemeforthresholdingthestatevector,referringtotheresultingmodelasaNon-HomogeneousBAM[ 2 ].Morerecentnetworksproposedforincreasingthecapacityofthelinearcorrelation-basedmemories,includetheSecondOrderAsymmetricBidirectionalAssociativeMemory(SOABAM)introducedbyChangandChoin2003[ 27 ]andWang'sOptimalBidirectionalAssociativeMemory[ 28 ].Givenanarbitrarysetofpatternassociations,however,perfectrecallwiththesememoriesisnotguaranteedbecausetheircapacity,thoughincreasedbeyondwhatwasoriginallyavailable,isnotinnite.Completerecallofanunlimitednumberofarbitrarypatternassociationsispossible,however,withtheLatticeBasedAssociativeMemoriesofSection 2.4 2.4.1IntroductiontoLatticeAlgebraLatticesaremathematicalconstructs,likegroupsorrings,thatplayanimportantroleinthediscussionofabstractalgebra[ 29 ].EarlyevidenceoflatticetheorystemsfromBoole'sworkonthenatureofthoughtin1854[ 30 ]andthatofDedekindattheturnofthe20thCentury[ 31 ].AnunderstandingofthefundamentalsoflatticealgebrawillbenecessaryinordertounderstandthelatticebasedassociativememoryofRitterandGader.Thetheoreticalunderpinningsofthisdocumentstemfromtheirwork.Therearetwostandardframeworksfordiscussinglattices.Oneisintermsofpartiallyorderedsets(posets),theirdenitions,andproperties.ThismethodallowsforamoregeometricalinterpretationthroughavisualtoolknownastheHassediagramnamedaftertheGermannumbertheoreticianHelmutHasse. 31

PAGE 32

1. 29 ].Thetriplet(<;_;^)isanexampleofalattice,where
PAGE 33

PAGE 34

Conjugationinthelatticedomainhasthepropertiesthat(r)=randr^u=(r^u)=(r_u)forallr,u. 34

PAGE 35

ThememoriesWXYandMXYwillbereferredtoascanonicalmemoriesthroughoutthisdocumenttoseparatethemfromotherproposedmemoriesmakinguseofencodingoperatorsotherthanthepointwiseminimumormaximum.Theyarealsoknownasmorphologicalmemories[ 6 ].OccasionallythesubscriptswillberemovedandthememorieswillbereferredtosimplyasWandM.Amemorythatwillrecallpatternywhenpresentedwithyforallf1;:::;kgisknownasaperfectrecallmemoryfor(X;Y). 4 ],[ 6 ]andproofscanbefoundinthosedocuments. 35

PAGE 36

PAGE 37

2{60 ,andthefactthatyisnotrestrictedtobeingdierentfromxitisclearthatWXX 6 ]. 32 ]. 6 ].GivenasetofinputpatternsX=fx
PAGE 38

2-1 whichshowsthepossibleE0planesinatwo-dimensionalcase.Also,forthesakeofthisdiscussion,denetwopointsin<2,a=(2;1)andb=(4;3),bothofwhichareshowninFigure 2-1 .LetthesetofinputpattersX=fa;bg.Notethat,inthetwo-dimensionalcase,E0(e1)issimplythex2-axis,whileE0(e2)isthex1-axis.Letusalsodenethefollowingsetsuchthatforsomep
PAGE 39

2-4 .Itistheregionbetweenthetwoorangelines.Thisisonewayofvisualizingthexedpointset.Theformulasareforthen-dimensionalcase,thoughtheexampleisspecicallyintwo-dimensions.Thereisanotherusefulwayofvisualizingthesamexedpointset.Itrequiresslightlydierentconstructs.First,letthesetFl(X)representtheimageontheplaneE0(el)foreverypointzinF(X).Thatis,forsuchapointz=(z1;:::;zn);;F(X)thepoint(z1zl;:::;znzl)isinFl(X).TheformaldenitionofFl(X)is 2-5 wheretwosetsofaxisarepresented,eachwithorangelinesdepictingtheregionF(X).OnepairofaxiscontainsF1(X)andtheothercontainsF2(X).The2DexampleshowninFigure 2-1 onlycontainedtwopoints.TogeneralizethisresulttothethecaseinwhichXcontainsmorethanjustpointsaandbconsiderthe 39

PAGE 40

2-6 showsanexamplein2Dofasetofpointsforwhichulandml(showninblue)formthehyperboxBn(showningray),thoughtheyarenotpartofthesetX(pointsshowninblack).Toprovethis,letX(l)=fx1(l);:::;xk(l)gandforeach=1;:::;kdenea=xl,whichisthevaluexiwouldhavetobeshiftedbyinordertointersectE0(el).Thismeansthatx(l)=a+x.Lettingw(l)ij=[WX(l)X(l)]ijandm(l)ij=[MX(l)X(l)]ij =k^=1hxi(l)xj(l)i=w(l)ij and =k_=1hxi(l)xj(l)i=m(l)ij: ThisimpliesthatWXX=WX(l)X(l)andMXX=MX(l)X(l).Now,ifxFl(X)thenfori=1;:::;n xi=WXX 40

PAGE 41

2.3 onecommonusefortheautoassociativememoryistorecallcleanpatternswhenpresentedwithnoisyones.Lattice-basedassociativememoriesaretolerantofcertaintypesofnoise.Foragiveninputvectorxletthenoisyversionxbedenotedasexanddenetheprocessoferosivenoiseasonethatresultsintherelationshipx
PAGE 42

PAGE 43

PAGE 44

2.4.4 canbeusedtoexplaindierenceinthebehaviorofpointex1inbothexamples.Forinstance,Theorem 2.4.4 statesthatinorderforWXX 2=2_(3+0:55:5)_(3+4:56)_(3+38);(2{86) 44

PAGE 45

2=2_(2+0)_(2+0)_(2+0);(2{87)soforeachianappropriatedjiexistsandthereforeWXX 0:56=0:5^(3+0)^(3+0)^(3+0);(2{89)andlettingi=1andji=2 26=2^(3+0)^(3+0)^(3+0):(2{90)Therefore,thereisnoappropriatejifori=1andthereforeWXX 2.4.4 33 ].KnowledgeofOWAoperatorsisnecessaryforunderstandingthematerialproposedinChapter 3 andChapter 4 .Boththeminandthemaxarespecialcasesofthisoperator,asarealloftheorderstatistics.Thebehavioroftheminisdescribedasan\anding"behavior,meaningthatitbehaveslikealogicalAND.Likewise,themaxoperatorissaidtohavean\oring"behaviorlikethelogicalORoperator.TheOWAoperatorsallowforaspecicdegreeofandingandoringbehavior,knownasthe\andness"and\orness"oftheoperator.Thoughtheywerebrieydenedearlier,theyaredescribedindetailhere. 45

PAGE 46

2.4.2 .InSection 4.1.2 thedualityoforderstatisticsbasedoperatorswasdiscussed.ThisdenitioncouldbeextendedtoalloftheOWAoperators.Thisdenition,however,only 46

PAGE 47

PAGE 48

2.5.2 .ItispossibletospecifyaparticularOWAoperatorgivenonlytheandnessorornessoftheoperator[ 35 ].ChaosTheoryisoftenusedtomodelthebehaviorofnon-lineardynamicalsystems,andhasbeenanintegralpartofdynamicalsystemstheorysinceasearlyasthe1960's[ 36 ].Achaoticsystemisadeterministicsystemthathasbeenshowntoexhibitrandombehavior,alsoknownasstrangebehavior,incertaincircumstances[ 37 ].Similaritiesbetweenthedentionofthepiecewiselinearassociativememoryandthatofasimple 48

PAGE 49

38 ].ForsomenumberoftimestepsT>0,knownastheperiod,x[k+T]=x[k].ThismeansthateveryTstepstheorbitwillreturntothesamespotinthestatespace.Suchatrajectoryisknownasalimitcycle.Ifallneighboringpathsapproachsuchacycleitisknownasaperiodicattractor[ 39 ].Fourieranalysisofperiodicorbitsshowsafundamentalcomponentat1 37 ].Quasi-periodictrajectories,inwhichtheorbitistheresultofsummingseveralperiodicorbits,arealsopossible[ 37 ].Fourieranalysisofquasi-periodicorbitsaresimilartothoseofperiodictrajectories,butcontainsidebands[ 37 ].Aperiodic,yetbounded,steadystatebehaviorispossibleoutsideofquasi-periodictrajectories[ 40 ].Chaoticbehaviorischaracterizedbytrajectoriesthatseemimpossibletopredict,butthatremainboundedwithinaparticularneighborhood.WhileFourieranalysisofperiodicandquasi-periodicorbitsshowdistinctharmoniccomponents,thoseofchaoticallywanderingorbitshavearandomappearanceclosertoauniformdistribution[ 37 ].Chaoticbehaviorisseeninnaturefromuiddynamics[ 41 ]toplatetectonics[ 42 ].Thoughtheoutputgeneratedbysomeofthefunctionsisquitecomplicated,theactualfunctionsthemselvesareoftensimple.SeveralpopularchaoticfunctionsincludeArnold's 49

PAGE 50

2(1x[k])forx[n]1 2;(2{99)whereisaparametersetbytheuser.Thoughthisequationlookssimple,itcanproducequitecomplicatedoutput.ConsidertheplotinFigure 2-8 .Asingleiterationofatentmapwith=0:5isshownforinitialvaluesbetweenzeroandone.Thehorizontalaxisrepresentstheinitialsystemstatevaluex[0].Becausethetentmapisonedimensionalthestatevectorreducestoascalarvalue.Thetriangularshapeoftheplotisthereasonthatthisfunctioniscalledatentmap.Thevalueatthepeakofthetentmapisequaltox[1]= 2.Withtheseparametersthesystemdisplayssimplepiecewisebehavior.InFigure 2-9 asimilarcaseisplotted.Againonlyoneiterationofthetentmapfunctionwascalculated,thistimewiththesetting=0:1.Thehorizontalandverticalaxesrepresenttheinputandoutputvalues,respectively,asinFigureFigure 2-8 .Thepeakofthetriangleisnowatx[1]=0:05whichisstillequaltox[1]= 2-10 .Theinitialparameterswerex[0]=0:4and=0:1.Thevalueofxdecreasesmonotonicallytowardszero.Infact,zeroisastableequilibriumpointforthissystemforanyinitialvalueinthe 50

PAGE 51

2-11 illustratesanotherorbitbeginningatthepointx[0]=0:4,thistimewith=1:0.Thehorizontallineindicatesthat0.4isaxedpointofthesystem.Thisistrueofallpointsx[k]<0:5becausex[k+1]=x[n]=x[n],since=1.Simplybychangingtheparameterfrom0:1to1:0,then,thebehaviorofthesystemchangedsuchthattherenolongerexistedasingleattractorlocatedatzero,butrathertheentirerangeofpointsx[0][0;0:5)areallequilibriumpoints.ThoughthepiecewiselinearequationgiveninEq.( 2{99 )lookssimple,itsbehaviorcanchangedrasticallybysimplyadjustingtheparameter.Thoughthechangefrom=0:1to=1:0wasinterestingbecauseofthealterationitcausedinthesystem'sbehavior,theactualbehavioritselfwasnotthatcomplicated.ThebehaviorshowninFigure 2-12 ,however,isnotsosimple.Here,thesystemisagaininitializedtox[0]=0:4,butwith=1:3.Tenthousanditerationswereperformedtoallowfortheemergenceofperiodicpatterns,iftheyexist.Itisevidentthatthesystemmapstherange[0:455;0:65]backontoitself.Theindividualpointschosenduringthemapping,though,seemsentirelyrandom.AplotofthesametrajectoryisshownifFigure 2-13 ,thistimeafteronly100iterations.Itiseasiertoobservethebehaviorofthesystemfromiterationtoiterationinthisgure.Bysimpleinspectionitisnotpossibletopredictthenextoutputvalue.Thoughthis,onceagain,doesnotprovethatnopatternexists,itisusefulforanintuitiveunderstandingofthebehavior.Forcomparison,Figure 2-14 shows1,0000randomelementsgeneratedfromauniformdistribution.NotethatthesequenceinFigure 2-14 doesnotappearmuchdierentthan 51

PAGE 52

PAGE 53

2-12 ,noperiodicbehaviorwasfoundanditseemedpossiblethatthesystemwasbehavingchaoticallyatthispoint.AcloseupofthebifurcationdiagramshowningureFigure 2-18 revealsthatthevaluesareactuallymovingbetweenthetworegions,andthatthisispartofthelargeroverallpatternoftheregionsexpandingandmerging.Thatistosay,thisbehaviorisnottrulyrandom.Thebehaviorisnotperiodic,however,asstatevaluesdonotrepeat.Thereareuniquevaluesforthestateateveryiteration,thoughthevaluesareconstrainedbytheregionboundaries.Oncethetworegionsmerge,however,suchpatternsarenolongerdiscernableandthesystembehaviorappearsrandom,exceptfortheexpandingoftheboundariesfortheregioninwhichthestatewanders.Figure 2-19 containsasectionofthebifurcationdiagramzoomedintotheregion[1:78;1:92]wherethebehaviorseemstowanderchaotically. 53

PAGE 54

Figure2-2: 54

PAGE 55

Halfspacesandtheirintersecion,B2. Figure2-4: 55

PAGE 56

Figure2-6: 56

PAGE 57

NoiseSuppressionPropertiesofMXX. Figure2-8: Oneiterationofatentmap,with=0:5 57

PAGE 58

Oneiterationofatentmap,with=0:1 Figure2-10: Fifteeniterationsofatentmaporbitstartingatx=0:4,with=0:1 58

PAGE 59

Fifteeniterationsofatentmaporbitstartingatx=0:4,with=1:0 Figure2-12: Tenthousanditerationsofatentmaporbitstartingatx=0:4,with=1:3 59

PAGE 60

Onehundrediterationsofatentmaporbitstartingatx=0:4,with=1:3 Figure2-14: Onethousanduniformrandomnumbersforcomparison. 60

PAGE 61

Bifurcationdiagramforthetentmap. Figure2-16: Bifurcationdiagramzoomedtoshowoscillations. 61

PAGE 62

Bifurcationdiagramoscillationszoomedfurther. Figure2-18: Patternsinbifurcationdiagram. 62

PAGE 63

Randomnessinbifurcationdiagram. 63

PAGE 64

2.4 arenon-linear,theycanbetreatedaspiecewiselinearsystems.Thissectionintroducesapiecewiselinearanalysisframeworkfordiscussingthesememories.Thispiecewiselinearviewpointallowsfortheextensionofthelatticebasedmodeltoincludeotherorderstatisticoperators,andotherOWAoperatorsingeneral,beyondtheminandthemax.ThisextensionisthesubjectofChapter 4 3{2 )yields: 3{3 );thesortoccursaftertheadditionisperformed.InthiscasethewvaluesaredeterminedbywhetherXXisencodedusingtheminormax 64

PAGE 65

3.1.1 .Algebraically,Eq.( 3{3 )isequivalenttotheanetransformation (3{5) (3{6) (3{7) (3{8) 3{3 )becomes ThismeansthatEq.( 3{4 )becomes 65

PAGE 66

(11+x1[k])(12+x2[k]) (3{14) (21+x1[k])(22+x2[k]): Ingeneral,forif1;2;3;4g Thiseectivelyturnsthenon-linearsystemdescribedinEq.( 3{3 )intofourlinearsystems,onlyoneofwhichmayoperateatagiventimestep.Ineachofthefourcasesthestateofthelatticebasedassociativememoryatanytimeisdenedbyasystemoftwolinearrst-orderstateequations,theunknownsofwhicharethetwodimensionsofthepreviousstatevector.Forthemoment,considereachofthefoursystemsaloneratherthanaspartofalarger,non-linearsystem.Thoughthesystem 66

PAGE 67

9 ]. 67

PAGE 68

PAGE 69

ThetwoorangelinesinFigure 3-1 representthelinesL1andL2.Thoughitispossiblethattheymaybeequal,itisusuallythecasethatoneisabovetheother.Theoperatorschosenforencodingdecidewhichlineisthetopline,anditissimpletoprovethatthetoplineisalwaysL1inthecasethatthemaxoperatorisusedforencoding. 3{1 )andEq.( 3{20 )yields 3-1 correspondstoL1.Likewise,whentheminoperatorwasusedtoencode,L2>L1.NotethatL1andL2dividethestate-spaceintothreeregions,labeledRA,RBandRC.TheregionthatisabovebothL1andL2isregionRA,whichcorrespondstotheconstraintsofCase1.TheregionRCexistsbelowbothlines,andrepresentstheconstraintsofCase4.InbetweenL1andL2liesRC,whichcorrespondstoCase3.As 69

PAGE 70

3{4 ),whereAandbaredenedbythesortingoutcome,orcase,operatingintheregion.ThelinesL1andL2formtheboundariesfortheseregions.Consideringonlystatevectorsfromaparticularregionasinputs,onlytheappropriatelinearsub-systemneedbeconsidered.Theoutputofsuchasub-systemmay,however,lieoutsidetheregionunderwhichitoperates.Thismakesastrictlylinearanalysisoftheentiresystemimpossible.CalculatingthevalueofAkiasinEq( 2{22 )forsomeCaseiisthereforeuselessunlessitcanbeprovedthatthesystemstaysinoneregionandnevercrossesaboundary.ShouldtheoutputofthesystemcrossoneoftheboundariesintoanotherregionatanytimestepthenextAiandbiwillbedierent.Itwillnotbepossible,therefore,toanalyzetheentiresystemusinglinearsystemstheoryalone,thoughmuchofthatbodyofliteraturewillbeusefulinanalyzinganyoneofthethreesub-systems. 3{4 )spansallpointsinthestate-space,or<2,andtheoutputcorrespondstothenextstate.Thiscanbedescribedbyplottingarrowswiththeirtailsresteduponacertainsamplingofinputpointsandtheirheadsrestinguponthecorrespondingoutputpoint.Thisgraphicallyillustratesthemovementofthestatevectorfromonepointtotheotherinonetimestep.Byexercisinggoodjudgementinthechoiceofinputsamplesaninformativedescriptionofthesystem'sevolutionovertimeiscreated.Thisissubjective,however,anddoesnotsubstituteforarigorousalgebraicanalysis. 70

PAGE 71

3-2 containsanexamplevectoreldfora2Dlatticebasedassociativememorygeneratedusingmaxencodingacross100randomlygeneratedinputvectors.Thebluelinescorrespondtotheboundariesbetweeninputdomainsforthethreelinearsub-systems,i.e.L1andL2.InthiscasethelinesarenotequalsoL1>L2.Figure 3-2 wasplottedinMatlabusingthefunctionarrow.m,availablefromtheMatlab-CentralFileExchange.Thearrowtailscorrespondtolocationsthathavebeensampledasinputstothesystem.Thearrowheadsrepresenttheoutputproducedbythesystemforeachcorrespondinginput.Examinetheregionbetweentheboundaries,orRB.Eachofthevectorsiscomprisedofsimplyanarrowheadorientedupwards.Thisistheresultofplottinganarrowwithzerolength,usingarrow.m.ThisimpliesthatforeachoftheregularlysampledinputpointsinRBthedierencebetweentheoutputstateandtheinputstatewaszero.Thatistosay,eachofthesepointsisanequilibriumpointasdescribedinEq.( 2{7 ).Thoughtheinputpointswerespacedsoastoprovideacloseapproximationtothedynamiccharacteristicsofthesystem,itdoesnotprovideasolidproofthateverypointinRBisanequilibriumpoint.Forthatweneedthefollowingtheorem. 3{30 )andEq.( 3{31 ) 3-2 .ExaminingtheregionRA,whichcorrespondstotheinputdomainabovebothL1andL2,allofthesampledinputpointsmaptothepointonL1reachedbystrictlyverticaldownwardsmovement.ThisisindicativeoftheoverallbehaviorofpointsinRAascharacterizedbythefollowingtheorem.

PAGE 72

3.1.3 reliesonEq.( 3{22 )andEq.( 3{23 )whichimplythat 3-2 ,therefore,capturestheevolutionofpointsinRAwell.ForregionRC,existingbelowbothL1andL2,thevectoreldissimilartothatofR1withtheexceptionthatthepointsmovestrictlyhorizontallytoapointonL2asdescribedinthefollowingtheorem. 3{22 )andEq.( 3{23 )instead,i.e. 72

PAGE 73

Foreachpairofdimensions(1;2)(1;3)and(2;3)thestatespacecanbedividedupintothreeregions,inamannersimilartothetwodimensionalcase.Forinstance,forthedimensionpair(1;2)thestatespaceisdividedintooneregionforwhichx2isabovebothboundariesx1w12andx1+w21,asecondregionwhichisbelowbothboundaries,andathirdthatexistsbetweenthem.Thedimensionpairs(1;3)and(2;3)canbesimilarly 73

PAGE 74

3.1.2 theplacementofpointswithintheseboundariesisgovernedbytheoutcomeofthesortdescribedinEq.( 3{2 ).Thefollowingtheoremdescribesthebehaviorofpointsinsidetheboundariesforbothdimensionpairs. (3{54) (3{55) (3{57) and (3{58) (3{59) (3{61) RecallfromSection 3.1.1 thatxi[k]wijxi[k]+wjiwhentheminoperatorisusedtoencodeandsoxj[k]isbetweenthetwoboundaries.Q.E.D.

PAGE 75

(3{62) (3{63) (3{65) and (3{66) (3{67) (3{69) RecallfromSection 3.1.1 thatxi[k]wijxi[k]+wjiwhenthemaxoperatorisusedtoencodeandsoxj[k]isbetweenthetwoboundaries.Q.E.D. 2.Thismeansthatanalysisbythegeometryofdimensionpairsdoesnotscalewell.For20dimensionsthenumberofdimensionpairsis190!Luckily,byexaminingthesystemequationthedescriptionofthestatespaceissimplied.Let 75

PAGE 76

PAGE 77

2dimensionpairsitisnecessaryforaxedpointtoresidewithintheinnerregiondenedbythetwoboundaries.Simplyverifyingthatapointiswithintheboundsofaparticulardimensionpairisnotsucient,however,assomeofthepointsintheinnerregionofdimensionpair(i;j)maybeinanouterregionofthestatespacewithrespecttodimensionpair(p;q),andsoforth.Theintersectionofthen(n1) 2setsofpointsexistingwithinthemiddleregionofacertaindimensionpairisthexedpointset.Pointsoutsidethisintersectionmightbexedwithrespecttooneormoredimensionpairs,butwillnotbeaxedpointoftheentiren-dimensionalsystem.Thefollowingtwoguresprovideanexample.ThepointsinFig.( 3-3 )arewithinthestatespaceboundaries,andarethereforexedwithrespecttodimensionpair(i;j).Examiningdimensionpair(p;q)inFig.( 3-4 )showsthatmanyofthepotentialxedpointsareruledout.Thosepointsintheoutsideregionsarenotxedindimensionpandqand,therefore,thesepointscannotbexedpointsoftheoverallsystem.Furthermore, 77

PAGE 78

6 ].Apointxisaxedpointifandonlyifforeverydimensionpair(i;j),xjisintheregionbetweenthehyperplanesxj=xiwijandxj=xi+wji.Similarly,foramax-encodedmemoryxjmustbeintheregionbetweenxj=xiwijandxj=xi+wji.ThissubjectisrevisitedinSection 4.1 .Foranndimensionallatticebasedassociativememory,letlbetheremainingdimension,suchthatfd1g[fd2g[:::[flg[:::[fdng=f1;2;:::;ng,thentheintersectionE0(el),asdescribedinSection 2.4.5 ,isthesetofpointszsuchthatzl=0,i.e. 2.4.5 78

PAGE 79

79

PAGE 80

Thethreeregionsdenedbythe2Dpiecewiselinearmemoryusingmaxencoding Figure3-2: Vectoreldforeachofthethreeregionsdenedbyanexampleusingmaxencoding 80

PAGE 81

Thesepointsarexedindimensionpair(i;j),butmaynotbexedpoints. 81

PAGE 82

Someofthesepointsarexedindimensionpair(p;q),butmaynotbexedpoints. 82

PAGE 83

83

PAGE 84

b ,whichmeansthatthe 1 andthe n .Forakeyvectorx,andmemoryWXXthen,thecanonicaldecodingprocesswouldbewritten and Thisleadstotheinterestingresultthat,forallvectorsxX,thedierencexi[k]xj[k]mustbelessthanorequaltothedierencexixjandthedierencexj[k]xi[k]mustbelessthanorequaltothedierencexjxi,forallvectorsxX.Thiscomesfromthe 84

PAGE 85

(4{10) and (4{12) Equations( 4{10 )and( 4{12 )comefromSection 3.3 ,whichstatesthatapointxisaxedpointifandonlyifforeverydimensionpair(i;j),xjisintheregionbetweenthehyperplanesxj=ximijandxj=xi+mji,forallj.Similarlogicwillshowthatwhenmin-encodingisused,xisaxedpointifandonlyifforallvectorsxX,thedierencexi[k]xj[k]mustbegreaterthanorequaltothedierencexixjandthedierencexj[k]xi[k]mustbegreaterthanorequaltothedierencexjxi,forallvectorsxX.Section 4.1.1 willdiscusshowthisishandledwhenencodingusingorderstatisticsotherthantheminandthemax. 85

PAGE 86

anddecodingusingthemaxoperatorimplies (4{20) Q.E.D.

PAGE 87

(4{25) 4{14 )and( 4{15 ),foranyvectorthatisnotxedinsomedimensionithenxiisreplacedwiththeminimumvalueof(ij+xj)acrossallofthedimensionsjf1ng.Thisvectorwillthenbexedinthisdimensionbecausexiij+xj,forallj.Sincethisistrueforalli,ifavectorxisnotaxedpointoriginally,itwillbeafteroneiteration.Q.E.D. 4{14 )and( 4{16 ),foranyvectorthatisnotxedinsomedimensionithenxiisreplacedwiththemaximumvalueof(ij+xj)acrossallofthedimensionsjf1ng.Thisvectorwillthenbexedinthisdimensionbecausexiij+xj,forallj.Sincethisistrueforalli,ifavectorxisnotaxedpointoriginally,itwillbeafteroneiteration.Q.E.D.Atthispointthereadermaywonderwhathappenswhenanorderstatisticsoperatoroutsideoftheminorthemaxisusedduringdecoding.Thoughmemoriesusingsuchdecodingoperatorswerenotproposedasatopicforthisresearch,theymightprovideaninterestingresearchtopicinthefuture. 87

PAGE 88

4.1.2 illustratestheusefulnessofencodingwithsuchorderstatistics,however,andChapter 5 discussesafamilyofmemoriesusingnon-minandnon-maxdecodingindetail. 88

PAGE 89

PAGE 90

4-2 eachofthepointsinXdimensionpair(1,3)isplottedwithsimilarresults.Thexedpointsetisnarrowedtoexcludethemostextremeelementsoneitherside.Figure 4-3 showsacasewherethenarrowerxedpointsetcollapsesintoasingleline.ThishappenswhenalloftheremainingpointsarecollinearandismorelikelywhenthenumberofelementsinthesetXissmall,orthenumberofoutlierstrimmedislarge.EvidenceofthisbehaviorisshownagaininFigure 4-5 ,usingdimensionpair(2,4).Figure 4-4 andFigure 4-6 containthesameplotfordimensionpairs(2,3)and(3,4),respectively.Notethatthesamexedpointsetwouldbecreatedbyencodingwiththeminoperator,inthecanonicalmemorycase,orwiththesecondorderstatistic(next-to-min)inthecaseoftheorderstatisticsbasedmemory.Inthiscasethecanonicalmemorywouldbe XX=2666666640112303131022110377777775:(4{31)NotethatXXisageneralnamegiventoamemoryencodedusinganyorderstatisticsorOWAbasedoperator.ThisXXhasbeenencodedwiththenext-to-minoperatorunlikethepreviousone,whichhadbeenencodedwiththenext-to-max.Inthefuture,whenever 90

PAGE 91

4-7 showstheboundariesfordimensionpair(1,2).NotethatthispictureisextremelysimilartoFigure 4-1 exceptforthefactthattheboundarylineshavechangedplaces.Thatistosay,linesL1andL2haveswitchedplaces,ashavelinesL3andL4.Thisdoesnotchangethexedpointsetsthemselves,however.Theorem 4.1.5 explainswhythishappens. 4.2 ,wherethisbehaviorisextendedtothen-dimensionalcaseandtheapplicationofsuchthistechniqueisexplained. 91

PAGE 92

3.2.1 itwasshownthatforanycanonicallatticebasedassociativememorytheA,bpairassociatedwithaxedpointwasalwaystheidentitymatrixandthezerovector,assumingthattiesinthedecodingsortwerealwaysbrokeninfavorofthediagonalelement(ii+xi).Sincethisdiscussiondealtonlywiththedecodingprocessitwasnotlimitedtomemoriescreatedwiththecanonicalencodingoperators,itisapropertyofallmemoriesthataredecodedwithcanonicaloperators.Infact,sincetheorderstatisticsbasedmemoriespresentedhereusecanonical,i.e.eitherminormax,decodingoperatorsthepropertydetailedinTheorem 3.2.3 istrueoftheorderstatisticsbasedmemoriesaswell.Considerasimplethree-dimensionalexampleinwhich XX=266664012102020377775:(4{34)Next,considerthenoisyvector~x1=(0;4;0).Figures 4-8 through 4-13 showtheresultofdecoding~x1usingMXXandXX.Ingeneralpointsbelowbothoftheboundariesmovetothelefttothenearestboundaryandpointsabovebothboundariesmovedownwardto 92

PAGE 93

4-10 and 4-11 .Since~x1isontheboundaryinthiscaseitisxedindimensionpair(i,j)butitstillmovesasaresultofbeingoutsideofthexedpointsetindimensionpairs(1,2)and(2,3).Thisexampleisnotmeanttoillustratetheusefulnessofgeneralorderstatisticencodingoperators,butsimplytoshowthedierencebetweenthecanonicalandgeneralmemories.Theusefulnessofsuchmemorieswillbecomeclearinthenextsection. [X]i=(2;3:5;4;4:5;5;5:5;6;7;7:5;3:5;8);(4{36)andletrowjbedenedsuchthat [X]j=(6;4;5;6;3;6;5:5;3;5;5;9):(4{37)WithoutspecifyingalloftheelementsinXitisimpossibletocalculateanentirememory,butonlytheentriescreatedfromdimensionsiandjarenecessaryforthisexample.Here,Mij=8:5,Mji=17,ij=4,andji=4. 93

PAGE 94

PAGE 95

43 ].Theauto-mpgdatawasoriginallyusedtotestgraphicalanalysispackagesatthe1983AmericanStatisticalAssociationExpositionandiscurrentlyavailablefromtheUniversityofCalifornia,IrvineMachineLearningRepository[ 44 ].Theauto-mpgdataisintendedforpredictingthefueleciencyofanautomobileusing4continuousvariablesand3discretevariables.Includedineachdatapointisthefueleciencyofeachcar,inmilespergallon.TheorderofthevariablesisshowninTable 4.4.1 .HistogramsofthevaluestakenineachofthedimensionsareshowninFigures 4-16 Table4-1: DataDimensions DimensionVariableType 1mpgcontinuous2cylindersdiscrete3displacementcontinuous4horsepowercontinuous5weightcontinuous6accelerationcontinuous7modelyeardiscrete8origindiscrete 95

PAGE 96

PAGE 97

4-25 issimilarbutfordimensionpair(1,4).Here,thelowerboundariesofthetwomemoriesaresocloseastoalmostbeindistinguishablefromoneanotherbutthetopboundariesarequitedierent.ThecanonicalmemoryMXXmustincludetheoutlyingpointinitsxedpointsetwhileXXleavesitout.Oncethetwomemorieswerecreatedarandomvectorwaspickedfromthedatasetandits4thdimensionwasgivenanewvaluerandomlypickedfrombetween300and500,areasonableupperboundforthetimeperiod.Thisdistortedvectorwasthengivenasinputtothesystemandtheoutputwascomputed.Thesquarederrorbetweenthefourthdimensionoftheoutputvectorandthatoftheoriginalvectorwascomputedforbothmemories.Thistestwasperformed1,000times.ThehistogramoferrorsobtainedforthememoryMXXisshownin 4-26 andthatformemoryXXisgiveninFigure 4-27 .Notethatalthoughthehistogramsfrombothmemoriesoverlapsomewhat,theerrorforXXissignicantlysmaller,onaverage,thanthatofMXX.Foreachofthe1,000teststhesquarederrorwascomputedforeachoftheremainingdimensionsandtheerrorindimensions1-8wasaveragedforeachsample.Figure 4-26 showsahistogramofthermsevaluesfortheoutputofMSSwhileFigure 4-27 showsthesameforXX.ThisbenetsMXX,asitsoutputhasnoerrorinanydimensionotherthan4,whiletheXXmightincursomeerrorinanydimensionifoneofthepointschosenwasoneofthepointsleftoutduringtheencodingprocess.Still,memoryXXoutperformsitscanonicalcounterpart. 97

PAGE 98

4-30 showsahistogramforthermsevaluespertainingtoMXX.Figure 4-31 showsahistogramforthermsevaluespertainingtoXX.Again,theerrorissubstantiallylowerforXX. 98

PAGE 99

Narrowingthexedpointsetfordimensionpair(1,2). Figure4-2: Narrowingthexedpointsetfordimensionpair(1,3). 99

PAGE 100

Narrowingthexedpointsetfordimensionpair(1,4). Figure4-4: Narrowingthexedpointsetfordimensionpair(2,3). 100

PAGE 101

Narrowingthexedpointsetfordimensionpair(2,4). Figure4-6: Narrowingthexedpointsetfordimensionpair(3,4). 101

PAGE 102

Thexedpointsetfortheminandnext-to-minencodedmemories. Figure4-8: DecodingwithMxx,dimensionpair(1,2). 102

PAGE 103

Decodingwithxx,dimensionpair(1,2). Figure4-10: DecodingwithMxx,dimensionpair(1,3). 103

PAGE 104

Decodingwithxx,dimensionpair(1,3). Figure4-12: DecodingwithMxx,dimensionpair(2,3). 104

PAGE 105

Decodingwithxx,dimensionpair(2,3). Figure4-14: Narrowingthexedpointsetfordimensionpair(i,j). 105

PAGE 106

Noiseandthenarrowedxedpointset. Figure4-16: Dimension1:mpg. 106

PAGE 107

Dimension2:cylinders. Figure4-18: Dimension3:displacement. 107

PAGE 108

Dimension4:horsepower. Figure4-20: Dimension5:weight. 108

PAGE 109

Dimension6:acceleration. Figure4-22: Dimension7:modelyear. 109

PAGE 110

Dimension8:origin. Figure4-24: Exampleboundariesfromdimensionpair(1,2). 110

PAGE 111

Exampleboundariesfromdimensionpair(1,4). Figure4-26: Maxencoding:squarederrorinthefourthdimensiononly. 111

PAGE 112

Canonicalencoding:squarederrorinthefourthdimensiononly. Figure4-28: Maxencoding:rmseacrossalldimensions. 112

PAGE 113

Next-to-maxencoding:rmseacrossalldimensions. Figure4-30: Maxencoding:rmseacrossalldimensions. 113

PAGE 114

Next-to-maxencoding:rmseacrossalldimensions. 114

PAGE 115

2;1andwEn=1wEn.SuchamemoryisdecodedusingaweightvectorwherewDi=0forif2;:::;n1g,wD1=wEn,andwDn=wE1=1wDn.Anear-maxmemory,then,isamemorythatisencodedwithavectorclosertothemaxencodingvectorthanitistotheminencodingvector.SuchanencodingvectorisdenedsothatwEi=0forif2;:::;n1g,wE10;1 2andwEn=1wE1. 115

PAGE 116

(5{1) Inthissection,itisimportanttodierentiatebetweenthenumberofinputvectors,usuallydenotedk,andthecurrenttimestep,whichisalsotypicallyk.Thesolutionwillbetotemporarilyusethevariablettodenotethecurrenttimestep.Decodingwithamemoryusingthenear-minproductornear-maxproduct,denoted Here,thenatureofthedecodingprocess(near-minproductornear-maxproduct)willbespeciedtoavoidambiguity.Theorem 5.1.1 willdemonstratethatanear-min-maxmemoryissimplyaweightedaverageofthetwocanonicalmemoriesWXXandMXX. 2 .Theyaredenedsuchthatwij=Vk=1(xixj)andmij=Wk=1(xixj).Forthenear-min-maxmemorydescribedinTheorem 5.1.1 (5{3) =wE1wij+wEnmij: ThismeansthatXX=wE1WXX+wEnMXX.Q.E.D.Whenthereisnochanceofambiguity,thesuperscripts\E"and\D"aredroppedfromtheweightvectors,andtheircomponentsinordertosimplifythenotation. 116

PAGE 117

3{40 )wedenedtwolinesthatformtheboundariesofthexedpointsetforthetwo-dimensionalpiecewiselinearlatticebasedassociativememory.Wewillnowrenethisconceptinordertodiscussthepropertiesofthenear-min-maxassociativememory.Foreachdimensionpair(i,j)letthelineformingthetop-mostboundaryoftheinsidecasefor(i,j)beknownasLTijandletthelineformingthebottom-mostboundarybeknownasLBij.DeneLTijsuchthatforallt LTij=8><>:xj[t]=xi[t]ijifwD1[0;1 2]xj[t]=xi[t]+jiifwD1[1 2;1]:(5{5)Itisconvenienttointroducetwonewnotationalelements,N0[0;1 2]andN1[1 2;1],whereN0+N1=1.Here,N0is\nearzero"andN1is\nearone".NotethatwD1[0;1 2]canthenbewrittenwD1=N0andwD1[1 2;1]canthenbewrittenwD1=N1.ThesamecanbedoneforwDn,wE1andwEn.Alsonotethatwiththenear-min-maxlatticebasedassociativememorywE1=1wEn,wD1=1wDnand,becausenear-minencodingisalwayspairedwithnear-maxdecodingandviceversa,wD1=1wE1=wEn.Equation( 5{5 )canthenbere-written 2]xj[t]=xi[t]+N0Vk=1(xjxi)+N1Wk=1(xjxi)ifwD1[1 2;1];(5{6)forallt.Since (5{8) 117

PAGE 118

5{6 )canbesimpliedto Likewise,foralltlet 2]xj[t]=xi[t]ijifwD1[1 2;1]:(5{11)Thismeansthat 2]xj[t]=xi[t]N0Vk=1(xixj)+N1Wk=1(xixj)ifwD1[1 2;1];(5{12)forallt.Usingourpreviouslogicwecanconcludethat,forallt (5{14) Likewise,thecasewhenwD1=0representsthecanonicalmax-decoded(min-encoded)memoryWXX,whichisknowntoconvergetothemiddlecaseforeverydimensionpairin 118

PAGE 119

(5{16) Therefore,foranear-min-maxmemorywithwD1[0;1]andwDn=1wD1 +wDnxi[t]+N1k^=1(xjxi)+N0k_=1(xjxi)! (5{20) +wDnxi[t]+N0k^=1(xjxi)+N1k_=1(xjxi)!; SincewD1+wDn=1,thismeansthat (5{23) (5{24) This,inturn,meansthat Q.E.D. 119

PAGE 120

3{3 ),fromSection 3.1 5-1 .Notethatthereisaclearglobalpatternthatemergesacrosstheentirediagram,itisalmostsinusoidalinnature.Therangeofvaluestakenbytherstdimensionofthestate 120

PAGE 121

2 .Figure 5-2 hasbeenzoomedintoshowthesingleuniquevaluetakenbytherstdimensionofthestatevectorin201iterations.AsimilargraphisshowninFigure 5-3 ,wherethesystemissettow1=0:5.Thiscaserepresentstheeventthatw1=w2=0:5,whichwasdiscussedinSection??.Thisisanothercaseinwhichthesystemiswellbehavedinthemin-maxsense.Thatistosay,apointoutsidethexedpointset(whichhasnowreducedtoasinglelineL1=L2)willmovetothexedpointsetinoneiteration.Thereforethereareagainatmosttwovaluestakenduringthe201iterations.Inthisexamplethereweretwovalues.AnexampleofoneofthesecasesisillustratedinFigure 5-4 .Twentyiterationswerecalculatedforacanonicallatticebasedassociativememoryusingmaxencoding.AswasdescribedinChapter 2 ,thesystemconvergestoaxedpointinoneiteration.NotethatthedatashowninFigure 5-4 istheresultofaseparatesimulationfromthedatapresentedinFigure 5-2 .Thismeansthattherandominitializationisdierent,andthereforethenumberofuniquevaluestakenbytherstdimensionofthestatevectormaybedierentaswell.Infact,therstdatasetgeneratedaxedpointimmediately,andthereforeonlyoneuniquepointin200iterations,whereasthesecondsimulationdidnotinitiallygenerateaxedpoint,andsotherearetwouniquevalues. 121

PAGE 122

5-1 wasrepeatedcloseto30timestoobservevariationscausedbyrandominitialization.Thoughtheactualvaluestakenduringthe200iterationscalculatedforeachsettingofthew1parameterdiered,thepseudo-sinusoidalpatterninglobalsystembehaviorwaspresentinallofthesimulations.Thecoherenceofthepsuedo-sinusoidalpatternwasnotconsistent,though.Thatis,thepatternwasoftennoisy.Figure 5-5 showstheresultsofoneoftheleastcoherent,i.e.oneofthenoisiest,simulations.Notethatthepsuedo-sinusoidalpatternisstilldiscernable,butitisnotnearlyasclean.Thisnoiseistheresultofhavingdierentrandomlycreatedinitialvaluesforthesystemateachpointonthew1axis.Iftheinitialpointisalwayszero,oriftheinitialpointissubtractedfromthevalueofthestatevectorbeforeplotting,thewaveformisalwayscompletelycoherent.AlsonoticeableinFigure 5-5 arethesinglepointsthatappearoutsideofthepseudo-sinusoidalpattern.Thesearetheinitialpointsandthereisonlyonesuchpointforeachofthevaluesofw1.Fromtheinitialpointthereisagapasthethetrajectoryjumpsfromanoutsidecasetotheinsidecase.Figure 5-6 showsthesamedataasthatinFigure 5-5 ,butzoomedintogiveabetterglimpseofthesediscontinuousjumps.Acrossallofthesimulationsthatwereruninthisexperimentthesizeofthesediscontinuousjumpswasalwaysinverselyproportionaltothecoherenceofthepsuedo-sinusoidalpattern.Thatis,thesmallerthejumpsthemorecoherentthepattern.Foreachvalueofw1theinitialpointoftheorbitisdierent. 122

PAGE 123

5-7 .AswithFigure 5-4 ,twentyiterationswereplotted.Unlikewithit'spredecessor,however,inthiscasethereisnoxedpoint.Thesystemparameterwasinitializedtow1=0:1,andsothisisnotacanonicallatticebasedassociativememory. 5-8 itisapparentthatchangingtheinitializationtox[100;200]doesnotdisruptthispattern.Infact,inFigure 5-8 thepseudo-sinusoidalpatternisextremelyclear.Perhapstheamplitudehasbeenaected,butnotthepatternitself.Asimilartestwasperformedonpositivevaluesotheroutsideoftherange[0;1].Forthisexperimentthesystemwasinitializedsuchthatx[50;100].ItisclearfromFigure 5-9 thatthepatternstillappears.Anothertestwasconstructedtotestthebehaviorofthesystemwhenthevaluesofxcrosszero;inthiscasex[30;30].Again,thepseudo-sinusoidalpatternheld.Foreachoftheseexperimentsthesystemwastestedwith10dierentsetsXforeachrangeofinitialvalues.Theimagesshownarerepresentativeofthoseproducedduringthesimulations.Inordertostudytheeectoftheinitialconditionsofaparticulartrajectoryseveralothersimulationswereconstructed.TheseexperimentsthesetofinitialvectorsXwaskeptconstantthroughout.Atrst,arandominitialpointwaschosensuchthatx[0][5;5].Again,thepseudo-sinusoidalpatternwasevident.Infact,thepatternwasevidentintwofurtherexperiments-oneinwhichx[0][25;5]andoneinwhichx[0][5;25].Eachsimulationwasperformed10times. 123

PAGE 124

3.1.1 .Therefore,wecanreneoursystemfunctiontothefollowingsimplerversion: 2w1x1+(1w1)x2+(1w1)12ifw11 2;1(5{29)InordertoperformtheharmonicanalysisusingFourierseriesapproximationitwasnecessarytochangethevariablefromw1,whichisdenedovertheinterval[0;1],tosomevariablep,whichisdenedovertheinterval[T;T],forsomeT.Toachievethat,letp=w11 2.Thismeansthatw1=(p+1 2)andthat(1w1)=(1 2p).Denethefunctionf(p)suchthat 2px1+p+1 2x2+p+1 212ifw11 2;0(p+1 2)x1+1 2px2+1 2p12ifw10;1 2(5{30)Thevariablepisthendenedontheinterval[1 2;1 2].Foranypiecewisecontinuousfunctionfontheinterval[T;T]theFourierseriesoffisthetrigonometricseries T+bnsinnx To;(5{31) 124

PAGE 125

Tdp;n=0;1;2;:::(5{32) Tdp;n=1;2;3;:::(5{33)andthesymbolisusedtohighlightthefactthattheseriesrepresentsf(p),butmaynotconvergetoit[ 18 ].Assumeasimpleexampleinwhichxisinitializedtothezerovector,andtheinitialsetofvectorsXissettotheidentitymatrix.LettingT=1 2, 21 2f(p)cos(2np)dp;n=0;1;2;:::;(5{35) 21 2f(p)sin(2np)dp;n=1;2;3;::::(5{36)Furthermore,recallthatwhenthedecodingvectoris(w1;w2)0theencodingvectoris(w2;w1)0andthedenitionofissuchthat 2pk^=1xixj+1 2+pk_=1xixj(5{38).ForaxedsetofinputvectorsX,thevaluesVk=1xixjandWk=1xixjwillremainxed.Letm=Vk=1xixjandM=Wk=1xixj.Therefore,using 5{30 125

PAGE 126

2px1+p+1 2x2+1 4p2m+p2+p+1 4Mifp1 2;0(p+1 2)x1+1 2px2+p2p+1 4m+1 4p2Mifp0;1 2(5{39)Sincef(p)isapiecewisefunction,theFouriercoecientintegralcanbewritten 21 2f(p)cos(2np)dp=2Z01 2f(p)cos(2np)+2Z1 20f(p)cos(2np): SubstitutingEq.( 5{40 )intoEq. 5{39 yields 2Z01 21 2px1+p+1 2x2+1 4p2m+p2+p+1 4Mcos(2np)dp:(5{41)Thiscanbeattemptedusingintegrationbyparts.Letu=f(p).Thisimpliesthat 2n;(5{43)whichisnotdenedwhenn=0.FromEquation 5{35 ,however,isclearthatwhenn=0 21 2f(p)cos(0)dp=2Z01 2f(p)dp+2Z1 20f(p)dp:(5{44)Therefore, 2x1+1 2x2+1 4(m+M):(5{45)Forthecaseswhenn6=0,integrationbypartsshowsthat 22Z01 2vdu+2uv]1 202Z1 20vdu: =2(cos(n)+2nsin(n)1)x1(cos(n)1)(2x2+M+m) 22n2

PAGE 127

23n3(5{47) 2)[(1 2;1).Proof:First,notethatwhenw1>1 2; x1[k]=T11[k]+T12[k]+T112[k]+T121[k];(5{48)where 127

PAGE 128

128

PAGE 129

(5{72) =x1[0]hc(2;0)w(20)1w02+c(2;2)w(22)1w22i =x1[0]w21+w22 (5{75) =x2[0]hc(2;1)w(21)1w12i =x2[0][2w1w2] (5{77) (5{78) =w212c(0;0)w01w02+c(1;0)w1 =w212[1+w1] (5{80) (5{81) =w221hc(1;1)w(11)1w12i =w221[w2]: 129

PAGE 130

and Also, +w22x1[0]+w1w2x2[0]+w2221+w212 =w21+w22x1[0]+2w1w2x2[0]+(w1w2+w2)12+w2221: Theseformulascanbeprovencorrectforallk>2usinginduction.Sincethebasicmechanismisthesameforeachofthetermsinx1[k]andx2[k],withonlyslightalgebraicdierences,onlytheproofforT11[k]willbeshownhere.Sincethecasewhenk=1wasproventobecorrect,itwillbethebasecase.Itisthennecessarytoshowthatfork>1, 5{91 )istrueifandonlyif +w2T21[k1]+T22[k1]+T212[k1]+T221[k1]+w212; 130

PAGE 131

Intheeventthatkisodd,thisistruei [c(k;0)c(k1;0)]wk1+[c(k;2)c(k1;1)c(k1;2)]w(k2)1w2+:::+[c(k;k1)c(k1;k1)c(k1;k2)]w1w(k1)2=0 131

PAGE 132

0!(k0)!(k1)! 0!(k1)!=11=0and (5{96) =k! (kj)!j!(k1)! [(k1)(j1)]!(j1)!(k1)! (k1j)!j! (5{97) =k! (kj)!j!j(k1)! (kj)(k1j)!j! (5{98) =k! (kj)!j!j(k1)! (k1)j!(kj)(k1)! (kj)!j! (5{99) =k! (kj)!j!(kj+j)!(k1)! (k1)j!j! =0: Therefore,ifkisodd,thenT11[k]=w1T11[k1]+w2T21[k1]:Ifkiseven,however,Equation( 5{91 )istrueifandonlyif [c(k;0)c(k1;0)]wk1+[c(k;2)c(k1;1)c(k1;2)]w(k2)1w2+:::+[c(k;k)c(k1;k1)]w(k)2=0;ByEquations( 5{96 )through,thisistrue.Sincekisalwayseitheroddoreven,thisisalwaystrue.ThisconcludestheproofforT11[k].Forallotherterms,theinductiveproofisextremelysimilar. 2
PAGE 133

2.NotethatPkj=0g(a;b)=(a+b)k,thankstothebinomialtheorem.Also,notethattheoddtermsaregivenby1 2[(a+b)k+(ab)k]because (a+b)k=c(k;0)ak+c(k;1)a(k1)b+c(k;2)a(k2)b2+:::+c(k;k)bk; (ab)k=c(k;0)ak+c(k;1)a(k1)(b)+c(k;2)a(k2)(b)2+ (5{103) =c(k;0)akc(k;1)a(k1)b+c(k;2)a(k2)b2+:::+c(k;k)(b)k; wherec(k;k)(b)k=c(k;k)bkwhenkisevenandc(k;k)(b)k=c(k;k)bk,whenkisodd.Thismeansthat (a+b)k+(ab)k=2c(k;0)ak+2c(k;2)a(k2)+2c(k;3)ak3b3+:::; sohalfofthisyieldsthesumoftheeventerms.Likewisetheeventermsaregivenby1 2[(a+b)k(ab)k]because (a+b)k(ab)k=2c(k;1)a(k1)(b)2c(k;3)a(k3)(b)3+:::; =2c(k;1)a(k1)b+2c(k;3)a(k3)b3+:::; andsohalfofthisyieldsthesumoftheoddterms.Furthermore,sincek(andthereforeiandj),w1andw2arealwaysnon-negative,thesumoftherstk1termsinEquations( 5{49 )through( 5{52 )arelessthanthesumoftherstkterms.Thisleadsto limk!1T11[k]1 2limk!1x1[0][(w1+w2)k+(w1w2)k] (5{109) =1 2limk!1x1[0][1+(w1w2)k] (5{110) =1 2x1[0]: 133

PAGE 134

limk!1T112[k]w212 (5{112) =w212 (5{113) TheequationforT121[k]issimilartothatofT112[k].SinceT11[k]andT12[k]arebounded,thegrowthofx1[k]isthereforedeterminedbyT112[k]andT121[k].Sincebothlimk!1P(k1)i=1[1+(w1w2)(i1)]1andlimk!1P(k1)i=1[1(w1w2)(i1)]=1aslongasw2(12+21) 2>0.Toseethatitis,letwE1andwE2denotetheweightsusedduringencoding,andwD1andwD2denotetheweightsusedduringdecoding.Furthermore,sincewehaveassumedthatwD1>1 2,itmustbetruethatwE1<1 2.Letmij=Vk=1(xixj),Mij=Wk=1(xixj).Recallthatmij=Mji. (5{114) (5{115) (5{116) (5{117) (5{118) Therefore,whenw1>1 2ondecoding,limk!1x1[k]=1.Asimilarproofshowsthatwhenw1<1 2duringdecoding,limk!1x1[k]=.Thismeansthatthemagnitudeofthepseudo-sinewavegrowswithoutbound.So,forthetwo-dimensionalcasetheonlyxedpointsoccurwhenw12f0;1 2;1g. 5.6.1 willshowthatwhenw1=wn=1 2thesystemhasaxedpoint. 134

PAGE 135

2,xi[k+1]=xi[k]forallpointsintheinsidecase.Proof:Inn-dimensions,avectorxisintheinsidecaseifandonlyifxiijxjxi+ji,fori;jf1;:::;ng.Also,sincew1=1 2thenwn=1w1=1 2aswell.Furthermore 2k^=1(xixj)+1 2k_=1(xixj)! =1 2k^=1(xixj)1 2k_=1(xixj) (5{121) =1 2k_=1(xjxi)+1 2k^=1(xjxi) (5{122) =ji: Thismeansthatxiij=xj=xi+ji.Takingintoaccountthatii=0foralli,thismeansthati1+x1=:::=ii+xi=:::=ij+xj=:::=in+xnforalli;j.Since (5{124) =1 2xi[k]+1 2(in+xn[k]) (5{125) =1 2xi[k]+1 2xi[k] (5{126) =xi[k]; Q.E.D.A3-dimensionalexamplefollows: 135

PAGE 136

Lemmas 5.6.1 and 5.6.2 arenecessaryinordertoaddresstheexistenceofnon-xedpointvaluesforw1. 5.6.1 (5{134) (5{135) (5{136) (5{137) (5{138) EqualityonlyholdsintheeventthatVnj=1(xk[k]+ij)=Wnj=1(xk[k]+ij).Q.E.D. 136

PAGE 137

2)[(1 2;1),however,thesystemdoesnotcontainanyxedpoints.Thissystemisnotwellbehavedinthemin-maxsense,asdescribedinChapter 4 137

PAGE 138

Bifurcationdiagramfor2Dmemory.Onlytherstdimensionofthestatespaceisshown. Figure5-2: Canonicalbehaviorinbifurcationdiagram. 138

PAGE 139

Whenw1=w2=0:5inthebifurcationdiagram. Figure5-4: Twentyiterationsofamaxencodedmemory. 139

PAGE 140

Anexampleoftheincoherentpsuedo-sinusoidalpattern. Figure5-6: Discontinuousjumpsinthebifurcationdiagram. 140

PAGE 141

Twentyiterationswithw1=0:1. Figure5-8: Bifurcationdiagraminitializedtox[100;200]. 141

PAGE 142

Bifurcationdiagraminitializedtox[50;100]. Figure5-10: Fittingasinewavetothebifurcationdiagram. 142

PAGE 143

PAGE 144

144

PAGE 145

[1] B.Kosko,\Bidirectionalassociativememories,"IEEETransactionsonSystems,ManandCybernetics,vol.18,pp.49{60,January1988. [2] K.HainesandR.Hecht-Nieson,\Abamwithincreasedinformationstoragecapacity,"ProcIEEEICNN-88,vol.1,pp.181{190,1988. [3] R.J.McEliece,E.C.Posner,E.R.Rodemich,andS.S.Venkatesh,\Thecapacityofthehopeldassociativememory,"IEEETransactionsOnInformationTheory,vol.33(4),pp.461{482,July1987. [4] G.X.RitterandP.Sussner,\Associativememoriesbasedonlatticealgebra,"in1997IEEEInternationalConferenceonComputationalCyberneticsandSimulation,vol.4,(Orlando,Florida,USA),pp.3570{3575,Systems,Man,andCybernetics,October1997. [5] P.SussnerandM.Grana,\Guesteditorial:Specialissueonmorphologicalneuralnetworks,"JournalofMathematicalImagingandVision,vol.19(2),pp.79{80,September2003. [6] G.RitterandP.Gader,AdvancesinImagingandElectronPhysics,vol.144,ch.FixedPointsofLatticeTransformsandLatticeAssociativeMemories,pp.165{238.AcademicPress,September2006. [7] J.Lewis,AnalysisofLinearDynamicSystems.Illinois:MatrixPublishers,Incorporated,1977. [8] J.Johnson,LinearSystemsAnalysis.NewYork:TheRonaldPressCompany,1975. [9] D.Luenberger,IntroductiontoDynamicSystems.NewYork:JohnWileyandSons,1979. [10] J.Stuart,CalculusEarlyTransendentals.NewJersey:GaryW.Ostedt,1999. [11] H.SchneiderandG.P.Barker,MatricesAndLinearAlgebra.NewYork:Dover,1973. [12] A.Anderson,\Asimpleneuralnetworkgeneratinganinteractivememory,"Mathe-maticalBiosciences,vol.14,pp.197{220,1972. [13] T.Kohonen,\Correlationmatrixmemories,"IEEETransactionsonComputers,vol.C-21(4),pp.353{359,1972. [14] J.Hopeld,\Neuralnetworksandphysicalsystemswithemergentcollectivecomputationalabilities,"ProceedingsoftheNationalAcademyofSciencesoftheUSA,vol.79,pp.2554{2558,April1982. [15] D.Valentin,H.Abdi,andB.Edelman,\Connectionistfaceo:Dierentalgorithmsfordierenttasks." 145

PAGE 146

S.Haykin,NeuralNetworksAComprehensiveFoundation.NewJersey:PrenticeHall,1999. [17] B.Kosko,NeuralNetworkandFuzzySystems.NewJersey:PrenticeHall,1992. [18] R.K.Nagle,DierentialEquationsandBoundaryValueProblems.NewYork:Addison-Wesley,2000. [19] Y.Wang,J.Cruz,andJ.H.Mulligan,\Twocodingstrategiesforbidirectionalassociativememory,"IEEETransactionsonNeuralNetworks,vol.1(1),pp.81{92,March1990. [20] G.MathaiandB.Upadhaya,\Performanceanalysisandapplicationofthebidirectionalassociativememorytoindustrialspectralsignatures,"in1989IEEEInternationalJointConferenceonNeuralNetworks,vol.1,pp.33{37,June1989. [21] T.Kohonen,Self-OrganizationandAssociativeMemory.NewYork:Springer-Verlag,1988. [22] W.G.Wee,\Generalizedinverseapproachtoadaptivemulticlasspatternclassication,"IEEETransactionsonComputers,vol.17(12),pp.262{269,December1968. [23] W.G.Wee,\Generalizedinverseapproachtoclustering,featuredetection,andclassication,"IEEETransactionsonComputers,vol.17(3),pp.1157{1164,May1968. [24] F.H.Kishi,AdvancesinControlSystems:TheoryandApplications,ch.OnLineComputerControlTechniques.NewYork:AcademicPress,1964. [25] J.ZhangandT.Zhuang,\Anovelapproachtodesignweightmatrixofhopeldnetwork,"inProceedingsofthe2005IEEEEngineeringinMedicineandBiology27thAnnualConference,(Shanghai,China),pp.1556{1558,September2005. [26] C.S.LeungandK.F.Cheung,\Householderencodingfordiscretebidirectionalassociativememory,"IEEEProcIJCNN91,,vol.1,pp.237{241,November1991. [27] A.Anderson,\Second-orderasymmetricbamdesignwithamaximalbasinofattraction,"IEEETransactionsOnSystems,Man,AndCybernetics,vol.33,pp.421{428,2003. [28] Z.ouWang,\Abidirectionalassociativememorybasedonoptimallinearassociativememory,"IEEETransactionsonComputers,vol.45(10),pp.1171{1179,October1996. [29] S.BurrisandH.Sankappanavar,ACourseinUniversalAlgebra.NewYork:Springer-Verlag,1981. [30] S.Burris,\Thelawsofboole'sthought."WebReferencePre-print,April2000. 146

PAGE 147

E.Ljapin,Semigroups(TranslationsofMathematicalMonographs),vol.3.RhodeIsland:AmerMathematicalSociety,June1978. [32] G.X.Ritter,P.Sussner,andJ.L.D.deLeon,\Morphologicalassociativememories,"IEEETransactionsonNeuralNetworks,vol.9(2),pp.281{293,January1998. [33] R.Yager,\Onorderedweightedaveragingaggregationoperatorsinmulti-criteriadecisionmaking,"IEEETransactionsonSystems,ManandCybernetics,vol.18,pp.183{190,1988. [34] R.Yager,\Familiesofowaoperators,"FuzzySetsandSystems,vol.57,pp.125{148,1993. [35] B.S.Ahn,\Onthepropertiesofowaoperatorweightsfunctionswithconstantleveloforness,"IEEETransactionsOnFuzzySystems,vol.14,pp.511{515,2006. [36] K.AlligoodandT.Sauer,Chaos:AnIntroductionToDynamicalSystems.NewYork:Springer-Verlag,1997. [37] T.ParkerandL.O.Chua,\Chaos:Atutorialforengineers,"ProceedingsOfTheIEEE,vol.75(8),pp.982{1008,August1987. [38] W.deMeloandS.vanStrien,One-DimensionalDynamics.NewYork:Springer,1996. [39] S.H.Strogatz,NonlinearDynamicsandChaos.NewYork:PerseusBooksGroup,2001. [40] D.Ruelle,ChaoticEvolutionandStrangeAttractors.Cambridge:CambridgeUniversityPress,1989. [41] G.S.DuaneandJ.J.Tribbia,\Synchronizedchaosingeophysicaluiddynamics,"Phys.Rev.Lett.,vol.86,pp.4298{4301,May2001. [42] D.L.Turcotte,FractalsandChaosinGeologyandGeophysics.NewJersey:GaryW.Ostedt,1997. [43] J.R.Quinlan,\Combininginstance-basedandmodel-basedlearning,"inPro-ceedingsoftheTenthInternationalConferenceonMachineLearning,(Amherst,Massachusetts),pp.236{243,MorganKaufmann,1993. [44] D.N.A.Asuncion,\UCImachinelearningrepository,"2007. 147

PAGE 148

JohnMcElroywasbornandraisedinFlorida.HehasaPh.D.andanM.S.incomputerengineeringandabachelor'sdegreeinnearts. 148