<%BANNER%>

Development of a Geospatial Data-Sharing Method for Unmanned Vehicles Based on the Joint Architecture for Unmanned Syste...

xml version 1.0 encoding UTF-8
REPORT xmlns http:www.fcla.edudlsmddaitss xmlns:xsi http:www.w3.org2001XMLSchema-instance xsi:schemaLocation http:www.fcla.edudlsmddaitssdaitssReport.xsd
INGEST IEID E20110320_AAAADM INGEST_TIME 2011-03-20T20:26:26Z PACKAGE UFE0009320_00001
AGREEMENT_INFO ACCOUNT UF PROJECT UFDC
FILES
FILE SIZE 84234 DFID F20110320_AACBRX ORIGIN DEPOSITOR PATH evans_c_Page_018.QC.jpg GLOBAL false PRESERVATION BIT MESSAGE_DIGEST ALGORITHM MD5
aa4f48daa3b33a3125c3eed119ac1a11
SHA-1
e68373205a121f1732749fb55cbc5c28fca833d1
95250 F20110320_AACBSL evans_c_Page_030.QC.jpg
8731360e286c77a98298d92825c03f84
afc8ac3da70050c7187145cdb2516f381445802f
198205 F20110320_AACBTA evans_c_Page_039.jpg
a2bdf5cc182c74ee0253c990805493dd
b363bbd0967db5166746dfd0215452d8adcef6f6
198981 F20110320_AACBSM evans_c_Page_031.jpg
7d02f28caba4ebdceb188236a14d258d
c9942780186b271b05e3b24c76c7d1126e0863c3
100859 F20110320_AACBRY evans_c_Page_019.QC.jpg
a5ae6234f98e1697dc165ed35e467d53
69494d951af0181d92f884afbeb10763045afafb
73400 F20110320_AACBTB evans_c_Page_039.QC.jpg
e2900dded2a3ed7f52cbf4b73040222c
c716f3affefcca97428be340e5a0f8b81659f31d
86176 F20110320_AACBSN evans_c_Page_031.QC.jpg
797b5aae2065177b3914486c517ee8a2
f7e23dcc494918b2325cd7acb8936076c763bd27
191144 F20110320_AACBRZ evans_c_Page_021.jpg
afffccf1d64ba2068692d0de1dc34b0e
98614845124cc1bd08295b9261de1b81665abae8
52443 F20110320_AACBTC evans_c_Page_040.jpg
51fdfc3476ffbeb8e455caeb3f97b108
2c44d25ca3796e338579b6fe5cdf92d89e1b8497
220262 F20110320_AACBSO evans_c_Page_032.jpg
f2e8f39509b86b5272d5fba3bf053c51
c9c2eb042fbdbe6fb14913e3c682c0bc6df56dd8
20217 F20110320_AACBTD evans_c_Page_040.QC.jpg
497e355d02f753b05ee762fed15e2a56
67703597c6196e786686806632608b16569d2db8
99009 F20110320_AACBSP evans_c_Page_032.QC.jpg
0afb90fc127c0d3cd2dc4d8e38b88629
fd5443c958eefad5db3a94284edacd7dec773846
175201 F20110320_AACBTE evans_c_Page_041.jpg
1689c3022a9618e918e2f62ddf4ff7f8
dea7ca751bab776ec2c5a12300771c4cae11d4ad
207285 F20110320_AACBSQ evans_c_Page_034.jpg
0810e3fd82ed1406c9b4ffd79bb4fb2f
2de67faeeb4f1b49d98a6b29997bc027395c2b68
62533 F20110320_AACBTF evans_c_Page_041.QC.jpg
da5f0ee292cb131549161865c46e8ab1
76b7d8b864d42bf5d41aa66e7ae52b7c23157624
76950 F20110320_AACBSR evans_c_Page_034.QC.jpg
46cb8b0b821bc9ae8ade39fd8dfb6172
713a155199cdda64e5ac70607bfb5c9ad085afbc
211393 F20110320_AACBTG evans_c_Page_043.jpg
0492be3c67872c09d104d7da6363c1e5
b4e129fe0f2bb79f6341225bf2e84be2ff37c5ef
205456 F20110320_AACBSS evans_c_Page_035.jpg
e5388f9db836ce394cab0af6668e06cb
12bf2c017abb3ac98cb9e67eaee78ca4fd098676
168321 F20110320_AACBTH evans_c_Page_045.jpg
31be7fe001cf514271bc10b97722d0cb
20ffe2e2eca3a0e40ce2cb4318a28c90ba54828f
75893 F20110320_AACBST evans_c_Page_035.QC.jpg
623423a742b55e5a93fe7587b6ed2960
b167ecf06bfaf7d0812de94a620fe8557cd03b73
81397 F20110320_AACBTI evans_c_Page_045.QC.jpg
5779af27bb33928a780cb1f27d074de8
2922edb7f8e6fefdd0e53d828d458bc9f6fbad0a
195326 F20110320_AACBSU evans_c_Page_036.jpg
3ef48401282a4ccd27f3c30818bf3bf3
97fae8e579ca0426e553e6ec1922fefbe1b5cc01
73079 F20110320_AACBTJ evans_c_Page_046.QC.jpg
aafdc28c3074e5f1652948be9c37d6a9
80b7b237f5d87e3f8cceacea65f1520340aded61
85806 F20110320_AACBSV evans_c_Page_036.QC.jpg
23de26b1c0b9bd0f3bc83dd1fb8ea06c
3e8bd9ccdd5b45664296c622dc7ca8d92c27fcfb
215794 F20110320_AACBTK evans_c_Page_047.jpg
9c57ee6e7efd27e8282a5d17d236178d
1e795334c35c422223ed21a32c2883d97aa62242
243095 F20110320_AACBSW evans_c_Page_037.jpg
a9bbe7f471b954e3326ccbcf7bacef2d
3d8c93c1b07a30041c2e70920095b0050082eeb1
194275 F20110320_AACBTL evans_c_Page_048.jpg
176574b90b27ee34e56b3f6dbf6b8ccd
7f1d19f8b3650e2145fc0bd9ccff2052302fdb22
101596 F20110320_AACBSX evans_c_Page_037.QC.jpg
d3cc62d692fe7af2e710db4751fd9843
066de1edc77c9b7887e1536a302deb96855270af
106141 F20110320_AACBUA evans_c_Page_062.QC.jpg
668c84357e09f60bbd0e953a3acd4397
7d7fb10e1bf9d4cc106fc1a0440efd7305b074f8
76728 F20110320_AACBTM evans_c_Page_048.QC.jpg
df3ceecabbcbd04d7364c6653402223b
78b022260006f8e616538b623b0f368c98b26946
218621 F20110320_AACBSY evans_c_Page_038.jpg
210bf4ce2530203e442398acbebb89ab
70fa3954a0e685f902416ec49ffdb643f176f6d0
203250 F20110320_AACBUB evans_c_Page_064.jpg
ff77ab32a90bac2ba3d9bf43c364e2b1
1251ba86e4ba60a74c43743d684d540af17b28a2
65786 F20110320_AACBTN evans_c_Page_049.QC.jpg
190efe4bba1e0b8746cfd6a64edc3f9a
a5de39d5f9c519a929575527cb120db69daa668a
81659 F20110320_AACBUC evans_c_Page_065.QC.jpg
71e43a55525dc275174687010c0c44db
e717412d6b86a21adbb40f01257c3a365224a0cf
195208 F20110320_AACBTO evans_c_Page_050.jpg
3be772764cb11b085c1e6e3fa3ee64dc
8ec0035a8c28a6245bdc6f8758155b9be18e9ab8
82383 F20110320_AACBSZ evans_c_Page_038.QC.jpg
cecfeac18b1e225f11656e46c50642c3
f156a7c6d947b09f635a63cc9903e1c65f45bc38
75917 F20110320_AACBUD evans_c_Page_068.QC.jpg
a9f85967afc5af37a5d65ecbf228488d
7a87117c062b9e37a5c6d4553f5b782c9d9a3674
73523 F20110320_AACBTP evans_c_Page_050.QC.jpg
9b0f787688e304c1465bc6c7ab1d1059
5a0ca1f8f4b55ad8f7e2f48ac29c596118458add
72883 F20110320_AACBUE evans_c_Page_069.QC.jpg
5431c1e4123a0e05b917554335182e9a
7ee4aed83fba5ec88f4dcb4613ae8e13c47637e6
69182 F20110320_AACBTQ evans_c_Page_052.QC.jpg
3032399f185a46ae6bb8eb084475da1e
4edb2f9886d0479690bb721afbdd148a431f8e2d
65687 F20110320_AACCAA evans_c_Page_108.jp2
d4157f8f5fc0d061541ad4a7739c9d5c
dbebaeefc965e18a5f1eceef2faa9f9a8969f6f9
69439 F20110320_AACBUF evans_c_Page_070.QC.jpg
62ecabfa2dba88fd62a86d1a7d793ec1
8746c9bcb72aa11336add2501613754c5cfc07b7
163967 F20110320_AACBTR evans_c_Page_053.jpg
0e0c8718ded9494fd22b7655c4f3b2e4
04b66a847911b20fcf2773a7d8c34b906d53f421
62970 F20110320_AACCAB evans_c_Page_109.jp2
5c18988a2b542588d5e4b16e30fabf77
fad7116488a3986fbca05aa8d81ae693b37ed51b
194323 F20110320_AACBUG evans_c_Page_071.jpg
43e2d9345d4e7db2be38c288ea7aea53
81190cf7dc89c1ee6d11da3ec6c9453be8ebce34
77986 F20110320_AACBTS evans_c_Page_053.QC.jpg
d498b9a28a330264b9a6036975c63b31
a5ff17742cd96fb7677a50fac459d3b719cc02a5
86845 F20110320_AACCAC evans_c_Page_110.jp2
2d0a69a1280f6d7877a6f97403b432e6
9736d8ec09e52317a345cb18e42b641d53f4dfa2
204018 F20110320_AACBUH evans_c_Page_072.jpg
df3dfa6513d6cc8b22d38eb9907be8e2
cba890a45481062226df5067fb3f5bfed790d3cc
72475 F20110320_AACBTT evans_c_Page_055.QC.jpg
5dc9efd278f2c416472007bb7925c458
1daac874e2e309fc671d3008ff7fe95418b0e2de
85025 F20110320_AACCAD evans_c_Page_111.jp2
527ffffbfedf83eb49c9cd17bee89e32
0186007158a2b0217d4bbe8164c17c11c55933ab
76066 F20110320_AACBUI evans_c_Page_072.QC.jpg
b67ffc3f6b08e6dd11f16ac7c1315e63
96174fc76a2149637c75758feb354c62994e0374
203260 F20110320_AACBTU evans_c_Page_056.jpg
c6e338b0cbd96ec88d05bbfcedce4785
9b8e251ba0475281ae3ef9039e977a4081159b37
67689 F20110320_AACCAE evans_c_Page_112.jp2
414824bb3235182c5084e5d44016f2e1
da601070b8da7f2a24d985ff76255124b7463e67
206510 F20110320_AACBUJ evans_c_Page_073.jpg
57e478a7077787a8d706621e50335b16
eb004406957d9651080de66f72670296ddf5a6a9
75941 F20110320_AACBTV evans_c_Page_056.QC.jpg
8120f413823635261c02e10922ad6283
1bce98252ccdabd686884d9e0d3e7d62623e4a6c
96512 F20110320_AACCAF evans_c_Page_113.jp2
0d5d544482e96adf27b13ce0007f61c7
2cf84eee6df8c2ec37fab9a7c7bff4c377e7d034
77252 F20110320_AACBUK evans_c_Page_073.QC.jpg
c8344ca96eedb64b4578c7494cab779a
13e4009fd1f469e3f5be40f4bc4a5360b7d840fa
65076 F20110320_AACBTW evans_c_Page_057.QC.jpg
90fb5f170ed0fb5d6c076815b6879706
dc08d1f1c947a1df8b1d79e6d23144ee742a4a04
42051 F20110320_AACBUL evans_c_Page_074.jpg
61b37bf2710f8a809adc213c4a7103eb
526b7f203d5eeeb263542d925d6f6edf84da9a6c
63080 F20110320_AACBTX evans_c_Page_058.QC.jpg
2b85040d1b31af1c2ac354a5e6429527
3839e6d2845f5c5c8d328d5523efd04b24a05ccd
79915 F20110320_AACCAG evans_c_Page_114.jp2
43d3333005f3934721b1f590ace8ae1a
08288d3b2a5b3eac22bf566cc893303a05c05f04
189038 F20110320_AACBTY evans_c_Page_061.jpg
50d056009822b4037aad55395969e7aa
9255a3d3617dd27227523d036d6a4e06c4d82418
74522 F20110320_AACBVA evans_c_Page_086.QC.jpg
666f94dd7c73830e51e62522d9501189
c3139e74d6f935036e11c7bb92975eb3947d7ac2
200031 F20110320_AACBUM evans_c_Page_076.jpg
e331143e43cd45d551e671c3c508da44
dc6bf2165cfa674092468ac5368d7d098f48f116
97562 F20110320_AACCAH evans_c_Page_115.jp2
f5e8b57b98efbab6b6cab7337a627d2d
a59b8db213d6555ef11df5b90e7acb3a2094b443
66810 F20110320_AACBTZ evans_c_Page_061.QC.jpg
1a221bb289acb012d3dc538cb5f3ada3
df5d5870d671369436ceef30c20e1720058b5757
211772 F20110320_AACBVB evans_c_Page_087.jpg
3dad0563be09e74815aa8ac5a2795e97
3f4243803ecdcc5033f8171bfbefee6186c09eef
200043 F20110320_AACBUN evans_c_Page_077.jpg
cca9e7bd7b6b42385adf3108725c1fbe
d9d07155ae2f82a41e16c195ee4f886bc1effa95
100130 F20110320_AACCAI evans_c_Page_116.jp2
30cda1d7f3c40ef2f9319312f91c574c
f9d964f92c22afc98bab82ad2891e0172e58035d
73479 F20110320_AACBVC evans_c_Page_087.QC.jpg
a4ef77d501ce80ed09b756b399905590
3419d359b490744d12926701de63587f170a4134
72714 F20110320_AACBUO evans_c_Page_077.QC.jpg
e848be8016b3ab36cdb891d45c394db3
f2c0119014b51e1e76a49e2ab0f2f54201e797fd
93724 F20110320_AACCAJ evans_c_Page_118.jp2
b0d36463e12dcc1519b6a8ea5294dd10
cc0e05febdce73c2b9c848103056f8e4047653a9
191581 F20110320_AACBVD evans_c_Page_089.jpg
edd187f8d37e1545d9fb67e45b94ff77
907dfb4ba0a10b840bb4c6f3ab7fec2d844db32f
202133 F20110320_AACBUP evans_c_Page_078.jpg
478c128dbae2ae47a56eb09ed10b0a2a
2832f8bdad727d87c62538cb5db996ada75a5a19
64398 F20110320_AACCAK evans_c_Page_119.jp2
fe14785c029af0ebb79ec8cf0ba62b2d
b3bd6ddbe95844e1e8210b206cce6071ffbc26ae
188667 F20110320_AACBVE evans_c_Page_090.jpg
3f11c6973b7dbb4d2769668d21141496
1f74cbc15e5f750a5efd964103eb0d851759ede9
203627 F20110320_AACBUQ evans_c_Page_079.jpg
60b9c2186269ef2f75dbb630f1c72130
ac0c34175cd475dc2bb94f0f58576ee7354a6946
34226 F20110320_AACCBA evans_c_Page_013thm.jpg
3ada36607a2a243b17b4e96ad1be1ded
e376e49daa3364a1619d17806f9d26cefdd0e5ec
63379 F20110320_AACCAL evans_c_Page_120.jp2
246c3e80da8a38fdf7867894fe1df50f
1ccf09c6165a826a1d3ee582666fd04ff7507aed
67710 F20110320_AACBVF evans_c_Page_090.QC.jpg
390e28f8f0d0a80649b3ca311dec6f1e
3e772d7a74ec7ee75b063f16f398b168ebd1ab2f
76136 F20110320_AACBUR evans_c_Page_079.QC.jpg
a13e293c711573703c81cc652d314447
386b311795ae49b448537bf72f4ad6ecad2bce9b
19625 F20110320_AACCBB evans_c_Page_014thm.jpg
feae74a9ac845cfb357f9c184debb52f
bbdb11d91f85c289458d9964f963c7f7c1b6dd2a
83130 F20110320_AACCAM evans_c_Page_121.jp2
b183bfa6a0db70d35250c3fc701c53ef
b719181206f0195dff9f40f94d88803393035850
127364 F20110320_AACBVG evans_c_Page_091.jpg
81c624118bfc3a0cad11b88267ad5423
6b2cef35f26741a08f8e4f69de0745f88d7e0200
216764 F20110320_AACBUS evans_c_Page_080.jpg
0b7b42ab08e9ad6bde5e5643ba4bd69a
85aa6c4723e85a073a0964c63f68b5e13df2a6ec
21897 F20110320_AACCBC evans_c_Page_016thm.jpg
c0c1ffaab277e121b86a483f6da7786d
e841dbd91900e3cda71f3db64d79d13ae05fc698
111876 F20110320_AACCAN evans_c_Page_125.jp2
1a0e130849c8d6703670e7a263e0340c
698284a91d2387b41ffae760afa136acb98fb529
41242 F20110320_AACBVH evans_c_Page_091.QC.jpg
3ae93e1bb147c444c33cc61e1e21be83
afdc93c492fbe7ce6e451ac0e76018885c0e421c
171625 F20110320_AACBUT evans_c_Page_082.jpg
3829f46046c9325f53c25c9d089a2e7b
5f5fa75ffd6334d5c590162acd2f1e5fd0e8a2ad
23876 F20110320_AACCBD evans_c_Page_017thm.jpg
e86dd51550f33fd43818848df97b6aa9
1f23fd6c51907997943c86aa7bf31ed6b07d8754
95347 F20110320_AACCAO evans_c_Page_126.jp2
6423d7d09aa32525b9193033667dae22
7674630dc8d939c030631aa965b799cbcd596f34
192722 F20110320_AACBVI evans_c_Page_092.jpg
aea9da3f35414b6accdd3cc31a50de4e
85e6ffb73edc5aa4fecdd75e812a91ee071fc604
69576 F20110320_AACBUU evans_c_Page_083.QC.jpg
27ee19f44d11f05e11d52c035dddace1
0c262311d081824ceb996d96b1918ff1d21a370d
25730 F20110320_AACCBE evans_c_Page_018thm.jpg
704505795f95b90768c8ccfcef077759
2e8733f5a3a87f7063529b60e23cf3e7e7b42110
117027 F20110320_AACCAP evans_c_Page_127.jp2
64f293e922a5ea05b54f93599e45b80c
0e414ebec3abfd8fc3b28d7ed0987eb1910567b1
46831 F20110320_AACBVJ evans_c_Page_093.QC.jpg
7781798395c54ee8d43598983a41ac67
e5995ec62d1d936e8505c1515899622fcb9f73b8
124748 F20110320_AACBUV evans_c_Page_084.jpg
b91c007c521a40ec9e28b80d0d91a6fb
4f925e2a0af255264109fbe5ff9b2846f03f94bf
49091 F20110320_AACCBF evans_c_Page_019thm.jpg
e1ed57be244d714bbb8e727e391c6d0d
46ff6117e7fe7e7450206ac960875f065a5c9e67
1051958 F20110320_AACCAQ evans_c_Page_128.jp2
ed31aa93af79fa9342413c2903605926
95b29c975c2314422561ab3a3ae1464872e60348
174055 F20110320_AACBVK evans_c_Page_094.jpg
5e5bbbaabefbc69580e5adea9f2e2ce7
d086f5e51133240a43087651aa83147e54518c1c
60710 F20110320_AACBUW evans_c_Page_084.QC.jpg
871732980a41db4f7fd28c5be3ff2cb8
709310ee3d07d30da97c1a80c1abf1ddfe8976db
8491 F20110320_AACCBG evans_c_Page_020thm.jpg
0622ef498788fdbed64d5c021f1e77d7
cc1f8d1b89c327c6b1f33c796e72cd7e48c33b9d
676622 F20110320_AACCAR evans_c_Page_129.jp2
b03c0e33c55ebe4d735d791ba50a0b26
db8ab7576efee808bb43bf143ec9ab4b668528d4
63123 F20110320_AACBVL evans_c_Page_094.QC.jpg
798e8cac309d148790ee5e54562cbae7
ed35e2bc073f028897268f5207f14c189b095966
194802 F20110320_AACBUX evans_c_Page_085.jpg
574cb39f9f0ca69de221f4aa426b7230
b7ac8c0c8e316ea16331682831fb5918ab30a43e
3198 F20110320_AACCAS evans_c_Page_002thm.jpg
b27ffead575c8158803714322716f6b8
35acc1024f4d389d6194f04e230d021925294682
44917 F20110320_AACBWA evans_c_Page_109.QC.jpg
d43b93109a41ed1ff7e5643ab0b7d8db
d4d622c0ec7d7a4c8ca6fa5e18a834b685c9d983
202746 F20110320_AACBVM evans_c_Page_095.jpg
77761971c1f34af7efab87c273ab8bfb
afacb70135dfdb162bde18b30cc48badf9d1564f
71684 F20110320_AACBUY evans_c_Page_085.QC.jpg
f1228cfd108e61b1123c8a92caf4431c
21c0723369cdf45974967686c00fc91187044573
22789 F20110320_AACCBH evans_c_Page_021thm.jpg
112190e6cfb87cbdc9f397addb49c379
bf7700ce71af38a3ce81b373268e00162c5105ce
4387 F20110320_AACCAT evans_c_Page_003thm.jpg
3aba3a0803dc0ffae9ee7fd1d7a3b466
fee8ddcffa8905d7dcf34a77b6d7eb2d3411f8ad
177798 F20110320_AACBWB evans_c_Page_110.jpg
0268532a57f554aab4f279a8bc1defe7
c7f167577871646f4d65da32eca4f531d798c418
76159 F20110320_AACBVN evans_c_Page_095.QC.jpg
bb91f555d1f709ae8e60a80a30fb38b8
1de5a7bcb2bf86f043fce54c72ba0a1400e7b097
192636 F20110320_AACBUZ evans_c_Page_086.jpg
ccc26eef11ab351450d28487c4376e5b
6858ccf7f0b84abcf5057c6a91abad2e0d2ce2d6
24869 F20110320_AACCBI evans_c_Page_022thm.jpg
f64c240ba3438f7116711885d4e3508d
36c849da8ab9e9b3efe66b137c81a5a46fda07fb
5081 F20110320_AACCAU evans_c_Page_004thm.jpg
a42d2916c5e424b99a8fd693059772ac
a0f84b4da3b53e131996f5fce91de5281da34ae8
61372 F20110320_AACBWC evans_c_Page_111.QC.jpg
5b16d0c2ebca22e8f9f6915924b0fb09
679bbf807dcf5c9f0ca968706bd6def850452f8a
197205 F20110320_AACBVO evans_c_Page_098.jpg
ea90ae76795f3f9bc50f4f100957d6c9
07fc409dc578944fbd6d88b338aca3b5d199b471
24302 F20110320_AACCBJ evans_c_Page_023thm.jpg
31fb8aa50f70b7322d3bc4abcfe2d812
2b846e3d92d48ddc991d293e4ea5749f15b0dcfd
19982 F20110320_AACCAV evans_c_Page_005thm.jpg
105747365461563c4087b8bd7f888024
04d2707859f6333ae28990baad01b5fbeb22d431
149680 F20110320_AACBWD evans_c_Page_112.jpg
55377bbd6ef9204784fb53b5903dec3d
e100a2104fd01fee491b7fbaa05a2fd6d2196325
196730 F20110320_AACBVP evans_c_Page_100.jpg
2d710599271e261de40c6bbc6d321236
e01e33623d2aa2cfb478a40f6fec48d52b37eec7
47482 F20110320_AACCBK evans_c_Page_024thm.jpg
819086cb6c0379e0fe1b61afcc6b27ba
5d8f7de0e2b757adb984a3d67df9a0406988a3de
22804 F20110320_AACCAW evans_c_Page_006thm.jpg
9de303287c8ea1d3dafc9042d85e9b1c
4a333e477592894824d5e7fc6ecf325704fb164a
47421 F20110320_AACBWE evans_c_Page_112.QC.jpg
e97355277a9c1c02f1fb0e613e46b668
ee748e503b774a470f1dbb7c7e31c9b53fce006a
73188 F20110320_AACBVQ evans_c_Page_100.QC.jpg
13a98ebf4046c22872ee292a51358908
1175170ca5e071141bbc7a122a25c1d1748d7073
24106 F20110320_AACCCA evans_c_Page_055thm.jpg
8daf8638213e845316cbdf242ebe110a
26e512d9c470658fbeeed7fbaa4beab6ac43d057
24188 F20110320_AACCBL evans_c_Page_025thm.jpg
0596160fd600f34234e733e482a4a087
8cec2840fb9f018fd2417c8ee3c2f57a18214aac
50490 F20110320_AACCAX evans_c_Page_009thm.jpg
cfd7120b188321fcae218d92f4dc7077
f7741ceeb17aac60e0f6664d8ad1952ad17040d0
191648 F20110320_AACBWF evans_c_Page_113.jpg
09eb734a5c2e3916d1dd726f9cf87ee9
aed2aa3163be203d59f13997ecd09948a03dc976
47874 F20110320_AACBVR evans_c_Page_101.QC.jpg
7f85d74f942a9c036f544e30d40c5ce2
53217c3c4ffa1682f9760c65c63c53bd8ddd67eb
24443 F20110320_AACCCB evans_c_Page_056thm.jpg
d6e454f56f69667b10147c51af255e76
055f2aab20422d716f55007ae23fe5684b370e8f
24663 F20110320_AACCBM evans_c_Page_027thm.jpg
583a37bcd7a1b36b9eb9893b8806c6c2
8aaa869577bdf77237845da2b9ae113dcc5df0e9
50035 F20110320_AACCAY evans_c_Page_010thm.jpg
863beeeee9e58d710864feac1083a281
cdd98a7831ea85548a777a9aae6d695fab4d33fd
166941 F20110320_AACBWG evans_c_Page_114.jpg
f04a932ceb9f79837bea428cd96c203a
9c29292979c444d90227611f34a7d729f429edf0
50634 F20110320_AACBVS evans_c_Page_102.QC.jpg
b8ebf269345f974cc71a0448b9893ee2
a6453590e4b862ae5759f5160f37fa2e536ac680
24736 F20110320_AACCCC evans_c_Page_059thm.jpg
63b61de33e7bf1a09e14e6f68619f16b
289199c9a7af971ed288df8c26cb1403fa89ea7b
24918 F20110320_AACCBN evans_c_Page_028thm.jpg
37cb62b59c17ac2f0d11d0fd30be3089
881654b79cc89b145b5ca3990678d5a2b2fe0ab8
43142 F20110320_AACCAZ evans_c_Page_011thm.jpg
2be01af890a7a38aa2d0aba6776ef73b
0c1bf91aaa3dd65f3d6d0b57489e07216caa14f7
57909 F20110320_AACBWH evans_c_Page_114.QC.jpg
5a8080074717242e5d9961d87bc63e30
47e719dd0e67435ec4fed0caf667f71e7909a2d7
191609 F20110320_AACBVT evans_c_Page_103.jpg
e53a79c07b4ebfdc4f167a6f1a8ecf68
84e1f682b5151ec94ee8b93a57078f1c39daa153
43852 F20110320_AACCCD evans_c_Page_063thm.jpg
d887b11460ca0e2313bfd5f2691b18c6
4d95dfcf52fc24af7ce300909df575dd14bbac07
25085 F20110320_AACCBO evans_c_Page_029thm.jpg
7266dd91efb79b7f8b00153ae3714294
59f8e40ad718c9b75ceaeb0186590ee31b3ee5b0
191903 F20110320_AACBWI evans_c_Page_115.jpg
2de1a80af7ebe096a9a9413528b0a173
1dd49c4c307ced0fbe0b3732315c02aa9912cdf0
150153 F20110320_AACBVU evans_c_Page_104.jpg
52816ad9a5b6766704183bc5cb116d6b
7362f27b62de4d89d16448b2a746e3bfd2ce1bd3
24499 F20110320_AACCCE evans_c_Page_064thm.jpg
67a7cbbbfbe51c0c56567fcd96ab5614
b7e873393b644c93a26cda6dd7ceb14ddf8bb131
45632 F20110320_AACCBP evans_c_Page_031thm.jpg
35ded5dd411790d7c52e2d5f99bfa867
8ebaec3e672c8f29f9f8fb71aa12dffb298b1fa5
71764 F20110320_AACBWJ evans_c_Page_115.QC.jpg
916e71eff42ce1b08c2b0370723ad396
db4f245b9d74948a1def933954d203fb859bc7b4
50307 F20110320_AACBVV evans_c_Page_104.QC.jpg
dc1706202c108fdcd4b6d25c23a3ae42
09ec13f989fb153a1d146e5aa3838623061b46f0
49987 F20110320_AACCCF evans_c_Page_067thm.jpg
4e793bbd4c223b1c68050f881b3a91a8
7f95c1c0d065fea5993432e301f44c7e41e5d706
23029 F20110320_AACCBQ evans_c_Page_035thm.jpg
d69f29ee2149d372efb7aa0dcf378679
c34d6ee7db81ced0e3b1e070e9c807080aaf465b
191399 F20110320_AACBWK evans_c_Page_116.jpg
4bf067c947136ccc1019b33aba4bf940
6e0f21eed95dd06d4c4f6cc061ef00594fbedcb6
70840 F20110320_AACBVW evans_c_Page_106.QC.jpg
36c63f7f9b3466deec9189105350b769
98bd8e80eaed7cfe9250a0e417086854e871780d
42547 F20110320_AACCCG evans_c_Page_069thm.jpg
a1b366e1b64cb7d783e73621fbcf41a1
78f16cc1ca5909675584f3775e96fbad50e34785
25109 F20110320_AACCBR evans_c_Page_038thm.jpg
08b3d0c945dfd7072db79e36e74a98bb
bc0c4e18955b61b59dd95b2de590fcecb0e7320a
71146 F20110320_AACBWL evans_c_Page_116.QC.jpg
f57248752d5b47876f266d59c2bdeeb1
88563e1024b517be8444b444fd8c98bc9d6ba443
76992 F20110320_AACBVX evans_c_Page_107.QC.jpg
7825fbfb97aa0f3861769ed11140bc25
2ebe50c732ad6f6882d7d1d9cbe5ab10ad417bdc
22997 F20110320_AACCCH evans_c_Page_070thm.jpg
a62957507f3f765f4445a548eaf25e5f
90b45cc56958a0ab7f385f678988c3410777d3a5
7084 F20110320_AACCBS evans_c_Page_040thm.jpg
0f2a596ca1722caaced11eb55d0aa0c1
6d083865f47b92ffdfd8c7a19c233355da2a0210
230515 F20110320_AACBWM evans_c_Page_117.jpg
0627aeb3f634827633ae060333a9597c
e480a636ece8a1f74bdd712242c0802774676baf
142435 F20110320_AACBVY evans_c_Page_108.jpg
179f11cf16e4aae4ecca9abf5469c4c9
d2e7c82fbf5df8629d7f917efdd2332903f1dff8
189006 F20110320_AACBXA evans_c_Page_130.jpg
8906b93c0716c6d7e85139dec4b580a7
4f842fb4a7ba5a34d45a78a427b18b26e2151a69
21458 F20110320_AACCBT evans_c_Page_041thm.jpg
0f50a6db1e4efd0242bcfd291dc834c1
8ad6298c7706759dd3fd4ba7ef3e846b85a0a228
81718 F20110320_AACBWN evans_c_Page_117.QC.jpg
6bf6f7e9103291164db6083196283776
ecea713b1da7cdafd79f7a279c33f0d88e6b607d
45094 F20110320_AACBVZ evans_c_Page_108.QC.jpg
9ab9cf1182948204c40ace3cc2d581e2
dc951dfc6efc6687f211ed21f7b329856a3ebff3
70908 F20110320_AACBXB evans_c_Page_130.QC.jpg
a11b40e61b1de999a944141aeaf4d79b
e9af987b8f01d2bf345e98918bab76c2cf494a68
23362 F20110320_AACCCI evans_c_Page_076thm.jpg
1ad4cbdf4640f578972dd646e13314c6
dc249707123b30ad670679ad76ba2838b41ef608
48953 F20110320_AACCBU evans_c_Page_042thm.jpg
85880cfab755eb337503e7f48f215007
dbfd1304fe309412963ae7ac375db58e2fec8239
187409 F20110320_AACBWO evans_c_Page_118.jpg
e7154496a09415ecddd55a19f665d251
531d7689e96c7e4172325a5c8d006adb8033ba77
177578 F20110320_AACBXC evans_c_Page_131.jpg
b165b5b1f4c8aca198e1ec5f94f4bd44
3edcdb81f93bc5e30f94e4899052685e7794266a
25423 F20110320_AACCCJ evans_c_Page_077thm.jpg
3d863067c25c2ffb98d5bba98ef6fdac
b4a71e905586f926c187a1257ecdfefc15466a97
23494 F20110320_AACCBV evans_c_Page_044thm.jpg
f671c4051a1a13f98176acabbaeaf5fe
ab5393886e5845825289bcc46cb4bfc8a7b20543
45247 F20110320_AACBWP evans_c_Page_119.QC.jpg
e8e4a15f4a325ccfaffdd5a0f87ad8f4
04d5f1fded3f00e32b45082109278c5f506ad67b
67894 F20110320_AACBXD evans_c_Page_131.QC.jpg
9f113fff79671e87be157d855f298115
f38249cd7e7938e4853ecbfb2794ec0f796f88b8
24322 F20110320_AACCCK evans_c_Page_078thm.jpg
56aa1e93bf19b86696dd162f64b3d3e2
2b73ab5b460928d7c11dadf96f43f581c303ad21
45308 F20110320_AACCBW evans_c_Page_045thm.jpg
6c778462187dd482f3fed576e0ece68c
393540c14111d7e4ea48bee1584a4bb2bbb84a7d
44217 F20110320_AACBWQ evans_c_Page_120.QC.jpg
d5274bff82744cb92458c0a8ae1fa579
e3d5efb2a1676478261c1d3a065a234715e1ac45
27844 F20110320_AACBXE evans_c_Page_001.jp2
0cca2120358976a918ca10356c7201c9
00a0149d5fcdab767cad98775adc290f9f9be8ce
22340 F20110320_AACCDA evans_c_Page_103thm.jpg
dee80c7ce4d07c60d57f2847b8d15799
436b7b2d9990e5506aab521beb64f6c72fb5f7c8
25909 F20110320_AACCCL evans_c_Page_080thm.jpg
8247500a047b6b3e4774f96c743e4f06
06df2b6d8a71af24673be922f15daac50a6a737f
20493 F20110320_AACCBX evans_c_Page_049thm.jpg
2ed8717f33a91d656c97324eb55975a0
284eb459554304cae693a70d3920b53d13c396a0
177288 F20110320_AACBWR evans_c_Page_121.jpg
d2eed309cfdb66dc40ebac4f19643eb2
77591b375d03755bad9c65e8add3693ba85b2923
5916 F20110320_AACBXF evans_c_Page_002.jp2
2961519b28d87078d9d5bf49999525e7
4baad8f52134064710c13cec1086af21a71202cd
16399 F20110320_AACCDB evans_c_Page_104thm.jpg
d80d440c6d32e502660c3501e678500a
ea3f2a191b36b2d36722bc0073dd74ec3abbb000
43065 F20110320_AACCCM evans_c_Page_082thm.jpg
9b12e86313db38a7f8208b725704b15d
eb44cff031354c9c563b64bd9f24bfcec4ca9557
23334 F20110320_AACCBY evans_c_Page_051thm.jpg
ec9371b05cfae924afd7461999d39da0
2e9a606457995d73dc3b16e9c05106feb2a4609b
60786 F20110320_AACBWS evans_c_Page_121.QC.jpg
a1df3fe243cf2eca6f05054b9f5aa3f4
53b47f95e5bb4bc13ef1138c6567660fc957f919
11440 F20110320_AACBXG evans_c_Page_003.jp2
8cd59ba3167a2c71cc1f1991e19f314b
7cc007b0ab115d700bcacba85d28b7a114fd0a63
22493 F20110320_AACCDC evans_c_Page_105thm.jpg
2da40eafe11c15b8b0f906b8a243da0e
50ebfebb6922358d44a1bc0010c24e0dfdcce101
22251 F20110320_AACCCN evans_c_Page_085thm.jpg
0af459cda0310edbfc70f4592fb3c011
a6ba731880e6baedb849bf9dd122c1416bb13441
23653 F20110320_AACCBZ evans_c_Page_052thm.jpg
1c79fda088508eb95b15d37d02c784dd
a2ef4aed7c2865acb8333667dd4263a5cf4cdeb0
198337 F20110320_AACBWT evans_c_Page_122.jpg
5d34441c68d22b0448e23ea2cd99ed74
ff1c07f6e8e2613acf44a4ea11b1bbd98da3fc79
23928 F20110320_AACBAA evans_c_Page_026thm.jpg
033b277682163bf66afd5b9b6fee76c7
eaaa10e703b80defeb4e8d034b89a713f58de4dc
19507 F20110320_AACBXH evans_c_Page_004.jp2
fb522c45fc443d645a59519f3a6f0cd2
ffc28b3ae1971de504d3c23c8d36c04b277b846e
16154 F20110320_AACCDD evans_c_Page_108thm.jpg
ade3f615267fb61cdf816b2a28979baa
50f6fdced3e6b219fd034d30180feb5be797d3da
23920 F20110320_AACCCO evans_c_Page_086thm.jpg
4242e5d3dd93bc8d40ea53648e152172
e7d80afb690741bbae12211aa291843ca78d435d
27429 F20110320_AACBWU evans_c_Page_123.QC.jpg
e3ab21b404fcbc56edd45fea8f3edd9a
e5148e8dc65645a8a118450d61280e05687e157b
137848 F20110320_AACBAB evans_c_Page_120.jpg
4f6a80438f68f273321aafbedbd078e6
d9f9a7fc643a805a28d9354e11caa83b709ffd1c
95861 F20110320_AACBXI evans_c_Page_006.jp2
a31c5ccd581d9c52c2999abdfdace7d5
629724847141d8b7ece0c5594f5c75d0be38397a
16667 F20110320_AACCDE evans_c_Page_109thm.jpg
2dd1d0b5c14ba479b480de05542bafbe
04f7dd1b01f6859b4c0fc8ca2fc990098bdf5eb3
23682 F20110320_AACCCP evans_c_Page_087thm.jpg
7556479fe82ab6842d22c4e8fb630361
d38257ffa3dc216a66687281b3385fb0eb6ac7d7
215704 F20110320_AACBWV evans_c_Page_125.jpg
a7908e41ece593b999635568d8fa311b
200eac690f9ee5fde3f862906fedb0c4c83e971a
164877 F20110320_AACBAC evans_c_Page_088.jpg
61dca851aaeec24856148f3586e9d057
afe1cfe262b0f3639201fffa1cac8f7457108e46
1051975 F20110320_AACBXJ evans_c_Page_007.jp2
400e55102ebb64b267d7a710dc9cd8b6
2bca8cdb04de48c8db44d1de443d04bd827f2755
20545 F20110320_AACCDF evans_c_Page_110thm.jpg
c936f37ad59d95b51dcb7247622e94a5
4257d12ad6bfbf2d8ae70d084fb24cb608b9a4cd
18481 F20110320_AACCCQ evans_c_Page_088thm.jpg
4d4993d8e4f279553b630e7ff64cc8ff
3ef2cf102019c7efe1eebe8ded2a8fc1630ec116
81436 F20110320_AACBWW evans_c_Page_125.QC.jpg
1b42a7cf9e05f8a6abcebd337c9db263
663547acef744843722068cfcc770324f2141182
67572 F20110320_AACBAD evans_c_Page_113.QC.jpg
729281089bb02a90835eeb305e387e24
d98963df1cad24e7bbc5b79102b4b66ece30e8d9
1051961 F20110320_AACBXK evans_c_Page_009.jp2
09b298d0427d71526218351d0b4ba8bc
e943634d7b4abcb64b294b1d01d37300aed28ad7
17942 F20110320_AACCDG evans_c_Page_112thm.jpg
0a714f2f10626a4cdf87dadd171f520d
4fab7a22993f83901901755ddd51b3e32139eb4f
22415 F20110320_AACCCR evans_c_Page_089thm.jpg
3ba69850a2e2dd10a400456005a43d1d
e210a3d82052fc42c35504f17ff2fc71a4dd2c6a
75445 F20110320_AACBWX evans_c_Page_127.QC.jpg
7a8577d31d8c72c1026a14d322ad8d36
3eaead5102067c8f83e1666ce743c0da9a197199
43307 F20110320_AACBAE evans_c_Page_126.pro
098169fc26edd57a6938a13ee629ed0b
014dc1a251a8ad52c157d95eb375aeccf247c239
77102 F20110320_AACAUJ evans_c_Page_098.QC.jpg
3fd1a15b163846ea1024241a23ae07f7
b44069ed47ae98148120b167f5da03c068f1f5af
1051982 F20110320_AACBXL evans_c_Page_010.jp2
4920f16ae819ebe244a1aa31cb304a30
d380bce165fd9c2b1413b7c57c53fa7618f361d7
22355 F20110320_AACCDH evans_c_Page_113thm.jpg
f48166e6654132a5b6b68c393c3e170f
ce89ea04c5df7f0169560b5a2ca78534274cd253
21918 F20110320_AACCCS evans_c_Page_090thm.jpg
914fe4107b7aa8a3a5c87c689c6e8d29
63bc87edca01d7ef38385611d4668690a4206621
315163 F20110320_AACBWY evans_c_Page_128.jpg
f727dbd850b36847c80934d5e0bc63e9
df1b395bb7220d24b3934df45361f5aabe263f41
41929 F20110320_AACBAF evans_c_Page_124.pro
7c031f71f2739469dd89ae1d74bf1aeb
da63441bf7e99e05217d6c4f12af203802a28241
880651 F20110320_AACBYA evans_c_Page_031.jp2
a640caa7d1fc71185da2648b1579e183
92d957c2971ef7124d35187d7326d507ba00444f
46901 F20110320_AACAUK evans_c_Page_050.pro
11fdfe6945a0d0201360114678976d95
397f35f3e74d268097f73a1a690e5a63241933a0
1051959 F20110320_AACBXM evans_c_Page_011.jp2
2f662bea87048758f92b874dd48b32e7
c71d75129c78d81d9d36b84ed881474b4d89d1a3
23593 F20110320_AACCDI evans_c_Page_116thm.jpg
416661c72a760d418a3c965cfe52f651
84f443301d4a9d73c97c6b8c4a1b11b04f0ed47f
22072 F20110320_AACCCT evans_c_Page_092thm.jpg
592896a93eb0a76051d5448929cbb3e9
ee46b2d5b9082800c448736feb545a503816fd63
66293 F20110320_AACBWZ evans_c_Page_129.QC.jpg
ec47184822689682a8f5575ed15f93cb
f25f044c0793fa47f6a44572c72ff2541a9a7b4a
24925 F20110320_AACBAG evans_c_Page_033thm.jpg
b428d6dfdcd7243684916cefb28d3149
e840366d3c52a5a3467e07c19e9bf85b1f66988c
104729 F20110320_AACBYB evans_c_Page_034.jp2
f25fa711c1f4c68fa6bde568a8b8ba2d
2f1fe958a131eeaf798fd3b4f69956d9b101ed7c
274887 F20110320_AACAUL evans_c_Page_062.jpg
b086a5d48476600af5a2ef79ca8f9598
ac485667fb0322ec2d6cb19e28c10ad26baa280e
1051971 F20110320_AACBXN evans_c_Page_012.jp2
239dc4d86e455cfda945c10acdddf20d
aebb303a7aa1f5deadb29724dc7dfb1abb43aabc
16437 F20110320_AACCCU evans_c_Page_093thm.jpg
2e892b153671123b65b883ffd685a825
d7fdb14bed9691938e6c58c1fae88cc012d49b75
1918 F20110320_AACAVA evans_c_Page_022.txt
d35f0fab2d71ccf8ffaeacb2e70341fc
057b142cd15a5fe0c72b09e68a23599778ab1c9a
106161 F20110320_AACBYC evans_c_Page_035.jp2
83c6d251671153b554c1ed4e0b504099
5b6d4208b56f3430ca27f257491919eb96a01f58
51331 F20110320_AACAUM evans_c_Page_038.pro
2d69c43ef0d335b4ba2be0544679868d
df6c3322a6cdf1ba3d9f2710557a47e67ed2d114
89477 F20110320_AACBXO evans_c_Page_014.jp2
2703a40bbb85052f7436d4977124ba9e
cb3d2b99849e5d65dc657128ec78b222e59d3d18
24944 F20110320_AACCDJ evans_c_Page_117thm.jpg
9973a62d1d2fc0533bf65661b906be88
44918947240c49e4173d31b0fd1a9f52ce743aa8
23389 F20110320_AACCCV evans_c_Page_095thm.jpg
453ecbdc21683a5141c2a475bdb4c63c
4194d735154c3404f39f9637d530f08e16aeeb09
182793 F20110320_AACBAH evans_c_Page_057.jpg
56c8a79d0746e65507361d4ef1956f51
c0c19609a7a89b38f666279470dd9141f036e939
73026 F20110320_AACAVB evans_c_Page_123.jpg
c54d61173abe68346f55038b813a734f
2fe9866264be7f5194665a5120a4ef96dfabfe8b
885803 F20110320_AACBYD evans_c_Page_036.jp2
8456e3d92cc741687a5152daea338be6
b5e253fd8192f270eb30e24bf4719ee766cfc744
198353 F20110320_AACAUN evans_c_Page_051.jpg
bc05b5a50a2f2aa27991ba327d922b4f
2bf6cdcd9daf30e28461a74667a8a73ceeb4e589
25526 F20110320_AACBXP evans_c_Page_015.jp2
0ee592a9b62dc4aaa13767e6d8324b32
e93cdca941cb95de1eaf8e77d2d3651fe02a1489
21902 F20110320_AACCDK evans_c_Page_118thm.jpg
56e4b8e9ebae43d03541dd6e4cf6e2a0
addce49a0d5aef5bbe40a4d6c47a95c30ca659ff
19851 F20110320_AACCCW evans_c_Page_096thm.jpg
11e75eaaf9d33cd8c30fce5e9b6785ed
87907715e61cdb7f2d9a0f213530f96ef62362c2
1053954 F20110320_AACBAI evans_c_Page_052.tif
6353fb67900a87569ca7680583a0068e
50eade25188b505fbcf919f9234c06da63bc8534
192072 F20110320_AACAVC evans_c_Page_083.jpg
c061c456b6bdd1b9d399aa000f193515
8dda9429d2f50a34da8422b924edd0a960c632b3
1051916 F20110320_AACBYE evans_c_Page_037.jp2
a714596e84dd9c30387b13a7b71635aa
879eccad6d26e17ad069d7d2a0ba8f69a206ca27
65825 F20110320_AACAUO evans_c_Page_005.QC.jpg
5e10efdcb754d87262bdb582c61c3ed8
6fec649c4a5be9d7a7eb4a42fecd42b15afb62d8
93913 F20110320_AACBXQ evans_c_Page_016.jp2
6d27cf1f03b836e3a069b634852ef57b
8ca8a314464e5453cfcc882e44ee19f8535b4be2
16052 F20110320_AACCDL evans_c_Page_119thm.jpg
063f1918bc588ae758c6acf7e83e0e37
77a6d3b69731ccbae9efbd380089d884fdb4d5aa
21204 F20110320_AACCCX evans_c_Page_097thm.jpg
1b5fc44ee20ebe5983e6679bb841ba04
4238e5ca8ff9217953cc3a9dd3e26082be41b984
214635 F20110320_AACBAJ evans_c_Page_107.jpg
843047ee8c91f8c7e632eaead7f2d296
2d94df148fec111f269fde8f045973e813a6833d
71004 F20110320_AACAVD evans_c_Page_071.QC.jpg
b26a6669149a611fe33833b0cff82ac4
85bab78a003155cd43bab2610ce43b0a53061808
111506 F20110320_AACBYF evans_c_Page_038.jp2
0b160f284f2fa1cfbc7f1e258a64c2c6
b849eb9269dc07e62f28a90869840fbaee3d5082
68610 F20110320_AACAUP evans_c_Page_118.QC.jpg
8f0aa14382b1a5da532c683814a70079
cd379c92ef359dbcf608848cffef8facb4fb19d5
127512 F20110320_AACBXR evans_c_Page_018.jp2
067e668da7e1c28fef5f54f11a1a8c8a
74aeafdcd6d5d7e406d5a044c3a0a67ba9d97c91
15954 F20110320_AACCDM evans_c_Page_120thm.jpg
137161930782c053e917b52a5c361d94
4206b7dfb1cd8d544862a746f6fac6ee14527717
15753 F20110320_AACCCY evans_c_Page_101thm.jpg
48ab5bb879df7d4792e9a6319f9aa42c
5ad0bd8699141a9f5d1ce49cb820336c942044fd
88453 F20110320_AACBAK evans_c_Page_005.jp2
be50a8d506fa93461d2218aaa638bd01
b355cda1c606dff4485b9e40c4ed8785e4d3e449
51216 F20110320_AACAVE evans_c_Page_028.pro
9b3d954aca88691024e12713aa08c56f
1899955e8d7e2d315ac21dfc88570eaee5303818
103432 F20110320_AACBYG evans_c_Page_039.jp2
4598d19c838cde5125097af5da04bfe3
004bf1c7beedb9ddf3c5b2dcdf39f8be2c1c7360
F20110320_AACAUQ evans_c_Page_091.tif
dd07097d2e257e6a8c2786e7b9925bc1
932e499c65d7f3a8a92036c5b8774c8fe055f2da
1051986 F20110320_AACBXS evans_c_Page_019.jp2
6bd1153f4cd23301660a8f2f42afbc59
1fb206b9242845dd775e19a8560421b5debfc219
23898 F20110320_AACCDN evans_c_Page_122thm.jpg
b3ca8751c66954529d44369781f2e7d0
35cc0a25b6b61d48d65dca4de90016e7801255d2
16701 F20110320_AACCCZ evans_c_Page_102thm.jpg
f1a2f9766f9ae88ae36d96e3adfc81c6
f67e92563ab460f85489fe331e643992fe6fc849
36551 F20110320_AACBAL evans_c_Page_036.pro
5691ad92e56e6036a7d8650dc8c1daf5
e4bd4399d777e7ac50b442c18c41cab507733c0f
245953 F20110320_AACAVF evans_c_Page_018.jpg
8b8f058a4bab03e432c3a1232e71f1f7
ccec8907d6afcb097351977f7a729e73c36123b8
90140 F20110320_AACBYH evans_c_Page_041.jp2
01701da5843404558afee1e150194788
11fe0b980712b5c98d26a8f2342185b22d4f815b
80264 F20110320_AACAUR evans_c_Page_033.QC.jpg
23a81283fcbb1cb5ccfce3013a3e14b7
ddf1c06d77e346eebcbe7dc092d1e645027d0509
34392 F20110320_AACBXT evans_c_Page_020.jp2
eb7e981bad8321800688bafefe0fa695
8f92d90a942dd57338d44c55008a154c0ce044b5
F20110320_AACBBA evans_c_Page_044.tif
351d2e29f10ed8af7195ab783f70e98a
c5a3da802c7f581ee13bb1afa87e2525606ca230
9911 F20110320_AACCDO evans_c_Page_123thm.jpg
6fd2ebf6e52b198a176a4e700f9daf14
8ca9387308340b441edd22afac402b647b025092
2096 F20110320_AACBAM evans_c_Page_115.txt
7e1ded9881dd91d2c5010ee4bd4d3c90
525b6f66f8efb9d465cb090b650fa7454cc59142
44955 F20110320_AACAVG evans_c_Page_007thm.jpg
a68e3a4974f28f184ae3f40b59fa4dc7
4b1e49e23f20c021cea6331e134eae8d617aa182
1051820 F20110320_AACBYI evans_c_Page_042.jp2
ece796e2692023b489d77049ec24d4ee
24cd3a3932b6aaaa09f8155afb6abbe465d24f4d
23577 F20110320_AACAUS evans_c_Page_071thm.jpg
39861b3fe3f1f17a08876e89dab1418b
e600ab51b69e43dc174cc9fd01c6fd01d3895d5c
107656 F20110320_AACBXU evans_c_Page_023.jp2
8c6c79153cdcae5b4fea5da55566eb0f
f416e4275ad88c7d2879ef857dd62ed3f35fdd01
49452 F20110320_AACBBB evans_c_Page_107.pro
2812df198e3214611bb34293b38e9322
1e8a047a748cb132e889988a1ff060c7ea2483c0
21211 F20110320_AACCDP evans_c_Page_124thm.jpg
6d7ca78b276e11dbf7456892cdcc729c
6a2b633cf7b9a7d6d7bfbd8a3d04b9ad19e090ee
182718 F20110320_AACBAN evans_c_Page_054.jpg
c25e19e2a30d5136ca3020eda4a881fb
ec1278ce78efa06f988d74abdef06aeffb06e68b
66896 F20110320_AACAVH evans_c_Page_103.QC.jpg
54cad1694979aa459db53cb1ebc786a6
517349247dc5cd88db1f7eb38b4a778b3d372516
110346 F20110320_AACBYJ evans_c_Page_043.jp2
1a78afed2334902749a58538c454e8f4
7a0d39d232966814eb3e06ba21d56b37fe924886
124 F20110320_AACAUT evans_c_Page_002.txt
deb045d882c89b61a722166fff552b81
39095df0840f0ef55721d2bc534423fabfe42434
884435 F20110320_AACBXV evans_c_Page_024.jp2
4a444fd4f2a4e88de1a909e26abdc6bf
4bdae2b9818627600fc967aa106ed655eb83d29f
97354 F20110320_AACBBC evans_c_Page_130.jp2
8626ae23640b86e5ec290f954c8c4d9b
ee7c20b352d8bc9b58642e1793196a753957880a
25554 F20110320_AACCDQ evans_c_Page_125thm.jpg
477c75a3b9119927e5cec094a3baaa9e
2ebe888e540b0304dd823dd2ecb3c56e82d34607
21029 F20110320_AACBAO evans_c_Page_066thm.jpg
b5871db03485b4f5a428874617df8ff3
ab971b370895ec0fd5911c0742cede57932ba7ac
15775 F20110320_AACAVI evans_c_Page_013.pro
1deacdb9635df7835fcac069fcdabcb4
5fa40c54c2d4ef1fda5ec734a587db2d898a8313
101171 F20110320_AACBYK evans_c_Page_044.jp2
e5b68e7e6f027e597e01b55e871b0376
0b5423d7ba083e0c2c30a74ebd271027074545fb
69475 F20110320_AACAUU evans_c_Page_126.QC.jpg
9de135d7a95ebf027338d94d3cd3c441
69e150d8404328a315ecee806a48dbc81bfe5f0c
106219 F20110320_AACBXW evans_c_Page_026.jp2
9ba986cb95b6c312b3a50709c3d57b15
b1741f2e89a2bd3ad907fb57cdfc810930c0ae58
20177 F20110320_AACBBD evans_c_Page_114thm.jpg
a425e1edca79199c173a462a85a92c6e
116734ffca801de7532b20d1ad987cbc8b50178c
22409 F20110320_AACCDR evans_c_Page_126thm.jpg
c91f6ce917a0df177db037bf43b4479b
a3462aef172f3df5ece59dada7450919faf6dce5
751381 F20110320_AACBAP evans_c_Page_068.jp2
2d784745fc8b3c66c77d8d4b57c2c1a1
3dd437d520f3ef9dd3c5ea377c2c97135282bb8a
F20110320_AACAVJ evans_c_Page_081.tif
fff833aec8c9bbd16fcbcffe7eddea51
947c230922f672fc62e98f9d1db9a0252b859a53
100809 F20110320_AACBYL evans_c_Page_046.jp2
b35586d8e2a1c42520e25d8ec1a16b63
d1dbeb7dbc6928e2df249986f9700b7af2a6da9d
92129 F20110320_AACAUV evans_c_Page_124.jp2
ef957233037f42a28f305bffc95598aa
e05e7ee3f52c706e28d79b90c44bb626916eb787
106691 F20110320_AACBXX evans_c_Page_027.jp2
cd88963e6a3811eb05e4411fbdb3969b
03ff7c82a7d70023a9ed29ef18e046f6f81b2d70
83577 F20110320_AACBBE evans_c_Page_082.QC.jpg
2fa97ec1f6ec20360f2d4fa494746abe
35cefbc553e6b394cea610d1a83f1cf4491bd8de
22871 F20110320_AACCDS evans_c_Page_127thm.jpg
de8caade4aed2b2404cf0ff3e8e68b5e
5311c411e30dd4406a802c865bd4d90be3889b60
99584 F20110320_AACBZA evans_c_Page_071.jp2
fd1775a1c7e045279b6ec62d43af6fb2
bf8a62ba73ff8cb6b294ee32a7e9dc71c655faa9
F20110320_AACBAQ evans_c_Page_079.tif
bbddbfc9f110978614a95b03642908c6
611495c191a450df891f2fad7854de993c8b402c
238283 F20110320_AACAVK evans_c_Page_019.jpg
110d8283a5baef3cea905c81bab255ca
bda6c0688b726901a2b7e431b746f02acb4c1937
112471 F20110320_AACBYM evans_c_Page_047.jp2
c04632bf294b09acfd1a0ad705821418
e41bdd061caba0646a1e8889a639bfbef985c714
2001 F20110320_AACAUW evans_c_Page_072.txt
511fdc2ac0a4a418e4ba23dcc8e644a9
03e64e824dfa61a93c387d54f41c6b0e7cda347f
107429 F20110320_AACBXY evans_c_Page_029.jp2
01bd2fab94d07b5982f83401f6c95bd1
676901ae0409aa40ed3a63bdd1f0bd4d59eb3b6f
24271 F20110320_AACBBF evans_c_Page_050thm.jpg
cdb26669c7b06ecfba30c0ce9d5b60c7
246cc71e7b8943332c6c21701f0f70e7f72a21d5
37805 F20110320_AACCDT evans_c_Page_129thm.jpg
8280a5908fec4904c3e37ddcb057ee74
3a08da5e218e08d3631c60993a533107090f7cb7
106352 F20110320_AACBZB evans_c_Page_072.jp2
47572836204b71bf8c991ecb847c5846
7bf66f6d5c2cf3e84216e7a8fb155be9ea70f9fe
F20110320_AACBAR evans_c_Page_001.tif
2a2eb581d61ec913862af4d33019cf1b
8c78f2b8b2ec7f1327d67da2b66599b5c5c49adb
1665 F20110320_AACAVL evans_c_Page_041.txt
14284b9be46670b6338aaf6c21675a06
06941bac50ebe0a1abc3d8908c4acf8d5ea47211
94885 F20110320_AACBYN evans_c_Page_049.jp2
9826d6117951cfaf4a60c95a21a7b1c8
ef28b52a465c390b56069434da8e20dc65d98947
22069 F20110320_AACAUX evans_c_Page_130thm.jpg
422a89105cdb277d2a90232c34e044e5
cf20cc493901660f28c8b30def2b4ffcbb904081
1051976 F20110320_AACBXZ evans_c_Page_030.jp2
2a5f7585ec57e583ab28c47226fca6ea
575df7127162f9cac7d8b80a4f586a38f8b6e736
2198 F20110320_AACBBG evans_c_Page_105.txt
66b0dc57893ed80f1804748cc77ba707
38c8743920cd20dd81a02df6b6b33962882fe2a5
150999 F20110320_AACCDU UFE0009320_00001.mets FULL
3066884abd05a78d788ef3aa2b8e850d
749ebf27b0d609ac9e1caed2176c8e1189e858dc
21741 F20110320_AACBZC evans_c_Page_074.jp2
e69a4467e928bfba37a241054b91e280
b82540194573258333222dfd20dd5d025a76c0a3
46100 F20110320_AACBAS evans_c_Page_103.pro
04656b795037119b055899f8df0482a6
149af85eda8f1a43b5c6078cf88e71495156a384
24294 F20110320_AACAVM evans_c_Page_045.pro
1d089b22d1f587d502fc3f1be72d328c
5714b5d7f70b7d505200cf826864d4e0f1d88f95
102344 F20110320_AACBYO evans_c_Page_050.jp2
3bb9dd3ed129abd501d83c222c143b26
eb9b59ebf5964cb8d6ef9eea9aee43c20dc6e338
1366 F20110320_AACAUY evans_c_Page_011.txt
3c0026048485906c5f6144239192f604
fd3a69cda3ca927db1e6e19bd6f7a331ee851ee7
2055 F20110320_AACBBH evans_c_Page_125.txt
044632beff7f6aa09ad2924143a17a6f
28d1f4335169d09dcbc7b0a696f61d2e190eea62
347204 F20110320_AACAWA evans_c_Page_009.jpg
63bb6c16243a5a961c6e5fe9adadd910
e1742f3c972f4d765aab56d1f6f6f805fea340cf
85213 F20110320_AACBZD evans_c_Page_075.jp2
baed8d3af1ddb3a84141fdeba2730621
6679ae37af9f645acb038a2f91675f104ce1aced
100365 F20110320_AACBYP evans_c_Page_051.jp2
38276c5d0a7de52d557c635d2f7dbb56
a66ec5ce2975a37f2565cc89e2246c32f2c58429
394 F20110320_AACBAT evans_c_Page_074.txt
d0e5949e640414e06acb8243a6e46679
b0b9a771b56b522cc578a84489ca5d92eff4b17b
2037 F20110320_AACAVN evans_c_Page_073.txt
94426e2d10abeec80109ec32f0b78eea
aa460744fa69be21196e4fab0fad9b5847983b55
F20110320_AACAUZ evans_c_Page_020.tif
2c4fd4fcf8cbd51c5713a2b8861f09e6
cc43d25a41b416f725a78cf33684d598ce32bc3f
1830 F20110320_AACAWB evans_c_Page_021.txt
07ae0218251bbfac4920d0463efed8cc
61db32fa7ea73de32569358958d2a41bdbfa8ba9
105273 F20110320_AACBZE evans_c_Page_076.jp2
0c156c1e3d57cda52c36a2d780a54ecb
6eac7cad1928c9e78e8acad88657ef7e3877cf27
703316 F20110320_AACBYQ evans_c_Page_053.jp2
b6c454722bb63af1b8f60796d94b1625
c74891c6bb25c4f392bd14d41bc13b73d2cb85b0
F20110320_AACBAU evans_c_Page_117.tif
e9c834e6ff43b085efb99591e312d8dc
e136e3ce45ca982ed2d51aec23e007fea57641a3
1996 F20110320_AACAVO evans_c_Page_107.txt
5adb0cb049aaffc3207b26dac5090fe8
e262a1d5034332c4aee58c8eaf51b6d4269f13e0
102231 F20110320_AACBBI evans_c_Page_067.QC.jpg
dddaf3ed348e95a0bc5e645381485b80
a5924a120994e26aea7bf82e1d757a99fb2248d7
1401 F20110320_AACAWC evans_c_Page_054.txt
a9ad9f3553fd8aaafcb7d65a82edf6d7
8be12baf0be0298995845845df5aaac8203bc98b
107982 F20110320_AACBZF evans_c_Page_079.jp2
a1cbfd406c7c9f9475807048dbaf088b
7df25bd5301baaf8eb8b15da710a9d3c3b19966e
103019 F20110320_AACBYR evans_c_Page_055.jp2
065107a155483c0471670d4e925c5186
4f296797a67ea6327815dbdf5b0cad1a29fb5303
37694 F20110320_AACBAV evans_c_Page_084thm.jpg
80205cb61bd4c302f865eb7c5cba91a6
256d5ed919d4a2fd9d50761f7f44b69e111330de
219 F20110320_AACAVP evans_c_Page_003.txt
927fdea2c32b2ada78dc0718fac369f4
07cd00f7c24766b0717c946768cb2d9043d03ef9
44664 F20110320_AACBBJ evans_c_Page_036thm.jpg
89a9913a922525caf6f6f61ad60cbfed
795c3b68625ca920c0c9b096c898c8b7ba941297
49141 F20110320_AACAWD evans_c_Page_012thm.jpg
4187aad2332b94ad669e943cae8a1443
7c5559bf4c4e1797b0f014ca7111a07aa8857353
113750 F20110320_AACBZG evans_c_Page_080.jp2
078f9c48fc7bc99a5ac1513348814d33
3fac0cc0339680f23e6f5b48fc38ec8ba28c4f54
81482 F20110320_AACBYS evans_c_Page_058.jp2
0f049478e7a21ecbd417a4291338e66c
45305fcf86a0971a65238b494a5c2422037b0f6b
F20110320_AACBAW evans_c_Page_127.tif
a7598d0e6fa70acfd3a012e3f2b0f40c
12389784d0eb49bd18d9397f5d8d1ca2dfea4a02
34542 F20110320_AACAVQ evans_c_Page_011.pro
9251722b147e8c837646dff02d13453b
e4949e5f64fa045bcea92bfb866a0c425e1bde46
26213 F20110320_AACBBK evans_c_Page_040.jp2
42347eb3344c61dfde193b5f7373bc9f
6c62d8c422bb361d75a060000fe0e96b3b018c3b
48557 F20110320_AACAWE evans_c_Page_037thm.jpg
cbad1e6872850415acdae6ba5feef017
bc3b3967f87133992392d1296ea6694ecc5cef8a
115649 F20110320_AACBZH evans_c_Page_081.jp2
b4e572a9ce28a114fa82dd62bd12ef4b
c1c3cf58c14bc5504efd91e9307920cd9e00f498
91040 F20110320_AACBYT evans_c_Page_060.jp2
c7d37ca5b06f26ca5165aceaf71a50f9
63e859e22d03f91deb31f13634de32c7ad7cff8c
F20110320_AACAVR evans_c_Page_087.tif
13f5a992821c05bb8aa35988442baebf
8250af44154dc84c88ce10a232ca395f41854efc
104966 F20110320_AACBCA evans_c_Page_056.jp2
cc9ae3e9645d87f406144d639473c1a1
4000bf19b0b056b5a368a93250cffe63455f12aa
21313 F20110320_AACBBL evans_c_Page_057thm.jpg
d19eca1cf0c8e78e418a4216d9250756
66b058bee29d1c0f4e0c2248b9d8b6911654d729
43786 F20110320_AACAWF evans_c_Page_130.pro
f223282e6aee4f73e8b654083d76e387
21bdf7536932126b38618a9b94079f53a8459036
46219 F20110320_AACBAX evans_c_Page_065thm.jpg
732aa894e67ce19e7cb2447da8ecb7fa
6e3443aba2ca20e745c8a95e9c3f4cc76b5a2af2
718014 F20110320_AACBZI evans_c_Page_082.jp2
1d67a91b56694ebb2d6fbfed3f84c685
d6ebe38776dbed4329b51d4631529bb388935988
99953 F20110320_AACBYU evans_c_Page_061.jp2
b88c27980a56af5436d05df15ec45ca2
198a1e45ed1b60a755ca70cee973283c6fb59971
202774 F20110320_AACAVS evans_c_Page_026.jpg
ff2ec7a212083e4117bfc77360767120
91dcbff8b47122660b2b7b585bae3a69cc156650
180490 F20110320_AACBCB evans_c_Page_065.jpg
b77470b6d46a38d92217dd0da4d6d9fb
ac4e43680a7b13189f51ddfc1aa1d261ad579406
61580 F20110320_AACBBM evans_c_Page_110.QC.jpg
3994a17d8304067edff5328648c36039
256342bd2241c71bf4f6d895fda46f3d8b4443c0
51826 F20110320_AACAWG evans_c_Page_128thm.jpg
c6c4cb50164d131624ec742dee625fda
a61e0bca37ad6183bcc53a08db3025ff29389fbb
68788 F20110320_AACBAY evans_c_Page_089.QC.jpg
ae6bb5dec9aba0d02d1b08cd492d7f84
298829f1f6d61ff13464fe1da6b4a586d5070ab4
413394 F20110320_AACBZJ evans_c_Page_084.jp2
c513487576f19c8ef16e0aad344842e7
75c23754e3c3234ad12ca528980f3c8b64581e0c
103748 F20110320_AACBYV evans_c_Page_064.jp2
ac40d73ff97d29de40abdba21004f33a
40f3da45c034aa6833080489bad2697b726a8573
25271604 F20110320_AACAVT evans_c_Page_068.tif
86d0580691b1073b7df9c272ddcdb331
2e30aa77560c5f1f42f85f5cf57ba52bf6e885d5
23792 F20110320_AACBCC evans_c_Page_107thm.jpg
ebd747c4808190d3f472e0f77e9c1182
01cd592c541827fa1286e196a3137f9b43af917d
110748 F20110320_AACBBN evans_c_Page_010.QC.jpg
abcf8fe561dcec42f84a52faa9b732a0
75c3513c28253ebcea7599237309ced64e0ad756
181472 F20110320_AACAWH evans_c_Page_060.jpg
2ca3c6dd9bece8d233b8b6dde365b38b
dc715955a8c70b0dea06b7a3dee8e70d24ebfedd
44449 F20110320_AACBAZ evans_c_Page_051.pro
6d5c8c21ba965086bc4f232a72e2e761
99995675453cd74110224397366b9d4b058d43b7
103546 F20110320_AACBZK evans_c_Page_086.jp2
4108a30b7bb192e1afb1518c7b664288
21508a0910cd06bbc05f79961d7b298ec5ab3f47
787187 F20110320_AACBYW evans_c_Page_065.jp2
1ca3e324905da0f1e9aaf0ea4d1c9253
174c2176a4c1ae6ca427df64362e483ce60433cd
91439 F20110320_AACAVU evans_c_Page_057.jp2
a13a4be36d5f211a2ef69c735305fb3f
319543fc07e60446d85ad6bc033af514b5b22ac6
F20110320_AACBCD evans_c_Page_026.tif
17fa2218d86c3d0ab51f8bf182e56446
52f0edad8abbcd4e3fc955498ec570ad0b677599
F20110320_AACBBO evans_c_Page_076.tif
2cda0c432c3bb6355d606c4fc2267fb1
0cf21e00f895bb6b462537b3507f3d775b369cb9
22680 F20110320_AACAWI evans_c_Page_061thm.jpg
85a122dd7a9d7380595840a0cb3c5d0d
afbce044005fc8296fd133c002743f4cb8626ab1
107601 F20110320_AACBZL evans_c_Page_087.jp2
0b9be4a40257610dc20aaa2d1c53c503
b57415473d27c483ab962cdc8a28d8fbb8470f61
90883 F20110320_AACBYX evans_c_Page_066.jp2
3c498f24900afa6a213ae8100f72a4f3
f4656a106f8978f894d63d64c4f28cf2d481e54e
81695 F20110320_AACAVV evans_c_Page_081.QC.jpg
2cf0df62abd6e97a73403995778d46ef
24a5f391f2b02e6421c1eee8d2939cca7f644138
1051902 F20110320_AACBCE evans_c_Page_008.jp2
3b1f42ef7d49da2bdcc29d8fd2095071
93324005b0ae050585066fd53326557519100cb6
F20110320_AACBBP evans_c_Page_024.tif
fdc139b6772c5f25c7aeeb101382f3aa
801b7774e184b9dd6c44caca90dc983cb88d5713
F20110320_AACAWJ evans_c_Page_102.tif
6886a6c380cbea54ef3a0cd1adaeb5fe
8369a7824e739c0fcbf12861346a5d74b9239186
76609 F20110320_AACBZM evans_c_Page_088.jp2
0928c9ef02069a4ab06e1c60cc4e9118
d3e6f7b5f0e7f544e462e622ebd5eb91f9f582f5
1051963 F20110320_AACBYY evans_c_Page_067.jp2
73eab1f46b7a671f3d845344b5e85729
7fe2e6dc27693c70041e1c3c633e8f71f2f675d0
400 F20110320_AACAVW evans_c_Page_004.txt
9455b5a9e7f61b52b124dc3fd84130d4
2be32d3f3708766fe65acf9166d4b3d346f45966
94908 F20110320_AACBCF evans_c_Page_092.jp2
034098a0eab11a5733d1fb1b0851c73d
4a5b845db6c8e8f8c1ace6f746ac64a31885bc23
173762 F20110320_AACBBQ evans_c_Page_066.jpg
3dcd68d7abfce276a647bee8dafec1d1
51dceea52dd758a603ad67f8536e9dfab4598cd3
95345 F20110320_AACAWK evans_c_Page_070.jp2
027edede6fb7afb7e07c04f8d5e9022e
0d4eeba3a7b70abe9d20cc937899076d989aa3cd
97065 F20110320_AACBZN evans_c_Page_089.jp2
47721330fc2bb2e1c169551564a80029
e69a119a799a4496e1922f520ac23b877555cdaf
730263 F20110320_AACBYZ evans_c_Page_069.jp2
ddc0b7cd6a06ad7f5432bad9f6f7f8ab
87f2323bea3917ad383624448a87dd9b8362c924
41829 F20110320_AACAVX evans_c_Page_016.pro
300bb8ff46869e7c486b6ac4b3605866
cc536802d7b79ab0e1d1fa4c2f9a63945b9d4ef1
209837 F20110320_AACBCG evans_c_Page_023.jpg
38ece454b0f98668e9d1a4ff96221bb1
b56460adcb73205252e427cbc60dee9fa65c5e55
50071 F20110320_AACBBR evans_c_Page_062thm.jpg
d7394053bbcad215f7c5362e0e607f5c
a1ce6e1975f3c5da34ad4efd168eb1ecb1ce11c5
107854 F20110320_AACAWL evans_c_Page_059.jp2
22edfde2f0aeb3d1342e27668a2e768e
c9f4871e341d9e58f59915b839f258f4c3d59527
94497 F20110320_AACBZO evans_c_Page_090.jp2
362dd26619acceb829de399a808c4edd
3a0c65e7a3b5c85546f8e673f77e659eab10386e
39265 F20110320_AACAVY evans_c_Page_118.pro
ecdbbbc57402584287124bd4170e092f
1ee50e4d54a2984a1e7a7d37bf89a70fccbe25b1
54177 F20110320_AACBCH evans_c_Page_008thm.jpg
b962aa09a1a4b94d2cbdf61d5275e0a8
5f97d1adb9dd8ce977cc8546f2c3a866aba34163
41082 F20110320_AACAXA evans_c_Page_004.jpg
d797cc1a04c2175d5d7c50e68de84340
45dc8bd18a614418579e0f5797ef02ba1d6f6554
F20110320_AACBBS evans_c_Page_056.tif
63ead1cbf5ab86df6185de7bc1956ae5
5e81ddff76b828d068390ce132094e37216f4ed5
97799 F20110320_AACAWM evans_c_Page_083.jp2
af3550984005b2fc3acf5a792ef6ab63
19cfdcf7d2cbd74a6d58cde48005d7b5079e761d
57109 F20110320_AACBZP evans_c_Page_091.jp2
67cd29517ff61f61f6c68c0a813ee912
3af1248d808970213fc811817cc1fe6fc4829586
F20110320_AACAVZ evans_c_Page_115.tif
159230e986da204776f2d0412d650a13
5acbb1ce390cde36f54308200736a3023d8e427c
F20110320_AACBCI evans_c_Page_051.tif
a137f5ef40fe74cb8342c64f25a46445
95eb091b9961a893e7f036df4a8ad229636aa5d5
F20110320_AACAXB evans_c_Page_043.tif
c05fae144735eba853a42fba7aec5f9b
913921e5c27204e6d90cdb00f74c8eb1c399b160
45852 F20110320_AACBBT evans_c_Page_085.pro
b51781b9d35868ae10f8f9a52a93671c
6e59b3d365038491174648b6eb2d297093a793ab
20452 F20110320_AACAWN evans_c_Page_111thm.jpg
7b1f5a16e6625f44e77b7de6bb1da685
7dce3a07b9cf48903b5c1b97bb3445e7c7ca6fd8
87525 F20110320_AACBZQ evans_c_Page_094.jp2
2d4c8e3ecc2bf9f78f4cf871b54179c1
d8c319379cd42d5f48dac8e1013d08bb1b8e9f48
2090 F20110320_AACAXC evans_c_Page_047.txt
61c3a99a16664d2292c7c90494747366
3aec75f92c45b89666b180c023c19c49e6ea2737
F20110320_AACBBU evans_c_Page_008.tif
441281ece61f334fb36c0ce5b253a6bb
939f12512a0fae76f285d74b8abcb9c0add8635c
23258 F20110320_AACAWO evans_c_Page_106thm.jpg
4aeb27d55ef457510c4422d9182aa7fb
69198437caea18eac3384ab1c906deec7071b02b
81918 F20110320_AACBZR evans_c_Page_096.jp2
1f9fa24ab518e21714bd8204730eaddb
03e078d2a7b15d8b0aa34647a3ccf8adbf65a317
82455 F20110320_AACBCJ evans_c_Page_080.QC.jpg
48aa325916d00ca0697f21a68c020394
7b58c29c3e6795491968a57ed7114bad9c59c673
76061 F20110320_AACAXD evans_c_Page_078.QC.jpg
a7f798c671b0742a9a0d785cc814fb41
181a887656b47e78f43d154712fec303bcdeb3ca
628 F20110320_AACBBV evans_c_Page_013.txt
7bce6c6ff94ea6d450ebf1f57cf2c100
34b88402b2179647380ef7248221bf29d3ad644a
32814 F20110320_AACAWP evans_c_Page_102.pro
45a86bc2a1ef540a537293f65b33678f
335cb6b7b23cabe6e82b222ede4d9288c6916aae
104537 F20110320_AACBZS evans_c_Page_098.jp2
386eb8dfc3e00c3cea0d0a719e71306d
1af8554c369b2e9c875eafb9295648516cfd9ae6
65865 F20110320_AACBCK evans_c_Page_097.QC.jpg
ad08247fb55da747fd16014afd12a031
068515fb2bef097c463d8e877aa9b1e15ee57d1f
224140 F20110320_AACAXE evans_c_Page_081.jpg
e5e5dc292b8b47002de6ba5684c5d217
019e7396167bb57666590830df070742e11ce4e0
F20110320_AACBBW evans_c_Page_112.tif
8330bbe68bc3955706c9934706d22c28
83b51a97879cfbd35a94e067b75367ad4da105e0
192771 F20110320_AACAWQ evans_c_Page_006.jpg
9d24c1f6c83a65a468a30ebd7afb4214
3e920997aa0d365f8b5b5c3f0ed6215a80c6243a
111440 F20110320_AACBZT evans_c_Page_099.jp2
d3d438acfa74ea1da5e239d7099d30bd
55fd588a01c16d647fc0b784b7ceda87d1c805cc
94077 F20110320_AACBCL evans_c_Page_103.jp2
b28d8007b0bf6ad15cf47385172094dc
70e5cf0bf70383307c8bdbb585f8860f60b08a23
25424 F20110320_AACAXF evans_c_Page_043thm.jpg
c9c3f41f2850178e7fa9d9b66b61190b
1bb3c5378f848481bfc0ba51f795a6463dad0cde
24398 F20110320_AACBBX evans_c_Page_079thm.jpg
f970dd62224e7bee75f2a6dccde1cac7
bae37d264d7eac34193d6cfbf64cf59498b813aa
23974 F20110320_AACAWR evans_c_Page_034thm.jpg
b7cc0fea8089f7711ee587a48463377e
c5ab497bfffa417b40d43d1408d8c8d040c393b9
83680 F20110320_AACBDA evans_c_Page_054.QC.jpg
a4bf61f3327bef81dc96a28b79b05345
7802c424708728dbb2aa2c46030c62e3167ee38c
100096 F20110320_AACBZU evans_c_Page_100.jp2
44749b4f26cf9a11650c3162891bbfda
7352d18fafd79078ce5979c0f15fae1b0c05809a
958 F20110320_AACBCM evans_c_Page_045.txt
a6fb0dd6c350904146dbba9ef39e2c1a
d21bb39e9dcc1060590d9059c1e58f2139ecb9e7
1544 F20110320_AACAXG evans_c_Page_031.txt
4df93d501e8c0077651481d6aa9126c3
518e0e8cdf8c2558c5010c57fe00c0ac4d915e68
1880 F20110320_AACBBY evans_c_Page_039.txt
981e5a9affd3384a02c3e9d9bdb8543d
4d3c85118cf3ade9d475704a5111db3751bcc03c
196021 F20110320_AACAWS evans_c_Page_025.jpg
6deded98cd6d54ebf96df6225da87f7f
c46890e620e201b855165b840eed38acb9078f76
183142 F20110320_AACBDB evans_c_Page_070.jpg
d1bdfbd7143042e8a49dad7f3dd62e20
e3fdba60e6718114e114cbaa172d8b7761eaf707
74543 F20110320_AACBZV evans_c_Page_101.jp2
9f18aef4d61b6f515b444c3e9661ac97
97d4a6850375a9491b1af71f369b91f9f5931978
1942 F20110320_AACBCN evans_c_Page_056.txt
e5a0c887e614c683f2f6fedf822a453b
f764fb71324c606305aaebc6be63b746289e75d1
1919 F20110320_AACAXH evans_c_Page_070.txt
58b78853ed998e4c9db712456cb4bb46
551f42204a826898b2a7a47e9c1677950bb8d0a0
1051904 F20110320_AACBBZ evans_c_Page_062.jp2
81fed2e273e8c09ce5d4006901e2014c
2de910633d22b184dc82cc85fb7a0bc289ee58fa
F20110320_AACAWT evans_c_Page_036.tif
81e7fd55a01f29284683b67ec4aee62a
dc4a5c7e9b4e219128434a78161ddd25d7bd792e
742 F20110320_AACBDC evans_c_Page_084.txt
964a70f3f660dc40a097d4bc715d67b9
e04d77e7a979f049cde9125f048b36ed53f5eed5
71970 F20110320_AACBZW evans_c_Page_102.jp2
31eb94b5bbf4a7f58aed69c35feb0f5f
6f947388ef043fec790b435c264a6c00286cba95
23791 F20110320_AACBCO evans_c_Page_099thm.jpg
68a196c2e89695feb3f0694d841bfe64
887ffcb9706b4d04302ca10cd0c61f7f38e3f5b5
20338 F20110320_AACAXI evans_c_Page_067.pro
74c76f20bb47c584e60b23b43c8bef60
78d0fe72c5159028ac75c7ec8f70eac9ce92d34f
6670 F20110320_AACAWU evans_c_Page_074thm.jpg
20a71b1e88e6dc9e686868d2b9904e1e
3653bfed66d659446781b13d38c3f330d49812f0
659 F20110320_AACBDD evans_c_Page_123.txt
e7cbd4bf265a9c27a907b099987df712
4939e469c8c56ae40eacb167a5c07d659be96fb2
100325 F20110320_AACBZX evans_c_Page_105.jp2
5670e299a8a56fca733a82554db3e293
1596588e0cb5cba8b3269c951f873dfe98677504
68311 F20110320_AACBCP evans_c_Page_093.jp2
2f05400504dd08936b0ed018166adecc
208eb4d1b0c7fdc977bdafcb2d8b49af347962b5
22464 F20110320_AACAXJ evans_c_Page_063.pro
0aaec487b90d25b5a35bcaee34831c58
7c2835d2d0fb36c55c96df9db6a58b7672a210a5
731655 F20110320_AACAWV evans_c_Page_045.jp2
ce58197b5c86a7f35337ad78e62b11ef
d32ff7f2c6175e6fdc2ecac9a95e2ff958976129
114124 F20110320_AACBDE evans_c_Page_117.jp2
e9186d91bb78cf254d5f11a54802d087
2f51151d517e72fa31ebd5c6d133b45891db36c0
101948 F20110320_AACBZY evans_c_Page_106.jp2
8c769811b02748e07a638fbb2519df5e
5505fcab2b1807fb4866add864a7fad6eb7aadec
185476 F20110320_AACBCQ evans_c_Page_052.jpg
ef5903520cc840075478760643640c84
f45a11fd880171ce51bca5b04a764bc35dea58ba
47488 F20110320_AACAXK evans_c_Page_026.pro
37586b9297c6a121b7323690c484294e
1e1ab860e9a8324e7d42246cbe3898ed43398b81
171212 F20110320_AACAWW evans_c_Page_069.jpg
3128c6187fb6c1b1b00646395c00e791
4a33ed00030dfc38e1a91d9d2e6efa04e5842d1d
49998 F20110320_AACBDF evans_c_Page_027.pro
5d342de6191a2b0ef2374f0a2bd2b89d
29c19bfc3b2f991f3e26bfc5c84528e2815e343f
110151 F20110320_AACBZZ evans_c_Page_107.jp2
8b9405e7bc71e5550b9135880d59c051
353f6f6f0c007bfb7ba0caed332beb3653f1b4de
48731 F20110320_AACBCR evans_c_Page_013.QC.jpg
cf654a995684ce390af8023f2f8f3711
8f0946b51b235cad1da0ee4b3a8d3c48c0f8fa83
60570 F20110320_AACAXL evans_c_Page_018.pro
804220c75c604d96cb7e11c06297828c
d3be9d2f4c6f211fba53e0d36a865d978ad9034b
21707 F20110320_AACAWX evans_c_Page_058thm.jpg
9f54b6a4c2a8dfe25b42bbee0edbc576
7f70014cfa7026f297ab945d2bf9478d97853648
F20110320_AACBDG evans_c_Page_124.tif
414e3b7617efd02cac9354bea217a3a3
ebdea535737794fea9423a982bd518e4c6fc29b7
1742 F20110320_AACAYA evans_c_Page_016.txt
2116d06f388c8e9b53a830d64957abcb
d99fca85caeaf35c4c3bcd5b4763916cff3ca7bd
19276 F20110320_AACBCS evans_c_Page_075thm.jpg
8a9b2b4b3e5888211f36d99ff0ddede0
ca7cf8433f5507062b097df1d6ac837d3ee76046
101037 F20110320_AACAXM evans_c_Page_025.jp2
ae93f1b150afdc4706bc3fb25361a454
270bd3e9f4fbc8f0efd77a3cf74dd2c32af93604
151694 F20110320_AACAWY evans_c_Page_129.jpg
c97d5fc9ce5e4d5f6f9ef3d6bcfb9950
b494a4b04ca32d26ed13e35360f7805d523f611c
63600 F20110320_AACBDH evans_c_Page_066.QC.jpg
8c1efc1e2cfa642e498742c76ca2cefc
4874b17d28a77f673621e2ed7559a6c85dc306ec
F20110320_AACAYB evans_c_Page_073.tif
fc729838f0b74524a3933319e6724622
c966544dca815bd0ce611a9fb0f8f66e308bfe3b
F20110320_AACBCT evans_c_Page_085.tif
ea4ed38f7b07e41a655bd80abc5a1627
d47f024df045443b9382ff819e65b43b3522dcd9
48210 F20110320_AACAXN evans_c_Page_076.pro
93c73a7be0c00153b13eb685c442ee65
0f547327b83ff5871b7f1f3307b9b052737fed1b
22192 F20110320_AACAWZ evans_c_Page_083thm.jpg
48cda82964d13966612d15e0bcf17bf0
192239a560925df8a62b4b0559e317a08476f897
133505 F20110320_AACBDI evans_c_Page_008.QC.jpg
0f7ed569f7a5457cddba1ea239068856
c5007a3178ee5efcefe0c88fbaf19807f0d80853
31460 F20110320_AACAYC evans_c_Page_109.pro
a41646022aa8e2e070f6909ea666e2ce
f71d657594d5ef01223c6562cc7fa873e1f4b2de
51515 F20110320_AACBCU evans_c_Page_043.pro
c3b0734562ef618fac2ab185e9eb11ac
14eda8cab2df29e1e1631c6c2c3eada99d6e2c09
98431 F20110320_AACAXO evans_c_Page_021.jp2
93b10e00d354576708edf34107545525
69b85ba7066429e7333909fca7c87083733b61a6
182879 F20110320_AACBDJ evans_c_Page_014.jpg
e0a66d784b95deb108144ca478731a56
fd8619f0ed298105fb2e94b48ad59d528ae12aa4
F20110320_AACAYD evans_c_Page_131.tif
f2ab60e97ead1dbf576c4a0d474822b9
46e4096f70bf1d7e3769fdf633df4055e05ec713
F20110320_AACBCV evans_c_Page_070.tif
945626614e61a54ed3c262405c879c49
4fe59e93954a9da34d30a035ff6edf61e3231417
197933 F20110320_AACAXP evans_c_Page_105.jpg
79ed79c3282e474daf0a4313546f2f01
e7647c360e24860a1a7b3515e3657fe3d36944d2
F20110320_AACAYE evans_c_Page_082.tif
634f9b8ab43b46b7bd47fb0a1bc3de0e
06802f6b88b9fcdab9a0aba7d16dd37d5dfa62f2
101867 F20110320_AACBCW evans_c_Page_095.jp2
48d7a86f581868e32e83a5c8b608619d
db85271dc1169dd094cf5cc4b8c7bd38d6792403
101934 F20110320_AACAXQ evans_c_Page_085.jp2
b2f6e80049ded93e4310946c81dbcaad
4970c9eaf177f7a15ec7e0f350bf3de1421392b8
105439 F20110320_AACBDK evans_c_Page_012.QC.jpg
e5e3586c214d45b917054b660f84ed6f
7004afe045a3cb77193146aea6c1f46157ca6463
108236 F20110320_AACAYF evans_c_Page_073.jp2
bc373ad3f91f090ee5c3d2cddd3fc6e9
10800739cee29c2932aaad83f1d5215ebe4b65f0
9680 F20110320_AACBCX evans_c_Page_015.pro
8a1b00665a9ffcac8a7a05ef61b43bae
4a4678a6bb7301f44d07f5b3d4a925f1ca6fd17e
75289 F20110320_AACAXR evans_c_Page_076.QC.jpg
76ca187f10cbd9fff08565af4f4b838f
08c1e553df896fde76cfcf0c4ae9335b1b77db8a
81372 F20110320_AACBEA evans_c_Page_009.pro
383b923efd489a9abce6f5b9c75bedb8
4a2d5c37f5284093e4febbffc4addeade3c808ae
173768 F20110320_AACBDL evans_c_Page_096.jpg
fe02061ea696f32a4d3c3ad1344c0ab3
a659f1055a184403939b0f4d4b529956bf400327
25642 F20110320_AACAYG evans_c_Page_020.QC.jpg
385667370db7352bf550f44c74e42124
64c71a325e8e82ab151237a836b28fe91f90876d
157747 F20110320_AACBCY evans_c_Page_102.jpg
438fac720936597c0c6bbe4c7e3073c7
2b7507e439956726a11e3875b26985a6513db6e9
92433 F20110320_AACAXS evans_c_Page_097.jp2
38fcb6304827e543dc21531c24de0157
cf3578284a67763cf73c40bbc6b6982d1ca70093
104422 F20110320_AACBEB evans_c_Page_078.jp2
d8a0c3b302466c62901c86c153a160aa
826550a775f191fd5774206c0f225c07ab33ca07
712404 F20110320_AACBDM evans_c_Page_063.jp2
f41ca2c0cc99182d879e60ae4d52cea1
e95cb048dab7bee624af4bfafbada4df1abff8ea
24384 F20110320_AACAYH evans_c_Page_073thm.jpg
0849aafeba6c988d3490433e0728ac6a
286aaac1ec72947a8c39da333a80d23e4414c299
F20110320_AACBCZ evans_c_Page_072.tif
8beb527208b1c885c00ad0ae4ab1587e
b8ed1ddaad13e110dc1c5e2a3b03b4170f6f7bc0
148719 F20110320_AACAXT evans_c_Page_093.jpg
e2ef7836bfa2eef4298893ecd4a6affc
06323b0ccf72826835489ae6935d6acde5426ae0
39544 F20110320_AACBEC evans_c_Page_014.pro
990caf9438594e0dfd55d13b29c1cb09
2f88e21c8d8f20b27ee710fa62eed445c5bda50f
53095 F20110320_AACBDN evans_c_Page_047.pro
89104515c6b7be981d9b72f512631a7c
66d59eb4a00600e1721265fecffc6556e32977cf
82838 F20110320_AACAYI evans_c_Page_047.QC.jpg
f798e2581ea8ab06dd300f84eda02687
d00ed30af63e15c963cb132b04ad8a8f012c406b
2080 F20110320_AACAXU evans_c_Page_019.txt
6323fe88341cf6d8bb1f2f419ae5fb96
720e6b34db503c4490b1bf10f6ece16f380599d4
2431 F20110320_AACBED evans_c_Page_108.txt
2f897c5e6b2eb2d906ee95e27d2236d8
dd3cabd79bb7ea681d177d4fad45b269d8d47fd6
27030 F20110320_AACBDO evans_c_Page_091.pro
62efa7504ec90886d8c0c45300b1247e
c26a0b9a1d60c6d0de30e9fcb94b4e16f0c87a84
13381 F20110320_AACAYJ evans_c_Page_004.QC.jpg
c3b75f5f4f1d1f3f81fcb93c88efecb8
84f83a2f22f97edac4c1f28e915f129ad8549ffd
F20110320_AACAXV evans_c_Page_074.tif
783617dc5d5620d3055e6c51757f54cd
79c355155562e13a4c28a30ab5e0a0c0a835cac5
66712 F20110320_AACBEE evans_c_Page_016.QC.jpg
36eebbc3aa741c49571d6405f401c6a4
f4550d465bb5965a45a949c075d9c127c17a5525
139711 F20110320_AACBDP evans_c_Page_119.jpg
ff7bd07bf51343e637132705007a2d53
1f39d30944faf9a1e6c34be9667704df9f1572af
2429 F20110320_AACAYK evans_c_Page_093.txt
f30a726284a71c4391f60e36196cfdbb
fd3da17a47eaa6742e6ca3263e6887c9a6c4e731
52401 F20110320_AACAXW evans_c_Page_080.pro
750f31cbac5d280a83a1064921364d99
d546e1ff32175d90b483ed81730ec5db6e0ee256
43708 F20110320_AACBEF evans_c_Page_115.pro
00b963e98392373d8e1256f065a0431b
cd1aa32b3f6b83fe8f4c424d85f22be9f89616ec
235714 F20110320_AACBDQ evans_c_Page_042.jpg
6d45f6cfb089a95221eaeb64bc2e93af
782217af0b682ee120d151e8abff7ffa3ca3a140
212259 F20110320_AACAYL evans_c_Page_028.jpg
3382a46d917977d73feba904eafc9970
c378e726445a1f9c178b507a817235edcf025e4d
101004 F20110320_AACAXX evans_c_Page_122.jp2
cba9b349036f04a6b531908759f9f8bb
b9e678c25d139f4810ff5bd6f4154c148aaf173f
45291 F20110320_AACBEG evans_c_Page_052.pro
70019512d94c0d8a721a2e763fd869de
fab2c188439a1001fc3dfe681f93c7f33a31efbb
97778 F20110320_AACAZA evans_c_Page_052.jp2
1cd119d203765946920a2c9b46412414
bf55a29739b38fbdecb57b4220e4a9df46a1fc08
79070 F20110320_AACBDR evans_c_Page_059.QC.jpg
4bf99c39df6a7959dcf0d1ccc770c7a0
e296194a824c72d3d73ae5a2da848ff126a741b3
43234 F20110320_AACAYM evans_c_Page_092.pro
99e415ae2cefc1ee3b7264566e465036
42522b2aca634cf51739e4af2c80e65fd18b346b
45276 F20110320_AACAXY evans_c_Page_106.pro
9739dca93d1032c6099732f493775ccd
34480bf5306339db909354e62ae995439537fadc
73986 F20110320_AACBEH evans_c_Page_051.QC.jpg
1517bfef30144fd9a39231b9b29acedb
6a79ed3e358e71051d6d8972a52b487f8a018125
43460 F20110320_AACBDS evans_c_Page_053thm.jpg
9592376b8071e00108e95b9c1e2ad6cd
052d4e043a6cc9a705a9365341cebdd8c09940c7
7448 F20110320_AACAYN evans_c_Page_001thm.jpg
c5af8357be82770d3758ce2d88349251
b2b2ffbff245e1c1b743a3fae67262602012b99c
F20110320_AACAXZ evans_c_Page_018.tif
928da8ff8271118458b536313fea60fa
7155c825a6b12274b7a3d765a7b488ea5aedda87
F20110320_AACBEI evans_c_Page_080.tif
1d98b4a3fb0423392aa70a7db90ef2f8
015cde32284fc9af95622fca58cc5f7518f31cd8
46651 F20110320_AACAZB evans_c_Page_078.pro
6803090b4c0839765aaaf4f96e6407a0
0c11b985fc40f93830a73c16e01fe85ff0f9bb49
1193 F20110320_AACBDT evans_c_Page_053.txt
df56afbb50eaac96c6f73d0e6a8238f1
9b0117b4638667723e100b9180a504ba1e59ebb3
61115 F20110320_AACAYO evans_c_Page_075.QC.jpg
b76bd7cfc97b50a8e07dd8413a64ad53
c942f90a286970bbbb68e2d502948b8543000fe1
2839 F20110320_AACBEJ evans_c_Page_101.txt
1b83af1c774daa23f7d88dfbc7ec07bb
91a40e0ccfe98a07fa3de38ea5b01240a5127294
193776 F20110320_AACAZC evans_c_Page_106.jpg
af75e6d937d442a9cccc7d4869d2724b
4eacd43a043180c00816988f3a04913d332ed4ce
90682 F20110320_AACBDU evans_c_Page_131.jp2
5f97e5300451b2b21dba64e7474e7771
f66954de214f0aadef7eb1b8b47410f5837e4365
53991 F20110320_AACAYP evans_c_Page_088.QC.jpg
c087171b61676c0c202aec64355ce5bf
1909db1504e4228db326e3caf9d306ad7d429764
49843 F20110320_AACBEK evans_c_Page_059.pro
da0b76110cf5aaac401dc7ff558b69fa
929d5b82bde95dd2feb38a1d053633ecea4e90a8
23218 F20110320_AACAZD evans_c_Page_048thm.jpg
03f4f22719b063fc3ed49bd30298f93f
ca68f23217b06634fa9d48ec89833c1534519ef1
2018 F20110320_AACBDV evans_c_Page_028.txt
eaadab6c4fbe64047f1efd5620e279f3
531d362dc76d62567b082dfe950301131ea8df52
1838 F20110320_AACAYQ evans_c_Page_086.txt
71d7334f02b729151e12039a40344a7b
604237cca2007659eca0883afb316162cd58b859
994304 F20110320_AACAZE evans_c_Page_032.jp2
7f7992769cf8cb6cb0b00833cb08f2e1
cb61b277cebc134f42ad8983f5996e94d5556468
206486 F20110320_AACBDW evans_c_Page_055.jpg
2304be57f3b3f284592a6fb65e7827d8
ea55a10f85d97e4f091417b1ad9938ab39699e44
21813 F20110320_AACAYR evans_c_Page_094thm.jpg
1c540986066ecd5ac87bb91529721054
57a65eed59545c04de2af20a9f4e2cacbef4cdd5
110503 F20110320_AACBFA evans_c_Page_028.jp2
af64c6a1bad8cbb2ed77cd5c553b849e
77dd97115cac81a3c53eaa0fba663b57d83558e7
F20110320_AACBEL evans_c_Page_100thm.jpg
bb551f134e1dcaaa5bd1365a33a9cc14
1098c2a4225921264a8ed64657039d6f0420ccd5
73068 F20110320_AACAZF evans_c_Page_044.QC.jpg
55050c1fea44cf949411ba01560e7f68
e8959b6f093ff93181e5f9a98fbfbd503b06898b
F20110320_AACBDX evans_c_Page_099.txt
00cbddc0fe124bacab497421c1773133
8e4982e6f107d3e3438508549ba807401c21b8ab
F20110320_AACAYS evans_c_Page_010.tif
67b2f04277be2a2b30409d59b8dc0e0d
81c5c35457d5434f50eadd735076d94482e2a58f
23455 F20110320_AACBFB evans_c_Page_046thm.jpg
e778d8398bad9be396ee89bbd28e346d
62a86470366f9675fc2b61c6d8793e05386b2140
50780 F20110320_AACBEM evans_c_Page_099.pro
e98a85b1fa1d8fc1f31b989c438976a7
b75904462d417d7b6bbd3b06296edd392dd34586
F20110320_AACAZG evans_c_Page_083.tif
a371c857de750609637518d0e24dbd72
6fa873b6eb0989bf580d086898cd90e3d4efb6ac
6881 F20110320_AACBDY evans_c_Page_015thm.jpg
b990aff39e0bbcade97ac7baafa31f35
2ff67943e8303edcc81470f39953b0ff3edcac53
115994 F20110320_AACAYT evans_c_Page_009.QC.jpg
139e6ea539ca83c81f01715a5802e855
500b9e47c0bbb1dece716819e7522144bad85a08
872046 F20110320_AACBFC evans_c_Page_054.jp2
bdf8ecca4a92cb3e8ee9e5fbcbc8112d
ba9638d9ed05c0243408afedc8a195b867eceef4
1831 F20110320_AACBEN evans_c_Page_025.txt
5128ab7e5a5b011586306b83c90859b0
6f63f540f8c3fa0c68944bf3b46007b5947580e1
194091 F20110320_AACAZH evans_c_Page_049.jpg
9ec8b0c7cc9857fb7eaa500f25c01186
c053ec3cf7b9511a8d36244e03ae65ccbdfdf5e3
116057 F20110320_AACBDZ evans_c_Page_128.QC.jpg
a12b9622698dcf9947b80eb24e647375
9d110f3c3d891f692d31254d1fa5f108074373a9
F20110320_AACAYU evans_c_Page_045.tif
c2ee30825706ffe6b661881e5ae91e70
7f9adfdabdaf71ddc3c784e7ce1d79e5c66a669a
F20110320_AACBFD evans_c_Page_108.tif
fa6185b17a3b3a4cc3f8d2b9e8933d5a
769e3989c3a06f3686c614863f8a463b1c0c9e3c
F20110320_AACBEO evans_c_Page_098.tif
931789a14b166786181abcf56479b927
60137e7c31a66caa9cbbd5f1055c33b907db533d
37881 F20110320_AACAZI evans_c_Page_075.pro
6cb7a6b51aeee9a9224ec129ba0ba104
fcadab220f849728277fe0250a970a4694d47b2f
102929 F20110320_AACAYV evans_c_Page_017.jp2
189c427fd0b4e938df338f9aa178b43a
d275457a8d0743e28238b0f0e92ad0fcd332f53a
F20110320_AACBFE evans_c_Page_028.tif
e0d0972262e2075be0293064e50d8b1e
bdb158be791705d3ad7cd41c421609e2e2027d7c
F20110320_AACBEP evans_c_Page_038.tif
b2364ae4f8b017579fe94484ca31eee9
77e80ec9fb58cc9162be800b30d4777913320ff7
2640 F20110320_AACAZJ evans_c_Page_128.txt
78bd7db486e82d9a9ce51f7cd1e32574
6c19bd1b97930d1a2c9044be53ac7e79afd5096f
44418 F20110320_AACAYW evans_c_Page_122.pro
f199a9143ee088c7dc1d683194085176
938279113b0645ebdcc74873a6cb58382400940b
28538 F20110320_AACBFF evans_c_Page_037.pro
745b5883340fbb02ec41631708315bea
2f18accf3e42a3c94d5bec27d35136a7bd76c326
75456 F20110320_AACBEQ evans_c_Page_099.QC.jpg
e5b36ba1bb6a9d235345377da5125db8
be32449c491069eeb7e670fc6cfd78724ac2c4bd
F20110320_AACAZK evans_c_Page_058.tif
3a1e86bd7f425c247fab56e3f4507670
5594d1a8ee82939b8f9e279850b8bf4a5e78f6c6
169444 F20110320_AACBFG evans_c_Page_068.jpg
1b49e8575974b7e23f8facc6b6356cc2
2c7b0dd3ddf920f9062821f518bf57d1b8f6a286
111319 F20110320_AACBER evans_c_Page_033.jp2
9db5c6cd6077c2cb69e47edb3638d29b
a148324637c39e0f7432f91d1028e3429d04c9b7
16381 F20110320_AACAZL evans_c_Page_123.pro
6c284c2e598251658a8b23e3754f094e
defeefd6807e99ea7d4986dfef472901187eafeb
21642 F20110320_AACAYX evans_c_Page_131thm.jpg
ebbafdbd6c056a99574da6e944879a7a
7173f82d13d9b5fec52c1916f17e277e6dd79e47
58295 F20110320_AACBFH evans_c_Page_096.QC.jpg
a7609b74f978edd08b51fc4f861ee292
527f63f7cd85dd6157cfd64a8360f952326153e8
48072 F20110320_AACBES evans_c_Page_032thm.jpg
ceca489440af5c121ac43425fe250bac
3d08f65dbd66a20e15e4a68544f3b273ed13eb7e
2899119 F20110320_AACAZM evans_c.pdf
abb386e4fd3b389277bb5404fe85646a
3c85263f8562446df14859aa0075bfea152135bc
104387 F20110320_AACAYY evans_c_Page_048.jp2
26dbc52992b423d0d6b333387e399e2b
e8050db4367de1e1c36af060616d265858092577
165464 F20110320_AACBFI evans_c_Page_075.jpg
33abececc486aaa02bcf612936bd1bdd
8724b8e80c29448a574d4b4cb5bc72175b71c7a7
39762 F20110320_AACBET evans_c_Page_005.pro
f53ffb8fd193437120448888d3f897e9
7372b610b68cb0cbdf0c14c124a88f72d87b7355
2065 F20110320_AACAZN evans_c_Page_049.txt
0b9dd295d51f395fd98a46299f3885db
5fbc0ae169dba05fd085be49d0a99bb549bbe279
171595 F20110320_AACAYZ evans_c_Page_058.jpg
8440b30393f96e0e98012c324cb3663a
d9865f24f4722d283887fc700d7ab3d787c276fe
214500 F20110320_AACBFJ evans_c_Page_033.jpg
82f6e757b10c1f5ebb05da1bc70094f8
5fbe99d3e4044a5f14bb0effc2437b3be7b7496f
1823 F20110320_AACBEU evans_c_Page_085.txt
cf1415eac47b532b034d66cfe1e7c9fe
d71d4df8cc4bc7b00e642727a81de92b1e82fb3a
2066 F20110320_AACAZO evans_c_Page_033.txt
d5e9ffa9026c6037539ce64d9c0a04ab
67c6aab654663e67e9f450e738e043d82699c849
188586 F20110320_AACBFK evans_c_Page_044.jpg
2f98855b91124aef5eeaca3f11840660
f6860a0b61eaad2f28920032ac66fa186762ebeb
45127 F20110320_AACBEV evans_c_Page_070.pro
389f6f55bae72560748ce9386cda9455
1d98271660c7f4bb1f72a589a23b59603ee3791d
41101 F20110320_AACAZP evans_c_Page_066.pro
2f4e26d1b5ce317bde73713c5af76e77
4e33962df173d734f68c47cc0776b5795fa28a2d
509 F20110320_AACBFL evans_c_Page_001.txt
a580a7a371eb7a3ed34bf8d7bed57ca2
d3f8f393f89748714150d747528f475357f8346d
F20110320_AACBEW evans_c_Page_039.tif
b6901d1fab6a954bd87071c77d7113aa
062dd4e31e83ce0d2abe8382a4ad0275c6f8119b
2191 F20110320_AACAZQ evans_c_Page_120.txt
273850f6449f6cd57be788ca3d81f74c
a36e4f42e9b8b2eb803b16baa2df2b94ff1132a5
158751 F20110320_AACBEX evans_c_Page_101.jpg
5d23c43576782c5b65b6a2c7f56b2188
67c0faeef66429da8d3ca09829550bfbf028ed92
47133 F20110320_AACAZR evans_c_Page_030thm.jpg
18a041a2a7b4cf6aaa54baf7075fcfc0
afaf08f7d88b3d94bfea7e7cd959fe96a3900548
23345 F20110320_AACBGA evans_c_Page_098thm.jpg
50e1d00c98496cbc571cec34efa529d5
f5e6ac4de118694df1206c1562d5290f21cd6769
48923 F20110320_AACBFM evans_c_Page_056.pro
5d58aff118bdbec06f67f1a26d40703d
067cc293222765829844bb6d02d5d107dc650cf3
71105 F20110320_AACBEY evans_c_Page_021.QC.jpg
0acfed33d2f2a1eaf18534074d35bf68
eed13404fff513f6bde1f56dd48710be4759801b
43899 F20110320_AACAZS evans_c_Page_054thm.jpg
a206ffdef5fe1c9c37992b1fd3d1f6bf
a03504a841c3bcd5677505bab3f43d28222d4568
2411 F20110320_AACBGB evans_c_Page_057.txt
b2c97cee78331d880d8f92f9d3b83fc8
3cd9e930243c55ffb328a7ff1027a81ca79f109f
185287 F20110320_AACBFN evans_c_Page_126.jpg
b299ede1dd2ff1d51d2d912deb7a9b7e
9c8f6472cee84434c6ab390e772178260513b299
139337 F20110320_AACBEZ evans_c_Page_109.jpg
27a982c0f6518278cb5b5b7057a9b5de
9427852e02a64cc81798d76912114026064d7908
223719 F20110320_AACAZT evans_c_Page_127.jpg
02879a1bbcab023175bd8a4470707335
eaa47fd94997bc92e164cd6a285de670d9a9f030
68959 F20110320_AACBGC evans_c_Page_104.jp2
aef756498995c01901720367bf2480eb
6c31f93ba79aeb086a2cbaff9e88164c4f64f032
221345 F20110320_AACBFO evans_c_Page_099.jpg
22db32b8e9090c87b96158857197f9d8
2f349b0cb0835246104b1cbd9c732323283a4cdd
169088 F20110320_AACAZU evans_c_Page_111.jpg
3d655df87236dcbde68a667dfadcb3f6
324bf2628b73e5e5fbade6627bcf9e43d3769ce7
2446 F20110320_AACBGD evans_c_Page_018.txt
f16f80c3855b9a8ebf88c6aea18859fb
22dc6b9aa1e2c02ed5ef4e6a09b9ad2c367ba433
254438 F20110320_AACBFP evans_c_Page_067.jpg
f29e8445cac032edb1a9dc72ec2312e1
d4c3ed3a15ebcfd059f10773eae5ae8e582b0869
579688 F20110320_AACAZV evans_c_Page_013.jp2
2e12d4a8aaf7087122c2af15154e4172
35fe4baf0dcd12e0abc23b44d0f9eb00d46cf32d
109314 F20110320_AACBGE evans_c_Page_022.jp2
b32c8c362f2d98486b5c7ea43f74311b
39bdf84838ff5f8b1e7ae820511904cef7686814
36062 F20110320_AACBFQ evans_c_Page_112.pro
0631963f7579f436857081e16f4b2f6d
8434c4787454acc0836fe80f037c7b99538bb178
921 F20110320_AACAZW evans_c_Page_042.txt
6130d0bc34fdd45a0fa09995033196d1
35625af9d0482ccfe3777674ac10ebf9ae5ff875
47762 F20110320_AACBGF evans_c_Page_105.pro
9c4a94c3097d1d8ad97431f43eb69d91
30a6e7fc1eb3cef790411d00e17a6548a7f0a8c3
22321 F20110320_AACBFR evans_c_Page_115thm.jpg
a40576bbf1ea29e84796dba28ea00872
7691fbb3fb56711083b418084c311eaab04d4316
192040 F20110320_AACAZX evans_c_Page_046.jpg
9b3ae420e6f0e41670c32f80659c584c
d78f162a8752d16c8ec801080f8be4abe4078771
F20110320_AACBGG evans_c_Page_125.tif
65a20163861cd6a7ca61defe4e5bb19c
17e0c6db04a07fcc4919aca961814d89af9a1c16
103887 F20110320_AACBFS evans_c_Page_077.jp2
71421fe91a7660f1496c775a7e7cbf1f
632854cab3137dae877061f12a1105b72b2b6180
41704 F20110320_AACAZY evans_c_Page_068thm.jpg
ef965a380deda49d5daff45c09e1851d
8714babf8bb11d17350ddc1e4b3fdacb20488a7e
185746 F20110320_AACBGH evans_c_Page_016.jpg
d03acb044a9da36ac9c2362497921dce
86dd7a3f7ba71340e49f50118f452ad78ad40ee2
1929 F20110320_AACBFT evans_c_Page_077.txt
82c474fe1d450db9c8a736bd5cd88113
44c1622252b679d92550f9d12c35727b3f5eeaf0
158326 F20110320_AACAZZ evans_c_Page_063.jpg
ad1c24ad4e0d7c76f83fd31a03dcb6b2
ef97df7139ab72d8a60586715431e0ac2ca521a9
F20110320_AACBGI evans_c_Page_074.QC.jpg
2935c9d330584aa7612f5334edd9a40e
26cd554bcb94be338aa56b567f6c15c60a75744f
75600 F20110320_AACBGJ evans_c_Page_064.QC.jpg
a445029c0ec62876fe91a812c4ff1966
9f228734b9634f1a3629b7a78975a9c68b74a491
25445 F20110320_AACBFU evans_c_Page_081thm.jpg
30303ca034380c20c73495e3efe5254b
fca09e388573e6904652f1a26d80ff8c9b80ac7b
F20110320_AACBGK evans_c_Page_012.tif
c03fe3dc85d4328b0483fb3992e3f633
9194a73f0b49f99970619e6362b173a0ccd48d65
79722 F20110320_AACBFV evans_c_Page_043.QC.jpg
1c89864ec4d1671a8c8bd1694c21e2bf
f526f33383ed6b76037800f23683d9f0adc7dd36
182117 F20110320_AACBGL evans_c_Page_124.jpg
fe0c2753fff26bd12ec050060e0658cc
c270a4e1bbe1733fc3ed1acc202ef4c0a6f8f09d
23495 F20110320_AACBFW evans_c_Page_072thm.jpg
b9c9abf2a3666692105811458dc0d903
b416725cec95854b9055d09746fe60548fd5303d
F20110320_AACBHA evans_c_Page_040.tif
e8df7efc97c99e4bae7044720e2408cf
1bbfe27c1f62ff37c6a5f9ad4971c92ee269665e
20133 F20110320_AACBGM evans_c_Page_060thm.jpg
03e29d39519e8341723a103a708c43d2
b42e558466d5da33ae4c26c18494f1b3cd540ccb
19291 F20110320_AACBFX evans_c_Page_121thm.jpg
3c11812e602a2fc6a074de1903fb27e7
4c21b5ae7c43e2e5e5db413347f818954c10ae63
2160 F20110320_AACBHB evans_c_Page_096.txt
72f115ed07446fc182b8373876cb2baf
935df561ab26efde168cc8824155c4224ec8f191
41426 F20110320_AACBFY evans_c_Page_097.pro
29c709ad74aa5c7e4d440a2d85a5a102
1d9de3a79b8869b43ea8088b38ab5af9e1f918be
2008 F20110320_AACBHC evans_c_Page_097.txt
fc6f00b22c0805a987a3ef46b90e18f1
4a5afb38dcbd386d8673cc181a9eb11fc18b4543
76329 F20110320_AACBGN evans_c_Page_063.QC.jpg
ebec89f678aebda1fbd27951b0108a44
46c73218848d6a8a5d498fcb7ff39ff51d4f3aee
69759 F20110320_AACBFZ evans_c_Page_092.QC.jpg
b4030bc4399d529a98ff70d5d7387ca4
ec1ac09a7f9f5b24baff181e722366367d0069ed
F20110320_AACBHD evans_c_Page_013.tif
4615bc5549d1e884bc5218cea833741e
845af1320b368701cb97c09be9c737c8f9d10939
181212 F20110320_AACBGO evans_c_Page_097.jpg
014607e1fb8b03a08e3c6ed8b13b9f9b
97efd1effacd7f2249626e24cc08e8c41145942e
F20110320_AACBHE evans_c_Page_120.tif
234887cd4a71f87311c8ef4a9b81db96
8c7a823253386f7b24bf439c4b1910f643c2e119
F20110320_AACBGP evans_c_Page_004.tif
d0fa44e9b002646cd7b11d06916c6f40
9f7879fd3b82a570c53396c131f646d1d6fc535c
76262 F20110320_AACBHF evans_c_Page_026.QC.jpg
ee0b9ec2958bd79b726b53028b8ebec7
ea72ca2284f70dbe116a5b7a9423788e6ff2fd3e
1766 F20110320_AACBGQ evans_c_Page_058.txt
c18834865214137c27428e26ea810ebd
155fabd3a112794c785cd23685ef7d09fbefc5ae
23534 F20110320_AACBHG evans_c_Page_039thm.jpg
d032dc9c2673b789a5cef8d6ebed8e77
d8d049f1619c06a18b1b9d1632e38c300561e993
36525 F20110320_AACBGR evans_c_Page_024.pro
7af288b47daa3293b63f32605dfa4e61
d3d4c91fbaedd474abf8d712950a284914d7c2ec
39664 F20110320_AACBHH evans_c_Page_123.jp2
d6e2511c7c9f629deff8bbff1ad30481
0e3948b57c1c50cc960971f315bcf0bbb6bae2c8
F20110320_AACBGS evans_c_Page_121.tif
0655bf70d61fef15c2498b6dc7003612
646ab38f507da46174d21db1720fd2d5af4dec32
F20110320_AACBHI evans_c_Page_078.tif
259a9a2a970ab5ce7d95fec6cbc51472
49b764767a9a8dc4ea54de8db4ab8e05ac159124
90828 F20110320_AACBGT evans_c_Page_024.QC.jpg
5b1998a1e697dc37d95c744cf31bf9d7
c589f16a2a8c47ddc8fa02b2540abe809f40c879
73049 F20110320_AACBHJ evans_c_Page_105.QC.jpg
7ceb35749c65d2b89ffb2fa42ae8db9f
e39de4b2477d401da9eebdf05448fb665e96992f
F20110320_AACBGU evans_c_Page_095.tif
b569d17d033ac94643b05dc678599f57
b07846d5e1a9d0180ea2797a1c45f8205628d2bc
F20110320_AACBHK evans_c_Page_057.tif
194df36ca3716b4c1dcd121993338d95
ed276311c20296eb4fa28cc405a4915257d0b55a
72651 F20110320_AACBGV evans_c_Page_122.QC.jpg
8b4ca1bf35946175b0c8fb9d8dde4a7b
2fd01acfa23b6346ef71c341091a334f53ebd626
13296 F20110320_AACBHL evans_c_Page_091thm.jpg
4ebd095932da3a3b4a05b140e82d5c4c
bb9e0b9ea1fb5ef82a08013971f14ae682387a68
65439 F20110320_AACBGW evans_c_Page_060.QC.jpg
55b84ac057b84c5e01aae197b4619bb0
56b7774d7bd4b3cad82ffa8fbe84a0b1f4d0c042
100185 F20110320_AACBHM evans_c_Page_042.QC.jpg
2f4009c3ef2d5f9cc72eda45edf9aa0b
50ecc34e9bc6c9e9888712488908be7b598f21da
35493 F20110320_AACBGX evans_c_Page_058.pro
08a21a0c5cb5c046977b275b3c47ce62
5ddb59fa4fd4a45f54c202ae33b6fd3211555f6c
F20110320_AACBIA evans_c_Page_014.tif
59041250712879ce13c63bec5575a27f
584078aa48c768e05e67700c07600dd920f30785
64606 F20110320_AACBHN evans_c_Page_124.QC.jpg
e49c06a5dd6d771f6edbf051dac81e7f
b4a05f972d28075bde4ebdeb37bfff295e47105e
209629 F20110320_AACBGY evans_c_Page_059.jpg
60a034f713a88ae04dd74be9dda0a249
91e0781bf9f09ed995b8a0be0fe19f454fd57d29
F20110320_AACBIB evans_c_Page_015.tif
57c56ffa81a6f8da41c9fb52dc9da623
246aad6e4cc4753a4e6e27d36fbceead6f31d454
25308 F20110320_AACBGZ evans_c_Page_047thm.jpg
393cccd19e7686b1f7936331f1fd00f6
ecb7812d53fffeedbf23d8a5c2dfa28878da64e0
F20110320_AACBIC evans_c_Page_016.tif
46a296b0a7abcaa343c8ec7f12a35b42
59a9f5f8acd92eadd14ba279ba8a15d8cf460341
2179 F20110320_AACBHO evans_c_Page_127.txt
1de93e420509ca711a1f6d64fe3d3589
31148f1e5d098999de43f367b4459e379e5f381a
F20110320_AACBID evans_c_Page_017.tif
d9d7dcd9ee9f5242ac1ee54de2b92a33
01722e01438f4018a8ba8a2e65249e3aeeee4c0f
64806 F20110320_AACBHP evans_c_Page_020.jpg
980ba95b4c69a8fa46c91e32a68d2724
971799e2b116ab5871f73e10e3151bb8c25278b5
F20110320_AACBIE evans_c_Page_019.tif
681af7a51379567113d8d1ce74a42e6c
8f2dff75a1b86a567f117ad02a2e787029fffc9c
211086 F20110320_AACBHQ UFE0009320_00001.xml
4b161a149f027cc92649dbebeb146ff9
6ba6cc4974cbc1affddfb369eb144b7c9b1485f9
F20110320_AACBIF evans_c_Page_021.tif
10da9e2417c185066b564e72e0b166de
12c986cc08dff40670c7cffc91b225e34d2b34fe
F20110320_AACBIG evans_c_Page_022.tif
35df162702bd7ffc3f41ad6016d9d21d
dcc2cddca48be11de0570b805a85a59b43b8087f
F20110320_AACBIH evans_c_Page_023.tif
a4fd2d74f8b1b763269f55add6760d24
26868be9df451c8e5183b5a62c1d9408054c3969
F20110320_AACBHT evans_c_Page_002.tif
e636575dddcfcd4f0fbd0847ed4fe657
ddee65d08b44d2be9992519480eddea54a1a0c17
F20110320_AACBII evans_c_Page_025.tif
d7d931eb3f71b1ee3f819c62b14ccd10
e4f0ff6fb31fb40bdc1759d5f3a1f90f735dd213
F20110320_AACBHU evans_c_Page_003.tif
b6a27d4512411b57977ebce7c70b27c6
31aed3f9543ef1081255aaa3468ce2b738974e98
F20110320_AACBIJ evans_c_Page_027.tif
9a7097a6e516ee23411922d4393b224e
84fe00309e84c57d2ae5014ccf6b4f98c5f46254
F20110320_AACBHV evans_c_Page_005.tif
29569b687889fe25528f86d4baa972fb
5cdf87a925b48703f0b0975b920b2fb41aa651c9
F20110320_AACBIK evans_c_Page_029.tif
fee62f8be3b14effad2cd39124e4cc5b
f3c356c4f1c73985a4c33f2fd2adca93991d836a
F20110320_AACBHW evans_c_Page_006.tif
103219dbbcc4ae98b9bc98e9b5497a49
a36a023f2934917b354691829167adaff81da5e4
F20110320_AACBIL evans_c_Page_030.tif
7df515d12c647febd7b647114d838ea6
047f9bf810eff682cf910730d035af3ec9048525
F20110320_AACBHX evans_c_Page_007.tif
dfc5d2af5ee5b9041fa171c9ed79c592
62bfbf4096e2d37db182e029160060f7002e8a91
F20110320_AACBJA evans_c_Page_054.tif
800e9c96866986687e91679923b7aa92
7c390b87d421216bb40e7ae7e636126ed318ea7a
F20110320_AACBIM evans_c_Page_031.tif
bef3e8c907a2db2ff4f8952fb55b8466
bce6cfac80c7de8207e34ecbababd7dba0152326
F20110320_AACBHY evans_c_Page_009.tif
850d90fffaaea9ecf99caa5cb955e4c4
f19a25c9d715e98d6ea1dbbae9ac787231345d53
F20110320_AACBJB evans_c_Page_055.tif
3263b165d4186c918c06784685dc522f
54dd5c3049a94f80c55d511a67e0cb24e1c28193
F20110320_AACBIN evans_c_Page_032.tif
33a8a7ae9dc1a800b9b6c997dbb5ea9f
947acfbad221a1bf595b2cddecda709dcaf1f14f
F20110320_AACBHZ evans_c_Page_011.tif
0b8322f253f50804b20567c60704ccdb
247b2b798e78193cd1ac5b83fd8e9150f10f7348
F20110320_AACBJC evans_c_Page_059.tif
2ad2ec39ca8c1626f2e6c25bafee5f49
4e17668ae68b701f3a260fd972a63eb02da7fec0
F20110320_AACBIO evans_c_Page_033.tif
1e9a535a5f097a2e7d88c17817cb6a12
3f5e4b3689a49577f8356ac435bb5551dbbe0c33
F20110320_AACBJD evans_c_Page_060.tif
0d3409e64013e39baa744b512f142152
231c7847a1c966cc208576bc10664bff8a1849d6
F20110320_AACBJE evans_c_Page_061.tif
976384e44795882e182db04254cb6e7f
b7e04c63ad31c8a00aeece00c96b49d73cc7c365
F20110320_AACBIP evans_c_Page_034.tif
8ccc10788946125b3c1c6498915e5057
53ac69967d50e4e0e0587545ec881d69cefeef96
F20110320_AACBJF evans_c_Page_062.tif
dd79da8ebfc35ba050e1fd60ab9e5a19
1e8b3de659041a1efa4561a65d046d6ca5529b89
F20110320_AACBIQ evans_c_Page_035.tif
461618c0f6e24c521844ac43286c6132
9c323728893349f074dc2ee2919d197df2e18d36
F20110320_AACBJG evans_c_Page_063.tif
5bf80997f29ba474deb5fd1f8c9d9017
3735b563d373f443815aa47e46aca4a5de7258a5
F20110320_AACBIR evans_c_Page_037.tif
4d3755c7d3582a707f94de150794edfa
7a257c3000af7e56e00e8023a9e356260c90067a
F20110320_AACBJH evans_c_Page_064.tif
a2db292c52eefc965a5d28182f60fefc
758aaf9f1c22c8b624271c5af445aa039a11f1ae
F20110320_AACBIS evans_c_Page_041.tif
d46aa65860023cf7b70f1f99bf848e16
8734e16bb0edf177c4a25c2b50f1aab176ed819f
F20110320_AACBJI evans_c_Page_065.tif
4cb76369b41f30c601eefdab74b3dd96
5ecfd2b8cbb8680f647352595e82be53d8ba32f1
F20110320_AACBIT evans_c_Page_042.tif
263f7d2ede14e422e30ed33e5dad4718
e2b90c22e8614e86aaa72e1e98c1b0b23059fa12
F20110320_AACBJJ evans_c_Page_066.tif
de353e492ca6b1289337338b58c199f4
34ed3e5054279e0ba0b27dd4fa5a15a34f798bb9
F20110320_AACBIU evans_c_Page_046.tif
a69b425d50edfcf9f407ca902463291a
356529be2bf0eb4e82584f8f5a1f66a9ef6cbc2f
F20110320_AACBJK evans_c_Page_067.tif
4f3ae29dc80ad338374346c0bfbf89a2
1a9b03b0250a5fe06f783094edb6aa66576ae2f4
F20110320_AACBIV evans_c_Page_047.tif
fb93062a17b9dd9ee329c6a1ab751522
64ca837840bb28ca206ffaeb4d7f243d6ef7568c
F20110320_AACBJL evans_c_Page_069.tif
4d7915c0a09d24a88161f0315d58f424
c891cb7148ddc100179bc128ce15b7e723473ae0
F20110320_AACBIW evans_c_Page_048.tif
5ed796ae9b98c6adcc66bb8399928dae
8f497e7b641dbb7c4cf6612ed4d8e4c9d6afabeb
F20110320_AACBKA evans_c_Page_100.tif
69b5ac88beb48be147170661b057dae8
58ff996c93e249b1e7b3b8891dae798445f83ca1
F20110320_AACBJM evans_c_Page_071.tif
a2aca8a6c7f27300b6c92687f79c9f1f
e6f62c7f3b92e80d76ae378d869ebc8686247393
F20110320_AACBIX evans_c_Page_049.tif
35c8d1347c0f794f326333e893424139
e9711d015acd7bfc9a9c5beb97033855078cd190
F20110320_AACBKB evans_c_Page_101.tif
aa1786d5891a1b3341ab2be6b6693a01
017dedd21ef7027f3694f0b5e0dc1260229a5bdb
F20110320_AACBJN evans_c_Page_075.tif
93af3565bf4f8fac948d80fe96a77873
17dd21f589840bb9b0db2095b54fa1944ecbe5a8
F20110320_AACBIY evans_c_Page_050.tif
01b50fe6687cd993274e9732b36b63ea
e95acdcbbd96fd2582407cc70579d975cb64cda7
F20110320_AACBKC evans_c_Page_103.tif
9cdf2718bd20050b2d6576a281dc3251
0b942cdd5ecb3094253acab7fa426218437bef61
F20110320_AACBJO evans_c_Page_077.tif
19bae6e219278ad622cbb5b14d80536f
189516caecb6b7d30ba26e31d5c8c7e1540d0b15
F20110320_AACBIZ evans_c_Page_053.tif
89951db455f9d867bffe1245e148d53e
7bdf306112d0750883087f1d600529b5280ed2e3
F20110320_AACBKD evans_c_Page_104.tif
7e819b956e782620c4ef1a104f5febe0
bbb09bc0cc15fe7fc834323cea144eded89c128a
F20110320_AACBJP evans_c_Page_084.tif
4713c7c561cab81dd80295f3d42f6988
50053f3df6dfec148bcb7fd6ff623e70bf161ca5
F20110320_AACBKE evans_c_Page_105.tif
867258eb5074834013afca3eaa7ffc9e
c03060f8067f109279411a2588ca0ebee7f0fec5
F20110320_AACBKF evans_c_Page_106.tif
5825c70c1218239465149c1fb3a62e6a
8a996ed279aa89f40b17ef0f5d758560e4f76b55
F20110320_AACBJQ evans_c_Page_086.tif
80b853c8adff6bddb9c9b37bae1431eb
57d69d8748c7fbe3e39e0d9a111f88d7f5daa24f
F20110320_AACBKG evans_c_Page_107.tif
c39b2ae1f93909ee8b080a2a15d92764
f3b07439d22bed1a021ed77e719dfc74de1cac72
F20110320_AACBJR evans_c_Page_088.tif
06557cb1d91c033e32df2c4572eb1725
c7d51b7e9367a056a5f3aebd463709673c17c0fc
F20110320_AACBKH evans_c_Page_109.tif
63965a5f80f550b1bddf541abb191de4
dce6c92ad21f26f64172daf5151929af79122b94
F20110320_AACBJS evans_c_Page_089.tif
fbce2d19176ee6da480b5024066b23b2
f57b8b9755c8976def0285c91681fcab839e3031
F20110320_AACBKI evans_c_Page_110.tif
52c76f4df1b03b659bfebbf228fcbc31
ae2c9a4e2ea566ee63840a507665c1b8275d450a
F20110320_AACBJT evans_c_Page_090.tif
30751e5bfadd93da3fb7ef364b841d41
0727af056eab8306a3b7310ff8cfe9ca00ace305
F20110320_AACBKJ evans_c_Page_111.tif
2b7abb0178b48dadf56f346278bbb71f
816765b0a98874097ac70cd9dcfcb06b915137a9
F20110320_AACBJU evans_c_Page_092.tif
6a60eb38f33baac96baba80491e2ed65
6ce596ff4a9819b267339b38d5ac2d63b543de15
F20110320_AACBKK evans_c_Page_113.tif
aff05a230c9920404a41db29ee9230f2
ea6ccf187a405de703fc8d433dd2074fb1d26a78
F20110320_AACBJV evans_c_Page_093.tif
12f6844f15ebabc3c1da9dd4d10a4b59
e8acdb00a87f61b0de587735ed9be2340e296866
F20110320_AACBKL evans_c_Page_114.tif
d2f074b1fd288b641f6b286cbb611b5f
d82264e9ebf4a992654a656fc29506d549e903f6
F20110320_AACBJW evans_c_Page_094.tif
5430009b6a83c200c62ec27bc4b6c8cd
a20aeec61afa6b5e8d2473b3daba03958bd48e1d
F20110320_AACBKM evans_c_Page_116.tif
f87f082cdf46d3c664652b2aa4bdf658
13f7c0ff0864506e338621ee2207e0361935231a
F20110320_AACBJX evans_c_Page_096.tif
a2078548cbe7dc29455f22e268c006c3
33f630bf2123749bbaff9bcc2e7d921cf9fda4be
2114 F20110320_AACBLA evans_c_Page_010.txt
427da0803af3a25534cb767d57d18da5
a3f3efa1c1a36e64402bc5fe432096c54d781c4d
F20110320_AACBKN evans_c_Page_118.tif
daac6cb6467ff48a737bc6de07de51c4
64dda79307bf05237bf80c13353307ebaf9d0a67
F20110320_AACBJY evans_c_Page_097.tif
5d0c5df7131e72ceac33dfbd68db2c75
8e8ab17f99136303994c3f9b807910a155a8bb21
2182 F20110320_AACBLB evans_c_Page_012.txt
f888cb7b22aa0d7bf0a28279ea0f951e
48058580838968e33036751ec3198b8977e5ac08
F20110320_AACBKO evans_c_Page_119.tif
92e7825207e2cbfa77a88a4a1eec666c
8586b91791e7d5d8215adc3cfbbaebdf8f247da4
F20110320_AACBJZ evans_c_Page_099.tif
c96569c822a5cfc3efd957a443ccfedf
c05ed41ade4d96e9879e5918764150a86747b186
1750 F20110320_AACBLC evans_c_Page_014.txt
1d0d64af002187bdc643832d18a66324
2cc4fd4242f403af00a9ba485fd78f4c2885fd8f
F20110320_AACBKP evans_c_Page_122.tif
63e2f3ceeb31ff3f37d2c17c4725b050
52dcd6f59cb9519fec35b79dd86a07718391bf13
393 F20110320_AACBLD evans_c_Page_015.txt
c78ee13fb1f9893165b79d8688fdfe06
df7526ddb8ac615262721ac70df533ea49672f84
F20110320_AACBKQ evans_c_Page_123.tif
1c2175ff7df761e5b34a7cb466c68e2b
e5afb247072c3be3d2c3033fd3ae7b810d89742a
F20110320_AACBLE evans_c_Page_017.txt
76a522be0e017226cd2caef92dcebd43
4cccede7e5947367f2232281685f62d580bffc69
613 F20110320_AACBLF evans_c_Page_020.txt
1523d743aaf0989ffa4e692015fb866c
af95c13bc9cd275a24d60fa989a0af4817cfd720
1940 F20110320_AACBLG evans_c_Page_023.txt
cc07df10d215c5832750854558f620b4
28f7964062933398d20acd6bfe275a654be9a294
F20110320_AACBKR evans_c_Page_126.tif
ca559a0d98eb7ce44d1be3885b457936
cb670f9e289e51656a2750763b24b5f181546027
1568 F20110320_AACBLH evans_c_Page_024.txt
8b1a8c6c1cd250256b640f9df2086aab
2866ad3e184c68e534e6e6c5dba99f6c088cee13
F20110320_AACBKS evans_c_Page_128.tif
8e6dc90f59da0a7360bd45efea9f6a93
db8d06e82ba7c95852c4924121ae1079676addaf
1891 F20110320_AACBLI evans_c_Page_026.txt
e569d9f8f5cd64429a44c0c68b8c0cdd
1b39191c58635dff4819e1395a88077ec389870e
F20110320_AACBKT evans_c_Page_129.tif
57c99fb49b40682a11c6e7feb815e99c
014ad41d44845cbc01564c146a2a3524ae848368
1977 F20110320_AACBLJ evans_c_Page_027.txt
6de9898342283de9d2692403fe1ece28
7fad6fd9997845c5acdc1259aff73c570efa4d95
F20110320_AACBKU evans_c_Page_130.tif
d9ff66931052f366f537a9c4508f1417
604d7bc79947f8fd354766d2fcb1f134a328f480
1979 F20110320_AACBLK evans_c_Page_029.txt
0e33de948b7c7fb4f41af458de0dd0d4
f10395b931a9d2fbfa62c3159215bcf3919dfbd9
1616 F20110320_AACBKV evans_c_Page_005.txt
819104436a65dd272994acd992bf62e6
2fd56f18753befde2a368e0b16b8dd1dd32d9ee7
1340 F20110320_AACBLL evans_c_Page_030.txt
c17ddeb8d4e90e79b42b225f7acfb156
9d622408937842d300d13ca0dd874e7039996899
1770 F20110320_AACBKW evans_c_Page_006.txt
aadb9d89960ad9dae92d2f97cdca06fa
23fcf3f5e05140221ba8b2de269fe54d9439b238
2350 F20110320_AACBMA evans_c_Page_055.txt
8122429c5a880ca76f4435f10993e488
59ed0a61eab298d0cc91503ab826878543028306
1879 F20110320_AACBLM evans_c_Page_032.txt
25efbf218e7522c007b176dc456bb62a
3fc0380068a0a456671883c704e70ee86f1306a9
3100 F20110320_AACBKX evans_c_Page_007.txt
32fba4ed244b9f8cd8f7a33eff5d367b
726a8074fb54b903115903fca2b1a84618a82215
1963 F20110320_AACBMB evans_c_Page_059.txt
36c3d003728668b2b59eef3d36c0e264
9619294e1260190c671d636836196a5009d8aeef
1933 F20110320_AACBLN evans_c_Page_034.txt
4a1ba70663658241f73028fb039040da
4df6057a2b76f0e306773372137f84e236e116ae
4438 F20110320_AACBKY evans_c_Page_008.txt
173a410668775ea7b58a8baabab6382c
4193126c2a206f407ef5787a05ec2f026686c512
1812 F20110320_AACBMC evans_c_Page_060.txt
9b19fdf6f0bc521917d7a24fd23e7993
415659239143bdaeef5ea6f11132b128bb14ef51
1911 F20110320_AACBLO evans_c_Page_035.txt
82a3920330a47a94f2823d10ebacf24c
92225b6bfa768b32bf5b8f3d860a0d177c63f9af
3602 F20110320_AACBKZ evans_c_Page_009.txt
a331b6be728b0dbe15afc616d69aaad1
8a19105581c6316dbaa069f14d4cc4830e8fb8f2
1876 F20110320_AACBMD evans_c_Page_061.txt
76548b5cf6ef419c83848dff4ce418af
fae57a297f811bac0a7c9ff3f753895a1d03ac91
1902 F20110320_AACBLP evans_c_Page_036.txt
f9ecc31559e8e745672c649f79b6f80e
babc28a4f5e24c83a2e26e26a00d57708ad00030
F20110320_AACBME evans_c_Page_062.txt
5cab4dc1da8eb38f7a43350a7b348164
8076f2628d37ffb93bee35b434ff617e31ee55bc
1199 F20110320_AACBLQ evans_c_Page_037.txt
3b849e8486b733f15cc2cdd2e603db35
7bc10c93313725c53e19bc1a0e0fd6e6f3004c30
982 F20110320_AACBMF evans_c_Page_063.txt
f868afd0d596e164b43e6e8f6f973c68
e5b13e1df4b70f627f27e2bdf9fe7fa710da8886
2026 F20110320_AACBLR evans_c_Page_038.txt
ab0465758d001f74e7201e0045b04912
2a7e336fc290f877ec2f98a69e69257ff01f62cb
2122 F20110320_AACBMG evans_c_Page_064.txt
f517744a9315053ae59a9a98a68a3596
b5ee003d34f9e6b5c8291c534339772e239ef963
1657 F20110320_AACBMH evans_c_Page_065.txt
34f58394aff7a504b8a2aba33bc79713
a4d3d7d5ba322d52dc96f5d1aae3fdd5a1e54898
462 F20110320_AACBLS evans_c_Page_040.txt
408805e09da4413a8a2ce182f3c666ac
f707dd7ef9881783bc66ea3efc409f288505bb70
1649 F20110320_AACBMI evans_c_Page_066.txt
0feb945bfeaa531af96c6a955c3fa1ca
8eecceca46798fbf3cc44323bea503f5a720c567
F20110320_AACBLT evans_c_Page_043.txt
ff494c379f37804ac8d0d26a175e6db8
5ec2129457b8564b8e54941aa2f4de7898b3c8d9
939 F20110320_AACBMJ evans_c_Page_067.txt
8bc68772d61d19d7ed495dec684b9953
9011476f42fe2eb3bc09277c3b4accf2f313cd8f
1862 F20110320_AACBLU evans_c_Page_044.txt
a88e463b27d478c3f13c92cbb41d70fb
9985309fe2db96f36b35111d00313e3ea0ef7f60
1056 F20110320_AACBMK evans_c_Page_068.txt
6c8f865342d3688a75d43cdc06cd1184
f4723ffe6b2388aad7967fbfcd53e596136c027b
1872 F20110320_AACBLV evans_c_Page_046.txt
3bc2a41f9d66cf8d8352b7bd094df409
ad6cc4cafa270f05a3c2659444d4a5dcc9da0b29
1383 F20110320_AACBML evans_c_Page_069.txt
171e0e1c7e095e257f198fea00af3ac0
bd5363a58cfec54d834bea84da364d1b9d8d0132
1906 F20110320_AACBLW evans_c_Page_048.txt
e65d9d8cf0f6a234f72862bbba6ca45e
8b4daee52b0e2136a4cff91136b5a6f90b25f22a
1848 F20110320_AACBMM evans_c_Page_071.txt
0c16a59ac015c52b72a2803063bf1067
0c5a992cacf83a49e945719fb5302d16aca6cf6b
1863 F20110320_AACBLX evans_c_Page_050.txt
5879967956eb2671f4a6fe68897afcc4
b888c6e20d6c4ca2f02f6a3d8dfa7da066d6bd21
1922 F20110320_AACBNA evans_c_Page_092.txt
439df49d29a55f4a68ad420e30bf611e
19b3eef2793ec22bf65e8720a4ff0a1b4f18354c
1570 F20110320_AACBMN evans_c_Page_075.txt
2583d40fe0f102c2d12a57df6bc87683
8c75318aa22052cfcaeee1cd3ab2765e3287a034
1889 F20110320_AACBLY evans_c_Page_051.txt
0af08d33fbb50fbfb8df33af28befa97
c1d5ded69854a212c2f8a8b81acf7bc6aaf8fc64
1959 F20110320_AACBNB evans_c_Page_094.txt
5f8cffcf0e269eb62e34e2d24c5889a1
ee0542e015ed5eefa66373ecebbb9299f45d8814
1953 F20110320_AACBMO evans_c_Page_076.txt
ad426c2934c83732f2a3475dccc0b28b
198c31a4f8cda9f5d335c03ad3baf4364aa14647
1805 F20110320_AACBLZ evans_c_Page_052.txt
f7ab737a645479da7d337c058319865e
c8b07ea101fcee5d0498f8cc2f0f904fe27a5563
2192 F20110320_AACBNC evans_c_Page_095.txt
3eff69ea947321b7f7a6bc68e26c5a1c
5ef218fd37408c933f70e161b0b5bed5cafa5b1b
1857 F20110320_AACBMP evans_c_Page_078.txt
59191ee1e129ecb2601d935a4a49cca0
15682474401afa5d21b94cae70ef75d9f8b9d915
1828 F20110320_AACBND evans_c_Page_098.txt
3d699c87f2f4a4d599a5560c59c30254
106ce5f77ec358260b5291d34fdef33fa0c7770b
F20110320_AACBMQ evans_c_Page_079.txt
b0f801829cc690ab6aa7c1cef0074a1c
a04420c0f86f91bec0d42ad53210e20b59084a00
F20110320_AACBNE evans_c_Page_100.txt
f293088c936ab4272f212b27695eda67
0ef47ead796c62421e0d26ad13d584b2e9c20d39
2061 F20110320_AACBMR evans_c_Page_080.txt
d29253802f94236a420fd4a45dac21b0
4abac5d43a686aaba2d770da4b939d167666eb67
1509 F20110320_AACBNF evans_c_Page_102.txt
6922f3df2311f3319e7ca2a1e16f8fdf
2a6a07b0a0bc6f0e1636956e26580128744a9721
2124 F20110320_AACBMS evans_c_Page_081.txt
a022978221d6dcd9fa27eebf42c846e5
33ce215b0339e9c3f3168763bd37fcfca73ae5b7
2499 F20110320_AACBNG evans_c_Page_103.txt
9286ceb6465d887d27d08e1d4c0164fd
04d2eedb9f248dc0674ba49370559ab47b14f599
1985 F20110320_AACBNH evans_c_Page_104.txt
87bc36efb1a0a39cd4dae435ae549b07
75e66d1956088cafbb93d3e0e0ebcfc1c5aa1dde
1801 F20110320_AACBMT evans_c_Page_082.txt
78fbe1f8ed7581b18e6714b2883f9da9
c3bc318b7aff4db57fe16d139db5283ce21da86b
F20110320_AACBNI evans_c_Page_106.txt
2d8364a38ec5c6761cbc1705d9ab64d5
125c7430d4a485df9ea300898e7ed9e360adcb4e
1786 F20110320_AACBMU evans_c_Page_083.txt
e7f39ec9c49ee7f35991e88c7815293e
446810da001c6ec4f170f4d97ea999028a724866
1625 F20110320_AACBNJ evans_c_Page_109.txt
0671e704afceb945bfedb9f83d476ecf
6648c103a835e4ccad1ab0ed93eddf1249af09ce
2038 F20110320_AACBMV evans_c_Page_087.txt
2a95286f61a3aace57038e2f63e3afe2
841080ca25689506ba7cf38ea5ad18d0ab5c85d8
1790 F20110320_AACBNK evans_c_Page_110.txt
1c77f0069c8da00029633aa6f79e032d
1734c0000797777a9b82b25657f457a62de577c1
2583 F20110320_AACBMW evans_c_Page_088.txt
02942679216168ab2ee6f03a7e3eb27a
8d51cd58cc2940233dc474d9ee82475ca2c2fc27
2233 F20110320_AACBNL evans_c_Page_111.txt
d3e6ac954d66f957522258bef99b1ec1
7ea9e87ef33ebc4937bd1629f5d0d5b8085c5bf5
2362 F20110320_AACBMX evans_c_Page_089.txt
bb3bb8cd67fa997ccc7d62ea4104f2c2
ac1c0297307a2270b4bf8ed24d46c8196c12e03b
9586 F20110320_AACBOA evans_c_Page_001.pro
9fad9142f027a66417656242287b9bae
7b08b2bf0a02b7ae732c2df1468bd0df4bdfb9e2
2296 F20110320_AACBNM evans_c_Page_112.txt
66e6599b55f37415fcb6ea13651a24c7
2aec8090ed707eb41c7b7ec6ca4e911ddae2e9d3
2448 F20110320_AACBMY evans_c_Page_090.txt
ca36f869aa6d01f90deb569cd8cd7442
29d0e0704188ee1a9cac1554daf5382d48a663d2
1382 F20110320_AACBOB evans_c_Page_002.pro
886f8255c45f67a2612ee2e875942a69
d930052eedd174be32df23143a1ed3f775abf523
2149 F20110320_AACBNN evans_c_Page_113.txt
62c08ba9a694ece70dc7199efc68f67f
21c9c30cf7e9571463d7c4fbfe0c1531b0b463f0
2161 F20110320_AACBMZ evans_c_Page_091.txt
40784ba23bbf6d4eb38bf6c05c3e2e71
4f2ec1870f7b872bd419bee9ccc3c7e3de33765c
4060 F20110320_AACBOC evans_c_Page_003.pro
98e6b1d885e3e50452340798a98e6e3a
c75a4e4b9bf7f8759aef846154080081c9b1cb72
2062 F20110320_AACBNO evans_c_Page_114.txt
6dd70e365d92be4711b9093d97363b51
61152aa792b10603ae2e85fd616c86aa9930ee48
8443 F20110320_AACBOD evans_c_Page_004.pro
878179e24de8d5e266cee973b18097b1
6d3efc88b3b604f4dc70875acb2eedbe43812ea0
F20110320_AACBNP evans_c_Page_116.txt
893fa668d3484b0d561642a801aed8db
17d31842c477b7018d249b8450384efdd54dd56d
44194 F20110320_AACBOE evans_c_Page_006.pro
9e4afacca3b0554c880a1e95024e876f
e57e26df8661cb2b8b7fc5ff8d997c5c6e0f7d86
F20110320_AACBNQ evans_c_Page_117.txt
2224b2f4adb9d5a9dda24251db84aa8d
da099846dbb6e4868812faa815486edea9c48048
74552 F20110320_AACBOF evans_c_Page_007.pro
13109abcc0ad9a1bbb2a1aa69c259fb1
cb697fa43d527c87ff2ceb7e8d2763643ae8873f
2042 F20110320_AACBNR evans_c_Page_118.txt
f06f7cd716dcc628ace95f0cb410a0a2
9f7c32f370b4bc051258b466fc41568225303eff
105929 F20110320_AACBOG evans_c_Page_008.pro
2057b62e74bccbdfb57a31139e7abe6a
0dfee5975c96398cef86908eda8181c7ff729165
2366 F20110320_AACBNS evans_c_Page_119.txt
89ade1fc1dad1a228017f4c7a132c229
2f62b44abc51965088abb3a8179aa356b3261217
53145 F20110320_AACBOH evans_c_Page_010.pro
87b2c7661d6f462e2d3d67fa1669eaf8
fb37f563c255ad28990594fa38cb838b6893045f
1980 F20110320_AACBNT evans_c_Page_121.txt
0680d2570bb1a53d3ed58aa2ef384f0a
d690f56c7bde979c8ffad2e901a467c7b2ab8eec
54854 F20110320_AACBOI evans_c_Page_012.pro
760ed02ddb14368fee40b3e636c2b56b
0d64990715d5e903d0070c7740b87786748cee64
46157 F20110320_AACBOJ evans_c_Page_017.pro
a991749d2f97952f3c9ce3d47acae4a1
7f2f207d2adc600ce33425b15400d2d986279b9c
1881 F20110320_AACBNU evans_c_Page_122.txt
8178aa379d1b044bf55d037fab2b90d5
8cce9e04946868e13c53c756f8db0631d901472f
48699 F20110320_AACBOK evans_c_Page_019.pro
2db55fb4fe030c7c95de905ae0b661c5
a0ef2eb8a7e686c72405962d3cf643bf84281757
1762 F20110320_AACBNV evans_c_Page_124.txt
4a86da6951e5e4686417f7c0cb89c7ad
de157f5d67ddfd30e8ba5c3595ae958a1f938165
14218 F20110320_AACBOL evans_c_Page_020.pro
91f787814784d103ac53cee3cc65dc1c
a622a76da17885937c84cf954c3aba68cfe97e68
1724 F20110320_AACBNW evans_c_Page_126.txt
6f6ac8a12e6a5376b8ca1d3045120025
998523eae2c85a8f5c3e633eef17d2f49aa48d31
23312 F20110320_AACBPA evans_c_Page_042.pro
14b0d64f2b23aa6e216e453a5c86ed9b
33a46b44b6edcda9eed570eaf87f841fb7217439
44322 F20110320_AACBOM evans_c_Page_021.pro
7dc086a7649fec4df048eb73b11203d1
d6d6f13607c3c7345dc8f5a47edf512bcf3b2997
1086 F20110320_AACBNX evans_c_Page_129.txt
6078834222aa64b3e5038c9fa3379b42
0047adbb85a9d5d82ff0dd570602a2c7b9579b36
46242 F20110320_AACBPB evans_c_Page_044.pro
31f8c291943fb510bfa2e1d5b28b0989
591e4920319f43195c01b8aa92da180f2d80f971
48808 F20110320_AACBON evans_c_Page_022.pro
adb070c570c989d2bfcefaa1c0f8416d
e06717ce53d234226cec230c50fa087fe4e3928b
F20110320_AACBNY evans_c_Page_130.txt
db0b0e47c7b5db7bd99f00990dac9e20
4c3359ea909eb1b617c699133cd0b5b68cef3d82
47193 F20110320_AACBPC evans_c_Page_046.pro
e2f256414b93205d3d7b220b72e30e16
59cd1b999a12f09e2f643bdec3a6a2548c2df311
48943 F20110320_AACBOO evans_c_Page_023.pro
67589b813e3f35839d1f89de56c0747a
79de3759df9108f6729a43cd1bf6b48eaddca8d9
1639 F20110320_AACBNZ evans_c_Page_131.txt
6ef44e0590cba6c5374cb60e8127d109
97471e8f7769e8a5dd896601a7d8d5e32ef9b044
47268 F20110320_AACBPD evans_c_Page_048.pro
585779094c73b4a52631d0b1b8c703ea
224bfed4d42458fe027e0c7e2c3b45325e08a24e
45969 F20110320_AACBOP evans_c_Page_025.pro
22517451973abaf26a8f759eee12d195
d1414ff378aa956a834f174394c28d54ddd5c199
45056 F20110320_AACBPE evans_c_Page_049.pro
3a40ae6b5c539ead18e45d073d2639ac
6893a223dc9f437605c7651cd0f9c0d268c0c1e7
49710 F20110320_AACBOQ evans_c_Page_029.pro
cb4906401e4a2a4c350da70cef47d0ce
093b72d0c70df3149f5446ab6e2db4071a64903f
27201 F20110320_AACBPF evans_c_Page_053.pro
1e34803f929e909a1ff2ab2e0a7624ce
9bed70550d59ccb6a4676989e194aefc59b1490e
30555 F20110320_AACBOR evans_c_Page_030.pro
af5ca4e90b590b3bb8710ff42a877759
fad2dc79757b2c4da19c82c085659835b3caa9a0
28910 F20110320_AACBPG evans_c_Page_054.pro
e3127540cadad4063e1e747795d382c0
f5cef49aaa14ebc552b22ad81ac3ce2672edc7a0
37507 F20110320_AACBOS evans_c_Page_031.pro
437ec0ed5e8349e5e80bc00e7faf9cff
7019eb179273b84e67282948bb42cf22fb2cc840
49739 F20110320_AACBPH evans_c_Page_055.pro
7a051bdb158c7ea1cf7aabfa4da16e3b
df9e63686df0d62cc8275ed5494fc80a35f689e6
41158 F20110320_AACBOT evans_c_Page_032.pro
77caeda46e0dd8b8f29f2fac407ecc7c
796c1e8d1d3e6fa1da5ae964d5072dfe38ad1208
43897 F20110320_AACBPI evans_c_Page_057.pro
f84f45e8177d0fad16649b15e65ca787
2cf491800d1a1da33fb52708aa305d434aa7ea12
52347 F20110320_AACBOU evans_c_Page_033.pro
949aa1d7b915ff78ba363d10d1005083
2169a5d50665e2ee2b7152d164414938907cdfcd
40740 F20110320_AACBPJ evans_c_Page_060.pro
175ffb0df01fb92d911cf946fdd45d2b
ef97bc58236cc6a21eefd5ba0bf634aa9adbfd62
46742 F20110320_AACBPK evans_c_Page_061.pro
bbefeff4b631bdbdfe8b3ec5c6845c97
55739d6974cd501dc416cf58c31dc4a4ad5886fb
49019 F20110320_AACBOV evans_c_Page_034.pro
83f2d71dafd8cd7662e14653059adc22
0e3335f3dac0867a1caf36722ab3f80761c7f6b4
24521 F20110320_AACBPL evans_c_Page_062.pro
587761c97e942a73979d7733d927e763
e6654cd55537eeabb1e3d5215f46ec13c4c6b1d7
47683 F20110320_AACBOW evans_c_Page_035.pro
6f8bee4e0ed63e4ff6c1c3e253904ec1
3f78cfef4c8f76f80bee980faed1168b5c1a21cd
51359 F20110320_AACBPM evans_c_Page_064.pro
be7b81c71dabd3a0dc8763c4d13a3fae
4da8765374773b10d763da5c160931bc0202ca10
47465 F20110320_AACBOX evans_c_Page_039.pro
5997c53b7adaf044281ab1e0a33b3040
b4fd932753da0175c78b8dbcb9ed491a89b0e939
46285 F20110320_AACBQA evans_c_Page_086.pro
6c735818e6cd567c848d0311b8bed476
1cee0638a052d0a9e49ba602225c3f0c3ccf3930
30443 F20110320_AACBPN evans_c_Page_065.pro
a6e150ddc64ff81ba4d019580d7e9feb
f9efe4dd7ed23f9683d8818790e993e935560c94
10602 F20110320_AACBOY evans_c_Page_040.pro
7a2b8389769aec1e556216aec25cdf88
1e2e5c617f6690ab48db1ffbc8ba4250575cd8a1
50946 F20110320_AACBQB evans_c_Page_087.pro
76ccbe5f818a2f847053a6863dece37b
facaec5d2a412e0255b69c1e2e1781298228d280
20489 F20110320_AACBPO evans_c_Page_068.pro
59721bcc2c232973c889c2054d72f561
fea5f2cfff11dc912ffe34a8791f29422727c198
39402 F20110320_AACBOZ evans_c_Page_041.pro
9173388aa08e142184b8a045217696af
530b156fd1c8f151c4675e257bfa3a1911a63a25
41567 F20110320_AACBQC evans_c_Page_088.pro
d805de6797ac4f470d11eecdc35fc000
439304982141194bd242ba090c9c9690052fdc90
43790 F20110320_AACBQD evans_c_Page_089.pro
b4dad5fd34535d96e2dec2a7d30ac4fa
9a9ed5e5780473ca794bbbef19c0ee491db36a5b
28674 F20110320_AACBPP evans_c_Page_069.pro
766171d8608444454fb68f1ad9225415
f38b43ecb7a72e5eb23b0d3f8195a9bb40af7e82
46221 F20110320_AACBQE evans_c_Page_090.pro
7c609274e8fa1c4decca27e465e270bc
a0996d940c3da475353fc11c4a31c044521e43d8
46013 F20110320_AACBPQ evans_c_Page_071.pro
39372159abe53007895f5d52101155aa
ce02e32cca035810471da3fc4e5dd86eaea2d631
33718 F20110320_AACBQF evans_c_Page_093.pro
a3ee7f82f6c52d556297457bafab77d1
0b62a4c6f63ae98665f592b8988105650b7788fc
50265 F20110320_AACBPR evans_c_Page_072.pro
a3bd1eb5147ab44342ae7648d49f99df
d7b9f4444ac818ec554d57d1fa81be0cdaf4962f
38540 F20110320_AACBQG evans_c_Page_094.pro
aa4daec1d9528cc040ed55d6fda7302f
b68432ce65f922e55cfc07998d8d66d83bb4a759
50210 F20110320_AACBPS evans_c_Page_073.pro
39ded8cafef5cf3ad915ed4731b0f0e3
378f81cb021eeab0d211116f46ce6197db4de9a4
47350 F20110320_AACBQH evans_c_Page_095.pro
452c42495aa6ea34925e7a43678d27f0
f170d07d03f02494041fd932f8a5c589db27dd13
8841 F20110320_AACBPT evans_c_Page_074.pro
f09a2518a5eeac0e42a803d50fd58f70
bc4478944f8465b22d9efc533e51b7d22951c922
39722 F20110320_AACBQI evans_c_Page_096.pro
fe9616647f86e224dca9dacfa9a0b084
3ba34de46d8b4ec63de3b06b08e2fc202c95a27f
47716 F20110320_AACBPU evans_c_Page_077.pro
a6bfdb7115e67b0276297eca2417e3f6
f02e1a0d4c0770cddee655f6ab3349da64f67127
46175 F20110320_AACBQJ evans_c_Page_098.pro
5937a2be4b8b68c5b8f55e40cf984b09
684b6547eac557c6eaca77d22f46a0cdea8d22c2
49742 F20110320_AACBPV evans_c_Page_079.pro
4951b57698737a54dd01c6a8a8a74f4c
e7cbf4cd9b56904110009cd2ef71a5e577e5164d
44894 F20110320_AACBQK evans_c_Page_100.pro
00c6f993d10790939debc7623fadfb58
dd5fb2a00257c5bb58b7be8d2c2f5efc42f538ea
38198 F20110320_AACBQL evans_c_Page_101.pro
37dc7959c70f355f77bfd58cafdfabe8
13685b92dd2cff6903200a53e8c2ec2ecc322155
53949 F20110320_AACBPW evans_c_Page_081.pro
b4da6c7996e03553a84c2be314214b59
9af5edcc6d5697387ed8582158c46ec07db093de
26607 F20110320_AACBRA evans_c_Page_129.pro
2b89b6e44b6250f37c064ea247855c32
ef6e0126e9b643c05f3970c1d1c659c30e2d1072
32674 F20110320_AACBQM evans_c_Page_104.pro
742ea4cbc891f46527885dabce54d77e
1104f0ab4b337ee0cec650abb24df8783c7b5f15
32841 F20110320_AACBPX evans_c_Page_082.pro
e6eb4303a3d7eb38f5fb334505ab7eb2
88e0780121620ddeac3b123ecb276f104783500f
41067 F20110320_AACBRB evans_c_Page_131.pro
cad2b6b119de225e72759e5b091b3833
7905726d5437fab3d7edb5cf6de68efdc77459ba
34319 F20110320_AACBQN evans_c_Page_108.pro
e7ca417a289d4ab2699c4c8e8880d705
b3c41ed0dc2a29545ef464c5eb4a8f800adb0d78
44140 F20110320_AACBPY evans_c_Page_083.pro
d30e3e8907686fe188c56b03f34a179a
4a84e18d83a42c842bcfba73f84db08f1dc3dd2b
63632 F20110320_AACBRC evans_c_Page_001.jpg
fb5d1333ad8ad080b26cdc980dfaa109
d1afac02ad33517b83624f159cccc67c43f2d2e9
41233 F20110320_AACBQO evans_c_Page_110.pro
32fef1b997d15b5233842edd9ec51864
5ac395f1dd64f50f0d6ef8d38b10d76b274db2df
16494 F20110320_AACBPZ evans_c_Page_084.pro
faf201c238575f054998133f784acdeb
7381abdaef74cec0de0a14b09ba178fa7cc740bc
21367 F20110320_AACBRD evans_c_Page_001.QC.jpg
920055735c1ea3507d2898619cc44ca4
e66b8f18144e5b8d27ed5a68a72e478fc718cbb6
39303 F20110320_AACBQP evans_c_Page_111.pro
5fe560547f3229a214bd50e69865f1ff
682d22295f6eb55704513b1ada4685d63abdda55
15366 F20110320_AACBRE evans_c_Page_002.jpg
0a49a0c7cf84366915ef247845e8c1ed
78963c3ab2801aee35c5e072d394f6581deb788b
43575 F20110320_AACBQQ evans_c_Page_113.pro
3ad3b2cbab6ab33b7c293a5007c687ed
be074b12eee7804cb419c3ec19018ba53c03bd32
6137 F20110320_AACBRF evans_c_Page_002.QC.jpg
7f054595bd710bb78f472a9d9f6b0505
468cede7642a4119814503b3d8453bc5bcc74a20
38899 F20110320_AACBQR evans_c_Page_114.pro
a48be7257c37d089951cea9e2c09237a
13dfc67674e8010e9915669f28f1dd1a3af31c55
24101 F20110320_AACBRG evans_c_Page_003.jpg
43326821fffcc360af0f51cef1d1210c
964b299cc750ae90af7c6ef1ad2bbdded104d25d
43706 F20110320_AACBQS evans_c_Page_116.pro
0026e6a000ff308e398acec83dc03f17
671820c115b658cccfabb33e29b265658fe6eecb
8502 F20110320_AACBRH evans_c_Page_003.QC.jpg
aad9a03d50585f5f03c7e1ea2e84a4cf
e5d31a21fcdd4d762c2c064d6b87bff419f63a50
49762 F20110320_AACBQT evans_c_Page_117.pro
2c9b39074375f97c38120937a943704e
20a4a0a9b96daac2fd150d5d76cc4a80f4b44537
172195 F20110320_AACBRI evans_c_Page_005.jpg
9c52b949b47cfcdf07d19d724d61b8af
b24e2c9654dd60aa76d4dd968ca5b1395e2c62ba
34477 F20110320_AACBQU evans_c_Page_119.pro
942fe9e361cd8434d4ab22cf01279477
75c1bab676b39337a60f8b1e82db154b705556e5
71025 F20110320_AACBRJ evans_c_Page_006.QC.jpg
3ea24b9c7132bac902a0f204dedc3b09
5631cef2b1678ede4c4b80fbabebeb1ff7b0579f
32437 F20110320_AACBQV evans_c_Page_120.pro
5691fe2cce974063d6a959520a10eb98
6ab2349dc55ee7a15cf9fdca398411de8c368043
261012 F20110320_AACBRK evans_c_Page_007.jpg
d9cec7e95889bf3c5f72756fd987e797
bcc982d89ffd41e71b13efdd394591de3073f72a
39887 F20110320_AACBQW evans_c_Page_121.pro
ca3da7051e62453c93837a3eeee15d68
09c0fd891e83e2f0f5939a29194fa59241ea32ad
96552 F20110320_AACBRL evans_c_Page_007.QC.jpg
852dca7f4fd3fdc968bfad0812c99c03
784d01efdcd7bf7284776482d38d9742d7a0540f
419058 F20110320_AACBRM evans_c_Page_008.jpg
18f222ab700ca38b80528b7561f2efaf
5bfc53b003b2b22bdc394f659fa529ca01832f6c
51478 F20110320_AACBQX evans_c_Page_125.pro
4183f73dc28bc25efe016fac83edd44d
ff894f1b434f34429bbab2c2abb464ae4ef8ee6e
210225 F20110320_AACBSA evans_c_Page_022.jpg
d71e8c9251bb78e7689453ae37f5a7ff
8d7ee748c858596ebc584ebe630b97e38c526b84
271493 F20110320_AACBRN evans_c_Page_010.jpg
93efd2cbc226edc69e11445a24a1a3b6
1f56fa4a1d0f80dc73f576f540b09470523f92f2
52811 F20110320_AACBQY evans_c_Page_127.pro
7908684c3b5b0c0a2a450d8287b9ce19
d7db5e5553ac0fc90874293c9adf2e1993a171cc
76782 F20110320_AACBSB evans_c_Page_022.QC.jpg
528f2d7e8bc20077c174b0a54e333168
cafca7061b32efa8f93688a8545128532d5d96b9
197488 F20110320_AACBRO evans_c_Page_011.jpg
3a1a0a907a5a38b310a3193f16600e1e
33201d40497ccce59c64817dc7b91e4e8b19046a
65427 F20110320_AACBQZ evans_c_Page_128.pro
de73212f42dc173825147039f40e70b3
28103f90555fecc5e09cbf1adf1dc325ae48e270
78474 F20110320_AACBSC evans_c_Page_023.QC.jpg
1480280d57a27274148f9a91edcf7ff9
7e9dd8ada06bf7fed1be219318fed36d88236789
83756 F20110320_AACBRP evans_c_Page_011.QC.jpg
fcb5de5f8f4639248cbb2b3ffdc52fef
219d85be92b8d738b6ffb3e963f0ea842b064709
195545 F20110320_AACBSD evans_c_Page_024.jpg
2f7909a69996fc082015becd219c0644
8454b851aa4caf45b783999ba612d5afc0ad1915
258462 F20110320_AACBRQ evans_c_Page_012.jpg
6b8619954afae99b0898bca28c42b1a3
fe490fb41ffb0b6047eca8bf540a78812cd4edb1
74782 F20110320_AACBSE evans_c_Page_025.QC.jpg
a4a87f4f26bb371164b325c0fff5506b
406b3e2ad2fea47ff7c4afb1ff8ee8bdc05af7c4
98074 F20110320_AACBRR evans_c_Page_013.jpg
443e9150057b66f5c6eb93cff2df49e2
527a0f52680eaba724b2c58e8a40e8678adefee0
207465 F20110320_AACBSF evans_c_Page_027.jpg
fd479f30a97dbdb07e4fe04355b6456b
07da586318b5915b0f3ca8b8e207aba4d0fd916c
63390 F20110320_AACBRS evans_c_Page_014.QC.jpg
2f00d7bdbd055dbeae25f3bb5d700e6d
25ca940793fab58d6cd20d2426dd64e26daff436
78937 F20110320_AACBSG evans_c_Page_027.QC.jpg
2f4a66b697ebfe8a83e4a3bed9c809b2
a0a2d27e1271fe1a8c75b7669a8afecbb5f7fb11
48409 F20110320_AACBRT evans_c_Page_015.jpg
5ba6c0741c9aa96a126e7d4d7be87283
8df40420bd7fd8d62fa95733767b6bdb487d5428
81840 F20110320_AACBSH evans_c_Page_028.QC.jpg
0f4c2ae4ed2bb07da5daf46ea8347f90
d55844c64c21472ce3cf3a61b798c2fbc8c7497e
18858 F20110320_AACBRU evans_c_Page_015.QC.jpg
b898506bd4cf66ea2c8daad6ceee221a
c613436e81fd757f13460ea451885a4ef74f4902
211610 F20110320_AACBSI evans_c_Page_029.jpg
b49be72d6cc122ad84ed4c72c215d8b6
76ee6f80e00ddf58b9232ce187ef249e3536adf7
201367 F20110320_AACBRV evans_c_Page_017.jpg
d73e3076b941aea7ba42503bec9a5572
a2569b3a78b6197f030d20d3b17f20571092e8b9
79248 F20110320_AACBSJ evans_c_Page_029.QC.jpg
678a2954c09455f4b45f79576fe1689f
03cf0a9a387a3149cf583621879929fb05221980
75020 F20110320_AACBRW evans_c_Page_017.QC.jpg
f4272021904a5fbf27d79ec9afbde122
d7479005783342bc392720e3487322d300f38e1b
207420 F20110320_AACBSK evans_c_Page_030.jpg
327316d40886163aed26873a7af19d5e
1b2b41c1ae74a820f9ebe91c1742a6c54674c0ed



PAGE 1

DEVELOPMENT OF A GEOSPATIAL DATA-SHARING METHOD FOR UNMANNED VEHICLES BASED ON TH E JOINT ARCHITECTURE FOR UNMANNED SYSTEMS (JAUS) By CARL PRESTON EVANS, III A THESIS PRESENTED TO THE GRADUATE SCHOOL OF THE UNIVERSITY OF FLOR IDA IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF MASTER OF SCIENCE UNIVERSITY OF FLORIDA 2005

PAGE 2

Copyright 2005 by Carl Preston Evans, III

PAGE 3

It is difficult to say what is impossible, for the dream of yesterday is the hope of today and the reality of tomorrow. Dr. Robert H. Goddard

PAGE 4

To my grandparents, Lottie and James Patt erson, for their neve r-ending, unconditional love, and support of me. Because of them, I have been able to see and do things that they were unable to. In the words of the great Al bert Einstein, I have seen farther because they have allowed me to stand on their shoulders.

PAGE 5

ACKNOWLEDGMENTS First and foremost, I would like to thank my committee chair (Dr. Carl D. Crane III) for his patience with me during the process of finishing this work, and for giving me the great opportunity to study and do my masters research at the University of Floridas Center for Intelligent Machines and Robotics. Dr. Crane and I have had a professional relationship since 2000, when we met at a Joint Architecture for Unmanned Systems (JAUS) working group meeting. I look forward to a continued professional relationship as we continue to research the vast field of autonomous systems. Thanks also go to my other committee members (Drs. John Schueller and Christopher Neizrecki) for their valuable contributions to this study. Thanks also go out to Mr. Dan Deguire, Project Manager in Foster-Millers Design and Systems Integration (DSI) Group. I had the honor of working for Dan for two-and-a-half years, starting as a co-op student and ending as a Design Engineer. While at Foster-Miller, Dan funded my first foray into obstacle detection and autonomous navigation (my undergraduate senior design project). I appreciate the friendship that we continue to share. Special thanks go out to the Center for Intelligent Machines and Robotics (CIMAR) Team CIMAR; competitors in the inaugural Defense Advanced Research Projects Agency (DARPA) Grand Challenge held in March 2004. I particularly those who were on (or otherwise contributed to work of) the perception team: Mel Torrie, v

PAGE 6

Sarah Gray, Kristopher Klingler, Charles Smith, Sanjay Solanki, Danny Kent, Erica Zawodny-McArthur, and Donald McArthur. Thanks also go out to the members of the JAUS World Model Subcommittee for making each of my meetings feel like a thesis defense. Their valuable suggestions helped to shape this document, particularly Chapter 4. I thank Chad Tobler (a friend and biker buddy, late in my years at CIMAR) for keeping me sane and for providing valuable insight during the writing of this study. Thanks also go to my great friend Matthew Bokach from the School of Natural Resources and Environment at the University of Florida. I thank him for being there for me in many ways both personally and professionally. I especially appreciate his help with this study. From the day that he told me that two points of equal latitude, longitude, and elevation are not always coincident, I have learned much from him. Of course I must thank my new coworkers (Todd, Parag, Patrick, and Mark) at Applied Perception, Inc. (API), for giving me a hard time for taking so long to finish this study, but more importantly for their kind words of support. I look forward to many successful years with them at API. Last but certainly not least, thanks go to the Air Force Research Laboratory at Tyndall Air Force Base, FL, for funding this work. For over a decade, they have been supporters of the work done by the talented group of roboticists at the University of Floridas Center for Intelligent Machines and Robotics. Without their support, this study could not have been completed. I am thankful for their support, and I look forward to continued work with them in the future. vi

PAGE 7

TABLE OF CONTENTS page ACKNOWLEDGMENTS...................................................................................................v LIST OF TABLES...............................................................................................................x LIST OF FIGURES..........................................................................................................xii ABSTRACT.....................................................................................................................xiv CHAPTER 1 INTRODUCTION AND RESEARCH PROBLEM.....................................................1 1.1 Introduction.............................................................................................................1 1.2 Research Problem...................................................................................................4 2 REVIEW OF RELEVANT LITERATURE.................................................................6 2.1 Joint Architecture for Unmanned Systems (JAUS)................................................6 2.1.1 Tenets of JAUS.............................................................................................7 2.1.2 System Structure of JAUS............................................................................8 2.1.3 World Model Subcommittee for JAUS......................................................11 2.2 Real-Time World Modeling Methods..................................................................11 2.2.1 Raster Occupancy Grid...............................................................................12 2.2.2 Real Time Terrain Mapping.......................................................................13 2.2.3 Raster Traversability Grid..........................................................................14 2.3 A Priori World Modeling Methods.......................................................................14 2.4 Geographic Modeling Methods............................................................................15 2.4.1 Global Coordinate Systems........................................................................16 2.4.2 Projected Coordinate Systems....................................................................18 2.4.3 Universal Transverse Mercator projection.................................................19 2.5 Georeferenced World Model Data........................................................................20 2.5.1 Raster Data Stores......................................................................................20 2.5.2 Vector Data Stores......................................................................................20 2.6 Distributed World Modeling Methods..................................................................21 2.6.1 Spatial Data Transfer Standard (SDTS).....................................................21 2.6.2 Geography Markup Language....................................................................23 vii

PAGE 8

3 SMART SENSORS....................................................................................................26 3.1 Smart Sensor Architecture....................................................................................26 3.2 Smart Sensor Architecture Components...............................................................29 3.2.1 Smart Sensor Component...........................................................................29 3.2.2 Smart Sensor Arbiter Component..............................................................31 3.2.3 Reactive Planner Component.....................................................................32 3.3 Smart Sensor Messaging Architecture..................................................................33 3.3.1 Smart Sensor Architecture Message Header..............................................33 3.3.2 Smart Sensor Architecture Message Set.....................................................36 3.3.2.1 Report vehicle state message............................................................38 3.3.2.2 Report traversability grid update message.......................................40 3.3.2.3 Report region clutter index message................................................42 3.3.3 Smart Sensor Architecture Network Communications..............................43 3.4 Smart Sensor Implementation...............................................................................45 3.4.1 Abstraction of Smart Sensor Core Functionality........................................45 3.4.2 Base Smart Sensor......................................................................................46 3.5 Smart Stereo Vision Sensor Implementation........................................................50 3.5.1 Stereo vision Hardware..............................................................................51 3.5.2 Stereo Vision Software...............................................................................51 3.5.3 Smart Stereo Vision Sensor........................................................................51 3.6 Use of Obstacle Detection and Free Space Sensors.............................................56 3.7 Smart Sensor Arbiter Implementation..................................................................57 4 JAUS WORLD MODEL KNOWLEDGE STORES.................................................60 4.1 Observations and Recommendations....................................................................61 4.1.1 Raster and Vector Object Representation...................................................66 4.2 World Model Knowledge Store Message Set.......................................................68 4.2.1 JAUS Core Input and Output Message Sets...............................................68 4.2.2 Raster Knowledge Store Input Message Set..............................................71 4.2.2.1 Code F000h: Create raster knowledge store object..........................72 4.2.2.2 Code F001h: Set raster knowledge store feature class metadata......74 4.2.2.3 Code F002h: Modify raster knowledge store object (cell update)...74 4.2.2.4 Code F003h: Modify raster knowledge store object (grid update)..77 4.2.2.5 Code F004h: Delete raster knowledge store objects........................79 4.2.2.6 Code F200h: Query raster knowledge store objects.........................79 4.2.2.7 Code F201h: Query raster knowledge store feature class metadata.81 4.2.2.8 Code F202h: Query raster knowledge store bounds........................82 4.2.2.9 Code F600h: Raster knowledge store event notification request.....82 4.2.2.10 Code F601h: Raster knowledge store bounds change event notification request.........................................................................83 4.2.2.11 Code F005h: Terminate raster knowledge store data transfer........83 4.2.3 Raster Knowledge Store Output Message Set............................................83 4.2.3.1 Code F400h: Report raster knowledge store object creation...........84 4.2.3.2 Code F401h: Report raster knowledge store feature class metadata84 4.2.3.3 Code F402h: Report raster knowledge store objects (cell update)...85 viii

PAGE 9

4.2.3.4 Code F403h: Report raster knowledge store objects (grid update)..87 4.2.3.5 Code F404h: Report raster knowledge store bounds.......................89 4.2.3.6 Code F800h: Raster knowledge store event notification (cell update).....................................................................................90 4.2.3.7 Code F801h: Raster knowledge store event notification (grid update).....................................................................................90 4.2.3.8 Code F802h: Raster knowledge store bounds change event notification.......................................................................................91 4.2.3.9 Code F405h: Report raster knowledge store data transfer termination.......................................................................................91 4.2.4 Vector Knowledge Store Input Message Set..............................................91 4.2.4.1 Code F020h: Create vector knowledge store objects.......................92 4.2.4.2 Code F021h: Set vector knowledge store feature class metadata....95 4.2.4.3 Code F022h: Delete vector knowledge store objects.......................96 4.2.4.4 Code F220h: Query vector knowledge store objects.......................98 4.2.4.5 Code F221h: Query vector knowledge store feature class metadata...........................................................................................99 4.2.4.6 Code F222h: Query vector knowledge store bounds.....................100 4.2.4.7 Code F620h: Vector knowledge store event notification request..100 4.2.4.8 Code F621h: Vector knowledge store bounds change event notification request........................................................................101 4.2.4.9 Code F023h: Terminate vector knowledge store data transfer.......101 4.2.5 Vector Knowledge Store Output Message Set.........................................101 4.2.5.1 Code F420h: Report vector knowledge store object(s) creation....102 4.2.5.2 Code F421h: Report vector knowledge store feature class metadata................................................................................................102 4.2.5.3 Code F422h: Report vector knowledge store objects.....................103 4.2.5.4 Code F423h: Report vector knowledge store bounds....................106 4.2.5.5 Code F820h: Vector knowledge store event notification...............107 4.2.5.6 Code F821h: Vector knowledge store bounds change event notification.....................................................................................107 4.2.5.7 Code F424h: Report vector knowledge store data transfer termination.....................................................................................107 5 CONCLUSIONS AND FUTURE WORK...............................................................109 5.1 Conclusions.........................................................................................................109 5.2 Future Work........................................................................................................110 REFERENCES................................................................................................................112 BIOGRAPHICAL SKETCH...........................................................................................115 ix

PAGE 10

LIST OF TABLES Table page 3-1 Standard JAUS sixteen byte message header...........................................................34 3-2 Smart sensor architecture message header...............................................................36 3-3 Smart sensor architecture's report vehicle state message.........................................40 3-4 Smart sensor architecture's report traversability grid updates message...................42 3-5 Smart sensor architecture's report region clutter index message..............................43 3-6 Smart sensor components and their component identification numbers..................45 4-1 Create raster knowledge store objects message format............................................73 4-2 Presence vector for create raster knowledge store objects message........................74 4-3 Set raster knowledge store feature class metadata message format.........................74 4-4 Modify raster knowledge store object (cell update) message format.......................75 4-5 Modify raster knowledge store object (grid update) message format......................77 4-6 Delete raster knowledge store objects message format............................................79 4-7 Query raster knowledge store objects message format............................................80 4-8 Presence vector for query raster knowledge store objects message.........................81 4-9 Query raster knowledge store feature class metadata message format....................82 4-10 Query raster knowledge store bounds message format............................................82 4-11 Report raster knowledge store object creation message format..............................84 4-12 Report raster knowledge store feature class metadata message format...................85 4-13 Report raster knowledge store objects (cell update) message format......................86 4-14 Report raster knowledge store objects (grid update) message format......................88 x

PAGE 11

4-15 Report raster knowledge store bounds message format...........................................90 4-16 Create vector knowledge store objects message format...........................................92 4-17 Presence vector for create vector knowledge store objects message.......................95 4-18 Set vector knowledge store feature class metadata message format........................96 4-19 Delete vector knowledge store objects message format...........................................97 4-20 Presence vector for delete vector knowledge store objects message.......................98 4-21 Query vector knowledge store objects message format...........................................98 4-22 Presence vector for query vector knowledge store objects message........................99 4-23 Query vector knowledge store feature class metadata message format.................100 4-24 Query raster knowledge store bounds message format..........................................100 4-25 Report vector knowledge store object(s) creation message format........................102 4-26 Report vector knowledge store feature class metadata message format................102 4-27 Report vector knowledge store objects message format........................................103 4-28 Report vector knowledge store bounds message format........................................106 xi

PAGE 12

LIST OF FIGURES Figure page 1-1 Example of the processes leading up to higher-level planning..................................4 1-2 How our study fits into the higher-level planning process.........................................4 2-1 System structure of JAUS..........................................................................................9 2-2 Path planning in a bounded radiation environment..................................................15 2-3 Ellipsoidal model of Earth........................................................................................16 2-4 Earth-centered global coordinate system.................................................................17 2-5 Example format of digital elevation model..............................................................21 2-6 An example of USGS source data............................................................................22 3-1 Team CIMAR NaviGATOR DARPA Grand Challenge entry vehicle....................27 3-2 Organization of the smart sensor-based perception system.....................................30 3-3 Minimally complete smart sensor-based perception system....................................38 3-4 Single sensor implementation of smart sensor-based perception system................38 3-5 Unmanned system coordinate system defined by JAUS..........................................39 3-6 Center for Intelligent Machines and Robotics Navigation Test Vehicle 2...............47 3-7 Smart sensor implementation...................................................................................48 3-8 Call graph for all functions within the base smart sensor API.................................48 3-9 Call graph for smart sensor communications receive thread...................................48 3-10 Call graph for function that determines number of rows and columns to shift........50 3-11 Call graph for sensor specific interface thread.........................................................50 3-12 Call graph for function used to detect changes in the traversability grid.................50 xii

PAGE 13

3-13 Videre Design STH-MD1-C stereo camera head.....................................................52 3-14 Source data and results from stereo correlation.......................................................53 3-15 Graph of range determined from stereo vision system vs. actual measured range..53 3-16 Plot of range resolution vs. range.............................................................................54 4-1 Definition of raster grid parameters and coordinate system.....................................67 4-2 Definition of vector objects and parameters............................................................69 xiii

PAGE 14

Abstract of Thesis Presented to the Graduate School of the University of Florida in Partial Fulfillment of the Requirements for the Degree of Master of Science DEVELOPMENT OF A GEOSPATIAL DATA-SHARING METHOD FOR UNMANNED VEHICLES BASED ON THE JOINT ARCHITECTURE FOR UNMANNED SYSTEMS (JAUS) By Carl Preston Evans, III August 2005 Chair: Carl D. Crane, III Major Department: Mechanical and Aerospace Engineering A task performed almost effortlessly by humans, perception is perhaps one of the most difficult tasks for autonomous vehicles. While substantial research has been done to develop these technologies, few studies have examined ways for multiple heterogeneous unmanned systems to cooperate in their perception tasks. Our study examined ways to model both perceived and a priori geospatial information, and formatting these data so that they can be used by the growing unmanned systems community. We introduce a perception system model, consisting of distributed smart sensors. This system of sensors was developed for the Team CIMAR entry into the inaugural DARPA Grand Challenge autonomous vehicle competition held in March 2004. The Smart Sensor Architecture proved to be a power method of distributing the possessing of sensor data to systems developed by engineers who best knew a particular sensor modality. By standardizing the logical, transport, and electrical interfaces, the smart sensor architecture developed into a powerful world modeling method. xiv

PAGE 15

We also investigated current geospatial data-modeling methods used in the unmanned systems and geodetic information systems (GIS) communities. Our study determined the commonalities among current methods and resulted in a first-generation geospatial data-sharing standard for unmanned systems compliant with the Joint Architecture for Unmanned Systems (JAUS). xv

PAGE 16

CHAPTER 1 INTRODUCTION AND RESEARCH PROBLEM 1.1 Introduction Imagine a world without languages, with no standard methods of communicating with other people. Imagine a world in which an individual or a small group of individuals had a language completely different from that of other individuals or groups. Image also that even the most primitive methods of communicating required an interpreter. It would be unreasonable to expect two people to be able to come together and (with minimal effort) understand each other. This is the state of the world in the unmanned systems community. Developing a common language for unmanned systems is not trivial. However, as unmanned systems become more commonplace and gain the ability to interoperate and ultimately collaborate, a standard communications method or language must be developed. As it has with a number of technological innovations throughout recent history, the United States Department of Defense (DoD) is helping to revolutionize the unmanned systems community by pushing the development of a standard communications method for all future DoD unmanned systems. Recognizing the increased acquisition and maintenance costs for a growing fleet of unmanned systems with proprietary interfaces, the Office of the Secretary of Defense chartered the Joint Architecture for Unmanned Ground Systems (JAUGS) Working Group to address these concerns. The JAUGS Working Group was tasked with developing an initial standard for interoperable unmanned ground systems. In 2002, the charter of the JAUGS Working Group was 1

PAGE 17

2 modified such that their efforts would extend to all unmanned systems, not only ground systems. The standard was therefore renamed Joint Architecture for Unmanned Systems (JAUS). Unmanned systems are becoming increasingly popular. In fact, large U.S. government acquisition programs such as Future Combat Systems (FCS) and Man Transportable Robotic Systems (MTRS) show that unmanned systems are here to stay. The Future Combat Systems (FCS) program is an ambitious multi-billion-dollar program with a goal of integrating autonomous, semi-autonomous, and tele-operated systems into the battlefield of tomorrow. Man Transportable Robotic Systems (MTRS) is a large multi-million-dollar program that requires a large number of tele-operated unmanned systems for use in the task of explosive ordnance disposal (EOD). Both the FCS and MTRS programs require systems that can communicate with one another (operator control units to vehicle or inter-vehicle) using a shared language. This language (JAUS) is the subject of our study. Currently, JAUS supports tele-operation; and, to an extent, primitive levels of semi-autonomy. Technological innovations in the areas of sensors, sensor processing and fusion, perception, and intelligence have advanced robotics so much that demands that were not long ago far-fetched are fast becoming a reality. To this end, JAUS must adapt to meet the growing requirements of the semi-autonomy and autonomy camps of the unmanned systems community. Types of autonomous behaviors are as numerous as human behaviors. However, for unmanned systems, the next step in the natural progression beyond tele-operation is the development of assisted tele-operation and autonomous navigation and obstacle avoidance.

PAGE 18

3 It is as difficult to define a new all-encompassing language for unmanned systems as it would be for humans. In the context of a particular mission, however, it is possible to develop a syntax that can be used to communicate relevant information. By initially limiting the scope of JAUS and incrementally adding functionality, a robust language is being built. The focus of our study was on allowing JAUS-based unmanned systems to share geospatial data. These geospatial data are needed to support the tasks of obstacle detection, obstacle avoidance, and path planning among multiple JAUS subsystems. The concept of the world model helps to put this work into perspective. Meystel and Albus [18] defined the world model as the intelligent systems best estimate of the state of the world. The world model includes a database of knowledge about the world, plus a database management system that stores and retrieves information. The world model also contains a simulation capability that generates expectations and predictions. The world model provides answers to requests about the present, past, and probable future states of the world. The world model provides this information service to the behavior generation system element in order to make intelligent plans and behavioral choices. It provides information to the sensory processing system element to perform correlation, model matching, and model-based recognition of states, objects, and events. It provides information to the value judgment system element to compute values such as cost, benefit, risk, uncertainty, importance, and attractiveness. The world model is kept up to date by the sensory processing system element. A world model presents unending directions to investigate. No doubt, many such investigations have begun. Our study focuses on the database of knowledge. We examined how the data are stored inside the database, and how databases can share data using a common language. In the context of JAUS, our study presents a first-generation standard for sharing a database of knowledge. Because unmanned systems used in the JAUS community are outdoor vehicles (and because of the desired tasks) these world model databases store geospatial data. A review of the relevant literature formed a solid

PAGE 19

4 basis for creating this standard. We also introduced the implementation of a perception system. Figure 1-1. Example of the processes leading up to higher-level planning Geospatial data generated by an unmanned system are only as good as the systems sensors and its sensor fusion and registration methods. Also important is what is done with these geospatial data after they have been fused and registered (high-level planning and intelligent behaviors). These important issues fall outside the scope of our study. Figure 1-1 shows some of the processes leading up to high-level planning. Figure 1-2 shows how our study fits in. The message-set generated by this study will allow different databases to share knowledge among themselves or with higher-level planning processes. Figure 1-2. How our study fits into the higher-level planning process 1.2 Research Problem Our study takes its direction from the following research problem. Given the experience and knowledge gained from examining current methods of modeling geospatial data within the unmanned systems and geographic information systems (GIS) communities and from implementing a perception system for an unmanned ground system, create a first generation geospatial data-sharing method for unmanned systems. Present this in a format consistent with the Joint Architecture for Unmanned Systems (JAUS) messaging framework. This is a broad and open-ended topic. However, it must be addressed. As the capabilities of JAUS are extended, being without a method for communicating even the most basic forms of obstacle data would be a severe limitation. The primary purpose of

PAGE 20

5 the standard presented in our study is to support mission planning. However other applications (such as data visualization) also benefit. The contribution of our study is a first-generation method recommended for sharing data needed by state-of-the-art real-world, unmanned systems. The recommendations may be seen as guidelines for a first attempt (at least within the JAUS community) to allow multiple disparate unmanned systems (from different organizations, with completely different perception implementations) to share data.

PAGE 21

CHAPTER 2 REVIEW OF RELEVANT LITERATURE The most difficult behaviors for unmanned systems are perception and reasoning. Reasoning for an unmanned system is highly dependent on the quality of the estimation of the environment in which the unmanned system operates. This estimation is often used to support higher level behaviors performed by either the unmanned system or a human operator through tele-operation. Each system typically has its own method for modeling and sharing data. As we move towards increased interoperability among unmanned systems from different vendors, work must be done to bridge the gap between different methods of representing sensed data and providing those data to disparate unmanned systems. Again, this is the focus of our study; to provide a first generation standardized method for modeling the environments that unmanned systems operate in and then providing those data to other concerned manned or unmanned systems. Much work has been done in recent years to move toward true interoperability between unmanned systems. One of the major efforts towards reaching this goal is the Joint Architecture for Unmanned Systems (JAUS). JAUS is a standard that defines the format of messages that travel between unmanned systems. Since it is fast becoming the standard for military unmanned systems, JAUS provides a suitable base upon which to build a first generation world modeling standard for unmanned systems. 2.1 Joint Architecture for Unmanned Systems (JAUS) The Joint Architecture for Unmanned Systems (JAUS) is a messaging standard being developed with overall goals of reducing life cycle costs, enabling fast integration 6

PAGE 22

7 of new technologies, and facilitating interoperability amongst heterogeneous unmanned systems. In 1998, the Office of the Secretary of Defense (OSD) chartered the Joint Architecture of Unmanned Ground Systems (JAUGS) Working Group and tasked this working group with developing a common model for messages used for controlling and monitoring processes within unmanned ground systems. Now the Joint Architecture for Unmanned Systems (JAUS), the working group is tasked with expanding the standard to the entire domain of unmanned systems. This group is currently represented by a diverse group of members from government, industry, and academic institutions. By having a wide range of input in developing the standards, JAUS is better prepared for wide acceptance by the unmanned systems community. 2.1.1 Tenets of JAUS To ensure the flexibility, extensibility, and ultimately the longevity of the emerging JAUS standard, it was developed with four main tenets. These are: technology independence, hardware independence, platform independence, and mission independence [16]. The technology independence of JAUS assures that the messages that compose the JAUS standard as well as the methods for transporting the messages are not dependent on any past, present, or developing standard. For example, many JAUS implementation use the user datagram protocol (UDP) and the internet protocol (IP) for data transmission. Other implementations may, however, use asynchronous serial communications links such as EIA/TIA 232. There may be cases where one communications method is preferred over another. By restricting the dependence on a communications technology, JAUS leaves this decision to the system developer and thus remains very flexible. By defining only the messages to be communicated, JAUS will remain relevant over time.

PAGE 23

8 The Hardware Independence rule is similar to the technology independence requirement. JAUS does not rely on knowledge of the structure of an unmanned system. There are no assumptions about the type of platform or the contents of the platform. So long as a system has adequate hardware to create, receive, process, and respond to the standardized JAUS messages, it is considered to be compliant with the specification. Platform independence is the third tenet of JAUS. There are no assumptions about the type of systems that will use JAUS. The JAUS standard is just as useful for large tanks as it is for miniature microcontroller based unattended sensors. Surely as systems become more embedded, the read only memory (ROM), random access memory (RAM), and computing resources available decrease. Therefore an embedded system is less likely to be able to support large complex JAUS messages. This is acceptable as JAUS is very flexible with respect to the messages that each system must support. With the exception of a small number of core input and output messages, JAUS allows systems to use only the messages (as well as fields within those messages) that they need to perform their function. JAUS also does not presuppose that the unmanned systems based on the specification are designed for any particular mission. This is the mission independence tenet of JAUS. By defining a comprehensive message set, it is hoped that JAUS developers can assemble systems that can complete any mission. Surely this is intractable, but with the guidance the diverse membership of the JAUS working group, JAUS has a firm foundation on which to build. 2.1.2 System Structure of JAUS The Joint Architecture for Unmanned Systems consists of a number of hierarchical elements that work together to form a complete JAUS compliant unmanned system. The

PAGE 24

9 lowest level of abstraction within JAUS is the component. Going up the chain of complexity, a JAUS node consists of multiple components, a subsystem consists of one or more nodes, and a JAUS system consists of one or more subsystems. Figure 2-1 shows the structure of a JAUS system. Not show in this figure is the concept of multiple instances of a component. This feature is included in JAUS to support component redundancy. Figure 2-1. System structure of JAUS The component encapsulates a specific function and the input and output messages necessary to command, control, and monitor the component. For example, the JAUS Primitive Driver component is responsible for the low-level command and control of an unmanned system. It controls and reports current status of the lowest level devices on the platform and reports platform specific data such as platform name and dimensions. Another component, the JAUS Global Pose component, interfaces to a device or a number of devices that are capable of providing the platform with its current global position, orientation, and orientation rate information. These are just two examples of

PAGE 25

10 JAUS components. The JAUS Reference Architecture currently defines 26 components, each with its own specific function. The Reference Architecture allows up to 254 components to operate within a JAUS node. A node is a single computing entity that consists of one or more JAUS components running in a tightly coupled manner. In this context, tightly coupled implies that the computing entities are not linked by any external connections. Instead, they are connected internally. This could be by function calls or shared memory, for example. If two or more components are to be linked by an external communications medium, they should be considered separate nodes. The JAUS standard currently allows up to 254 nodes within a subsystem. A subsystem is device that performs a function through the synergy of the component containing nodes within it. There must be at least one node within a subsystem. This node may contain all the components necessary for the subsystem to perform its function. The subsystem may also contain a number of nodes that each provide components necessary for the subsystem to perform its function. The JAUS standard currently allows up to 254 subsystems to operate within a JAUS system. A system consists of one or more subsystems working together for some useful purpose. This is the highest level within the JAUS hierarchy. JAUS currently does not permit communications between different JAUS systems. Within a system, however, any component, node, or component may communicate with any other component, node, or subsystem. The hope is that the JAUS standard is generalized enough that it will not inhibit the creativity of the engineers and scientists developing these systems. Of course it is not

PAGE 26

11 possible to account for all possible unmanned system scenarios. Because of this, the JAUS standard has been developed to allow for the development of user-defined components. The idea is that as these user-defined components mature and their usefulness is recognized by the JAUS community, they would be incorporated into the JAUS Reference Architecture. What is most important overall about JAUS is that it standardizes the interface between these components. As unmanned systems become more and more common place, without JAUS or some industry-wide JAUS-like standard, the interoperability issues will only be compounded. 2.1.3 World Model Subcommittee for JAUS In October 2002, the Joint Architecture for Unmanned Systems Reference Architecture Committees World Model Subcommittee was established to address the growing need within the unmanned systems community for a messaging architecture that allows multiple heterogeneous unmanned systems to share geospatial data. The task of this subcommittee was to develop the methods to allow modeling and sharing of geospatial data within the JAUS framework. For JAUS, the primary purpose for modeling and sharing of these data is to support the tasks of mission planning and distributed mapping for autonomous systems. JAUS is focused on the practical approach to unmanned systems and therefore so should a JAUS standard for geospatial data modeling. 2.2 Real-Time World Modeling Methods The field of mobile robotics is generally interested in real time world modeling methods. Typically these world modeling methods support the task of reflexive obstacle avoidance whereby an unmanned system uses an instantaneous view of the environment to effect change in its current mission. For example, an unmanned system may be tasked

PAGE 27

12 to autonomously navigate to a given waypoint without colliding with anything along the way. Similar to a human reflexively reacting to a sudden undesired condition, an unmanned system given this task may reflexively respond to obstacles that appear within the field of view of its sensors. Often these methods require very little modeling or processing of the sensor data. Of paramount concern is the safety of the systems. It is often desired or necessary to have unmanned systems accumulate a model of the environment in which they operate. This may simply be for the sake of building an accumulative map of the environment or it may be to allow the unmanned system to make a more informed decision should it decide that it needs to modify its current behavior in order to successfully complete its given task. For example, if an unmanned system can perceive the environment at a distance that extends far beyond a range at which the system must act reflexively to maintain the safety of the system, those additional data could be used to reactively re-plan a path that avoids the hazard completely. At the very least, an accumulative model of the environment would provide the system with the ability to, should it have to act reflexively, choose the best long term plan. Typically both reflexive and reactive obstacle avoidance systems use a tessellated raster grid based data structure to represent the environment. These raster grids are most commonly used for real-time world modeling because it is simple to project the sensor view into a two-dimensional Cartesian grid. 2.2.1 Raster Occupancy Grid Sensors are all prone to errors that affect the quality of their data. Some of the sensors, such as radar and sonar, have wide fields of view, but very low resolution within their fields of view. To handle the issues of uncertainty and errors in the sensor data, the

PAGE 28

13 concept of the occupancy grid was introduced by Elfes [13]. The raster occupancy grid is a tessellated grid used to accumulate real-time sensor data. A probabilistic model of the data from the sensor is generated and is used to update occupancy probabilities within the raster grid. The grid cell values for the occupancy grid represent the probability that an object exists or does not exist in the area covered by the cell. Updates are made to the patches of the grid that represent the field of view of the sensor. Even though this idea was pioneered by Elfes in 1989, it is still the most common implementation for real-time sensor data accumulation. It is especially useful for supporting the task of obstacle avoidance. Over the years there have been several extensions on the pioneering work done by Elfes. One extension to Elfess approach was introduced by Borenstein [6]. Rather than updating a large patch of the occupancy grid within the field of view of the sensor, this method updates a single cell along the major axis of the sensor. Borenstein shows that as the unmanned system traverses an area, this method is cheaper computationally and achieves similar results because the method focuses on using data along the sensors axis that provides the best data. Novick [22] extended the concept of the raster occupancy grid update method. His approach was to apply a nonhomogenous Markov chain based method to update grid cells. Using this approach, Novick shows that this method is a significant advance in sensor fusion for outdoor vehicles. Both Borenstein and Novicks methods use raster grids to represent their data. 2.2.2 Real Time Terrain Mapping An extension of the occupancy grid methods is the real-time terrain mapping method. This method attempts to generate a model of the Earths surface in a tessellated data structure. This two and a half dimensional representation assigns a height to each

PAGE 29

14 grid cell as opposed to an occupancy probability. Crosetto and Crippa [10] presented a method for fusing stereo and radar data to form real-time elevation maps. 2.2.3 Raster Traversability Grid The traversability grid concept is an extension of the both raster occupancy grid and terrain mapping methods. In this implementation, the value in a grid cell represents the degree to which the area covered by the grid cell is considered drivable by the vehicle. Unlike the previous two methods, occupancy grids and terrain mapping, the traversability method is dependent on vehicle parameters. This is because the concept of traversability is inherently platform dependent. For example, an area occupied by a small rock may be deemed untraversable by a small unmanned system. However, a larger unmanned system confronting the same rock may consider the region less than desirable, but still traversable. Vehicle parameters that are often used traversability determination include the maximum allowable rotation angles of the platform about its three axes. Using a model of the terrain in which an unmanned system operates, it is possible to calculate the pose of the unmanned system along a path given the vehicles physical parameters. 2.3 A Priori World Modeling Methods An a priori world model data store is one that contains data that were accumulated prior to use by an unmanned system. For example, if an unmanned system maps, this map could be stored for future use by the unmanned system or transferred to another unmanned system to allow it to make mission decisions. This is an example of the use of raster data a priori. This is not the typical use of a priori within unmanned systems because of possible errors in the map making process. Instead, vector methods are used more frequently for initial data. An example of the use of this modeling method is

PAGE 30

15 presented by Pasha [23]. The model of the world used is based on a polygonal representation as shown in Figure 2-2. In this work, Pasha models an environment in which an unmanned system must operate. The locations of static obstacles are known and can be used during the path planning process. Compounding the problem however, is the presence of numerous radiation sources. Given the obstacles and location and strength of radiation sources, a path plan is computed that most efficiently gets the unmanned system to its desired destination while minimizing its exposure to radiation. Figure 2-2. Path planning in a bounded radiation environment (Source: A. Pasha, "Path Planning for Nonholonomic Vehicles and Its Application to Radiation Environments," Master of Science Thesis. Department of Mechanical Engineering: University of Florida, 2003, p. 59, Figure 6-9) 2.4 Geographic Modeling Methods The areas in which unmanned systems operate are typically assumed to be simple planar surfaces. As unmanned systems begin to be introduced into real world outdoor applications, this assumption can not hold.

PAGE 31

16 2.4.1 Global Coordinate Systems When moving from the laboratory to real world, outdoor applications that cover large distances, the methods presented in Section 2.3 must be modified. Those methods assumed that the unmanned system was operating in a perfectly planar environment; where, in the case of raster data, the cells were square and the coordinate system Cartesian. The Earth is not flat and, therefore, when unmanned systems operate over large distances, they must take the Earths true shape into consideration. There are three commonly used models of the Earths shape. They are actual shape of the Earths surface, the ellipsoid, and the geoid [7]. Because of the large variations in the Earths surface, it is difficult to develop a true mathematical model for it. Therefore, the other two methods of modeling, the ellipsoid and the geoid, are typically used. The ellipsoid is a mathematical model of the shape of the Earth. The ellipsoid (Figure 2-3) is defined by its semi-major and semi-minor axes. Over the years different Ellipsoidal models of the Earth have been established based on the best known shape of the Earth. Currently the most commonly used model is the World Geodetic System as defined in 1984 (WGS84). This model defines the semi-major axis r 1 as 6,378,137.0 meters and the semi-minor axis r 2 as 6,356,752.3 meters [7]. Figure 2-3. Ellipsoidal model of Earth

PAGE 32

17 Once the Earth ellipsoidal model is established, a geographic coordinate system must also be established. Because of the spherical shape of the Earth, a spherical coordinate system is used to define points on the ellipsoid. A point on the ellipsoidal surface is described in spherical coordinates by a latitude value in degrees, a longitude value in degrees, and a height or elevation value in feet or meters. As shown in Figure 2-4, latitude values increase going north and range from -90 at the South Pole, to 0 at the Equator, to 90 at the North Pole. Longitude values start at 0 at the Prime Meridian and range between plus and minus 180. The values go negative going west and positive going east. Figure 2-4. Earth-centered global coordinate system Bolstad [7] describes the geoid as a three-dimensional surface that has a constant pull of gravity at each point. This equipotential surface is important for establishment of a vertical datum. In fact, this surface typically defines what is referred to as mean sea level [36]. If the Earth was covered by only water and no land, gravity would pull the water such that the geoid and the sea level would be the same [36]. This is how mean sea level is defined for land areas that are not near the sea. As with the ellipsoid model,

PAGE 33

18 locations are referenced by latitude, longitude, and elevation. The difference is that the ellipsoid model uses the surface of the ellipsoid to establish elevation whereas this method measures the elevation of the geoid with respect an ellipsoid. 2.4.2 Projected Coordinate Systems While true global coordinates are expressed as points of latitude, longitude, and elevation, it is more intuitive to model the world in Cartesian coordinates. This is particularly true when extending the methods in Section 2.3 to outdoor applications. In order to use a Cartesian coordinate system, methods have been established to mathematically project global, spherical coordinates onto a rectangular grid. Because it is not possible to exactly represent this three-dimensional surface in two-dimensions, there are different mathematical projections that preserve different features of the three-dimensional surface. Typical features that are preserved are local shape, area, distance, and true direction. Conformal projections preserve local shape, equal area projections preserve area, equidistant projections preserve distance to some points, and true-direction projections preserve true-course between certain points [14]. Projections are not only classified by the types of features they preserve, they are also classified by the type of method used to create them. The main classifications are: cylindrical, conic, and planar or azimuthal [14]. Cylindrical projections convert from the Earths three-dimensional spherical coordinate system to a cylindrical coordinate system. After the projection, the cylindrical representation is sliced so that it forms a two-dimensional rectangular representation of the Earths surface. Conic projections convert from the Earths three-dimensional spherical coordinate system to a conic coordinate system. After the projection, the conic representation is sliced so that it forms a two-dimensional representation of the Earths

PAGE 34

19 surface. Planar or azimuthal projections convert from the Earths three-dimensional spherical coordinate system directly to a planar coordinate system. There are numerous types and variation of each type of projection. Bolstad [7] and ESRI [14] provide qn in-depth discussion of these projections and many of their variations. 2.4.3 Universal Transverse Mercator projection The Universal Transverse Mercator (UTM) projection is a modification of the cylindrical Mercator projection. This projection is a conformal projection which preserves local shape of objects [14]. It creates minimal distortion of areas, local angles, and distance [14]. Unlike the cylindrical projection shown in Figure 2-5, the UTM projection divides the cylinder into 60 vertical zones. Each UTM zone is exactly 6 degrees of longitude wide and is further divided into north and south parts [7]. The UTM zones each have their own coordinate system which is completely different than the coordinate system of other zones. Because of this, it is difficult to use the UTM projection when traveling between UTM zones. This is rarely a problem with unmanned ground systems because the six degree UTM zones are much larger than what would be reasonably expected for a system of this type to traverse. It may be an issue with unmanned aerial vehicles, but this is something that can be taken into account by the system developers. What is most attractive about the UTM projection is that it is a projection that is defined globally. In each zone, it is able to maintain shape, area, direction, as well as distance. These are all features that are important for unmanned vehicles during navigation and world modeling tasks. The reader is referred to [7] for a more in-depth discussion of the Universal Transverse Mercator projection, its applications, and limitations.

PAGE 35

20 2.5 Georeferenced World Model Data The modeling methods presented in Sections 2.3 and 2.4 are dependent on a planar assumption for the environment that the unmanned system operates in. In these applications, the coordinate systems are Cartesian with the origin being based on an arbitrarily chosen local coordinate system. As discussed in Section 2.4, these data can be stored and used as a priori data from other unmanned systems. What is more common, however, is to use data from third party sources. The most important 2.5.1 Raster Data Stores Raster Data Stores are those that provide tessellated grid based geospatial data. Examples of the raster data stores include Digital Elevation Model (DEM), Digital Terrain Elevation Data (DTED), Digital Raster Graphics (DRG), Digital Orthophoto Quadranges (DOQs). This list is by no means exhaustive. There are many more types of raster data stores. Each type of data store provides different types of data at different resolutions possibly using different projections. Figure 2-5 shows the high-level format of Digital Elevation Model (DEM) data. DEM data use the UTM projection to create a Cartesian coordinate system. These DEM data represent a 2.5D surface. The resolution of DEM data is 30 meters. 2.5.2 Vector Data Stores Vector Data Stores are those that provide geospatial data that are referenced by points, lines, or vertices of polygons. Types of vector data stores include Digital Line Graphs (DLG), State Soil Geographic (STASGO), and Topologically Integrated Geographic Encoding and Referencing (TIGER). This list is by no means exhaustive. There are many more types of vector data stores available from third parties. Since there is no globally accepted geospatial data standard, each store may use a different format..

PAGE 36

21 Figure 2-5. Example format of digital elevation model Figure 2-6 shows an example of Digital Line Graph (DLG) data being extracted from a Digital Orthophoto Quadrangle. The benefit of this extraction is that the resulting DLG vector data size is smaller than the DOQs data size. 2.6 Distributed World Modeling Methods There are very few major efforts attempting to tackle the difficult task of distributed world modeling. Two of the current efforts are the Spatial Data Transfer Standard (SDTS) and the Geography Markup Language (GML). 2.6.1 Spatial Data Transfer Standard (SDTS) SDTS is an open standard being developed by the Unites States government for use in geographic information systems. One of the reasons for developing this standard is that there are various types of geospatial data available based of different Earth models and projections with each having different errors associated with them. SDTS seeks to provide a method that allows a complete data transfer with all necessary information

PAGE 37

22 associated with those data needed to incorporate them into other data systems. SDTS specifies the entire process of storing and sharing geospatial data. This ranges from the methods for modeling raster and vector geospatial data down to the way that data are stored in digital files. SDTS is also a very broad standard that is able to support different models of the Earth, different map projections, and different method of modeling the data. A B Figure 2-6. An example of USGS source data. A) USGS Orthoimage B) Extracted digital line graph The SDTS is divided into six profiles that completely define the standard. The first three parts define the logical specification, spatial features, and data encoding, respectively. The other parts are called profiles. Each profile provides instructions for using the base SDTS rules, parts one through three, to different types of geospatial data [32]. Part four of the SDTS standard is the Topological Vector Profile (TVP). This profile allows transfer of geospatial vector data described by vector geometry and

PAGE 38

23 topology. This profile allows data to be geometrically described using points, lines, polygons, as well as combinations of these. The Topological Vector Profile is useful for transferring digital line graph (DLG) data such as those presented in Figure 2-9 [3]. Part five of the SDTS standard is the Raster Profile and Extensions (RPE). This profile supports various types of raster formatted geospatial data. This includes Georeferenced orthoimages, grid formatted terrain data such as DTED and DEM, as well as any type of tessellated geospatial data. RPE does not support data of a higher dimension that two and a half (such as terrain data) [5]. The Last part of the current version of SDTS is the Point Profile. This profile provides support for high precision point data only. While the Topological Vector Profile does support point data, it does not at high enough precision for some applications. The Point Profile supports up to 64 bits of precision whereas the TVP only supports up to 32 bits of precision [4]. All six parts of the SDTS standard combine to form a powerful and comprehensive method for modeling and distributing geospatial data. 2.6.2 Geography Markup Language The Geography Markup Language (GML) is a broad standard that supports raster and vector data in 2, 2.5, and 3 dimensions. It also supports more types of complex shapes and surfaces than are needed for unmanned system world modeling. It is able to support data based on different projections as well as different Earth models [11]. GML is an extension of the Extensible Markup Language (XML). XML, like the Hypertext Markup Language (HTML) commonly used for transfer of web pages, supports tags that specify the types of data included in the document. For XML the tags are defined by the document creator for the type of data included. HTML specified all of its tags a priori. Also unlike HTML, XML and subsequently GML, does not mix the

PAGE 39

24 data content with the formatting of the content. For GML, the descriptors (or tags) are geospatial data related. While XML provides a very loose structure for the types of data described, GML places restrictions on XML by specifying the methods for geometrically modeling the data. If GML based system developers associate different attributes with the geospatial data types, they will at the very least be able to understand each others data at a geometric level [17]. Both SDTS and GML are both adequate methods for modeling geospatial data and sharing those data, but they are not exactly appropriate for JAUS based unmanned systems. Of the two, GML is more appropriate since it is based on the powerful XML standard which is designed for real-time transfer. By defining additional XML tags, it is possible to make data store modifications in real-time rather than on a per XML document basis. The downside of GML is that it is all ASCII text based and requires extra characters to support its extensibility. Because some of the tags are many characters long, this translates to additional bandwidth being used for the support characters. SDTS is not appropriate for use with unmanned systems where bandwidth utilization should be minimized. Because SDTS transfers are to be all self-contained with all necessary data included, this is not suitable for real-time data transfer. A real-time world modeling message set should support the ability to make individual changes to the data store in real-time rather than requiring changes to be transmitted via an updated version of all the data in the data store. This work is interested in using the power of the JAUS infrastructure to support distributed world modeling. Since JAUS defines the structure of its messages a priori,

PAGE 40

25 beyond its 16-byte header, JAUS does not require any other bytes to support its infrastructure. All of the data after the JAUS header are values for the field described in the JAUS message definition. Rather than incorporating a completely different, non-optimal standard into JAUS for world modeling, the world modeling standard builds on the framework developed by the JAUS Working Group.

PAGE 41

CHAPTER 3 SMART SENSORS While setting out to develop a standard for modeling the various types of geospatial data presented in the preceding chapter, a distributed set of world models was developed. These world models were tightly coupled to their associated sensors and therefore were initially considered to be smart sensors. 3.1 Smart Sensor Architecture The smart sensor architecture was originally developed for the perception system in the Team CIMAR NaviGATOR which is represented Figure 3-1. The NaviGATOR was developed as an entry to the 2004 Defense Advanced Research Projects Agency (DARPA) Grand Challenge. Held in March of 2004, the DARPA Grand Challenge was a first of its kind unmanned ground vehicle (UGV) competition. The thrust of this challenge was to develop a UGV that could autonomously navigate and avoid obstacles over the approximately 140 miles from Barstow, California to Primm, Nevada crossing the Mojave Desert. Team CIMAR consisted of graduate students and engineering staff from the University of Floridas Center for Intelligent Machines and Robotics (CIMAR) and Logan, Utah based Autonomous Solutions, Inc. Recognizing the power and flexibility afforded by the use of JAUS, Team CIMAR used it throughout the NaviGATOR and therefore developed, at the time, one of the only completely autonomous systems based on the Joint Architecture for Unmanned Systems (JAUS). The exception to this was the Navigators perception system where the 26

PAGE 42

27 messages that defined the smart sensor messaging architecture were only loosely modeled after the JAUS standard and the JAUS World Modeling Subcommittees forthcoming draft message set which is presented in Chapter 4 of this document. The smart sensor architecture is a networked system of distributed, modular, heterogeneous sensor units that all use a common messaging and network interface to share data. Each smart sensor processes data specific to its associated sensor modality and determines region traversability using a suitable traversability metric as determined by the sensor system developer. These geospatial traversability data are shared within the perception system and provided to higher level planning components to allow them to make intelligent decisions such as obstacle avoidance. Figure 3-1. Team CIMAR NaviGATOR DARPA Grand Challenge entry vehicle

PAGE 43

28 The smart sensor units are considered smart because they not only process their sensor data, they also provide a logically redundant interface to other components within the system. The impetus behind the creation of this smart sensor architecture was to allow sensing system implementers to develop their sensing technologies independent of one another and then have them, with minimal effort, seamlessly integrate their work to form a robust perception system. The JAUS-like messaging infrastructure and logical redundancy of the smart sensors afforded this flexibility. Even though their implementations and sensor modalities are different, these sensor units are logically redundant in that their messaging interfaces are identical [19]. The idea was that each sensor implementer best knew how to process and register their own sensor data. Rather than relying on a probabilistic model of the sensor to homogenize the sensor data on one system, this implementation expects the sensor data to be homogenized before they are fused. Once their data were available, the smart sensors would publish the data to a central component, the smart sensor arbiter, whose responsibility would be to fuse the data from all of the smart sensors. The output of the smart sensors is a measure of region traversability cost. This cost is based on a sensor-specific traversability metric being applied to the data from a physical obstacle detection sensor. Behaviors of this type, associating a cost to an attribute based on a metric, are called value judgment [18]. Smart sensor developers were permitted to use any sensor modality that presented data that could be processed to provide sufficient traversability value judgment. For this implementation, these included stereo vision, stationary laser measurement system, and monocular vision based smart sensors all developed by researchers at the University of Florida. A continuously rolling

PAGE 44

29 laser measurement system based smart sensor was developed at Autonomous Solutions, Inc. While there are duplicate sensor types, the implementation of the associated smart sensor makes the data from the sensors quite unique. For example, the stereo camera and monocular cameras use the same sensing modality however the difference is in the implementation of the smart sensors. The stereo camera data are processed so that they, through the use of image rectification and correlation, provide a sparse three-dimensional representation of the environment within the field of view of the cameras. Traversability is determined by considering the stereo data as real-time terrain data and applying value judgment. The implementation of the monocular camera based smart sensor utilized color and cluster affinity in RGB-space to classify image pixels that belonged to traversable surfaces. Once the individually developed smart sensors were completed, a predefined messaging architecture was used to transmit the traversability data within the perception system. In order to support true interoperability, however, electrical and transport layer issues also had to be addressed. These issues will be address later in this chapter. 3.2 Smart Sensor Architecture Components There are three major types of components that make up the perception systems smart sensor architecture. These are the smart sensor, smart sensor arbiter, and reactive planner components. Figure 3-2 shows the perception system components as well as the component interconnects. 3.2.1 Smart Sensor Component The smart sensor is a modular perception system component that provides an interface between a physical sensor and the smart sensor network. It encapsulates a

PAGE 45

30 physical sensor, the hardware necessary to process the sensor data, a method for determining region traversability from the processed sensor data, a standardized messaging interface, and a communications link. Figure 3-2. Organization of the smart sensor-based perception system A smart sensor is modular in that it shares the same logical interface with all other smart sensors. With the exception of a single field in the message header, the source component identification number, the output format of each smart sensor is identical to that of all other smart sensors. This allows any smart sensor to seamlessly replace any other. Internally, the smart sensor maintains a tessellated traversability grid of a size specified by the predefined range and resolution of the grid. As with the occupancy and traversability grids introduced in Chapter 2, this grid maintains a fixed orientation and

PAGE 46

31 remains vehicle centered. In this implementation, the grid maintains a north-east orientation. As the vehicle moves, the grid is translated in discrete steps to compensate for the vehicles movement. The translation of the vehicle is determined from the previous and current positions of the vehicle as provided by a global positioning system (GPS) providing coordinates in the WGS84 coordinate system. The coordinates are projected from global to Cartesian coordinates using the Universal Transverse Mercator (UTM) projection. The difference in position, in meters East and North of the origin, is converted to a translation of grid rows and columns. To assure that the vehicle is always centered in the center cell of the traversability grid, the grid dimensions, rows and columns, are required to be odd. The geospatial traversability data are registered by using the vehicles orientation to project the sensor data into the two-dimensional traversability grid. As the vehicle translates and rotates, changes to the traversability grid are monitored. As the values of cells change, the updated values are transferred to other systems to provide grid synchronization. 3.2.2 Smart Sensor Arbiter Component The smart sensor arbiter has the responsibility of fusing data from the smart sensors and, through the synergy of the different sensor modalities, providing a better model of the world to the reactive planner component. In a complete smart sensor system, the arbiter component is the hub of all data traffic from the smart sensors. As it receives traversability updates from the smart sensors, it immediately fuses the updated data with that from previous sensor updates. Generally, the method used to fuse the traversability data from the sensors is not

PAGE 47

32 specified and is left to the implementer. What is important is that the interface to the arbiter is consistent with the smart sensor message set and that the arbiters grid resolution is the same as the smart sensors. Maintaining a grid of equal size as the smart sensors is not required as it may be desirable to have a grid that extends well beyond the bounds of all of the smart sensor grids. This allows the arbiter to maintain a larger local memory of the area perceived by all the smart sensors. In a system with multiple subsystems, this functionality could be used for collaborative mapping of large areas. The smart sensor arbiter also includes a virtual component the Region Clutter sensor. This component provides a very fast indication of the saturation of non-traversable areas within the unmanned systems immediate vicinity. This feature gives the higher level planning components information that allows it to modify the vehicles speed as it encounters cluttered areas. By modifying the systems travel speed, there may be adequate time to generate a plan to negotiate the non-traversable regions. The smart sensor arbiter also shares the same logical interface as the smart sensors. This allows smart sensor based perception systems to use a single smart sensor without the smart sensor arbiter or multiple smart sensors with the arbiter. This flexibility is an asset especially in the development and debugging processes. 3.2.3 Reactive Planner Component Within the smart sensor based perception system, higher level obstacle avoidance and vehicle travel speed control is the responsibility of the reactive planner component. The reactive planner component, using the JAUS communications network, receives the position, orientation, and orientation rates of the platform from the JAUS Global Pose and Velocity State components. The reactive planner component then uses the smart sensor architecture messaging interface and the smart sensor network to transmit the

PAGE 48

33 position and orientation updates to the smart sensors. The same network is used to receive the smart sensors traversability data. As it is receiving traversability grid updates from either the arbiter or smart sensors, the reactive planner continuously searches for the optimal, lowest cost path through the accumulated traversability data. The output of the reactive planner is a modified path plan for the unmanned system to execute. 3.3 Smart Sensor Messaging Architecture In order to support the development of the components of the perception system, a standardized messaging interface was defined. Its use was mandated for all components participating in the smart sensor based perception system. This messaging interface was to a large degree based on the methodologies and messages established by JAUS. 3.3.1 Smart Sensor Architecture Message Header To support message identification, routing, and transfer, a modified version of the standard 16 byte JAUS message header was created. The JAUS header supports more functionality than needed by the smart sensor architecture. Therefore the majority of the bytes within the header would not be needed. Because of the volume of data transferred within the smart sensor system, any savings of would be beneficial. Therefore the JAUS header was reduced so that all unnecessary header fields were removed. Table 3-1 shows the format of the official JAUS message header. Since the smart sensors communicated on their own network, this optimization had no effect on the JAUS based NaviGator network. The smart sensor development team made several assumptions about the data transfer process in order to justify the reduction in header size. They are as follows: Smart sensors are all contained within the same subsystem

PAGE 49

34 Smart sensors are single component nodes Smart sensors have distinct component identification numbers Smart sensors have only one instance Smart sensor message types are unidirectional Smart sensors use the same version of the interface control document Smart sensors do not use service connections Smart sensors do not require message acknowledgment Smart sensors transmit messages of the same priority Table 3-1. Standard JAUS sixteen byte message header Field # Field Description Size (Bytes) 1 Message Properties 2 2 Command Code 2 3 Destination Instance ID 1 4 Destination Component ID 1 5 Destination Node ID 1 6 Destination Subsystem ID 1 7 Source Instance ID 1 8 Source Component ID 1 9 Source Node ID 1 10 Source Subsystem ID 1 11 Data Control (bytes) 2 12 Sequence Number 2 Total Bytes 16 The smart sensors are all contained within the same subsystem and the subsystem does not communicate spatial data to any other subsystem, therefore the fields for the destination and source subsystem identification numbers (fields 6 and 10, respectively) would be equal and would always remain static. Removing these fields reduces the required header size by 2 bytes. The message header fields 5 and 9, node identifiers, may be removed due to the assumption that the smart sensors are single component nodes and that each smart sensor has its own component identification number. Because of these assumptions each component identification number must be coupled to one and only one node identification number. Therefore specifying both the component and node identifiers

PAGE 50

35 would be redundant. The removal of the destination and source node identifier fields saves an additional two bytes. Therefore, the smart sensor components are addressed only by component identification numbers. Within the smart sensor system there is no redundancy of smart sensor implementations, therefore the headers instance identification fields, 3 and 7, would never be used. Removal of these fields results in a savings of 2 bytes. It is important to note that this redundancy assumption is specific to the smart sensor implementation where each individual smart sensor has its own component identification number. In a true JAUS system, this would not necessarily be the case. They would be treated as redundant components since each smart sensor is just another instance of the same. Messages traveling down stream from the reactive planner component to the arbiter and smart sensors are position and orientation update messages. Messages traveling upstream from the smart sensors to the arbiter and reactive planner are cell update messages. The exception to this rule is the arbiter clutter sensor component, which will be discussed in greater detail later in this chapter. Because message types are unidirectional and there is a priori knowledge of the system configuration, this assumption removes the need for the source component identification number and the message command code; a combined savings of three bytes. The message properties field in the JAUS header provides important information to the receiving JAUS component. This includes the version of JAUS Reference Architecture message set used to create the attached message as well as message type, acknowledgement, and priority information. The last four assumptions have a direct

PAGE 51

36 impact on this message field by making it useless. The assumption that the smart sensors are not using service connections also removes the need for the Sequence Number field of the JAUS header. Combined, these four assumptions result in a savings of four bytes. The cumulative savings produced by the nine assumptions presented above is 13 bytes. In a system with a relatively large amount of bandwidth or less frequent raster geospatial data transfers, the thirteen-byte savings may not seem significant. Since the message header must be attached to each message, however, when there is a large volume of data, as can be expected within the smart sensor architecture, the aggregate savings can be substantial. The final three-byte header is show in Table 3-2. Table 3-2. Smart sensor architecture message header Field # Field Description Size (Bytes) 1 Source Component ID 1 2 Data Control (bytes) 2 Total Bytes 3 One of the strengths of JAUS is that the messages are develop completely separate from, and are not at all dependent on, the message header. Because of this, the smart sensor messages that will be introduced may be transmitted using any message header that can be used to properly route the messages to their intended destination. Again, the header size reduction presented in this section was made primarily for the purpose of saving bandwidth and computing resources. 3.3.2 Smart Sensor Architecture Message Set The NaviGATORs perception system, consisting of the components of the smart sensor architecture, provides a unique method for transferring and synchronizing raster formatted geospatial traversability data. The structure of the messages within this

PAGE 52

37 architecture is based on the JAUS Reference Architecture message set. They were designed to support the interoperability, extensibility, and logical redundancy required of the smart sensor architecture. Figure 3-3 shows the minimum number of component types needed for a complete smart sensor system. It is considered complete because all of the core components are present; the reactive planner, arbiter, and smart sensor. It is minimal because only one smart sensor is present. In fact, this system is not particularly useful because the arbiter and the smart sensors internal traversability grid representations would be exactly the same. Therefore, the arbiter should be used when there is more than one smart sensor present. Using the logical redundancy provided by the smart sensor architecture, a more efficient implementation of a single sensor based system is shown in Figure 3-4. This showcases the power of the logically redundant interface as a smart sensor may replace the arbiter or any other smart sensor. Three types of messages are used in the smart sensor based perception system: Report vehicle state Report traversability grid updates Report region clutter index The Report Vehicle State message communicates vehicle position, orientation, and orientation rate information. Updates to the smart sensor or smart sensor arbiter traversability grid are transmitted through the use of the Report Traversability Grid Updates Message. The Report Region Clutter Index message transmits an indication of the saturation of non-traversable areas in the immediate vicinity of the vehicle. This is primarily used to limit the speed of the vehicle while in cluttered areas.

PAGE 53

38 3.3.2.1 Report vehicle state message The Report Vehicle State message, consisting of vehicle position, orientation, and orientation rate updates, is a combination of the JAUS Code 4402h: Report Global Pose and Code 4404h: Report Velocity State messages. The position of the platform is given in latitude, and longitude in accordance with the WGS84 standard. The orientation and orientation rates are with respect to the vehicles coordinate system as defined by JAUS (Figure 3-5). Figure 3-3. Minimally complete smart sensor-based perception system consisting of one instance of the core component. Figure 3-4. Single sensor implementation of smart sensor-based perception system consisting of a single smart sensor synchronizing data with the reactive planner.

PAGE 54

39 (a)(b)(c)(d) x y z x y z x y z x y zground plane north dire c tion Figure 3-5. Unmanned system coordinate system defined by JAUS To allow message types of variable size where only the desired data are transmitted, JAUS provides a presence vector. This presence vector is an n-byte bit field with flags indicating which optional fields are present in a JAUS message. For the smart sensor implementation, only the latitude, longitude, roll, pitch, yaw, roll rate, pitch rate, and yaw rate fields are needed from the JAUS Report Global Pose and Report Velocity State messages. The remaining fields are not included in the transmitted message. Since these unneeded fields have been removed from this message contraction and there is a priori knowledge of the structure of this message, the presence vector was also removed. The benefit of this approach is that the Report Global Pose and Report Velocity State messages do not have to be sent separately with the 16 byte JAUS header attached to each. An additional benefit to this approach is that the position, orientation, and

PAGE 55

40 orientation rate fields are synchronized; i.e. the message includes an instantaneous reading of both the position and orientation data. This Report Vehicle State message, Table 3-3, is 20 bytes in length, 23 bytes including the message header. Fields 1 and 2 contain the latitude and longitude, respectively, as scaled integers. Fields 3 through 5 contained the vehicle orientation and fields 6 through 8 contain the orientation rates. Table 3-3. Smart sensor architecture's report vehicle state message Field # Name Type Units Interpretation 1 Latitude (WGS 84) Integer Degrees Scaled Integer Lower Limit = -90 Upper Limit = 90 2 Longitude (WGS 84) Integer Degrees Scaled Integer Lower Limit = -180 Upper Limit = 180 3 (Roll) 4 (Pitch) 5 (Yaw) Short Integer Radians Scaled Integer Lower Limit = Upper Limit = 6 Roll Rate 7 Pitch Rate 8 Yaw Rate Short Integer Radians per Second Scaled Integer Lower Limit = -32.767 Upper Limit = 32.767 3.3.2.2 Report traversability grid update message The Report Traversability Grid Update message provides a synchronization mechanism between the multiple distributed traversability grids. This functionality is event driven and based on updates to the smart sensors traversability grids. When a change is made to a traversability grid, the change is transmitted to the destination component to synchronize the two grids. By making the process event driven, bandwidth utilization is reduced over transmitting the entire traversability grid, especially when there are only a small number of changes to the traversability grid. The Report Traversability Grid Updates message is shown in Table 3-4. The first two fields of the message are a latitude and longitude position stamp. This position

PAGE 56

41 stamp represents the point with which the cell update values are referenced; the current location of the vehicle at the time the sensor data was processed. Following the position stamp is a series of cell update three-tuples. Each three-tuple represents the traversability grid update as an updated cell row, column, and traversability value. The traversability grid cell update values use the entire numeric range of a byte, 0 to 255, to represent the traversability of the region represented by the cell. A value of 127 corresponds to an unknown traversability. As the value approaches zero, exclusive of zero, the cell classification become more and more non-traversable. Conversely as the value approaches 255, the classification is more traversable. The grid cell value zero is reserved exclusively for the world model corridor data which is used to constrain the search for the lowest cost path through the traversability grid. This message allows all changes to be transmitted in one message, provided that the total message data size is less than the 65527 bytes that the smart sensor architecture header permits. This limit is determined by the UDP/IP transport layers limit on the maximum number of payload data bytes that may be transmitted in a single transaction [24]. Should the Report Traversability Grid Update message exceed 65527 bytes, it should be broken into separate messages. These separate messages should have the same latitude and longitude position stamp values as the first cell update message. This latitude and longitude position stamp is very important as it defines the origin of the cell changes. The total number of three-tuple cell updates being transmitted may be inferred from the header data bytes field by subtracting the eight bytes required by the position stamp and dividing the remainder by three.

PAGE 57

42 3.3.2.3 Report region clutter index message Within the perception system, there is a need to allow higher level components, particularly the Reactive Planner, to know the degree of saturation of non-traversable areas local to the vehicle. The purpose of this is to allow the UGV to reduce its speed to allow it to successfully negotiate the traversable regions. To support this, another pseudo-component was developed. This component, the Region Clutter Sensor, is embedded in the arbiter. It simply provides a fast assessment of the percentage of cells in a specified area that are classified as non-traversable. This message is sent to the reactive planner, which has the responsibility for determining how to react to this notification. Ideally, the reactive planner converts the clutter percentage to a recommended vehicle speed and transmits this to the Global Path Segment Driver component using the JAUS Code 040Ah: Set Travel Speed message. Table 3-4. Smart sensor architecture's report traversability grid updates message. Field # Name Type Units Interpretation 1 Latitude (WGS 84) Integer Degrees Scaled Integer Lower Limit = -90 Upper Limit = 90 2 Longitude (WGS 84) Integer Degrees Scaled Integer Lower Limit = -180 Upper Limit = 180 3 Cell Update 1 Row Byte N/A 4 Cell Update 1 Column Byte N/A 5 Cell Update 1 Value Byte N/A 0 Reserved for World Model 1 126 Non Traversable 127 Unknown (Initial value for cells) 128 255 Traversable 1=completely non-traversable 255=completely traversable

PAGE 58

43 Table 3-4. Continued Field # Name Type Units Interpretation 3n Cell Update n Row Byte N/A 3n + 1 Cell Update n Column Byte N/A 3n + 2 Cell Update n Value Byte N/A Same as field 5 The area covered by the Region Clutter Sensor is not specified in the Smart Sensor Architecture Interface Control Document (ICD). This is a system specific parameter and is therefore left to the system implementer. A system traveling at high speed may need to monitor a large area whereas a slower or smaller system may need to monitor a smaller area. Table 3-5. Smart sensor architecture's report region clutter index message. Field # Name Type Units Interpretation 1 Clutter Index Byte Percent Scaled Byte Lower Limit = 0 Upper Limit = 100 Percentage clutter in specified area 3.3.3 Smart Sensor Architecture Network Communications The smart sensor data are transferred within the perception system via the user datagram protocol (UDP) running on top of the Internet protocol (IP). This combination of user datagram protocol and the Internet protocol will be referred to as UDP/IP. UDP/IP provides a connectionless, unreliable communications link between systems. The term unreliable is in some respects a misnomer because UDP/IP can provide a quality connection. Unlike the Transmission Control Protocol, UDP does not have any

PAGE 59

44 checks to assure receipt of data. It relies on the host application to do the checking. For example, the JAUS header provides a message acknowledgement flag that requests that the receiving component notify the sending component of receipt of a message. If the sending component does not respond in a set period of time, as per the JAUS RA, the sending component retries up to three times and then terminates transmission. If a JAUS implementation used UDP/IP, then this functionality would help assure reliable communications. The smart sensor system is set up a priori under the assumption that the minimum number of system components are present and that they are online and in the ready state. It was developed such that each component commences transfer of the supported messages to the appropriate component directly after initialization. This may be considered transmission of unsolicited responses to repeated data queries (sans the queries) or as an unsolicited JAUS service connection. The UDP/IP transport layer supports this functionality. UDP/IP is connectionless and therefore does not require that the destination component be present or a link established in order for data to be sent within the system. The popular alternative to UDP/IP, TCP/IP, generally requires that a socket connection be established between two or more systems before data can be sent. To route data to the smart sensors, internet protocol (IP) addresses had to be defined. To allow the IP addresses to be determined dynamically based on the destination of the message, an IP addressing convention was established. Since each smart sensor has a unique component ID, the component ID was used as the last octet of the IP address. The first three octets of the IP address were established a priori. For example: 192.168.1.component_id is an example configuration where the first three

PAGE 60

45 octets are the defined and the component ID is used as the last octet. Table 3-6 presents a list of smart sensor components and their associated component identification numbers. A standard UDP/IP port was also designated. The network interface between all components within the perception system was wired Ethernet capable of providing data transfer at rates of up to 100 Megabits per second. Table 3-6. Smart sensor components and their component identification numbers Smart Sensor Component ID Reactive Planner 10 Smart Sensor Arbiter 11 Smart 3D Laser Sensor 21 Smart Stereo Vision Sensor 22 Smart Terrain Finder Sensor 23 Smart Road Finder Sensor 24 Smart World Model Sensor 25 Region Clutter Sensor 127 3.4 Smart Sensor Implementation While the smart sensor architecture was originally developed for the Team CIMAR NaviGATOR, final testing and verification took place on the Center for Intelligent Machines and Robotics Navigation Test Vehicle 2 (NTV2) shown in Figure 3-6. The implementation of the smart sensor units at CIMAR exploited the commonality between implementations. 3.4.1 Abstraction of Smart Sensor Core Functionality Because of the considerable amount of implementation overlap, the CIMAR smart sensor system was designed so that all developers build their smart sensors on top of a common base implementation that contained the core smart sensor functionality. This approach saved a considerable amount of time because testing and debugging of the main

PAGE 61

46 base sensor implementation occurred independent of development of the sensors. The interface to this system was made into a clean application programmers interface (API). This API handles communications, grid synchronization, and all other low level smart sensor tasks. The system designer has the responsibility of processing the sensor specific data to determine traversability, placing that data in a grid of the proper range and resolution, and using the smart sensor API to publish the new data to concerned components within the system. Figure 3-7 shows the high-level conceptual separation between the two functions. The power of this approach is that it allows new implementations of sensors to come online in very short order. 3.4.2 Base Smart Sensor The base smart sensor encapsulates all low-level functionality common to all smart sensors. This functionality includes: Allocating memory for a local traversability grid Receiving position and orientation updates via UDP/IP Transforming data from sensor coordinates to grid coordinates Shifting the traversability grid to keep it vehicle centered Monitoring traversability grid updates Synchronizing traversability grid updates via UDP/IP This functionality leaves to the operator the task of solely proving an instantaneous local traversability grid from their sensor data. They initialize their smart sensors using the API. The base smart sensor has two thread that run concurrently with the sensor interface specific thread. Figure 3-8 shows a call graph for all functions within the base smart sensor. The traversability data are registered in the grid by utilizing the platform orientation data. Upon startup, the base smart sensor spawns a thread (Figure 3-9) to handle

PAGE 62

47 asynchronous position updates from either the smart sensor arbiter or directly from the reactive planner component Figure 3-6. Center for Intelligent Machines and Robotics Navigation Test Vehicle 2. The rotations , and as shown in Figure 3-5, as well as the sensors offset from the vehicles coordinate system are used in the homogenous transformation of the data from the sensors coordinate system to the grid coordinate system. Equation 3-1 shows the compound transformations necessary for this. The x offset y offset and z offset values all represent the offset of the sensor coordinate system from the vehicles coordinate system. It is assumed that sensor is aligned such that there is no rotational difference between the two coordinate systems, only translation. The x sensor y sensor and z sensor values represent the coordinates of a point as read from the sensor in the sensors coordinate

PAGE 63

48 system; x vehicle y vehicle and z vehicle are the coordinates of the point after transformation to the vehicle coordinate system. Figure 3-7. Smart sensor implementation abstraction of low-level smart sensor functionality Figure 3-8. Call graph for all functions within the base smart sensor API. Figure 3-9. Call graph for smart sensor communications receive thread

PAGE 64

49 110000cos0sin00100sin0cos1000010000cossin00sincos10000cossin00sincos0000110001000100011sensorsensorsensoroffsetoffsetoffsetvehiclevehiclevehiclezyxzyxzyx (3-1) To synchronize the position of the grid map, the existing cell data are shifted such that the vehicle is always located in the center of the raster grid. The benefit of shifting the cell data is that it provides a limited short-term memory of the area directly local to the vehicle. It is assumed that the position and orientation data are fairly accurate and precise. If they are not, proper data registration will not be attained. Research is currently taking place to find ways of handling this problem, but this is outside the scope of this work. This is not an issue within this system because all smart sensors use the same position and orientation updates. Therefore any errors introduced due to loss of position system precision or accuracy will be present in all of the smart sensors data. Once the grid has been shifted, the sensor specific data may be entered into the traversability grid as if it were a local traversability sensor, i.e. no global position or orientation data. When the smartSensorTransformPoint() function is called, it handles converting the data from sensor coordinates to vehicle coordinates and finally to world coordinates. This transformation is shown in Equation 3-1. Once updates have been made to the base smart sensor-based traversability grid, the main thread causes the grid to be checked for changes. These changes are transmitted via the smartSensorSendCellUpdates() function, as shown in Figure 3-12, and its access to the UDP/IP transport layer. The next section details the implementation of a stereo vision based smart sensor. While the method described in this section is specific to the stereo vision system, the

PAGE 65

50 power of the smart sensor approach is that this sensor specific interface is abstracted out. This means that as long as a sensor implementer uses the same grid parameters, interfacing to the base smart sensor will be trivial. Figure 3-10. Call graph for function that determines number of rows and columns to shift the traversability grid based on the current and previous positions Figure 3-11. Call graph for sensor specific interface thread. Figure 3-12. Call graph for function used to detect changes in the traversability grid and transmit these changes to the smart sensor arbiter 3.5 Smart Stereo Vision Sensor Implementation Like all CIMAR smart sensors, the smart stereo vision sensor builds on the base smart sensor module. It is based on the Videre Design STH-MD1-C stereo vision camera system and the SRI Small Vision System.

PAGE 66

51 3.5.1 Stereo vision Hardware The Videre Design STH-MD1-C, shown in Figure 3-13, is a high resolution, wide baseline stereo vision camera system. It consists of two CMOS imagers and an IEE1394 (Firewire) interface for transferring the digital images to the computer doing the stereo processing. 3.5.2 Stereo Vision Software To handle the tasks of camera calibration, image rectification, and stereo correlation, the SRI Small Vision System (SVS) is used. SVS provides an application programmers interface to its internal implementation of the functions necessary for stereo processing [35]. This system is available for both Linux and Windows based systems. Figure 3-14 shows a rectified stereo image pair from the Videre system. The output of SVSs processing is shown below the stereo pair. In this image brighter pixels correspond to smaller distances. Conversely, darker pixels correspond to larger distances as calculated by stereo correlation. 3.5.3 Smart Stereo Vision Sensor The base smart sensor handles all of the low level functionality of the smart sensor. Because of this, the smart stereo vision sensor has to only provide an instantaneous indication of the region traversability within the area local to the vehicle. To handle the tasks of camera calibration, image rectification, and stereo correlation, the SRI small vision system (SVS) is used. A check of the range resolution was done at the range specified by the Team CIMAR perception team. The following equation relates the range resolution to the camera parameters as:

PAGE 67

52 rrbfd2 (3-2) where r is the resolution at range r, b is the baseline of the stereo vision camera system, f is the focal length of the camera lenses, and d is the smallest disparity perceivable by the stereo vision system. For this sensor systems STH-MD1, the baseline was 200 millimeters, the focal length was 12.5 millimeters, and the smallest disparity perceivable was 0.46875e-3 millimeters. A graph of range versus range resolution is shown in Figure 3-16 [35]. As can be seen in the figure, at a range of the 30 meters, the range resolution is approximately 17 mm not a problem at all considering that the grid resolution is constant at 0.5 meters per cell. Figure 3-13: Videre Design STH-MD1-C stereo camera head (left)

PAGE 68

53 A B C Figure 3-14: Source data and results from stereo correlation. A) Left image. B) Right image. C) Disparity image. 0 2 4 6 8 10 12 14 16 18 20 0 2 4 6 8 10 12 14 16 18 20 Range to Object as Measured with Rule [m]Range to Object as Measured with Stereo Vision Systems [m] Figure 3-15. Graph of range determined from stereo vision system vs. actual measured range Region traversability value judgment is based on an assessment of the three dimensional data provided by the stereo vision system. A method for fast obstacle

PAGE 69

54 classification based on allowable slope is presented in [15]. This is shown in Equation 3-3. zzxxyyzzkgkgkgkg2222sin (3-3) In this equation (x g y g and z g ) represent the coordinates of a known ground point and (x k y k and z k ) represent the coordinates of sensed point in space. The maximum allowable angle is represented by This method analyses each point within the sensor data to determine whether or not it represents a traversable region. 0 5 10 15 20 25 30 0 2 4 6 8 10 12 14 16 18 Object Range in Left Camera Coordinate System [m]Range Resolution [mm] Figure 3-16. Plot of range resolution vs. range for the Videre Design STH-MD1-C with 12.5mm focal length lenses In this work Hong et al. [15] also show that it is possible for an object to fail this test, but still be an obstacle because of the objects height. The following test, Equation 3-4, checks for this condition, by considering the height of the object above the ground

PAGE 70

55 plane. If an object is too tall for the vehicle to drive over, then it is classified as an obstacle. The constant H in Equation 3-4 sets this threshold. zzHkg (3-4) Because the smart stereo vision system is based on accumulated instantaneous sensor readings, the ground point used in (3.3) is the origin of the vehicle projected onto the plane defined by the intersection of the vehicles tires and the ground plane. By establishing this point as the origin of the vehicles coordinate system, the terms x g y g and z g drop out of the equation. Therefore for the instantaneous sensor reading, the obstacle check is based on Equation 3-5. zxyzkkkk2222sin (3-5) The base smart sensor API is used to perform the conversion from three-dimensional world coordinates to two dimensional grid coordinates. Because the base smart sensor has access to the current position and orientation of the vehicle, the offsets of the vehicle and sensor coordinate systems, and the range and resolution of the traversability grid, it is able to provide the smart stereo vision system a transformation from world coordinates directly to grid coordinates. To update data within the local traversability grid, a method for updating the traversability grid was established based on the work by [20]. This implementation of a local occupancy grid uses a simpler approach for grid updating. They based their updated method on the observation that stereo errors are systemic and are not easily modeled probabilistically. This is because stereo vision systems are dependent of the visual properties of the environment. For example, as lighting and texture conditions

PAGE 71

56 change, the performance of the stereo matching process may improve or degrade. Because of this a probabilistic model of the stereo vision system may not be the same as under ideal conditions. Similar to the grid cell properties established in Section 3.3.2.2, Murray and Littles [20] method uses a one byte per cell representation with an unknown state represented by the value 127. IF i TRAVERS(r) THEN G(i) = G(i)+K t ELSE G(i) = G(i)-K nt (3-6) An extension of their work was developed for use in the smart stereovision system traversability grid. This is method updates cells based on Equation 3-6. As in Murray and Littles implementation, i represents a location within the grid in this case the traversability grid, r is a reading from the stereo vision sensor, TRAVERS() is the operator that determines if the sensor reading represents a traversable point, G(i) is the traversability grid value at location i, K t and K nt are constants used to, respectively, increment and decrement the traversability cell value. The addition of K t and K nt is a departure from Murry and Littles approach where a single constant is used. By having separate constants, emphasis can be placed on either maintaining clean data with slower response times for detecting obstacles or vise versa. To bias one approach over the other, the associated incrementing constant is made larger than the other. Otherwise the constants should be equal. 3.6 Use of Obstacle Detection and Free Space Sensors As mentioned previously, the purpose of the Smart Sensor Architecture is to generate a model of the world local to the vehicle to support the task of obstacle avoidance. Since all sensor modalities do not provide high-resolution data within their

PAGE 72

57 fields of view, an important distinction is made between different types of sensors. They are classified as free space detectors, obstacle detectors, or a combination of both. While highly accurate with respect to presence within a zone, the issue is that when trying to use this for obstacle detection, the entire zone would have to be classified as an obstacle because of the lack of granularity in the sensor field. Rather than considering the radar unit an obstacle detector, it is viewed as a free space detector. If the radar unit indicates that there is no object in a particular zone, it can be assumed that the entire zone is clear. It follows that if the radar until indicates that all zones are clear then there is a fast, computationally inexpensive method of classifying the entire sensor field of view as clear and traversable. The associated cells are updated to correspond to this classification. Sensors with high resolution such as the stereo vision system or the LADAR sensor presented in chapter two can be used to detect free space as well as objects. 3.7 Smart Sensor Arbiter Implementation The smart sensor arbiter also builds on the base smart sensor module. The initial implementation of the arbiter is minimalist. Because position and orientation updates from the JAUS network are sent through the smart sensor arbiter down to all of the smart sensor components, the arbiter knows its current position. As grid cell updates come in from the smart sensors, each message has a latitude and longitude position stamp indicating the origin of the cell updates. Using the Universal Transverse Mercator projection, the latitude and longitude based coordinate values are converted to Cartesian coordinates within a UTM zone. The arbiter then does the same conversion using the coordinates of the vehicles current location. The difference in the vehicle position and

PAGE 73

58 the origin of the grid cell updates is converted to an offset of grid coordinates (rows and columns). This offset is simply applied to each grid cell update. This approach is acceptable in this situation because, while they are distributed systems, all smart sensors use position data from single position system. If each smart sensor were on a different subsystem with independent position systems, then this approach would have to be modified because of accuracy and precision issues. As the data within the smart sensors grids change, they transmit corresponding traversability grid updates. To fuse the data from the smart sensors, the arbiter uses the method shown in Equation 3-7. (3-7) Giwcellupdaterowrowoffsetcolcoloffsetniinnumsmartsensors()_(_,___1 n) where G(i) is the value of the fused traversability grid cell at the grid position i. The constant w n represents a weight associated with data from smart sensor n. Consider the case of a smart sensor system consisting of stereo vision based smart sensor and a RADAR based smart sensor. If the RADAR is considered a free space detector and is limited to that traversable range for cell updates, then when the RADAR unit classifies a pixel as free, it is highly probable that the RADARs data would always be more accurate than the stereo vision system, which is subject to the systemic errors discussed by [20]. Therefore weighing the RADAR data more than the stereo data would put more emphasis on the high quality RADAR data. This is only true when the RADAR is used as a free space detector. If the RADAR were used as an obstacle detector, then it would adversely affect the quality of the fused data from the stereovision system. The Region Clutter Sensor is a quasi-component embedded in the smart sensor arbiter. This component simply applies a non-traversable region threshold to the values

PAGE 74

59 within a region of the traversability grid. As the saturation of non-traversable regions increases, the vehicle makes the appropriate changes in velocity necessary to allow successful negotiation of area. This component is only present in the arbiter, so when a smart sensor replaces the arbiter, this functionality is lost.

PAGE 75

CHAPTER 4 JAUS WORLD MODEL KNOWLEDGE STORES The previous chapter presented a detailed description of the smart sensor architecture and the implementation of a stereovision based smart sensor unit. The smart sensor architecture message set defines a standard logical interface that allows data to be shared between the components of a perception system. While this interface is acceptable to meet the synchronization requirements of the smart sensor architectures traversability grids, it is not general enough to support the sharing of data based on the raster and vector modeling methods presented in Chapter 2. Building on the reviewed literature as well as on lessons learned from developing the smart sensor architecture, this chapter introduces standard modeling and input/output methods for world model knowledge stores. A world model knowledge store is to be the central geospatial data store for a JAUS component, node, subsystem, or system. The knowledge store provides only geospatial data storage and access methods. Therefore, no processing or higher level functionality should be provided by the knowledge store. It is the most primitive world modeling component and forms the foundation for all future world model components. These future components will extend the world modeling capabilities of JAUS by providing functions such as value judgment, simulation, prediction, etc. as described by Mystel [18]. 60

PAGE 76

61 Similar to the smart sensors in Chapter 3, the world model knowledge stores are envisioned as location independent, modular JAUS components. Because of this, it is possible to have multiple subsystems accumulating data in a global world model knowledge store or to have individual subsystems accumulate data in their own world model knowledge stores and then have synchronization of those stores. The data within these stores may be either persistent or volatile. This will not be specified as it is an implementation issue. 4.1 Observations and Recommendations Chapter 2 showed that there is a considerable amount of commonality between the numerous types of data that are available in a priori data stores and the types of data that may be accumulated in real-time. The main two classes of data types are raster and vector data. Raster data may consist of elevation, geo-referenced orthoimages, density maps, occupancy grids, traversability grids, etc. Vector data may consist of digital road maps, polygon maps, etc. By enforcing some constraints, it is possible to distill these data into a common format that may be used by unmanned systems community. A number of key observations were made about the current methods for accessing and sharing real time and a priori geospatial data. The level of complexity of modeling methods designed for a priori data-sharing (such as SDTS and GML) is well beyond what is necessary for JAUS based unmanned systems. There are a number of different projections that may be used to transform data from geodetic coordinates to a two or two and a half dimensional surface. Current world modeling methods are, at their core, based on either raster or vector primitives. For real-time world modeling on unmanned vehicles, raster methods are used most often.

PAGE 77

62 Both vector and raster modeling methods are commonly used in a priori data stores. Most a priori data stores include metadata which have extra information about the stored data. Therefore it is recommended that at a minimum the initial JAUS World Model standard should: Provide the ability for JAUS based subsystems, nodes, and/or components to share geospatial data with minimal complexity. Allow developers some degree of flexibility within the constraints of the standard. Specify a map projection and horizontal and vertical datums to be used within the knowledge stores. Allow for use and transfer of a priori and real-time raster and vector data. Provide a mechanism to allow distinguishing between different types of geospatial data. Provide a means for saving and sharing information about the geospatial data within the knowledge store. Meet the standard JAUS requirements for definition of new components. A JAUS World Model Knowledge Store standard should not be concerned with the method of modeling data internal to the system, but with how the data are formatted and presented to other JAUS components that use or store geospatial data. The work done on the smart sensor architecture as well as past experiments with JAUS interoperability has shown that as component interfaces become more complex, it becomes increasingly difficult to achieve true interoperability. The approach with the message set presented herein is to develop a method of sharing the data at the most primitive levels. Complexity has been limited so as to provide to the many organizations that make up the JAUS Working Group a more acceptable and undemanding initial standard. As the geospatial data-sharing requirements of the group change, so too will the standard.

PAGE 78

63 Standards inherently impose limitations and this must be accepted. However, standards that are too restrictive run the risk of losing of support. Therefore the JAUS World Model Knowledge Store standard is developed to be as flexible as reasonably possible. The messages are also designed to be as extensible as is possible within the JAUS framework. This standard and all future world modeling component standards should be considered living documents that are able to quickly change to meet the needs of system developers. The JAUS working groups review process will assure that the changes that are made are only those that are applicable to the group as a whole. Geospatial data transferred from different systems must use the same map projections, ellipsoidal Earth model, and horizontal and vertical datum. For the global coordinates, JAUS specifies that all systems use the World Geodetic System 1984 (WGS84). The map projection will be the Universal Transverse Mercator Projection. Vertical measurements will be based on the vertical datum as established by the ellipsoidal model of the Earth. Since most of the systems will be operating in the United States, the horizontal with be the North American Datum as established in 1983 (NAD83). Chapter 2 showed that real-time world modeling methods typically use tessellated raster data structures and a priori world modeling methods use ether raster or vector data structures. Therefore a message set has been developed to support two types of knowledge stores: the World Model Raster Knowledge Store and the World Model Vector Knowledge Store. The World Model Raster Knowledge Store provides a method for storage and sharing of raster formatted geospatial data within a JAUS system. Many unmanned

PAGE 79

64 systems with perception systems utilize a form of the local occupancy grid as introduced by Elfes [13]. The local occupancy grid is implemented as a tessellated geo-referenced grid. The World Model Raster Knowledge Store is a generalization of such a local occupancy grid. It is desired to have this knowledge store support most types of raster data. These include binary image, grey scale images, RGB images, digital elevation model (DEM) data, traversability, occupancy, etc. Typically an occupancy grid stores a value corresponding to a truth metric in each cell. When raster data are stored such that each cell represents a height at that location (such as DEM data), this is referred to as two and a half (2.5) dimensions [21]. Storage and sharing of spatial data such as points, lines, polylines, or polygons is supported by the World Model Vector Knowledge Store. These vector formatted spatial data provides a number of benefits. The primary benefit of such a system in the context of JAUS is that it requires significantly less bandwidth to transmit data as compared to the raster store. This method therefore can reduce the storage requirements within the system. A feature class represents a categorization of types of spatial data. For example, occupancy, free space, objects, roads, terrain, building, etc. all represent distinct feature classes. A geo-referenced, orthoimage may also represent a feature class. It may be more intuitive to consider these feature classes as different layers of geospatial data within the knowledge store. This is important because it allows different types of spatial data to be handled separately. Predefined feature classes will eventually be defined by the JAUS World Model Subcommittee in the interest of true world model interoperability. Since it is not possible to define all types of feature classes a priori, a

PAGE 80

65 sizeable amount of space has been set aside for user defined feature classes. While this does have an adverse effect on interoperability, this is mitigated by having system developers provide each other with a data dictionary when they wish to interoperate. The data dictionary is simple a description of which types of data correspond to a feature class identifier. Even with the predefined feature classes, when testing interoperability system developers must establish the data types that they are using within the knowledge store. It is possible that this exchange could be handled during the discovery process provided in the forthcoming JAUS dynamic configuration and registration extensions To allow dissemination of information about a feature class, the world model framework provides for storage and transfer of feature class metadata. In this context, metadata is simply text that provides general information about the data within a particular feature class. Initially the metadata are developed to be human readable text in a format specified by the user. Bolstad [7] gives an introduction to metadata and discusses the Content Standard for Digital Geospatial Metadata. Just as this is considered only a guideline for the GIS community, it is considered only a guideline for the JAUS community. Initially these metadata are designed to be human readable and not used in any distributed computations that may be performed on these data. The local request identifier (LRID) is a single-byte numerical identifier attached to certain classes of messages originating outside of the world model knowledge store. This feature allows synchronization of messages and their associated response. This is important because even though requests to the knowledge store may be synchronized, there is no guarantee that the responses will be synchronized. By attaching the LRID, the requesting component will be able to internally synchronize any asynchronous responses.

PAGE 81

66 4.1.1 Raster and Vector Object Representation This section describes the raster and vector objects as they should be formatted in the JAUS messages that define the input and outputs of the knowledge stores. Special attention must be made to assure that these conventions are followed by all components sending data to or receiving data from the world model knowledge stores. The data within a raster knowledge store should always maintain a north-east orientation. Raster data in the knowledge store should be geo-referenced by defining their origin as a single point described by the intersection of a line of latitude and a line of longitude (WGS84). The grid parameters also include the number of rows and columns and the grid resolution. While a grid cell is specified as a point, that point covers an area equal to the grid resolution squared. A Cartesian coordinate system is established at the geo-referenced point. The Cartesian coordinates of the grid cells are derived from use of the Universal Transverse Mercator projection. The grid cells may also be referenced by their row and column offset from the origin point. Figure 4-1 shows the format of a layer of raster data. While cells may have negative row and column values with respect to the grid origin, when transmitting rectangular grid data (e.g. images, DTED), the origin of the raster data must be the point that defines the cell whose column coordinate is equal to the column coordinate of the western most cells and whose row coordinate is equal to the row coordinate of the southern most cells. Therefore when transmitting a rectangular array of raster data, there will be no cell values with coordinates less than zero. For the vector knowledge store, objects are represented as points, lines and polylines, and polygons. The coordinates of these points are defined by a point of latitude and longitude (WGS84). Polylines and polygons may consist of up to 65535 vertices. Figure 4-2 shows the format of these vector objects. Rather than assigning

PAGE 82

67 these points Cartesian coordinates with respect to an arbitrarily chose datum, each vertex is expressed as a point of latitude and longitude. Figure 4-1. Definition of raster grid parameters and coordinate system The vector objects on the right of Figure 4-2 have a buffer parameter. The buffer parameter establishes a radial region around each vector object vertex and connects the radial regions of two or more radial regions by drawing lines at their tangents. The area within these radial regions and tangent lines are considered to be within the vector objects buffer zone. This feature allows a region to be established in proximity to the vector objects. For example, United States Geological Survey (USGS) road data is presented in vector form representing the center-line of such roads. It may be useful to do a search within the perimeter along a particular route defined in the USGS digital line

PAGE 83

68 graph data. For simple cases, it may be possible to generate a polygonal representation of the area around the road. Establishing this polygon will require transmitting the coordinates of each of its vertices. As the problem scales up, this method becomes very inefficient. A better solution to this problem would be to determine the route using the USGS digital line graph data and assign a region buffer to each line segment. The region buffer is defined as an offset distance in meters. The spatial buffer is established by defining a radius from each point on the vector object. For many cases, this buffer will be a simple offset with the exception of point objects and along non-smooth contours. Figure 4-2 shows these cases. If the system designer requires finer control over this region, they may define the buffer using the aforementioned polygonal representation. 4.2 World Model Knowledge Store Message Set The following sections present the initial draft message set for the first two JAUS World Modeling components. This message set is based on a review of the current methods of modeling spatial and geospatial data as presented in Chapter 2. These methods are distilled into their most basic form and codified into a standard consistent with the JAUS framework. 4.2.1 JAUS Core Input and Output Message Sets Support for the JAUS core message set is required by the current version of the JAUS Reference Architecture (RA). The JAUS Core Message Set consists of the following messages: Code 0001h: Set component authority Code 0002h: Shutdown Code 0003h: Standby Code 0004h: Resume Code 0005h: Reset Code 0006h: Set emergency

PAGE 84

69 Code 0007h: Clear emergency Code 0008h: Create service connection Code 0009h: Confirm service connection Code 000Ah: Activate service connection Code 000Bh: Suspend service connection Code 000Ch: Terminate service connection Figure 4-2. Definition of vector objects and parameters

PAGE 85

70 While the JAUS RA does require that these messages be accepted by all components, there is no requirement that components have an action associated with each input message. Because the expected behavior of components while in each state is somewhat ambiguous, they will be defined for the world model knowledge stores. So too will the message that are required to have a response. The world model knowledge stores should have an appropriate response to the following messages: Code 0002h: Shutdown Code 0003h: Standby Code 0004h: Resume Code 0005h: Reset Code 0009h: Confirm service connection Code 000Ah: Activate service connection Code 000Bh: Suspend service connection Code 000Ch: Terminate service connection Code 2002h: Query component status Code 4002h: Report component status The Code 0002h: Shutdown message should cause the receiving knowledge store to immediately terminate all data transfer upon receipt. If the knowledge store is responding to a query, it should immediately terminate the flow of data and transmit the Code F405h: Report Raster Knowledge Store Data Transfer Termination or the Code F424h: Report Vector Knowledge Store Data Transfer Termination message to the component whose query response was interrupted and any components with outstanding requests. Upon termination of all data transfer, the world model should execute its specific shutdown routine and then halt. It should no longer respond to any data requests and should require a hard reset in order to resume operation. The Code 0003h: Standby message should cause the receiving knowledge store to respond as if it had received the Code 0002h: Shutdown message. The exception is that

PAGE 86

71 the knowledge store should not halt. It should respond only to the Code 0004h: Resume and Code 0005: Reset messages. Upon resumption to the ready state, the knowledge store should resume normal operations. It should not resume any suspended query responses. The Code 0005h: Reset message should cause the receiving knowledge store to immediately terminate the transfer and processing of any data. The knowledge store should transmit to all components with outstanding requests or data transfers the Code F405h: Report Raster Knowledge Store Data Transfer Termination or the Code F424h: Report Vector Knowledge Store Data Transfer Termination message. The knowledge store should them immediately restart and return to the ready state. Terminated data transfers should not resume. The Codes 0009h: Confirm Service Connection, 000Ah: Activate Service Connection, 000Bh: Suspend Service Connection, 000Ch: Terminate Service Connection, 2002h: Query Component Status and 4002h: Report Component Status messages should all invoke that typical JAUS response associated with their receipt. 4.2.2 Raster Knowledge Store Input Message Set In the following subsections are the messages that define the input to the raster version of the world model knowledge store. These command, query, and event setup class messages are transmitted in order to initiate an appropriate inform or event notification class message output. These outputs messages are defined in Section 4.2.3. The inputs to the raster knowledge store are: The JAUS core input message set Code F000h: Create raster knowledge store object Code F001h: Set raster knowledge store feature class metadata Code F002h: Modify raster knowledge store object (cell update)

PAGE 87

72 Code F003h: Modify raster knowledge store object (grid update) Code F004h: Delete raster knowledge store objects Code F200h: Query raster knowledge store objects Code F201h: Query raster knowledge store feature class metadata Code F202h: Query raster knowledge store bounds Code F600h: Raster knowledge store event notification request Code F601h: Raster knowledge store bounds change event notification request Code F005h: Terminate raster knowledge store data transfer 4.2.2.1 Code F000h: Create raster knowledge store object The Code F000h: Create Raster Knowledge Store Object message (Table 4-1) is used to create and initialize a layer of feature class data within the raster knowledge store. In order for data to be added to the feature class, the feature class layer must first be created. The origin of the raster grid must be geo-referenced by specifying its origin in fields 4 and 5 as a point of latitude and longitude. Extents of the layer must also be specified as a number of rows and columns in fields 7 and 8. Both the data types that describe the number of rows and columns and the cell attribute type are variable and must also be specified in fields 6 and 11, respectively. The grid cell resolution is also specified in field 9. Because this message is used to create a feature class layer, the feature class must be specified using field 10. This message has a single optional field (field 11). Inclusion of this optional field is determined from the state of bit zero in the message presence vector (Table 4-2). If the bit zero is set, then the value in field 12 shall be used to initialize all cells within the feature class. When the feature class layer is initialized using this message, the data are filled in the grid on a row by row basis starting at the southern most row and moving north. It is filled beginning at the southwestern most point moving east.

PAGE 88

73 Table 4-1. Create raster knowledge store objects message format Field # Name Type Units Interpretation 1 Message Properties Byte N/A Bit Field 0: Request confirmation of object creation 1 7: Reserved 2 Message Properties Byte N/A Bit Field 0: Request confirmation of object creation 3 Local Request ID Byte N/A Request identifier to be used when returning confirmation to requesting component 4 Origin Latitude (WGS84) Integer Degrees Scaled Integer Lower Limit = -90 Upper Limit = 90 5 Origin Longitude (WGS84) Integer Degrees Scaled Integer Lower Limit = -180 Upper Limit = 180 6 Raster Data Row and Column Data Type Byte N/A Enumeration 0: Byte 1: Reserved 2: Reserved 3: Reserved 4: Unsigned Short Integer 5: Unsigned Integer 6: Unsigned Long Integer 7 255: Reserved 7 Raster Grid Update Rows Varies (See field 4) Grid Cells 8 Raster Grid Update Columns Varies (See field 4) Grid Cells 9 Cell Resolution Float Meters 10 Feature Class Unsigned Short Integer N/A Enumeration 0 65,534 See Feature Class Table 65,535 Reserved 11 Raster Cell Data Type Byte N/A Eumeration Same format as field 6 12 Initial Value for Raster Grid Cells Varies (see field 11) N/A

PAGE 89

74 Table 4-2. Presence vector for create raster knowledge store objects message Vector to Data Field Mapping for Above Command Vector Bit 7 6 5 4 3 2 1 0 Data Field R R R R R R R 12 4.2.2.2 Code F001h: Set raster knowledge store feature class metadata As described in Section 4.1, metadata are data about data. The Code F001h: Set Raster Knowledge Store Feature Class Metadata (Table 4-3) message allows a user to create, modify, and erase feature class metadata. At the present time the format of these metadata is not specified. It is left to the system designer to develop a convention for doing this. Initially these data are to be used by the human operators. In the future a schema may be defined so as to provide a standard metadata format that may be parsed and the data used by unmanned systems without human intervention. Table 4-3. Set raster knowledge store feature class metadata message format Field # Name Type Units Interpretation 1 Metadata Options Byte N/A Enumeration 0: Append 1: Prepend 2: Overwrite 3 254: Reserved 255: Erase All 2 Feature Class Short Integer N/A Enumeration 0 65,534 See Feature Class Table 65,535 Reserved 3 Number of String Characters Unsigned Short Integer N/A 0 65,535 This field should be equal to zero only when Field 1 is equal to 255 (Erase All) 4 Metadata String N/A Variable length string 4.2.2.3 Code F002h: Modify raster knowledge store object (cell update) The Code F002h: Modify Raster Knowledge Store Object (Cell Update) message (Table 4-4) is used to change data within a raster knowledge store feature class layer.

PAGE 90

75 This message can only be used on a layer that has been created within the raster knowledge store. This method is specified as a cell update version because it allows modification of the raster grid on a cell by cell basis. This message has no optional fields. The origin of the raster grid cell updates must be geo-referenced by specifying its origin in fields 2 and 3 as a point of latitude and longitude. Both the data types that describe the update row and column and cell attribute are variable and must also be specified in fields 4 and 7, respectively. The grid cell update resolution is also specified in field 5. Because this message is used to modify a feature class layer, the feature class must be specified using field 6. The data type for the field that specifies the number of cell updates included in the message (field 9) is also variable and is defined in field 8. Each cell update is a three-tuple representing the cell updates row, column, and update attribute value. Table 4-4. Modify raster knowledge store object (cell update) message format Field # Name Type Units Interpretation 1 Local Request ID Byte N/A Request identifier to be used when returning confirmation to requesting component 2 Origin Latitude (WGS84) Integer Degrees Scaled Integer Lower Limit = -90 Upper Limit = 90 3 Origin Longitude (WGS84) Integer Degrees Scaled Integer Lower Limit = -180 Upper Limit = 180 4 Raster Data Row and Column Data Type Byte N/A Enumeration 0: Byte 1: Short Integer 2: Integer 3: Long Integer 4: Unsigned Short Integer 5: Unsigned Integer 6: Unsigned Long Integer 7 255: Reserved

PAGE 91

76 Field # Name Type Units Interpretation 5 Cell Resolution Float Meters 6 Feature Class Unsigned Short Integer N/A Enumeration 0 65,534 See Feature Class Table 65,535 Reserved 7 Raster Cell Data Type Byte N/A Enumeration 0: Byte 1: Short Integer 2: Integer 3: Long Integer 4: Unsigned Short Integer 5: Unsigned Integer 6: Unsigned Long Integer 7: Float 8: Long Float 19: RGB (3 Bytes) 10 255: Reserved 8 Data Type for Number of Cell Updates Byte N/A Enumeration 0: Byte 1: Short Integer 2: Integer 3: Long Integer 4: Unsigned Short Integer 5: Unsigned Integer 6: Unsigned Long Integer 7 255: Reserved 9 Number of Cell Updates Varies (see field 8) N/A 10 Raster Cell Update 1 Row Varies (see field 4) N/A 11 Raster Cell Update 1 Col Varies (see field 4) N/A 12 Raster Cell Update 1 Data Varies (see field 7) Varies with Feature Class

PAGE 92

77 Table 4-4. Continued Field # Name Type Units Interpretation 3n + 7 Raster Cell Update n Row Varies (see field 4) N/A 3n + 8 Raster Cell Update n Col Varies (see field 4) N/A 3n + 9 Raster Cell Update n Data Variable (see field 7) Varies with Feature Class 4.2.2.4 Code F003h: Modify raster knowledge store object (grid update) The Code F003h: Modify Raster Knowledge Store Object (Grid Update) message (Table 4-5) is similar to the Code F002h: Modify Raster Knowledge Store Object (Cell Update) message in that it permits change of grid cell values. It differs from that method in that rather than transmitting single cell updates, an entire rectangular patch of cells is updated. As the number of cells that need to be modified increases, this method becomes more efficient than the cell update method. The origin of the raster grid update must be geo-referenced by specifying its origin in fields 2 and 3 as a point of latitude and longitude. Both the data types that describe the update row and column and cell attribute are variable and must also be specified in fields 4 and 9, respectively. Fields 5 and 6 specify the number of rows and columns of raster grid updates being transmitted. The grid cell update resolution is also specified in field 7. Because this message is used to modify a feature class layer, the feature class must be specified in field 8. Table 4-5. Modify raster knowledge store object (grid update) message format Field # Name Type Units Interpretation 1 Local Request ID Byte N/A Request identifier to be used when returning confirmation to requesting component

PAGE 93

78 Table 4-5. Continued Field # Name Type Units Interpretation 2 Origin Latitude (WGS84) Integer Degrees Scaled Integer Lower Limit = -90 Upper Limit = 90 3 Origin Longitude (WGS84) Integer Degrees Scaled Integer Lower Limit = -180 Upper Limit = 180 4 Raster Data Row and Column Data Type Byte N/A Enumeration 0: Byte 1: Reserved 2: Reserved 3: Reserved 4: Unsigned Short Integer 5: Unsigned Integer 6: Unsigned Long Integer 7 255: Reserved 5 Raster Grid Update Rows Varies (See field 4) Grid Cells 6 Raster Grid Update Columns Varies (See field 4) Grid Cells 7 Cell Resolution Float Meters 8 Feature Class Unsigned Short Integer N/A Enumeration 0 65,534 See Feature Class Table 65,535 Reserved 9 Raster Cell Data Type Byte N/A Enumeration 0: Byte 1: Short Integer 2: Integer 3: Long Integer 4: Unsigned Short Integer 5: Unsigned Integer 6: Unsigned Long Integer 7: Float 8: Long Float 19: RGB (3 Bytes) 10 255: Reserved 10 Raster Cell Update 1 Varies (see field 9) N/A 11 Raster Cell Update 2 Varies (see field 9) N/A

PAGE 94

79 Table 4-5. Continued Field # Name Type Units Interpretation 9 + n Raster Cell Update n Varies (see field 9) N/A 10 + n Raster Cell n+1 Varies (see field 9) N/A 4.2.2.5 Code F004h: Delete raster knowledge store objects The Code F004h: Delete Raster Knowledge Store Object message (Table 4-6) is used to free all resources allocated to a feature class layer within the raster knowledge store. In order to resume accumulation of data within the deleted feature class, the feature class layer must be recreated using the Create Raster Knowledge Store Object message. The message allows a single feature class or all feature classes to be deleted in one message. Table 4-6. Delete raster knowledge store objects message format Field # Name Type Units Interpretation 1 Presence Vector Byte N/A See mapping table below 2 Local Request ID Byte N/A Request identifier to be used when returning confirmation to requesting component 3 Number of Feature Classes Byte N/A 4 Feature Class 1 Short Integer N/A Enumeration 0 65,534 See Feature Class Table 65,535 ALL 3 + n Feature Class n Short Integer N/A Enumeration 0 65,534 See Feature Class Table 65,535: Reserved 4.2.2.6 Code F200h: Query raster knowledge store objects The Code F200h: Query Raster Knowledge Store Objects message (Table 4-7) provides access to data within the raster knowledge store. Field 1 of this message is the

PAGE 95

80 message presence vector (Table 4-8). The optional fields in this message are fields 4, 5, and 6. Field 2 is the Query Response Properties bit field. When bit zero is clear, the response to the query should only include the number of records that would be returned. When bit one is set, the query response shall be the Code F402h: Report Raster Knowledge Store Objects (Cell Update) message. Otherwise, the Code F403h: Report Raster Knowledge Store Objects (Grid Update) message shall be sent. Field 3 is the message Local Request Identifier. This field allows synchronization of message responses. Field 4 is the Raster Query Resolution. This field allows the querying component to specify the cell resolution to be used in the response to the query. If this resolution does not match the native resolution of the queried knowledge store, then the knowledge store should either sub-sample or interpolate the data to obtain the desired resolution. This field is optional. Field 5 specifies a specific feature class to be queried. This field is optional. If a feature class is not specified, then the query should be done on all feature classes within the knowledge store. Fields 6 through 9 specify two points of latitude and longitude that limit the range of the query. These fields are optional. If presence vector bit two is set, then fields 6 through 9 shall all be included. Otherwise, they should not. Table 4-7. Query raster knowledge store objects message format Field # Name Type Units Interpretation 1 Presence Vector Unsigned Short Integer N/A See mapping table below 2 Query Response Properties Byte N/A Bit Field 0: Only return number of responses that would be transmitted 1: Return cell update 3 tuples or raster scan (active low) 2 7: Reserved

PAGE 96

81 Table 4-7. Continued Field # Name Type Units Interpretation 3 Local Request ID Byte N/A Request identifier to be used when returning data to requesting component 4 Raster Query Resolution Float Meters 5 Feature Class Unsigned Short Integer N/A Enumeration 0 65,534 See Feature Class Table 65,535 All Feature Classes 6 Query Region Point 1 Latitude (WGS84) Integer Degrees Scaled Integer Lower Limit = -90 Upper Limit = 90 7 Query Region Point 1 Longitude (WGS84) Integer Degrees Scaled Integer Lower Limit = -180 Upper Limit = 180 8 Query Region Point 2 Latitude (WGS84) Integer Degrees Scaled Integer Lower Limit = -90 Upper Limit = 90 9 Query Region Point 2 Longitude (WGS84) Integer Degrees Scaled Integer Lower Limit = -180 Upper Limit = 180 Table 4-8. Presence vector for query raster knowledge store objects message Vector to Data Field Mapping for Above Command Vector Bit 7 6 5 4 3 2 1 0 Data Field R R R R R 6 5 4 4.2.2.7 Code F201h: Query raster knowledge store feature class metadata The Code F201h: Query Raster Knowledge Store Feature Class Metadata message (Table 4-9) should cause the Raster Knowledge Store to reply to the requestor with the Code F402h: Report Raster Knowledge Store Feature Class Metadata. There is a single

PAGE 97

82 field associated with this message. This field specifies the feature class metadata to return in the reply. There is also an option to return metadata for all feature classes present in the queried raster knowledge store. Table 4-9. Query raster knowledge store feature class metadata message format Field # Name Type Units Interpretation 1 Feature Class Unsigned Short Integer N/A Enumeration 0 65,534 See Feature Class Table 65,535 All 4.2.2.8 Code F202h: Query raster knowledge store bounds The Code F202h: Query Raster Knowledge Store Bounds message (Table 4-10) is used to request the spatial extents of a single feature class or of all feature classes within a raster knowledge store. The knowledge store should respond with the Code F404h: Report Raster Knowledge Store Bounds message. The bounds are represented by two points the represent the rectangular region that just covers all of the data within the feature class layer or layers. Table 4-10. Query raster knowledge store bounds message format Field # Name Type Units Interpretation 1 Local Request ID Byte N/A Request identifier to be used when returning data to requesting component 2 Feature Class Unsigned Short Integer N/A Enumeration 0 65,534 See Feature Class Table 65,535 All Feature Classes 4.2.2.9 Code F600h: Raster knowledge store event notification request The Code F660h: Raster Knowledge Store Event Notification Request message is used to establish an event triggered query within the knowledge store. Therefore, this

PAGE 98

83 message is formatted exactly the same as the Code F200h: Query Raster Knowledge Store Objects message. That message should be referenced for the format of this message. Whenever the criteria established in this message are met, depending on the query response field of the event notification request, the raster knowledge store should transmit either the Code F800h: Raster Knowledge Store Event Notification (Cell Update) message or the Code F801h: Raster Knowledge Store Event Notification (Grid Update) message. 4.2.2.10 Code F601h: Raster knowledge store bounds change event notification request The Code F601h: Raster Knowledge Store Bounds Change Event Notification Request message is used to establish an event triggered response to notify the requesting component of when the data in a feature class extends past the bounds of the data when the initial request was sent. When the extents of the data change, the raster knowledge store will transmit the Code F802: Raster Knowledge Store Bounds Change Event Notification message. 4.2.2.11 Code F005h: Terminate raster knowledge store data transfer This Code F005h: Terminate Raster Knowledge Store Data Transfer message is a command class message that should cause the raster knowledge store to immediately terminate the transfer of all current and outstanding data destined to the requesting component. Upon termination, the raster knowledge store should send the requestor the Code F405h: Report Raster Knowledge Store Data Transfer Termination message. 4.2.3 Raster Knowledge Store Output Message Set In the following subsections are the messages that define the output of the raster version of the world model knowledge store. These inform and event notification class

PAGE 99

84 messages are transmitted in response to the command, query, and event setup class of input messages presented in Section 4.2.2. The outputs of the raster knowledge store are: The JAUS core output message set Code F400h: Report raster knowledge store object creation Code F401h: Report raster knowledge store feature class metadata Code F402h: Report raster knowledge store objects (cell update) Code F403h: Report raster knowledge store objects (grid update) Code F404h: Report raster knowledge store bounds Code F800h: Raster knowledge Store Event Notification (cell update) Code F801h: Raster knowledge store event notification (grid update) Code F802h: Raster knowledge store bounds change event notification Code F405h: Report raster knowledge store data transfer termination 4.2.3.1 Code F400h: Report raster knowledge store object creation The Code F400h: Report Raster Knowledge Store Object Creation message (Table 4-11) is used to confirm creation of raster objects in the raster knowledge store. This message is sent only when an object creation message is requested by setting bit zero in the Code F000h: Create Raster Knowledge Store Object message. If this bit is set, this message will be transmitted and the local object identifier (field 1) is set to the value sent with the Code F000h: Create Raster Knowledge Store Raster Object message. Table 4-11. Report raster knowledge store object creation message format Field # Name Type Units Interpretation 1 Local Request ID Byte N/A Local request identifier sent by creating component 4.2.3.2 Code F401h: Report raster knowledge store feature class metadata The Code F401h: Report Raster Knowledge Store Feature Class Metadata message (Table 4-12) allows access to feature class metadata stored within raster knowledge store. It is transferred in response to the Code F201h: Query Raster Knowledge Store Feature

PAGE 100

85 Class Metadata message. If the query message requests all feature classes, a separate message should be sent for each feature class. These metadata are entered using the Code F001h: Set Raster Knowledge Store Feature Class Metadata message. Table 4-12. Report raster knowledge store feature class metadata message format Field # Name Type Units Interpretation 1 Feature Class Short Integer N/A Enumeration 0 65,535 See Feature Class Table 2 Number of String Characters Unsigned Short Integer N/A 0 65,535 3 Metadata String N/A Variable length string 4.2.3.3 Code F402h: Report raster knowledge store objects (cell update) The Code F402h: Report Raster Knowledge Store Objects (Cell Update) message (Table 4-13) is sent in direct response to a Code F200h: Query Raster Knowledge Store Objects message if and only if bit two of the bit field in message field two is set. Otherwise, the Code F403h: Report Raster Knowledge Store Objects (Grid Update) message is transmitted. If bit one of field two of the Code F200h: Query Raster Knowledge Store Objects message is set, then only the first two fields of this message shall be transmitted. Field 1 of this message is Local Request Identifier sent with the query that initiated this report message. Field 2 notifies the receiving component of the number of records included in the report message. Fields 3 and 4 establish the geodetic origin (latitude and longitude) of the cell updates included in the message. Both the data types that describe the update row and column and cell attribute are variable and are specified in fields 5 and 8, respectively. Field 6 is the resolution of the raster grid

PAGE 101

86 updates reported in the message. Field 7 is the feature class that raster data are assigned to. Table 4-13. Report raster knowledge store objects (cell update) message format Field # Name Type Units Interpretation 1 Local Request ID Byte N/A Request identifier sent with initial request 2 Number of Responses Unsigned Short Integer N/A 0 65,535 Number of Responses Included on this Report Message 3 Origin Latitude (WGS84) Integer Degrees Scaled Integer Lower Limit = -90 Upper Limit = 90 4 Origin Longitude (WGS84) Integer Degrees Scaled Integer Lower Limit = -180 Upper Limit = 180 5 Raster Data Row and Column Data Type Byte N/A Enumeration 0: Byte 1: Short Integer 2: Integer 3: Long Integer 4: Unsigned Short Integer 5: Unsigned Integer 6: Unsigned Long Integer 7 255: Reserved 6 Cell Resolution Float Meters 7 Feature Class Unsigned Short Integer N/A Enumeration 0 65,534 See Feature Class Table 65,535 Reserved 8 Raster Cell Data Type Byte N/A Enumeration 0: Byte 1: Short Integer 2: Integer 3: Long Integer 4: Unsigned Short Integer 5: Unsigned Integer 6: Unsigned Long Integer 7: Float 8: Long Float 19: RGB (3 Bytes) 10 255: Reserved

PAGE 102

87 Table 4-13. Continued Field # Name Type Units Interpretation 9 Data Type for Number of Cell Updates Byte N/A Enumeration 0: Byte 1: Short Integer 2: Integer 3: Long Integer 4: Unsigned Short Integer 5: Unsigned Integer 6: Unsigned Long Integer 7 255: Reserved 10 Number of Cell Updates Varies (see field 8) N/A 11 Raster Cell Update 1 Row Varies (see field 5) N/A 12 Raster Cell Update 1 Col Varies (see field 5) N/A 13 Raster Cell Update 1 Data Varies (see field 8) Varies with Feature Class 3n + 8 Raster Cell Update n Row Varies (see field 5) N/A 3n + 9 Raster Cell Update n Col Varies (see field 5) N/A 3n + 10 Raster Cell Update n Data Varies (see field 8) Varies with Feature Class 4.2.3.4 Code F403h: Report raster knowledge store objects (grid update) The Code F403h: Report Raster Knowledge Store Objects (Grid Update) message (Table 4-14) is sent in direct response to a Code F200h: Query Raster Knowledge Store Objects message if and only if bit two of the bit field in message field two is clear. Otherwise, the Code F402h: Report Raster Knowledge Store Objects (Cell Update)

PAGE 103

88 message is transmitted. If bit one of field two of the Code F200h: Query Raster Knowledge Store Objects message is set, then only the first two fields of this message shall be transmitted. Field 1 of this message is Local Request Identifier sent with the query that initiated this report message. Field 2 notifies the receiving component of the number of records included in the report message. Fields 3 and 4 establish the geodetic origin (latitude and longitude) of the cell updates included in the message. Both the data types that describe the update row and column and cell attribute are variable and are specified in fields 5 and 10, respectively. Fields 6 and 7 represent the number of rows and columns of grid update cells. Field 8 is the resolution of the raster grid updates reported in the message. Field 9 is the feature class that raster data are assigned to. Table 4-14. Report raster knowledge store objects (grid update) message format Field # Name Type Units Interpretation 1 Local Request ID Byte N/A Request identifier sent with initial request 2 Number of Responses Unsigned Short Integer N/A 0 65,535 Number of responses (objects) included on this report message 3 Origin Latitude (WGS84) Integer Degrees Scaled Integer Lower Limit = -90 Upper Limit = 90 4 Origin Longitude (WGS84) Integer Degrees Scaled Integer Lower Limit = -180 Upper Limit = 180 5 Raster Data Row and Column Data Type Byte N/A Enumeration 0: Byte 1: Reserved 2: Reserved 3: Reserved 4: Unsigned Short Integer 5: Unsigned Integer 6: Unsigned Long Integer 7 255: Reserved

PAGE 104

89 Table 4-14. Continued Field # Name Type Units Interpretation 6 Raster Grid Update Rows Varies (See field 5) Grid Cells 7 Raster Grid Update Columns Varies (See field 5) Grid Cells 8 Cell Resolution Float Meters 9 Feature Class Unsigned Short Integer N/A Enumeration 0 65,534 See Feature Class Table 65,535 Reserved 10 Raster Cell Data Type Byte N/A Enumeration 0: Byte 1: Short Integer 2: Integer 3: Long Integer 4: Unsigned Short Integer 5: Unsigned Integer 6: Unsigned Long Integer 7: Float 8: Long Float 19: RGB (3 Bytes) 10 255: Reserved 11 Raster Cell Update 1 Varies (see field 10) N/A 12 Raster Cell Update 2 Varies (see field 10) N/A 10 + n Raster Cell Update n Varies (see field 10) N/A 11 + n Raster Cell n+1 Varies (see field 10) N/A 4.2.3.5 Code F404h: Report raster knowledge store bounds The Code F404h: Report Raster Knowledge Store message format is shown in Table 4-15. This message reports the Raster Knowledge Store bounds as a response to the Query Knowledge Store Bounds message. In this message, the raster knowledge

PAGE 105

90 store returns the two geographic points that represent the extents of the data within a feature class layer or all feature class layers. Table 4-15. Report raster knowledge store bounds message format Field # Name Type Units Interpretation 1 Southwest Point Latitude (WGS84) Integer Degrees Scaled Integer Lower Limit = -90 Upper Limit = 90 2 Southwest Point Longitude (WGS84) Integer Degrees Scaled Integer Lower Limit = -180 Upper Limit = 180 3 Northeast Point Latitude (WGS84) Integer Degrees Scaled Integer Lower Limit = -90 Upper Limit = 90 4 Northeast Point Longitude (WGS84) Integer Degrees Scaled Integer Lower Limit = -180 Upper Limit = 180 4.2.3.6 Code F800h: Raster knowledge store event notification (cell update) The Code F800h: Raster Knowledge Store Event Notification (Cell Update) message is an event triggered message that is sent in response to the Code F600h: Raster Knowledge Store Event Notification Request message. When bit two of the bit field in that message field two is set, this message is transmitted when the conditions specified in the event notification request are met. The format of this message is identical to that of the Code F402h: Report Raster Knowledge Store Objects (Cell Update) message. 4.2.3.7 Code F801h: Raster knowledge store event notification (grid update) The Code F801h: Raster Knowledge Store Event Notification (Grid Update) message is an event triggered message that is sent in response to the Code F600h: Raster Knowledge Store Event Notification Request message. When bit two of the bit field in that message field two is clear, this message is transmitted when the conditions specified

PAGE 106

91 in the event notification request are met. The format of this message is identical to that of the Code F403h: Report Raster Knowledge Store Objects (Grid Update) message. 4.2.3.8 Code F802h: Raster knowledge store bounds change event notification The Code F802h: Raster Knowledge Store Bounds Change Event Notification message is an event triggered message that is sent in response to the Code F601h: Raster Knowledge Store Bounds Change Event Notification Request message. It is transmitted to the requesting component each time the spatial extents of a feature class or feature classes (as specified in the event notification request message) change. The format of this message is identical to that of the Code F404h: Report Raster Knowledge Store Bounds message. 4.2.3.9 Code F405h: Report raster knowledge store data transfer termination The Code F405h: Report Raster Knowledge Store Data Transfer Termination message notifies other JAUS components that data that were being transferred or were going to be transferred to them has been stopped. This message is sent in response to the Code F005h: Terminate Raster Knowledge Store Data Transfer message. It is also sent whenever data transfer is interrupted due to a change in the component state as discussed in Section 4.2.1. 4.2.4 Vector Knowledge Store Input Message Set Below are the messages that define the input methods to the vector version of the knowledge store. Inputs: The JAUS core input message set Code F020h: Create vector knowledge store objects Code F021h: Set vector knowledge store feature class metadata Code F022h: Delete vector knowledge store objects Code F220h: Query vector knowledge store objects

PAGE 107

92 Code F221h: Query vector knowledge store feature class metadata Code F222h: Query vector knowledge store bounds Code F620h: Vector knowledge store event notification request Code F621h: Vector knowledge store bounds change event notification request Code F023h: Terminate vector knowledge store data transfer 4.2.4.1 Code F020h: Create vector knowledge store objects The Code F020h: Create Vector Knowledge Store Objects message (Table 4-16) is used to add objects to the Vector Knowledge Store. This message allows multiple vector objects to be created using a single message. Field 1 of this message is the presence vector (Table 4-17). When multiple objects are created using the same message, the presence vector shall apply to all objects. Because there is a single presence vector associated with this message, all objects within this message shall use this presence vector. Field 2 of this message is the creation message properties. If bit zero is set, then the knowledge store shall return the Code F420h: Report Vector Knowledge Store Object(s) Creation message with the local request identifier specified in field 3. The data type that describes the vector objects attributes is variable and is specified in fields 4. Field 5 indicates the number of vector objects included in the message. Fields 6 begins the definition of a single vector object. The vector objects is defined by its type (point, line, or polygon), the number of feature classes that it is assigned to, an attribute for each feature class, followed by the global coordinates of the vertices of the object. These fields are repeated for each object created using this message. Again, the presence vector applies to each vector object. Table 4-16. Create vector knowledge store objects message format Field # Name Type Units Interpretation 1 Presence Vector Byte N/A See mapping table below

PAGE 108

93 Table 4-16. Continued Field # Name Type Units Interpretation 2 Message Properties Byte N/A Bit Field 0: Request confirmation of object creation 1 7: Reserved 3 Local Request ID Byte N/A Request identifier to be used when returning confirmation to requesting component 4 Feature Class Attribute Data Type for Vector Objects Byte N/A Enumeration 0: Byte 1: Short Integer 2: Integer 3: Long Integer 4: Unsigned Short Integer 5: Unsigned Integer 6: Unsigned Long Integer 7: Float 8: Long Float 9: RGB (3 Bytes) 10 255: Reserved 5 Number of Objects Unsigned Short Integer 0, reserved 1 65,535 6 Object 1 Type Byte N/A Enumeration 0: Point 1: Line 2: Polygon 3 255: Reserved 7 Object 1 Buffer Float Meters 8 Object 1 Number of Feature Classes Byte N/A 8 Object 1 Feature Class 1 Short Integer N/A Enumeration 0 65,534 See Feature Class Table 65,535 Reserved Object 1 Feature Class m Short Integer N/A Enumeration 0 65,534 See Feature Class Table 65,535 Reserved

PAGE 109

94 Table 4-16. Continued Field # Name Type Units Interpretation Object 1 Feature Class Attribute 1 Varies (see field 4) Varies with Feature Class Object 1 Feature Class Attribute m Varies (see field 4) Varies with Feature Class Object 1 Point 1 Latitude (WGS84) Integer Degrees Scaled Integer Lower Limit = -90 Upper Limit = 90 Object 1 Point 1 Longitude (WGS84) Integer Degrees Scaled Integer Lower Limit = -180 Upper Limit = 180 Object 1 Point n Latitude (WGS84) Integer Degrees Scaled Integer Lower Limit = -90 Upper Limit = 90 Object 1 Point n Longitude (WGS84) Integer Degrees Scaled Integer Lower Limit = -180 Upper Limit = 180 Object p Type Byte N/A Enumeration 0: Point 1: Line 2: Polygon 3 255: Reserved Object p Buffer Float Meters Object p Number of Feature Classes Byte N/A Object p Feature Class 1 Unsigned Short Integer N/A Enumeration 0 65,534 See Feature Class Table 65,535 Reserved

PAGE 110

95 Table 4-16. Continued Field # Name Type Units Interpretation Object p Feature Class m Unsigned Short Integer N/A Enumeration 0 65,534 See Feature Class Table 65,535 Reserved Object p Feature Class Attribute 1 Varies (see field 4) Varies with Feature Class Object p Feature Class Attribute m Varies (see field 4) Varies with Feature Class Object p Point r Latitude (WGS84) Integer Degrees Scaled Integer Lower Limit = -90 Upper Limit = 90 Object p Point r Longitude (WGS84) Integer Degrees Scaled Integer Lower Limit = -180 Upper Limit = 180 Table 4-17. Presence vector for create vector knowledge store objects message Vector to Data Field Mapping for Above Command Vector Bit 7 6 5 4 3 2 1 0 Data Field R R R R R R R 7 4.2.4.2 Code F021h: Set vector knowledge store feature class metadata As described in Section 4.1, metadata are data about data. The Code F021h: Set Vector Knowledge Store Feature Class Metadata (Table 4-18) message allows a user to create, modify, and delete feature class metadata. At the present time the format of these metadata is not specified. It is left to the system designer to develop a convention for doing this. Initially these data are to be used by the human operators. In the future a schema may be defined so as to provide a standard metadata format that may be parsed and the data used by unmanned systems without human intervention.

PAGE 111

96 Table 4-18. Set vector knowledge store feature class metadata message format Field # Name Type Units Interpretation 1 Metadata Options Byte N/A Enumeration 0: Append 1: Prepend 2: Overwrite 3 254: Reserved 255: Erase all metadata for given feature class 2 Feature Class Short Integer N/A Enumeration 0 65,534 See Feature Class Table 65,535 Reserved 3 Number of String Characters Unsigned Short Integer N/A 0 65,535 This field should be equal to zero only when Field 1 is equal to 255 (Erase All) 4 Feature Class Metadata String N/A Variable length string 4.2.4.3 Code F022h: Delete vector knowledge store objects The Code F022h: Delete vector knowledge store objects message (Table 4-19) allows the deletion of objects from the vector knowledge store. This message allows multiple vector objects to be deleted using a single message. Field 1 of this message is the presence vector (Table 4-20). Fields 5 and 6 are the only optional fields in this message. When they are included, they further limit the scope of the deletion. Field 2 of this message is the Local Request Identifier. Field 3 identifies the type of region that will be used to select the objects to delete. The number of vertices for this region is specified in field 4. Field 5 indicates the size of the region buffer to use with this message. Fields 7 begins the definition of vertices of the object deletion region.

PAGE 112

97 Table 4-19. Delete vector knowledge store objects message format Field # Name Type Units Interpretation 1 Presence Vector Byte N/A See mapping table below 2 Local Request ID Byte N/A Request identifier to be used when returning confirmation to requesting component 3 Region Type Byte N/A Enumeration 0: Point 1: Line 2: Polygon 3 255: Reserved 4 Number of Region Points Short Integer N/A 0: Reserved 1 65,535 5 Region Buffer Float Meters 6 Feature Class Short Integer N/A Enumeration 0 65,534 See Feature Class Table 65,535 ALL 7 Deletion Region Point 1 Latitude (WGS84) Integer Degrees Scaled Integer Lower Limit = -90 Upper Limit = 90 8 Deletion Region Point 1 Longitude (WGS84) Integer Degrees Scaled Integer Lower Limit = -180 Upper Limit = 180 2n + 5 Deletion Region Point n Latitude (WGS84) Integer Degrees Scaled Integer Lower Limit = -90 Upper Limit = 90 2n + 6 Deletion Region Point n Longitude (WGS84) Integer Degrees Scaled Integer Lower Limit = -180 Upper Limit = 180

PAGE 113

98 Table 4-20. Presence vector for delete vector knowledge store objects message Vector to Data Field Mapping for Above Command Vector Bit 7 6 5 4 3 2 1 0 Data Field R R R R R R 6 5 4.2.4.4 Code F220h: Query vector knowledge store objects The Code F220h: Query Vector Knowledge Store Objects message (Table 4-21) allows the access to objects within the vector knowledge store. Field 1 of this message is the presence vector (Table 4-22). Fields 6, 7, and 8 are the only optional fields in this message. When these fields are included, they further limit the scope of the query. Field 2 is a presence vector used to set the query response properties. If bit zero is clear, then the response shall only include the first three fields of the Code F422h: Report Vector Knowledge Store Objects message. Field 3 of this message is the Local Request Identifier. Field 4 identifies the type of region that will be used to limit the query. The number of vertices for this region is specified in field 5. Field 6 indicates the size of the region buffer to use with this message. Fields 8 begins the definition of vertices of the object query region. If this field is not present, the query scope shall be the entire knowledge store. Table 4-21. Query vector knowledge store objects message format Field # Name Type Units Interpretation 1 Presence Vector Unsigned Short Integer N/A See mapping table below 2 Local Request ID Byte N/A Request identifier to be used when returning data to requesting component 3 Query Properties Byte N/A Bit Field 0: Only return number of responses that would be transmitted 1 7: Reserved

PAGE 114

99 Table 4-21. Continued Field # Name Type Units Interpretation 4 Region Type Byte N/A Enumeration 0: Point 1: Line 2: Polygon 3 255: Reserved 5 Number of Region Points Unsigned Short Integer N/A 0, reserved 1 65,535 6 Region Buffer Float Meters 7 Feature Class Unsigned Short Integer N/A Enumeration 0 65,534 See Feature Class Table 65,535 All Feature Classes 8 Query Region Point 1 Latitude (WGS84) Integer Degrees Scaled Integer Lower Limit = -90 Upper Limit = 90 9 Query Region Point 1 Longitude (WGS84) Integer Degrees Scaled Integer Lower Limit = -180 Upper Limit = 180 2n + 6 Query Region Point n Latitude (WGS84) Integer Degrees Scaled Integer Lower Limit = -90 Upper Limit = 90 2n + 7 Query Region Point n Longitude (WGS84) Integer Degrees Scaled Integer Lower Limit = -180 Upper Limit = 180 Table 4-22. Presence vector for query vector knowledge store objects message Vector to Data Field Mapping for Above Command Vector Bit 7 6 5 4 3 2 1 0 Data Field R R R R R 8 7 6 4.2.4.5 Code F221h: Query vector knowledge store feature class metadata The Code F221h: Query Vector Knowledge Store Feature Class Metadata message (Table 4-23) should cause the Vector Knowledge Store to reply to the requestor with the

PAGE 115

100 Code F422h: Report Vector Knowledge Store Feature Class Metadata. There is a single field associated with this message. This field specifies the feature class metadata to return in the reply. There is also an option to return metadata for all feature classes present in the queried raster knowledge store. Table 4-23. Query vector knowledge store feature class metadata message format Field # Name Type Units Interpretation 1 Feature Class Unsigned Short Integer N/A Enumeration 0 65,534 See Feature Class Table 65,535 All 4.2.4.6 Code F222h: Query vector knowledge store bounds The Code F222h: Query Vector Knowledge Store Bounds message (Table 4-24) is used to request the spatial extents of a single feature class or of all feature classes within a vector knowledge store. The knowledge store should respond with the Code F424h: Report Vector Knowledge Store Bounds message. The bounds are represented by two points the represent the rectangular region that just covers all of the data within the feature class layer or layers. Table 4-24. Query raster knowledge store bounds message format Field # Name Type Units Interpretation 1 Local Request ID Byte N/A Request identifier to be used when returning data to requesting component 2 Feature Class Unsigned Short Integer N/A Enumeration 0 65,534 See Feature Class Table 65,535 All Feature Classes 4.2.4.7 Code F620h: Vector knowledge store event notification request The Code F620h: Vector Knowledge Store Event Notification Request message is used to establish an event triggered query within the knowledge store. Therefore, this

PAGE 116

101 message is formatted exactly the same as the Code F220h: Query Vector Knowledge Store Objects message. That message should be referenced for the format of this message. Whenever the criteria established in this message are met, the raster knowledge store should transmit the Code F820h: Vector Knowledge Store Event Notification message with the appropriate data attached. 4.2.4.8 Code F621h: Vector knowledge store bounds change event notification request The Code F621h: Vector Knowledge Store Bounds Change Event Notification Request message is used to establish an event triggered response to notify the requesting component of when the data in a feature class extends past the bounds of the data when the initial request was sent. When the extents of the data change, the raster knowledge store will transmit the Code F821h: Vector Knowledge Store Bounds Change Event Notification message. 4.2.4.9 Code F023h: Terminate vector knowledge store data transfer This Code F023h: Terminate Vector Knowledge Store Data Transfer message is a command class message that should cause the vector knowledge store to immediately terminate the transfer of all current and outstanding data destined to the requesting component. Upon termination, the raster knowledge store should send the requestor the Code F424h: Report Vector Knowledge Store Data Transfer Termination message. 4.2.5 Vector Knowledge Store Output Message Set Below are the messages that define the output methods for the vector version of the world model knowledge store. Outputs: The JAUS core output message set Code F420h: Report vector knowledge store object(s) creation

PAGE 117

102 Code F421h: Report vector knowledge store feature class metadata Code F422h: Report vector knowledge store objects Code F423h: Report vector knowledge store bounds Code F820h: Vector knowledge store event notification Code F821h: Vector knowledge store bounds change event notification Code F424h: Report vector knowledge store data transfer termination 4.2.5.1 Code F420h: Report vector knowledge store object(s) creation The Code F420h: Report Vector Knowledge Store Object Creation message (Table 4-25) is used to confirm creation of objects in the vector knowledge store. This message is sent only when an object creation message is requested by setting bit zero in the Code F020h: Create Vector Knowledge Store Object message. If this bit is set, this message will be transmitted and the local object identifier (field 1) is set to the value sent with the Code F020h: Create Vector Knowledge Store Raster Object message. Table 4-25. Report vector knowledge store object(s) creation message format Field # Name Type Units Interpretation 1 Local Request ID Byte N/A Local request identifier sent by creating component 4.2.5.2 Code F421h: Report vector knowledge store feature class metadata The Code F421h: Report Vector Knowledge Store Feature Class Metadata message (Table 4-26) allows access to feature class metadata stored within raster knowledge store. It is transferred in response to the Code F221h: Query Vector Knowledge Store Feature Class Metadata message. If the query message requests all feature classes, a separate message should be sent for each feature class. These metadata are entered using the Code F021h: Set Vector Knowledge Store Feature Class Metadata message. Table 4-26. Report vector knowledge store feature class metadata message format Field # Name Type Units Interpretation 1 Feature Class Short Integer N/A Enumeration 0 65,535 See Feature Class Table

PAGE 118

103 Table 4-26. Continued Field # Name Type Units Interpretation 2 Number of String Characters Unsigned Short Integer N/A 0 65,535 3 Feature Class Metadata String N/A Variable length string 4.2.5.3 Code F422h: Report vector knowledge store objects The Code F422h: Report Vector Knowledge Store Objects message (Table 4-27) is sent in direct response to a Code F220h: Query Vector Knowledge Store Objects message. Field 1 is a presence vector that informs the receiving component as to whether or not data are included with the message. If bit zero is set, then data should be expected after message field 3. Field 2 of this message is Local Request Identifier sent with the query that initiated this report message. Field 3 indicates the number of vector objects included in the message. The data type that describes the vector objects attributes is variable and is specified in fields 4. Fields 5 begins the definition of a single vector object. The vector objects is defined by its type (point, line, or polygon), the number of feature classes that it is assigned to, an attribute for each feature class, followed by the global coordinates of the vertices of the object. These fields are repeated for each object reported in this message. Table 4-27. Report vector knowledge store objects message format Field # Name Type Units Interpretation 1 Presence Vector Byte N/A Bit Field Bit 0: If data are present after field 3, this bit should be set. This is based on the parameters in the received Query Vector Knowledge Store Objects Message. Bits 1-7: Reserved

PAGE 119

104 Table 4-27. Continued Field # Name Type Units Interpretation 2 Local Request ID Byte N/A Request identifier to be used when returning confirmation to requesting component 3 Number of Objects Unsigned Short Integer 0, reserved 1 65,535 4 Object 1 Type Byte N/A Enumeration 0: Point 1: Line 2: Polygon 3 255: Reserved 5 Object 1 Buffer Float Meters 6 Object 1 Feature Class Short Integer N/A Enumeration 0 65,534 See Feature Class Table 65,535 Reserved 7 Object 1 Feature Class Attribute Data Type Byte N/A Enumeration 0: Byte 1: Short Integer 2: Integer 3: Long Integer 4: Unsigned Short Integer 5: Unsigned Integer 6: Unsigned Long Integer 7: Float 8: Long Float 9: RGB (3 Bytes) 10 255: Reserved 8 Object 1 Feature Class Attribute Varies (see field 4) Varies with Feature Class 9 Number of Points for Object 1 Unsigned Short Integer 0, reserved 1 65,535 10 Object 1 Point 1 Latitude (WGS84) Integer Degrees Scaled Integer Lower Limit = -90 Upper Limit = 90

PAGE 120

105 Table 4-27. Continued Field # Name Type Units Interpretation 11 Object 1 Point 1 Longitude (WGS84) Integer Degrees Scaled Integer Lower Limit = -180 Upper Limit = 180 2m + 8 Object 1 Point m Latitude (WGS84) Integer Degrees Scaled Integer Lower Limit = -90 Upper Limit = 90 2m + 9 Object 1 Point m Longitude (WGS84) Integer Degrees Scaled Integer Lower Limit = -180 Upper Limit = 180 2m + 10 Object n Type Byte N/A Enumeration 0: Point 1: Line 2: Polygon 3 255: Reserved 2m + 11 Object n Buffer Float Meters 2m + 12 Object n Feature Class Short Integer N/A Enumeration 0 65,535 See Feature Class Table 2m + 13 Object n Feature Class Attribute Data Type Byte N/A Enumeration 0: Byte 1: Short Integer 2: Integer 3: Long Integer 4: Unsigned Short Integer 5: Unsigned Integer 6: Unsigned Long Integer 7: Float 8: Long Float 9: RGB (3 Bytes) 10 255: Reserved 2m + 14 Object n Feature Class Attribute Varies (see field 4) Varies with Feature Class

PAGE 121

106 Table 4-27. Continued Field # Name Type Units Interpretation 2m + 15 Number of Points for Object n Unsigned Short Integer 0, reserved 1 65,535 2m + 16 Object n Point 1 Latitude (WGS84) Integer Degrees Scaled Integer Lower Limit = -90 Upper Limit = 90 2m + 17 Object n Point 1 Longitude (WGS84) Integer Degrees Scaled Integer Lower Limit = -180 Upper Limit = 180 Object n Point k Latitude (WGS84) Integer Degrees Scaled Integer Lower Limit = -90 Upper Limit = 90 Object n Point k Longitude (WGS84) Integer Degrees Scaled Integer Lower Limit = -180 Upper Limit = 180 4.2.5.4 Code F423h: Report vector knowledge store bounds The Code F423h: Report Vector Knowledge Store Bounds message format is shown in Table 4-28. This message reports the bounds as a response to the Query Vector Knowledge Store Bounds message. In this message, the knowledge store returns the two geographic points that represent the extents of the data within a feature class layer or all feature class layers. Table 4-28. Report vector knowledge store bounds message format Field # Name Type Units Interpretation 1 Southwest Point Latitude (WGS84) Integer Degrees Scaled Integer Lower Limit = -90 Upper Limit = 90 2 Southwest Point Longitude (WGS84) Integer Degrees Scaled Integer Lower Limit = -180 Upper Limit = 180

PAGE 122

107 Table 4-28. Continued Field # Name Type Units Interpretation 3 Northeast Point Latitude (WGS84) Integer Degrees Scaled Integer Lower Limit = -90 Upper Limit = 90 4 Northeast Point Longitude (WGS84) Integer Degrees Scaled Integer Lower Limit = -180 Upper Limit = 180 4.2.5.5 Code F820h: Vector knowledge store event notification The Code F820h: Vector Knowledge Store Event Notification message is an event triggered message that is sent in response to the Code F620h: Vector Knowledge Store Event Notification Request message. The format of this message is identical to that of the Code F422h: Report Vector Knowledge Store Objects message. 4.2.5.6 Code F821h: Vector knowledge store bounds change event notification The Code F821h: Vector Knowledge Store Bounds Change Event Notification message is an event triggered message that is sent in response to the Code F621h: Vector Knowledge Store Bounds Change Event Notification Request message. It is transmitted to the requesting component each time the spatial extents of a feature class or feature classes (as specified in the event notification request message) change. The format of this message is identical to that of the Code F423h: Report Vector Knowledge Store Bounds message. 4.2.5.7 Code F424h: Report vector knowledge store data transfer termination The Code F424h: Report Vector Knowledge Store Data Transfer Termination message notifies other JAUS components that data that were being transferred or were going to be transferred to them has been stopped. This message is sent in response to the Code F025h: Terminate Vector Knowledge Store Data Transfer message. It is also sent

PAGE 123

108 whenever data transfer is interrupted due to a change in the component state as discussed in Section 4.2.1. The messages presented in the preceding sections present a solution to world modeling within the context of the Joint Architecture for Unmanned Systems (JAUS). The defined messages allow the raster and vector versions of the knowledge store to receive and transmit formatted geospatial data. Because the underlying geometry of most geospatial data is based on raster or vector objects, the JAUS World Model components are able to support most types of geospatial data including those presented in Chapter 2.

PAGE 124

CHAPTER 5 CONCLUSIONS AND FUTURE WORK 5.1 Conclusions Our study focused on standardization of an interface between unmanned systems. Specifically, it focused on standardizing the interface between different types of geospatial data stores within the JAUS framework. In responding to the research problem stated in Chapter 1, a review of relevant literature was done. Next a system of distributed modular sensor processing units was developed. This system of modular sensors is called the Smart Sensor Architecture. Its development laid the foundation for the development of the world model message set presented in Chapter 4. This generic JAUS message set was developed to allow transfer of basic forms of both raster and vector formatted geospatial data. The interfaces to the world model knowledge stores as introduced in Chapter 4 present a standardized method for communicating geospatial data. The application possibilities of these messages are endless. While it is useful for single unmanned systems for mapping its environment, it is particularly useful for collaborative robotics tasks. For example, an unmanned system with a powerful sensor suite could be used to map an area for obstacles. That map, or world model, could be shared with other unmanned systems to allow them to traverse a region with minimal or no sensors; essentially sharing the resources of another unmanned system. It should be clear that our study in not a final solution to the question of how to share geospatial data between multiple unmanned systems. It is a first step in a long 109

PAGE 125

110 process of defining a standard for the growing JAUS community. The consequences of the work presented herein could very well be far reaching indeed. Just imagine a class of unmanned systems with a shared language a standard method of communicating with other unmanned systems. Unmanned systems that are able to, despite the fact that they were developed by different vendors, interoperate with minimal effort. With JAUS, this is becoming a reality. Our study is a significant contribution to the JAUS Working Groups effort to develop the next generation of intelligent JAUS systems. 5.2 Future Work Since the problem addressed by our study is open-ended, this work must and most certainly will continue on. There is no single best way to model or share geospatial data. What is important is that all concerned parties reach consensus on how to do this. Therefore the results of our study provides a base upon which the JAUS community can build. As with any new component added to the JAUS Reference Architecture, the component messages presented herein must be vetted by all interested parties within the JAUS community. Only after approval by the JAUS Reference Architecture Committee and the JAUS Working Group as a whole will it be adopted. Chapter 4 presents two separate methods for modeling and sharing both raster and vector geospatial data. What was not address is how to bridge the two modeling methods. Converting vector data to raster format is trivial. This simply requires the projection of the points along the edges of the vector object into a grid. The grid should be of high enough resolution to accurately represent the vector objects. The more difficult side of this bridge is the conversion from raster to vector. Because transfer and storage of raster data is very expensive, this is of particular importance when it comes sharing data between unmanned systems. Raster data requires a large amount of

PAGE 126

111 bandwidth during transmission. For example, even when transmitting a large amount of raster formatted geospatial data areas have similar values, each cell must still be transmitted. A possible approach to the raster to vector conversion problem is the use of the Level Set Method developed by Sethian in [27]. This approach will be explored as a possible solution to this problem. Another future extension for the standard presented in Chapter 4 is the addition of support for more projected coordinate systems. For the sake of simplicity, this message set requires all global coordinates to be based on the World Geodetic System 1984 (WGS) and the projected coordinate system to be based on the Universal Transverse Mercator projection. As discussed in Chapter 2, there are benefits to use of different types of projected coordinate systems. UTM is a good general purpose transformation, but system developers may want or need to use another projection that preserves features that are most important to them. The standard should grow to not only support, but allow systems to distinguish and convert of data from different projections. One of the most often discussed issue with JAUS is that is not a very flexible or extensible architecture. As this document attempts a first step at bridging the GIS and Unmanned Systems communities, it is expected that the World Model subcommittee of JAUS will make an effort to bring in members of the GIS community. There is a wealth of knowledge and contributions to be gained from both the unmanned systems and GIS communities. This is perhaps the most important continuation plan for this work.

PAGE 127

REFERENCES American National Standards Institute, Spatial Data Transfer Standard (SDTS) Part 1: Logical Specification. Washington, D.C.: American National Standards Institute, 1997. [1] [2] [3] [4] [5] [6] [7] [8] [9] [10] [11] [12] American National Standards Institute, Spatial Data Transfer Standard (SDTS) Part 2: Spatial Features. Washington, D.C.: American National Standards Institute, 1997. American National Standards Institute, Spatial Data Transfer Standard (SDTS) Part 3: ISO 8211 Encoding. Washington, D.C.: American National Standards Institute, 1997. American National Standards Institute, Spatial Data Transfer Standard (SDTS) Part 4: Topological Vector Profile. Washington, D.C.: American National Standards Institute, 1997. American National Standards Institute, Spatial Data Transfer Standard (SDTS) Part 6: Point Profile. Washington, D.C.: American National Standards Institute, 1998. American National Standards Institute, Spatial Data Transfer Standard (SDTS) Part 5: Raster Profile. Washington, D.C.: American National Standards Institute, 1999. P. Bolstad, GIS Fundamentals: A First Text on Geographic Information Systems. White Bear Lake, MN: Eider Press, 2002. J. Borenstein, "The Vector Field Histogram Fast Obstacle Avoidance for Mobile Robots," IEEE Journal of Robotics and Automation, vol. 7, no. 3, pp. 278-288, 1991. K. Brown, Datums and Projections: A Brief Guide, United States Geological Survey Center for Biological Informatics. Reston, VA: United States Geological Survey, 1999. M. Crosetto, B. Crippa, "Optical and Radar Data Fusion for DEM Generation," IAPRS GIS Between Vision and Applications, vol. 32, no. 4, pp. 128-134, Stuttgart, Germany, 1998.. S. Cox, P. Daisey, R. Lake, C. Portele, A. Whiteside, OpenGIS Geography Markup Language Implementation Specification, OpenGIS Consortium, Inc., 2004. G. Dudeck, M. Jenkin, Computational Principles of Mobile Robotics. New York, NY: Cambridge University Press, 2002. 112

PAGE 128

113 [13] [14] [15] [16] [17] [18] [19] [20] [21] [22] [23] [24] [25] [26] [27] [28] [29] A. Elfes, "Using Occupancy Grids for Mobile Robot Perception and Navigation," Computer, vol. 22, no. 6, pp. 46-57, 1989. ESRI, Understanding Map Projections: GIS by ESRI, ESRI. Redlands, CA, 2004. T. Hong, M. Abrams, T. Chang, M. Shneier, "An Intelligent World Model for Autonomous Off-Road Driving." Gaithersburg, MD: NIST Intelligent Systems Division, 2001. JAUS Working Group, Joint Architecture for Unmanned Systems (JAUS) Version 3.1, Volume 2, The Joint Architecture for Unmanned Systems. http://www.jauswg.org April 2004. R. Lake, Enabling the Geo-spatial Web, Galdos Systems Inc. Vancouver, CA, 2001. A. Meystel, J. Albus, Intelligent Systems: Architecture, Design, and Control. New York, NY: John Wiley & Sons, 2002. R. Murphy, Introduction to AI Robotics. Cambridge, MA: The MIT Press, 2000. D. Murray, J. Little, "Using Real-Time Stereo Vision for Mobile Robot Navigation," Autonomous Robots, vol. 8, no. 2, pp. 151-171, 2000. O. R. Musin, "Towards 3/4-D GIS." Moscow, Russia: Moscow State University, Department of Cartography and Geoinformatics, 1998. D. Novick, "Implementation of a Sensor Fusion-Based Object Detection Component for an Autonomous Outdoor Vehicle," Doctor of Philosophy Dissertation. Department of Mechanical Engineering: University of Florida, 2002. A. Pasha, "Path Planning for Nonholonomic Vehicles and Its Application to Radiation Environments," Master of Science Thesis. Department of Mechanical Engineering: University of Florida, 2003. J. Postel, User Datagram Protocol, Request for Comments 768, USC Information Sciences Institute. Marina del Ray, CA, August 1980. P. Rigaux, M. Scholl, A. Voisard, Spatial Databases with Application to GIS. San Francisco, CA: Morgan Kaufmann, 2002. J. Rosenblatt, "DAMN: A Distributed Architecture for Mobile Navigation," Doctor of Philosophy Dissertation. Robotics Institute: Carnegie Mellon University, 1997. J. A. Sethian, Level Set Methods and Fast Marching Methods: Evolving Interfaces in Computational Geometry, Fluid Mechanics, Computer Vision, and Materials Science. Cambridge, UK: Cambridge University Press, 1999. Sick, Inc., LMS200/LMS211/LMS220/LMS221/LMS291 Laser Measurement Systems Technical Description, Sick, Inc. Minneapolis, MN, 2003. United States Geological Survey, U.S. GeoData Digital Line Graphs Fact Sheet. Reston, VA: United States Geological Survey, 1996.

PAGE 129

114 United State Geological Survey, National Mapping Division, National Mapping Program Technical Instructions: Part 1 General Standards for Digital Elevation Models. Reston, VA: United States Geological Survey, 1997. [30] [31] [32] [33] [34] [35] [36] United States Geological Survey, Digital Raster Graphics Fact Sheet. Reston, VA: United States Geological Survey, 1999. United States Geological Survey, Spatial Data Transfer Standard (SDTS) Fact Sheet. Reston, VA: United States Geological Survey, 1999. United States Geological Survey, U.S. GeoData Digital Elevation Models Fact Sheet. Reston, VA: United States Geological Survey, 2000. United States Geological Survey, U.S. GeoData Digital Orthophoto Quadrangles Fact Sheet. Reston, VA: United States Geological Survey, 2001. Videre Design, STH-MD1/-C Stereo Head User's Manual, Videre Design. Menlo Park, CA, 2001. Wikipedia Contributors, "Sea Level," http://en.wikipedia.org/wiki/Sea_level Wikipedia: The Free Encyclopedia. (December 15, 2004).

PAGE 130

BIOGRAPHICAL SKETCH Carl P. Evans III was born on May 15, 1978, in Salisbury, Maryland. As a child and teenager, he was very inquisitive and had an appreciation for science and technology (particularly robotics, albeit toy robotics). He was first exposed to programming when his grandmother purchased a Commodore 64 computer for him. The computer came with very little software so, out of necessity, he wrote his own. Building on the vast experience he gained writing Commodore 64 code, Mr. Evans took 2 years of programming in high school. This work culminated in Carls Operating System (COS) a PASCAL-developed graphical user interface that ran on top of Microsoft DOS. Motivated by his desire to develop his own version of Short Circuits Johnny Five, Mr. Evans decided to pursue a degree in engineering. Not knowing which field of engineering (electrical or mechanical) he wanted to study, in 1996 Mr. Evans enrolled in the unique Electromechanical Engineering (BELM) program at Wentworth Institute of Technology in Boston, MA (the only ABET-accredited interdisciplinary Electromechanical Engineering program in the United States). A campus leader, Mr. Evans served as chapter president of ASME and IEEE, Institute Presidents Host, and Resident Assistant. In 2001, Mr. Evans graduated, cum laude, with his B.S. degree and gave the student commencement address to his fellow graduates. As part of the WIT Electromechanical Engineering programs graduation requirements, Mr. Evans worked two co-op semesters, to supplement his studies with practical experience. He earned the honor of being an engineer assistant at Foster-Miller, 115

PAGE 131

116 Inc., in Waltham, MA. At FMI, the range of Mr. Evans work was as broad as his education. As a student, he earned the level of respect and responsibility granted to Staff Engineers. Upon graduation, Mr. Evans was granted a double-promotion to the position of Design Engineer, bypassing the position of Staff Engineer. To this day, his contributions live on at Foster-Miller and are helping in the current war on terrorism. In January of 2002, Mr. Evans joined Dr. Carl Crane at the University of Floridas Center for Intelligent Machines and Robotics. While at the University of Florida, he was a member of the Unmanned Systems research team, funded by the Air Force Research Laboratory (Panama City, FL). He was a member of the CIMAR DARPA Grand Challenge team and led the perception team. In August of 2005, he graduated with his Master of Science degree. A Commonwealth of Massachusetts Engineer Intern since 2001, Mr. Evans plans to sit for the Professional Engineers (PE) exam with a concentration in Electrical Engineering. Mr. Evans is a first-generation college graduate. He owes all that he has achieved to his loving grandparents, Lottie and James Patterson. His future educational plans include attending law school with an ultimate goal of one day becoming a politician. In June 2004, Mr. Evans joined Applied Perception, Inc. (API) in Wexford, PA as a Senior Engineer. At API he will continue his involvement with JAUS and the development of perception systems, world modeling methods, and collaborative technologies for unmanned systems.


Permanent Link: http://ufdc.ufl.edu/UFE0009320/00001

Material Information

Title: Development of a Geospatial Data-Sharing Method for Unmanned Vehicles Based on the Joint Architecture for Unmanned Systems (JAUS)
Physical Description: Mixed Material
Copyright Date: 2008

Record Information

Source Institution: University of Florida
Holding Location: University of Florida
Rights Management: All rights reserved by the source institution and holding location.
System ID: UFE0009320:00001

Permanent Link: http://ufdc.ufl.edu/UFE0009320/00001

Material Information

Title: Development of a Geospatial Data-Sharing Method for Unmanned Vehicles Based on the Joint Architecture for Unmanned Systems (JAUS)
Physical Description: Mixed Material
Copyright Date: 2008

Record Information

Source Institution: University of Florida
Holding Location: University of Florida
Rights Management: All rights reserved by the source institution and holding location.
System ID: UFE0009320:00001


This item has the following downloads:


Full Text












DEVELOPMENT OF A GEOSPATIAL DATA-SHARING METHOD FOR
UNMANNED VEHICLES BASED ON THE JOINT ARCHITECTURE FOR
UNMANNED SYSTEMS (JAUS)
















By

CARL PRESTON EVANS, III


A THESIS PRESENTED TO THE GRADUATE SCHOOL
OF THE UNIVERSITY OF FLORIDA IN PARTIAL FULFILLMENT
OF THE REQUIREMENTS FOR THE DEGREE OF
MASTER OF SCIENCE

UNIVERSITY OF FLORIDA


2005

































Copyright 2005

by

Carl Preston Evans, III

































"It is difficult to say what is impossible, for the dream of yesterday is the hope of today
and the reality of tomorrow."

-Dr. Robert H. Goddard

































To my grandparents, Lottie and James Patterson, for their never-ending, unconditional
love, and support of me. Because of them, I have been able to see and do things that they
were unable to. In the words of the great Albert Einstein, I have seen farther because
they have allowed me to stand on their shoulders.















ACKNOWLEDGMENTS

First and foremost, I would like to thank my committee chair (Dr. Carl D. Crane

III) for his patience with me during the process of finishing this work, and for giving me

the great opportunity to study and do my master's research at the University of Florida's

Center for Intelligent Machines and Robotics. Dr. Crane and I have had a professional

relationship since 2000, when we met at a Joint Architecture for Unmanned Systems

(JAUS) working group meeting. I look forward to a continued professional relationship

as we continue to research the vast field of autonomous systems. Thanks also go to my

other committee members (Drs. John Schueller and Christopher Neizrecki) for their

valuable contributions to this study.

Thanks also go out to Mr. Dan Deguire, Project Manager in Foster-Miller's Design

and Systems Integration (DSI) Group. I had the honor of working for Dan for

two-and-a-half years, starting as a co-op student and ending as a Design Engineer. While

at Foster-Miller, Dan funded my first foray into obstacle detection and autonomous

navigation (my undergraduate senior design project). I appreciate the friendship that we

continue to share.

Special thanks go out to the Center for Intelligent Machines and Robotics

(CIMAR) Team CIMAR; competitors in the inaugural Defense Advanced Research

Projects Agency (DARPA) Grand Challenge held in March 2004. I particularly those

who were on (or otherwise contributed to work of) the perception team: Mel Torrie,









Sarah Gray, Kristopher Klingler, Charles Smith, Sanjay Solanki, Danny Kent, Erica

Zawodny-McArthur, and Donald McArthur.

Thanks also go out to the members of the JAUS World Model Subcommittee for

making each of my meetings feel like a thesis defense. Their valuable suggestions helped

to shape this document, particularly Chapter 4.

I thank Chad Tobler (a friend and biker buddy, late in my years at CIMAR) for

keeping me sane and for providing valuable insight during the writing of this study.

Thanks also go to my great friend Matthew Bokach from the School of Natural

Resources and Environment at the University of Florida. I thank him for being there for

me in many ways both personally and professionally. I especially appreciate his help

with this study. From the day that he told me that two points of equal latitude, longitude,

and elevation are not always coincident, I have learned much from him.

Of course I must thank my new coworkers (Todd, Parag, Patrick, and Mark) at

Applied Perception, Inc. (API), for giving me a hard time for taking so long to finish this

study, but more importantly for their kind words of support. I look forward to many

successful years with them at API.

Last but certainly not least, thanks go to the Air Force Research Laboratory at

Tyndall Air Force Base, FL, for funding this work. For over a decade, they have been

supporters of the work done by the talented group of roboticists at the University of

Florida's Center for Intelligent Machines and Robotics. Without their support, this study

could not have been completed. I am thankful for their support, and I look forward to

continued work with them in the future.
















TABLE OF CONTENTS

page

A C K N O W L E D G M E N T S ........................................................................ .....................v

LIST OF TABLES ............................................................................. x

L IST O F FIG U R E S .... ...... ................................................ .. .. ..... .............. xii

ABSTRACT ........ .............. ............. ...... ...................... xiv

CHAPTER

1 INTRODUCTION AND RESEARCH PROBLEM ..............................................1

1.1 Intro du action ...................................... ............................. ................ 1
1.2 Research Problem ................... .. ........... ... ............ ... .............

2 REVIEW OF RELEVANT LITERATURE ..... .........................6

2.1 Joint Architecture for Unmanned Systems (JAUS)..............................................6
2.1.1 Tenets of JAU S ................................................ .. .. .. ................ .7
2.1.2 System Structure of JAU S ................................ ....................... .......... 8
2.1.3 World Model Subcommittee for JAUS .................................................11
2.2 Real-Time World Modeling Methods .............. .............................................11
2.2.1 R aster O occupancy G rid......................................... .......................... 12
2.2.2 Real Tim e Terrain M apping .................................................................... 13
2.2.3 R aster Traversability G rid .................................. ..................................... 14
2.3 A Priori World Modeling Methods.............................................. ...............14
2.4 Geographic M odeling M ethods ................................................................ ...... 15
2.4.1 G lobal Coordinate System s .................................. ............ .................. 16
2.4.2 Projected Coordinate System s................................................................ 18
2.4.3 Universal Transverse M ercator projection................................................. 19
2.5 Georeferenced W orld M odel Data.................................... ........................ 20
2.5.1 R aster D ata Stores .............................................. .......................... 20
2.5.2 V ector D ata Stores......................................................... .. ............... 20
2.6 Distributed W orld M odeling M ethods....................................... ............... 21
2.6.1 Spatial Data Transfer Standard (SDTS) ............................................. 21
2.6.2 Geography Markup Language ..................................... ...............23









3 SM A R T SE N SO R S .......................................................................... ....................26

3.1 Sm art Sensor A architecture ........................................................................ .. .... 26
3.2 Smart Sensor Architecture Components.................................... ............... 29
3.2.1 Smart Sensor Component...... ........................................ .................. 29
3.2.2 Sm art Sensor Arbiter Com ponent ................................... .................31
3.2.3 Reactive Planner Com ponent .......................................... ............... 32
3.3 Sm art Sensor M essaging Architecture .............. ............ .................. ............ 33
3.3.1 Smart Sensor Architecture Message Header ...... ..................................33
3.3.2 Smart Sensor Architecture Message Set................................ ...............36
3.3.2.1 R report vehicle state m essage................................. .....................38
3.3.2.2 Report traversability grid update message ....................................40
3.3.2.3 Report region clutter index message ............................................. 42
3.3.3 Smart Sensor Architecture Network Communications ...........................43
3.4 Smart Sensor Implementation................... ................................ 45
3.4.1 Abstraction of Smart Sensor Core Functionality............................45
3.4.2 B ase Sm art Sensor........................................................................ ... ... 46
3.5 Smart Stereo Vision Sensor Implementation.....................................50
3.5.1 Stereo vision H ardw are ........................................ ........................ 51
3.5.2 Stereo V ision Softw are......................................... .......................... 51
3.5.3 Sm art Stereo V ision Sensor.................................. ........................ 51
3.6 Use of Obstacle Detection and Free Space Sensors ..........................................56
3.7 Smart Sensor Arbiter Implementation .......................................... .............57

4 JAUS WORLD MODEL KNOWLEDGE STORES .............................................60

4.1 Observations and Recommendations.................... .... ........................ 61
4.1.1 Raster and Vector Object Representation .............................................66
4.2 World Model Knowledge Store Message Set..................... ............................ 68
4.2.1 JAUS Core Input and Output Message Sets..............................................68
4.2.2 Raster Knowledge Store Input Message Set ...........................................71
4.2.2.1 Code F000h: Create raster knowledge store object..........................72
4.2.2.2 Code F001h: Set raster knowledge store feature class metadata......74
4.2.2.3 Code F002h: Modify raster knowledge store object (cell update)...74
4.2.2.4 Code F003h: Modify raster knowledge store object (grid update) ..77
4.2.2.5 Code F004h: Delete raster knowledge store objects ...................79
4.2.2.6 Code F200h: Query raster knowledge store objects......................79
4.2.2.7 Code F201h: Query raster knowledge store feature class metadata.81
4.2.2.8 Code F202h: Query raster knowledge store bounds ......................82
4.2.2.9 Code F600h: Raster knowledge store event notification request.....82
4.2.2.10 Code F601h: Raster knowledge store bounds change event
notification request ................................................... ................. 83
4.2.2.11 Code F005h: Terminate raster knowledge store data transfer........83
4.2.3 Raster Knowledge Store Output Message Set......................................83
4.2.3.1 Code F400h: Report raster knowledge store object creation ..........84
4.2.3.2 Code F401h: Report raster knowledge store feature class metadata84
4.2.3.3 Code F402h: Report raster knowledge store objects (cell update)...85









4.2.3.4 Code F403h: Report raster knowledge store objects (grid update)..87
4.2.3.5 Code F404h: Report raster knowledge store bounds .....................89
4.2.3.6 Code F800h: Raster knowledge store event notification
(cell u p d ate) ......................................... ....... ... ........ .... 9 0
4.2.3.7 Code F801h: Raster knowledge store event notification
(grid update).. ... ..................................... ...... .. ......... ...... 90
4.2.3.8 Code F802h: Raster knowledge store bounds change event
notification ............. ... ....................... ................ 91
4.2.3.9 Code F405h: Report raster knowledge store data transfer
termination .......... .... ... ...... .... ....... ............... 91
4.2.4 Vector Knowledge Store Input M message Set........................................... 91
4.2.4.1 Code F020h: Create vector knowledge store objects....................92
4.2.4.2 Code F021h: Set vector knowledge store feature class metadata ....95
4.2.4.3 Code F022h: Delete vector knowledge store objects....................96
4.2.4.4 Code F220h: Query vector knowledge store objects ..................98
4.2.4.5 Code F221h: Query vector knowledge store feature class
m etadata .............................. .... ..... .. .... .................... 99
4.2.4.6 Code F222h: Query vector knowledge store bounds .................100
4.2.4.7 Code F620h: Vector knowledge store event notification request ..100
4.2.4.8 Code F621h: Vector knowledge store bounds change event
notification request ...................... ............ ....... ............... ... 10 1
4.2.4.9 Code F023h: Terminate vector knowledge store data transfer.......101
4.2.5 Vector Knowledge Store Output Message Set......................................101
4.2.5.1 Code F420h: Report vector knowledge store objects) creation.... 102
4.2.5.2 Code F421h: Report vector knowledge store feature class
m etadata ..................... ....... .. .... ...... .. ... ....... ............ 102
4.2.5.3 Code F422h: Report vector knowledge store objects.................103
4.2.5.4 Code F423h: Report vector knowledge store bounds ..................106
4.2.5.5 Code F820h: Vector knowledge store event notification.............107
4.2.5.6 Code F821h: Vector knowledge store bounds change event
notification ................. ............... ........... ........ ........... ........ 107
4.2.5.7 Code F424h: Report vector knowledge store data transfer
term in atio n .............................. ....................... ............... 10 7

5 CONCLUSIONS AND FUTURE WORK............................ .............. .........109

5 .1 C o n clu sio n s................................................ .................. 10 9
5.2 Future W ork.......... ........... ..................................... .. ........ .. 110

REFEREN CES ............ ...... .................. .................................................. 12

B IO G R A PH IC A L SK E TCH .................................................................. ....... ........115
















LIST OF TABLES


Table pge

3-1 Standard JAUS sixteen byte message header................................ ............... 34

3-2 Smart sensor architecture message header............... .............................................36

3-3 Smart sensor architecture's report vehicle state message............... ...................40

3-4 Smart sensor architecture's report traversability grid updates message .................42

3-5 Smart sensor architecture's report region clutter index message............................43

3-6 Smart sensor components and their component identification numbers..................45

4-1 Create raster knowledge store objects message format....................................73

4-2 Presence vector for create raster knowledge store objects message ...................74

4-3 Set raster knowledge store feature class metadata message format.........................74

4-4 Modify raster knowledge store object (cell update) message format ................. 75

4-5 Modify raster knowledge store object (grid update) message format .................77

4-6 Delete raster knowledge store objects message format................ .. ............ 79

4-7 Query raster knowledge store objects message format ..................................80

4-8 Presence vector for query raster knowledge store objects message......................81

4-9 Query raster knowledge store feature class metadata message format ..................82

4-10 Query raster knowledge store bounds message format............... ... ............ 82

4-11 Report raster knowledge store object creation message format.............................84

4-12 Report raster knowledge store feature class metadata message format ...................85

4-13 Report raster knowledge store objects (cell update) message format ......................86

4-14 Report raster knowledge store objects (grid update) message format ......................88









4-15 Report raster knowledge store bounds message format................ ..................90

4-16 Create vector knowledge store objects message format...................................92

4-17 Presence vector for create vector knowledge store objects message ..................95

4-18 Set vector knowledge store feature class metadata message format........................96

4-19 Delete vector knowledge store objects message format.....................................97

4-20 Presence vector for delete vector knowledge store objects message .....................98

4-21 Query vector knowledge store objects message format ...................... ...............98

4-22 Presence vector for query vector knowledge store objects message.....................99

4-23 Query vector knowledge store feature class metadata message format...............00

4-24 Query raster knowledge store bounds message format .............. ... ............100

4-25 Report vector knowledge store objects) creation message format........................102

4-26 Report vector knowledge store feature class metadata message format ..............102

4-27 Report vector knowledge store objects message format ............ ...................103

4-28 Report vector knowledge store bounds message format ............. ... ............106
















LIST OF FIGURES


Figure p

1-1 Example of the processes leading up to higher-level planning..............................4

1-2 How our study fits into the higher-level planning process......................................4

2-1 System structure of JAUS ............... .........................................

2-2 Path planning in a bounded radiation environment...............................................15

2-3 Ellipsoidal m odel of E arth............................................... ............................. 16

2-4 Earth-centered global coordinate system ...................................... ............... 17

2-5 Example format of digital elevation model..................... .............. ............. 21

2-6 An example of USGS source data.......................................... ............... 22

3-1 Team CIMAR NaviGATOR DARPA Grand Challenge entry vehicle....................27

3-2 Organization of the smart sensor-based perception system .................................30

3-3 Minimally complete smart sensor-based perception system ..................................38

3-4 Single sensor implementation of smart sensor-based perception system ................38

3-5 Unmanned system coordinate system defined by JAUS........................................39

3-6 Center for Intelligent Machines and Robotics Navigation Test Vehicle 2...............47

3-7 Sm art sensor im plem entation ........................................................ ............... 48

3-8 Call graph for all functions within the base smart sensor API ................................48

3-9 Call graph for smart sensor communications receive thread .................................48

3-10 Call graph for function that determines number of rows and columns to shift........50

3-11 Call graph for sensor specific interface thread.................................... ............... 50

3-12 Call graph for function used to detect changes in the traversability grid.................50









3-13 Videre Design STH-MD1-C stereo camera head.................................... ..............52

3-14 Source data and results from stereo correlation ....................................... .......... 53

3-15 Graph of range determined from stereo vision system vs. actual measured range..53

3-16 Plot of range resolution vs. range ............. ... ......... ............... 54

4-1 Definition of raster grid parameters and coordinate system.............................. 67

4-2 Definition of vector objects and parameters .....................................................69















Abstract of Thesis Presented to the Graduate School
of the University of Florida in Partial Fulfillment of the
Requirements for the Degree of Master of Science

DEVELOPMENT OF A GEOSPATIAL DATA-SHARING METHOD FOR
UNMANNED VEHICLES BASED ON THE JOINT ARCHITECTURE FOR
UNMANNED SYSTEMS (JAUS)

By

Carl Preston Evans, III

August 2005

Chair: Carl D. Crane, III
Major Department: Mechanical and Aerospace Engineering

A task performed almost effortlessly by humans, perception is perhaps one of the

most difficult tasks for autonomous vehicles. While substantial research has been done to

develop these technologies, few studies have examined ways for multiple heterogeneous

unmanned systems to cooperate in their perception tasks. Our study examined ways to

model both perceived and apriori geospatial information, and formatting these data so

that they can be used by the growing unmanned systems community.

We introduce a perception system model, consisting of distributed "smart" sensors.

This system of sensors was developed for the Team CIMAR entry into the inaugural

DARPA Grand Challenge autonomous vehicle competition held in March 2004. The

Smart Sensor Architecture proved to be a power method of distributing the possessing of

sensor data to systems developed by engineers who best knew a particular sensor

modality. By standardizing the logical, transport, and electrical interfaces, the smart

sensor architecture developed into a powerful world modeling method.









We also investigated current geospatial data-modeling methods used in the

unmanned systems and geodetic information systems (GIS) communities. Our study

determined the commonalities among current methods and resulted in a first-generation

geospatial data-sharing standard for unmanned systems compliant with the Joint

Architecture for Unmanned Systems (JAUS).














CHAPTER 1
INTRODUCTION AND RESEARCH PROBLEM

1.1 Introduction

Imagine a world without languages, with no standard methods of communicating

with other people. Imagine a world in which an individual or a small group of

individuals had a language completely different from that of other individuals or groups.

Image also that even the most primitive methods of communicating required an

interpreter. It would be unreasonable to expect two people to be able to come together

and (with minimal effort) understand each other. This is the state of the world in the

unmanned systems community. Developing a common language for unmanned systems

is not trivial. However, as unmanned systems become more commonplace and gain the

ability to interoperate and ultimately collaborate, a standard communications method or

language must be developed.

As it has with a number of technological innovations throughout recent history, the

United States Department of Defense (DoD) is helping to revolutionize the unmanned

systems community by pushing the development of a standard communications method

for all future DoD unmanned systems. Recognizing the increased acquisition and

maintenance costs for a growing fleet of unmanned systems with proprietary interfaces,

the Office of the Secretary of Defense chartered the Joint Architecture for Unmanned

Ground Systems (JAUGS) Working Group to address these concerns. The JAUGS

Working Group was tasked with developing an initial standard for interoperable

unmanned ground systems. In 2002, the charter of the JAUGS Working Group was









modified such that their efforts would extend to all unmanned systems, not only ground

systems. The standard was therefore renamed Joint Architecture for Unmanned Systems

(JAUS).

Unmanned systems are becoming increasingly popular. In fact, large U.S.

government acquisition programs such as Future Combat Systems (FCS) and Man

Transportable Robotic Systems (MTRS) show that unmanned systems are here to stay.

The Future Combat Systems (FCS) program is an ambitious multi-billion-dollar program

with a goal of integrating autonomous, semi-autonomous, and tele-operated systems into

the battlefield of tomorrow. Man Transportable Robotic Systems (MTRS) is a large

multi-million-dollar program that requires a large number of tele-operated unmanned

systems for use in the task of explosive ordnance disposal (EOD). Both the FCS and

MTRS programs require systems that can communicate with one another (operator

control units to vehicle or inter-vehicle) using a shared language. This language (JAUS)

is the subject of our study.

Currently, JAUS supports tele-operation; and, to an extent, primitive levels of

semi-autonomy. Technological innovations in the areas of sensors, sensor processing and

fusion, perception, and intelligence have advanced robotics so much that demands that

were not long ago far-fetched are fast becoming a reality. To this end, JAUS must adapt

to meet the growing requirements of the semi-autonomy and autonomy camps of the

unmanned systems community. Types of autonomous behaviors are as numerous as

human behaviors. However, for unmanned systems, the next step in the natural

progression beyond tele-operation is the development of assisted tele-operation and

autonomous navigation and obstacle avoidance.









It is as difficult to define a new all-encompassing language for unmanned systems

as it would be for humans. In the context of a particular mission, however, it is possible

to develop a syntax that can be used to communicate relevant information. By initially

limiting the scope of JAUS and incrementally adding functionality, a robust language is

being built.

The focus of our study was on allowing JAUS-based unmanned systems to share

geospatial data. These geospatial data are needed to support the tasks of obstacle

detection, obstacle avoidance, and path planning among multiple JAUS subsystems. The

concept of the world model helps to put this work into perspective. Meystel and Albus

[18] defined the world model as

the intelligent system's best estimate of the state of the world. The world model
includes a database of knowledge about the world, plus a database management
system that stores and retrieves information. The world model also contains a
simulation capability that generates expectations and predictions. The world model
provides answers to requests about the present, past, and probable future states of
the world. The world model provides this information service to the behavior
generation system element in order to make intelligent plans and behavioral
choices. It provides information to the sensory processing system element to
perform correlation, model matching, and model-based recognition of states,
objects, and events. It provides information to the value judgment system element
to compute values such as cost, benefit, risk, uncertainty, importance, and
attractiveness. The world model is kept up to date by the sensory processing
system element.

A world model presents unending directions to investigate. No doubt, many such

investigations have begun. Our study focuses on the database of knowledge. We

examined how the data are stored inside the database, and how databases can share data

using a common language. In the context of JAUS, our study presents a first-generation

standard for sharing a database of knowledge. Because unmanned systems used in the

JAUS community are outdoor vehicles (and because of the desired tasks) these world

model databases store geospatial data. A review of the relevant literature formed a solid









basis for creating this standard. We also introduced the implementation of a perception

system.


Sensor Data Data Modeling High Level
Acquisition and Fusion Planning


Figure 1-1. Example of the processes leading up to higher-level planning

Geospatial data generated by an unmanned system are only as good as the system's

sensors and its sensor fusion and registration methods. Also important is what is done

with these geospatial data after they have been fused and registered (high-level planning

and intelligent behaviors). These important issues fall outside the scope of our study.

Figure 1-1 shows some of the processes leading up to high-level planning. Figure 1-2

shows how our study fits in. The message-set generated by this study will allow different

databases to share knowledge among themselves or with higher-level planning processes.


Sensor Data __ Data Iondeling Interface __ High Level
Acquisition and Fusion Standard Planning

Figure 1-2. How our study fits into the higher-level planning process

1.2 Research Problem

Our study takes its direction from the following research problem.

Given the experience and knowledge gained from examining current methods of
modeling geospatial data within the unmanned systems and geographic information
systems (GIS) communities and from implementing a perception system for an
unmanned ground system, create a first generation geospatial data-sharing method
for unmanned systems. Present this in a format consistent with the Joint
Architecture for Unmanned Systems (JAUS) messaging framework.

This is a broad and open-ended topic. However, it must be addressed. As the

capabilities of JAUS are extended, being without a method for communicating even the

most basic forms of obstacle data would be a severe limitation. The primary purpose of






5


the standard presented in our study is to support mission planning. However other

applications (such as data visualization) also benefit.

The contribution of our study is a first-generation method recommended for sharing

data needed by state-of-the-art real-world, unmanned systems. The recommendations

may be seen as guidelines for a first attempt (at least within the JAUS community) to

allow multiple disparate unmanned systems (from different organizations, with

completely different perception implementations) to share data.














CHAPTER 2
REVIEW OF RELEVANT LITERATURE

The most difficult behaviors for unmanned systems are perception and reasoning.

Reasoning for an unmanned system is highly dependent on the quality of the estimation

of the environment in which the unmanned system operates. This estimation is often

used to support higher level behaviors performed by either the unmanned system or a

human operator through tele-operation. Each system typically has its own method for

modeling and sharing data. As we move towards increased interoperability among

unmanned systems from different vendors, work must be done to bridge the gap between

different methods of representing sensed data and providing those data to disparate

unmanned systems. Again, this is the focus of our study; to provide a first generation

standardized method for modeling the environments that unmanned systems operate in

and then providing those data to other concerned manned or unmanned systems.

Much work has been done in recent years to move toward true interoperability

between unmanned systems. One of the major efforts towards reaching this goal is the

Joint Architecture for Unmanned Systems (JAUS). JAUS is a standard that defines the

format of messages that travel between unmanned systems. Since it is fast becoming the

standard for military unmanned systems, JAUS provides a suitable base upon which to

build a first generation world modeling standard for unmanned systems.

2.1 Joint Architecture for Unmanned Systems (JAUS)

The Joint Architecture for Unmanned Systems (JAUS) is a messaging standard

being developed with overall goals of reducing life cycle costs, enabling fast integration









of new technologies, and facilitating interoperability amongst heterogeneous unmanned

systems. In 1998, the Office of the Secretary of Defense (OSD) chartered the Joint

Architecture of Unmanned Ground Systems (JAUGS) Working Group and tasked this

working group with developing a common model for messages used for controlling and

monitoring processes within unmanned ground systems. Now the Joint Architecture for

Unmanned Systems (JAUS), the working group is tasked with expanding the standard to

the entire domain of unmanned systems. This group is currently represented by a diverse

group of members from government, industry, and academic institutions. By having a

wide range of input in developing the standards, JAUS is better prepared for wide

acceptance by the unmanned systems community.

2.1.1 Tenets of JAUS

To ensure the flexibility, extensibility, and ultimately the longevity of the emerging

JAUS standard, it was developed with four main tenets. These are: technology

independence, hardware independence, platform independence, and mission

independence [16].

The technology independence of JAUS assures that the messages that compose the

JAUS standard as well as the methods for transporting the messages are not dependent on

any past, present, or developing standard. For example, many JAUS implementation use

the user datagram protocol (UDP) and the internet protocol (IP) for data transmission.

Other implementations may, however, use asynchronous serial communications links

such as EIA/TIA 232. There may be cases where one communications method is

preferred over another. By restricting the dependence on a communications technology,

JAUS leaves this decision to the system developer and thus remains very flexible. By

defining only the messages to be communicated, JAUS will remain relevant over time.









The Hardware Independence rule is similar to the technology independence

requirement. JAUS does not rely on knowledge of the structure of an unmanned system.

There are no assumptions about the type of platform or the contents of the platform. So

long as a system has adequate hardware to create, receive, process, and respond to the

standardized JAUS messages, it is considered to be compliant with the specification.

Platform independence is the third tenet of JAUS. There are no assumptions about

the type of systems that will use JAUS. The JAUS standard is just as useful for large

tanks as it is for miniature microcontroller based unattended sensors. Surely as systems

become more embedded, the read only memory (ROM), random access memory (RAM),

and computing resources available decrease. Therefore an embedded system is less

likely to be able to support large complex JAUS messages. This is acceptable as JAUS is

very flexible with respect to the messages that each system must support. With the

exception of a small number of core input and output messages, JAUS allows systems to

use only the messages (as well as fields within those messages) that they need to perform

their function.

JAUS also does not presuppose that the unmanned systems based on the

specification are designed for any particular mission. This is the mission independence

tenet of JAUS. By defining a comprehensive message set, it is hoped that JAUS

developers can assemble systems that can complete any mission. Surely this is

intractable, but with the guidance the diverse membership of the JAUS working group,

JAUS has a firm foundation on which to build.

2.1.2 System Structure of JAUS

The Joint Architecture for Unmanned Systems consists of a number of hierarchical

elements that work together to form a complete JAUS compliant unmanned system. The









lowest level of abstraction within JAUS is the component. Going up the chain of

complexity, a JAUS node consists of multiple components, a subsystem consists of one

or more nodes, and a JAUS system consists of one or more subsystems. Figure 2-1

shows the structure of a JAUS system. Not show in this figure is the concept of multiple

instances of a component. This feature is included in JAUS to support component

redundancy.

Subsystem Subsysem ubstem a
Node I Node I


Conpesenti Component f Componnt r Component R



Node m Node y


Component 1 Ciomponent r Component 1. Componen t t




Figure 2-1. System structure of JAUS

The component encapsulates a specific function and the input and output messages

necessary to command, control, and monitor the component. For example, the JAUS

Primitive Driver component is responsible for the low-level command and control of an

unmanned system. It controls and reports current status of the lowest level devices on the

platform and reports platform specific data such as platform name and dimensions.

Another component, the JAUS Global Pose component, interfaces to a device or a

number of devices that are capable of providing the platform with its current global

position, orientation, and orientation rate information. These are just two examples of









JAUS components. The JAUS Reference Architecture currently defines 26 components,

each with its own specific function. The Reference Architecture allows up to 254

components to operate within a JAUS node.

A node is a single computing entity that consists of one or more JAUS components

running in a tightly coupled manner. In this context, tightly coupled implies that the

computing entities are not linked by any external connections. Instead, they are

connected internally. This could be by function calls or shared memory, for example. If

two or more components are to be linked by an external communications medium, they

should be considered separate nodes. The JAUS standard currently allows up to 254

nodes within a subsystem.

A subsystem is device that performs a function through the synergy of the

component containing nodes within it. There must be at least one node within a

subsystem. This node may contain all the components necessary for the subsystem to

perform its function. The subsystem may also contain a number of nodes that each

provide components necessary for the subsystem to perform its function. The JAUS

standard currently allows up to 254 subsystems to operate within a JAUS system.

A system consists of one or more subsystems working together for some useful

purpose. This is the highest level within the JAUS hierarchy. JAUS currently does not

permit communications between different JAUS systems. Within a system, however, any

component, node, or component may communicate with any other component, node, or

subsystem.

The hope is that the JAUS standard is generalized enough that it will not inhibit the

creativity of the engineers and scientists developing these systems. Of course it is not









possible to account for all possible unmanned system scenarios. Because of this, the

JAUS standard has been developed to allow for the development of user-defined

components. The idea is that as these user-defined components mature and their

usefulness is recognized by the JAUS community, they would be incorporated into the

JAUS Reference Architecture. What is most important overall about JAUS is that it

standardizes the interface between these components. As unmanned systems become

more and more common place, without JAUS or some industry-wide JAUS-like standard,

the interoperability issues will only be compounded.

2.1.3 World Model Subcommittee for JAUS

In October 2002, the Joint Architecture for Unmanned Systems Reference

Architecture Committee's World Model Subcommittee was established to address the

growing need within the unmanned systems community for a messaging architecture that

allows multiple heterogeneous unmanned systems to share geospatial data. The task of

this subcommittee was to develop the methods to allow modeling and sharing of

geospatial data within the JAUS framework. For JAUS, the primary purpose for

modeling and sharing of these data is to support the tasks of mission planning and

distributed mapping for autonomous systems. JAUS is focused on the practical approach

to unmanned systems and therefore so should a JAUS standard for geospatial data

modeling.

2.2 Real-Time World Modeling Methods

The field of mobile robotics is generally interested in real time world modeling

methods. Typically these world modeling methods support the task of reflexive obstacle

avoidance whereby an unmanned system uses an instantaneous view of the environment

to effect change in its current mission. For example, an unmanned system may be tasked









to autonomously navigate to a given waypoint without colliding with anything along the

way. Similar to a human reflexively reacting to a sudden undesired condition, an

unmanned system given this task may reflexively respond to obstacles that appear within

the field of view of its sensors. Often these methods require very little modeling or

processing of the sensor data. Of paramount concern is the safety of the systems.

It is often desired or necessary to have unmanned systems accumulate a model of

the environment in which they operate. This may simply be for the sake of building an

accumulative map of the environment or it may be to allow the unmanned system to

make a more informed decision should it decide that it needs to modify its current

behavior in order to successfully complete its given task. For example, if an unmanned

system can perceive the environment at a distance that extends far beyond a range at

which the system must act reflexively to maintain the safety of the system, those

additional data could be used to reactively re-plan a path that avoids the hazard

completely. At the very least, an accumulative model of the environment would provide

the system with the ability to, should it have to act reflexively, choose the best long term

plan.

Typically both reflexive and reactive obstacle avoidance systems use a tessellated

raster grid based data structure to represent the environment. These raster grids are most

commonly used for real-time world modeling because it is simple to project the sensor

view into a two-dimensional Cartesian grid.

2.2.1 Raster Occupancy Grid

Sensors are all prone to errors that affect the quality of their data. Some of the

sensors, such as radar and sonar, have wide fields of view, but very low resolution within

their fields of view. To handle the issues of uncertainty and errors in the sensor data, the









concept of the occupancy grid was introduced by Elfes [13]. The raster occupancy grid is

a tessellated grid used to accumulate real-time sensor data. A probabilistic model of the

data from the sensor is generated and is used to update occupancy probabilities within the

raster grid. The grid cell values for the occupancy grid represent the probability that an

object exists or does not exist in the area covered by the cell. Updates are made to the

patches of the grid that represent the field of view of the sensor. Even though this idea

was pioneered by Elfes in 1989, it is still the most common implementation for real-time

sensor data accumulation. It is especially useful for supporting the task of obstacle

avoidance.

Over the years there have been several extensions on the pioneering work done by

Elfes. One extension to Elfes's approach was introduced by Borenstein [6]. Rather than

updating a large patch of the occupancy grid within the field of view of the sensor, this

method updates a single cell along the major axis of the sensor. Borenstein shows that as

the unmanned system traverses an area, this method is cheaper computationally and

achieves similar results because the method focuses on using data along the sensor's axis

that provides the best data. Novick [22] extended the concept of the raster occupancy

grid update method. His approach was to apply a nonhomogenous Markov chain based

method to update grid cells. Using this approach, Novick shows that this method is a

significant advance in sensor fusion for outdoor vehicles. Both Borenstein and Novick's

methods use raster grids to represent their data.

2.2.2 Real Time Terrain Mapping

An extension of the occupancy grid methods is the real-time terrain mapping

method. This method attempts to generate a model of the Earth's surface in a tessellated

data structure. This two and a half dimensional representation assigns a height to each









grid cell as opposed to an occupancy probability. Crosetto and Crippa [10] presented a

method for fusing stereo and radar data to form real-time elevation maps.

2.2.3 Raster Traversability Grid

The traversability grid concept is an extension of the both raster occupancy grid

and terrain mapping methods. In this implementation, the value in a grid cell represents

the degree to which the area covered by the grid cell is considered drivable by the

vehicle. Unlike the previous two methods, occupancy grids and terrain mapping, the

traversability method is dependent on vehicle parameters. This is because the concept of

traversability is inherently platform dependent. For example, an area occupied by a small

rock may be deemed untraversable by a small unmanned system. However, a larger

unmanned system confronting the same rock may consider the region less than desirable,

but still traversable. Vehicle parameters that are often used traversability determination

include the maximum allowable rotation angles of the platform about its three axes.

Using a model of the terrain in which an unmanned system operates, it is possible to

calculate the pose of the unmanned system along a path given the vehicle's physical

parameters.

2.3 A Priori World Modeling Methods

An apriori world model data store is one that contains data that were accumulated

prior to use by an unmanned system. For example, if an unmanned system maps, this

map could be stored for future use by the unmanned system or transferred to another

unmanned system to allow it to make mission decisions. This is an example of the use of

raster data apriori. This is not the typical use of apriori within unmanned systems

because of possible errors in the map making process. Instead, vector methods are used

more frequently for initial data. An example of the use of this modeling method is









presented by Pasha [23]. The model of the world used is based on a polygonal

representation as shown in Figure 2-2. In this work, Pasha models an environment in

which an unmanned system must operate. The locations of static obstacles are known

and can be used during the path planning process. Compounding the problem however,

is the presence of numerous radiation sources. Given the obstacles and location and

strength of radiation sources, a path plan is computed that most efficiently gets the

unmanned system to its desired destination while minimizing its exposure to radiation.



__ t Ii

















Figure 2-2. Path planning in a bounded radiation environment (Source: A. Pasha, "Path
Planning for Nonholonomic Vehicles and Its Application to Radiation
Environments," Master of Science Thesis. Department of Mechanical
Engineering: University of Florida, 2003, p. 59, Figure 6-9)

2.4 Geographic Modeling Methods

The areas in which unmanned systems operate are typically assumed to be simple

planar surfaces. As unmanned systems begin to be introduced into real world outdoor

applications, this assumption can not hold.









2.4.1 Global Coordinate Systems

When moving from the laboratory to real world, outdoor applications that cover

large distances, the methods presented in Section 2.3 must be modified. Those methods

assumed that the unmanned system was operating in a perfectly planar environment;

where, in the case of raster data, the cells were square and the coordinate system

Cartesian. The Earth is not flat and, therefore, when unmanned systems operate over

large distances, they must take the Earth's true shape into consideration.

There are three commonly used models of the Earth's shape. They are actual shape

of the Earth's surface, the ellipsoid, and the geoid [7]. Because of the large variations in

the Earth's surface, it is difficult to develop a true mathematical model for it. Therefore,

the other two methods of modeling, the ellipsoid and the geoid, are typically used.

The ellipsoid is a mathematical model of the shape of the Earth. The ellipsoid

(Figure 2-3) is defined by its semi-major and semi-minor axes. Over the years different

Ellipsoidal models of the Earth have been established based on the best known shape of

the Earth. Currently the most commonly used model is the World Geodetic System as

defined in 1984 (WGS84). This model defines the semi-major axis rl as 6,378,137.0

meters and the semi-minor axis r2 as 6,356,752.3 meters [7].

North Pole


South Pole


Figure 2-3. Ellipsoidal model of Earth










Once the Earth ellipsoidal model is established, a geographic coordinate system

must also be established. Because of the spherical shape of the Earth, a spherical

coordinate system is used to define points on the ellipsoid. A point on the ellipsoidal

surface is described in spherical coordinates by a latitude value in degrees, a longitude

value in degrees, and a height or elevation value in feet or meters. As shown in Figure

2-4, latitude values increase going north and range from -90 at the South Pole, to 00 at

the Equator, to 900 at the North Pole. Longitude values start at 00 at the Prime Meridian

and range between plus and minus 1800. The values go negative going west and positive

going east.

North Pole (90" Latitude)
Prime Meridian (01 Longitude)



Meridians (Loln=ijillep f Parallels (Latitude)


Eluiialor (00 Latitude)

-90.0 Longitude 90.0 Longitude





South Pole (-90 Latitude)

Figure 2-4. Earth-centered global coordinate system

Bolstad [7] describes the geoid as a three-dimensional surface that has a constant

pull of gravity at each point. This equipotential surface is important for establishment of

a vertical datum. In fact, this surface typically defines what is referred to as mean sea

level [36]. If the Earth was covered by only water and no land, gravity would pull the

water such that the geoid and the sea level would be the same [36]. This is how mean sea

level is defined for land areas that are not near the sea. As with the ellipsoid model,









locations are referenced by latitude, longitude, and elevation. The difference is that the

ellipsoid model uses the surface of the ellipsoid to establish elevation whereas this

method measures the elevation of the geoid with respect an ellipsoid.

2.4.2 Projected Coordinate Systems

While true global coordinates are expressed as points of latitude, longitude, and

elevation, it is more intuitive to model the world in Cartesian coordinates. This is

particularly true when extending the methods in Section 2.3 to outdoor applications. In

order to use a Cartesian coordinate system, methods have been established to

mathematically project global, spherical coordinates onto a rectangular grid.

Because it is not possible to exactly represent this three-dimensional surface in

two-dimensions, there are different mathematical projections that preserve different

features of the three-dimensional surface. Typical features that are preserved are local

shape, area, distance, and true direction. Conformal projections preserve local shape,

equal area projections preserve area, equidistant projections preserve distance to some

points, and true-direction projections preserve true-course between certain points [14].

Projections are not only classified by the types of features they preserve, they are

also classified by the type of method used to create them. The main classifications are:

cylindrical, conic, and planar or azimuthal [14].

Cylindrical projections convert from the Earth's three-dimensional spherical

coordinate system to a cylindrical coordinate system. After the projection, the cylindrical

representation is sliced so that it forms a two-dimensional rectangular representation of

the Earth's surface. Conic projections convert from the Earth's three-dimensional

spherical coordinate system to a conic coordinate system. After the projection, the conic

representation is sliced so that it forms a two-dimensional representation of the Earth's









surface. Planar or azimuthal projections convert from the Earth's three-dimensional

spherical coordinate system directly to a planar coordinate system. There are numerous

types and variation of each type of projection. Bolstad [7] and ESRI [14] provide qn in-

depth discussion of these projections and many of their variations.

2.4.3 Universal Transverse Mercator projection

The Universal Transverse Mercator (UTM) projection is a modification of the

cylindrical Mercator projection. This projection is a conformal projection which

preserves local shape of objects [14]. It creates minimal distortion of areas, local angles,

and distance [14]. Unlike the cylindrical projection shown in Figure 2-5, the UTM

projection divides the cylinder into 60 vertical zones. Each UTM zone is exactly 6

degrees of longitude wide and is further divided into north and south parts [7]. The UTM

zones each have their own coordinate system which is completely different than the

coordinate system of other zones. Because of this, it is difficult to use the UTM

projection when traveling between UTM zones. This is rarely a problem with unmanned

ground systems because the six degree UTM zones are much larger than what would be

reasonably expected for a system of this type to traverse. It may be an issue with

unmanned aerial vehicles, but this is something that can be taken into account by the

system developers.

What is most attractive about the UTM projection is that it is a projection that is

defined globally. In each zone, it is able to maintain shape, area, direction, as well as

distance. These are all features that are important for unmanned vehicles during

navigation and world modeling tasks. The reader is referred to [7] for a more in-depth

discussion of the Universal Transverse Mercator projection, its applications, and

limitations.









2.5 Georeferenced World Model Data

The modeling methods presented in Sections 2.3 and 2.4 are dependent on a planar

assumption for the environment that the unmanned system operates in. In these

applications, the coordinate systems are Cartesian with the origin being based on an

arbitrarily chosen local coordinate system. As discussed in Section 2.4, these data can be

stored and used as apriori data from other unmanned systems. What is more common,

however, is to use data from third party sources. The most important

2.5.1 Raster Data Stores

Raster Data Stores are those that provide tessellated grid based geospatial data.

Examples of the raster data stores include Digital Elevation Model (DEM), Digital

Terrain Elevation Data (DTED), Digital Raster Graphics (DRG), Digital Orthophoto

Quadranges (DOQs). This list is by no means exhaustive. There are many more types of

raster data stores. Each type of data store provides different types of data at different

resolutions possibly using different projections.

Figure 2-5 shows the high-level format of Digital Elevation Model (DEM) data.

DEM data use the UTM projection to create a Cartesian coordinate system. These DEM

data represent a 2.5D surface. The resolution of DEM data is 30 meters.

2.5.2 Vector Data Stores

Vector Data Stores are those that provide geospatial data that are referenced by

points, lines, or vertices of polygons. Types of vector data stores include Digital Line

Graphs (DLG), State Soil Geographic (STASGO), and Topologically Integrated

Geographic Encoding and Referencing (TIGER). This list is by no means exhaustive.

There are many more types of vector data stores available from third parties. Since there

is no globally accepted geospatial data standard, each store may use a different format..












Pt2



Ax = 30 meters (Easting)
ii Ay = 30 meters INorthifng)
S0 = Elvaeiro p=', ri iln r ii ia::er,[
PY f i quadrangle
-L V.1 = Elevation point

E = Co rer of DEM polygon
Ay (7.5-minute quadrangle corners)

ILExaTmplF is a quadrangle westof
I central meridian of UTM zone.)
Ax Pt4





Figure 2-5. Example format of digital elevation model

Figure 2-6 shows an example of Digital Line Graph (DLG) data being extracted

from a Digital Orthophoto Quadrangle. The benefit of this extraction is that the resulting

DLG vector data size is smaller than the DOQs data size.

2.6 Distributed World Modeling Methods

There are very few major efforts attempting to tackle the difficult task of

distributed world modeling. Two of the current efforts are the Spatial Data Transfer

Standard (SDTS) and the Geography Markup Language (GML).

2.6.1 Spatial Data Transfer Standard (SDTS)

SDTS is an open standard being developed by the Unites States government for use

in geographic information systems. One of the reasons for developing this standard is

that there are various types of geospatial data available based of different Earth models

and projections with each having different errors associated with them. SDTS seeks to

provide a method that allows a complete data transfer with all necessary information









associated with those data needed to incorporate them into other data systems. SDTS

specifies the entire process of storing and sharing geospatial data. This ranges from the

methods for modeling raster and vector geospatial data down to the way that data are

stored in digital files. SDTS is also a very broad standard that is able to support different

models of the Earth, different map projections, and different method of modeling the

data.





















Figure 2-6. An example of USGS source data. A) USGS Orthoimage B) Extracted
digital line graph

The SDTS is divided into six profiles that completely define the standard. The first

three parts define the logical specification, spatial features, and data encoding,

respectively. The other parts are called profiles. Each profile provides instructions for

using the base SDTS rules, parts one through three, to different types of geospatial data



Part four of the SDTS standard is the Topological Vector Profile (TVP). This

profile allows transfer of geospatial vector data described by vector geometry and
diitl in gap









topology. This profile allows data to be geometrically described using points, lines,

polygons, as well as combinations of these. The Topological Vector Profile is useful for

transferring digital line graph (DLG) data such as those presented in Figure 2-9 [3].

Part five of the SDTS standard is the Raster Profile and Extensions (RPE). This

profile supports various types of raster formatted geospatial data. This includes

Georeferenced orthoimages, grid formatted terrain data such as DTED and DEM, as well

as any type of tessellated geospatial data. RPE does not support data of a higher

dimension that two and a half (such as terrain data) [5].

The Last part of the current version of SDTS is the Point Profile. This profile

provides support for high precision point data only. While the Topological Vector Profile

does support point data, it does not at high enough precision for some applications. The

Point Profile supports up to 64 bits of precision whereas the TVP only supports up to 32

bits of precision [4]. All six parts of the SDTS standard combine to form a powerful and

comprehensive method for modeling and distributing geospatial data.

2.6.2 Geography Markup Language

The Geography Markup Language (GML) is a broad standard that supports raster

and vector data in 2, 2.5, and 3 dimensions. It also supports more types of complex

shapes and surfaces than are needed for unmanned system world modeling. It is able to

support data based on different projections as well as different Earth models [11].

GML is an extension of the Extensible Markup Language (XML). XML, like the

Hypertext Markup Language (HTML) commonly used for transfer of web pages,

supports tags that specify the types of data included in the document. For XML the tags

are defined by the document creator for the type of data included. HTML specified all of

its tags apriori. Also unlike HTML, XML and subsequently GML, does not mix the









data content with the formatting of the content. For GML, the descriptors (or tags) are

geospatial data related. While XML provides a very loose structure for the types of data

described, GML places restrictions on XML by specifying the methods for geometrically

modeling the data. If GML based system developers associate different attributes with

the geospatial data types, they will at the very least be able to understand each other's

data at a geometric level [17].

Both SDTS and GML are both adequate methods for modeling geospatial data and

sharing those data, but they are not exactly appropriate for JAUS based unmanned

systems. Of the two, GML is more appropriate since it is based on the powerful XML

standard which is designed for real-time transfer. By defining additional XML tags, it is

possible to make data store modifications in real-time rather than on a per XML

document basis. The downside of GML is that it is all ASCII text based and requires

extra characters to support its extensibility. Because some of the tags are many

characters long, this translates to additional bandwidth being used for the support

characters.

SDTS is not appropriate for use with unmanned systems where bandwidth

utilization should be minimized. Because SDTS transfers are to be all self-contained

with all necessary data included, this is not suitable for real-time data transfer. A real-

time world modeling message set should support the ability to make individual changes

to the data store in real-time rather than requiring changes to be transmitted via an

updated version of all the data in the data store.

This work is interested in using the power of the JAUS infrastructure to support

distributed world modeling. Since JAUS defines the structure of its messages apriori,






25


beyond its 16-byte header, JAUS does not require any other bytes to support its

infrastructure. All of the data after the JAUS header are values for the field described in

the JAUS message definition. Rather than incorporating a completely different, non-

optimal standard into JAUS for world modeling, the world modeling standard builds on

the framework developed by the JAUS Working Group.














CHAPTER 3
SMART SENSORS

While setting out to develop a standard for modeling the various types of geospatial

data presented in the preceding chapter, a distributed set of world models was developed.

These world models were tightly coupled to their associated sensors and therefore were

initially considered to be smart sensors.

3.1 Smart Sensor Architecture

The smart sensor architecture was originally developed for the perception system in

the Team CIMAR NaviGATOR which is represented Figure 3-1. The NaviGATOR was

developed as an entry to the 2004 Defense Advanced Research Projects Agency

(DARPA) Grand Challenge. Held in March of 2004, the DARPA Grand Challenge was a

first of its kind unmanned ground vehicle (UGV) competition. The thrust of this

challenge was to develop a UGV that could autonomously navigate and avoid obstacles

over the approximately 140 miles from Barstow, California to Primm, Nevada crossing

the Mojave Desert.

Team CIMAR consisted of graduate students and engineering staff from the

University of Florida's Center for Intelligent Machines and Robotics (CIMAR) and

Logan, Utah based Autonomous Solutions, Inc.

Recognizing the power and flexibility afforded by the use of JAUS, Team CIMAR

used it throughout the NaviGATOR and therefore developed, at the time, one of the only

completely autonomous systems based on the Joint Architecture for Unmanned Systems

(JAUS). The exception to this was the Navigator's perception system where the









messages that defined the smart sensor messaging architecture were only loosely

modeled after the JAUS standard and the JAUS World Modeling Subcommittee's

forthcoming draft message set which is presented in Chapter 4 of this document.

The smart sensor architecture is a networked system of distributed, modular,

heterogeneous sensor units that all use a common messaging and network interface to

share data. Each smart sensor processes data specific to its associated sensor modality

and determines region traversability using a suitable traversability metric as determined

by the sensor system developer. These geospatial traversability data are shared within the

perception system and provided to higher level planning components to allow them to

make intelligent decisions such as obstacle avoidance.


Figure 3-1. Team CIMAR NaviGATOR DARPA Grand Challenge entry vehicle









The smart sensor units are considered "smart" because they not only process their

sensor data, they also provide a logically redundant interface to other components within

the system. The impetus behind the creation of this smart sensor architecture was to

allow sensing system implementers to develop their sensing technologies independent of

one another and then have them, with minimal effort, seamlessly integrate their work to

form a robust perception system. The JAUS-like messaging infrastructure and logical

redundancy of the smart sensors afforded this flexibility. Even though their

implementations and sensor modalities are different, these sensor units are logically

redundant in that their messaging interfaces are identical [19]. The idea was that each

sensor implementer best knew how to process and register their own sensor data. Rather

than relying on a probabilistic model of the sensor to homogenize the sensor data on one

system, this implementation expects the sensor data to be homogenized before they are

fused. Once their data were available, the smart sensors would publish the data to a

central component, the smart sensor arbiter, whose responsibility would be to fuse the

data from all of the smart sensors.

The output of the smart sensors is a measure of region traversability cost. This cost

is based on a sensor-specific traversability metric being applied to the data from a

physical obstacle detection sensor. Behaviors of this type, associating a cost to an

attribute based on a metric, are called value judgment [18]. Smart sensor developers

were permitted to use any sensor modality that presented data that could be processed to

provide sufficient traversability value judgment. For this implementation, these included

stereo vision, stationary laser measurement system, and monocular vision based smart

sensors all developed by researchers at the University of Florida. A continuously rolling









laser measurement system based smart sensor was developed at Autonomous Solutions,

Inc.

While there are duplicate sensor types, the implementation of the associated smart

sensor makes the data from the sensors quite unique. For example, the stereo camera and

monocular cameras use the same sensing modality however the difference is in the

implementation of the smart sensors. The stereo camera data are processed so that they,

through the use of image rectification and correlation, provide a sparse three-dimensional

representation of the environment within the field of view of the cameras. Traversability

is determined by considering the stereo data as real-time terrain data and applying value

judgment. The implementation of the monocular camera based smart sensor utilized

color and cluster affinity in RGB-space to classify image pixels that belonged to

traversable surfaces.

Once the individually developed smart sensors were completed, a predefined

messaging architecture was used to transmit the traversability data within the perception

system. In order to support true interoperability, however, electrical and transport layer

issues also had to be addressed. These issues will be address later in this chapter.

3.2 Smart Sensor Architecture Components

There are three major types of components that make up the perception system's

smart sensor architecture. These are the smart sensor, smart sensor arbiter, and reactive

planner components. Figure 3-2 shows the perception system components as well as the

component interconnects.

3.2.1 Smart Sensor Component

The smart sensor is a modular perception system component that provides an

interface between a physical sensor and the smart sensor network. It encapsulates a









physical sensor, the hardware necessary to process the sensor data, a method for

determining region traversability from the processed sensor data, a standardized

messaging interface, and a communications link.


Figure 3-2. Organization of the smart sensor-based perception system

A smart sensor is modular in that it shares the same logical interface with all other

smart sensors. With the exception of a single field in the message header, the source

component identification number, the output format of each smart sensor is identical to

that of all other smart sensors. This allows any smart sensor to seamlessly replace any

other.

Internally, the smart sensor maintains a tessellated traversability grid of a size

specified by the predefined range and resolution of the grid. As with the occupancy and

traversability grids introduced in Chapter 2, this grid maintains a fixed orientation and









remains vehicle centered. In this implementation, the grid maintains a north-east

orientation.

As the vehicle moves, the grid is translated in discrete steps to compensate for the

vehicle's movement. The translation of the vehicle is determined from the previous and

current positions of the vehicle as provided by a global positioning system (GPS)

providing coordinates in the WGS84 coordinate system. The coordinates are projected

from global to Cartesian coordinates using the Universal Transverse Mercator (UTM)

projection. The difference in position, in meters East and North of the origin, is

converted to a translation of grid rows and columns. To assure that the vehicle is always

centered in the center cell of the traversability grid, the grid dimensions, rows and

columns, are required to be odd.

The geospatial traversability data are registered by using the vehicle's orientation to

project the sensor data into the two-dimensional traversability grid. As the vehicle

translates and rotates, changes to the traversability grid are monitored. As the values of

cells change, the updated values are transferred to other systems to provide grid

synchronization.

3.2.2 Smart Sensor Arbiter Component

The smart sensor arbiter has the responsibility of fusing data from the smart sensors

and, through the synergy of the different sensor modalities, providing a better model of

the world to the reactive planner component.

In a complete smart sensor system, the arbiter component is the hub of all data

traffic from the smart sensors. As it receives traversability updates from the smart

sensors, it immediately fuses the updated data with that from previous sensor updates.

Generally, the method used to fuse the traversability data from the sensors is not









specified and is left to the implementer. What is important is that the interface to the

arbiter is consistent with the smart sensor message set and that the arbiter's grid

resolution is the same as the smart sensors'. Maintaining a grid of equal size as the smart

sensors is not required as it may be desirable to have a grid that extends well beyond the

bounds of all of the smart sensor grids. This allows the arbiter to maintain a larger local

memory of the area perceived by all the smart sensors. In a system with multiple

subsystems, this functionality could be used for collaborative mapping of large areas.

The smart sensor arbiter also includes a virtual component the Region Clutter

sensor. This component provides a very fast indication of the saturation of non-

traversable areas within the unmanned system's immediate vicinity. This feature gives

the higher level planning components information that allows it to modify the vehicle's

speed as it encounters cluttered areas. By modifying the system's travel speed, there may

be adequate time to generate a plan to negotiate the non-traversable regions.

The smart sensor arbiter also shares the same logical interface as the smart sensors.

This allows smart sensor based perception systems to use a single smart sensor without

the smart sensor arbiter or multiple smart sensors with the arbiter. This flexibility is an

asset especially in the development and debugging processes.

3.2.3 Reactive Planner Component

Within the smart sensor based perception system, higher level obstacle avoidance

and vehicle travel speed control is the responsibility of the reactive planner component.

The reactive planner component, using the JAUS communications network, receives the

position, orientation, and orientation rates of the platform from the JAUS Global Pose

and Velocity State components. The reactive planner component then uses the smart

sensor architecture messaging interface and the smart sensor network to transmit the









position and orientation updates to the smart sensors. The same network is used to

receive the smart sensors' traversability data.

As it is receiving traversability grid updates from either the arbiter or smart sensors,

the reactive planner continuously searches for the optimal, lowest cost path through the

accumulated traversability data. The output of the reactive planner is a modified path

plan for the unmanned system to execute.

3.3 Smart Sensor Messaging Architecture

In order to support the development of the components of the perception system, a

standardized messaging interface was defined. Its use was mandated for all components

participating in the smart sensor based perception system. This messaging interface was

to a large degree based on the methodologies and messages established by JAUS.

3.3.1 Smart Sensor Architecture Message Header

To support message identification, routing, and transfer, a modified version of the

standard 16 byte JAUS message header was created. The JAUS header supports more

functionality than needed by the smart sensor architecture. Therefore the majority of the

bytes within the header would not be needed. Because of the volume of data transferred

within the smart sensor system, any savings of would be beneficial. Therefore the JAUS

header was reduced so that all unnecessary header fields were removed. Table 3-1 shows

the format of the official JAUS message header. Since the smart sensors communicated

on their own network, this optimization had no effect on the JAUS based NaviGator

network.

The smart sensor development team made several assumptions about the data

transfer process in order to justify the reduction in header size. They are as follows:

* Smart sensors are all contained within the same subsystem









* Smart sensors are single component nodes
* Smart sensors have distinct component identification numbers
* Smart sensors have only one instance
* Smart sensor message types are unidirectional
* Smart sensors use the same version of the interface control document
* Smart sensors do not use service connections
* Smart sensors do not require message acknowledgment
* Smart sensors transmit messages of the same priority

Table 3-1. Standard JAUS sixteen byte message header

1 Message Properties 2
2 Command Code 2
3 Destination Instance ID 1
4 Destination Component ID 1
5 Destination Node ID 1
6 Destination Subsystem ID 1
7 Source Instance ID 1
8 Source Component ID 1
9 Source Node ID 1
10 Source Subsystem ID 1
11 Data Control (bytes) 2
12 Sequence Number 2
Total Bytes 16

The smart sensors are all contained within the same subsystem and the subsystem

does not communicate spatial data to any other subsystem, therefore the fields for the

destination and source subsystem identification numbers (fields 6 and 10, respectively)

would be equal and would always remain static. Removing these fields reduces the

required header size by 2 bytes.

The message header fields 5 and 9, node identifiers, may be removed due to the

assumption that the smart sensors are single component nodes and that each smart sensor

has its own component identification number. Because of these assumptions each

component identification number must be coupled to one and only one node

identification number. Therefore specifying both the component and node identifiers









would be redundant. The removal of the destination and source node identifier fields

saves an additional two bytes. Therefore, the smart sensor components are addressed

only by component identification numbers.

Within the smart sensor system there is no redundancy of smart sensor

implementations, therefore the header's instance identification fields, 3 and 7, would

never be used. Removal of these fields results in a savings of 2 bytes.

It is important to note that this redundancy assumption is specific to the smart

sensor implementation where each individual smart sensor has its own component

identification number. In a true JAUS system, this would not necessarily be the case.

They would be treated as redundant components since each smart sensor is just another

instance of the same.

Messages traveling down stream from the reactive planner component to the arbiter

and smart sensors are position and orientation update messages. Messages traveling

upstream from the smart sensors to the arbiter and reactive planner are cell update

messages. The exception to this rule is the arbiter clutter sensor component, which will

be discussed in greater detail later in this chapter. Because message types are

unidirectional and there is apriori knowledge of the system configuration, this

assumption removes the need for the source component identification number and the

message command code; a combined savings of three bytes.

The message properties field in the JAUS header provides important information to

the receiving JAUS component. This includes the version of JAUS Reference

Architecture message set used to create the attached message as well as message type,

acknowledgement, and priority information. The last four assumptions have a direct









impact on this message field by making it useless. The assumption that the smart sensors

are not using service connections also removes the need for the Sequence Number field

of the JAUS header. Combined, these four assumptions result in a savings of four bytes.

The cumulative savings produced by the nine assumptions presented above is 13

bytes. In a system with a relatively large amount of bandwidth or less frequent raster

geospatial data transfers, the thirteen-byte savings may not seem significant. Since the

message header must be attached to each message, however, when there is a large volume

of data, as can be expected within the smart sensor architecture, the aggregate savings can

be substantial. The final three-byte header is show in Table 3-2.

Table 3-2. Smart sensor architecture messa e header

1 Source Component ID 1

2 Data Control (bytes)
2
Total Bytes 3


One of the strengths of JAUS is that the messages are develop completely separate

from, and are not at all dependent on, the message header. Because of this, the smart

sensor messages that will be introduced may be transmitted using any message header

that can be used to properly route the messages to their intended destination. Again, the

header size reduction presented in this section was made primarily for the purpose of

saving bandwidth and computing resources.

3.3.2 Smart Sensor Architecture Message Set

The NaviGATOR's perception system, consisting of the components of the smart

sensor architecture, provides a unique method for transferring and synchronizing raster

formatted geospatial traversability data. The structure of the messages within this









architecture is based on the JAUS Reference Architecture message set. They were

designed to support the interoperability, extensibility, and logical redundancy required of

the smart sensor architecture.

Figure 3-3 shows the minimum number of component types needed for a complete

smart sensor system. It is considered complete because all of the core components are

present; the reactive planner, arbiter, and smart sensor. It is minimal because only one

smart sensor is present. In fact, this system is not particularly useful because the arbiter

and the smart sensor's internal traversability grid representations would be exactly the

same. Therefore, the arbiter should be used when there is more than one smart sensor

present.

Using the logical redundancy provided by the smart sensor architecture, a more

efficient implementation of a single sensor based system is shown in Figure 3-4. This

showcases the power of the logically redundant interface as a smart sensor may replace

the arbiter or any other smart sensor.

Three types of messages are used in the smart sensor based perception system:

* Report vehicle state
* Report traversability grid updates
* Report region clutter index

The Report Vehicle State message communicates vehicle position, orientation, and

orientation rate information. Updates to the smart sensor or smart sensor arbiter

traversability grid are transmitted through the use of the Report Traversability Grid

Updates Message. The Report Region Clutter Index message transmits an indication of

the saturation of non-traversable areas in the immediate vicinity of the vehicle. This is

primarily used to limit the speed of the vehicle while in cluttered areas.









3.3.2.1 Report vehicle state message

The Report Vehicle State message, consisting of vehicle position, orientation, and

orientation rate updates, is a combination of the JAUS Code 4402h: Report Global Pose

and Code 4404h: Report Velocity State messages. The position of the platform is given

in latitude, and longitude in accordance with the WGS84 standard. The orientation and

orientation rates are with respect to the vehicles coordinate system as defined by JAUS

(Figure 3-5).


Report Vehicie State


Report Vehicle State


Report Traversability Grid Update: Reprt Traversabil Grid Updates
Report Traversabilty Gid Updates
Report Region Clutter Index

Figure 3-3. Minimally complete smart sensor-based perception system consisting of one
instance of the core component.


Report Vehicle Slate


Repoir Traversability Grid Updates


Figure 3-4. Single sensor implementation of smart sensor-based perception system
consisting of a single smart sensor synchronizing data with the reactive
planner.











ground plane x

North


(a)





z (b)




-x






S (d)


Figure 3-5. Unmanned system coordinate system defined by JAUS

To allow message types of variable size where only the desired data are

transmitted, JAUS provides a presence vector. This presence vector is an n-byte bit field

with flags indicating which optional fields are present in a JAUS message. For the smart

sensor implementation, only the latitude, longitude, roll, pitch, yaw, roll rate, pitch rate,

and yaw rate fields are needed from the JAUS Report Global Pose and Report Velocity

State messages. The remaining fields are not included in the transmitted message. Since

these unneeded fields have been removed from this message contraction and there is a

priori knowledge of the structure of this message, the presence vector was also removed.

The benefit of this approach is that the Report Global Pose and Report Velocity

State messages do not have to be sent separately with the 16 byte JAUS header attached

to each. An additional benefit to this approach is that the position, orientation, and









orientation rate fields are synchronized; i.e. the message includes an instantaneous

reading of both the position and orientation data.

This Report Vehicle State message, Table 3-3, is 20 bytes in length, 23 bytes

including the message header. Fields 1 and 2 contain the latitude and longitude,

respectively, as scaled integers. Fields 3 through 5 contained the vehicle orientation and

fields 6 through 8 contain the orientation rates.

Table 3-3. Smart sensor architecture's report vehicle state message

Latitude Scaled Integer
1(WGS 84) Integer Degrees Lower Limit = -90
Upper Limit = 90
Longitude Scaled Integer
2 (WGS 84) Integer Degrees Lower Limit = -180
Upper Limit = 180
3 4 (Roll) Scaled Integer
4 0 (Pitch) nter Radians Lower Limit = -7
Integer
5 y (Yaw) Upper Limit = 7
6 Roll Rate Short Radians Scaled Integer
Short Radians
7 Pitch Rate Lower Limit = -32.767
Integer per Second
8 Yaw Rate IntegUpper Limit = 32.767

3.3.2.2 Report traversability grid update message

The Report Traversability Grid Update message provides a synchronization

mechanism between the multiple distributed traversability grids. This functionality is

event driven and based on updates to the smart sensors' traversability grids. When a

change is made to a traversability grid, the change is transmitted to the destination

component to synchronize the two grids. By making the process event driven, bandwidth

utilization is reduced over transmitting the entire traversability grid, especially when

there are only a small number of changes to the traversability grid.

The Report Traversability Grid Updates message is shown in Table 3-4. The first

two fields of the message are a latitude and longitude position stamp. This position









stamp represents the point with which the cell update values are referenced; the current

location of the vehicle at the time the sensor data was processed. Following the position

stamp is a series of cell update three-tuples. Each three-tuple represents the traversability

grid update as an updated cell row, column, and traversability value.

The traversability grid cell update values use the entire numeric range of a byte, 0

to 255, to represent the traversability of the region represented by the cell. A value of

127 corresponds to an unknown traversability. As the value approaches zero, exclusive

of zero, the cell classification become more and more non-traversable. Conversely as the

value approaches 255, the classification is more traversable. The grid cell value zero is

reserved exclusively for the world model corridor data which is used to constrain the

search for the lowest cost path through the traversability grid.

This message allows all changes to be transmitted in one message, provided that

the total message data size is less than the 65527 bytes that the smart sensor architecture

header permits. This limit is determined by the UDP/IP transport layer's limit on the

maximum number of payload data bytes that may be transmitted in a single transaction

[24]. Should the Report Traversability Grid Update message exceed 65527 bytes, it

should be broken into separate messages. These separate messages should have the same

latitude and longitude position stamp values as the first cell update message. This

latitude and longitude position stamp is very important as it defines the origin of the cell

changes.

The total number of three-tuple cell updates being transmitted may be inferred from

the header data bytes field by subtracting the eight bytes required by the position stamp

and dividing the remainder by three.









3.3.2.3 Report region clutter index message

Within the perception system, there is a need to allow higher level components,

particularly the Reactive Planner, to know the degree of saturation of non-traversable

areas local to the vehicle. The purpose of this is to allow the UGV to reduce its speed to

allow it to successfully negotiate the traversable regions. To support this, another

pseudo-component was developed. This component, the Region Clutter Sensor, is

embedded in the arbiter. It simply provides a fast assessment of the percentage of cells in

a specified area that are classified as non-traversable. This message is sent to the reactive

planner, which has the responsibility for determining how to react to this notification.

Ideally, the reactive planner converts the clutter percentage to a recommended vehicle

speed and transmits this to the Global Path Segment Driver component using the JAUS

Code 040Ah: Set Travel Speed message.

Table 3-4. Smart sensor architecture's re ort traversability grid updates message.

Latitude Scaled Integer
Latitude
1 (W 84) Integer Degrees Lower Limit = -90
Upper Limit = 90
Longitude Scaled Integer
SLongitude ^ -1 o
2 (WGS 84) Integer Degrees Lower Limit = -180
Upper Limit = 180
3 Cell Update 1 Byte N/A
3 Byte N/A
Row
4 Cell Update 1 e N/A
Column B
0 Reserved for World
Model
1 ... 126 Non Traversable
127 Unknown (Initial value
Cell Update 1 Byte N/A for cells)
Value 128 ... 255 Traversable


completely non-traversable
255=completely traversable









Table 3-4. Continued





3n Cell Update Byte N/A
Row
Cell Update n
3n + 1 Cll n Byte N/A
Column
Cell Update n
3n +2 ell e Byte N/A Same as field 5
Value

The area covered by the Region Clutter Sensor is not specified in the Smart Sensor

Architecture Interface Control Document (ICD). This is a system specific parameter and

is therefore left to the system implementer. A system traveling at high speed may need to

monitor a large area whereas a slower or smaller system may need to monitor a smaller

area.

Table 3-5. Smart sensor architecture's report region clutter index message.


Scaled Byte
Lower Limit = 0
1 Clutter Upper Limit = 100
1Index Byte Percent
Index
Percentage clutter in
specified area

3.3.3 Smart Sensor Architecture Network Communications

The smart sensor data are transferred within the perception system via the user

datagram protocol (UDP) running on top of the Internet protocol (IP). This combination

of user datagram protocol and the Internet protocol will be referred to as UDP/IP.

UDP/IP provides a connectionless, unreliable communications link between systems.

The term unreliable is in some respects a misnomer because UDP/IP can provide a

quality connection. Unlike the Transmission Control Protocol, UDP does not have any









checks to assure receipt of data. It relies on the host application to do the checking. For

example, the JAUS header provides a message acknowledgement flag that requests that

the receiving component notify the sending component of receipt of a message. If the

sending component does not respond in a set period of time, as per the JAUS RA, the

sending component retries up to three times and then terminates transmission. If a JAUS

implementation used UDP/IP, then this functionality would help assure reliable

communications.

The smart sensor system is set up apriori under the assumption that the minimum

number of system components are present and that they are online and in the ready state.

It was developed such that each component commences transfer of the supported

messages to the appropriate component directly after initialization. This may be

considered transmission of unsolicited responses to repeated data queries (sans the

queries) or as an unsolicited JAUS service connection. The UDP/IP transport layer

supports this functionality. UDP/IP is connectionless and therefore does not require that

the destination component be present or a link established in order for data to be sent

within the system. The popular alternative to UDP/IP, TCP/IP, generally requires that a

socket connection be established between two or more systems before data can be sent.

To route data to the smart sensors, internet protocol (IP) addresses had to be

defined. To allow the IP addresses to be determined dynamically based on the

destination of the message, an IP addressing convention was established. Since each

smart sensor has a unique component ID, the component ID was used as the last octet of

the IP address. The first three octets of the IP address were established apriori. For

example: 192.168.1.componentid is an example configuration where the first three









octets are the defined and the component ID is used as the last octet. Table 3-6 presents a

list of smart sensor components and their associated component identification numbers.

A standard UDP/IP port was also designated.

The network interface between all components within the perception system was

wired Ethernet capable of providing data transfer at rates of up to 100 Megabits per

second.

Table 3-6. Smart sensor components and their component identification numbers

Reactive Planner 10
Smart Sensor Arbiter 11
Smart 3D Laser Sensor 21
Smart Stereo Vision Sensor 22
Smart Terrain Finder Sensor 23
Smart Road Finder Sensor 24
Smart World Model Sensor 25
Region Clutter Sensor 127

3.4 Smart Sensor Implementation

While the smart sensor architecture was originally developed for the Team CIMAR

NaviGATOR, final testing and verification took place on the Center for Intelligent

Machines and Robotics' Navigation Test Vehicle 2 (NTV2) shown in Figure 3-6. The

implementation of the smart sensor units at CIMAR exploited the commonality between

implementations.

3.4.1 Abstraction of Smart Sensor Core Functionality

Because of the considerable amount of implementation overlap, the CIMAR smart

sensor system was designed so that all developers build their smart sensors on top of a

common base implementation that contained the core smart sensor functionality. This

approach saved a considerable amount of time because testing and debugging of the main









base sensor implementation occurred independent of development of the sensors. The

interface to this system was made into a clean application programmer's interface (API).

This API handles communications, grid synchronization, and all other low level

smart sensor tasks. The system designer has the responsibility of processing the sensor

specific data to determine traversability, placing that data in a grid of the proper range

and resolution, and using the smart sensor API to publish the new data to concerned

components within the system. Figure 3-7 shows the high-level conceptual separation

between the two functions. The power of this approach is that it allows new

implementations of sensors to come online in very short order.

3.4.2 Base Smart Sensor

The base smart sensor encapsulates all low-level functionality common to all smart

sensors. This functionality includes:

* Allocating memory for a local traversability grid
* Receiving position and orientation updates via UDP/IP
* Transforming data from sensor coordinates to grid coordinates
* Shifting the traversability grid to keep it vehicle centered
* Monitoring traversability grid updates
* Synchronizing traversability grid updates via UDP/IP

This functionality leaves to the operator the task of solely proving an instantaneous

local traversability grid from their sensor data. They initialize their smart sensors using

the API. The base smart sensor has two thread that run concurrently with the sensor

interface specific thread. Figure 3-8 shows a call graph for all functions within the base

smart sensor.

The traversability data are registered in the grid by utilizing the platform orientation

data. Upon startup, the base smart sensor spawns a thread (Figure 3-9) to handle









asynchronous position updates from either the smart sensor arbiter or directly from the

reactive planner component


Figure 3-6. Center for Intelligent Machines and Robotics Navigation Test Vehicle 2.

The rotations Y, 0, and 4, as shown in Figure 3-5, as well as the sensor's offset

from the vehicle's coordinate system are used in the homogenous transformation of the

data from the sensor's coordinate system to the grid coordinate system. Equation 3-1

shows the compound transformations necessary for this. The x .., offset, and offset values

all represent the offset of the sensor coordinate system from the vehicle's coordinate

system. It is assumed that sensor is aligned such that there is no rotational difference

between the two coordinate systems, only translation. The Xsensor, Ysensor, and zsensor values

represent the coordinates of a point as read from the sensor in the sensor's coordinate







48


system; Xvehicle, vehicle, and vehicle are the coordinates of the point after transformation to


the vehicle coordinate system.


Smart Sensor Application
Programmer's Interface


Figure 3-7. Smart sensor implementation abstraction of low-level smart sensor
functionality


; smartSensorGridMapLockPosition

smartSensorGndMapGetValue

smartSensorGndMapSetValue

smartSensorTransformPoint

smartSensorGridMapUnlockPosition

SsmariSensorGetPoselnfo

smartSensorMessageUpdatePoselnfo

-- -FrreadSrocket


Figure 3-8. Call graph for all functions within the base smart sensor API.


readSocket


IsmartSensorMessageUpdatePoselnfo I


scaledlntl6ToReal


scaledlnt32ToReal


smartSensorLockPoseMutex

smartSensorUnlockPoseMutex


Figure 3-9. Call graph for smart sensor communications receive thread


~1~E~g~ab$9IB~mm~









vehicle 1 0 0 ff 1 0 0 0 cosy -sinm 0 0 cos 0 sin 0 x..
vehicle 0 1 0 ffse 0 cosO -sinO 0 sinV cosVy 0 0 0 1 0 0 y.seso, (3-1)
vehicle 0 0 1 zoset O sin0 cosO 0 0 0 1 0 -sin 0 coso 0 zse
1 0 0 1 0 0 0 1 0 0 0 1 0 0 0 1 1

To synchronize the position of the grid map, the existing cell data are shifted such

that the vehicle is always located in the center of the raster grid. The benefit of shifting

the cell data is that it provides a limited short-term memory of the area directly local to

the vehicle.

It is assumed that the position and orientation data are fairly accurate and precise.

If they are not, proper data registration will not be attained. Research is currently taking

place to find ways of handling this problem, but this is outside the scope of this work.

This is not an issue within this system because all smart sensors use the same position

and orientation updates. Therefore any errors introduced due to loss of position system

precision or accuracy will be present in all of the smart sensors' data.

Once the grid has been shifted, the sensor specific data may be entered into the

traversability grid as if it were a local traversability sensor, i.e. no global position or

orientation data. When the smartSensorTransformPoint( function is called, it handles

converting the data from sensor coordinates to vehicle coordinates and finally to world

coordinates. This transformation is shown in Equation 3-1.

Once updates have been made to the base smart sensor-based traversability grid,

the main thread causes the grid to be checked for changes. These changes are transmitted

via the smartSensorSendCellUpdateso function, as shown in Figure 3-12, and its

access to the UDP/IP transport layer.

The next section details the implementation of a stereo vision based smart sensor.

While the method described in this section is specific to the stereo vision system, the









power of the smart sensor approach is that this sensor specific interface is abstracted out.

This means that as long as a sensor implementer uses the same grid parameters,

interfacing to the base smart sensor will be trivial.


convertGEOtoUTMi


getCentralMeridian


Figure 3-10. Call graph for function that determines number of rows and columns to shift
the traversability grid based on the current and previous positions

smartSensorGridMapGetValue


smartSensorGridMapLockPosition


smartSensorGridMapSetValue


smlarS aSensorGridMapUnlockPosition


SsmartSensorTransformPoint


Figure 3-11. Call graph for sensor specific interface thread.

smartSensorGridMapGetUpdateArray
getlPAddress
smartSensorSendData
writeSocket

Figure 3-12. Call graph for function used to detect changes in the traversability grid and
transmit these changes to the smart sensor arbiter

3.5 Smart Stereo Vision Sensor Implementation

Like all CIMAR smart sensors, the smart stereo vision sensor builds on the base

smart sensor module. It is based on the Videre Design STH-MD1-C stereo vision camera

system and the SRI Small Vision System.


C ini r o I I









3.5.1 Stereo vision Hardware

The Videre Design STH-MD1-C, shown in Figure 3-13, is a high resolution, wide

baseline stereo vision camera system. It consists of two CMOS imagers and an IEE1394

(Firewire) interface for transferring the digital images to the computer doing the stereo

processing.

3.5.2 Stereo Vision Software

To handle the tasks of camera calibration, image rectification, and stereo

correlation, the SRI Small Vision System (SVS) is used. SVS provides an application

programmer's interface to its internal implementation of the functions necessary for

stereo processing [35]. This system is available for both Linux and Windows based

systems. Figure 3-14 shows a rectified stereo image pair from the Videre system. The

output of SVS's processing is shown below the stereo pair. In this image brighter pixels

correspond to smaller distances. Conversely, darker pixels correspond to larger distances

as calculated by stereo correlation.

3.5.3 Smart Stereo Vision Sensor

The base smart sensor handles all of the low level functionality of the smart sensor.

Because of this, the smart stereo vision sensor has to only provide an instantaneous

indication of the region traversability within the area local to the vehicle. To handle the

tasks of camera calibration, image rectification, and stereo correlation, the SRI small

vision system (SVS) is used.

A check of the range resolution was done at the range specified by the Team

CIMAR perception team. The following equation relates the range resolution to the

camera parameters as:









r2
Ar= (3-2)
b-f

where Ar is the resolution at range r, b is the baseline of the stereo vision camera system,

fis the focal length of the camera lenses, and d is the smallest disparity perceivable by the

stereo vision system. For this sensor system's STH-MD1, the baseline was 200

millimeters, the focal length was 12.5 millimeters, and the smallest disparity perceivable

was 0.46875e-3 millimeters. A graph of range versus range resolution is shown in Figure

3-16 [35]. As can be seen in the figure, at a range of the 30 meters, the range resolution

is approximately 17 mm not a problem at all considering that the grid resolution is

constant at 0.5 meters per cell.


Figure 3-13: Videre Design STH-MD1-C stereo camera head (left)
















'EEL mL
.A..r'


Figure 3-14: Source data and results from stereo correlation. A) Left image. B) Right
image. C) Disparity image.








E 16--

14 ---









S1 ----- 10 12 14 16 18 20
1 1----- 7---- 7------------ ^ ^ -+ - - -
S1----- ------ ---- ----------




S----- -- --------- ---- ---- ------


2 - ~^ -- - ~ ~ ^ ~ ~ - --- -



0 2 4 6 8 10 12 14 16 18 20
Range to Object as Measured with Rule [m]


Figure 3-15. Graph of range determined from stereo vision system vs. actual measured
range


Region traversability value judgment is based on an assessment of the three


dimensional data provided by the stereo vision system. A method for fast obstacle










classification based on allowable slope is presented in [15]. This is shown in Equation

3-3.

zk- -Z

2 k Z) > sin2 (a) (3-3)
(xk g) (Yk yg) + Zk Zg)


In this equation (xg, yg, and zg) represent the coordinates of a known ground point and (xk,


yk, and Zk) represent the coordinates of sensed point in space. The maximum allowable

angle is represented by ac. This method analyses each point within the sensor data to

determine whether or not it represents a traversable region.










12-


12 6---------- ---------- ---------------^---- --------- ---------


2--

.5 10 15 20 25 30
Of 6 -







Object Range in Left Camera Coordinate System [m]

Figure 3-16. Plot of range resolution vs. range for the Videre Design STH-MD1-C with
12.5mm focal length lenses

In this work Hong et al. [15] also show that it is possible for an object to fail this

test, but still be an obstacle because of the object's height. The following test, Equation

3-4, checks for this condition, by considering the height of the object above the ground









plane. If an object is too tall for the vehicle to drive over, then it is classified as an

obstacle. The constant H in Equation 3-4 sets this threshold.

zk z < H (3-4)

Because the smart stereo vision system is based on accumulated instantaneous

sensor readings, the ground point used in (3.3) is the origin of the vehicle projected onto

the plane defined by the intersection of the vehicle's tires and the ground plane. By

establishing this point as the origin of the vehicle's coordinate system, the terms Xg, yg,

and Zg drop out of the equation. Therefore for the instantaneous sensor reading, the

obstacle check is based on Equation 3-5.

Szk2 sin2(a) (3-5)
xk + k +zk

The base smart sensor API is used to perform the conversion from three-

dimensional world coordinates to two dimensional grid coordinates. Because the base

smart sensor has access to the current position and orientation of the vehicle, the offsets

of the vehicle and sensor coordinate systems, and the range and resolution of the

traversability grid, it is able to provide the smart stereo vision system a transformation

from world coordinates directly to grid coordinates.

To update data within the local traversability grid, a method for updating the

traversability grid was established based on the work by [20]. This implementation of a

local occupancy grid uses a simpler approach for grid updating. They based their

updated method on the observation that stereo errors are systemic and are not easily

modeled probabilistically. This is because stereo vision systems are dependent of the

visual properties of the environment. For example, as lighting and texture conditions









change, the performance of the stereo matching process may improve or degrade.

Because of this a probabilistic model of the stereo vision system may not be the same as

under ideal conditions. Similar to the grid cell properties established in Section 3.3.2.2,

Murray and Little's [20] method uses a one byte per cell representation with an unknown

state represented by the value 127.

IF i e TRAVERS(r) THEN G(i) = G(i)+Kt ELSE G(i) = G(i)-Knt (3-6)

An extension of their work was developed for use in the smart stereovision system

traversability grid. This is method updates cells based on Equation 3-6. As in Murray

and Little's implementation, i represents a location within the grid -in this case the

traversability grid, r is a reading from the stereo vision sensor, TRAVERS( is the

operator that determines if the sensor reading represents a traversable point, G(i) is the

traversability grid value at location i, Kt and Knt are constants used to, respectively,

increment and decrement the traversability cell value. The addition of Kt and K,,is a

departure from Murry and Little's approach where a single constant is used. By having

separate constants, emphasis can be placed on either maintaining clean data with slower

response times for detecting obstacles or vise versa. To bias one approach over the other,

the associated incrementing constant is made larger than the other. Otherwise the

constants should be equal.

3.6 Use of Obstacle Detection and Free Space Sensors

As mentioned previously, the purpose of the Smart Sensor Architecture is to

generate a model of the world local to the vehicle to support the task of obstacle

avoidance. Since all sensor modalities do not provide high-resolution data within their









fields of view, an important distinction is made between different types of sensors. They

are classified as free space detectors, obstacle detectors, or a combination of both.

While highly accurate with respect to presence within a zone, the issue is that when

trying to use this for obstacle detection, the entire zone would have to be classified as an

obstacle because of the lack of granularity in the sensor field. Rather than considering

the radar unit an obstacle detector, it is viewed as a free space detector. If the radar unit

indicates that there is no object in a particular zone, it can be assumed that the entire zone

is clear. It follows that if the radar until indicates that all zones are clear then there is a

fast, computationally inexpensive method of classifying the entire sensor field of view as

clear and traversable. The associated cells are updated to correspond to this

classification. Sensors with high resolution such as the stereo vision system or the

LADAR sensor presented in chapter two can be used to detect free space as well as

objects.

3.7 Smart Sensor Arbiter Implementation

The smart sensor arbiter also builds on the base smart sensor module. The initial

implementation of the arbiter is minimalist. Because position and orientation updates

from the JAUS network are sent through the smart sensor arbiter down to all of the smart

sensor components, the arbiter knows its current position. As grid cell updates come in

from the smart sensors, each message has a latitude and longitude position stamp

indicating the origin of the cell updates. Using the Universal Transverse Mercator

projection, the latitude and longitude based coordinate values are converted to Cartesian

coordinates within a UTM zone. The arbiter then does the same conversion using the

coordinates of the vehicle's current location. The difference in the vehicle position and









the origin of the grid cell updates is converted to an offset of grid coordinates (rows and

columns). This offset is simply applied to each grid cell update.

This approach is acceptable in this situation because, while they are distributed

systems, all smart sensors use position data from single position system. If each smart

sensor were on a different subsystem with independent position systems, then this

approach would have to be modified because of accuracy and precision issues.

As the data within the smart sensors' grids change, they transmit corresponding

traversability grid updates. To fuse the data from the smart sensors, the arbiter uses the

method shown in Equation 3-7.

num smart sensors
G(i)= wn- cell update(row, -row_ offset, col col_ offset)n (3-7)
n=l

where G(i) is the value of the fused traversability grid cell at the grid position i. The

constant w, represents a weight associated with data from smart sensor n.

Consider the case of a smart sensor system consisting of stereo vision based smart

sensor and a RADAR based smart sensor. If the RADAR is considered a free space

detector and is limited to that traversable range for cell updates, then when the RADAR

unit classifies a pixel as free, it is highly probable that the RADAR's data would always

be more accurate than the stereo vision system, which is subject to the systemic errors

discussed by [20]. Therefore weighing the RADAR data more than the stereo data would

put more emphasis on the high quality RADAR data. This is only true when the RADAR

is used as a free space detector. If the RADAR were used as an obstacle detector, then it

would adversely affect the quality of the fused data from the stereovision system.

The Region Clutter Sensor is a quasi-component embedded in the smart sensor

arbiter. This component simply applies a non-traversable region threshold to the values






59


within a region of the traversability grid. As the saturation of non-traversable regions

increases, the vehicle makes the appropriate changes in velocity necessary to allow

successful negotiation of area. This component is only present in the arbiter, so when a

smart sensor replaces the arbiter, this functionality is lost.















CHAPTER 4
JAUS WORLD MODEL KNOWLEDGE STORES

The previous chapter presented a detailed description of the smart sensor

architecture and the implementation of a stereovision based smart sensor unit. The smart

sensor architecture message set defines a standard logical interface that allows data to be

shared between the components of a perception system. While this interface is

acceptable to meet the synchronization requirements of the smart sensor architecture's

traversability grids, it is not general enough to support the sharing of data based on the

raster and vector modeling methods presented in Chapter 2. Building on the reviewed

literature as well as on lessons learned from developing the smart sensor architecture, this

chapter introduces standard modeling and input/output methods for world model

knowledge stores.

A world model knowledge store is to be the central geospatial data store for a

JAUS component, node, subsystem, or system. The knowledge store provides only

geospatial data storage and access methods. Therefore, no processing or higher level

functionality should be provided by the knowledge store. It is the most primitive world

modeling component and forms the foundation for all future world model components.

These future components will extend the world modeling capabilities of JAUS by

providing functions such as value judgment, simulation, prediction, etc. as described by

Mystel [18].









Similar to the smart sensors in Chapter 3, the world model knowledge stores are

envisioned as location independent, modular JAUS components. Because of this, it is

possible to have multiple subsystems accumulating data in a global world model

knowledge store or to have individual subsystems accumulate data in their own world

model knowledge stores and then have synchronization of those stores. The data within

these stores may be either persistent or volatile. This will not be specified as it is an

implementation issue.

4.1 Observations and Recommendations

Chapter 2 showed that there is a considerable amount of commonality between the

numerous types of data that are available in apriori data stores and the types of data that

may be accumulated in real-time. The main two classes of data types are raster and

vector data. Raster data may consist of elevation, geo-referenced orthoimages, density

maps, occupancy grids, traversability grids, etc. Vector data may consist of digital road

maps, polygon maps, etc. By enforcing some constraints, it is possible to distill these

data into a common format that may be used by unmanned systems community.

A number of key observations were made about the current methods for accessing

and sharing real time and apriori geospatial data.

* The level of complexity of modeling methods designed for a priori data-sharing
(such as SDTS and GML) is well beyond what is necessary for JAUS based
unmanned systems.

* There are a number of different projections that may be used to transform data from
geodetic coordinates to a two or two and a half dimensional surface.

* Current world modeling methods are, at their core, based on either raster or vector
primitives.

* For real-time world modeling on unmanned vehicles, raster methods are used most
often.









* Both vector and raster modeling methods are commonly used in apriori data
stores.

* Most apriori data stores include metadata which have extra information about the
stored data.

Therefore it is recommended that at a minimum the initial JAUS World Model

standard should:

* Provide the ability for JAUS based subsystems, nodes, and/or components to share
geospatial data with minimal complexity.

* Allow developers some degree of flexibility within the constraints of the standard.

* Specify a map projection and horizontal and vertical datums to be used within the
knowledge stores.

* Allow for use and transfer of apriori and real-time raster and vector data.

* Provide a mechanism to allow distinguishing between different types of geospatial
data.

* Provide a means for saving and sharing information about the geospatial data
within the knowledge store.

* Meet the standard JAUS requirements for definition of new components.

A JAUS World Model Knowledge Store standard should not be concerned with the

method of modeling data internal to the system, but with how the data are formatted and

presented to other JAUS components that use or store geospatial data. The work done on

the smart sensor architecture as well as past experiments with JAUS interoperability has

shown that as component interfaces become more complex, it becomes increasingly

difficult to achieve true interoperability. The approach with the message set presented

herein is to develop a method of sharing the data at the most primitive levels.

Complexity has been limited so as to provide to the many organizations that make up the

JAUS Working Group a more acceptable and undemanding initial standard. As the

geospatial data-sharing requirements of the group change, so too will the standard.









Standards inherently impose limitations and this must be accepted. However,

standards that are too restrictive run the risk of losing of support. Therefore the JAUS

World Model Knowledge Store standard is developed to be as flexible as reasonably

possible. The messages are also designed to be as extensible as is possible within the

JAUS framework. This standard and all future world modeling component standards

should be considered living documents that are able to quickly change to meet the needs

of system developers. The JAUS working group's review process will assure that the

changes that are made are only those that are applicable to the group as a whole.

Geospatial data transferred from different systems must use the same map

projections, ellipsoidal Earth model, and horizontal and vertical datum. For the global

coordinates, JAUS specifies that all systems use the World Geodetic System 1984

(WGS84). The map projection will be the Universal Transverse Mercator Projection.

Vertical measurements will be based on the vertical datum as established by the

ellipsoidal model of the Earth. Since most of the systems will be operating in the United

States, the horizontal with be the North American Datum as established in 1983

(NAD83).

Chapter 2 showed that real-time world modeling methods typically use tessellated

raster data structures and a priori world modeling methods use ether raster or vector data

structures. Therefore a message set has been developed to support two types of

knowledge stores: the World Model Raster Knowledge Store and the World Model

Vector Knowledge Store.

The World Model Raster Knowledge Store provides a method for storage and

sharing of raster formatted geospatial data within a JAUS system. Many unmanned









systems with perception systems utilize a form of the local occupancy grid as introduced

by Elfes [13]. The local occupancy grid is implemented as a tessellated geo-referenced

grid. The World Model Raster Knowledge Store is a generalization of such a local

occupancy grid. It is desired to have this knowledge store support most types of raster

data. These include binary image, grey scale images, RGB images, digital elevation

model (DEM) data, traversability, occupancy, etc. Typically an occupancy grid stores a

value corresponding to a truth metric in each cell. When raster data are stored such that

each cell represents a height at that location (such as DEM data), this is referred to as two

and a half (2.5) dimensions [21].

Storage and sharing of spatial data such as points, lines, polylines, or polygons is

supported by the World Model Vector Knowledge Store. These vector formatted spatial

data provides a number of benefits. The primary benefit of such a system in the context

of JAUS is that it requires significantly less bandwidth to transmit data as compared to

the raster store. This method therefore can reduce the storage requirements within the

system.

A feature class represents a categorization of types of spatial data. For example,

occupancy, free space, objects, roads, terrain, building, etc. all represent distinct feature

classes. A geo-referenced, orthoimage may also represent a feature class. It may be

more intuitive to consider these feature classes as different layers of geospatial data

within the knowledge store. This is important because it allows different types of spatial

data to be handled separately. Predefined feature classes will eventually be defined by

the JAUS World Model Subcommittee in the interest of true world model

interoperability. Since it is not possible to define all types of feature classes apriori, a









sizeable amount of space has been set aside for user defined feature classes. While this

does have an adverse effect on interoperability, this is mitigated by having system

developers provide each other with a data dictionary when they wish to interoperate. The

data dictionary is simple a description of which types of data correspond to a feature class

identifier. Even with the predefined feature classes, when testing interoperability system

developers must establish the data types that they are using within the knowledge store.

It is possible that this exchange could be handled during the discovery process provided

in the forthcoming JAUS dynamic configuration and registration extensions

To allow dissemination of information about a feature class, the world model

framework provides for storage and transfer of feature class metadata. In this context,

metadata is simply text that provides general information about the data within a

particular feature class. Initially the metadata are developed to be human readable text in

a format specified by the user. Bolstad [7] gives an introduction to metadata and

discusses the Content Standard for Digital Geospatial Metadata. Just as this is considered

only a guideline for the GIS community, it is considered only a guideline for the JAUS

community. Initially these metadata are designed to be human readable and not used in

any distributed computations that may be performed on these data.

The local request identifier (LRID) is a single-byte numerical identifier attached to

certain classes of messages originating outside of the world model knowledge store. This

feature allows synchronization of messages and their associated response. This is

important because even though requests to the knowledge store may be synchronized,

there is no guarantee that the responses will be synchronized. By attaching the LRID, the

requesting component will be able to internally synchronize any asynchronous responses.









4.1.1 Raster and Vector Object Representation

This section describes the raster and vector objects as they should be formatted in

the JAUS messages that define the input and outputs of the knowledge stores. Special

attention must be made to assure that these conventions are followed by all components

sending data to or receiving data from the world model knowledge stores.

The data within a raster knowledge store should always maintain a north-east

orientation. Raster data in the knowledge store should be geo-referenced by defining

their origin as a single point described by the intersection of a line of latitude and a line of

longitude (WGS84). The grid parameters also include the number of rows and columns

and the grid resolution. While a grid cell is specified as a point, that point covers an area

equal to the grid resolution squared. A Cartesian coordinate system is established at the

geo-referenced point. The Cartesian coordinates of the grid cells are derived from use of

the Universal Transverse Mercator projection. The grid cells may also be referenced by

their row and column offset from the origin point. Figure 4-1 shows the format of a layer

of raster data. While cells may have negative row and column values with respect to the

grid origin, when transmitting rectangular grid data (e.g. images, DTED), the origin of

the raster data must be the point that defines the cell whose column coordinate is equal to

the column coordinate of the western most cells and whose row coordinate is equal to the

row coordinate of the southern most cells. Therefore when transmitting a rectangular

array of raster data, there will be no cell values with coordinates less than zero.

For the vector knowledge store, objects are represented as points, lines and

polylines, and polygons. The coordinates of these points are defined by a point of

latitude and longitude (WGS84). Polylines and polygons may consist of up to 65535

vertices. Figure 4-2 shows the format of these vector objects. Rather than assigning









these points Cartesian coordinates with respect to an arbitrarily chose datum, each vertex

is expressed as a point of latitude and longitude.

N
Columns E
I E



I :1
SRe ol Lit on f1

I
Rows







(-2, 1) /
/ Cells are
On )in I expressed as
(0, 0) (3, 0) (column, row)
(Lat, Lon) (x, y)

Figure 4-1. Definition of raster grid parameters and coordinate system

The vector objects on the right of Figure 4-2 have a buffer parameter. The buffer

parameter establishes a radial region around each vector object vertex and connects the

radial regions of two or more radial regions by drawing lines at their tangents. The area

within these radial regions and tangent lines are considered to be within the vector

object's buffer zone. This feature allows a region to be established in proximity to the

vector objects. For example, United States Geological Survey (USGS) road data is

presented in vector form representing the center-line of such roads. It may be useful to

do a search within the perimeter along a particular route defined in the USGS digital line









graph data. For simple cases, it may be possible to generate a polygonal representation of

the area around the road. Establishing this polygon will require transmitting the

coordinates of each of its vertices. As the problem scales up, this method becomes very

inefficient. A better solution to this problem would be to determine the route using the

USGS digital line graph data and assign a region buffer to each line segment. The region

buffer is defined as an offset distance in meters. The spatial buffer is established by

defining a radius from each point on the vector object. For many cases, this buffer will

be a simple offset with the exception of point objects and along non-smooth contours.

Figure 4-2 shows these cases. If the system designer requires finer control over this

region, they may define the buffer using the aforementioned polygonal representation.

4.2 World Model Knowledge Store Message Set

The following sections present the initial draft message set for the first two JAUS

World Modeling components. This message set is based on a review of the current

methods of modeling spatial and geospatial data as presented in Chapter 2. These

methods are distilled into their most basic form and codified into a standard consistent

with the JAUS framework.

4.2.1 JAUS Core Input and Output Message Sets

Support for the JAUS core message set is required by the current version of the

JAUS Reference Architecture (RA). The JAUS Core Message Set consists of the

following messages:

* Code 0001h: Set component authority
* Code 0002h: Shutdown
* Code 0003h: Standby
* Code 0004h: Resume
* Code 0005h: Reset
* Code 0006h: Set emergency









Code 0007h: Clear emergency
Code 0008h: Create service connection
Code 0009h: Confirm service connection
Code 000Ah: Activate service connection
Code 000Bh: Suspend service connection
Code 000Ch: Terminate service connection







Point Pointwith Buffer
(Lat, Lon) (Lat, Lon),
radius


Line
(Lat, Lon), (Lat, Lon)


Polygon
(Lat, Lon), (Lat, Lon),
(Lat, Lon), (Lat, Lon)


Line with Buffer
(Lat, Lon), (Lat, Lon),
radius


Polygon with Buffer
(Lat, Lon), (Lat, Lon),
(Lat, Lon), (Lat, Lon),
radius


Figure 4-2. Definition of vector objects and parameters









While the JAUS RA does require that these messages be accepted by all

components, there is no requirement that components have an action associated with each

input message. Because the expected behavior of components while in each state is

somewhat ambiguous, they will be defined for the world model knowledge stores. So too

will the message that are required to have a response.

The world model knowledge stores should have an appropriate response to the

following messages:

* Code 0002h: Shutdown
* Code 0003h: Standby
* Code 0004h: Resume
* Code 0005h: Reset
* Code 0009h: Confirm service connection
* Code 000Ah: Activate service connection
* Code 000Bh: Suspend service connection
* Code 000Ch: Terminate service connection
* Code 2002h: Query component status
* Code 4002h: Report component status

The Code 0002h: Shutdown message should cause the receiving knowledge store to

immediately terminate all data transfer upon receipt. If the knowledge store is

responding to a query, it should immediately terminate the flow of data and transmit the

Code F405h: Report Raster Knowledge Store Data Transfer Termination or the Code

F424h: Report Vector Knowledge Store Data Transfer Termination message to the

component whose query response was interrupted and any components with outstanding

requests. Upon termination of all data transfer, the world model should execute its

specific shutdown routine and then halt. It should no longer respond to any data requests

and should require a hard reset in order to resume operation.

The Code 0003h: Standby message should cause the receiving knowledge store to

respond as if it had received the Code 0002h: Shutdown message. The exception is that









the knowledge store should not halt. It should respond only to the Code 0004h: Resume

and Code 0005: Reset messages. Upon resumption to the ready state, the knowledge

store should resume normal operations. It should not resume any suspended query

responses.

The Code 0005h: Reset message should cause the receiving knowledge store to

immediately terminate the transfer and processing of any data. The knowledge store

should transmit to all components with outstanding requests or data transfers the Code

F405h: Report Raster Knowledge Store Data Transfer Termination or the Code F424h:

Report Vector Knowledge Store Data Transfer Termination message. The knowledge

store should them immediately restart and return to the ready state. Terminated data

transfers should not resume.

The Codes 0009h: Confirm Service Connection, 000Ah: Activate Service

Connection, 000Bh: Suspend Service Connection, 000Ch: Terminate Service Connection,

2002h: Query Component Status and 4002h: Report Component Status messages should

all invoke that typical JAUS response associated with their receipt.

4.2.2 Raster Knowledge Store Input Message Set

In the following subsections are the messages that define the input to the raster

version of the world model knowledge store. These command, query, and event setup

class messages are transmitted in order to initiate an appropriate inform or event

notification class message output. These outputs messages are defined in Section 4.2.3.

The inputs to the raster knowledge store are:

* The JAUS core input message set
* Code F000h: Create raster knowledge store object
* Code F001h: Set raster knowledge store feature class metadata
* Code F002h: Modify raster knowledge store object (cell update)









* Code F003h: Modify raster knowledge store object (grid update)
* Code F004h: Delete raster knowledge store objects
* Code F200h: Query raster knowledge store objects
* Code F201h: Query raster knowledge store feature class metadata
* Code F202h: Query raster knowledge store bounds
* Code F600h: Raster knowledge store event notification request
* Code F601h: Raster knowledge store bounds change event notification request
* Code F005h: Terminate raster knowledge store data transfer

4.2.2.1 Code F000h: Create raster knowledge store object

The Code F000h: Create Raster Knowledge Store Object message (Table 4-1) is

used to create and initialize a layer of feature class data within the raster knowledge store.

In order for data to be added to the feature class, the feature class layer must first be

created.

The origin of the raster grid must be geo-referenced by specifying its origin in

fields 4 and 5 as a point of latitude and longitude. Extents of the layer must also be

specified as a number of rows and columns in fields 7 and 8. Both the data types that

describe the number of rows and columns and the cell attribute type are variable and must

also be specified in fields 6 and 11, respectively. The grid cell resolution is also specified

in field 9. Because this message is used to create a feature class layer, the feature class

must be specified using field 10.

This message has a single optional field (field 11). Inclusion of this optional field

is determined from the state of bit zero in the message presence vector (Table 4-2). If the

bit zero is set, then the value in field 12 shall be used to initialize all cells within the

feature class.

When the feature class layer is initialized using this message, the data are filled in

the grid on a row by row basis starting at the southern most row and moving north. It is

filled beginning at the southwestern most point moving east.









Tah1l z al CrT~tP ra QtPr trncwri1 Arlrr Qtcrp nhi PrtQ m paa rp fcnm, t


Type Ui Inepea 0ion
I I \I I-l \


Message
Properties


Byte


N/A


Bit leld
0: Request confirmation of
object creation
1- 7: Reserved


2 Message Byte N/A Bit Field
Properties 0: Request confirmation of
object creation
3 Local Byte N/A Request identifier to be used
Request ID when returning confirmation to
requesting component
4 Origin Integer Degrees Scaled Integer
Latitude Lower Limit = -90
(WGS84) Upper Limit = 90
5 Origin Integer Degrees Scaled Integer
Longitude Lower Limit = -180
(WGS84) Upper Limit = 180
6 Raster Data Byte N/A Enumeration
Row and 0: Byte
Column 1: Reserved
Data Type 2: Reserved
3: Reserved
4: Unsigned Short Integer
5: Unsigned Integer
6: Unsigned Long Integer
7 255: Reserved
7 Raster Grid Varies (See Grid Cells
Update field 4)
Rows
8 Raster Grid Varies (See Grid Cells
Update field 4)
Columns
9 Cell Float Meters
Resolution
10 Feature Unsigned N/A Enumeration
Class Short 0 ... 65,534 See Feature
Integer Class Table
65,535 Reserved
11 Raster Cell Byte N/A Eumeration
Data Type Same format as field 6

12 Initial Value Varies (see N/A
for Raster field 11)
Grid Cells









Table 4-2. Presence vector for create raster knowledge store objects message

Vector Bit 7 6 5 4 3 2 1 0
Data Field R R R R R R R 12

4.2.2.2 Code F001h: Set raster knowledge store feature class metadata

As described in Section 4.1, metadata are data about data. The Code F001h: Set

Raster Knowledge Store Feature Class Metadata (Table 4-3) message allows a user to

create, modify, and erase feature class metadata. At the present time the format of these

metadata is not specified. It is left to the system designer to develop a convention for

doing this. Initially these data are to be used by the human operators. In the future a

schema may be defined so as to provide a standard metadata format that may be parsed

and the data used by unmanned systems without human intervention.

Table 4-3. Set raster knowledge store feature class metadata message format

1 Metadata Byte N/A Enumeration
Options 0: Append
1: Prepend
2: Overwrite
3 254: Reserved
255: Erase All
2 Feature Short N/A Enumeration
Class Integer 0 ... 65,534 See Feature
Class Table
65,535 -Reserved
3 Number of Unsigned N/A 0 ... 65,535
String Short
Characters Integer This field should be equal to
zero only when Field 1 is equal
to 255 (Erase All)
4 Metadata String N/A Variable length string


4.2.2.3 Code F002h: Modify raster knowledge store object (cell update)

The Code F002h: Modify Raster Knowledge Store Object (Cell Update) message

(Table 4-4) is used to change data within a raster knowledge store feature class layer.









This message can only be used on a layer that has been created within the raster

knowledge store. This method is specified as a cell update version because it allows

modification of the raster grid on a cell by cell basis. This message has no optional

fields.

The origin of the raster grid cell updates must be geo-referenced by specifying its

origin in fields 2 and 3 as a point of latitude and longitude. Both the data types that

describe the update row and column and cell attribute are variable and must also be

specified in fields 4 and 7, respectively. The grid cell update resolution is also specified

in field 5. Because this message is used to modify a feature class layer, the feature class

must be specified using field 6. The data type for the field that specifies the number of

cell updates included in the message (field 9) is also variable and is defined in field 8.

Each cell update is a three-tuple representing the cell update's row, column, and update

attribute value.

Table 4-4. Modif raster knowledge store object (cell update) message format

1 Local Byte N/A Request identifier to be used
Request ID when returning confirmation to
requesting component
2 Origin Integer Degrees Scaled Integer
Latitude Lower Limit = -90
(WGS84) Upper Limit = 90
3 Origin Integer Degrees Scaled Integer
Longitude Lower Limit = -180
(WGS84) Upper Limit = 180
4 Raster Data Byte N/A Enumeration
Row and 0: Byte
Column 1: Short Integer
Data Type 2: Integer
3: Long Integer
4: Unsigned Short Integer
5: Unsigned Integer
6: Unsigned Long Integer
7 255: Reserved









Fil # Nae T


Cell
Resolution


Float


Meters


6 Feature Unsigned N/A Enumeration
Class Short 0 ... 65,534 See Feature
Integer Class Table
65,535 Reserved
7 Raster Cell Byte N/A Enumeration
Data Type 0: Byte
1: Short Integer
2: Integer
3: Long Integer
4: Unsigned Short Integer
5: Unsigned Integer
6: Unsigned Long Integer
7: Float
8: Long Float
19: RGB (3 Bytes)
10- 255: Reserved

8 Data Type Byte N/A Enumeration
for Number 0: Byte
of Cell 1: Short Integer
Updates 2: Integer
3: Long Integer
4: Unsigned Short Integer
5: Unsigned Integer
6: Unsigned Long Integer
7- 255: Reserved

9 Number of Varies (see N/A
Cell field 8)
Updates
10 Raster Cell Varies (see N/A
Update 1 field 4)
Row
11 Raster Cell Varies (see N/A
Update I field 4)
Col
12 Raster Cell Varies (see Varies with
Update 1 field 7) Feature
Data Class










Tiabel 44 Ntie d yp Uts inte r on


3n + /


Kaster Cell
Update n
Row


varies (see N/A
field 4)


3n + 8 Raster Cell Varies (see N/A
Update n field 4)
Col
3n + 9 Raster Cell Variable Varies with
Update n (see field Feature
Data 7) Class

4.2.2.4 Code F003h: Modify raster knowledge store object (grid update)

The Code F003h: Modify Raster Knowledge Store Object (Grid Update) message

(Table 4-5) is similar to the Code F002h: Modify Raster Knowledge Store Object (Cell

Update) message in that it permits change of grid cell values. It differs from that method

in that rather than transmitting single cell updates, an entire rectangular patch of cells is

updated. As the number of cells that need to be modified increases, this method becomes

more efficient than the cell update method.

The origin of the raster grid update must be geo-referenced by specifying its origin

in fields 2 and 3 as a point of latitude and longitude. Both the data types that describe the

update row and column and cell attribute are variable and must also be specified in fields

4 and 9, respectively. Fields 5 and 6 specify the number of rows and columns of raster

grid updates being transmitted. The grid cell update resolution is also specified in field 7.

Because this message is used to modify a feature class layer, the feature class must be

specified in field 8.

Table 4-5. Modif raster knowledge store object id date) message format

1 Local Byte N/A Request identifier to be used
Request ID when returning confirmation to
requesting component


Table 4-4.


Continued









Table 4-5. Continued

FIelr IN TInterpeaion


rigin
Latitude
(WGS84)


Integer


Degrees


Scaled Integer
Lower Limit
Upper Limit:


3 Origin Integer Degrees Scaled Integer
Longitude Lower Limit = -180
(WGS84) Upper Limit = 180
4 Raster Data Byte N/A Enumeration
Row and 0: Byte
Column 1: Reserved
Data Type 2: Reserved
3: Reserved
4: Unsigned Short Integer
5: Unsigned Integer
6: Unsigned Long Integer
7 255: Reserved
5 Raster Grid Varies (See Grid Cells
Update field 4)
Rows
6 Raster Grid Varies (See Grid Cells
Update field 4)
Columns
7 Cell Float Meters
Resolution
8 Feature Unsigned N/A Enumeration
Class Short 0 ... 65,534 See Feature
Integer Class Table
65,535 Reserved
9 Raster Cell Byte N/A Enumeration
Data Type 0: Byte
1: Short Integer
2: Integer
3: Long Integer
4: Unsigned Short Integer
5: Unsigned Integer
6: Unsigned Long Integer
7: Float
8: Long Float
19: RGB (3 Bytes)
10- 255: Reserved
10 Raster Cell Varies (see N/A
Update 1 field 9)
11 Raster Cell Varies (see N/A
Update 2 field 9)









Table 4-5. Continued

9 + n Raster Cell Varies (see N/A
Update n field 9)
10 + n Raster Cell Varies (see N/A
n+1 field 9)


4.2.2.5 Code F004h: Delete raster knowledge store objects

The Code F004h: Delete Raster Knowledge Store Object message (Table 4-6) is

used to free all resources allocated to a feature class layer within the raster knowledge

store. In order to resume accumulation of data within the deleted feature class, the

feature class layer must be recreated using the Create Raster Knowledge Store Object

message. The message allows a single feature class or all feature classes to be deleted in

one message.

Table 4-6. Delete raster knowledge store objects message format

1 Presence Byte N/A See mapping table below
Vector
2 Local Byte N/A Request identifier to be used


Request ID


when returning confirmation to
requesting component


3 Number of Byte N/A
Feature
Classes
4 Feature Short N/A Enumeration
Class 1 Integer 0 ... 65,534 See Feature
Class Table
65,535 ALL

3 + n Feature Short N/A Enumeration
Class n Integer 0 ... 65,534 See Feature
Class Table
65,535: Reserved

4.2.2.6 Code F200h: Query raster knowledge store objects

The Code F200h: Query Raster Knowledge Store Objects message (Table 4-7)

provides access to data within the raster knowledge store. Field 1 of this message is the









message presence vector (Table 4-8). The optional fields in this message are fields 4, 5,

and 6. Field 2 is the Query Response Properties bit field. When bit zero is clear, the

response to the query should only include the number of records that would be returned.

When bit one is set, the query response shall be the Code F402h: Report Raster

Knowledge Store Objects (Cell Update) message. Otherwise, the Code F403h: Report

Raster Knowledge Store Objects (Grid Update) message shall be sent. Field 3 is the

message Local Request Identifier. This field allows synchronization of message

responses. Field 4 is the Raster Query Resolution. This field allows the querying

component to specify the cell resolution to be used in the response to the query. If this

resolution does not match the native resolution of the queried knowledge store, then the

knowledge store should either sub-sample or interpolate the data to obtain the desired

resolution. This field is optional. Field 5 specifies a specific feature class to be queried.

This field is optional. If a feature class is not specified, then the query should be done on

all feature classes within the knowledge store. Fields 6 through 9 specify two points of

latitude and longitude that limit the range of the query. These fields are optional. If

presence vector bit two is set, then fields 6 through 9 shall all be included. Otherwise,

they should not.

Table 4-7. Que raster knowledge store objects mesa e format

1 Presence Unsigned N/A See mapping table below
Vector Short Integer
2 Query Byte N/A Bit Field
Response 0: Only return number of
Properties responses that would be
transmitted
1: Return cell update 3 tuples
or raster scan (active low)
2 7: Reserved









Table 4-7. Continued
Fiel Nam Type n tl Intepraionm


Local
Request ID


Byte


N/A


Request identifier to be used
when returning data to
requesting component


4 Raster Float Meters
Query
Resolution
5 Feature Unsigned N/A Enumeration
Class Short Integer 0 ... 65,534 See Feature
Class Table
65,535 All Feature Classes
6 Query Integer Degrees Scaled Integer
Region Lower Limit = -90
Point 1 Upper Limit = 90
Latitude
(WGS84)
7 Query Integer Degrees Scaled Integer
Region Lower Limit = -180
Point 1 Upper Limit = 180
Longitude
(WGS84)
8 Query Integer Degrees Scaled Integer
Region Lower Limit = -90
Point 2 Upper Limit = 90
Latitude
(WGS84)
9 Query Integer Degrees Scaled Integer
Region Lower Limit = -180
Point 2 Upper Limit = 180
Longitude
(WGS84)


Table 4-8. Presence vector for query raster knowledge store objects message

Vector Bit 7 6 5 4 3 2 1 0
Data Field R R R R R 6 5 4

4.2.2.7 Code F201h: Query raster knowledge store feature class metadata

The Code F201h: Query Raster Knowledge Store Feature Class Metadata message

(Table 4-9) should cause the Raster Knowledge Store to reply to the requestor with the

Code F402h: Report Raster Knowledge Store Feature Class Metadata. There is a single









field associated with this message. This field specifies the feature class metadata to

return in the reply. There is also an option to return metadata for all feature classes

present in the queried raster knowledge store.

Table 4-9. Quer raster knowled e store feature class metadata mesa e format

1 Feature Unsigned N/A Enumeration
Class Short 0 ... 65,534 See Feature
Integer Class Table
65,535 All


4.2.2.8 Code F202h: Query raster knowledge store bounds

The Code F202h: Query Raster Knowledge Store Bounds message (Table 4-10) is

used to request the spatial extents of a single feature class or of all feature classes within

a raster knowledge store. The knowledge store should respond with the Code F404h:

Report Raster Knowledge Store Bounds message. The bounds are represented by two

points the represent the rectangular region that just covers all of the data within the

feature class layer or layers.

Table 4-10. Query raster knowledge store bounds message format

1 Local Byte N/A Request identifier to be used
Request ID when returning data to
requesting component

2 Feature Unsigned N/A Enumeration
Class Short 0 ... 65,534 See Feature
Integer Class Table
65,535 All Feature Classes


4.2.2.9 Code F600h: Raster knowledge store event notification request

The Code F660h: Raster Knowledge Store Event Notification Request message is

used to establish an event triggered query within the knowledge store. Therefore, this









message is formatted exactly the same as the Code F200h: Query Raster Knowledge

Store Objects message. That message should be referenced for the format of this

message. Whenever the criteria established in this message are met, depending on the

query response field of the event notification request, the raster knowledge store should

transmit either the Code F800h: Raster Knowledge Store Event Notification (Cell

Update) message or the Code F801h: Raster Knowledge Store Event Notification (Grid

Update) message.

4.2.2.10 Code F601h: Raster knowledge store bounds change event notification
request

The Code F601h: Raster Knowledge Store Bounds Change Event Notification

Request message is used to establish an event triggered response to notify the requesting

component of when the data in a feature class extends past the bounds of the data when

the initial request was sent. When the extents of the data change, the raster knowledge

store will transmit the Code F802: Raster Knowledge Store Bounds Change Event

Notification message.

4.2.2.11 Code F005h: Terminate raster knowledge store data transfer

This Code F005h: Terminate Raster Knowledge Store Data Transfer message is a

command class message that should cause the raster knowledge store to immediately

terminate the transfer of all current and outstanding data destined to the requesting

component. Upon termination, the raster knowledge store should send the requestor the

Code F405h: Report Raster Knowledge Store Data Transfer Termination message.

4.2.3 Raster Knowledge Store Output Message Set

In the following subsections are the messages that define the output of the raster

version of the world model knowledge store. These inform and event notification class









messages are transmitted in response to the command, query, and event setup class of

input messages presented in Section 4.2.2.

The outputs of the raster knowledge store are:

* The JAUS core output message set
* Code F400h: Report raster knowledge store object creation
* Code F401h: Report raster knowledge store feature class metadata
* Code F402h: Report raster knowledge store objects (cell update)
* Code F403h: Report raster knowledge store objects (grid update)
* Code F404h: Report raster knowledge store bounds
* Code F800h: Raster knowledge Store Event Notification (cell update)
* Code F801h: Raster knowledge store event notification (grid update)
* Code F802h: Raster knowledge store bounds change event notification
* Code F405h: Report raster knowledge store data transfer termination

4.2.3.1 Code F400h: Report raster knowledge store object creation

The Code F400h: Report Raster Knowledge Store Object Creation message (Table

4-11) is used to confirm creation of raster objects in the raster knowledge store. This

message is sent only when an object creation message is requested by setting bit zero in

the Code F000h: Create Raster Knowledge Store Object message. If this bit is set, this

message will be transmitted and the local object identifier (field 1) is set to the value sent

with the Code F000h: Create Raster Knowledge Store Raster Object message.

Table 4-11. Report raster knowledge store object creation message format

1 Local Byte N/A Local request identifier sent by
Request ID creating component



4.2.3.2 Code F401h: Report raster knowledge store feature class metadata

The Code F401h: Report Raster Knowledge Store Feature Class Metadata message

(Table 4-12) allows access to feature class metadata stored within raster knowledge store.

It is transferred in response to the Code F201h: Query Raster Knowledge Store Feature









Class Metadata message. If the query message requests all feature classes, a separate

message should be sent for each feature class.

These metadata are entered using the Code F00lh: Set Raster Knowledge Store

Feature Class Metadata message.

Table 4-12. Report raster knowledge store feature class metadata message format

1 Feature Short N/A Enumeration
Class Integer 0 ... 65,535 See Feature
Class Table

2 Number of Unsigned N/A 0 ... 65,535
String Short
Characters Integer

3 Metadata String N/A Variable length string


4.2.3.3 Code F402h: Report raster knowledge store objects (cell update)

The Code F402h: Report Raster Knowledge Store Objects (Cell Update) message

(Table 4-13) is sent in direct response to a Code F200h: Query Raster Knowledge Store

Objects message if and only if bit two of the bit field in message field two is set.

Otherwise, the Code F403h: Report Raster Knowledge Store Objects (Grid Update)

message is transmitted. If bit one of field two of the Code F200h: Query Raster

Knowledge Store Objects message is set, then only the first two fields of this message

shall be transmitted. Field 1 of this message is Local Request Identifier sent with the

query that initiated this report message. Field 2 notifies the receiving component of the

number of records included in the report message. Fields 3 and 4 establish the geodetic

origin (latitude and longitude) of the cell updates included in the message. Both the data

types that describe the update row and column and cell attribute are variable and are

specified in fields 5 and 8, respectively. Field 6 is the resolution of the raster grid