<%BANNER%>

Tracking and state estimation of an unmanned ground vehicle system using an unmanned aerial vehicle system

University of Florida Institutional Repository
xml version 1.0 encoding UTF-8
REPORT xmlns http:www.fcla.edudlsmddaitss xmlns:xsi http:www.w3.org2001XMLSchema-instance xsi:schemaLocation http:www.fcla.edudlsmddaitssdaitssReport.xsd
INGEST IEID E20101209_AAAACK INGEST_TIME 2010-12-09T15:18:11Z PACKAGE UFE0017511_00001
AGREEMENT_INFO ACCOUNT UF PROJECT UFDC
FILES
FILE SIZE 9025 DFID F20101209_AABMWS ORIGIN DEPOSITOR PATH macarthur_d_Page_001.pro GLOBAL false PRESERVATION BIT MESSAGE_DIGEST ALGORITHM MD5
862dd1fbf6218f7d55e600367844954f
SHA-1
cbbb1fe8eafd359b2c23d792befffa4d7f3107c3
2067 F20101209_AABNDB macarthur_d_Page_061.txt
cdcb9fabed7011ba2f441daaf8e57c35
b1ecded227742d3fa8590063f182caae9744a4a0
712 F20101209_AABNCM macarthur_d_Page_045.txt
1f33bfe95cefbeff61e260ad5a7419e7
b65353e7bf20962fd4c554ccca440daa78261847
37703 F20101209_AABMXG macarthur_d_Page_015.pro
0a74f17a95da1fe68068eeea846ba58b
e7cf4fedbc7c4068ae82e3f53bde33436e020311
1211 F20101209_AABNBY macarthur_d_Page_031.txt
ecc2e532e9e1bf977da4c5488a5a1bfe
8a679d2b700d98c33a6ddd4d12d4a7b7c536bdb4
963 F20101209_AABMWT macarthur_d_Page_002.pro
b1ec8920f228748f93cf6f90bd34a2c9
df0e945e5e42e769e1cf7b3fa77ddb684731caa7
564 F20101209_AABNDC macarthur_d_Page_062.txt
00f3d23a8072ca493cb21b04e32526d3
4005e9f5dc79cda63cec5b0250d6bedb44525b9c
1637 F20101209_AABNCN macarthur_d_Page_046.txt
a83f5b4ec6b0f159fd2d15bd2ec55244
45be8a570deb77d2ac62e68d7ccdab6c2a4abbdd
43972 F20101209_AABMXH macarthur_d_Page_016.pro
fdbdd66eba1b5b470e22e0c23f29bf3e
498ac1a3a1683946edddb765c1c40ce4a935c520
253 F20101209_AABNBZ macarthur_d_Page_032.txt
00106ff4b5d02b25ec30903f910d4035
e3a5759d3cc31f8787574e4577657cabc284ba0c
3552 F20101209_AABMWU macarthur_d_Page_003.pro
4322594b477841142b440d8de7157666
8a802be0f3efaec0964f92f6696c8b14ba2c4049
927 F20101209_AABNDD macarthur_d_Page_063.txt
3396c33df28f8b86198f5a9ae33ed0fb
2f6ef2352edeb2cc561ed6e754bc043a11895130
715 F20101209_AABNCO macarthur_d_Page_047.txt
1e7e3a67c9cd9f9d76a9200ed41ca827
8116bb1a6c2690c910734a7b5a8798738e2f221e
44952 F20101209_AABMXI macarthur_d_Page_017.pro
1ed70ffb068b00efb4b4cece5defca64
e61dfeada20c279e83cdd55770905be787292450
4295 F20101209_AABMWV macarthur_d_Page_004.pro
6bbf9682550c889da3530f03a4ea1f67
17dcb1b19541c73d74e1737e73fe9b346f7494d6
1226 F20101209_AABNDE macarthur_d_Page_064.txt
5ae42baa9ecf2900198a1c730b2d4612
e2f79be9918e27d08ffc9aa65fa00331f2b22331
1199 F20101209_AABNCP macarthur_d_Page_049.txt
46034234cb3b10e0cd6d8d56fa8fecf3
ecb1aa3711606a29a2552f9e66db4f3d3fd71b04
36243 F20101209_AABMXJ macarthur_d_Page_018.pro
e577657ccd8df7fa3a5b3fc99d7474e1
410473865869f5ebf234e3961c377ff425adc4f4
71681 F20101209_AABMWW macarthur_d_Page_005.pro
6e8716815dd4f321e68ea2bef7c8d1cf
e80191509170aef09ea30dfcce4d5794796ad1f7
1704 F20101209_AABNCQ macarthur_d_Page_050.txt
b0bae404cce25fa07ce15e3e07ceb85e
169e60762c2be368611456b852f51bcc8c625e48
28159 F20101209_AABMXK macarthur_d_Page_019.pro
25671b1401045f11b3783155114f51fd
4d1f7b2efc171049a856b1d3feda652b0e23e8f2
55713 F20101209_AABMWX macarthur_d_Page_006.pro
f8c930dc52da3dabcfef0b026c4874fc
43cb4d95d6afa000758322ec5ccf0afa49617824
1773 F20101209_AABNDF macarthur_d_Page_065.txt
37ea8758672ba138c5bfa40f4e73050b
96cb3760b338d8120dfe3f9c30c2ef6adb9a7149
1005 F20101209_AABNCR macarthur_d_Page_051.txt
9befef4a46cdccdf2bc881638cc84589
fd858895f0fc2c109b64028155cba42aa71ce6f2
43009 F20101209_AABMXL macarthur_d_Page_020.pro
3504f6a028714643b53a04bd69ca32b5
274dd1e7669e3fad08919b67d0a5451738549d45
24746 F20101209_AABMWY macarthur_d_Page_007.pro
759af7407f697bf430e5d23083dfb53c
6bf94217e0809ae1d1cc8a4d3eb2ae6b80240845
1491 F20101209_AABNDG macarthur_d_Page_066.txt
60561fd84cb2d1438abe589b79b95c45
f64c0010e6a9f6048a036055ee23ca8ec8bc5003
20589 F20101209_AABMYA macarthur_d_Page_037.pro
87a5ca318c2c5e0efc8d3bb9c79a5b1c
44b5ac391d55b8e4f70d2bc6f7239933ae2d7823
1136 F20101209_AABNCS macarthur_d_Page_052.txt
f16e0d3256f794e0a7ebf5eae25acef9
bd8abf834239bc6a70fff3b9823fbee3bf8bebbd
56616 F20101209_AABMXM macarthur_d_Page_022.pro
c5a09b74a960a6baabb151bd301d89c9
01e0237c98866a943b4b71f05888c1852239a49a
1469 F20101209_AABNDH macarthur_d_Page_067.txt
19cbb7defb1e6dc0c4fbd259ff40903e
f87f5b582ed345795d5200824425c4a586a6daac
8678 F20101209_AABMYB macarthur_d_Page_038.pro
81a48d482a5516a27a7c344902df5eae
66f449a284c1c5b6cb986bc1bb9632561dccd13a
1017 F20101209_AABNCT macarthur_d_Page_053.txt
a5d0e2d28f6cc7def0c507bbd533cabe
4b8b71f9b56e3bf97dc5dbc21b4cb7de0ef8618c
8651 F20101209_AABMXN macarthur_d_Page_023.pro
e419624465129510c5780894a6c06fda
0d9aff802ed92dbe6a78f694adcf33d866dd95b2
58911 F20101209_AABMWZ macarthur_d_Page_008.pro
055d4211ccb87fcf3590fc2ca0472496
42a3a858ac2113159e6a40710041b561848dc6aa
1064 F20101209_AABNDI macarthur_d_Page_068.txt
b2657898c39acf4e19232269d51e492c
ac6ecb8f7145051ab6b32f55baea23e3ea5a7fa0
31410 F20101209_AABMYC macarthur_d_Page_039.pro
2bb6ac1e11fc7bc9fcaf1164d3836ac1
389b59ea34d83b2adae901d7d5e5df715faaf8e7
552 F20101209_AABNCU macarthur_d_Page_054.txt
a848c77b2793e7596f42c02a41126f7c
eef1da98ac1bf2808f5fd5bc4d0c74ef3c63749a
24737 F20101209_AABMXO macarthur_d_Page_025.pro
2a89b87091139e56906b0529c5432b4f
45c79093770544b42ab593831914bade45986102
1528 F20101209_AABNDJ macarthur_d_Page_070.txt
bf2787b32c3b823a24b5a1159dc22f12
05eec2a34885d4642059c4b2c268b0a71b38979f
35952 F20101209_AABMYD macarthur_d_Page_040.pro
69aefa6b6c8086d71f5d57e5f8e6101a
b7021ae717268ec2266d7a52ea9d16cd4ed4008f
1744 F20101209_AABNCV macarthur_d_Page_055.txt
4366efe5decc2d2ebb045aa456e581cc
096f54e444ca66c3708f148a2d70232722d80dd5
29849 F20101209_AABMXP macarthur_d_Page_026.pro
f10ba9fb89e398af1f05b5e309ad6433
0dc7955654ee8db38b2a510742d45d7c8f5f5415
969 F20101209_AABNDK macarthur_d_Page_071.txt
5fbc506021353261ff244d0730eb244c
e54901befb618af6a4174fdb3ff015270919abb4
28997 F20101209_AABMYE macarthur_d_Page_041.pro
489bc25d2bdc399127edfb34ee14f8c3
c3692c6af28c8ac007a97cceaaa09e21cbdff56c
474 F20101209_AABNCW macarthur_d_Page_056.txt
f5dad94b85bf66d5c9d1783e5d712397
f0be80278c18e8a8a6073bf79d7afb43eb32bc2a
15994 F20101209_AABMXQ macarthur_d_Page_027.pro
e3e31a18aed96b1f515a0152467cff42
8c09cf61fbcb7df5dc125752638b4bd568c2cd80
754 F20101209_AABNDL macarthur_d_Page_072.txt
be394e16e1afd5ebc87a2059ce128084
f49684877f3dc1e8d099e54cac3bfc77dd063a5e
29443 F20101209_AABMYF macarthur_d_Page_042.pro
42dd27787fee0ab5b0679c4ad2220d10
315521544413924f4835e131f2fc7cceaadbf483
1840 F20101209_AABNCX macarthur_d_Page_057.txt
9d7a8b400183bc4c8df8230fd7f24d6c
f0d9d9cb572375cebdb0ce2fac648adc7456aef2
27439 F20101209_AABMXR macarthur_d_Page_028.pro
aee3e8681936c45cd442793617be5e72
8b24835e4daecd8c6ffbbf88d1cf2c4688854e01
892 F20101209_AABNEA macarthur_d_Page_087.txt
2d2079ccf7c575a90c856e6b3b00d293
3f23678c7075a8443703f0d8a5314a096b760eb8
1753 F20101209_AABNDM macarthur_d_Page_073.txt
f0250a775495acf42a19cbb37f7012c5
9bdf9e1868d84fad0495694e26083a5fafca244c
26393 F20101209_AABMYG macarthur_d_Page_043.pro
ea09029a6ce58de1592a7c1c7767229f
0f5f896fc4d6342188fb0ce20390d3a886e867dd
826 F20101209_AABNCY macarthur_d_Page_058.txt
e199e9a018abf13f044c4ecabaf95239
908d2d85609fb21a354f1a6b4d032e0331cbf8a8
27855 F20101209_AABMXS macarthur_d_Page_029.pro
2268beaa232ea2e85343144536978a5c
8d1b295016d089d97713e6e667d2e4214b4a52e4
1137 F20101209_AABNEB macarthur_d_Page_088.txt
5a82ea38c4375e35083e906e5fa58f26
942271ed33a5fc689be34e3cdf96073fa8179bc8
1856 F20101209_AABNDN macarthur_d_Page_074.txt
361824e9b187bdc37e994b5b48d09c49
7abf839a2b427bf211ff13b070cd8ea1fd3d0b90
10481 F20101209_AABMYH macarthur_d_Page_044.pro
bbaca790a6fffada60d7dfd1fb4332fe
936d7e037285f19ef6b4127b230ec8614fd66d9d
1259 F20101209_AABNCZ macarthur_d_Page_059.txt
b51449a1552d0f7d53bdc9cf2e6f857f
71a6d23beeea4abd152d84b149c5754795c4d803
21698 F20101209_AABMXT macarthur_d_Page_030.pro
89652df16c0f259f314638de2261c340
ce08c4aed2774131fb9614ac11db1da5d6d26cef
1871 F20101209_AABNEC macarthur_d_Page_089.txt
f96ff7bcb92aab422533d0dfda3817d3
49e375e83ba5f0b1969a85e99761844f106ccaa0
1506 F20101209_AABNDO macarthur_d_Page_075.txt
4276e538fc5649a249ecfa0d0121c2d6
66a015c6a431248eb5ce57fe928c45271b68522f
16450 F20101209_AABMYI macarthur_d_Page_045.pro
7536e9d39b75d051ff8fc45294cf9c25
17b8731d63e95aaf6e4cfb1b6a0982d146df37b7
29472 F20101209_AABMXU macarthur_d_Page_031.pro
de7fd85fc06b452a56c55b3f9f4e91fd
9d7f84af7fa757c08cbb5e98f12dbbe056ab51b5
1062 F20101209_AABNED macarthur_d_Page_090.txt
e7011cf7c38208c580e5d92b0141c66b
f95c53a3207dc61ec8a7a44c6f50c7b035772396
2239 F20101209_AABNDP macarthur_d_Page_076.txt
1370db309ab1770306928a90204f6a99
411bbeb6ef04ea4514adb785eaa0f68bc2237322
29949 F20101209_AABMYJ macarthur_d_Page_046.pro
8acd7b1c6298e2423852e71ca89025c8
6930236041871313d6f9cb2175022d5bccc5f0ea
6046 F20101209_AABMXV macarthur_d_Page_032.pro
c689b9099adfd15be7580c4191837b5f
06758900f8e2b36f5a11d7702136e6b477927758
1574 F20101209_AABNEE macarthur_d_Page_091.txt
883dac9a251f706a2524ec4de02d0f47
4535a0b6a053a1818be492689ef5a10bd2d6f528
1164 F20101209_AABNDQ macarthur_d_Page_077.txt
b5997e9dad3e3ebf557a97c17e5cf14f
5a99a71f2e82b498261def875f088f83909a6259
15536 F20101209_AABMYK macarthur_d_Page_047.pro
7cfcc5a3214fb506fa4f489b12650921
55dd906b0fc5bb9f788afa5a8cb716b45c03f2dc
28884 F20101209_AABMXW macarthur_d_Page_033.pro
b39e20c54eb20ada4656388f41c71846
56b2692e911211c4da7349801039ac6e46ea976d
2181 F20101209_AABNEF macarthur_d_Page_093.txt
72382df97b1f80fb6f8bd5117e288ad8
bf7fceed5c89b8649f99e98eb249ecfa7e9f8131
1233 F20101209_AABNDR macarthur_d_Page_078.txt
ce929b75ebadcde5459e550d9db7bc04
ba24089dc0c055be30dd37e8567f2b3fdd74414a
8056 F20101209_AABMYL macarthur_d_Page_048.pro
c52607094c97e9be2e7c7b20bdc5d7b6
b6272bcece489b648943f7f12855d13c50809180
23934 F20101209_AABMXX macarthur_d_Page_034.pro
4081f4f06040314395b311f0cdaf3373
39613e090ec594914096566ac0d6c080027337a3
22343 F20101209_AABMZA macarthur_d_Page_063.pro
d53dc9070c0a5365be2a4c22a1ad068e
d2a3b80f8eb6ca3db6945a7225579a348710e245
1499 F20101209_AABNDS macarthur_d_Page_079.txt
e39a5722b68a31466d0586e31d878f69
476afa9103f5a81b21b711e64e51462e3c30cef9
12676 F20101209_AABMYM macarthur_d_Page_049.pro
a7744dacca44144d73286c7dfaa9d80d
45f132e8c5fbca62e579dfe00ea693b94b795fc7
36466 F20101209_AABMXY macarthur_d_Page_035.pro
7362abadf17cca730283925703059f61
80b0e02c7024b3703f51cae0d500bc5e94d03bf2
1797 F20101209_AABNEG macarthur_d_Page_094.txt
69d645ec4b997ad2bc30dabd25dd6362
f6eeecd8eec36b94014266fdb45b93210e4491e0
24387 F20101209_AABMZB macarthur_d_Page_064.pro
eb69d09433fa17dd4f5920fa107f130f
1b52fe69de1d8e77918d83462eeae5ea0022ad78
1284 F20101209_AABNDT macarthur_d_Page_080.txt
c2358c7d3ee38ce87350a01df4d5e9bd
d0d6cc4775b7d586f7bbba19974998045f0472fc
F20101209_AABMYN macarthur_d_Page_050.pro
a6ebab662a18e90556a5979091bcc70d
0aecd1d1dc5f85daaa510503b8b1cf745de5e590
37437 F20101209_AABMXZ macarthur_d_Page_036.pro
7c53a24be916955d62859d9c363fd5b5
2c2348b615afdc505d036944ce9a9acfdb2ce6fb
2175 F20101209_AABNEH macarthur_d_Page_095.txt
cef44fcd3b43ec081bf5ad4a89560f7a
4fa8605b986b25f1546827f4c1e90cb909b03d32
33840 F20101209_AABMZC macarthur_d_Page_065.pro
fe116d27679eb97dabcb684c161cafb5
04ec3e897a4ebebfcf7ff493648b71f42ad4729d
1748 F20101209_AABNDU macarthur_d_Page_081.txt
2f39b5da6c9d9a50825d8d1fd65c024b
f52856dcc4d5b81a6685929e7cf3f710b4949beb
18959 F20101209_AABMYO macarthur_d_Page_051.pro
7a8fb2a4bbb2f0eb1c0d8d028a10623d
d6178973868faff97239bdccbd446995acee51b0
1217 F20101209_AABNEI macarthur_d_Page_096.txt
650c9410ab2d2c89b9d3c49ecbe5df8c
396fd9b1ac6113bef60434c376172942d0e1212f
22948 F20101209_AABMZD macarthur_d_Page_066.pro
c0de9322a55dbc5153aebe696503f0ff
bc28e0c5c4e6c099c1fd85ab39910450d7db9503
1383 F20101209_AABNDV macarthur_d_Page_082.txt
2c33dd2c32543f5e364e2175566985fd
bdb786d91e830ca3b41b433ec5cb5d5fde307dd8
19081 F20101209_AABMYP macarthur_d_Page_052.pro
6aa6e9094f0fc1bb57a255f75b73e497
40cb6f84c448aea39d3bac5289aa6466c3c178dd
1201 F20101209_AABNEJ macarthur_d_Page_097.txt
6bb46116e92aab793af3da62753f0fcc
a809ab7dd8c962956f4f57e27b24798e607d4840
27827 F20101209_AABMZE macarthur_d_Page_067.pro
5a22b4dab254ce9c72c78f1bc4dc67f8
121fd49dd2f29470243487e9c451744fad91e87a
2040 F20101209_AABNDW macarthur_d_Page_083.txt
c913afa354de3589da2a5e4d540c7a38
81bc6554950ad3126f9540d1b3927f72b1182c51
21911 F20101209_AABMYQ macarthur_d_Page_053.pro
857d2522fc80fad1c831d3ac9a39f679
afea94a4b7a712eee5c76346865d3047e50cd36e
723 F20101209_AABNEK macarthur_d_Page_098.txt
0ed9f63fe7cca991ba0fd05a624731d5
93c320ddb406d46cbb1c560ae5d460c57b6e1e56
19771 F20101209_AABMZF macarthur_d_Page_068.pro
71ed30acc95f7e794bb5572181795791
f679d9c52e09160fb38fc7b99f37f876d2a3dfc9
1379 F20101209_AABNDX macarthur_d_Page_084.txt
b0be17605f70297e4ec528e5fb0c0738
cda17ae5d643bc09bd06dbdacd4eb81f2cbe2bb5
11940 F20101209_AABMYR macarthur_d_Page_054.pro
5a8dc1ca9515fd0dfaa306818c14b3e7
15bc5e90e2b8f7e547267682d292043858d53f18
544 F20101209_AABNFA macarthur_d_Page_002thm.jpg
c8c8cc7b306ec362650d231dfb200d12
4770e32dc04dc22e2632ebe00d877d7ac3da70d8
1359 F20101209_AABNEL macarthur_d_Page_099.txt
7ac22bfb0f6270e2b67f79f9d3a4ad31
e963f936c839542623627abe1226c26d40edc516
47738 F20101209_AABMZG macarthur_d_Page_069.pro
43697045b2406ae8d0fb86ba832ebd5f
1e2b52ea80b12fd953c5414e8da7c60a68458796
847 F20101209_AABNDY macarthur_d_Page_085.txt
5bcac4e2dc1ca8335ff714b17645c58c
f1fcf8dd95475499c9dc74936e58713334fcbfe8
41691 F20101209_AABMYS macarthur_d_Page_055.pro
3f1ad2f3003afd7fafd420257aeef43d
0b42c38c559dfe118871dec1901f044419030f0a
1270 F20101209_AABNFB macarthur_d_Page_002.QC.jpg
8f560679f3ba8a043d5b92cb98bb2366
288e889b96ac9cb9a9d3e69f5b4d60a07c64ffbe
696 F20101209_AABNEM macarthur_d_Page_100.txt
0be2786eca12f56285c61e4a736b0c44
9b5d7da75689c6101de8ce848d9d8bed92931257
25319 F20101209_AABMZH macarthur_d_Page_070.pro
8583c3b9982f8ef86229c72b8f52e50b
0d1b71243781347bd1d5533dac21a7cd20647831
1358 F20101209_AABNDZ macarthur_d_Page_086.txt
c97cb0242709fe64308ec3d0ef2548d7
9fe3e657abe4fae0ace461f5bbb83282084ff0bb
9898 F20101209_AABMYT macarthur_d_Page_056.pro
274eced09a8b77abb525fee1b60c0839
74c8e20b3ca69e1c76918bf1d520684ae6543f1f
1051 F20101209_AABNFC macarthur_d_Page_003thm.jpg
904ab4ed6988872af6921ce05bad0f3a
860cf1f23e7e1975d345d201dbb6728aeb8616d0
541 F20101209_AABNEN macarthur_d_Page_101.txt
7107d79456b7013df3adf06fca1b2353
07db133689b71915a0cdf02a9968df09d2594511
16974 F20101209_AABMZI macarthur_d_Page_071.pro
c4d07c818f3a6db3760fea656754160a
108a253112216d955ab77631eeb1937f2cc9f934
35916 F20101209_AABMYU macarthur_d_Page_057.pro
9e65bed7deb64037a3182bd696b7f1f7
8b36edb803788bf874c9d0a2445efd9ab3e9868b
2579 F20101209_AABNFD macarthur_d_Page_003.QC.jpg
fc81a16f431f1a49eb26ac864a402ba5
c9ea8e70c272299a00cc9ba24d2ec185abff7eed
2005 F20101209_AABNEO macarthur_d_Page_102.txt
ec02bf783158af1f63af9c9a9c52997c
66b332b8ba250e0405d250864ba435affb0c01b0
14054 F20101209_AABMZJ macarthur_d_Page_072.pro
cc083f1b0b205d79e6dfe16256537cdb
77056d7e1a15216a974d48daaee5c3e31e58ae6c
13156 F20101209_AABMYV macarthur_d_Page_058.pro
3ff2aa220beea7c42486a6c916c7231e
e483f69a21bec3dcd198da70bd478c9080b820bf
1274 F20101209_AABNFE macarthur_d_Page_004thm.jpg
d8645489ccfc024aff5626ccff336510
62adcc575edd8f33a0cfeb129095ad37c321805d
1442 F20101209_AABNEP macarthur_d_Page_103.txt
83ac0db81c02ca9258f12ae4846c24c8
e773c93d65fbd3c5a040bd7358d6f57afe567e95
30116 F20101209_AABMZK macarthur_d_Page_073.pro
ce251d7ee5066c6671cb2164c57ac793
2e9afdb01d57274e54b8c5d5a5c60b3b015cfc8d
26813 F20101209_AABMYW macarthur_d_Page_059.pro
922376280782092030536654247645c9
473fda106f938df09de95e5cdd5edeb2695fae69
4230 F20101209_AABNFF macarthur_d_Page_004.QC.jpg
c71010319a1668c060e71aeb7f3f1d2c
6fb66e5ed85d8ea37cace6bb1304bb9d89122065
1299 F20101209_AABNEQ macarthur_d_Page_104.txt
41ac5de943710d8576a4bf27a2b20b1e
9d216f033e1f05a25b7bb42b3f8cbbe9c9911ee6
43643 F20101209_AABMZL macarthur_d_Page_074.pro
6019ee8a18b2c41a2b62892048b7a4f2
7ace18bc737158e190489e5e4c87c197bbe238a6
45938 F20101209_AABMYX macarthur_d_Page_060.pro
987e7e5f5bce5e0f03860006ff02ee2d
af972dfa83bfa930dbfbc9f933bb4d62f7027950
6672 F20101209_AABNFG macarthur_d_Page_005thm.jpg
bf262cb44f48e13be64954088fced2ae
a782d6dec1ef290af68a94f8709c33ac3866d578
395 F20101209_AABNER macarthur_d_Page_105.txt
a546a101b08ea9dc326009dfdc51228d
711ec27baeb4c583d44b14f56ee3dac138b8b88a
31017 F20101209_AABMZM macarthur_d_Page_075.pro
cacab90d6f2d1636a03e9da433e93984
5e7b8d7560178393267f792bb5e1eb1b93ac848e
52350 F20101209_AABMYY macarthur_d_Page_061.pro
225f2bc8ef528c1c89623bba32ff7d7a
64f5aba055bec15be71929aecee7882118eda4b4
2554 F20101209_AABNES macarthur_d_Page_106.txt
5bf955771b1fdbace72f828dccb4e35f
b3d7c767e1c48a3fd4fbf98783667715f3ee2364
42754 F20101209_AABMZN macarthur_d_Page_076.pro
d66b89181a075d8548b5c93350ada6f9
34b52d6e42b6ce450fd866b5d9643a17b3e479c1
12195 F20101209_AABMYZ macarthur_d_Page_062.pro
4ed3a7ba7a2c62869e601d4144967ae3
7a19c04de396a87b1932637eb9a4fc4439e92512
27450 F20101209_AABNFH macarthur_d_Page_005.QC.jpg
3eb465ac4333d3e3b2d7cde4cb889a02
d446797c50c828d71b21392e59c76b347cc0c363
2460 F20101209_AABNET macarthur_d_Page_107.txt
3fc5eed30165bbe6d5504685080e62a9
abcc4cee66fd11187c54e086014b642311eb46a8
22519 F20101209_AABMZO macarthur_d_Page_077.pro
3c727797ac569be09e6092626eadc743
64b669c865a3779ca19e443c678bf22a73172477
5548 F20101209_AABNFI macarthur_d_Page_006thm.jpg
99144bcac845f91cbcfd45bb29b6bdf1
f9193a3732f05d056f9f0edb86f19b3b775d9a67
2322 F20101209_AABNEU macarthur_d_Page_108.txt
02afc9ed537065d481265f0d59aa5800
132084a34035729e742bd9b41ba59c4ef37db899
15283 F20101209_AABMZP macarthur_d_Page_078.pro
a9057bce6feff4d01bcd763273f1dd7b
c18ef2a85843e33e2d5d3a0c56aeceec6894d3b8
22525 F20101209_AABNFJ macarthur_d_Page_006.QC.jpg
3e1f2c68d8980b5c38beb3d533d3c487
f8407bf2e089ab03ce82b2cf52773f9adb579212
590 F20101209_AABNEV macarthur_d_Page_109.txt
8961222e6e872786b4f1b7a648bcc26b
3c50ffb41127e6ae3173361ca28004334770d31c
36159 F20101209_AABMZQ macarthur_d_Page_079.pro
d447ea05a719d5330c114fe882886f63
6c14e3dcf69fbfa0345342cf07c9a0f847e23188
4060 F20101209_AABNFK macarthur_d_Page_007thm.jpg
fe42d8829cccfee6c212e75371c346f4
ef0aa146ea416c2a295b1fbfba1d68f4d44512b4
746 F20101209_AABNEW macarthur_d_Page_110.txt
2836fe2a7ebe87117554f4dfb48bef8e
1e4b0f4e17ec1b6eff885f2f7e80c4fca4b6710a
31237 F20101209_AABMZR macarthur_d_Page_080.pro
e009b310026190427b9fd9ef72a53be8
ff9297149b77104422e94ff06339ff6551bfe588
8610 F20101209_AABNGA macarthur_d_Page_015thm.jpg
544acf422f6f2b3088217983e323983f
b0181fbf72eef2a4e3ab92430c8fa899ab1ffd81
16174 F20101209_AABNFL macarthur_d_Page_007.QC.jpg
11156423a85680e3357ca66cdfbd3fc4
d46484ea941e2b35f076448fbd79dc455309f172
2128 F20101209_AABNEX macarthur_d_Page_001thm.jpg
87c214957441e89b196c72e9a88d99d4
b069c4f76cede51b8ab0fa2b52a2d25c93a204ad
34388 F20101209_AABMZS macarthur_d_Page_081.pro
6645ec520fa91d51b1da26771b9eec37
551863e6c7122e6995e497041dc1527a298b7436
32136 F20101209_AABNGB macarthur_d_Page_015.QC.jpg
58f4ad2ad43480ee8e5ce56ba84cbcd4
86ea944bf29a30cdd7c627d528acabd1e9c317e3
7402 F20101209_AABNFM macarthur_d_Page_008thm.jpg
61599602e5ab876fac31c2ce83af816e
32e08ac8f153ba4d0d195db688937ceab989ff49
3375642 F20101209_AABNEY macarthur_d.pdf
f9bf51d46707ad23a04af0e654c8fa92
ce5893117dc8fb4aaf347c44035fc27de41cc5c4
30200 F20101209_AABMZT macarthur_d_Page_082.pro
1580cca742de8d93f84df179dce47d09
599005f64b8687ca91fb4bf99df674e0b9e560c1
8830 F20101209_AABNGC macarthur_d_Page_016thm.jpg
664dc0b61b809ade3d11fdb69af3db89
128ccbb6ceb5cb266577e3533c0a14e1a0449f5c
30702 F20101209_AABNFN macarthur_d_Page_008.QC.jpg
39bba25235d5a4b792825be9f2f13436
3e94b69ed7b0694e344bc9f05a34109864436d59
8649 F20101209_AABNEZ macarthur_d_Page_001.QC.jpg
85a2a12d72cbd2b1481110f9af759d97
c09851e8aef3c96ed5ca76720aa619c7776fbbfc
50620 F20101209_AABMZU macarthur_d_Page_083.pro
dc14fa153df77c93cbbf8156236baba1
2ffc16a5df2c7d7686b33278e2a2f512e12b7428
32435 F20101209_AABNGD macarthur_d_Page_016.QC.jpg
ee1afef9652ce69e348fdce91a70a6e4
b93facedf3112bdbe10b8c2fb64cd2363353a333
8089 F20101209_AABNFO macarthur_d_Page_009thm.jpg
858cc1df0585a64bf7dd393b8ece2782
6de3bdc866fbd2b0e37fcc76e1aa533d9062838d
33340 F20101209_AABMZV macarthur_d_Page_084.pro
a6d69ab80937ec9bcb1bbaf86ad9e114
0116a4c69b90f82997da9e14bf6dd876a96d02d1
8978 F20101209_AABNGE macarthur_d_Page_017thm.jpg
52ff478bed8caecbfee9e326ad250bee
97803648b1c312cfd05e32a096c93b75e38a0894
33138 F20101209_AABNFP macarthur_d_Page_009.QC.jpg
83248c570a5d6d6aa2da7d4a393d8af7
1329628217c74f8dd91f65e3f252ac993a2e1106
35279 F20101209_AABNGF macarthur_d_Page_017.QC.jpg
4d42eaec496553f5bed39c281105a646
a7bd747fe203843e5efcee20a53200eeb8b00945
1813 F20101209_AABNFQ macarthur_d_Page_010thm.jpg
26126888de28bc074cedcf2a69c50f22
37b475b5616bb8f0aa67767cb0756878aa1e5177
20760 F20101209_AABMZW macarthur_d_Page_085.pro
6b2f22a72261b43cbea3c89f9093252b
396a1f8da827f3e80747e9d00a156d4ae880d2c4
7495 F20101209_AABNGG macarthur_d_Page_018thm.jpg
2989fc5dd2046c964a0454a27911d95d
64b1a252ade737bb55a5252d005f7a68afbd4bc5
7110 F20101209_AABNFR macarthur_d_Page_010.QC.jpg
c6a93a30f5f6f6ee92b92583ad852f8f
97912e4298c2ff495d582e37b0b7b10c3040d4ee
33023 F20101209_AABMZX macarthur_d_Page_086.pro
8ae109dbdc1b0e26e77770dfc67206f6
c99232be9c4b4a8fb068c94d21b617d2fca8d402
27943 F20101209_AABNGH macarthur_d_Page_018.QC.jpg
3f7bb734f48a73c199b1c345e7490f22
7dcf980d861833ed532dddfe46cc18bdb24e302c
7942 F20101209_AABNFS macarthur_d_Page_011thm.jpg
7d81a98df8606e166fd54cfd9a262721
7761084298160aaa910f6e03df58670da6e3807d
22263 F20101209_AABMZY macarthur_d_Page_087.pro
43b12aca9add56765c910f2dbebd2fcd
95f825b855b456c423e0e544f38a2895fac1760c
31500 F20101209_AABNFT macarthur_d_Page_011.QC.jpg
7610ceecc74743b88e2970fe4069b413
0a62389c0036269fdce824c32aec7a5bedb4b812
23569 F20101209_AABMZZ macarthur_d_Page_088.pro
c81e143b668b8d29b5690fd3fa29be0d
8a23ce853ac50067d5719bb60ab6185a7ca0e362
6733 F20101209_AABNGI macarthur_d_Page_019thm.jpg
5606dc2dd3f9d2166c0e6d831c44af25
be33e3604dc6b37578c946b20daa0427ffcc257a
2222 F20101209_AABNFU macarthur_d_Page_012thm.jpg
e0c0dbb3f630e1cdf9a303e4bfec667b
712b2ebca02f646cccd6d61da03631d905800cff
25070 F20101209_AABNGJ macarthur_d_Page_019.QC.jpg
e538f4c1c671f84348523006e12c52cf
b028782140aad532ba9392f422cc9ced2c39f393
8077 F20101209_AABNFV macarthur_d_Page_012.QC.jpg
2cc483e7efd26dd5a60b12480d910a62
fd25ec03cbe9661553313c86cb368844b14191a8
8062 F20101209_AABNGK macarthur_d_Page_020thm.jpg
12a16dc8a75a92c75c1b5cdc066fe648
738bf7116e7b080e16a6fba1e05dad18cbef9dbd
7339 F20101209_AABNFW macarthur_d_Page_013thm.jpg
a158ff6e7db32381bf94092b887914c6
4955dd5a9e97c03ec95ee46a7399791ac38cd562
29952 F20101209_AABNGL macarthur_d_Page_020.QC.jpg
7343d667154bb348a18d5c8908b83664
5d4ad92006ce87240a84ee711109a14b079e5b7d
29872 F20101209_AABNFX macarthur_d_Page_013.QC.jpg
c33065a4b09daf77d145b9f5f3f19084
8350993f6a0410998648081c7b2e39ed7de8bed1
6330 F20101209_AABNHA macarthur_d_Page_029thm.jpg
904feffe32b820efb2c4feb13b4f7b74
bf318cde3ffc0c6953c46689dd5fcb619aa91ad8
8669 F20101209_AABNGM macarthur_d_Page_021thm.jpg
8320683dabd4c5ae7258bb3620e653a5
58a8ee82126908861054ef185ec5b5ac7e76ba3e
8023 F20101209_AABNFY macarthur_d_Page_014thm.jpg
9beb4500430754408fb4457d9dad2bbc
c79bceaf8a257257309abb5c466b20309634a564
21954 F20101209_AABNHB macarthur_d_Page_029.QC.jpg
8cb1fabf11bf69f35acd7e5b19d1e868
0e4da7fbba1f40f088d3bc7ff1338b9d8a30a1aa
35088 F20101209_AABNGN macarthur_d_Page_021.QC.jpg
c94b5cb67a537c6149ba3b1aa8ea7431
45326ddbcf17f7dad97c5d7c3ff7be6826e8569d
31638 F20101209_AABNFZ macarthur_d_Page_014.QC.jpg
deb6873f14e8aa10635a479da5f753c5
aeda6ef14f6c5e566e5754b650a7841ed376b5fc
28065 F20101209_AABNHC macarthur_d_Page_030.QC.jpg
f2bc9b536794ae4c3a8eac3f52e984b4
aa62b1f4db5ffa59c3c40c1b1ff7f58ceb7a8a75
8979 F20101209_AABNGO macarthur_d_Page_022thm.jpg
217a5008641e2d0c45008bfc2b148b9f
3fa147356345842692d3cb2c835e9a8de24418d7
7957 F20101209_AABNHD macarthur_d_Page_031thm.jpg
f72a3422f2b9967fa321c1d0fe246d4b
962a1eb412e9065ef76eebd9c69ce76fed613967
36919 F20101209_AABNGP macarthur_d_Page_022.QC.jpg
82760b36041764b116d1006631e71df6
29a3601e9e956c79bfa108e0bf15b4694dd9a52b
28865 F20101209_AABNHE macarthur_d_Page_031.QC.jpg
de5254fcc971d78496fed312f146618d
17bbb5abea4b66a508e7cfecdd7ac90041cfad3c
1974 F20101209_AABNGQ macarthur_d_Page_023thm.jpg
03f1cd15f930c54c57d1a9163a48229f
272ab0ea944006fd784471a157e9a5e146d93462
4732 F20101209_AABNHF macarthur_d_Page_032thm.jpg
14729d40f86cded0612edfef81c76d03
9fbf7174531d2d8b0a532507c5ac7516c0b35022
7024 F20101209_AABNGR macarthur_d_Page_023.QC.jpg
94862bcfa8fa059ee71771bdb2f30df1
26bf07e9fea5d7171930c10a0a2a425fde26a50e
17424 F20101209_AABNHG macarthur_d_Page_032.QC.jpg
c92cd4632152401230010afc97b83580
4e52f43953451e55b54ad922d163bfa36da9d617
6176 F20101209_AABNGS macarthur_d_Page_024thm.jpg
c2678fe0575a58131c3e99c23573b305
630018085fee764d6299d703598327c631a1b9f3
7090 F20101209_AABNHH macarthur_d_Page_033thm.jpg
26210afaa9015339e05b1d096c6fa116
3ec06ffc7bf42bee81f0c5d2a76277496c3a4b48
24644 F20101209_AABNHI macarthur_d_Page_033.QC.jpg
a71930e1f3a6d64e3b181a337541e8c6
2e37b2a1b44fe02bed811be19853d5b6a99de7e3
23774 F20101209_AABNGT macarthur_d_Page_024.QC.jpg
d83e99102ea03fed2c1127a18b5a3fa1
03ce5e3ab09377f725f5498771858dc375d2a393
6888 F20101209_AABNGU macarthur_d_Page_025thm.jpg
81b15e2e11f586d497661f638e7fb95f
ac309bc2951a086f68519f36d68b7883842ab022
5802 F20101209_AABNHJ macarthur_d_Page_034thm.jpg
48fb12536c0a776c241c32e72c0bf57d
b5e6c194ee758ea69562407acd58bcc2966826fa
24462 F20101209_AABNGV macarthur_d_Page_025.QC.jpg
ae7b2d2f184cdb9b3340c67679e91476
bcdb18cc99406d1e54eb02d2e7cd0ffd4a06e6ce
20353 F20101209_AABNHK macarthur_d_Page_034.QC.jpg
f228db889f96fa3b3ea1697ee2232c57
7ac8d44fa9e9d6bcaac0c993a6a6a87d2a62b486
6989 F20101209_AABNGW macarthur_d_Page_026thm.jpg
923a65c9df5c8d8cb48e4a9502d529e2
41096211881f0f81e4c482dd9b320d2fc69b376f
26128 F20101209_AABNIA macarthur_d_Page_042.QC.jpg
dcd1c997018cebb72c997252004c7d9d
c97d85b3a22f489bc501f87fcfc753d9f8b7e7d0
6293 F20101209_AABNHL macarthur_d_Page_035thm.jpg
106f355d1fd315932fea161e571197a1
09a1c8feaaa74e0c4e9f0473396f7291c7fcc3e5
25633 F20101209_AABNGX macarthur_d_Page_026.QC.jpg
e698030f999dfe21a3189f870a259a37
44244d81c501df28973285c6ff1ae51b3ec323a0
5640 F20101209_AABNIB macarthur_d_Page_043thm.jpg
70c24d8c2906d9d84c1ffd5a3adfd7f4
ddb80fb49f3e8de989a8f3ac5f6d87192255d333
24200 F20101209_AABNHM macarthur_d_Page_035.QC.jpg
9c638c4995989523fd2ecdf6af31a061
10326ac11e8b0906cb59f13d869e79ea46417256
8178 F20101209_AABNGY macarthur_d_Page_027thm.jpg
192d5bee57529f6737ed30a16619e625
7c9b19a10971e34af57766978458470221a3fc60
21605 F20101209_AABNIC macarthur_d_Page_043.QC.jpg
c1142ec93776480b4f1421ae292ae576
362f8e57c4351cda5a6c44c1441d317ffbaefebc
6445 F20101209_AABNHN macarthur_d_Page_036thm.jpg
450a091f8accbb15616322e9c91ad282
8f7a14d3dd86e5dbf0a4d28646f16c7e5bdfe1b7
27845 F20101209_AABNGZ macarthur_d_Page_027.QC.jpg
f9d708c6fd22533acf33cff8bc1efb1f
466e182b4df80e454e28bb4dc09fcc81656f9517
3751 F20101209_AABNID macarthur_d_Page_044thm.jpg
e106512de19e335c8059a4f329579355
013a169792784837c9388db54aef863d664f1567
27685 F20101209_AABNHO macarthur_d_Page_036.QC.jpg
b16cf4be215a6ab46dce5b6b10a0536d
8e4bf3e719c968e9934397c0a4819d7a6f9a6cdf
11992 F20101209_AABNIE macarthur_d_Page_044.QC.jpg
0e3741197925b9dfa1db1c21e1325e83
cc1235a98a8d95acaacf1e6e3275c8ddc4660747
4831 F20101209_AABNHP macarthur_d_Page_037thm.jpg
c5c4a7a356acea172fffefc90242071b
be370626bb5cdb1dbff6c15bb5737d0731421eaa
5073 F20101209_AABNIF macarthur_d_Page_045thm.jpg
370c82683e8493931028f70feb5b4b6f
0e852c610b9f3ea6dfe06f832c0b773c622c3486
17368 F20101209_AABNHQ macarthur_d_Page_037.QC.jpg
9168731c977190b850d7aecebf585c1a
de3bc102a3e34d7346d04b0a6c324234c12af460
17489 F20101209_AABNIG macarthur_d_Page_045.QC.jpg
31f180d3ebde39a68d58adb5ab215797
686072c5d1746fd89f9342ee92722686da7168ea
3925 F20101209_AABNHR macarthur_d_Page_038thm.jpg
4a4942dbed4bd122a956b9445eea6ded
c136f078d3e6ff4e1df62764360304ad94055560
6435 F20101209_AABNIH macarthur_d_Page_046thm.jpg
ce640882b7390623d0f8ece1cd331d7f
edd9efac91437251e8c099af8528727b18ac48f9
12437 F20101209_AABNHS macarthur_d_Page_038.QC.jpg
d476e69e503ee6449dd52d7c5ead4cdc
a21e90e02d9bc6a8868ee8ffb35165d198de15a7
24343 F20101209_AABNII macarthur_d_Page_046.QC.jpg
e3657cb219915732fbddfce2931b287f
d3482ae5d1ea9a810dbb16b48636f36821f276ed
6340 F20101209_AABNHT macarthur_d_Page_039thm.jpg
313111cea8a3fced2c73b5995174f439
ad30b8500d59b0279911ef81578fa7768ee4ef0b
4965 F20101209_AABNIJ macarthur_d_Page_047thm.jpg
dacc52886517dc81eaf0393f28be45c0
3e53592a70d8516d8e4763faa8ecaab51b34eadb
21840 F20101209_AABNHU macarthur_d_Page_039.QC.jpg
bde41be8ed51305f5b782f2d59ca9f82
2ebbaa0583b03de7a70f94086f6dce109d5b4d10
8284 F20101209_AABNHV macarthur_d_Page_040thm.jpg
b9c900dee5f4c6688696a70adead8bb3
7a0bc2612a0e9ff47e42924c5694d87f5c760c1a
17872 F20101209_AABNIK macarthur_d_Page_047.QC.jpg
400799d1e749e4f3d9fe4bc8dfb5a28b
70b3ccf02db6e62e28b34c93fc12172986b057c1
30328 F20101209_AABNHW macarthur_d_Page_040.QC.jpg
22c1a20c556f9f80d6a2de3bb724b595
aa9e393a787730710abc192d9330340eac3485a4
3525 F20101209_AABNIL macarthur_d_Page_048thm.jpg
089e47aa9a5fa2147cc16b865d093dfd
d6184e8c67d42bb60e9f11a52e3fb8114cdd97a0
7464 F20101209_AABNHX macarthur_d_Page_041thm.jpg
d37a14ccbb1ff2630f2ff3bb0a69c0bc
90c2d669f40196b50d09d39435df3e04ae692b52
27864 F20101209_AABNJA macarthur_d_Page_055.QC.jpg
85f855ca06bb826f297f57eb0ae217de
b98713b528876abbf66f0ba8e63d114a43bf155d
11223 F20101209_AABNIM macarthur_d_Page_048.QC.jpg
7256e743a5bc0785672043634d46cd6d
9472388ef2e66f0655db080dbb55a1910e007b3a
28790 F20101209_AABNHY macarthur_d_Page_041.QC.jpg
68441b5d371f6aa8b859a7728ab3c8ec
39d97e3e2b2a46083e1d4f2812bc9ef92bf426e9
3644 F20101209_AABNJB macarthur_d_Page_056thm.jpg
6cf67fad7e529e0eaa71b3231a1fd223
e4c6a0ef9f7e5a05790ed2261af6c46b0a8123ef
5283 F20101209_AABNIN macarthur_d_Page_049thm.jpg
add65a93b7115c5c4c002667356b97ad
2634d24db0fbe2a8fdc6e03f3d3b83597c04681b
7071 F20101209_AABNHZ macarthur_d_Page_042thm.jpg
62ea477e6f958fdd4eab7d8ed12d59c2
fefa37399f81369d9cbfd38816854c8b3e5a41e1
11859 F20101209_AABNJC macarthur_d_Page_056.QC.jpg
d6ed3333a400ec022d34538d9ca13a64
e1118ed2e9b3ad243f36b7c1a0ac066b89c25997
18641 F20101209_AABNIO macarthur_d_Page_049.QC.jpg
0516d9ae312946051dc650003762a2a7
c55e388cc640e77a221ed26a193d264cca475d5b
6414 F20101209_AABNJD macarthur_d_Page_057thm.jpg
465d4359d4ed66c6735038dbf0746c6d
11fe22d7c493865ceef72c1a64ba687d05cd3ed8
6648 F20101209_AABNIP macarthur_d_Page_050thm.jpg
d107ce506306d74b3420a6cdbdbb8b26
00a6e466b18d937aa66a96515b98a4642d18cda8
23235 F20101209_AABNJE macarthur_d_Page_057.QC.jpg
44f9884527a28c7a3817e7aa8360bdfc
43cec5cfc308f46d05d3f036f52d6e67b0b841bd
24480 F20101209_AABNIQ macarthur_d_Page_050.QC.jpg
94539643844e4bdc1a2056eda2588862
a61ca8ce60aa7098214a0e81cb3344dda1fa15d6
4730 F20101209_AABNJF macarthur_d_Page_058thm.jpg
75a85aaf2ad0d755a90ae9d15e186477
d1259014d5f9c82dd9d31f7529eec31079caf48c
5655 F20101209_AABNIR macarthur_d_Page_051thm.jpg
3710274ec7ae46e1e158adf9578db5c8
c4ce94a9f4deaf53c9bd14cd4831a3e0f886fa3a
15154 F20101209_AABNJG macarthur_d_Page_058.QC.jpg
53e55014649117b8b9c33a3cee38ccbb
dadd19c59f596c8469637f997db4a18be4432a37
19078 F20101209_AABNIS macarthur_d_Page_051.QC.jpg
67ef60d6a89833f2b98585cbd4f5cc37
8f881c9ebaf98bf153846b63a5c18a7ea9a44b04
5378 F20101209_AABNJH macarthur_d_Page_059thm.jpg
3e92f5001623ed6b41e89c478bb2b18d
dc329ee4ab5542376071cf5dd33e75376195348e
5954 F20101209_AABNIT macarthur_d_Page_052thm.jpg
2a6b5fb87706c32012faa7aab9b942cc
df441664e691d9c0dd8d79ea800aded2aa5b374d
20003 F20101209_AABNJI macarthur_d_Page_059.QC.jpg
3e109a8bb0785b770549798c46480abe
a064b1ad39d12347bcaf159d028dca38fb67d5b3
20522 F20101209_AABNIU macarthur_d_Page_052.QC.jpg
b1b5a9160f20c72b19aedfc6eff1a5bf
44d6900e68888049b7834add5e2621015655a8c3
7825 F20101209_AABNJJ macarthur_d_Page_060thm.jpg
a95311f74d9c5b8e771d3f361d68d67c
76984877abbed7d8f46e40fd859466c40a7848b6
5955 F20101209_AABNIV macarthur_d_Page_053thm.jpg
12bf8f246fa99d7e0865747cdd5f05fc
4beca68131fe74c774ad743b8ff15bb854d0cdeb
31073 F20101209_AABNJK macarthur_d_Page_060.QC.jpg
fb7e167b502d75e457664ba6a0f25174
c576029304726234ef4447413b24d6813a4ef123
20378 F20101209_AABNIW macarthur_d_Page_053.QC.jpg
f2ca2a664013cd016032bb081fae5676
36dbd3082095871506e4f802f866a2399126df4f
3092 F20101209_AABNIX macarthur_d_Page_054thm.jpg
cedaab66d4eff069e1f420730d7b4a55
5a49efd48d998dbb09b6ad3d9ee602aae650043e
7970 F20101209_AABNKA macarthur_d_Page_069thm.jpg
38a2e272f50a5b2d68578c539e883d31
b7cbcd61ccfa4a0ccfc4c6d188ec7768942bf629
8632 F20101209_AABNJL macarthur_d_Page_061thm.jpg
69a520cb0a8f41011ac8f9f793a0ee25
ddf99d635e2db0d1ad59de793bc5b09b18ec3b54
11856 F20101209_AABNIY macarthur_d_Page_054.QC.jpg
f578cb8664e8662ffb7d7b08e76bbb08
1b1bb3afe01f86569549d5e490f74af79ddaeffb
30236 F20101209_AABNKB macarthur_d_Page_069.QC.jpg
d1d5521c08b92392f5aa24454e48fe30
4ea190989534d9968104479d7cf9da8215c8b6e2
34474 F20101209_AABNJM macarthur_d_Page_061.QC.jpg
07ba234b3c8822960a226249f885777a
dfd11ea90a871d84dd32137d671579f33703aaca
7039 F20101209_AABNIZ macarthur_d_Page_055thm.jpg
a0620abb0e6573f338b59ca73c2e7956
b3e4b85543b1a64e71e50845bd2d4373806a8119
6874 F20101209_AABNKC macarthur_d_Page_070thm.jpg
5ef44f673a175636460359aa00971f7d
b1649534e2e0b5329ace621026d250cb54e0296d
6852 F20101209_AABNJN macarthur_d_Page_062thm.jpg
2b8f88108790ac679efbae24d8572236
4d4ebf7fd30f9b5633f0c90132ba0364961f8d29
26296 F20101209_AABNKD macarthur_d_Page_070.QC.jpg
4815ef7ae8627405c985ad61dd9fdeb4
c5a765d7cf5d4fd98e350ae66d777f73a6104943
22212 F20101209_AABNJO macarthur_d_Page_062.QC.jpg
ae8b26c90619e16705e1172a529d4bdb
d7b2dd395cc8daf97d2161a06a1d20f468f41eb1
5688 F20101209_AABNKE macarthur_d_Page_071thm.jpg
4922551e74a53d6d64d292c6dcd38745
10fa893abff42f8c962207d8329c23161cc5990e
8182 F20101209_AABNJP macarthur_d_Page_063thm.jpg
580cff42a1e6a535fb64e46053426d3e
c6acbc949d574fe2a587f5e1e765877178d35d0e
19850 F20101209_AABNKF macarthur_d_Page_071.QC.jpg
21abb1f99f08efd6c1c02ef04d4df7ad
0d4019ae8497219ca45db7f050b364cd3922e778
28605 F20101209_AABNJQ macarthur_d_Page_063.QC.jpg
6e35a7ecafd779bd441098e9b0ee29c2
069fcf7b8dfeeda7247fcc07428c52a06caa1d9f
6294 F20101209_AABNKG macarthur_d_Page_072thm.jpg
7235b827ff707111ef723a411a341b2c
e6991e36ac2cdb51c55ab20670892acc0bf36cc4
7707 F20101209_AABNJR macarthur_d_Page_064thm.jpg
42b2f18aa849afea70c613c2c91cb313
eae9a4fe74acedc7f7da2acdaf3fdcb5c8d96725
21816 F20101209_AABNKH macarthur_d_Page_072.QC.jpg
d8d1956b4d2b53c1a17dbfe49043adc9
33f1634118b5e98815ef69ac3eabf9b2ed7ccb61
24623 F20101209_AABNJS macarthur_d_Page_064.QC.jpg
487e7af1c01c20f0e21564d80216c3cb
11a5630b1f03c8ed0ea4020ca49baa4ddf87f8e9
6161 F20101209_AABNKI macarthur_d_Page_073thm.jpg
5430cb8f025f14cb09763513bee96db3
7c7f30a129d4571f7f08ddb14efe583e1bf40cdf
7241 F20101209_AABNJT macarthur_d_Page_065thm.jpg
6a4e31d531327fbbe29b68fa12e775e0
6ce44350cee64b15c1ddddbc662b0f371a94abc0
24121 F20101209_AABNKJ macarthur_d_Page_073.QC.jpg
63a58557a829fa457af072bee8aaaa82
66edf4ef31743fe17314b2ddd6d5fdd431ae81b5
5624 F20101209_AABNJU macarthur_d_Page_066thm.jpg
336596020698378f41500ccd766d3f62
ec72c52e345fcc69bfebfcb0396b68e4216b8bc7
7541 F20101209_AABNKK macarthur_d_Page_074thm.jpg
5bcfad5bd8e5f11422a4545ff34803b5
9b87b4678d52ece76fd119ec34999366821b7d2f
21381 F20101209_AABNJV macarthur_d_Page_066.QC.jpg
524696526cb7250e24259cab72ed00db
6de36e45d988ef7f31e9165feb34ad3ef0c51746
29404 F20101209_AABNKL macarthur_d_Page_074.QC.jpg
a4a938bf820b14eb85272b4ea36efe28
e41a1f56b22659e8cdaa36605294ecc50ba3b91b
5960 F20101209_AABNJW macarthur_d_Page_067thm.jpg
3b703146e5b9ab69941576731538997a
5a4ef46281a7506f28e2eb012ff2ab59b2a02078
7420 F20101209_AABNLA macarthur_d_Page_082thm.jpg
a3a13a906fa9d8dc9d9f6811365f9838
1e441999fd6ae089ced8b453d3be99e35f8f8568
22164 F20101209_AABNJX macarthur_d_Page_067.QC.jpg
ce65b7adbac20fd4d9ed06e915c5e5a9
b1914db401880ca69bf410be7858db9388e8ca2f
28294 F20101209_AABNLB macarthur_d_Page_082.QC.jpg
2498ce722f73eb57c1b3b4b8fef0e1ff
471c9729d34897b6978215c5521a66e18e1343ef
6213 F20101209_AABNKM macarthur_d_Page_075thm.jpg
3f5e3c4883c3d7282ba18bfd5af92ceb
117ccf69590c730d7955ef322b3072183a78f338
4303 F20101209_AABNJY macarthur_d_Page_068thm.jpg
b30ab10b4d0c16fc60b44a765921b506
a8c4f6be55f47389920a7f470384e08f0ecbd95d
8789 F20101209_AABNLC macarthur_d_Page_083thm.jpg
fb174082c4149232f03093f64def6659
c4d0d882d3ca6a53409a4e3026abcd7e200fb8ab
23266 F20101209_AABNKN macarthur_d_Page_075.QC.jpg
84496fe5a811957d1098d233e31370ac
7ba2a22ece52f3869b9577752d38413adbe83f08
16243 F20101209_AABNJZ macarthur_d_Page_068.QC.jpg
5e10a91af61ef7855b023c3ccd0ed823
1d7efb8067d03ff3fce97e87b9780971266e0e88
34160 F20101209_AABNLD macarthur_d_Page_083.QC.jpg
cc9ba6395949305e3164c1d5d44f1ed2
e6c03d2d61126791d404a54d91093681815940d4
7907 F20101209_AABNKO macarthur_d_Page_076thm.jpg
f1ed326167bfbded4eea33aa561ee3fc
b51afce17c3219ba53d863160226a837469fdbdd
31554 F20101209_AABNLE macarthur_d_Page_084.QC.jpg
cec660a1414a601c4ad99017f95bf69b
e19e56226f1dd63770088c4f3cd49a1e1cee2956
27832 F20101209_AABNKP macarthur_d_Page_076.QC.jpg
08b93ff7bf9296b6f1f8a12970f28165
031779bf2c14ed3a2aa9d5d7b3492fc630c7eac8
6255 F20101209_AABNLF macarthur_d_Page_085thm.jpg
eaa2ae0759e3deb067e691b9bc3cdf42
bea35b287c5adcd86a4e5d8340d686e124461715
5729 F20101209_AABNKQ macarthur_d_Page_077thm.jpg
90382471002c77e6523734071c3a686f
9cf93b25f924ba685028c977ef6ef9f601bf1a66
23371 F20101209_AABNLG macarthur_d_Page_085.QC.jpg
f4aa71b998e03990a3cafed4397c1334
432568a4016765cfe5699e3afce76346c80086f2
20598 F20101209_AABNKR macarthur_d_Page_077.QC.jpg
7b46342bdff12444b2efc2a08b86c592
56986b7bc906725c6230ea72d119651f721e959c
7682 F20101209_AABNLH macarthur_d_Page_086thm.jpg
f16170851297618aac48714ec004e1eb
58698381da76b1e85bcb6970ac87c8f2a818133c
4875 F20101209_AABNKS macarthur_d_Page_078thm.jpg
52267ae68f0867b32da0fe7289c06539
d076cc6237c8169ba29c192b8df3852b0b5e844b
29817 F20101209_AABNLI macarthur_d_Page_086.QC.jpg
cedd744ac9b79a79962666ae51d4e82f
6eb7da0499b8407ed36122af2d50fd6cf5c539f2
17277 F20101209_AABNKT macarthur_d_Page_078.QC.jpg
5c652ba82ffce55b359dc78809e9cce8
405a18d806dbc98d2dc9ff575034547e2af74f53
7296 F20101209_AABNLJ macarthur_d_Page_087thm.jpg
8563954cf790bd6e55834ff612f115be
940fa1027a7e063ef796d1ef409df0dddbd09364
8359 F20101209_AABNKU macarthur_d_Page_079thm.jpg
99ad827c4ac96977bb23adc732384889
4d073f7b1c9edb2dc1fd563c643f0eb0e3b7ae3e
27425 F20101209_AABNLK macarthur_d_Page_087.QC.jpg
4ceb770edba729c1fc13a830a3c0b34e
d24f6629db36f1d72dfa683479d069af471545f4
29887 F20101209_AABNKV macarthur_d_Page_079.QC.jpg
155b13d3775d98bf179261cd4080b1e7
e89ac6d8eb4fca2f9eb2cf7f29ca2279635b7b86
6875 F20101209_AABNLL macarthur_d_Page_088thm.jpg
d6462465203e2a10ca39269c9b4fffe3
768cdea5e5790ef1d8c764e72eab322087db8cf3
7255 F20101209_AABNKW macarthur_d_Page_080thm.jpg
f8214b1b8a31b420dbaf58b780f73777
c1fb75a10276f2a80213a7d4536fc6a10aafc960
22915 F20101209_AABNLM macarthur_d_Page_088.QC.jpg
609ff4771023f93c69d8696ab983f131
4617c36dc6b28d7f2d867d970c8f80fff0dacd27
27290 F20101209_AABNKX macarthur_d_Page_080.QC.jpg
95def9041c85341af73e79f2839a3ad0
4f09d25a1c905c83a2f4e967b8a52f098116fd7a
36470 F20101209_AABNMA macarthur_d_Page_095.QC.jpg
2960163db7db0f222dd0aeaf05568107
76404237c9268fdf1c61129990b4de149283a358
6270 F20101209_AABNKY macarthur_d_Page_081thm.jpg
9b1d61b5dd3c7a9845becb286cc9d6b9
7ed26678aaba99ae5393afa0d8739f2519b933da
6646 F20101209_AABNMB macarthur_d_Page_096thm.jpg
ed04eba3c219c85fda31aaa76193e0ca
bfbf65e96c05ca069f7998fd6b61cb09edd13e59
7722 F20101209_AABNLN macarthur_d_Page_089thm.jpg
1f3fce81064cd0dd2231c6f60f1ac463
7ab1a297c9243561097fdfb6c3fad566b0a8bc74
23414 F20101209_AABNKZ macarthur_d_Page_081.QC.jpg
66abff89b9550e1c07072d4fe95cce4c
a8fe36bfd6150fb4fc66c77327a0c567b89438ec
24324 F20101209_AABNMC macarthur_d_Page_096.QC.jpg
476b064b8b0834763eb6b44b7b474e94
5115da6df3222d5f27b8a0ea27420161c5bde907
30815 F20101209_AABNLO macarthur_d_Page_089.QC.jpg
49f08a4d12776dd0d0e8298292a092d0
87ae0c0201221b83796de8b2fee6602fdb5c4675
7419 F20101209_AABNMD macarthur_d_Page_097thm.jpg
fb8a5da8fea55b37128539c60165fd77
4a0da22ae1e09ad74faa7ff0fb471af6aee757fc
6853 F20101209_AABNLP macarthur_d_Page_090thm.jpg
f545377b36b1b102c10a3fb7b2746da1
a0ee276385d70fa82c966d211cda41447f91d884
26841 F20101209_AABNME macarthur_d_Page_097.QC.jpg
e76439b9e6e413d09347f33d6cc10a35
e6b091fd1643fe51702d3abed65d6bc666c71cec
10667 F20101209_AABNMF macarthur_d_Page_098thm.jpg
a1a856304e56f3a4f7375e495c875057
96ca4085714680a3ad2417c64b5be8df454b415c
23214 F20101209_AABNLQ macarthur_d_Page_090.QC.jpg
df9bf98dfc66d67e3e2201ba829f5d8d
196bcb06bbb0ce8b4205c5e5e3b3983b59d5aea7
41377 F20101209_AABNMG macarthur_d_Page_098.QC.jpg
e794946ad9e34efdff4ec1060b4f1907
853ab7c5aa99c751a39bcbea925f13a635d0efc2
6474 F20101209_AABNLR macarthur_d_Page_091thm.jpg
4c636d22647ea4f78b38f72fbaf87b2d
49d5f75c3efd8c3f836597abc2c03dc4e0b850aa
36547 F20101209_AABNMH macarthur_d_Page_099.QC.jpg
3a3f4ba15329399cdeb25a2581c479af
0765d5c7139594fd6cc240398f57070544576bb2
23119 F20101209_AABNLS macarthur_d_Page_091.QC.jpg
f79c82d946f159c65dcc05b567e6a142
27e435b56835a6784ce9b8942167e276ff0b0a34
8490 F20101209_AABNMI macarthur_d_Page_100thm.jpg
a4931a5f9d5f68952bc7e009ded09b23
16aa3c7875ead61edc6eeb9f335888d1d62764a3
8033 F20101209_AABNLT macarthur_d_Page_092thm.jpg
c72a0cb6df4681e11eb6fdea5fe77f9e
ee41e2fcdf82c7fd6597f3d57f18f9f82209a756
2530 F20101209_AABNMJ macarthur_d_Page_101thm.jpg
f42cc322f1205a03d610baa5cc6d4b27
5310ccae30aa7215bcac18f0af83789a6b51904f
31918 F20101209_AABNLU macarthur_d_Page_092.QC.jpg
551bbcce2fdba21ac48a59dc1dfeb9e9
5ee06966a4c4e7f50364a0194542d36afb1e97a0
10210 F20101209_AABNMK macarthur_d_Page_101.QC.jpg
7ae48447e822305324a46d75121db8cb
6ad847e9ec9ed4ea29c0b4df3665b73e53213a1f
8782 F20101209_AABNLV macarthur_d_Page_093thm.jpg
3fea8ab67f05173d4c57b62850fce9e4
b3686e488904ebe34325a19bb9a9003089366601
F20101209_AABMJJ macarthur_d_Page_028thm.jpg
4a91aee46576d3eb09a3dcf388995b19
84a51428cb00b3c1f7d0adad795d3756f47a290b
7994 F20101209_AABNML macarthur_d_Page_102thm.jpg
0e69a647c33ffd7c436f43f860d5b9f5
919012290bc998ad87ba536aec5c9dc788b6d079
37389 F20101209_AABNLW macarthur_d_Page_093.QC.jpg
2393d60937d7392a68e36ef9e67bbe4b
5595b23307a7b2fc6cb14a0fed895e9381ac53f9
10117 F20101209_AABNNA macarthur_d_Page_109.QC.jpg
8eade9e93798e5a83f6e6afe3cd76c1c
0cb47e76622e8182854f21f7b3826180c0411e46
963797 F20101209_AABMJK macarthur_d_Page_076.jp2
30340784e1d124bca87adacfa7ce5d37
fb6827a3c26d3e018183b6d6fcbbb7595880dcb0
31816 F20101209_AABNMM macarthur_d_Page_102.QC.jpg
beade8c60cf62155aae49dae1a5506a1
e15ac8e29bfbbd4fe6cff20b9ccb5ccef99d12fe
7917 F20101209_AABNLX macarthur_d_Page_094thm.jpg
f312954f19687e84a4a03b502ade3e1f
50af9438605f6a442f0a01b3e219e52c9238c7b8
3355 F20101209_AABNNB macarthur_d_Page_110thm.jpg
4c125c3f6b7ea57c7b91eae8e13dda36
326c9267d76eeea9760b44dccd3d6e438f68eb5f
57765 F20101209_AABMJL macarthur_d_Page_032.jpg
c3a79f9d8d198f2eec088d595e1ab8ee
f675d0006fe559738d0901f04421bd359485bdb0
6606 F20101209_AABNMN macarthur_d_Page_103thm.jpg
fb519f11331232872ac133a3762c024c
6cb30f559d78c7035f9138ed903c5473c662795c
30758 F20101209_AABNLY macarthur_d_Page_094.QC.jpg
e56b4faa8f9095408340c6658cf2571b
1291e38e04772b2611bbf441c0c5c7d2beed14bd
8265 F20101209_AABMKA macarthur_d_Page_084thm.jpg
676c39544c38083cf440b7709caebfb9
39acb52c74b3b1fd886af14beeafa1711b2390b7
F20101209_AABNNC macarthur_d_Page_110.QC.jpg
69da769641c81524dcfe83a567fabe87
5863542323048fb9ac8c0949192d0a103a4b98ec
9012 F20101209_AABNLZ macarthur_d_Page_095thm.jpg
d1fada936670e69cab453aa57cb58c07
c11e19fce38f6a1bef3f1da31c043b5c0af2b21f
167814 F20101209_AABMKB UFE0017511_00001.xml FULL
32bed5bee3979c05a7674a7034a42d25
09a186c089417006becce74a3d0f7e78cc299750
129561 F20101209_AABNND UFE0017511_00001.mets
29c51a93ce573ae620c020cc4510c0bd
297d18a7c3ce521d174d2e57c576ebcc70d67e92
7796 F20101209_AABMJM macarthur_d_Page_030thm.jpg
21ba393267541946a18143e0d9c54a3d
85511b398383412718f94bfc4dc3d63feadcb0fa
25093 F20101209_AABNMO macarthur_d_Page_103.QC.jpg
9fdccc55dbc8ad95d4a62154b464259b
3fb7a2645bf08f243312222e48bb1699cb1c0c59
80472 F20101209_AABMJN macarthur_d_Page_046.jp2
7f2208e06a952e207ccd73c8128abe06
1934920ca57c5044532564a04e8ac91226f6ecb5
6323 F20101209_AABNMP macarthur_d_Page_104thm.jpg
7eb4d7dec7313f1a6d43e8c8bdf93c84
2b6a9bc542745911f938a96e930dd5dfd177eebd
2002 F20101209_AABMJO macarthur_d_Page_069.txt
85f7b24a50d123cbd2d154a99a368f19
bbcdd5c5142948f77399c02ee41fec779732de46
22595 F20101209_AABNMQ macarthur_d_Page_104.QC.jpg
d619c415954de213c867a53177733f4c
7c6ff6f2080946a960feb13259c0ae343dd5d346
28278 F20101209_AABMKE macarthur_d_Page_001.jpg
f4a2f5bac001b1e75c13c58dc3f434af
f0316d01bd163adf6c1c8533cd6a54216f2db735
932 F20101209_AABMJP macarthur_d_Page_030.txt
612fda0914201f93b0e2dc32290e1657
48644c772d10988e2d5ac2a4f2cc733116ebd1ba
2119 F20101209_AABNMR macarthur_d_Page_105thm.jpg
f703aba82731dbdc27b24c48173dbcbe
9aa2f30ad2f96f001de4f072c210f78f156b1a6d
4281 F20101209_AABMKF macarthur_d_Page_002.jpg
87254cb796b6554f4d40f1b8929a13ad
1146bbcc056efa7496f4f095e301bc36a7818f22
410 F20101209_AABMJQ macarthur_d_Page_048.txt
d9eef10fd29b51f8746b2036b0cc596f
b1cf20b5f3b5fda880c9d2c1d86089af503c9385
7405 F20101209_AABNMS macarthur_d_Page_105.QC.jpg
0ba47e14e7d82b8b716ca8861357a10b
29200dee9111d9e2f18229d0d388e8e59b741be3
10349 F20101209_AABMKG macarthur_d_Page_003.jpg
642ae34b49ad60ac60fd55a8c5ab7f85
0fc87790ea283ecbd9445c86002eb294d1aad31a
31292 F20101209_AABMJR macarthur_d_Page_024.pro
5cd1dd37c84f4241e71bf767d1dcacab
7b9a4c15d0b5e809c7452106b4aee452eabc6295
8704 F20101209_AABNMT macarthur_d_Page_106thm.jpg
e3c157ddba3403639ae1506c102ab399
9602a94b4f7ee3d32cc9dbc63cba1e989a05697c
12365 F20101209_AABMKH macarthur_d_Page_004.jpg
29ab784bd47ecf4ff2e057796497a86f
91492b0c1a89f460356071116922bab91e4e3370
1053954 F20101209_AABMJS macarthur_d_Page_102.tif
f25b38cf82b3fc19479469eeca63062e
571a10df40a694d94e03cb59fe61a95d689286f4
35935 F20101209_AABNMU macarthur_d_Page_106.QC.jpg
d54026ac3e3d462a576ed9234edee644
462511e4dcbc755167a2ee9729c1979d9318a699
119549 F20101209_AABMKI macarthur_d_Page_005.jpg
2c4f11b3aadb74f018eced2d98b8775a
75331405ae991b8167651d9eb5b86d555b697b73
28729 F20101209_AABMJT macarthur_d_Page_065.QC.jpg
5ed18dabbaf253cce134468b07152c99
091951c8f0c8affdfb528aa2274a7f161a55b438
9510 F20101209_AABNMV macarthur_d_Page_107thm.jpg
cad5a45e752be388516097ec1bbed9c1
8b3b2d0c7754574dd1af59bf72f648055a136833
97738 F20101209_AABMKJ macarthur_d_Page_006.jpg
6bae76c966e769253b6d1e6951ff679d
f1a1a4e44cdabc89eea10cae09f596c41c8dda6a
1783 F20101209_AABMJU macarthur_d_Page_092.txt
26ba69b324ee96c8dd16a97205d7305a
abf33fe9b9a240d4c9c31f805b54bd9592addd57
36624 F20101209_AABNMW macarthur_d_Page_107.QC.jpg
a70331c3070140639a147e6c7a2bf166
2b0141f2db3255655f057d70e928432dfde61b36
53483 F20101209_AABMKK macarthur_d_Page_007.jpg
7d4686512ebd494bd786fce65a775a68
c06e38a5b13b97907695a12e1a927c218c436806
210 F20101209_AABMJV macarthur_d_Page_004.txt
211b7025be2c1e914273c9475ba5619f
f611aaf46f51e7e4cee439801c938b7651a18fc7
8362 F20101209_AABNMX macarthur_d_Page_108thm.jpg
1d72ccf12263133cd3e9cb5da19fab77
92e89c323f2bee8fa5ddb33d3b00457f437d3d5d
108712 F20101209_AABMKL macarthur_d_Page_008.jpg
69e9adff7e1c8c6b2392be06fa900d6e
9b0d7478d6ee2342a404703a8c7c411572a045db
9031 F20101209_AABMJW macarthur_d_Page_099thm.jpg
9d48b4ad4759557e46ecbdee9d2a485f
b35066b2fad161b92f159b550ecf040f9a755e6b
35071 F20101209_AABNMY macarthur_d_Page_108.QC.jpg
73cfe77f69adb61698aa8abd707314c4
07fcfcf2a04671245b8f8415064245e3f335d915
118971 F20101209_AABMKM macarthur_d_Page_009.jpg
5964b3010969141a9c105c340e43f110
23b63b4af7b1b78cd3d68ad1472864129a60eb54
33787 F20101209_AABMJX macarthur_d_Page_100.QC.jpg
b476c3c8a37eb36440204faba1529a3b
5aecf6d62d2c5f4c1aa189d5178cfa719c3bec8c
2576 F20101209_AABNMZ macarthur_d_Page_109thm.jpg
d353f1f7878779a55ffcb72c4b6f2f50
9700ef9ec9e099c0b49400e9ca05c6322caa1c04
21174 F20101209_AABMLA macarthur_d_Page_023.jpg
1cd6b7190383e0e6ae636a5432ffb8ba
56330bedb3c5c932048988130798311449de1a83
23579 F20101209_AABMJY macarthur_d_Page_028.QC.jpg
4577f9c812c9c1581a3f0cc87a5ee594
a3fc321c08fd4cf47d6dc4f10aa94b00b7841550
71176 F20101209_AABMLB macarthur_d_Page_024.jpg
bc5ca1644cfa7c649421b84f100731ce
2cace35d20af00bc84521f2c289088e8f0fe82ce
23809 F20101209_AABMKN macarthur_d_Page_010.jpg
2307a53ff731413669ea50ab22faafa0
9d4af9e95ff28730eeb322f11681fbfd1ecc31b5
52495 F20101209_AABMJZ macarthur_d_Page_021.pro
b89b6e6b88319bc3b8d110df2651dacd
8324f2153b6ba0a78bcc54d504df21a9d7955207
77919 F20101209_AABMLC macarthur_d_Page_025.jpg
dfe67c0042ed5a555106d99aeac1782f
201043bdd0beac36f823f762102f5093226ed538
102615 F20101209_AABMKO macarthur_d_Page_011.jpg
8f97e08a7bd115d254d18ac532590cf4
f25baf8110abc85331c4cf43505afb3fb6396573
81067 F20101209_AABMLD macarthur_d_Page_026.jpg
7c15ccbe96484d2971d0f65aebf4e742
f605e7f0bea985ac99f2241bd1c0d3d4814a5fef
24496 F20101209_AABMKP macarthur_d_Page_012.jpg
5b21e84cad8f9581c5763c89d92abd53
5502e26770cf5cb8137998a666921f293d33c75f
82339 F20101209_AABMLE macarthur_d_Page_027.jpg
851eb25e5914025a47e6bd9dda533a7d
35a1e41a1e44da7541c489215e0f82ad9e2e8df9
93476 F20101209_AABMKQ macarthur_d_Page_013.jpg
a2e2d2df6ad99d0d667a8a2fb5d263ce
53565e13050d3a3f3d48db46f7528f38f98f5f26
69914 F20101209_AABMLF macarthur_d_Page_028.jpg
aac3d5ef997172b04095822b5d0598e3
811715fea2eda267bd794bbe79a8f90953266698
97896 F20101209_AABMKR macarthur_d_Page_014.jpg
050991515a60f80f1269aa86dcf6a2fb
e79679af3dbe756843f5a4e6e2e5ee11194f6834
68345 F20101209_AABMLG macarthur_d_Page_029.jpg
6e486f9c1b71dda32f6105b23aa76b7d
4b537b3ef6117005ef26106fae067f85477c5333
96426 F20101209_AABMKS macarthur_d_Page_015.jpg
6bacb608508f4e82fde414db38fab8dc
e80702617c719c72143ac1281fe5be785bd3e438
87180 F20101209_AABMLH macarthur_d_Page_030.jpg
18726c9ac8b06cbf2129aeaf0017e988
21ebc3c963482c55d66f50046a69d03d9dd5478e
100584 F20101209_AABMKT macarthur_d_Page_016.jpg
9a1fcc29807c51450e5b1bd4ab1a4b62
4a5fdc25f416403baa60b1c45295633e9534de75
88667 F20101209_AABMLI macarthur_d_Page_031.jpg
9c8e818f20575e0231a30e7ed091ba80
fa1c76c5bc7bb088245a1321aa3bf72b7245f55b
105470 F20101209_AABMKU macarthur_d_Page_017.jpg
04119780dafe3330f002e4e8eee84d64
ee9e37f6944f779a895e4339866488b9a5c3dd11
79591 F20101209_AABMLJ macarthur_d_Page_033.jpg
f593079a2a86c0715378415f96de5b7f
a83db20d212c466a66fea50243a108b726eaaad1
85326 F20101209_AABMKV macarthur_d_Page_018.jpg
aec949176ac548db69878daf0c6e3b0c
92feb010d3dad85d39cfc1908988339889399a52
62059 F20101209_AABMLK macarthur_d_Page_034.jpg
2b26eec96831748dfd41dd277fbc79c5
bd0bdcf97098761dc8251d169f0b8b078b892ff1
74449 F20101209_AABMKW macarthur_d_Page_019.jpg
5d807c81edaeace7b8b23553cadbf718
60626061a6d77d47ece7d960ea085bf7117c606c
72051 F20101209_AABMLL macarthur_d_Page_035.jpg
5c8c01c23fdcf132a160dfa6e67727a7
7d2d8b4a4186b0ba544399ba7a5633b75532fbea
94945 F20101209_AABMKX macarthur_d_Page_020.jpg
acfebe6387a907d4aa90ec868eedd84f
b5ff9a976f99fdebcd6bfd46d033ddb65f6f6d83
72454 F20101209_AABMMA macarthur_d_Page_050.jpg
984da761a5e1dbfac3764d85b0fa43ed
04bc8826af9258e6be5782b5d7634ac6d8e7e7a9
78231 F20101209_AABMLM macarthur_d_Page_036.jpg
a68d0667cd8f4338eb18016aebfb0106
c7b12269c64735a5a53ec34efb531272a1a73623
108635 F20101209_AABMKY macarthur_d_Page_021.jpg
b401e710a2be821cde241d35d5057c8d
ea2877a1539b76363973fa002ae2059db0ae09ef
59065 F20101209_AABMMB macarthur_d_Page_051.jpg
6c79796b3819bed1e423528ddca84399
718f721883e73b758ff5ac6e34895eeb273309b8
52286 F20101209_AABMLN macarthur_d_Page_037.jpg
c3741bc03aa042bdf9221ab4a80deb8e
8f01a194b6e0733689cd584f707fb25d05622aa0
112266 F20101209_AABMKZ macarthur_d_Page_022.jpg
d6e4a23199a4bbcc1e838640cb13bb4a
433f7d5c0eb8e09db4e99893a93662b3da8a899b
61652 F20101209_AABMMC macarthur_d_Page_052.jpg
8ab15723b0d47abe9e1f96d440766554
c03b07f487110b6ec453be8251e33ab1faa11465
59763 F20101209_AABMMD macarthur_d_Page_053.jpg
d73c09cf70e08c42e3da2f28dc90cafa
8a3b8df1d830d0c3811dc5e828208123cacefa18
35029 F20101209_AABMLO macarthur_d_Page_038.jpg
3ba9a78cfe0290e4abce86d03ddb42a2
6247ff8f57f49dcb1762e708aaf9c0380d28c6f1
35397 F20101209_AABMME macarthur_d_Page_054.jpg
c96d5d2ae15b2d06742540c2b35bdc39
3f6a86125460fb91a37d65f25d3d8ddfbbef7c6b
68184 F20101209_AABMLP macarthur_d_Page_039.jpg
2b4aeb10c803486fabe1c77d38135b0c
502b761497d32dfee8e68ffc9a91ef14db4e3dab
85953 F20101209_AABMMF macarthur_d_Page_055.jpg
475b727bab37d36607d47d6655a36d9d
0c8d759326b6a56bcbe91a93fcbdc878e05a2a27
93280 F20101209_AABMLQ macarthur_d_Page_040.jpg
d5ff9a000edfd8d7256eb3d8b4f21663
6ac7c4a3477ba3193c20735f63f836a3d36453e1
32751 F20101209_AABMMG macarthur_d_Page_056.jpg
baa38087531484be2cbc9273d3f63317
acda757dfcd48c2887c823d5482adf6482fac142
86683 F20101209_AABMLR macarthur_d_Page_041.jpg
be60087fc4f6014522bfa2e3e7ce65a0
62a80a1028a157836a7b58ad633d30acbed410c0
71019 F20101209_AABMMH macarthur_d_Page_057.jpg
9de2213402cba922101ce263a5dc5c98
7222dd0f9dba57808d8c825a12996e3f29f4197b
75231 F20101209_AABMLS macarthur_d_Page_042.jpg
dbf777fad8e3bd6cdd5a92c6455688cd
a9c89099ac9d0eab2d354202a611da1480352bc4
43508 F20101209_AABMMI macarthur_d_Page_058.jpg
9d9a715399f2420e6e641a9d91925ac9
c2217d29fdb606c98f5dbd59ccdce0e777eeb4b3
60329 F20101209_AABMLT macarthur_d_Page_043.jpg
e122c66b9c29546ae73102c9b489e33d
0239151eca2739e686ca398a3127d65c9dd29d4e
58530 F20101209_AABMMJ macarthur_d_Page_059.jpg
01c99ec03ff731c9fd2bb6f905466e5c
6992acf31c71ae90b56775738194d6524d36d82a
33674 F20101209_AABMLU macarthur_d_Page_044.jpg
a395d157eba02425ce94d68df7534096
67ad45c8a4f8280e1d5f441750dad0cab3d65625
92440 F20101209_AABMMK macarthur_d_Page_060.jpg
d7b175e1d5a5b521a1dce18363a6e348
f46edc74e048078371397733bcafb7f09bd5147c
55394 F20101209_AABMLV macarthur_d_Page_045.jpg
0cc5c88371d1e8ebed96d46aae6662b4
87e9d6f390c1538c5eb25c9719d0ebb02c883752
105560 F20101209_AABMML macarthur_d_Page_061.jpg
27db928cda016d25acca1fec71acddb3
312a94264e3b60bf1fa5e7628b6cdbcec3e2d21f
74006 F20101209_AABMLW macarthur_d_Page_046.jpg
4a14f0006bcf5d8f5b276c50f4606624
72b42bea35a7aa4fed90f14557d581b40ce7d3f3
84534 F20101209_AABMNA macarthur_d_Page_076.jpg
051aa610a975635ce265f948c685c878
b5e2ca079394c5300c7fc343a3a53647a423545c
63748 F20101209_AABMMM macarthur_d_Page_062.jpg
18aa54f95203036beca8070e2025cef3
675c63ceeea2ef5e61905d1e8e8888650ecd5264
50173 F20101209_AABMLX macarthur_d_Page_047.jpg
d9bbb6f6f746e7ea842020f127acf13e
51b2310f92c7891529ea1b9514b360c95348087e
61460 F20101209_AABMNB macarthur_d_Page_077.jpg
a202ea7f19b3c9e5342125b07a0f1407
df2808089ec8aa6fd13738e93f1d286e6896c587
83641 F20101209_AABMMN macarthur_d_Page_063.jpg
09d946c47c7e73520b7fa0e6d400e9ce
b231a80c0fba8767580959d69cc83595e683fb90
33494 F20101209_AABMLY macarthur_d_Page_048.jpg
b7c796dfbd6ac73bcb007d085f20c3bf
27e8b25ede9e4d4223c7535decb07965c84eeafb
53321 F20101209_AABMNC macarthur_d_Page_078.jpg
f8700a4efacff6c9e63358e15263ada9
50f5b003cd41a095bd5c60365637a30131466843
75462 F20101209_AABMMO macarthur_d_Page_064.jpg
6d22112560eae3c37cfcd9647569474e
b3d96577f1940c6acdbe34ca0a8ad5566fae1926
55597 F20101209_AABMLZ macarthur_d_Page_049.jpg
c3e1e6b12a92f2de01b5074cbed21a9a
0c8ff59ba1b78fa79e367e9d8c023395f374dc9d
92004 F20101209_AABMND macarthur_d_Page_079.jpg
833dfae9a7cafd4a785417c24180f71d
b6f635af90bcfe916432fa657d2aad8f8b91d6da
84828 F20101209_AABMMP macarthur_d_Page_065.jpg
da1bf312268a01394c5407094b3e30f3
38697b0bc9391e4ec3380a87152184d3ac261210
79812 F20101209_AABMNE macarthur_d_Page_080.jpg
b2722b6a4df9e2d06dab689d8b277aca
4a2d63ef45bed08daaafafbba9fc6fdc20301c71
59148 F20101209_AABMMQ macarthur_d_Page_066.jpg
0faf2e61da8758961ecbd1a5999fe63a
83d0643e1c0421e7e085994d95bf4168105cf9c7
73422 F20101209_AABMNF macarthur_d_Page_081.jpg
6a98435afee29c7239b4f66f6ea5dbad
d86fff395f532b5e7583cacb74d928a3d54bac89
68261 F20101209_AABMMR macarthur_d_Page_067.jpg
1cc7d427bdea5f562a03f48da1cf6d3a
132dd33403f93fbaa8147d9b3ffdb3a0ccaf4859
89372 F20101209_AABMNG macarthur_d_Page_082.jpg
b7608ebb24df2bd5cc700179825af03f
fd11e097dd26e1b0e60749dc7c7b72a35b685d7c
48672 F20101209_AABMMS macarthur_d_Page_068.jpg
5cd668975f6bbdffcbfe4179351090ce
23241b299d7a3960fd17d3d35ee4b9fa489d96a3
106714 F20101209_AABMNH macarthur_d_Page_083.jpg
217743bd373a504e382d95ab4b8dba02
7edbd16bb37e726002961d23e10ad761e5ec6c48
95494 F20101209_AABMMT macarthur_d_Page_069.jpg
5252c688efc8c51b6713c30af5ca0168
2386c1701845fb802a55caf0b807bbe724304132
105802 F20101209_AABMNI macarthur_d_Page_084.jpg
a7974b0317eb0416bcf2fb08703de088
9f55ae546d66f8e05c78b064175cc2eb23e3b0c2
79848 F20101209_AABMMU macarthur_d_Page_070.jpg
fae3b503900d469a9f6c89b33cb43ad1
d63fdd93a93f596ef2f257baa25e1424ef1fc227
80133 F20101209_AABMNJ macarthur_d_Page_085.jpg
8f5996534017461e12a0bb369cce2fef
b1dcf80bea907802ea90d1ceffaafdb3ba83d516
61166 F20101209_AABMMV macarthur_d_Page_071.jpg
c50a6cb1641a1cb47c8ccdc9a8f23599
8565a2df11cc34b3ac5d7112616634c1d5f09973
89836 F20101209_AABMNK macarthur_d_Page_086.jpg
0e52d5264759341925f40348dbc02356
35190f59cd698293748a84976aa7995071b37529
71299 F20101209_AABMMW macarthur_d_Page_072.jpg
70007c0c4c658e04e367279d54d4e3fc
8ac544fb71ef9a5b557a1645602b3fab8cf1cffa
88490 F20101209_AABMNL macarthur_d_Page_087.jpg
0ca951859dafbaf99e5307b2674de213
ea2f8324836541bece67663b6d25a9115d5bb1f7
70531 F20101209_AABMMX macarthur_d_Page_073.jpg
d1191efdb3de9451735db0c0163eaa73
542c73c6c6c8a97baf7d10d803bb4519e0827864
97238 F20101209_AABMOA macarthur_d_Page_102.jpg
e2409ecad59797c9070984d88329e314
288f3a4ea836e119a86939842155a2b32c0a194f
67909 F20101209_AABMNM macarthur_d_Page_088.jpg
12a6ed6646d98f715e48dd25a273e497
95a1c62f1c1e474e3f506c7cd951064dd43c3f6d
90305 F20101209_AABMMY macarthur_d_Page_074.jpg
52bafcc5815be7f852e69a66fb1e328b
dd3a966b4dd9990c9ab2818adc695db28c653b5e
76728 F20101209_AABMOB macarthur_d_Page_103.jpg
cd7c8a756f2b02d2bc54e0783f468522
fcfbba0ea92eee542adf6a0d3e2058d1c2e45d8f
92086 F20101209_AABMNN macarthur_d_Page_089.jpg
3cdbf5c79a5412d056450d52a2f18a94
8ae6e37c95b360ab6040793d684e5859cb0b0bb7
70856 F20101209_AABMMZ macarthur_d_Page_075.jpg
67e64ddf3b47da845763fef6b18be58c
006b2b9bbee303dac7769f9cba7d45f2984c6364
69788 F20101209_AABMOC macarthur_d_Page_104.jpg
946ee2770bbd2e6987d0819b33fc4cd4
268955b386d26a7608f3d10ac444bcc0a9f3fd18
70127 F20101209_AABMNO macarthur_d_Page_090.jpg
00f61f584d2580cd9c7994d934a6fc70
b5e137c64e5c180a6744c513ece2acdc32c38b71
22813 F20101209_AABMOD macarthur_d_Page_105.jpg
faee3b348f58da0c7fc2a481f2a5885e
3522d2a3c9ba54523406487980d1a4608179aa3b
69355 F20101209_AABMNP macarthur_d_Page_091.jpg
99d02b28d37edc403993f376ad4e0609
00cafcf5e1881cc20a052d67c788be0b6af776c7
128327 F20101209_AABMOE macarthur_d_Page_106.jpg
addc124b87d78b1c4224c1c907ed6a0b
ffd67193309fa7d6a7b1dfb064faf44897b8f204
127783 F20101209_AABMOF macarthur_d_Page_107.jpg
e7ab92013155736e81659dd306013f80
c8e01eee5eb8aa84caa0ba5d9214b61124ccbdae
101024 F20101209_AABMNQ macarthur_d_Page_092.jpg
fb86807ec4b679633142b58b8414a3d7
3c1f1971042e6f02d9023cf4e06ae1767389742c
118105 F20101209_AABMOG macarthur_d_Page_108.jpg
ce21872ca86f69c43f3e2ce94ab94e7a
8ed3b89cbf59f7b9b651cb127c22e2ad3120f1b8
110847 F20101209_AABMNR macarthur_d_Page_093.jpg
e5f83e092f534293d5f8edc6cfcbb995
3dc312dac6f78bff96fe7aac4d3cb7053e20e75c
35706 F20101209_AABMOH macarthur_d_Page_109.jpg
11cfa0918975262b6090938a1fde1fcb
8cfd72f21807d2b4655d75b7fcbed94ce600102b
92062 F20101209_AABMNS macarthur_d_Page_094.jpg
77f45fcf38a05d4b3a90d714e6984d38
7911c0593086846ae817d262f5cd22315469d6d9
41410 F20101209_AABMOI macarthur_d_Page_110.jpg
91d766a59c7df5c44f6e25c9e212baac
e1bffd6edc8b50d401f0e77d6ef7e2cfb4176014
111661 F20101209_AABMNT macarthur_d_Page_095.jpg
331f7dd3d40e7473a5bb75dc0dedbf0e
8c97d0c1de26776e3c65e67fb87cb47a58c0a3ab
27687 F20101209_AABMOJ macarthur_d_Page_001.jp2
0ee93b36d9f44114ff3ab8240f362aea
9e4c4f958975b6365916d9ff5e4bda947f77325d
72950 F20101209_AABMNU macarthur_d_Page_096.jpg
53c8cc3a8fe1a2b8b25ef2212cdddb80
c81f07e4dc15dd8cbe4dd9c087bd1bccfeed6c53
5633 F20101209_AABMOK macarthur_d_Page_002.jp2
365195837b746dd897baf493235bf4dd
20c076386dae98dc5e25a8edf035bb0c1eb464be
76465 F20101209_AABMNV macarthur_d_Page_097.jpg
91d6f6897f20c19b832f007bd257929a
1b355a7aa8b3d9423263a2d23a2f36a03a16162d
10900 F20101209_AABMOL macarthur_d_Page_003.jp2
ffbee30905d61971c5f2d12a275a2b78
acebfe28dbb0019da9a616766d4cc56405347654
142196 F20101209_AABMNW macarthur_d_Page_098.jpg
758612ff146ebb3228111af4baee6dc2
a0f1cb3a9cd8c98caf51e4a16e91632414a5c7ff
979502 F20101209_AABMPA macarthur_d_Page_018.jp2
07b01e8755848f269d4485f109c0e9c5
f646c1f8bb806fd08358236de6e9e0600f14e9ec
13447 F20101209_AABMOM macarthur_d_Page_004.jp2
83eb7879b7eb09288506d0e73047f339
d58c7a1911e05a86ceeb46d99411f02fd66f8b82
122008 F20101209_AABMNX macarthur_d_Page_099.jpg
f13973e632cfb885602da4344d2df657
0e0efe72c138ad4a7f3833f3249f2fb6a923c008
928496 F20101209_AABMPB macarthur_d_Page_019.jp2
540e6802edc8aee5f85f0663c29ef6a0
4326d0ddcc9404e63097a84fe3adad07a9d0b293
1051978 F20101209_AABMON macarthur_d_Page_005.jp2
1eea2d7b99161998a261c0478f7c1b00
325734ec6882680a802ea31cd979c5be32a0e1fc
116990 F20101209_AABMNY macarthur_d_Page_100.jpg
7700227f2b1eac63c9d1aefd64a49cf5
03040ca813c27f82b69e94dee9de1214d6b19a1c
1017733 F20101209_AABMPC macarthur_d_Page_020.jp2
67b5ea56c3242fdbe029b461551603d1
99bc77140f51eab7848b60e5e570aca88865337b
1051983 F20101209_AABMOO macarthur_d_Page_006.jp2
96971fde12d80b44113945bcddd37fdd
69cad8c6e109a136d25537eb66b91f393e71079d
29745 F20101209_AABMNZ macarthur_d_Page_101.jpg
473a73522542c9da3df6cff48ebc42e9
e6a7469675761c53bc0cc5b154135aedfffe1193
113210 F20101209_AABMPD macarthur_d_Page_021.jp2
2bcbffed635325f9cfdb18342a58aad6
b6caf7408a49cbf10e21594ad384e873458e2b77
980377 F20101209_AABMOP macarthur_d_Page_007.jp2
eeb64f6dd52f0a084d8602138315e097
a4aa41db915965979132df20112cdab69eebf1a4
120759 F20101209_AABMPE macarthur_d_Page_022.jp2
b66b41dee6d9a425fbfc6d8a88ec5f4c
406511e5d6c18d60c323ca47d73fcd05ec4630a3
1051971 F20101209_AABMOQ macarthur_d_Page_008.jp2
01b215a5f7cf9a2f94adcc8093df4ec6
21e5673eedbb8493cd3dd52879b0c2b10f584d43
22744 F20101209_AABMPF macarthur_d_Page_023.jp2
a810083b1bdb6454a8f88b833e5c9159
27e18ac4ac78e0928fe9d151634871bba3973910
77693 F20101209_AABMPG macarthur_d_Page_024.jp2
ab015722dd4c4feb1079204037710949
5f75dbdc7cd59a3706ff7fd69538aef35dfaf55c
1051903 F20101209_AABMOR macarthur_d_Page_009.jp2
b5ac485d3a420986e5c4603648513979
f7a67ce3ac1d54c5f1034b1e00c821010842e826
971058 F20101209_AABMPH macarthur_d_Page_025.jp2
b1c5dad5a335ef88e9258edfa8cebb50
473388b2930d909a85997115b7a5e38ade07ad6b
376824 F20101209_AABMOS macarthur_d_Page_010.jp2
4df4ff45461a8464f9d7504fe20949e1
d78fbcb0e3b6421ff7a5e5bd3972f5c9277eeb84
990984 F20101209_AABMPI macarthur_d_Page_026.jp2
d98110bc999f15407e7a905bb8b6ceca
3b9afdd7654557dc0ca6bbb6751f7e7451dd3904
107858 F20101209_AABMOT macarthur_d_Page_011.jp2
bda2362f2a97c5597075ee9a202656be
1af200743e20aabfd0a00de4d58cf253ffe47ad8
1051945 F20101209_AABMPJ macarthur_d_Page_027.jp2
51e39599922618615b5cfd5a2864a8cd
950adb8dd3fc22a14989b0eb0f6c4d4785e7b886
26869 F20101209_AABMOU macarthur_d_Page_012.jp2
13e7b45e0124968126c7f644df6cff2d
954eae2dc7110cc1ba2ca25fa2244c715a6acbc8
755417 F20101209_AABMPK macarthur_d_Page_028.jp2
fe24e5c3f9c8af1647132fe12dfd90c3
22e951a54260209f1f4228143980b7134e2c0354
99449 F20101209_AABMOV macarthur_d_Page_013.jp2
0d305d9f556d0f07143afcb56f2734a3
d38ef85cde8cc86cef623802ec7dbf5c15e3affc
781854 F20101209_AABMPL macarthur_d_Page_029.jp2
3c2803401bca7d0c31f42c4e8692c37f
ef8441dab761bb70dc996588805841a1b898177b
105347 F20101209_AABMOW macarthur_d_Page_014.jp2
298fa2f4f5f899ce246688c8d3745de8
ff082510524885680856d2b8de41ee0e3d701b9f
1051895 F20101209_AABMPM macarthur_d_Page_030.jp2
4458accf3acc615732f7f61d769b50e6
a2475a1b9b42c8c2e9ea892a315599dba0bbf687
1051919 F20101209_AABMOX macarthur_d_Page_015.jp2
187f499b1c5657b25c23290ca7010a88
384fe7d0154c4f7e0a510ea123632ef61319c7eb
34951 F20101209_AABMQA macarthur_d_Page_044.jp2
777c94fe4c344867db0cc1a8917c5235
5a559a0d6276fc1c8554910e4430368a2a481e75
F20101209_AABMPN macarthur_d_Page_031.jp2
446bb5baeaaca4f896bf3dcdad6bb4b3
841b37a3007804e47458875d8e837840acd0b81e
1051979 F20101209_AABMOY macarthur_d_Page_016.jp2
95c14d8e82790e0ddf05bfef63c85d36
4bbcf2d33baaaa68cc5a2b71d1e5d592017cac95
57075 F20101209_AABMQB macarthur_d_Page_045.jp2
6d7c1444095ad7537382b103bb440aa2
f234f098de08ee2a1ac757ac1cba9716f0669521
1030695 F20101209_AABMPO macarthur_d_Page_032.jp2
78259090fa3e414b35e2cb3e37fafd9a
f8e6e0f2092a9136e166f47cfc65a2928bee8ea0
1051969 F20101209_AABMOZ macarthur_d_Page_017.jp2
1415475b8583f7bfc03a13cd8c2c8b5e
6d3a7d9e9544e378df6e77dcdf0d5282298273e8
52606 F20101209_AABMQC macarthur_d_Page_047.jp2
2e7239fe7b12aea19ddaebec800f3de0
ff46dd8c5fb89afe274e30fce2b2059011b48a2b
911376 F20101209_AABMPP macarthur_d_Page_033.jp2
400ed072af24c9349c56802690ab6e61
fe8fb856d6d234eda5036d96459cc0a6a64e393f
34080 F20101209_AABMQD macarthur_d_Page_048.jp2
06581136bffb7940b69caa202fef804d
0bd78e5c3154a9c4c82aaec205615ca7c30a85dd
748374 F20101209_AABMPQ macarthur_d_Page_034.jp2
52934fbbc2f2fa312f391a011e071ab7
0972c780e0bc9f27a6ef0730b463879264bd0c9f
57544 F20101209_AABMQE macarthur_d_Page_049.jp2
e74bdc860bb847611c160f53174e454c
a22b192f2add24f6db527c19e2c01eaa9b7df197
76999 F20101209_AABMPR macarthur_d_Page_035.jp2
e0ef51ebec31e8c612dbb25f6f0ee85c
ea3bee090ba8b57968d4aa6b61b54305663c8eff
76986 F20101209_AABMQF macarthur_d_Page_050.jp2
5546981fe63fa1c0e926f71672aaa8c8
c3f4084535dda38f5c58e2f18d6eb740cba56fb0
61449 F20101209_AABMQG macarthur_d_Page_051.jp2
e5b0859164d4d38e8ac54684c1ec0e87
2f5ec5bb917d1f62f2ba142e3e7fee988cac1d28
83910 F20101209_AABMPS macarthur_d_Page_036.jp2
6944fc2b0aad92843f9fe1837ca6aaf9
eb993f9fcd9e96e24de6d9e02e6017fa541f12dd
63698 F20101209_AABMQH macarthur_d_Page_052.jp2
f66d0aaef03cd441aac3f617bd427828
2b65890f0abd74edda19d079cab0531467e7c7ff
553898 F20101209_AABMPT macarthur_d_Page_037.jp2
320e81220c468340f343ee1421461333
884a99ab91a24d77802a968d8207b23a15f365fd
63291 F20101209_AABMQI macarthur_d_Page_053.jp2
2d4e0fe158ec09655a48f51ce0414828
ebc62f91f4dff8e4be6a4679d7cc9ef858a30288
359162 F20101209_AABMPU macarthur_d_Page_038.jp2
9eef3eba21403c076316a2dd55684ef3
09fd530303032b80973e4be4a156168ebd2f280a
36200 F20101209_AABMQJ macarthur_d_Page_054.jp2
5ea3136b00d9e2567d958625c7202959
9cbf94c81d2737666d55a10ea04539edb4650414
73910 F20101209_AABMPV macarthur_d_Page_039.jp2
267fb2335fcb3b2048aec22038c98f2c
ff71d7ef84fff89c0f40b1d40472349e189afaaa
89753 F20101209_AABMQK macarthur_d_Page_055.jp2
0726cb1f2a1eeddf8aeca273eb790c60
42a6f0595c7a0af0550fa76752d61e41fe2b39bb
1044033 F20101209_AABMPW macarthur_d_Page_040.jp2
a06cecc3a4a00b994887e745ba1e83ab
8b14b2b20518e3125c771ad8a5aa7b39ca48cf7b
331185 F20101209_AABMQL macarthur_d_Page_056.jp2
1a006c5d0e98f3dca8f7974118b03a4e
416f2b1359183fee6573c8ee236d77148874e920
1051982 F20101209_AABMPX macarthur_d_Page_041.jp2
c41973bf49b78401f6b319ca311b10d7
f525ef8a52519b6607a461a3de422fef7bfa3eb1
847430 F20101209_AABMRA macarthur_d_Page_071.jp2
f94330fb7927b7be0d53d23dd0038a7c
5f04d20fdb86b6f300d589af1cbab1223231bd9d
75795 F20101209_AABMQM macarthur_d_Page_057.jp2
81d168eee1b83703556eb1633abdd6bf
e25cc1e97fadc32fe6d367d7b9f2e108440eecd7
840616 F20101209_AABMPY macarthur_d_Page_042.jp2
eef7f8ca1e911a0c82a89a42512bb486
9bdd9fe6fcc51fac2c1400f73ff5cb68d38c6e45
1051786 F20101209_AABMRB macarthur_d_Page_072.jp2
64ad2d0c463592ada188408baaec5d32
eb565aa65c086ff455547dbc651db87453a8ffe5
417220 F20101209_AABMQN macarthur_d_Page_058.jp2
9635a5467cab18495f7efd7b4cb4c835
eb4db7956f542b2356b4a4b393b2a7d34ed9d29b
64987 F20101209_AABMPZ macarthur_d_Page_043.jp2
7ae6e3407783f5b3869005ce28202a55
477f5f9d38ba2d172b08c2f91b5bc0e519bc2689
71152 F20101209_AABMRC macarthur_d_Page_073.jp2
d9012f67fdb323a6abddf23a55533e5d
8b702db94a4a521026cd2f0292b51d7a79525a87
62145 F20101209_AABMQO macarthur_d_Page_059.jp2
428d0a8f649be0e347ea943cd8a286df
8b0840b9fa655405c2185eff9e0a3f029bcf87de
97533 F20101209_AABMRD macarthur_d_Page_074.jp2
d7e8325c16fb5770f2c26eb8076c2c15
13f9dd30fb149ed89f2f1bea1c7e5e0f66243f36
98723 F20101209_AABMQP macarthur_d_Page_060.jp2
ecc9932748dbaa03832ae319bf176fe7
8f66a0565a8071ff69d13cc1f845360075d4537b
72684 F20101209_AABMRE macarthur_d_Page_075.jp2
ac4814e2f2a2cd23cc77a00bd74f5c48
b03cf980a414d9cc0527b9c38a3d874e2cb58c90
111358 F20101209_AABMQQ macarthur_d_Page_061.jp2
d869dab5f46bb96f1f92ce52e8031c5c
0caf5053e55ad7bbd220120bf454bec54920aa38
61368 F20101209_AABMRF macarthur_d_Page_077.jp2
ec7a9bad0c3f6b87ca267e51bdc352ca
61689da064ca12f6e264a5856f57fcd033b63583
912840 F20101209_AABMQR macarthur_d_Page_062.jp2
ca8a70944ae2276cca3fdeca0f8aacde
f0c9cce083997b77943009e9c4824afb363c3468
859689 F20101209_AABMRG macarthur_d_Page_078.jp2
0e6b18f62f360fa81cc0687846054c13
5a6f8d7012c87226358683d9ac8d4ab14bae1934
1051974 F20101209_AABMQS macarthur_d_Page_063.jp2
b1450e8af3798da235a76bf2ffab44c8
4f8630ec1596216ecb75058d35ab1ea381a42fa2
1051738 F20101209_AABMRH macarthur_d_Page_079.jp2
c369dc77ff3400449d2be541565437f1
1651d7097e357a56229fa14181dc0cb5822633b7
893450 F20101209_AABMRI macarthur_d_Page_080.jp2
740cee28616e0e30ee221d91d3c11b72
4843135f2270cfc78ba104f772267fd6159fe6f0
1041222 F20101209_AABMQT macarthur_d_Page_064.jp2
3db91b6e2ccb7849c15bcb5b72518221
550ce46184cafc9be32c04d196b13c9b4701172b
81605 F20101209_AABMRJ macarthur_d_Page_081.jp2
8adf57dc762210ba1181025fd77a767d
9241f18abcd1cd2c6b7b79dc62ca10e5ccf2864d
1037860 F20101209_AABMQU macarthur_d_Page_065.jp2
b7e2a4f5442a9932d3b50cba8a14e0bd
9d0fe8aa5c4a63596a74cbeba9ae9f5d09e9e407
1051970 F20101209_AABMRK macarthur_d_Page_082.jp2
852cc46472aabb166be9fee9f47b37af
1c7f64ac76398adda14b472389a515bc0e22b4cf
61053 F20101209_AABMQV macarthur_d_Page_066.jp2
3cff7c2881205d21576abba2d90bcbdb
daddd5fce9db7b2448964d763b78a1acce0985fa
112424 F20101209_AABMRL macarthur_d_Page_083.jp2
26689ab8094b4307d7e65804340f9429
e079e97bae18be771b6aa46a6c6b019b3903dee4
69945 F20101209_AABMQW macarthur_d_Page_067.jp2
218a0876956f4c27a5e225b55636ba50
977c830ca3704f24c017f8b5f7d42a901d217a74
1051956 F20101209_AABMRM macarthur_d_Page_084.jp2
6cc6d912937f593a103080b22c10928a
6367eec372f6e687a5d597caeeb5feab67b3fed8
49426 F20101209_AABMQX macarthur_d_Page_068.jp2
3e159135e0a316267ff344467e2adfdb
899cb4bab260a3006c7d11e5409bde5b5e7a8a4f
1051930 F20101209_AABMSA macarthur_d_Page_098.jp2
a3c37c151b62603457a0dfe9a8b6e1da
66c5900b9e72af78865588f8053b05d0e6179751
1051851 F20101209_AABMRN macarthur_d_Page_085.jp2
89d1ea552db666369f9a3e37e67f0216
8e2d1f3262fc064fbcc93b258ea276bda05fb21b
103424 F20101209_AABMQY macarthur_d_Page_069.jp2
c288629fff4dbc168747da38adde1490
8a1208246453d3b528d5e1e8c3ae339c78d537cb
1051984 F20101209_AABMSB macarthur_d_Page_099.jp2
c355411256a7f2296eb36e2f4e69826c
2634b2d07d6b98e74404577a37b13399b0b75fc6
1051937 F20101209_AABMRO macarthur_d_Page_086.jp2
c64a9daf1d1ac72175aba07643010ed0
e454ea0180eb36a7252d139bb3d69f712af78adb
1051951 F20101209_AABMQZ macarthur_d_Page_070.jp2
cdaba53edabafce7cfc9a19fd7a404bb
e3a01fd424dd3790407812fbbc3c98d00ff56bb0
1051952 F20101209_AABMSC macarthur_d_Page_100.jp2
040f6a4794702bd95d38f72d5d92cd70
298b2157d5ce11253789070df06b80803f4cfab7
1051816 F20101209_AABMRP macarthur_d_Page_087.jp2
2d0691774d9e49b26bb78706e5829aed
f258671d0f5290e8afcdc2225825b833c418d835
31870 F20101209_AABMSD macarthur_d_Page_101.jp2
9fd151087ae079c058434cb58960cd3f
597700fb96520b3cfe789458b72d74685bacfe71
644918 F20101209_AABMRQ macarthur_d_Page_088.jp2
3a823cf73ce82d7bb7e4ea607ac653f1
9f62c3cefde94bc1aec98c2a0a631521af281a8a
105054 F20101209_AABMSE macarthur_d_Page_102.jp2
28d608aa25c9020da807d65b6afeaa07
5bcb47685bf1b2d28ac75960c4732587298c7099
98512 F20101209_AABMRR macarthur_d_Page_089.jp2
2ea01e1808cf796c89db4c0bd867e3f8
793ed30e1d784d9654c77c4dcf4a962c2e54ac1b
979181 F20101209_AABMSF macarthur_d_Page_103.jp2
7d507e6c7d1fefb87cd9df45d4395f6b
6f3cb62ea95739f106d348c977734e7419309a02
699595 F20101209_AABMRS macarthur_d_Page_090.jp2
a5077fe747ed0a683bce26201685a068
ad93c51a8bebeb6eceb4afdb9180196fd0222b7d
909486 F20101209_AABMSG macarthur_d_Page_104.jp2
c57d4d214cf73851b322671320eae94e
f3c87183d044cacb0123cb67e35a2f8560cfafd6
699523 F20101209_AABMRT macarthur_d_Page_091.jp2
0422f59efa89f1ed1dac28858b44bd76
ba1709cc58f41de8bb911d4c6baabeb3b9b5673e
24669 F20101209_AABMSH macarthur_d_Page_105.jp2
269cd286802f7b7cc80d4e923b852d8f
cf646a938210f9d3cadc63439dd05fd945969305
135634 F20101209_AABMSI macarthur_d_Page_106.jp2
2de56b4d4a07bfee5b15cb3b756b55d5
4a73ad92fa38170609b2c1df0b53ee023b3ca084
97422 F20101209_AABMRU macarthur_d_Page_092.jp2
ee70d11ee7ecbc84f582059550dad292
07adaafa776a0284290a387cf5b24250e70003b4
F20101209_AABMSJ macarthur_d_Page_107.jp2
b56b98ae99703bd0d4e77d95592754ff
3a2974351bd9cf2a2c334ec7d9e6a62859e496a2
116836 F20101209_AABMRV macarthur_d_Page_093.jp2
a0dfb4984fb17f8de45a3bd3a557c053
7de58df912d6f3db48298ea94a4479ba71b83a05
128209 F20101209_AABMSK macarthur_d_Page_108.jp2
78644ad5ca14d99bff80715bc1c53c74
f891579fd0492ad54f894ec38ee19eae3dec6b74
93967 F20101209_AABMRW macarthur_d_Page_094.jp2
615d6bcc45dce803d935e3f2e2395d29
221a7909cc058ec0081c1005962f21c99865543c
35120 F20101209_AABMSL macarthur_d_Page_109.jp2
58fa1a059e201019a0ed3634907d26bd
b27c5114879feefa05c291561f3680ad6fb891bf
118930 F20101209_AABMRX macarthur_d_Page_095.jp2
13018c6a5596d0aec0c2ef8ee23d92b3
9f84cf7a0665fe753eeff05d72abdac638d6e8da
F20101209_AABMTA macarthur_d_Page_014.tif
898fcb0ae74d835505d9e92c8f576fb3
4f53edbfad7966f512e2ca31995f36f327c97248
43476 F20101209_AABMSM macarthur_d_Page_110.jp2
ba23e19fa74aa98b9eb4e392d1005d90
2e5af269fa19377332720c7fb42927a1643d00e0
629026 F20101209_AABMRY macarthur_d_Page_096.jp2
8adf804aa524b4c6745b4c16dbb4b792
ffc0cf22f58039b1e76cfd0a718e80d9a892aa0f
25271604 F20101209_AABMTB macarthur_d_Page_015.tif
14b550ecdb77e93c16b197f191038ec0
425d36f204f38ba06113f1765ddf2f99a7ed8ee2
F20101209_AABMSN macarthur_d_Page_001.tif
57552c6772776b0a3978b769c752fdeb
8b39909e5fe3550955937918ff45e0d4a0879868
727918 F20101209_AABMRZ macarthur_d_Page_097.jp2
8e22c5868acabc11560e0285df83c713
6fa0427cd1ae9ca7fca156cf91ceec882374835f
F20101209_AABMTC macarthur_d_Page_016.tif
517ece6bb997a335d082bb7d85a6d220
42853497a1b9c1c48900244c0c17027fc2c37360
F20101209_AABMSO macarthur_d_Page_002.tif
a640016aa9527a1e23da3d756a2f897a
50ccff9a634a91b95f84805fd116cb670e631a75
F20101209_AABMTD macarthur_d_Page_017.tif
06c24517c743907fefc67914545c4fd3
be961754c2c83161258db3f6118d52a1e9e1a707
F20101209_AABMSP macarthur_d_Page_003.tif
5f0525ce5e63da788391ada1db1b48ac
61c40c39fb29b9f322e8da1b0571ecdd2e52c7a2
F20101209_AABMTE macarthur_d_Page_018.tif
9fcccc9e55b5af54f89b299c90444178
859fe1df5c691f1f44e4679802fa8bbf09cae893
F20101209_AABMSQ macarthur_d_Page_004.tif
8be660d164496bf838e56643b6d2b117
3a34a5732374476f2be7ad421f73c8c450a4b938
F20101209_AABMTF macarthur_d_Page_019.tif
9b99b8c6971a99ffed2e1b7953c27d1a
ae914bd73e59c9eb29178e0cef727d61252abd0e
F20101209_AABMSR macarthur_d_Page_005.tif
740c8237cc05f2b909cfb7ccbc2bdfdf
a955e68696f717482377fd65343d647dce77805a
F20101209_AABMTG macarthur_d_Page_020.tif
639d5dd32bd5100051de3039e5332a33
750730b3c638833676969396456e5903a5a57a3f
F20101209_AABMSS macarthur_d_Page_006.tif
de86ed87437d7e9dfff354df793a5348
0fdc04a62150778fb7ce0d54fc03adf0ba79681b
F20101209_AABMTH macarthur_d_Page_021.tif
f7d88b653537d18b8948a87a9d156198
c7e5fb186e6676c1c33ccc24f746632f19878b6c
F20101209_AABMST macarthur_d_Page_007.tif
51abec63896568258da64fb7a03fd879
49b329cb65931eebb63d902661ad03a556da1068
F20101209_AABMTI macarthur_d_Page_022.tif
591616dc1f533d2d65342c19815eb57a
2a15416ec8534f9ceb5bc14ba8a52fab728cb30b
F20101209_AABMSU macarthur_d_Page_008.tif
6eb861770d797e725d4db29939ab364e
fd7eca7836b6765308d2690052f20fbac9f32e74
F20101209_AABMTJ macarthur_d_Page_023.tif
38a5dfe2e5b12bacd1e168c484d9b9a5
ad41d6b7f554588a8ce97ff528a16ad29409f3ef
F20101209_AABMTK macarthur_d_Page_024.tif
dfd831e2052139c35057029c41812692
9d2adc1ab4b8ebb12a20c437ebd4e40645f48426
F20101209_AABMSV macarthur_d_Page_009.tif
9fcd837d5612c68a608ad7e2e3490863
3f2cf834964ecfcb58a4332af649d45c82a1ed4c
F20101209_AABMTL macarthur_d_Page_025.tif
9ce282d1112762175b8b745331fe0f06
a8cbedac9f61d850630bd866a509cb03d69d269d
F20101209_AABMSW macarthur_d_Page_010.tif
bad48d1320e29b450a6962eb989cbb3c
1f76bbe22696fe334eafddeb27ae11dd8679d9f2
F20101209_AABMUA macarthur_d_Page_040.tif
1e3d9cc5cfcf7b8ab0df02863fe051b5
41b185b90507b4bb92e71792bcc73226c030c9cb
F20101209_AABMTM macarthur_d_Page_026.tif
627163b1dbea151f5a0176aa600ead61
bb72a014127cfa99e433bb5ea8c8b358c285143a
F20101209_AABMSX macarthur_d_Page_011.tif
73bd40be60cd63abbd9cac341945129f
fca5f71b1551c53e58c021ad6889d9ccd4ac034f
F20101209_AABMUB macarthur_d_Page_041.tif
b9c7223c2c6a5394eb4e46a6e7db3562
1cc0616a3e5559f7229faa62ef4548badb54027b
F20101209_AABMTN macarthur_d_Page_027.tif
adfd7a099f648607f84709045bc9e28c
4e0a31375b35d1794933824388654e1635dbe002
F20101209_AABMSY macarthur_d_Page_012.tif
05cb711abdf0986ccfd60b08f234c75d
42b26c0174a9206b693a461b58e4d4776f0777a7
F20101209_AABMUC macarthur_d_Page_042.tif
97f54de2e5f2e0d2e406f4050e2ba277
75d96465d3e7ddfc6fd1b20e37a5b7ddd2ee6ebc
F20101209_AABMTO macarthur_d_Page_028.tif
4c7ae048bf029d2a5428678404460afc
4ffbf978a1dc2142f4eecc23470cb232d9ad79e5
F20101209_AABMSZ macarthur_d_Page_013.tif
45ddb5016e4edf29d398aa7f09ae29ac
449cb7c60866c40cb31f4303b31f698485a8bc0b
F20101209_AABMUD macarthur_d_Page_043.tif
9bf59fc16a47bec61cbfa34bc6e69dd2
c9930fb96d1867b01d203ea4ba7ddab910265975
F20101209_AABMTP macarthur_d_Page_029.tif
f5e5c4ab06173ba71e72aa7e33c21c2d
c6dc49407accb7b5ecfec96668a67ac4912bf5f5
F20101209_AABMUE macarthur_d_Page_044.tif
1347b948c3dd8b5542f403e069330df0
013c50c43a2e7156d88e4b215320b7896782bfc5
F20101209_AABMTQ macarthur_d_Page_030.tif
5683d4c11e194892b70056cfeb82e0e6
3ffb6850c79b1574ab730a5a0f87030153a121ac
F20101209_AABMUF macarthur_d_Page_045.tif
6e1a749dfee976e86ade82d13173cf39
e8752baedf5bde4e2701336c746ae2a4010ec8ae
F20101209_AABMTR macarthur_d_Page_031.tif
1391dbe97bd68b1c0c8221eb22489f05
5cdb83ab3ae530bd45d076ca8568dea770da3c23
45260 F20101209_AABNAA macarthur_d_Page_089.pro
8d4dc0ec20f9d3b9f599967b11a5c2a2
06ca4b96b1b1ebf191010f89adb16f5d36990871
F20101209_AABMUG macarthur_d_Page_046.tif
ba06acf7a8e753044fa52ff44de24068
bb2c89492a9f59179327620425b4410ad658d594
F20101209_AABMTS macarthur_d_Page_032.tif
d74f3ca046bf2f9eb34c74e68e800631
2a3995dce415dccaf63bc29848222734cfb1c10c
21447 F20101209_AABNAB macarthur_d_Page_090.pro
a974a50fd650c9049db1460b0776f917
a95f427214702fc94b2eebbf52a663a995983e9c
F20101209_AABMUH macarthur_d_Page_047.tif
08b9890c76aa10c3132e3245f0a43c9d
18bdcd22085999d8d9b82c2288bce3d3286bfd58
F20101209_AABMTT macarthur_d_Page_033.tif
5b50c549419a70396fab4d3417f8a5ca
da01d2149a6bb0ef9d48092423c9bea7cdd9d592
F20101209_AABMUI macarthur_d_Page_048.tif
ebb93ebc92a363c562f8fa688b57b278
ed59e8d552712578849caea914f7a2b916e072b7
F20101209_AABMTU macarthur_d_Page_034.tif
481cba9d04103b1c6fccedb834c35c78
7364ee2520b6a374da1319fd2b6ef5021d5a4b51
28759 F20101209_AABNAC macarthur_d_Page_091.pro
0df354292ab503f5bcfe8e9b7a6607b4
1711f4c79e0c7eb1ac73948939957c8b832877c1
F20101209_AABMUJ macarthur_d_Page_049.tif
05b8a54f1f434808f89ede655fa7d2dc
017c5d529c7bba5c9d00b50a8817f1dd1b6c8fc4
F20101209_AABMTV macarthur_d_Page_035.tif
babd88d5e12603765da5a2a069d3de1e
e3433e6a713cad4771cfba46a23a85f31e70b614
41306 F20101209_AABNAD macarthur_d_Page_092.pro
ad880b06a06065fb477fa5960d9b52a3
07a500f105c072e2a101a80f2a9413a7734ed08f
F20101209_AABMUK macarthur_d_Page_050.tif
5cc314b938e3b3fea9828e4a54237552
db53986b72212362d86cb1db01dd071eed4185e4
55012 F20101209_AABNAE macarthur_d_Page_093.pro
145c4a9428def7feabbb13ca00a3393b
f5feab21a95b4d36e6b957136f5085e8a46cea34
F20101209_AABMUL macarthur_d_Page_051.tif
712d1cfd2c1a79d74cf1854b7cf40c66
daa8c7af1f962b2ba7fc102f17b829d297f9cd80
F20101209_AABMTW macarthur_d_Page_036.tif
54a2ef950c571d83b5bb7b25b2d74247
9ed80219ff6391427096c4e1be18cb45b9b06193
42296 F20101209_AABNAF macarthur_d_Page_094.pro
ff2c48539a40b2d8469d52482ec09463
be7801b674fa902e22acc88ad75835b4f23c4d6b
F20101209_AABMUM macarthur_d_Page_052.tif
64274217f8612039aed174304914fc14
45b6a4fc2e6252f5a014cbc9f8551129fb022c44
F20101209_AABMTX macarthur_d_Page_037.tif
8a9dcbda7975f8ef6f43427d3016f071
f51a960a59bf8b43a257e8748689eb54b4aafe85
55075 F20101209_AABNAG macarthur_d_Page_095.pro
dfb16aa532c9dfccb0d694ac2cc18bb5
44c042d228ac2c198cd5a178c562b87835f10288
F20101209_AABMVA macarthur_d_Page_066.tif
396d50a7bd934f5f234d94c10b6df8dc
0075fd2c4d2e3729477f99e745bfcde8f1e5c522
F20101209_AABMUN macarthur_d_Page_053.tif
5c9ce12191b636707b610e8cac97622c
9509f3e81245a3942dbe5da8305322f5967b6b01
F20101209_AABMTY macarthur_d_Page_038.tif
7ac381db4f1d6bd96608a2097718c10d
2482f6845ba4edcb877036a1877b38a962620e0d
23363 F20101209_AABNAH macarthur_d_Page_096.pro
b36cabe1c6e679b18b60142201d14bee
61344f599f78dfc716122bdf8cbc164229e20e08
F20101209_AABMVB macarthur_d_Page_067.tif
1c98fb58f1d86f6bcb8a87023f657fb5
3900241ea6e362f9729131ca86b87add9a19ca41
F20101209_AABMUO macarthur_d_Page_054.tif
b87e7f707f2393eba517688f0ee7a7e4
42ede0d00f09066a1f231c0c9d79eb2fd1f9065f
F20101209_AABMTZ macarthur_d_Page_039.tif
aefac93c3444f799cb27f2f1c6b2fdc8
ab91670061a666322c068ffe847abb8d02eba73f
25883 F20101209_AABNAI macarthur_d_Page_097.pro
1ed4b8abf9098d2aed06eddce9c88a7a
d963e98ccce82105c386117273b53b4dc862f7a3
F20101209_AABMVC macarthur_d_Page_068.tif
1a820cbf80e107c513dbb1b425c6e752
f5270e3862f2179db43b7e70ac1693eef94fdd12
F20101209_AABMUP macarthur_d_Page_055.tif
d543914cccc5353d759b06707a43899f
4417d6ead4c29968a61a972980e93133978df8b0
18757 F20101209_AABNAJ macarthur_d_Page_098.pro
9287df06c992536cc479aaa59d74e709
294adabf34e14e862aee7c246c3b5fee6ca9d22c
F20101209_AABMVD macarthur_d_Page_069.tif
f0acae8d846781c44eb2eff3faf5d75a
ae7dfb25e385b320afc1171fdeacb379e46a97c3
F20101209_AABMUQ macarthur_d_Page_056.tif
9c7cafb3992f500df51f9217d0842262
6ac08139e7dd8a5a9149757596ce92fa85928810
34225 F20101209_AABNAK macarthur_d_Page_099.pro
62eb691edb9ee686110524787bd9765a
9fd13a7379094cbb68102c79ced95b25bc99f9b3
F20101209_AABMVE macarthur_d_Page_070.tif
30426622b6cdf45a8fa4a42bdbaaadd3
9cc7c9bd41e980a4dcac794ee53ae341d683f8d8
F20101209_AABMUR macarthur_d_Page_057.tif
f4a8bce923bc13c3435b4fda3e63544c
70fba4937e0a3b0756815927c94a7e9eb1a2ba96
2466 F20101209_AABNBA macarthur_d_Page_006.txt
9d90ee2dd42251465eb3903df3a00260
fd109dc176458b14a6aa8196eaa76f4157ca3b01
15674 F20101209_AABNAL macarthur_d_Page_100.pro
c6c987600ccb4b25f46afa095b9cb561
4c209bbde29c96f3d871be2c10250a5eb4b612e0
F20101209_AABMVF macarthur_d_Page_071.tif
80c201b43f3509b8ccc727ddd4f2842b
05ec5709a0a291e057011a2226dd9ffbede5af66
F20101209_AABMUS macarthur_d_Page_058.tif
0cab315cd277d604da3c8e84f1edbbfb
0c6570df307c1aa00fa377fc9bfab113f5aa2dd4
1004 F20101209_AABNBB macarthur_d_Page_007.txt
bfce6ae499a5b60eaedf02bc16db44ec
a4285a341da5db7efba3dd748faceec99f12e470
13439 F20101209_AABNAM macarthur_d_Page_101.pro
0d0bb9e49f0d8f8d8a7f77c2b563021b
3515f190d376197679fa6198319538424adff562
F20101209_AABMVG macarthur_d_Page_072.tif
c87c4286cebd45d0d430078f9e0b40ed
cb09b5384daa6a39754d1afcdcb951b26f8fc016
F20101209_AABMUT macarthur_d_Page_059.tif
5264b58c6cfdac61f810a4b070492365
35eb13c59921b1094047ad7bfa4d930a5e6fe2f5
2457 F20101209_AABNBC macarthur_d_Page_008.txt
a97f43df29017ec941c9d42d6f9d67dc
0b391e7d8bfa27c9231afd0340c7442948318dd4
48479 F20101209_AABNAN macarthur_d_Page_102.pro
f43d4d67dfe0d442e11b1daa93ca270d
03804905965caf3fe66b7a72057a573767c42547
F20101209_AABMVH macarthur_d_Page_073.tif
e41c3c8f143b10bafe25319f20563837
22ccb348fd92fa6b61a3fcce3d9db7638250cb54
F20101209_AABMUU macarthur_d_Page_060.tif
d7ea743970afb083b18c34812c8f500c
675df19c4405396faffc59347a8cb893aa1f3aa6
29165 F20101209_AABNAO macarthur_d_Page_103.pro
365252e31b17c75aefc04cf7bd83f17a
56434d127c0f20e446daa75d754fd4ea09c77b07
F20101209_AABMVI macarthur_d_Page_074.tif
ee5c304edbf9f2ea1c5efdcd48455531
88a6e4bdf4455c91e6d617526dd627b7b10fe828
F20101209_AABMUV macarthur_d_Page_061.tif
e0ff2c61448a318a2abf5565587e0941
f6d3eb507213e13ac28a68d7d3bc798b60e2edd0
2570 F20101209_AABNBD macarthur_d_Page_009.txt
cc74332e34f879e04dd6fe4d83a45fe9
e4b61ba868fecaf97b6c082d15f71ad81f5dc6dc
F20101209_AABNAP macarthur_d_Page_104.pro
d431a91904a887d16815edac20dcc1a2
eac7d61fb7844cbcd9f70715436a011f6d6abeed
F20101209_AABMVJ macarthur_d_Page_075.tif
d5b77f1a36ce6dcd210b9622bf5c1423
b7de5b9fe69fdcd35add3d95708a83f31a16e8fb
F20101209_AABMUW macarthur_d_Page_062.tif
974a370c8abbc57f03b2086d1a60e0ab
2bc66ae4e94419ca0033b39fcd41c134432bfb07
392 F20101209_AABNBE macarthur_d_Page_010.txt
24a30558a0b3d54ebaa5b7bbaaafb894
70e3f92ac0cd5480dd84d8241e2c47080fe52104
9769 F20101209_AABNAQ macarthur_d_Page_105.pro
451ba7a49491b29df433bade007f83ed
383fb0e062a5eb13e69fdf96f7c3e63e594d9e71
F20101209_AABMVK macarthur_d_Page_076.tif
faa83424d8bfeedecc68f69484d0f9a8
81328ae89a6bc90af48af6ff65093abb8adb9aec
2075 F20101209_AABNBF macarthur_d_Page_011.txt
f96b6082343ea2203fbe52791f797364
7910349cdfdff0ff48d4b6f29f77c17e4ad12eda
64404 F20101209_AABNAR macarthur_d_Page_106.pro
4bda86b11eaf25ef6d7e9205c3c34a77
afd55868e9838bafe5c22d22704f81d236d80cff
F20101209_AABMVL macarthur_d_Page_077.tif
bc249e49a22a37af2c8f3227e4f6af6a
d15b4c1431f8eeea1e9f90598208212300e15d81
F20101209_AABMUX macarthur_d_Page_063.tif
46ba368e97d80709348a0ec6fe890a8e
9b3e08ab9037f620ab9fd8ed7bbb4fa29244a6d0
427 F20101209_AABNBG macarthur_d_Page_012.txt
527c1057f40be079023306a3debe2c7b
9e705301752497e147fa77f3790aca5f0c937da7
F20101209_AABMWA macarthur_d_Page_092.tif
3be7326b3df3dd0c1528161535c22aae
210b0f8598f5ea378d61949da50f7d7b89d6c88d
61967 F20101209_AABNAS macarthur_d_Page_107.pro
0244faa20e73d6c8b8af3fab51932471
e3f9676849d0bc8830ddfccabbf0f31a613c5e73
F20101209_AABMVM macarthur_d_Page_078.tif
162a16132162e588a3ff4da6894cbc87
48310663b54b1eef34064dbd7f5e29a770565e40
1828 F20101209_AABNBH macarthur_d_Page_013.txt
56a9868eb4938bd0bdf9375a753ae142
898ac9a9ff6717340877f9c57b6792e1897bdfda
F20101209_AABMWB macarthur_d_Page_093.tif
aff2d9067b3d7d22c13fbeafd8eb1a0f
fb45de6c2a105ab96bfe31fd38041c82af157fe1
58334 F20101209_AABNAT macarthur_d_Page_108.pro
6622e8be524e8031033acb562e13242e
a591ea38301a712a0c4d165d68f7685eccf5b3d3
F20101209_AABMVN macarthur_d_Page_079.tif
65f0d2953e79f9945ac448e93148950b
242ca90ce820791da1423a59acce95258d826a8a
F20101209_AABMUY macarthur_d_Page_064.tif
68ae71adfab52b2302cacc3f9e61347e
d0481d1d08df62a9076618c2c1f830c6390e3b34
1976 F20101209_AABNBI macarthur_d_Page_014.txt
b500aa183863c14a65078093d9bce127
d9ebef0eb93a128fa76b23f573eefecd0ffcb7f0
F20101209_AABMWC macarthur_d_Page_094.tif
a4d602557c5d817c2e2d92a4d9f9d4ba
686716c6a9ce0a56ea06c83436ffce51dd4399dc
14561 F20101209_AABNAU macarthur_d_Page_109.pro
856a5035f91c830c3b35d618e86fd87c
1b309217f94d04b7ab722653d5ca922550a80c5d
F20101209_AABMVO macarthur_d_Page_080.tif
efbab1bc4334e8a8edb92a1ff958c49e
85932a5792ee7cf70093fea652552033036f760b
F20101209_AABMUZ macarthur_d_Page_065.tif
530531e9a578c5f7ee7565a13a851f8b
ae64cf57fac63dff81e539c4a86d88fe343ddb76
1503 F20101209_AABNBJ macarthur_d_Page_015.txt
a250e359bc243f970c5f4bbdaa42218a
986dc6af675c25c5926db296d475a5176127384b
F20101209_AABMWD macarthur_d_Page_095.tif
060e2ec8f9fcf80fd828634ec43c7bbb
e03c03e75b2e26641d00b33ce5e097438ce78de5
17827 F20101209_AABNAV macarthur_d_Page_110.pro
ac852fc6f1b9e9b1730e3a9eabe0133b
7ded3b2b24d89b14f8c7d9ba6392db2d5ac6b7d1
F20101209_AABMVP macarthur_d_Page_081.tif
8927daded92cf162d958d17c4c1cc986
36e0339651423f5c635d366d554eec61701d8d16
1758 F20101209_AABNBK macarthur_d_Page_016.txt
46921166c2f03c7d53acdfd49efb5b6b
251bfd862657f712a707ce38724fdfdd957ff584
F20101209_AABMWE macarthur_d_Page_096.tif
975bc551ca3d49c3b7c02102110eeb85
1c0a2ad8d8053eaef68690dcecff3b20a15a73f7
496 F20101209_AABNAW macarthur_d_Page_001.txt
a849100cff90f6a20c6b6d90eaf99764
41cbc4b93e72bee78ced69db85f9994726ada279
F20101209_AABMVQ macarthur_d_Page_082.tif
8584f09026cc9a9851e5990df2de98dc
a2e2b4e7168fe69512c487c3f60a9d651fa0ac6d
1779 F20101209_AABNBL macarthur_d_Page_017.txt
09e7c145384cbe969e47599b8c2a9047
7c32baa9ad8e9bdb26426f1deeae9dbdbe971dbb
F20101209_AABMWF macarthur_d_Page_097.tif
3bf4e8989291a4c1717a91747b68dcc2
0ff3875154691fd9e517be69c18f175348747c7c
99 F20101209_AABNAX macarthur_d_Page_002.txt
c087875ff4d9d21604a7a9bac16f6397
59aad4b593646ca2ca25ca9fd5559227ecf4c0ba
F20101209_AABMVR macarthur_d_Page_083.tif
480624cd61471cc1995b87258fce29da
2191d97e0ecf64c68769b27ffb2391d311574934
1401 F20101209_AABNCA macarthur_d_Page_033.txt
524d6537045d5e0da82bc0021f3c1bbf
5f0cc5845037e6b0552d26ba6720243c9c839534
1463 F20101209_AABNBM macarthur_d_Page_018.txt
b786b9d2ac33b450053ece831ac193c8
58bb5adec8034354c79ceb028f12f48a0a2e6bf7
F20101209_AABMWG macarthur_d_Page_098.tif
d578ee0f0d99c68a2921ffcb4bd480dd
f52bc000b6ed7990ccd904c91ae78dad8949cfe7
224 F20101209_AABNAY macarthur_d_Page_003.txt
05e760a69400b27277927328c17087b1
96633961353f11ca70d55ec08346c573957d9257
F20101209_AABMVS macarthur_d_Page_084.tif
b1d48954476417d143cf9f5eb18430e0
3b03081dd73cd33e7bfaa30e04d9a475daf70ab2
1365 F20101209_AABNCB macarthur_d_Page_034.txt
f92ada56256cc84a2db993f627b107a9
3592112ae153c42f513043baa411d4596086ecfb
1162 F20101209_AABNBN macarthur_d_Page_019.txt
0a684c3197b59343d3fd231e2445df35
fb0b684eae89c263d1d27dd5caac3cb48cf42fba
F20101209_AABMWH macarthur_d_Page_099.tif
b8d0426aea48927d22018704ad8dfbaa
ead0762d2367be1147a399f3087de2d296e53962
3150 F20101209_AABNAZ macarthur_d_Page_005.txt
c3d4246ada7d06721c35cc9f5bf8075a
b1a210d92571092551572d3b77709742e687b2fc
F20101209_AABMVT macarthur_d_Page_085.tif
dc0b987b1c75bb8e6172a20fd3dd6ce1
5882b75a2b9c260640eabad568da500a6ce9944a
1899 F20101209_AABNCC macarthur_d_Page_035.txt
ad96b4901f188822f4495d3ff7c4b59e
f3f63afc12cc2394da7d47f986cabb0dc2c5b8d1
1736 F20101209_AABNBO macarthur_d_Page_020.txt
605234dda0dcb3bf9f970c466a6e4de1
f518bf4baa25cc2149b557c314660c4fe3900f41
F20101209_AABMWI macarthur_d_Page_100.tif
99d861ba531d744ad32cc9d721ec009c
1d2e44236cfb977bd64f811c01cbcb83dcb8f586
F20101209_AABMVU macarthur_d_Page_086.tif
809ec9b205cd8b8bedce097ab3d3e3ad
a8af9908e01f0fd3cd1276706e3cd956771a7831
1663 F20101209_AABNCD macarthur_d_Page_036.txt
b15392f1cf96a3ea561241951e098c5e
d83d15e2b322e437388df31a75fe4833c52d8d02
2102 F20101209_AABNBP macarthur_d_Page_021.txt
66ae7123d805ecd290f00dd3eebc0aed
2d47288f76cbda7d6461f2b125bcefdbf0e170f0
F20101209_AABMWJ macarthur_d_Page_101.tif
694260c2ea1ea506e9898bbd3a99030f
7fa353e1e1efde59dd59e649ad9605582a5b6887
F20101209_AABMVV macarthur_d_Page_087.tif
39c45c112a30c7b333eb07e430211b1b
0bc68566925f3b779afda5f085996322a2ad43b1
2216 F20101209_AABNBQ macarthur_d_Page_022.txt
dbe7bf920008002d164f878f65d30cfc
3a40cec4cf59293e1c8edf4e8e4bc0cbc0276afd
F20101209_AABMWK macarthur_d_Page_103.tif
006737a1c912afa187be7a991747be59
f20b305ae48b25e361345819a25d08bdd7b70015
F20101209_AABMVW macarthur_d_Page_088.tif
6446561a2a1a8200849f014afc208123
9c8d0a80ff43727520c1730d4f936f828f461a88
1113 F20101209_AABNCE macarthur_d_Page_037.txt
b2988da6eb20657a20553db3a356af0b
6c4b272c77ea238a9bd00ef444e3f285f4a0991c
348 F20101209_AABNBR macarthur_d_Page_023.txt
a315d752498aef2095f3e53cb5966861
c612a92adec02b4a62c940f1337918d509999516
F20101209_AABMWL macarthur_d_Page_104.tif
c2a6898d93a20f3feeb09985723b2875
3dfd057a07a0fa809dc3f3d82d149dba2f561871
F20101209_AABMVX macarthur_d_Page_089.tif
efd688ad943c021297b987cff8cde842
d1648377eb0dd47cf1007ddc881fb5be660072dc
520 F20101209_AABNCF macarthur_d_Page_038.txt
80cff59d6753f975c1d6f16b921ee699
2a2b863d0831b38b77f6b5cc5cbadae05c277358
1570 F20101209_AABNBS macarthur_d_Page_024.txt
19e29b68c3bfa0f3c909cf611fb310e0
911f623f439e406a79c035f82452acfdcbc83e19
F20101209_AABMWM macarthur_d_Page_105.tif
0a75357724dc908684b379013aba08ee
bef78ec0ba85581f895c99e5dd36b0a246032849
1610 F20101209_AABNCG macarthur_d_Page_039.txt
8b1b5e24e605f1c57cbbf33505d17664
b53f2fcdd870774d57b4a1cec979cb2355cae1e7
63782 F20101209_AABMXA macarthur_d_Page_009.pro
e0012f94d8c32f632d04de1706aa16bb
e86f464b9561e2278bae71418599e26c30582eee
1058 F20101209_AABNBT macarthur_d_Page_025.txt
406b8ee848acb35ee507560bfe416886
3a326a8d57892e8d81104c6bf11b5358108d28c2
F20101209_AABMWN macarthur_d_Page_106.tif
d26e7b00294db4a3fe0a5d7f89a1565a
e5f520ef9c14e6d5cce0ebbdd6c167812cd058cf
F20101209_AABMVY macarthur_d_Page_090.tif
722fe446771e8b27e49d443286a08092
a36423a6dd607b3e9f56ff5fbf7d51e9f707bbaa
1445 F20101209_AABNCH macarthur_d_Page_040.txt
5a623b41c7b66689c90a9b59cd8851e2
25595abe01212d1c1a0ce3d741c477783b6a15ac
9292 F20101209_AABMXB macarthur_d_Page_010.pro
dd3a506074fc788cb7466acdd6083632
9b895b0e18f373699dbeb0cae65652d68f56d428
1638 F20101209_AABNBU macarthur_d_Page_026.txt
bc3625d551545b48aadf8361d33f0036
c988b0130cc177e136b5a994468aa6398bba901b
F20101209_AABMWO macarthur_d_Page_107.tif
20f47d4631c5d5585294f317f0edcec7
7f7de4ec28407aeef51bee680089aadd02278ab5
F20101209_AABMVZ macarthur_d_Page_091.tif
2a35c99ddf445aebeaabf01aedf8a35e
efe4d8ec320501719e56f681ccaa048b0466373d
1235 F20101209_AABNCI macarthur_d_Page_041.txt
97c7e1e866a5d8479fb1a10394183c5e
ada2f108ee7558d93fa4c6924cfdc3e68a4e214b
48260 F20101209_AABMXC macarthur_d_Page_011.pro
7ca184dd6a87e1ec769a96b06aad6b85
7c1e5b808b92a35a49a2e1943de6d1bdf85ea91d
676 F20101209_AABNBV macarthur_d_Page_027.txt
e120045e8b1d59b168e82fe19001e671
84635f56c7dd7719ac3b201cbc4d941fd0142aeb
F20101209_AABMWP macarthur_d_Page_108.tif
af962358ae8e280e4578abe43f4b408a
9fcee1e122b9373944c74e8579bd2cd0843ca6e4
1231 F20101209_AABNCJ macarthur_d_Page_042.txt
6f5cf82241c1459c7de968a0ca9c1b52
b1c28ecf2aaec05f786aad8e9bc1449a320b83e3
10694 F20101209_AABMXD macarthur_d_Page_012.pro
c47a733e288816d35bc117308f0a18ba
a25b166c13a683cd0ee21b4a6cd9562f9651d93b
F20101209_AABMWQ macarthur_d_Page_109.tif
35955c9290ae7b514549cae7e57c03b9
0bf43cd7de71cdb9ba872fa9a4f83beb73dcebff
1411 F20101209_AABNCK macarthur_d_Page_043.txt
3e0ca149de288d6a7489851d01aacbb5
e3e5604be5440af67ee96e60f6c319fa36b11944
43961 F20101209_AABMXE macarthur_d_Page_013.pro
09235585e634211a88dd24f7facc4534
94584a362a71149a87bd28422f4aa67d4b6feac3
1254 F20101209_AABNBW macarthur_d_Page_028.txt
500d06e384166cc93c705f96b1e087ec
1a16fd0f1037ca9fb594c08d7a31817346d5c2be
F20101209_AABMWR macarthur_d_Page_110.tif
4a21c630ca7116e109b1c4cb6ef2dc71
e4696a1b737d2a031e4c1009bf19b8c3a0865924
1996 F20101209_AABNDA macarthur_d_Page_060.txt
78c5f7adea7b446b81b5d09c3c2beff5
bccc06129a0e40eb1303aad22067df475439a40f
836 F20101209_AABNCL macarthur_d_Page_044.txt
dbb9a94feb9dadcab24c834e510eebb0
eb0e907b0a658f5485306a13e06c0121f4843767
47549 F20101209_AABMXF macarthur_d_Page_014.pro
37605c36a9f356d8af2512a8aaeefd3b
36ecee9bb3223064250b0c3394d427a1966d68f6
1293 F20101209_AABNBX macarthur_d_Page_029.txt
d504f2b9413ef0b35b24debefd2079fd
57234b69e980bca2be0cfd96ffeb47e723c1c7e1



PAGE 1

1 TRACKING AND STATE ESTIMATI ON OF AN UNMANNED GROUND VEHICLE SYSTEM USING AN UNMANNED AIR VEHICLE SYSTEM By DONALD KAWIKA MACARTHUR A DISSERTATION PRESENTED TO THE GRADUATE SCHOOL OF THE UNIVERSITY OF FLOR IDA IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF DOCTOR OF PHILOSOPHY UNIVERSITY OF FLORIDA 2007

PAGE 2

2 2007 Donald K. MacArthur

PAGE 3

3 I proudly dedicate my life and this work to my w onderful wife Erica. Ma ny trials, we both have suffered through this process.

PAGE 4

4 ACKNOWLEDGMENTS I would like to thank my father Donald Sr., my mother Janey, and my brother Matthew for their support through my many years of schooling.

PAGE 5

5 TABLE OF CONTENTS page ACKNOWLEDGMENTS...............................................................................................................4 LIST OF TABLES................................................................................................................. ..........7 LIST OF FIGURES................................................................................................................ .........8 ABSTRACT....................................................................................................................... ............11 CHAPTER 1 INTRODUCTION..................................................................................................................13 2 BACKGROUND....................................................................................................................14 Position and Orientation Measurement Sensors.....................................................................14 Global Positioning Systems.............................................................................................14 Inertial Measurement Units.............................................................................................17 Magnetometers................................................................................................................18 Accelerometer..................................................................................................................19 Rate Gyro...................................................................................................................... ...19 Unmanned Rotorcraft Modeling.............................................................................................20 Unmanned Rotorcraft Control................................................................................................21 3 EXPERIMENTAL TESTING PLATFORMS.......................................................................24 Electronics and Se nsor Payloads............................................................................................24 First Helicopter Electroni cs and Sensor Payload............................................................24 Second Helicopter Electronics and Sensor Payload........................................................26 Third Helicopter Electroni cs and Sensor Payload...........................................................28 Micro Air Vehicle Embedded State Estimator and Control Payload..............................29 Testing Aircraft............................................................................................................... ........29 UF Micro Air Vehicles....................................................................................................29 ECO 8.......................................................................................................................... ....30 Miniature Aircraft Gas Xcell...........................................................................................30 Bergen Industrial Twin....................................................................................................31 Yamaha RMAX...............................................................................................................31 4 GEO-POSITIONING OF STATIC OBJE CTS USING MONOCULAR CAMERA TECHNIQUES..................................................................................................................... ..33 Simplified Camera Model and Transformation......................................................................33 Simple Camera Model.....................................................................................................33 Coordinate Transformation.............................................................................................34 Improved Techniques for Geo-Posi tioning of Static Objects.................................................36

PAGE 6

6 Camera Calibration............................................................................................................. ....39 Geo-Positioning Sensitivity Analysis.....................................................................................43 5 UNMANNED ROTORCRAFT MODELING.......................................................................55 6 STATE ESTIMATION USI NG ONBOARD SENSORS......................................................60 Attitude Estimation Using A ccelerometer M easurements.....................................................60 Heading Estimation Using Magnetometer Measurements.....................................................64 UGV State Estimation........................................................................................................... .66 7 RESULTS........................................................................................................................ .......69 Geo-Positioning Sensitivity Analysis.....................................................................................69 Comparison of Empirical Versus S imulated Geo-Positioning Errors....................................77 Applied Work................................................................................................................... ......79 Unexploded Ordnance (UXO) Detection and Geo-Positioning Using a UAV...............79 Experimentation VTOL aircraft...............................................................................80 Sensor payload.........................................................................................................81 Maximum likelihood UXO detection algorithm......................................................81 Spatial statistics UXO de tection algorithm..............................................................83 Collaborative UAV/UGV Control...................................................................................86 Waypoint surveying.................................................................................................87 Local map.................................................................................................................88 Citrus Yield Estimation...................................................................................................91 Materials and methods.............................................................................................93 Results......................................................................................................................96 Discussion................................................................................................................97 8 CONCLUSIONS..................................................................................................................102 LIST OF REFERENCES.............................................................................................................106 BIOGRAPHICAL SKETCH.......................................................................................................110

PAGE 7

7 LIST OF TABLES Table page 7-1 Parameter standard deviations fo r the horizontal and vertical position.............................70 7-2 Parameter standard deviations for the roll, pitch, and yaw angles.....................................71 7-3 Normalized pixel coordinate standard de viations used during sensitivity analysis...........73 7-4 Parameter standard deviations used during sensitivity analysis........................................73 7-5 Comparison of Monte Carlo Method results.....................................................................77 7-6 Production of Oranges (1000s me tric tons) (based on NASS, 2006)...............................92 7-7 Production of Grapefruit (1000s me tric tons) (based on NASS, 2006)............................92 7-8 Irrigation Treatments...................................................................................................... ...94 7-9 Results from Image Processi ng and Individual Tree Harvesting.......................................96

PAGE 8

8 LIST OF FIGURES Figure page 2-1 Commercially available GPS units....................................................................................15 2-2 Commercially available GPS antennas..............................................................................16 2-3 Commercially available IMU systems...............................................................................17 2-4 MicroMag3 magnetometer sensor from PNI Corp............................................................18 2-5 HMC1053 tri-axial analog ma gnetometer from Honeywell..............................................19 2-6 ADXL 330 tri-axial SMT magnetometer from Analog Devices Inc.................................19 2-7 ADXRS150 rate gyro from Analog Devices Inc...............................................................20 4-1 Image coordinates to projection angle calculation.............................................................33 4-2 Diagram of coordinate transformation...............................................................................34 4-3 Normalized focal and projective planes.............................................................................37 4-4 Relation between a point in the ca mera and global reference frames................................38 4-5 Calibration checkerboard pattern.......................................................................................40 4-6 Calibration images......................................................................................................... ....41 4-7 Calibration images......................................................................................................... ....42 5-1 Top view of the body fixed coordinate system..................................................................56 5-2 Side view of the body fixed coordinate system.................................................................56 5-3 Main rotor blade angle..................................................................................................... ..58 5-4 Main rotor thrust vector................................................................................................... ..58 6-1 Fast Fourier Transform of raw accelerometer data............................................................62 6-2 Fast Fourier Transform of raw accel erometer data after low-pass filter...........................62 6-3 Roll and Pitch measurement pr ior to applying low-pass filter..........................................63 6-4 Roll and Pitch measurement after applying low-pass filter...............................................64 6-5 Magnetic heading estimate................................................................................................65

PAGE 9

9 7-1 Roll and Pitch measurements used for defining error distribution....................................71 7-2 Heading measurements used for defining error distribution..............................................71 7-3 Image of triangular placard used for geo-positioning experiments...................................72 7-4 Results of x and y pixel error calculations.........................................................................72 7-5 Error Variance Histograms for the respective parameter errors........................................76 7-6 Experimental and simu lation geo-position results.............................................................78 7-7 BLU97 Submunition..........................................................................................................79 7-8 Miniature Aircraft Gas Xcell Helicopter...........................................................................80 7-9 Yamaha RMAX Unmanned Helicopter.............................................................................80 7-10 Sensor Payload System Schematic....................................................................................81 7-11 Segmentation software..................................................................................................... ..82 7-12 Pattern Recognition Process..............................................................................................84 7-13 Raw RGB and Saturation Images of UXO........................................................................85 7-14 Segmented Image........................................................................................................... ....85 7-15 Raw Image with Highlighted UXO...................................................................................85 7-16 TailGator and HeliGator Platforms....................................................................................86 7-17 Aerial photograph of all simulated UXO...........................................................................87 7-18 Local map generated with Novatel differential GPS.........................................................88 7-19 A comparison of the UGVs path to the differential waypoints........................................90 7-20 UAV waypoints vs. UGV path..........................................................................................91 7-21 Individual Tree Yields as Affect ed by Irrigation Depletion Treatments...........................96 7-22 Individual Tree Yield as a Func tion of Orange Pixels in Image........................................97 7-23 Individual Tree Yield as a Function of Orange Pixels with Nonirrigated Removed.........97 7-24 Image of Tree 2C Before and After Image Processing......................................................98 7-25 Image of Tree 2F Before and After Image Processing......................................................98

PAGE 10

10 7-26 Image of Tree 6D Before and After Image Processing......................................................99 7-27 Ground Images of Tree 6D and Tree 2E..........................................................................100 8-1 Simulated error calculation versus elevation...................................................................103 8-2 Geo-Position error versus elevation.................................................................................104

PAGE 11

11 Abstract of Dissertation Pres ented to the Graduate School of the University of Florida in Partial Fulfillment of the Requirements for the Degree of Doctor of Philosophy TRACKING AND STATE ESTIMATI ON OF AN UNMANNED GROUND VEHICLE SYSTEM USING AN UNMANNED AERIAL VEHICLE SYSTEM By Donald Kawika MacArthur May 2007 Chair: Carl Crane Major: Mechanical Engineering Unmanned Air Vehicles (UAV) have several ad vantages and disadvantages compared with Unmanned Ground Vehicles (UGV). Both systems have different mobility and perception abilities. These UAV systems have extended perception, tracking, and mobility capabilities compared with UGVs. Comparatively, UGVs have more intimate mobility and manipulation capabilities. This research investigates the collaboration of UAV and UGV systems and applies the theory derived to a heterogeneous unmanned mu ltiple vehicle system. This research will also demonstrate the use of UAV perception and tracki ng abilities to extend the capabilities of a multiple ground vehicle system. This research is unique in that it presents a comprehensive system description and analysis from the sensor and hardware level to the system dynamics. This work also couples the dynamics and kine matics of two agents to form a robust state estimation using completely passive sensor tech nology. A general sensitivity analysis of the geo-positioning algorithm was performed. This an alysis derives the sensitivity equations for determining the passive positioning error of th e target UGV. This research provides a framework for analysis of passive target positioning and error contributions of each parameter used in the positioning algorithms. This fram ework benefits the research and industrial community by providing a method of quantifying positioning error due to errors from sensor

PAGE 12

12 noise. This research presents a framework by which a given UAV payload configuration can be evaluated using an empirically derived sensor noi se model. Using this data the interaction between sensor noise and positioning error can be compared. This allows the researcher to selectively focus attention to sensors which have a greater e ffect on position error and quantify expected positioning error.

PAGE 13

13 CHAPTER 1 INTRODUCTION The Center for Intelligent Machines and Robo tics at the University of Florida has been performing autonomous ground vehicle research for over 10 years. In that time, research has been conducted in the areas of sensor fusion, pr ecision navigation, precision positioning systems, and obstacle avoidance. Research ers have used small unmanned helicopters for remote sensing purposes for various applications. Recently, expe rimentation with unmanned aerial vehicles has been in collaboration with the Tyndall Air Force Research Laboratory at T yndall AFB, Florida. Recently, unmanned aerial vehicles (UAVs) have been used more extensively for military and commercial operations. The improved pe rception abilities of UAVs compared with unmanned ground vehicles (UGVs) make them more attractive for surveying and reconnaissance applications. A combined UAV/UGV multiple vehicle system can pr ovide aerial imagery, perception, and target tracking alon g with ground target manipulati on and inspection capabilities. This research investigates collaborative UAV/UGV systems and also demonstrates the application of a UAV/UGV system fo r various task-based operations. The Air Force Research Laboratory at Tynda ll Air Force Base has worked toward improving EOD and range clearance operations by using unmanned ground vehicle systems. This research incorporates th e abilities of UAV/UGV systems to support these operations. The research vision for the range cl earance operations is to deve lop an autonomous multi-vehicle system that can perform surveying, ordnance dete ction/geo-positioning, and disposal operations with minimal user supervision and effort.

PAGE 14

14 CHAPTER 2 BACKGROUND Researchers have used small unmanned heli copters for remote sensing purposes for various applications [1,2,3]. These applications range from agricultura l crop yield estimation, pesticide and fertilizer app lication, explosive reconnaissanc e and detection, and aerial photography and mapping. This research effort will strive to estimate the states of a UGV system using monocular camera techniques and the extrinsic parameters of the camera sensor. The extrinsic parameters can be reduced to the transformation from the cam era coordinate system to the global coordinate system. Position and Orientation Measurement Sensors Global Positioning Systems Global Position Systems (GPS) are widely be coming the position system of choice for autonomous vehicle navigation. This technology allo ws for an agent to determine its location using broadcasted signals from satellites overhead. The Navigation Signal Timing and Ranging Global Positioning System (NAVST AR GPS) was established in 1978 and is maintained by the United States Department of Defense to provide a positioning service for use by the U.S. military and is utilized by the public as a public good. Sinc e its creation, the service has been used for commercial purposes such as na utical, aeronautical, and gr ound based navigation, and land surveying. The current U.S. based GPS satellite constellation system consists of over 24 satellites. The number of satellites in operation for this system can vary du e to satellites being taken in and out of service. Ot her countries are leading efforts to develop alternative satellite systems for their own GPS systems. A simila r GPS system is the GLONASS constructed by Russia. The GALILEO GPS system is being developed by a European consortium. This system

PAGE 15

15 is to be maintained by Europeans and will provide capabilities similar to that of the NAVSTAR and GLONASS systems. Each satellite maintains its own specific or bit and circumnavigates earth once every 12 hours. The orbit of each satellite is timed and coordi nated so that five to eight satellites are above the horizon of any location on th e surface of earth at any time. A GPS receiver calculates position by first receiving the microwave RF si gnals broadcast by each visible satellite. The signals broadcasted by the satellites are comple x high frequency signals with encoded binary information. The encoded binary data contains a large amount of information but mainly contains information about the time that the data was sent and location of the satellite in orbit. The GPS receiver processes this information to solve for its position and current time. GPSreceivers typically provide position solu tions at 1Hz but GPS receivers can be purchased that output po sition solutions up to 20 Hz. The accuracy of a commercial GPS system without any augmentation is approximately 15 mete rs. Several types of commercially available GPS units are shown in Figure 2-1. Some units are equipped with or without antennas. The Garmin GPS unit in Figure 2-1 contains the ante nna and receiver whereas the other two units are simply receivers. Several types of an tennas are shown in Figure 2-2. Figure 2-1. Commercially available GPS units

PAGE 16

16 Figure 2-2. Commercially available GPS antennas Differential GPS is an alternative method by which GPS signals from multiple receivers can be used to obtain higher accuracy position solutions. Differential GPS operates by placing a specialized GPS receiver in a know n location and measuring the er rors in the position solution and the associated satellite data. The information is then broadcast in the form of correction data so that other GPS receivers in the area can calculate a more accurate position solution. This system is based on the fact that there are inhere nt delays as the satellite signals are transmitted through the atmosphere. Localized atmospheric cond itions cause the satellite signals within that area to have the same delays. By calculating and broadcasting the correction values for each visible satellite the differential GPS system can attain accuracy from 1mm to 1cm [4]. In 2002, a new type of GPS correction system ha s been integrated so that a land-based correction signal is not require d to improve position solutions Satellite based augmentation systems (SBAS) transmit localized correction signals from orbiting satellites [5]. A SBAS system implemented for North America is the Wide Area Augmentation System (WAAS). This system has been used in this research and positio n solutions with errors of less than three meters have been observed. In 2005, the first in a series of new sate llites was introduced into the NAVSTAR GPS system. This system provides a new GPS signal referred to as L2C. This enhancement is intended to improve the accuracy and reliability of the NAVSTAR GPS sy stem for military and public use.

PAGE 17

17 Inertial Measurement Units Inertial Measurement Unit (IMU) systems are used extensively in vehicles where accurate orientation measurements are required. T ypical IMU systems cont ain accelerometers and angular gyroscopes. These sensors allow for th e rigid body motion of the IMU to be measured and state estimations to be made. These system s can vary greatly in cost and performance. When coupled with a GPS system, the positi oning and orientation of the system can be accurately estimated. The coupled IMU/GPS comb ines the position and velocity measurements based on satellite RF signals with inertial mo tion measurements. These systems complement each other whereby the GPS is characterized by low frequency global position measurements and the IMU provides higher frequency relative positioning/orientation measurements. Some of the commercially available IMU systems are shown in Figure 2-3. Figure 2-3. Commercially available IMU systems Other sensors allow for orientation measuremen ts such as fluidic tilt sensors, imaging sensors, light sensors, thermal sensors. Each of these sensors has different advantages and disadvantages for implementation. Fluidic tilt sensors provide high freq uency noise rejection and decent attitude estimation for low dynamic ve hicles. In high G turns and extreme dynamics these sensors fail to provide usable data. Im aging sensors have the advantage of not being affected by vehicle dynamics. However, advan ced image processing algorithms can require significant computational overhead and these sensors are highly affected by lighting conditions. Thermopile attitude sensors have been used for attitude estimation and are not affected by

PAGE 18

18 vehicle dynamics. These sensors provide excelle nt attitude estimations but are affected by reflective surfaces and changes in environment temperature. Magnetometers A magnetometer is a device that allows for th e measurement of a local or distant magnetic field. This device can be used to measure the strength and direction of a magnetic field. The heading of an unmanned vehicle may be determin ed by detecting the magnetic field created by the Earths magnetic poles. The magnetic north direction can aid in na vigation and geo-spatial mapping. For applications where the vehicle orient ation is not restricted to planar motion, the magnetometer is typically coupled with a tilt sensor to provide a hor izontal north vector independent of the vehicle orientation. There are several commercially available magnetometer sensors. The MicroMag3 from PNI Corp. provides magnetic field measurements in three axes with a digital serial peripheral interface (SPI) shown in Figure 2-4. Figure 2-4. MicroMag3 magnetome ter sensor from PNI Corp. The Honeywell Corporation also manufactures a line of magnetic field detection sensors. These products vary from analog linear/vector sens ors to integrated digital compass devices. The HMC1053 from Honeywell is a three axis ma gneto-resistive sensor for multi-axial magnetic field detection and is shown in Figure 2-5.

PAGE 19

19 Figure 2-5. HMC1053 tri-axial anal og magnetometer from Honeywell Accelerometer An accelerometer measures the acceleration of the device in single or multiple measurement axes. MEMS based accelerometers provide accurate and inexpensive devices for measurement of acceleration. The ADXL330 from Analog Devices Inc. provides analog three axis acceleration measurements in a small surface mount package shown in Figure 2-6. Figure2-6. ADXL 330 tri-axial SMT magne tometer from Analog Devices Inc. A two or three axis accelerometer can be used as a tilt sensor. The off horizontal angles can be determined by measuring th e projection of the gravity vector onto the sensor axes. These measurements relate to the roll and pitch angl e of the device and when properly compensated to account for effects from vehicle dynamics, can provide accurate orientation information. Rate Gyro A rate gyro is a device that m easures the angular time rate of change about a single axis or multiple axes. The ADXRS150 is a single axis MEMS rate gyro manufactured by Analog

PAGE 20

20 Devices Inc. which provides analog measurements of the angular rate of the device and is shown in Figure 2-7. Figure 2-7. ADXRS150 rate gyro from Analog Devices Inc. This device uses the Coriolis effect to measure the angular rate of the device. An internally resonating frame in the device is c oupled with capacitive pickoff elements. The response of the pickoff elements changes with th e angular rate. This signal is then conditioned and amplified. When coupled with an acceler ometer, these devices allow for enhanced orientation solutions. Unmanned Rotorcraft Modeling For this research, the aircraft operating regi on will mostly be in hover mode. The flight characteristics of the full flight envelope are very complex and i nvolve extensive dynamic, aerodynamic, and fluid mechanics analysis. Prev iously, researchers have performed extensive instrumentation of a Yamaha R-50 remote piloted helicopter [6]. These researchers outfitted a Yamaha R-50 helicopter with a sensor suite for in -flight measurements of rotor blade motion and loading. The system was equipped with sensor s along the length of the main rotor blades, measuring strain, accelera tion, tilt, and position. This research was unique due to the detail of instrumentation not to mention the difficulties of instrumenting rotating components. This work provided structural, modal, and load characteristics for this airframe and demonstrates the extensive lengths required for obt aining in-flight aircraft properties. In addition, extensive work has been conducted in system identification fo r the Yamaha R-50 and the Xcell 60 helicopters

PAGE 21

21 [7,8,9]. These researchers perfor med extensive frequency respons e based system identification and flight testing and compared modeling results with that of th e scaled dynamics of the UH-1H helicopter. These researcher s have conducted an extensive analysis of small unmanned helicopter dynamic equations and system identification. This wo rk has resulted in complete dynamic modeling of a model scale he licopter. These results showed great promise in that they demonstrated a close relation between the UH-1H helicopter dynamics and th e tested aircraft. This research also showed that the aircraft modeling technique used was valid and that the system identification techniques used for larger ro torcraft were extensible to smaller rotorcraft. Other researchers present a more systems level approach to the aircraft automation discussion [10]. They present the instrumentation equipment a nd architecture, and present the modeling and simulation derivations. They go on to present their work involving hardware-inthe-loop simulation and image processing. Unmanned Rotorcraft Control Many researchers have become actively i nvolved in the control and automation of unmanned rotorcraft. The resear ch has involved nume rous controls topics including robust controller design, fuzzy c ontrol, and full flight envelope control. Robust H controllers have been developed usin g Loop Shaping and Gain-Scheduling to provide rapid and reliable high bandwidth contro llers for the Yamaha R-50 UAV [11,12]. In this research, the authors sought to incorporate the use of high-fidelity simulation modeling into the control design to improve performance. Coupled with the use of multivariable control design techniques, they also sought to develop a contro ller that would provide fast and robust controller performance that could better utilize the full fli ght envelope of small unmanned helicopters. Anyone who has observed experienced competiti on level Radio Controlled (RC) helicopter

PAGE 22

22 pilots and their escapades during flight have observed the awesome capabilities of small RC helicopters during normal and inverted flight. It is these capabilities that draw researchers towards using helicopters for th eir research. But with increas ed capabilities comes increased complexities in aircraft mechanics and dynamics. These researchers have attempted to incorporate the synergic use of a high-fidelity aircraft model with robust multivariable control strategies and have validated their findings by implementing and flight testing their control algorithms on their testin g aircraft. Also, H controller design has been applied to highly flexible aircraft [13]. As will be shown later, helicopter airframes are significantly prone to failures cause by vibration modes. Disastrous co nsequences can occur if these vibration modes are not considered and compensated. In this resear ch, a highly flexible aircraft model is used for control design and validation. The controller is specifically desi gned to compensate for the high flexibility of the airframe. The authors present the aircraft model and uncertainties and discuss the control law synthesis algorithm. These resu lts demonstrate the meshing of the aircraft structure modeling/analysis and the control design /stability. This concept is important not only from a system performance perspective but also from a safety perspective. As UAVs become more prevalent in domestic airspace, the public can benefit from the improved system safety provided by more sophisticated mo deling and analysis techniques. Previous researchers have also conducted research on control optimization for small unmanned helicopters [14]. In this research, the authors focus on the problem of attitude control optimization for a small-scale unmanned helicopt er. By using an identified model of the helicopter system that incorporates the coupled rotor/stabilizer/fuselag e dynamic effects, they improve the overall model accuracy. This research is unique in that it inco rporates the stabilizer bar dynamic affects commonly not included in previ ous work. The system model is validated by

PAGE 23

23 performing flight tests using a Yamaha RMax helicopter test-bed system. They go on to compensate for the performance reduction i nduced by the stabilizer bar and optimize the Proportional Derivative (PD) atti tude controller usi ng an established control design methodology with a frequency response envelope specification.

PAGE 24

24 CHAPTER 3 EXPERIMENTAL TESTING PLATFORMS Electronics and Sensor Payloads In order to perform testing and evaluation of the theory and concepts involved in this research, several electronics and sensor payl oads were developed. The purpose of these payloads was to provide per ception and aircraft state meas urements and onboard processing capabilities. These systems were developed to operate modularly and en able transfer of the payload to different aircraft. The payloads were developed with varying capabilities and sizes. The host aircraft for these payloads ranged from a 6 fixed wing micro-air vehicle to a 3.1 meter rotor diameter agricultural mini-helicopter. First Helicopter Electron ics and Sensor Payload The first helicopter electronics and sensor payload was constr ucted to provide an initial testing platform to ensure proper operation of the electronics and aircra ft during flight. The system schematic is shown in Figure 3-1. Figure 3-1. First helicopter payload system schematic DC/DC Converters Industrial CPU Digital Stereovision Cameras LiPo Battery Laptop Hard Drive Wireless Ethernet Power Communication Imaging Data Storage

PAGE 25

25 The system consisted of five subsystems: 1. Main processor 2. Imaging 3. Communication 4. Data storage 5. Power The main processor provides the link between al l of the sensors, the data storage device, and the communication equipment. The imaging subsystem consists of a Videre stereovision system linked via two firewire connections. Th e data storage subsystem consists of a 40GB laptop hard drive which runs a Li nux operating system and was used for sensor data storage. The power subsystem consists of a 12V to 5V DC to DC converter, 12V power regulator and a 3Ah LiPo battery pack. The power regulators co ndition and supply power to all electronics. The LiPo battery pack served as th e main power source and was se lected based on the low weight and high power density of the LiPo battery chemis try. The first helicopter payload attached to the aircraft is shown in Figure 3-2. Figure 3-2. First payloa d mounted on helicopter

PAGE 26

26 The first prototype system was equipped on th e aircraft and was tested during flight. Although image data was able to be gathered dur ing flight, it was found that the laptop hard drive could not withstand the vibra tion of the aircraft. Figure 3-3 shows in flight testing of the first prototype payload. Figure 3-3. Helicopter tes ting with first payload Second Helicopter Electronics and Sensor Payload The payload design was refined in order to prov ide a more robust testing platform for this research. In order to improve th e designs, vibration isolation of th e payload from the aircraft was required as well as a data storage method that could withstand the ha rsh environment onboard the aircraft. The system schematic for the se cond prototype payload is shown in Figure 3-4. Figure 3-4. Second helicopter payload system schematic DC/DC Converters Industrial CPU Digital Stereovision Cameras LiPo Battery Wireless Ethernet Power Communication Imaging Compact Flash Data Storage Compact Flash OEM Garmin GPS Pose Sensors Digital Compass

PAGE 27

27 The system consisted of six subsystems: 1. Main processor 2. Imaging 3. Pose sensors 4. Communication 5. Data Storage 6. Power The second prototype payload c ontained similar components as the first prototype, but instead of a laptop hard drive it ut ilized two compact flash drives for storage, and in addition two pose sensors were added. The OEM Garmin GPS provided global position, ve locity, and altitude data at 5Hz. The digital compass provided head ing, roll, and pitch angles at 30Hz. The second prototype payload is shown in Figure 3-5. Figure 3-5. Second payload mounted to helicopter

PAGE 28

28 Flight tests showed that the second payload could reliably collect image and pose data during flight and maintain wireless communicati on at all times. Figure 3-6 shows the second prototype payload equipped on the aircraft during flight testing. Figure 3-6. Flight testing w ith second helicopter payload Third Helicopter Electronics and Sensor Payload The helicopter electronics and sensor payloa d was redesigned slightly to include a high accuracy differential GPS (Figure 3-7). This sy stem has a vendor stated positioning accuracy of 2 cm in differential mode and allows precise helicopter positioning. This system further improves the overall system performance and allows for comparison of the normal versus RT2 differential GPS systems. Figure 3-7. Third helicopter payload system schematic DC/DC Converters Industrial CPU Digital Stereovision Cameras LiPo Battery Wireless Ethernet Power Communication Imaging Compact Flash Data Storage Compact Flash Novatel RT2 Differential GPS Pose Sensors Digital Compass

PAGE 29

29 Micro Air Vehicle Embedded State Estimator and Control Payload An embedded state estimator and control payl oad was developed to support the Micro Air Vehicle research being performed at the Universi ty of Florida. This system provides control stability and video data. The system schematic is shown in Figure 3-8. Figure 3-8. Micro-Air Vehicl e embedded state estimator a nd control system schematic Testing Aircraft UF Micro Air Vehicles Several MAVs have been developed for rec onnaissance and control applications. This platform provides a payload capability of < 30 gram s with a wingspan of 6 (Figure 3-9). This system is a fixed wing aircraft with 2-3 control surface actuators and an electric motor. System development for this platform requires small size and weight, and low power consumption. Figure 3-9. Six inch micro air vehicle DC/DC Converters Atmel Mega128 CMOS Camera RF Video Transmitter LiPo Battery Aerocomm 900MHz RF Modem Power Communication Imaging 2 Axis Accelerometer Pose Sensors Altitude and Airspeed Pressure Sensors

PAGE 30

30 ECO 8 This aircraft was the first helicopter built in the UF laboratory. The aircraft is powered by a brushed electric motor with an eight cell nickel cadmium battery pack. Th e aircraft is capable of flying for approximately 10 minutes under nor mal flight conditions. This system has a payload capacity of less than 60 grams with CCPM swashplate mixing as shown in Figure 3-10. Figure 3-10. Eco 8 helicopter Miniature Aircraft Gas Xcell A Miniature Aircraft Gas Xcell was the first gas powered heli copter purchased for testing and experimentation (Figure 3-11). This aircraft is equipped with a two stroke gasoline engine, 740 mm main rotor blades, and has an optimal rotor head speed of 1800 rpm. The payload capacity is approximately 15 lbs wi th a runtime of 20 minutes. Figure 3-11. Miniatur e Aircraft Gas Xcell

PAGE 31

31 Bergen Industrial Twin A Bergen Industrial Twin was purchased for te sting with heavier payloads (Figure 3-12). This aircraft is equipped with a dual cylinder two stroke gaso line engine, 810 mm main rotor blades, and has an optimal rotor head speed of 1500 rpm. The payload capacity is approximately 25 lbs with a runtime of 30 minutes. Figure 3-12. Bergen indu strial twin helicopter Yamaha RMAX Several agricultural Yamaha RMAX helicopt ers were purchased by the AFRL robotics research laboratory at Tyndall Air Force base in Pa nama City, Florida. Th e aircraft is shown in Figure 3-13. This system has a two-stroke engi ne with internal power generation and control stabilization system. This system has a 60 lb pa yload capability. The system is typically used for small area pesticide and fertilizer spraying. These aircraft were used to conduct various experiments involving remote sensing, sensor noise analysis, system identifica tion, and various applied rotorcra ft tasks. These experiments and their results will be discusse d in the subsequent chapters. E ach aircraft has varying costs,

PAGE 32

32 payload capabilities, and runtimes. As with th e various sensors availa ble for UAV research, the aircraft should be selected to tailor to th e needs of the particular project or task. Figure 3-13. Yamaha RMAX helicopter

PAGE 33

33 CHAPTER 4 GEO-POSITIONING OF STATIC OBJE CTS USING MONOCULAR CAMERA TECHNIQUES Two derivations were performed which allowed for the global coordinates of an object in an image to be found. Both derivations perform the transformation from a 2D coordinate system referred to as the image coordinate system to the 3D global coordinate system. The first derivation utilizes a simplified camera model and cal culates the position of the static object using the concept of the inters ection of a line and a plane. The se cond derivation utilizes intrinsic and extrinsic camera parameters and uses projecti ve geometry and coordinate transformations. Simplified Camera Model and Transformation Simple Camera Model The cameras were modeled by linearly scaling the horizontal and vert ical projection angle with the x and y position of the pi xel respectively as illustrated in Figure 4-1. This allowed for the relative angle of the static obj ect to be calculated with respect to a coordinate system fixed in the aircraft. Figure 4-1. Image coordinates to projection angle calculation x y Roll Pitch

PAGE 34

34 Coordinate Transformation A coordinate transformation is performed on the static object location from image coordinates to global co ordinates as shown in Figure 4-2. The image data provides the relative angle of the static object with resp ect to the aircraft reference frame. In order to find the position of the static object, a solu tion of the intersec tion of a line and a plane was used. Figure 4-2. Diagram of c oordinate transformation The equation of a plane that is used for this problem is 0 D Cz By Ax (4-1) where x, y, and z are the coordina tes of a point in the plane. The equation of a line used in this problem is 1 2 1~ ~ ~ ~ p p u p p (4-2) where p1 and p2 are points on the line. Substituting (4-2) into (4-1) results in the solution ) ( ) ( ) (2 1 2 1 2 1 1 1 1z z C y y B x x A D Cz By Ax u (4-3)

PAGE 35

35 where x1, y1, and z1 are the coordinates of point p1, and x2, y2, and z2 are the coordinates of point p2. For this problem the ground plane is define d in the global reference frame by A=0, B=0, C=1, and D=-ground elevation. The point p1 is the focal point of the camera and it is determined in the global reference frame based on the sensed GPS data. The point p2 is calculated in the global reference frame as equa l to the coordinates of p1 plus a unit distance along the static object projection ray. This is known from the static object image angle and the cameras orientation as measured by attitude and heading se nsors. In other words the direction of the static object projection ray in the global reference frame was found by transforming the projection vector from the aircraft to the static object as measured in the aircraft frame to the global frame. This entailed using a downward v ector in the aircraft frame and rotating about the yaw, pitch, and roll axis by the projection angles and pose angles. The rotation matrices are 1 0 0 0 02 1 Cos Sin Sin Cos R (4-4) Cos Sin Sin Cos R 0 0 1 0 03 2 (4-5) Cos Sin Sin Cos R 0 0 0 0 14 3 (4-6) where =yaw of the aircraft =pitch of the aircraft pl us projection pitch angle =roll of the aircraft plus projection pitch angle.

PAGE 36

36 The downward vector r =(0 0 -1)T was transformed using the compound rotation matrix 4 3 3 2 2 1 4 1 R R R R (4-7) The new projection vector was found as r R r ~ ~ 4 1 (4-8) where r is the projection ray measured in the aircraft reference frame and ~ r is the projection ray as measured in the global reference frame. Us ing the solution found for th e intersection of a line and a plane and using the aircraft position as a point on the line, th e position of the static object in the global reference frame was found. Thus, for each object identified in an image, the coordinates of p1 and p2 are determined in the global refere nce frame and (4-2) and (4-3) are then used to calculate the pos ition of the object in the global reference frame. Improved Techniques for Geo-Posi tioning of Static Objects A precise camera model and an image to globa l coordinate transforma tion were developed. This involved finding the intrinsic and extrinsi c camera parameters of the camera system attached to the aerial vehicle. A relation between the normalized pixel coordinates and coordinates in the projective coordinate plane was used: c c c c v nZ Y Z X v u (4-9) The normalized pixel coordinate vector m ~ and the projective plane coordinate vector M ~ are related using Equation 4-9 and form the pr ojection relationship betw een points in the image plane and points in the camera reference frame as shown in Figure 4-9, where 1 ~n nv u m (4-10)

PAGE 37

37 1 ~c c cZ Y X M. (4-11) Figure 4-3. Normalized fo cal and projective planes The transformation from image coordinates to global coordinates was determined using the normalized pixel coordinates, and the camera positi on and orientation with respect to the global coordinate system (Figure 4-4). The transfor mation of a point M expressed in the camera reference system C to a point expressed in the global system is s hown in Equation 4-12. M C G C M GP T P (4-12) 1 1 0 13 C C C T Co G G C M C G C G G G M GZ Y X P R P T Z Y X P (4-13) Dividing both sides of Equation 4-13 by ZC and substituting ZG = 0 (assuming the elevation of the camera is evaluated as the a bove ground level and the target location exists on the ZG = 0 global plane) results in Equation 4-14.

PAGE 38

38 Figure 4-4. Relation between a point in the camera and global reference frames C C C C C T Co G G C C C G C GZ Z Y Z X P R Z Z Y Z X 1 1 1 0 1 03 (4-14) Substituting XC/ZC = un and YC/ZC = vn: C n n T Co G G C C C G C GZ v u P R Z Z Y Z X 1 1 1 0 1 03 (4-15) This leads to three equa tions and three unknowns XG, YG, ZC: C Cox G n n C GZ P R v R u R Z X 13 12 11 (4-16)

PAGE 39

39 C Coy G n n C GZ P R v R u R Z Y 23 22 21 (4-17) C Coz G n nZ P R v R u R 33 32 310 (4-18) where the scalar ij R represents the element in the ith row and jth column of the ij R matrix. Using Equations 4-16, 4-17, and 4-18, ZC, XG, YG can be determined explicitly: 33 32 31R v R u R P Zn n Coz G C (4-19) Cox G n n n n Coz G GP R v R u R R v R u R P X 13 12 11 33 32 31 (4-20) Coy G n n n n Coz G GP R v R u R R v R u R P Y 23 22 21 33 32 31 (4-21) Equations 4-20 and 4-21 provide the global coordinates of th e static object. Camera Calibration In order to calculate the normalized pixel co ordinates using raw imaging sensor data, a calibration procedure is performed using a cam era calibration toolbox fo r MATLAB[15]. The calibration procedure determines the extrinsic an d intrinsic parameters of the camera system. During the calibration procedure, several images ar e used with checkerboard patterns of specific size that allow for the different parameters to be estimated as shown in Figure 4-5. The extrinsic parameters define the position and orientation characte ristics of the camera system. These parameters are affected by the mounting and positioning of the camera relative to the body fixed coordinate system.

PAGE 40

40 Figure 4-5. Calibrati on checkerboard pattern The intrinsic parameters define the optic proj ection and perspective ch aracteristics of the camera system. These parameters are affected by the camera lens properties, imaging sensor properties, and lens/sensor plac ement properties. The camera lens properties are generally characterized by the focal length and prescribed imaging sensor size. The focal length is a measure of how strongly the lens fo cuses the light energy. This in essence correlates to the zoom of the lens given a fixed sensor size and distance. The imaging sensor properties are generally characterized by the physical size and horizontal/vertical resolu tion of the imaging sensor. These properties help to define the dimensi ons and geometry of the image pixels. The lens/sensor placement properties are generally char acterized by the misalignment of the lens and image sensor, and the lens to sensor planar dist ance. For our analysis we are mostly concerned with determining the intrinsic parameters of the camera system. These parameters are used for calculating the normalized pixel coordina tes given the raw pixel coordinates. The intrinsic parameters that are used for generating the normalized pixel coordinates are the focal length, principal point, skew coefficien t, and image distortion coefficients. The focal

PAGE 41

41 length as described earlier estima tes the linear projection of points observed is space to the focal plane. The focal length has components in the x and y axes and does not assume these values are equal. The principal point estimates the center pixel position. All norma lized pixel coordinates are referenced to this point. The skew coeffi cient estimates the angle between the x and y axes of each pixel. In some instances the pixel geometry is not squa re or even rectangular. This coefficient describes how off-square the pixel x and y axes are and allows for compensation. The image distortion coefficients estimate the radial and tangen tial distortions typically caused by the camera lens. Radial distortion causes a ch anging magnification effect at varying radial distances. These effects are apparent when a st raight line appears to be curved through the camera system. The tangential distortions are ca used by ill centering or defects of the lens optics. These cause the displacement of points perpendicular to the radial imaging field. Figure 4-6. Calibration images

PAGE 42

42 The camera calibration toolbox allo ws for all of the intrinsic parameters to be estimated using several images of the predefined checkerbo ard pattern. Once the ca libration procedure is completed, the intrinsic parameters are used in the geo-positioning algori thm. Selections of images were used that captured the checker patte rn at different ranges an d orientations as shown in Figure 4-6. The boundaries of the checker pattern were then selected manually for each image. The calibration algorithm used the gradient of the pa ttern to then find all of the vertices of the checkerboard as shown in Figure 4-7. Figure 4-7. Calibration images Once the boundaries for all of the images were selected the algorithm calculated the intrinsic camera parameter estimates using a gradie nt descent search. Using the selected images the following parameters were calculated: Focal Length: fc = [ 1019.52796 1022.12290 ] [ 20.11515 20.62667 ] Principal point: cc = [ 645.66333 527.72943 ] [ 13.60462 10.92129 ] Skew: alpha_c = [ 0.00000 ] [ 0.00000 ] => angle of pixel axes = 90.00000 0.00000 degrees

PAGE 43

43 Distortion: kc = [ -0.17892 0.13875 -0.00128 0.00560 0.00000 ] [ 0.01419 0.02983 0.00158 0.00203 0.00000 ] Pixel error: err = [ 0.22613 0.14137 ] Geo-Positioning Sensitivity Analysis In this section the derivation for the sensitiv ity of the position solution will be derived based on the measurable parameters used in th e geo-positioning algorithm. This analysis will show the general sensitivity of the positioning solution, and also the sensitivity at common operating conditions. Equation 4-15 is used to determine the globa l position of the targ et based on the global position and orientation of the camera, and the norm alized target pixel coordinates. Multiplying Equation 4-15 through by Zc produces: 31 0 01 1 1n G n GG G CCo C T Cu x v y RP z z (4-22) Since the geo-positioning coordinates are of primary concern for the sensitivity analysis, Equation 4-22 is reduced to the form: 111213 2122231 1x yn GGGG n CCCCo G C GGGG G CCCCo Cu v RRRP x z y RRRP z (4-23) Equation 4-23 is rewritten in the form: G C Gx zAb y (4-24) where

PAGE 44

44 x yG Co G CoCCSSSCSSSCSCP CSCCSP (4-25) 1 1n n cu v b z (4-26) The geo-positioning process is modeled by assuming there are some errors in the parameters used in the calculation. The parameter vector is defined below: x y zG Co G Co G Co n nP P P p u v (4-27) The modeled process is shown below: GG C p GG actualerrorxx zAb yy (4-28) where

PAGE 45

45 xx yy zzGG CoCo GG CoCo GG CoCo nn nnPP PP PP p uu vv (4-29) The positioning error from Equation 4-28 reduces to: G Gx CC pp ye ezAbzAb e (4-30) In order to establish a metric from measurement of the error for the sensitivity analysis, the inner product of Equation 4-30 is used. 2T T CCCC pppp TT T CCCC pp pp TTT T CCCCCC p pp pppeezAbzAbzAbzAb eezAbzAbzAbzAb eezAbzAbzAbzAbzAbzAb (4-31) Upon using Equation 4-24 to substitute for C pzAb the generic form for the error variance becomes: 22 2T G TTTTT CCC pp p Gx eezbAAbzAbzbAAb y (4-32) The partial derivative of the generic error va riance is shown for an arbitrary parameter

PAGE 46

46 22 2 22222 2 2T TTTT C T CC ppp G G TTTT T CC pp G G TTT TTTTTT C CCCCCzAb zbAAbzbAAb x ee y zbAAbzbA x ee y z eebAA zbAAbzAAbzbAbzbAbz 2TT p TT GGG TTTT C CC GGG pppb bAA xxx z bA bAzAzb yyy (4-33) In order to reduce the complexity of the analysis and to provide a more concise representation of the eff ects of the parameter errors on the error variance, and without loss of generality, the target position is set as the origin of the global coordinate system. Equation 4-30 reduces to: C pezAb (4-34) 2 T T CC pp TTT C peezAbzAb eezbAAb (4-35) This quantity equates to the error variance of the positioning solution given the system configuration and erro r values. It is desirable to determine the effects of errors in each parameter used in the geo-positioning solution. Hence th e partial derivative of the inner product is calculated with respect to each parameter error. The partial derivative of the reduced error va riance is shown for an arbitrary parameter 22222TTT TTTTTTTT C CCCCCz eebAAb zbAAbzAAbzbAbzbAbzbAA (4-36)

PAGE 47

47 Equation 4-25 is restated below along with the partial derivatives with respect to xG CoP, xG CoP, xG CoP, , nu, and nv: x yG Co G CoCCSSSCSSSCSCP CSCCSP (4-25) 0001 0000xG CoP (4-37) 0000 0001yG CoP (4-38) 0000 0000zG CoP (4-39) 0 0 SCSSCCSS SSSCC (4-40) 0 0000SCCSSSSCSCSS (4-41) 00 00CSSSCCCSSS CCCS (4-42) 0000 0000nu (4-43) 0000 0000nv (4-44) Equation 4-26 is restated below along with the partial derivatives with respect to xG CoP, xG CoP, xG CoP, , nu and nv : 1 1n n cu v b z (4-26)

PAGE 48

48 1 Oznn G c CuSCCSSvSSCSCCC z P (4-45) 1 0 0 Ozn G Cb u SCCSS P (4-46) 0 1 0 Ozn G Cb v SSCSC P (4-47) 0 0 0 0xG Cob P (4-48) 0 0 0 0yG Cob P (4-49) 20 1 0 z OzG Co nn G Cb P uSCCSSvSSCSCCC P (4-50) 0 0 0 Oznn G Cb uCCSvCCCCS P (4-51)

PAGE 49

49 0 0 0 Oznn G Cb uCCSSSvCSSSCSC P (4-52) 0 0 0 Oznn G Cb uSSCSCvSCCSS P (4-53) Equation 4-19 is restated below along with the partial derivatives with respect to xG CoP, y G CoP z G CoP, , nu and nv : OzG C c nnP z uSCCSSvSSCSCCC (4-54) 0xc G Coz P (4-55) 0yc G Coz P (4-56) 1 zc G Co nnz P uSCCSSvSSCSCCC (4-57) 2 OzG C c n nnPSCCSS z u uSCCSSvSSCSCCC (4-58) 2 OzG C c n nnPSSCSC z v uSCCSSvSSCSCCC (4-59) 2 OzG Cnn c nnPuCCSvCCCCS z uSCCSSvSSCSCCC (4-60)

PAGE 50

50 2 OzG Cnn c nnPuCCSSSvCSSSCSC z uSCCSSvSSCSCCC (4-61) 2 OzG Cnn c nnPuSSCSCvSCCSSCS z uSCCSSvSSCSCCC (4-62) With the partial derivatives for the compone nts of the error inne r product defined, the sensitivity of the error can be quantified for each parameter. Hence the sensitivity of the error variance can be derived with re spect to each parameter error. The error sensitivity with respect to xG CoP is shown in Equations 4-63 and 4-64. 22 222xxx x xxTTT TTTT C CCC GGG G CoCoCo Co TTTT CC GG CoCoz eebA zbAAbzAAbzbAb PPP P Ab zbAbzbAA PP (4-63) 221 2 00010001 00000000xT TT C G Co nn T TTT CCee zbAAb P uSCCSSvSSCSCCC zbAbzbAb (4-64) The error sensitivity with respect to y G CoP is shown in Equations 4-65 and 4-66. 22 222yyy y yyTTT TTTT C CCC GGG G CoCoCo Co TTTT CC GG CoCoz eebA zbAAbzAAbzbAb PPP P Ab zbAbzbAA PP (4-65) 2200000000 00010001yT T TTT CC G Coee zbAbzbAb P (4-66) The error sensitivity with respect to z G CoP is shown in Equations 4-67 and 4-68.

PAGE 51

51 22 222 z zz z zzTT TTTTT C CCC GGG G CoCoCo Co T TTT CC GG CoCoz eebb zbAAbzAAbzbAA PPP P AA zbAbzbAb PP (4-67) 2 2 2 1 2 0 1 0 0 1 0 z OzT TT C G Co nn T T C nn G C TT C nee zbAAb P uSCCSSvSSCSCCC zAAb uSCCSSvSSCSCCC P zbAA uSCCSS 2 Ozn G CvSSCSCCC P (4-68) The error sensitivity with respect to is shown in Equations 4-69 and 4-70. 22222TTT TTTTTTTT C CCCCCz eebAAb zbAAbzAAbzbAbzbAbzbAA (4-69) 2 22 2 00 00 00 Oz OzG T Cnn TT C nn T TTT CC nnn G CPuCCSvCCCCS ee zbAAb uSCCSSvSSCSCCC zAAbzbAA uCCSvCCCCSuCC P 22 00 00Ozn G C T TTT CCSvCCCCS P SCSSCCSSSCSSCCSS zbAbzbAb SSSCCSSSCC (4-70)

PAGE 52

52 The error sensitivity with respect to is shown in Equations 4-71 and 4-72. 22222TTT TTTTTTTT C CCCCCz eebbAA z bAAbzAAbzbAAzbAbzbAb (4-71) 2 2 2 0 0 0 Oz OzG T Cnn TT C nn C nn G CPuCCSSSvCSSSCSC ee z bAAb uSCCSSvSSCSCCC z uCCSSSvCSSSCSC P 2 2 20 0 0 0 0000OzT T TT C nn G C T T C TT CAAb zbAA uCCSSSvCSSSCSC P SCCSSSSCSCSS zbAb SCCSSSSCSCS zbA 0 0000 S b (4-72) The error sensitivity with respect to is shown in Equations 4-73 and 4-74. 22222TTT TTTTTTTT C CCCCCz eebbAA z bAAbzAAbzbAAzbAbzbAb (4-73)

PAGE 53

53 2 2 2 0 0 0 Oz OzG T Cnn TT C nn T T C nn G CPuSSCSCvSCCSSCS ee zbAAb uSCCSSvSSCSCCC zAAb uSSCSCvSCCSS P 2 2 2 0 0 0 00 00 00OzTT C nn G C T T C TT CzbAA uSSCSCvSCCSS P CSSSCCCSSS zbAb CCCS CSSSCCCSSS zbA CCC 00 b S (4-74) The error sensitivity with respect to nu is shown in Equations 4-75 and 4-76. 22222TTT TTTTTTTT C CCCCC nnnnnnz eebbAA zbAAbzAAbzbAAzbAbzbAb uuuuuu (4-75) 2 22 2 11 00 00 Oz Oz OzG T C TT C n nn T TTT CC GG CCPSCCSS ee zbAAb u uSCCSSvSSCSCCC zAAbzbAA SCCSSSCCSS PP (4-76) The error sensitivity with respect to nv is shown in Equations 4-77 and 4-78. 22222TTT TTTTTTTT C CCCCC nnnnnnz eebbAA zbAAbzAAbzbAAzbAbzbAb vvvvvv (4-77)

PAGE 54

54 2 22 2 00 11 00 Oz Oz OzG T C TT C n nn T TTT CC GG CCPSSCSC ee zbAAb v uSCCSSvSSCSCCC zAAbzbAA SSCSCSSCSC PP (4-78) This derivation provides the general sensitivit y equations for target geo-positioning from a UAV. These equations provide th e basis for the sensitivity anal ysis conducted in the following chapters. These results will be combined with em pirically derived sensor data to determine the parameter significance relative to th e induced geo-positioning error.

PAGE 55

55 CHAPTER 5 UNMANNED ROTORCRAFT MODELING In order to derive the equations of motion of the aircraft and to perform further analysis, an aircraft model was developed based on previous work [7,8,9,10,16,17]. For this research the scope of the rotorcraft mechanics was limited to Bell-Hiller mixing a nd Flapping rotor head design. A simplified aircraft model was deve loped previously [16,17] for simulation and controller development. A similar approach will be used here for the derivations. Mettler et al. [7,8,9] use more complex anal ysis when deriving thei r dynamic equations. Their analysis includes more complex dynamic fact ors such as fly-bar paddle mixing, main blade drag/torque effects, and fuselage /stabilizer aerodynamic effects. The actuator inputs commonly used for control of RC rotorcraft are composed of: lon: Longitudinal cyclic control lat: Lateral cyclic control col: Collective pitch control rud: Tail rudder pitch control thr: Throttle control A body fixed coordinate system was used in or der to relate sensor and motion information in the inertial and relative refe rence frames. Figures 5-1 and 52 show the body fixed coordinate system. A transformation matrix was derived which re lates the position and orientation of the body fixed frame to the inertial frame. The orientation of the body fixed frame is related to the inertial frame using a 3-1-2 rotation sequence. The inerti al frame is initially in the North-East-Down orientation. The coordinate s system undergoes a rotation about the Z axis, then a rotation

PAGE 56

56 about the X axis, and then a rotation about the Y axis. The compound rotation is equated below in Equation 5-1 and the subsequent ro tations are shown in 5-2, 5-3, and 5-4. Figure 5-1. Top view of the body fixed coordinate system Figure 5-2. Side view of th e body fixed coordinate system 4 3 3 2 2 1 4 1 R R R R (5-1) 1 0 0 0 02 1 Cos Sin Sin Cos R (5-2)

PAGE 57

57 Cos Sin Sin Cos R 0 0 0 0 13 2 (5-3) Cos Sin Sin Cos R 0 0 1 0 04 3 (5-4) The final compound rotation matrix is equated below in Equation 5-4. 14CCSSSCSSSCSC RCSCCS SCCSSSSCSCCC (5-4) where the notation iC and iS represent the Cosine and Sine of the angle i respectively. The transformation matrix which converts a po int measured in the body fixed frame to the point measured in the inertial fixe d frame is shown in Equation 5-5. 1 03 T Bodyo Inertial Inertial Body Body InertialP R T (5-5) where Bodyo InertialP represents the position of the body fixed frame origin measured in the inertial frame. The lateral and longitudinal motion of the aircraft is primarily controlled by the lateral and longitudinal cyclic control inputs. For a flapping rotor head, the mo tions of the main rotor blades form a disk whose orientation with respect to th e airframe is controlled by these inputs. The orientation of the main rotor disk is illustrated in Figure 5-3: In this analysis, a represents the lateral rotation of the main rotor blade disk and b represents the longitudina l rotation of the main rotor blade disk. In a report by Heffley and Munich [17], motion of the main rotor disc is approximated by a first order system as shown below:

PAGE 58

58 Figure 5-3. Main rotor blade angle max max00 00latlat lonlona aa b bb (5-6) where lat is the lateral cyclic damping coefficient and lon is the longitudinal cyclic damping coefficient. The angular velocity as measured in the body fi xed frame B can be translated into angular velocity in the inertial frame by using Equation 5-4: 14 BG T GBBG GB GBRR RR R (5-7) Figure 5-4. Main rotor thrust vector

PAGE 59

59 The main rotor induces a moment and linear force on the body of the aircraft. These induce lateral and longitudinal motion, and roll and pitch rotations of the aircraft. The main rotor thrust vector M RT is illustrated in Figure 5-4: The main rotor thrust vector as m easured in the body fixed frame is: 22() () 1()()MRMRSinb TTSina SinbSina (5-7) The equations of motion of th e aircraft were derived in the inertial frame using the following equations: GB B GBT B BRFma R MRIR (5-8) This derivation has resulted in a simplif ied helicopter dynamic model. This model provides a foundation for simulation of the aircraft in the absence of an experimental platform. This derivation was performed to provide the re ader with basic helicop ter dynamic principles and an introduction to helicop ter control mechanics. Now that a background on helicopter mechanics and dynamics has been presented, th e next chapter will discuss the use of onboard sensors and signal processing fo r aircraft state estimation.

PAGE 60

60 CHAPTER 6 STATE ESTIMATION USI NG ONBOARD SENSORS This research proposes to derive and dem onstrate the estimation of UGV states using a UAV. In order to estimate the UGV states, the estim ates of the UAV states are required. In this research, sensors measurements from the UAV will be used to perform the state estimation of the UAV and UGV. This research is primarily concer ned with developing a remote sensing system. Where it stands out is in how UA V dynamics and state measuremen ts are utilized in passively determining the states of the UGV. Attitude Estimation Using Accelerometer Measurements As discussed earlier, a two or three axis ac celerometer can be used for determining the attitude of an aircraft. A simple equation for de termining the roll and pitch angles of an aircraft using the acceleration measuremen ts in the x and y body fixed axes is shown in Equation 6-1 and 6-2. 1sinya roll g (6-1) 1sin x a pitch g (6-2) where x aand yaare the measured acceleration measurements in the body fixed x and y axes. A major problem with using accelerometers for attitude estimation is the effects from high frequency vibration inherent to ro tary wing aircraft. There are se veral characteristic frequencies in the rotorcraft system to consider when an alyzing the accelerometer signals. The main characteristic frequencies are the speed of the main rotor blades, tail rotor blades, and engine/motor. The highest freque ncy vibration will come from the engine/motor. The main gear of the power transmission reduces the frequency to the main rotor head by about a factor of 9.8 for the Gas Xcell helicopter. The frequency is then further reduced by the tail rotor transmission

PAGE 61

61 to the tail rotor blades. Any imbalances in the motor/motor fa n, transmission gears, and rotor heads/blades can cause significant vibration. Also any bent or misaligned shafts can cause vibration in the system. Due to the speed and number of moving parts in a helicopter, these aircraft have significant vibration at the engine, main and tail rotor fr equencies and harmonics. Extreme care must be taken to ensure balance and proper alignment of all elements of the drive train. Time taken balancing and inspecting components can payoff in the long run in system performance. The airframe and payload structure must be careful ly considered. Due to the energy content at specific frequencies, any structur al element with a natural freque ncy at or around the engine, or main/tail rotor frequencies or harmonics could pr oduce disastrous effects. Rigid mounting of payload is highly discouraged as there would be no element other than the aircraft structure to dissipate the cyclic loading. Prospective researchers are forewarned that small and large unmanned aircraft systems should be treated like any other pi ece of heavy machinery. In th is case the payload was rigidly attached to the base of the aircraft frame. Upon spool-up of the engine the head speed transitioned into the natural fre quency of the airframe with the most flexible component of the system being the side frames of the aircraft. In less than a second the airc raft entered a resonant vibration mode. This resulted in a tail-boom strike by the main blades. The damage was a result of the main shaft shattering and projecting the u pper main bearing block striking the pilot over thirty feet away. Airframe resonance is particularly dangerous in all rotary aircraft from small unmanned systems to large heavy lift commercial and military helicopters. A Fast Fourier Transform (FFT) of the accele rometer measurements show very specific spikes on all axes at specific fr equencies as shown in Figure 6-1.

PAGE 62

62 Figure 6-1. Fast Fourier Tran sform of raw accelerometer data Figure 6-2. Fast Fourier Transform of raw accelerometer data after low-pass filter Strategic filtering at the majo r vibration frequencies can im prove the attitude estimates while still allowing for the aircraft dynamics to be measured. Also by attenuating only specific

PAGE 63

63 frequency bands, the noise can be reduced yet sti ll produce fast signal response. A discrete lowpass Butterworth IIR filter was used, with a 5 Hz pass band and a 10 Hz stop band that filtered the high frequency noise evident between 15-25H z. The raw accelerometer FFT response using the low-pass filter is shown in Figure 6-2. The FFT of the raw acceleromete r data shows that the high frequency noise is attenuated beyond 5 Hz thereby eliminating the major effect s caused by the power train or high frequency electrical interference. Before the low-pass filter was applied, the roll and pitch measurements were almost unusable as shown in Figure 6-3. Figure 6-3. Roll and Pitch measuremen t prior to applying low-pass filter

PAGE 64

64 After the low-pass filter was applied, the meas urements produced much more viable results as shown in Figure 6-4. Figure 6-4. Roll and Pitch measurem ent after applying low-pass filter These results indicate the im portance of proper vehicle main tenance and assembly. More rigorous balancing and tuning of the vehicle can produce much be tter system performance and reduce the work required to compensate for vibration in sensor data. Heading Estimation Using M agnetometer Measurements The heading of unmanned ground and air vehicl es is commonly estimated by measuring the local magnetic field of the earth. The magneti c north or compass bearing has been used for hundreds of years for navigation and mapping. By measuring the local magnetic field, an

PAGE 65

65 estimate of the northern magnetic field vector can be obtained. Errors betw een the true north as measured relative to latitude and longitude compared with magnetic north varies depending on the location on the globe. Depending on the loca tion, the variations between the true and magnetic north are known and can be compensated. Alternative methods for determining the heading of unmanned systems exist including using highly accurate rate gyros. By precisely measuring the angular rate of a static vehicle, the angular rate induced from the rotation of the earth can be used to estimate heading. This requires extremely high pr ecision rate gyros which are currently too expensive, large, an d sensitive for small unmanned systems. Normally all three axes of the magnetometer would be used for heading estimation but due to the fact that the aircraft does not perform a ny radical roll or pitch maneuvers, only the lateral and longitudinal magnetometer measurements are required as shown in Figure 6-5. Figure 6-5. Magnetic heading estimate

PAGE 66

66 UGV State Estimation The geo-positioning equations derived in the previous chapters are restated below: G C Gx z Ab y (6-3) where x yG Co G CoCCSSSCSSSCSCP CSCCSP (6-4) 1 1n n cu v b z (6-5) 33 32 31R v R u R P Zn n Coz G C (6-6) By identifying two unique points fixed to the U GV, the direction vect or can be defined: 21 21 22 21 2121 21 22 2121() 1 () 1nn GG nn GGGG C GG GGGG cuu xx vv xxyy Cos hzA Sin yy xxyy z (6-7) The heading of the vehicle can be found using: atan2((),())SinCos (6-8) The kinematic motion of the vehicle can be de scribed by the linear and angular velocity terms. In the 2D case, the UGV is constrained to move in the x-y plane with only a z component in the angular velocity vector. Hence the state vector is shown below: ()()0 ()()0 01 xvCosCos v xyvSinSin (6-9)

PAGE 67

67 In [27] the researchers define the kinema tic equations for an Ackermann style UGV. Using these equations the kinematic equa tions are restated using our notation: ()()()() 22 ()()()() 22 LL vCosSinCosSin x v yLL vSinCosSinCos (6-10) Equation 6-10 follows the structure outlined in [28] and is rewritten in the form: ()() 2 ()() 2 L CosSin zxv L SinCos zHxv (6-11) where z is the measurement vector, x is the state vector, and v is the additive measurement error. The measurement error can be isolated an d the squared error can be written in the form: T TvzHx vvzHxzHx (6-12) The measurement estimate is written in the form: zHx (6-13) Hence the sum of the squares of the measurement variations zz is represented by: TJzHxzHx (6-14) The sum of the squares of the measurement variat ions is minimized with respect to the state estimate as shown: 1 0 0 022 T TT TTTTT TT TTzHxzHx J xx HzHxzHxH HzHHxzHxHH HzHHx xHHHz (6-15) Therefore the state estimate can be expressed as:

PAGE 68

68 1 1 2()()()() ()() 2 ()()()() ()() 2222 2 10 ()() ()() 0 22 4 10 L CosSinCosSin CosSin x z LLLL L SinCosSinCos SinCos CosSin xz LL L SinCos x 2()() 4 0()() 22 ()() 22 ()() CosSin z LL SinCos L CosSin xz SinCos LL (6-16) Equation 6-16 can be rewritten in the form: ()() 22 ()() CosSin vx y SinCos LL (6-17) This chapter has discussed the use of onboard sensors for aircraft and ground vehicle state estimation. These techniques will be used in th e following chapter to determine the sensor noise models for the sensitivity analysis. They w ill also be used for the validation of the geopositioning algorithm and comparison with simulation results.

PAGE 69

69 CHAPTER 7 RESULTS This chapter presents the results derived from the experiments performed using the experimental aircraft and payload systems. It will also present the results of the effort of applying this research to solv e several engineering problems. Geo-Positioning Sensitivity Analysis The sensitivity of the error vari ance to the errors of each para meter is highly coupled with the other parameters in the positioning solution. The mapping of the error variance sensitivity to the parameters is highly nonlinear. In order to achieve a concise qualitativ e representation of the effects of each parameters error in the positioning solution, a loca lized sensitivity analysis is performed. This entails using common operating parameters and experimentally observed values for the parameter errors. When substituted into the error variance partial differential equations, the respective sensitivity of each parameter is observed for common testing conditions. The error models for the various geo-pos itioning parameters were obtained using manufacturer specifications and empirically derive d noise measurements. The main aircraft used for empirical noise analysis was the Miniature Ai rcraft Gas Xcell platform. This aircraft was equipped with the testing payloa d as discussed in previously and sensor measurements were recorded with the aircraft on the ground but w ith the engine on, and head-speed just below takeoffs speed. The sensor used for measuring the global position of the camera was a WAAS enabled Garmin 16A model GPS. This GPS provides a 5 Hz positioning solution. The manufacturer specifications for horizontal and vertical positioni ng accuracy is less than 3 meters. For the sensitivity analysis the lateral and longitudina l error distribution was defined using a uniform

PAGE 70

70 radial error distributi on bounded by a three meter range. The error distribution parameters for the horizontal and vertical positioning m easurements are stated in Table 7-1. Parameter Value G C OxP ,G C OyP 3 m G C OzP 1.5 m Table 7-1. Parameter standard deviations for the horizontal a nd vertical position The sensor used for measuring the orientat ion of the camera was a Microstrain 3DMG orientation sensor. This sensor provide three axis measurements of lin ear acceleration, angular rate, and magnetic field. The se nsor measurements used for dete rmining the roll and pitch angle of the camera were the lateral and longitudinal li near accelerations. The roll and pitch angles were calculated using these measurements as described previously. The roll and pitch measurements used for defining the erro r distribution are shown in Figure 7-1.

PAGE 71

71 Figure 7-1. Roll and Pitch measurements used for defining error distribution The Microstrain 3DMG contains a three axis magn etometer for estimating vehicle heading. The measurements made to estimate the heading e rror distribution for the sensitivity analysis is shown in Figure 7-2. Figure 7-2. Heading measurements used for defining error distribution Using this data set the standard deviations fo r the roll, pitch, and yaw were calculated and shown in Table 7-2. ParameterValue 4.4 6.8 0.9 Table 7-2. Parameter standard deviations for the roll, pitch, and yaw angles

PAGE 72

72 The error distributions for the normalized pixe l coordinates were calcu lated using a series of images of a triangular placa rd taken from various elevations as shown in Figure 7-3. Figure 7-3. Image of triangular placar d used for geo-positioning experiments Figure 7-4. Results of x a nd y pixel error calculations

PAGE 73

73 It was difficult to quantify the expected error distribution for the normalized pixel coordinates. The error dist ribution for the x and y compone nts of the normalized pixel coordinates were estimated by comparing the dete cted vertex points of the placard with the calculated centroid of the volume. The resulting variation is s hown in Figure 7-4. The pixel errors were then converted to normalized pi xel errors and are shown in Table 7-3. Parameter Value nu 0.0021 nv 0.0070 Table 7-3. Normalized pixel co ordinate standard devi ations used during se nsitivity analysis A summary of the parameters for the sensor error distribut ions that are used in the proceeding sensitivity analysis are shown in Table 7-4. Parameter Value G C OxP 3 m G C OyP 3 m G C OzP 1.5 m 4.4 6.8 0.9 nu 0.0021 nv 0.0070 Table 7-4. Parameter standard deviati ons used during sensitivity analysis The Monte Carlo method was used to evaluate each sensitivity equa tion. In order to demonstrate the significance of each parameter in the Monte Carlo analysis, each parameter is perturbed by a uniform error distribution based on experimentally derived measurements. This

PAGE 74

74 analysis seeks to show the difference between th e positioning errors based off of each varying parameter. The key element to this analysis is that the error sensitiv ity for each parameter is calculated including e rrors from other parameters. This allows for the nonlinear and coupled relationship between the parameters to propagate through the sensitivity anal ysis. The results of this analysis determine the rank of the dominance of each parameter in causing positioning error. The error sensitivity is used in the subsequent analysis and is restated in Equation 7-1. TT eeee S (7-1) The error sensitivity is evaluated using the common parameter values perturbed by a uniform error distribution. The range of the error di stribution is defined using experimentally derived data. A uniform distribut ion was chosen instead of a no rmal distribution for the Monte Carlo simulation. It was found that the normal distribution t ook too long to converge during testing. The normal distributi on also had a larger search sp ace, combined w ith the nonlinear coupling between the parameters, caused for the processing times to not be manageable. The uniform distribution provides solid limits to th e error distribution and quickly traversed through the search space. This provided a quick yet fruitful analysis. In order to quantify the errors in position attributable to each pa rameter, Equation 7-1 was modified as shown in Equation 7-2. TTT eeee ppeeSSe (7-2) where p : parameter vector with all elements pertur bed by associated uniform error distribution p : parameter vector with all elements but perturbed by associated uniform error distribution e : parameter error.

PAGE 75

75 This formulation allows for the Monte Carlo simulation to calculate the error variance distribution associated with each parameter using all parameter e rror distributions. This allows not only for the coupling between the different para meters to affect the positioning error but also the various parameter error distributions to affect the results. As with many complex systems, not only does the inherent relationship between th e various parameters effect the observations but also the measurement errors of the various parameters. The simulation uses the empirically derived error distribution and the target system configuration. For this analys is, the target system is a hoveri ng unmanned rotorcraft, operating at a 10 meter elevation. The parameter values us ed for this analysis are shown in Equation 7-3. 0.0 0.0 10 0.0 0.0 0.0 0.0 0.0x y zG Co G Co G Co n nPm Pm Pm p u v (7-3) The error distribution is defined as a uniform distribution with bounds at the standard deviations defined previously for the geopositioning algorithm parameters. The histograms of the error variance relative to each parameter are shown in Figure 7-5.

PAGE 76

76 Figure 7-5. Error Varian ce Histograms for the respective parameter errors The bounds of the results with respect to the parameter standard deviations are shown in Table 7-5. For the given parameter error distribu tions and system configuration, the results show that the order of significance is as follows: zG CoP, xG CoP, yG CoP, , nv, and nu. The most significant term zG CoP, demonstrates the importance in the altitude data in the geo-positioning calculations. This simulation has shown the process by which the geo-positioning parameter rank was calc ulated using empirically derived sensor noise distributions and a specified sy stem configuration. By simp ly adapting the sensor noise distributions and system configurat ion values, this process can be applied to any given system to provide insight in geo-positioning error source dominance.

PAGE 77

77 Parameter Value maxG C Ox TG C OxP eePS 18.00 m2 maxG C Oy TG C OyP eePS 18.00 m2 maxG C Oz TG C OzP eePS 20.53 m2 maxTeeS 1.594 m2 maxTeeS 0.3130 m2 maxTeeS 0.0001524 m2 maxn T nu u eeS 0.001240 m2 maxn T nv v eeS 0.01316 m2 Table 7-5. Comparison of Monte Carlo Method results Comparison of Empirical Versus Simulated Geo-Positioning Errors The experimental results obtaine d using the Gas Xcell Aircraft equipped with a downward facing camera and the experimental payl oad discussed earlier were comp ared with simulation results using the estimated error distributions used in the Monte Carlo analysis. The testing conditions used for the simulation analysis are shown in Equation 7-4. The results show that the geopositioning errors from simulation closely match the geo-positioning results obtained using the experimental vehicle/payload setup. The ge o-positioning results are shown in Figure 7-6.

PAGE 78

78 0.0 0.0 10 0.0 0.0 0.0 0.0 0.0x y zG Co G Co G Co n nPm Pm Pm p u v (7-4) Figure 7-6. Experimental a nd simulation geo-position results The use of a uniform error distribution fo r the simulation produces different results compared with a normal distribution. While the simulation results vary slightly from the

PAGE 79

79 experimental results, the uniform distribution provides more of an absolute bound for the error distribution. Applied Work Unexploded Ordnance (UXO) Detectio n and Geo-Positioning Using a UAV This research investigated the automatic detection and geo-positioning of unexploded ordnance using VTOL UAVs. Personne l at the University of Florid a in conjunction with those at the Air Force Research Laboratory at Tyndall Air Force Base, Flor ida, have developed a sensor payload capable of gathering image, attitude, and position information during flight. A software suite has also been developed that processes the image data in order to identify unexploded ordnance (UXO). These images are then geo-refere nced so that the absolute positions of the UXO can be determined in terms of the ground reference frame. This sensor payload was outfitted on a Yamaha RMAX aircraft and several experiments were conducted in simulated and live bomb testing ranges. This paper discu sses the object recognition and classification techniques used to extract the UXO from the imag es, and present the results from the simulated and live bombing range experiments. Figure 7-7. BLU97 Submunition Researchers have used aerial imagery obtai ned from small unmanned VTOL aircraft for control, remote sensing and mapping experiments [1,2,3]. In these experiments, it was necessary

PAGE 80

80 to detect a particular type of ordnance. The primary UXO of interest in these experiments was the BLU97. After deployment, this ordnance has a yellow main body with a circular decelerator. The BLU97 is shown in Figure 7-7. Experimentation VTOL Aircraft The UXO experiments were conducted using severa l aircraft in order to demonstrate the modularity of the sensor payload and to determine the capabilities of each aircraft. The first aircraft that was used for testi ng was a Miniature Aircra ft Gas Xcell RC helicopter. The aircraft was configured for heavy lift a pplications and has a payload cap acity of 10-15lbs. The typical flight time for this aircraft is 15 minutes and pr ovides a smaller VTOL aircraft for experiments at UF and the Air Force Research Laboratory. Th e Xcell helicopter is shown in Figure 7-8. Figure 7-8. Miniature Aircra ft Gas Xcell Helicopter The second aircraft used for testing was a Yamaha RMAX unmanned helicopter. With a payload capacity of 60lbs and a runtime of 20 mi nutes, this platform provided a more robust and capable testing platform for range clearance op erations. The RMAX is shown in Figure 7-9. Figure 7-9. Yamaha RM AX Unmanned Helicopter

PAGE 81

81 Sensor Payload Several sensor payloads were developed for various UAV experiments. Each payload was constructed modularly so as to en able attachment to various airc raft. The system schematic for the sensor payload is shown in Figure 7-10. Figure 7-10. Sensor Payload System Schematic The detection sensor used for these experime nts was dual digital cameras operating in the visible spectrum. These cameras provided high resolution imagery with low weight packaging. These experiments sought to also explore and qu antify the effectiveness of this sensor for UXO detection. Maximum Likelihood UXO Detection Algorithm A statistical color model was used to differen tiate pixels in the image that compose the UXO. The maximum likelihood (ML) UXO detection algorithm used a priori knowledge of the color distribution of the surface of the BLU97s in order to detect ordnance in an image. The color model was constructed using the RGB colo r space. The feature vector was defined as b g r x ~ (7-5) where r, g, and b are the eight bit color values for each pixel. DC/DC Converters Industrial CPU Digital Stereovision Cameras LiPoBattery Wireless Ethernet Power Communication Imaging Compact Flash Data Storage Compact Flash NovatelRT2 Differential GPS Pose Sensors Digital Compass DC/DC Converters Industrial CPU Digital Stereovision Cameras LiPoBattery Wireless Ethernet Power Communication Imaging Compact Flash Data Storage Compact Flash NovatelRT2 Differential GPS Pose Sensors Digital Compass

PAGE 82

82 Using the K-means Segmentation Algorithm [18], an image containing a UXO was segmented and selected. A segmented image is shown in Figure 7-11. This implementation used a 5D feature vector for each pixel which allowed for clustering using spatial and color parameters. Results varied depending on relativ e scaling of the feature vector components. Figure 7-11. Segmentation software The distribution of the UXO pixels was assumed to be a Gaussian distribution [18] thereby the maximum likelihood method was used to ap proximate the UXO color model. The region containing the UXO pixels was selected and the color model was calculated. The mean color vector is calculated as nx n1~ 1 ~ (7-6) where n is the number of pixe ls in the selected region. The covariance matrix was then calculated as n Tx x n1) ~ ~ )( ~ ~ ( 1 (7-7) The mean and covariance of the UXO pixels we re then used to develop a classification model. This classification mode l described the location and the di stribution of the training data within the RGB color space. The equation used for the classification metric was Segmented Ordnance

PAGE 83

83 ) ~ ~ ( ) ~ ~ ( exp 11 x x pT. (7-8) The classification metric is similar to the li kelihood probability except that it lacks the prescaling coefficient required by Gaussian pdfs. Th e pre-scaling coefficient was removed in order to optimize the performance of the classification algorithm. This allows for the classification metric value to range from 0 to 1. The analys is was performed by selecting a threshold for the classification metric in order to classify UXO pixe ls from the image. This allowed for images to be screened for UXO detection and for the pixel c oordinate location of the UXO to be identified in the image. Initial experimentation using simulated UXO and the ML UXO detection algorithm proved to provide successful classifica tion of UXO using images obtai ned using both aircraft. As expected the performance of the al gorithm deteriorated when there were variations in the color of the surface of the UXO or the contrast between the UXO and the background. Relating to the data, the ML UXO detection algorithm failed when either the actual UXO color distribution fell far from the modeled distribution in RGB space or the background distribution closely encompassed the actual UXO color distribution. In these cases, the vari ations caused both false positives and negatives when using the classification algorithm. The use of an expanded training data set and multiple Gaussian di stributions for modeling was inve stigated but found to slightly improve UXO detection rates but greatly incr ease false positive readings from background pixels. The algorithm performance was also ex tremely sensitive to the likelihood threshold thereby introducing another tunabl e parameter to the algorithm. Spatial Statistics UXO Detection Algorithm Previous experimental results showed that when the background of the image closely resembled the UXO color, the ML UXO performance degraded. In order to perform more robust

PAGE 84

84 UXO detection, an algorithm wa s developed whose parameters were based solely on the dimensions of the UXO and not a trained color m odel. A more sophisticated pattern recognition approach was used as shown in Figure 7-12. Fig. 7-12. Pattern Recognition Process The spatial statistics UXO detection algorithm was designed to segment like colored/shaded objects and classify them based on their dimensions. This would allow for robust performance in varying lighting, color, and background conditions. The assumptions made for this algorithm were that the UXO was of contin uous color/shading, and the UXO region would have scaled spatial properties of an actual UXO. Based on the measured above ground level of the aircraft and the projective properties of the imaging device, the algo rithm parameters would be auto-tuned to accommodate the scaling from the imaging process. In order to reduce the dimensionality of the data set, the color space was first converted from RGB to HSV. By inspection, it was found th at the saturation channel provided the greatest contrast between the background and the UXO. The raw RGB image and the saturation channel images are shown in Figure 7-13. Capture Image Prefiltering Segmentation Classification

PAGE 85

85 Figure 7-13. Raw RGB and Saturation Images of UXO The pre-filtering process consis ted of histogram equalization of the saturation image. This allowed for the contrast between the UXO pixels and the background and improved segmentation. The segmentation process was conducted by segm enting the pre-filtered image using the kmeans algorithm as shown in Figure 7-14. Figure 7-14. Segmented Image Each region was analyzed and classified using the scaled spatial statistics of the UXO. Properties such as the major/minor axis length fo r the region were used to classify the regions. Regions whose spatial properties closely matche d those of the UXO were classified as UXO and highlighted in the final image as shown in Figure 7-15. Figure 7-15. Raw Image with Highlighted UXO

PAGE 86

86 Collaborative UAV/UGV Control Recently unmanned aerial vehicles (UAVs) have been used more extensively in military operations. The improved perception abiliti es of UAVs compared with unmanned ground vehicles (UGVs) make them more attractive for surveying and reconnaissance applications. A combined UAV/UGV multiple vehicle system can provide aerial imagery, perception, and target tracking along with ground target manipulation and inspection capabi lities. This experiment was conducted to demonstrate the application of a UAV/UGV system for simulated mine disposal operations. The experiment was conducted by surveying th e target area with the UAV and creating a map of the area. The aerial map was transmitted to the base station and post-processed to extract the locations of the targets and develop waypoi nts for the ground vehicle to navigate. The ground vehicle then proceeded to each of the targets, which simulated the validation, and disposal of the ordnance. Results include the aerial map, processed images of the extracted ordnances, and the ground vehi cles ability to navigate to the target points. The platforms used for the collaborative c ontrol experiments are shown in Figure 7-16. Figure 7-16. TailGator and HeliGator Platforms

PAGE 87

87 Waypoint Surveying In order to evaluate the performance of the UAV/UGV system, the waypoints were surveyed using a Novatel RT-2 differential GPS. This system provided two centimeter accuracy or better when provided with a ba se station correction signal. A ccurate surveying of the visited waypoints provided a baseline for comparison of the results obtained from the helicopter and the corresponding path the ground vehicle traversed. The UXOs were simulated to resemble BLU-97 or dnance. Aerial photo graphs as shown in Figure 7-17 of the ordnance along with the camera position and orientation were collected. Using the transformation described previously the global coordinates of the UXOs were calculated. The calculated UXO positions were compared with the precision survey data. Figure 7-17. Aerial phot ograph of all simulated UXO

PAGE 88

88 Local Map A local map of the operating region was genera ted using the precision survey data. This local map as shown in Figure 7-18 provided a baseline for all of the position comparisons throughout this task. 3280320 3280325 3280330 3280335 3280340 3280345 3280350 3280355 3280360 368955368960368965368970368975368980368985368990368995369000 Easting (m)Northing (m) Diff. Waypoints Boundaries Figure 7-18. Local map generate d with Novatel differential GPS The data collected compares the positioning ability of the UGV and the ability of the UAV sensor system to accurately calculate the UXO positions. While both the UGV and UAV use WAAS enabled GPS there is some inherent e rror due to vehicle motion and environmental affects. The UGVs control feedback was based on waypoint to waypoint control versus a path following control algorithm.

PAGE 89

89 Once a set of waypoints was provided by th e UAV, the UGV was programmed to visit every waypoint as if to simulate the automa ted recovery/disposal process of the UXOs. The recovery/disposal process was optimized by or dering the waypoints in a manner that would minimize the total distance traveled by the UGV. This problem was similar to the traveling salesman optimization problem in which a set of cities must all be visited once while minimizing the total distance traveled. An A* search algorithm was implemented in order to solve this problem. The A* search algorithm operates by creating a decision graph and traversing the graph from node to node until the goal is reached. For the problem of waypoint order optimization, the current path distance g estimated distance to the final waypoint h and the estimated total distance f was evaluated for each node by glength of straight line segments of all predecessor waypoints h = (minimum distance of any two waypoints (successors & current waypoints)) (# of successors) h g f (7-10) The requirement for the A* algorithm of the admissibility of the h heuristic is fulfilled due to the fact that there exists no path from the current node n to a goal node with a distance less than h Therefore the heuristic provides the minimum bound as required by the A* algorithm and guarantees optimality should a path exist. The UGV was commanded to come within a sp ecified threshold of a waypoint before switching to the next waypoint as shown in Figure 7-19. The UGV consistently tr aveled within three meters or less of each of the desired waypoint s which is within the er ror envelope of typical WAAS GPS accuracy.

PAGE 90

90 3280320 3280325 3280330 3280335 3280340 3280345 3280350 3280355 3280360 368955368960368965368970368975368980368985368990368995369000 Easting (m)Northing (m) Diff. Waypoints Boundaries UGV Figure 7-19. A comparison of the UGV s path to the differential waypoints The UAV calculates the waypoints based on its sensors and thes e points are compared with the surveyed waypoints. There is an offset in the UAVs data due to the GPS being used and due to error in the transformation from image coordi nates to global coordinate s as shown in Figure 720. The UGV is able to navigate within several meters of the waypoints, however, is limited due to the vehicle kinematics. Further work in volves a waypoint sorting algorithm that accounts for the turning radius of the vehicle.

PAGE 91

91 3280320 3280325 3280330 3280335 3280340 3280345 3280350 3280355 3280360 368955368960368965368970368975368980368985368990368995369000 Easting (m)Northing (m) Waypoints Boundaries UAV Figure 7-20. UAV wa ypoints vs. UGV path Citrus Yield Estimation Within the USA, Florida is the dominant stat e for citrus producti on, producing over twothirds of the USAs tonnage, even in the hurricane-damaged 2004-2005 crop year. The citrus crops of most importance to Flor ida are oranges and grapefruit, w ith tangerines and other citrus being of less importance. With contemporary globalization, citr us production and marketing is highly internationalized, especially frozen juice concen trates. So there is great competition between various countries. Tables 7-6 and 7-7 show the five most important coun tries for production of oranges and grapefruit in two crop years. Production can vary significantly from year-to-year due to weather, especially due to hurricanes. Note the dominance of Brazil in oranges and the rise of China in both crops.

PAGE 92

92 Country 2000-2001 Crop Year 2004-2005 Crop Year Brazil 14,729 16,606 USA 11,139 8,293 China 2,635 4,200 Mexico 3,885 4,120 Spain 2,688 2,700 Other Countries 9,512 9,515 World Total 44,588 45,434 Table 7-6. Production of Oranges (1000s metric tons) (based on NASS, 2006) Country 2000-2001 Crop Year 2004-2005 Crop Year China 0 1,724 USA 2,233 914 Mexico 320 310 South Africa 288 270 Israel 286 247 Other Countries 680* 330 World Total 3,807 3,795 *Cuba produced a very significant 310 (1000 metric tons) in 2000-2001 Table 7-7. Production of Grapefruit ( 1000s metric tons) (based on NASS, 2006) The costs of labor, land, and e nvironmental compliance are generally less in most of these countries than in the USA. Labor is the larg est cost for citrus production in the USA, even though many workers, especially harvesters, are migrants. In order for producers from the USA to be competitive, they must ha ve advantages in produc tivity, efficiency, or quality to counteract the higher costs. This need for productivity, efficiency, and quality translates in to a need for better management. One management advantage that USA producers can use to remain competitive is to utilize advanced tech nologies. Precision agriculture is on e such set of technologies which can be used to improve prof itability and sustainability. Preci sion agriculture technologies were researched and applied later to citrus than some other crops, but there has been successful precision agriculture research [19,20,21]. There has been some commercial adoption [22].

PAGE 93

93 Yield maps have been a very important part of precision agriculture for over twenty years [23]. They allow management to make appropriate decisions to maximize crop value (production quantity and quality) while minimizi ng costs and environmental impacts [24]. However, citrus yield maps, like most yield maps can currently only be generated after the fruit is harvested because the production data is obtained during the harv esting process. It would be advantageous if the yield map was available be fore harvest because th is would allow better management, including better harves t scheduling and crop marketing. There has been a history of using machine vision to locate fruit on trees for robotic harvesting [25]. More recent work at the Univ ersity of Florida has attempted to use machine vision techniques to do on-tree yield mapping. Machine vision has been used to count the number of fruit on trees [26]. Other research ers not only counted the fr uit, but used machine vision and ultrasonic sensors to determine fruit si ze [27]. This research has been extended to allow for counting earlier in the season when the fruit is still quite green[28]. However, these methods all require vehicles to travel down the alleys between the rows of trees to take the machine vision images. Resear chers have demonstrated that a small remotelypiloted mini-helicopter with machine vision hard ware and software could be built and operated in citrus groves[29]. They also discuss some of the recent res earch on using mini-helicopters in agriculture, primarily conducted at Hokkaido Un iversity and the University of Illinois. The objective of this research was to determine if images taken from a mini-helicopter would have the potential to be used to generate yield maps. If so there might be a possibility of rapidly and flexibly producing citrus yield maps before harvest. Materials and Methods The orange trees used to test this concept we re located at Water Conserv II, jointly owned by the City of Orlando and Orange County. Th e facility, located about 20 miles west of

PAGE 94

94 Orlando, is the largest water recl amation project (over 100 million l iters per day) of its type in the world, one that combines agricultural irrigati on and rapid infiltration ba sins (RIBs). A block of Hamlin orange trees, an early maturing vari ety (as opposed to the later maturing Valencia variety), was chosen for study. The spatial variability of citr us tree health and production ca n range from very small to extremely great depending upon local conditions. This block had some natural variability, probably due to its variable blight infesta tion and topography. Additional variability was introduced by the trees being subjected to irriga tion depletion experiments. However, mainly due to substantial natural rainfa ll in the 2005-2006 growing season, the variation in the yield is within the bounds of what might be expected in contemporar y commercial orange production, even with the depl etion experiments. The irrigation depletion treat ment (percent of normal irrigation water NOT applied) was indicated by the treatment number. Irrigation depletion amounts were sometimes different for the Spring and the Fall/Winter parts of the growi ng season, as seen in Table 7-8 below. The replication was indicated by a lett er suffix. Only 15 of the 42 tr ees (six treatments with seven replications each) were used for this mini-h elicopter imaging effort. Treatment 6 had no irrigation except periodic fe rtigation, and the trees lived on rainfall alone. Treatment Spring Depletion Fall/Winter Depletion 1 25 25 2 25 50 3 25 75 4 50 50 5 50 75 6 100 100 Table 7-8. Irrigation Treatments

PAGE 95

95 The mini-helicopter used for this work was a Gas Xcell model modified for increased payload by its manufacturer [30]. It was purch ased in 2004 for about US$ 2000 and can fly up to 32 kph and carry a 6.8 kg payload. Its rotor is rated to 1800 rpm and has a diameter of less than 1.6 m. The instrumentation platform is descri bed in MacArthur, et al., (2005) and includes GPS with WAAS, two compact flash drives, a digital compass, and wireless Ethernet. The machine vision system uses a Videre model STH-MDCS-VAR-C st ereovision sensor. The mini-helicopter was flown at the Water Conserv II site on 10 January 2006, a mostly sunny day, shortly before noon. The helicopter generally hovered over ea ch tree for a short period of time as it moved down the row taking images with the Videre camera. The images were stored on the helicopter and some were simultaneously tr ansferred to a laptop computer over the wireless Ethernet. In addition, a Canon PowerShot S2 IS five-m egapixel digital camera was used to take photos of the trees (in north-south rows) from the east and west sides. The fruit on the individual trees were hand harvested by professional pickers on 13 February 2006. The fruit from each tree was we ighed and converted to the industry-standard measurement unit of field boxes. A fiel d box is defined as 40.8 kg (90 lbs.). The images were later processed manually. A best image of each tree was selected, generally on the basis of lighti ng and complete coverage of the tree. Each overhead image was cropped into a square that encl osed the entire tree and scaled to 960 by 960 pixels. The pixel data from several oranges were collected from seve ral representative images in the data set. The data was assumed to be normally distributed, thus the probability function was calculated for each orange pixel dataset. Using a "Mixture of Gaussians" to represent the orange class model, the images were analyzed and a threshold estab lished based on our color model. The number of "orange" pixels was then calculated in each image and used in our further analysis

PAGE 96

96 Results The results of the image processing and th e individual tree harvesting of the 15 trees studied in this work are presented in Table 7-9. As Figure 7-21 illustrates, only irrigation depletion treatment 6 had a great effect on the individual tree yields. Treatment 6 was 100% depletion, or no irrigation. The natural rainfall was such in this productio n year that the other treatments produced yields of at least four boxes per tree. Table 7-9 Results from Image Pro cessing and Individual Tree Harvesting 0 2 4 6 8 10 01234567 Treatment NumberFruit Yield Per Tree (Boxes) Figure 7-21. Individu al Tree Yields as Affected by Irrigation Depletion Treatments Treatment Replication Orange Pixels Boxes Fruit 1 B 13990 7 1 G 6391 6 2 B 11065 8 2 C 2202 4 2 E 5884 5 2 F 17522 7.5 3 B 2778 6 4 A 4433 6.2 4 B 5516 4.8 4 E 5002 4 4 F 11559 4.3 5 B 9069 7 5 C 17088 6.8 6 B 5376 2.5 6 D 6296 1

PAGE 97

97 The images were treated by the process discu ssed above. The number of orange pixels varied from 2202 to 17,522. More pixels should i ndicate more fruit. However, as Figure 7-22 shows, there was substantial scatter in the data. It can be improved somewhat by the removal of the nonirrigated treatment 6, as shown in Figure 7-23. y = 0.0002x + 3.5919 R2 = 0.2835 0 1 2 3 4 5 6 7 8 9 05000100001500020000 Orange Pixels in ImageFruit Yield Per Tree (Boxes) Figure 7-22. Indivi dual Tree Yield as a Function of Orange Pixels in Image y = 0.0002x + 4.5087 R2 = 0.373 0 2 4 6 8 10 05000100001500020000 Orange PixelsFruit Yield Per Tree (boxes Figure 7-23. Individual Tree Yield as a Function of Orange Pixels with Nonirrigated Removed Discussion This work showed that good overhead images of citrus trees coul d be taken by a minihelicopter and processed to have some correlation with the individual tree yield. A tree with fewer oranges should have fewer pixels in the im age of the orange color. For example, tree

PAGE 98

98 2C had only 2202 pixels and 4 boxes of fruit whil e tree 2F had 17,522 pixels and 7.5 boxes of fruit. These are shown as Figures 7-24 and 725 below in which the oranges in the After photo are enhanced to indicat e their detection by the image processing algorithm. Figure 7-24. Image of Tree 2C Be fore and After Image Processing Figure 7-25. Image of Tree 2F Before and After Image Processing The image processing used in th is initial research was very simple. More sophisticated techniques would likely improve the ability to bette r separate oranges from other elements in the images. The strong sunlight like ly contributed to some of the e rrors. Again, the use of more

PAGE 99

99 sophisticated techniques from ot her previous research, especial ly the techniques developed for yield mapping of citrus from the ground, woul d likely improve the performance in overhead yield mapping. A major assumption in this work is that the number of orange pixels visible is proportional to the tree yield. However, th e tree canopy (leaves, branches, othe r fruit, etc.) do hide some of the fruit. Differing percentages of the fruit may be visible on differing tr ees. This is quite apparent with the treatment 6 tr ees. Figure 7-26 shows the imag es for tree 6D. This tree, obviously greatly affected by the l ack of irrigation and a blight disease, has 6296 orange pixels but only yielded one box of fruit. The poor heal th of the tree meant that there were not many leaves to hide the interior oranges. Hence, a falsely high estimate of the yield was given. Figure 7-27 shows the images taken from the ground of Trees 6D and 2E. Even though they had similar numbers of orange pixe ls on the images taken from the helicopter, Tree 2E had five times the number of fruit. The more vigorous ve getation, especially the leaves, meant that the visible oranges on Tree 2E represented a sma ller percentage of the total tree yield. Figure 7-26. Image of Tree 6D Be fore and After Image Processing

PAGE 100

100 Figure 7-27. Ground Images of Tree 6D and Tree 2E Mini-helicopters are smaller and less expensiv e than piloted aircraft. Accordingly, the financial investment in them may be justifiable to growers and small industry firms. The minihelicopters would give their owne rs the flexibility of being able to take images on their own schedule. The mini-helicopters al so do not cause a big disturbance in the fruit grove. The noise and wind are moderate. They can operate in a rather inconspicuous manner, as shown by Figure 7-28. Figure 7-28 Mini-He licopter Operating at Water Conserv II

PAGE 101

101 While the yield mapping results of this work may appear to be a little disappointing at first glance, the results do indicate that there is pote ntial. Getting accurate yield estimates by minihelicopter image processing will be a somewhat co mplex image processing task. But this is a start. The images acquired by the helicopter are not that different, other th an the direction, than those acquired from the ground. Hence, the techniques developed in the ground-based yield mapping might be applicable.

PAGE 102

102 CHAPTER 8 CONCLUSIONS This work has presented the theory and e quipment used for tracki ng and state estimation of an unmanned ground vehicle system using an unmanned aerial vehicle system. This research is unique in that it presents a comprehensive syst em description and analysis from the sensor and hardware level to the system dynamics. This wo rk also couples the dyna mics and kinematics of two agents to form a robust state estimation us ing completely passive sensor technology. A sensitivity analysis of the geo-positioning al gorithm was performed which identified the significance of the parameters used in the algorithm. The simulation results showed that the elevati on error was the most dominant parameter in geo-positioning algorithm. By assuming that the error distributions would not change dramatically across varying system configurations, it seems intuitively obvious that errors in the system position would dominate at low altitudes due to the clos e mapping of errors in system position to target position. This was shown in the results by the dominance of the three position parameters relative to all other parameters. It was hypothesized that as the elevation of the aircraft increases, the dominance of the horiz ontal position errors would diminish and the orientation and pixel errors w ould begin to dominate. While the errors attributed to the horizontal position parameters would remain rela tively constant, the errors attributed to the orientation and pixel parameters would increas e due to the projective nature of the geopositioning algorithm. This hypothesis was tested by performing the sensitivity analysis again at varying elevations. The sensitivity analysis was performed fr om 10 meters to 100 me ters to show the dominance trend of the various parameters The results are shown in Figure 8-1.

PAGE 103

103 101 102 10-5 10-4 10-3 10-2 10-1 100 101 102 103 Max Variance versus Elevation Elevation (m)Max Variance (m2) PxPyPzPhi Theta Psi unvn Figure 8-1. Simulated error calculation versus elevation These results show how the ge o-positioning error attributable to the horizontal position error stays constant as the eleva tion increases. Also, all orientati on and pixel parameters increase their dominance as the elevation increases. Th ese results validate th e hypothesis that the horizontal position parameters will dominate at low altitude and the orientation and pixel parameters will dominate at higher altitudes. This finding is significant in that it shows the usefulness of this analysis in predicting which pa rameters are most dominant in a given system. Moreover, this analysis can be used to guide the prospectiv e researcher to what sensor specifications would most bene fit their application given th e anticipated system operating conditions. For example, a low altitude UAV application should employ a high accuracy horizontal and vertical positioning system. Conversely, a high alt itude reconnaissance UAV

PAGE 104

104 should employ a high accuracy IMU and camera system. Depending on the application, the emphasis should shift towards the sensors which would most benef it the system performance. This analysis can also provide the anticipat ed system performance for a given system configuration. For the researcher this research provides a valuable tool for assisting the system level design. These results were also validated experimenta lly. The experimental aircraft system was flown at various altitudes. Us ing the previous data obtained fo r use in the geo-position error analysis, the target position error was evaluated rela tive to the aircraft elevation. These results show that the error distributi on increased with increasing alti tude as shown in Figure 8-2. Figure 8-2. Geo-Position error versus elevation

PAGE 105

105 This work can be extended to improve th e tracking and state es timation techniques and further reduce system errors and also perfor ming a sensitivity analys is given the system configuration and paramete r error statistics. Future work can also in clude autonomous control of the aircraft by way of UGV tracking to form collaborative heterogeneous control strategies.

PAGE 106

106 LIST OF REFERENCES 1. Drake, P. 1991. Fire mapping using airborne global posi tioning. Engineering Field Notes 23:17-24. USDA Forest Service Engineering Staff. 2. Feron, E., and J. Paduano. 2004. Vision te chnology for precision landing of agricultural autonomous rotorcraft. Proceedings, Automa tion Technology for Off-Road Equipment. Kyoto, Japan. October 7-8. pp. 64-73. 3. Iwahori, T., R. Sugiura, K. Ishi, and N. Noguchi. 2004. Remote sensing technology using an unmanned helicopter with a c ontrol pan-head. Proceedings, Automation Technology for Off-Road Equipment. Kyot o, Japan. October 7-8. pp. 220-225. 4. Dana, P. H., Global Positioning System Overview, 5/1/2001, http://www.colorado.edu/geography/gcr aft/notes/gps/gps.html, 1/14/2003. 5. Federal Aviation Administration W ide Area Augmentation System, http://gps.faa.gov/Programs/WAAS/waas.htm, 4/21/2003. 6. Schrage D., Yillikci Y., Liu S., Prasad J., Hanagud S., Instrumentation of the Yamaha R-50/RMAX Helicopter testbeds for Airloads Identificati on and follow-on research. 25th European Rotorcraft Forum, 1999. 7. Mettler B., Tischler M.B., Ka nade T., "System Identificat ion of Small-Size Unmanned Helicopter Dynamics". American Helicop ter Society 55th Forum, May, 1999. 8. Mettler B., Tischler M., Kanade T., System Identification of a Model-Scale Helicopter. Technical report CMU-RI-TR-0003, Robotics Institute, Carnegie Mellon University, January, 2000. 9. Mettler B., Dever C., Feron E ., Identification Modeling, Flying Qualities, and Dynamic Scaling of Miniature Roto rcraft, Nato SCI-120 Sym posium on Challenges in Dynamics, System Identification, Control a nd Handling Qualities for Land, Sea, Berlin, Germany, May, 2002. 10. Johnson E., DeBitetto P., Modeling and Simulation for Small Autonomous Helicopter Development. AIAA Modeling & Simula tion Technologies Conference, 1997. 11. Civita M., Papageorgiou G., Messner W., Ka nade T., Design and Flight Testing of a High-Bandwidth H-infinity Loop Shaping Contro ller for a Robotic Helicopter. Journal of Guidance, Control, and Dynamics, Vol. 29, No. 2, March-April 2006, pp. 485-494. 12. Civita M., Papageorgiou G., Messner W., Ka nade T., Design and Flight Testing of a Gain-Scheduled H-infinity Loop Shaping C ontroller for Wide-Envelope Flight of a Robotic Helicopter. Proceedings of th e 2003 American Control Conference, pp. 41954200, Denver, CO, 4-6 June 2003.

PAGE 107

107 13. Kron A., Lafontaine J., Alazar d D., Robust 2-DOF H-infin ity Controller for Highly Flexible Aircraft : Design Methodology and Numerical Results. Can. Aeronautics and Space J., Vol. 49, No. 1, pp. 19-29, 2003. 14. B. Mettler M.B. Tischler, and T. Kanade "Attitude Control Optimization for a SmallScale Unmanned Helicopter," AIAA Guidan ce, Navigation and Control Conference, 2000. 15. Bouguet, J., Camera Calibration Toolbox for MATLAB, http://www.vision.caltec h.edu/bouguetj/calib_doc/. 16. Koo T., Ma Y., Sastry S., Nonlinear Cont rol of a Helicopter Ba sed Unmanned Aerial Vehicle Model. IEEE Transa ctions on Control Systems Technology, January 2001. 17. Heffley R., Mnich M, Minimum Complexity Helicopter Simulation Math Model Program. Manudyne Report 83-2-3, October 1986. 18. Duda, R., Hart, P, Stork, D., Pattern Classification, John Wiley and Sons, Inc, 2001. 19. Schueller, J.K., J.D. Whitney, T.A. Wheaton, W.M. Miller, and A. E. Turner. 1999. Lowcost automatic yield mapping in hand-harvest ed citrus. Computers and Electronics in Agriculture 23:145-153. 20. Whitney, J.D., W.M. Miller, T.A. Wheaton, M. Salyani, and J.K. Schueller. 1999. Precision farming applications in Florida citrus. Applied Engi neering in Agriculture. 15: 399-403. 21. Cugati, S.A., W.M. Miller, J.K. Schuelle r, and A.W. Schumann. 2006. Dynamic characteristics of two commercial hydrauli c flow-control valves for a variable-rate granular fertilizer spread er. ASABE Paper No. 061071. 22. Sevier, B. J., and W. S. Lee. 2005. Precision farming adoption in Florida citrus: A grower case study. ASAE Paper No. 051054. 23. Schueller, J.K. and Y.H. Bae. 1987. Spa tially attributed automatic combine data acquisition. Computers and Electro nics in Agriculture. 2:119-127. 24. Schueller, J.K. 1992. A review and integrati ng analysis of spatially -variable control of crop production. Fertiliz er Research. 33:1-34. 25. Slaughter, D.C., and R.C. Harrell. 1987. Co lor vision in robotic fruit harvesting. Transactions of the ASAE. 30(4):1144-1148. 26. Annamalai, P., W. S. Lee, and T. F. Burk s. 2004. Color vision system for estimating citrus yield in real-tim e. ASAE Paper No. 043054. 27. Regunathan, M. and W. S. Lee. 2005. Citrus yield mapping and size determination using machine vision and ultrasonic sensors. ASAE Paper No. 053017.

PAGE 108

108 28. Kane, K. E., and W. S. Lee. 2006. Spectral sensing of different citrus varieties for precision agriculture. ASABE Paper No. 061065. 29. MacArthur, D.K., J.K. Schueller, and C.D. Crane. 2005. Remotely-piloted minihelicopter imaging of citrus. ASAE Paper No. 051055. 30. Miniature Aircraft of Sorrento, Florida, http://www.miniatureaircraftusa.com/. 31. DOC. 2006. CS Market Research: Unmanne d Aerial Vehicles (UAVs). U.S.A. Department of Commerce. U.S. Commercial Service. www.buyusa.gov/newengland/155.pd f (accessed 25 May 2006). 32. NASS. 2006. Florida agricultu ral statistics: Citrus su mmary 2004-2005. United States Department of Agriculture, National Agricultural Statistics Service, Florida Field Office. February. 54 pp. 33. Crane, C., Duffy, J., Kinematic Analysis of Robot Manipulators, Cambridge University Press, 1998. 34. Faugeras, O., Luong, Q., The Geometry of Multiple Images, The MIT Press, 2001. 35. Center for Advanced Aviation Systems Development, The MITRE Corporation, Navigation, 2/25/2002, www.caasd.org/ work/navigation.html, 4/21/2003. 36. Garmin, GPS16 OEM GPS r eceiver, Olathe, Kansas. 37. D. Burschka and G. Hager, Visio n-Based Control of Mobile Robots,Proc. of the IEEE International Conference on Robotics and Automation, pp. 1707-1713, 2001. 38. J. Chen, D. M. Dawson, W. E. Dixon, and A. Behal, Adaptive Homography-Based Visual Servo Tracking for Fixed an d Camera-in-Hand Configurations, IEEE Transactions on Control Systems Technology, accepted, to appear. 39. J. Chen, W. E. Dixon, D. M. Dawson, a nd M. McIntire, Homography based Visual Servo Tracking Control of a Wheeled Mobile Robot, Proc. of the IEEE International Conference on Intelligent Robots and Systems, Las Vegas, Nevada, pp. 1814-1819, October 2003. 40. J. Chen, W. E. Dixon, D. M. Dawson, a nd V. Chitrakaran, Visual Servo Tracking Control of a Wheeled Mobile Robot with a Monocular Fixed Camera, Proceedings of the IEEE Conference on Control Applications, Taipei, Taiwan, pp. 1061-1066, 2004. 41. A. K. Das, et al., Real-Time Vision-Base d Control of a Nonholonomic Mobile Robot, Proc. of the IEEE International Conference on Robotics and Automation, pp. 1714-1719, 2001.

PAGE 109

109 42. W. E. Dixon, D. M. Dawson, E. Zergeroglu, and A. Behal, Adaptive Tracking Control of a Wheeled Mobile Robot via an Uncalibrated Camera System, IEEE Transactions on Systems, Man, and Cybernet ics-Part B: Cybernetics, Vol. 31, No. 43. MacArthur, E., Crane, C., Development of a Multi-Vehicle Simulator and Control Software, 2005 Florida Conference on Recent A dvances in Robotics, Gainesville, Fl, 2005. 44. The Analytic Sciences Corporation, App lied Optimal Estimation, The MIT Press, Cambridge, Massachusetts, and London, England, 1974.

PAGE 110

110 BIOGRAPHICAL SKETCH Donald Kawika MacArthur was born in Miami, Florida. He attended the Maritime and Science Technology (MAST) High School. He atte nded the University of Florida where he graduated summa cum laude with a B.S. in mech anical engineering. He pursued graduate research at the University of Florida and receiv ed his masters degree and ultimately his Ph.D. His research has spanned various vehicle automation technologies. He has performed research in areas such as computer vision, autonomous ground vehicle control a nd navigation, sensor systems for guidance, navigation and control, unmanned aircraft automation, and embedded hardware and software design.


Permanent Link: http://ufdc.ufl.edu/UFE0017511/00001

Material Information

Title: Tracking and state estimation of an unmanned ground vehicle system using an unmanned aerial vehicle system
Physical Description: Mixed Material
Language: English
Creator: MacArthur, Donald K. ( Dissertant )
Crane, Carl D. ( Thesis advisor )
Publisher: University of Florida
Place of Publication: Gainesville, Fla.
Publication Date: 2007
Copyright Date: 2007

Subjects

Subjects / Keywords: Mechanical Engineering thesis, Ph. D.
Dissertations, Academic -- UF -- Mechanical and Aerospace Engineering
Genre: bibliography   ( marcgt )
theses   ( marcgt )
non-fiction   ( marcgt )

Notes

Abstract: Unmanned Air Vehicles (UAV) have several advantages and disadvantages compared with Unmanned Ground Vehicles (UGV). Both systems have different mobility and perception abilities. These UAV systems have extended perception, tracking, and mobility capabilities compared with UGVs. Comparatively, UGVs have more intimate mobility and manipulation capabilities. This research investigates the collaboration of UAV and UGV systems and applies the theory derived to a heterogeneous unmanned multiple vehicle system. This research will also demonstrate the use of UAV perception and tracking abilities to extend the capabilities of a multiple ground vehicle system. This research is unique in that it presents a comprehensive system description and analysis from the sensor and hardware level to the system dynamics. This work also couples the dynamics and kinematics of two agents to form a robust state estimation using completely passive sensor technology. A general sensitivity analysis of the geo-positioning algorithm was performed. This analysis derives the sensitivity equations for determining the passive positioning error of the target UGV. This research provides a framework for analysis of passive target positioning and error contributions of each parameter used in the positioning algorithms. This framework benefits the research and industrial community by providing a method of quantifying positioning error due to errors from sensor noise. This research presents a framework by which a given UAV payload configuration can be evaluated using an empirically derived sensor noise model. Using this data the interaction between sensor noise and positioning error can be compared. This allows the researcher to selectively focus attention to sensors which have a greater effect on position error and quantify expected positioning error.
Subject: geopositioning, helicopter, sensitivity, UAV, UGV, unmanned
General Note: Title from title page of source document.
General Note: Document formatted into pages; contains 110 pages.
General Note: Includes vita.
Thesis: Thesis (Ph. D.)--University of Florida, 2007.
Bibliography: Includes bibliographical references.
General Note: Text (Electronic thesis) in PDF format.

Record Information

Source Institution: University of Florida
Holding Location: University of Florida
Rights Management: All rights reserved by the source institution and holding location.
System ID: UFE0017511:00001

Permanent Link: http://ufdc.ufl.edu/UFE0017511/00001

Material Information

Title: Tracking and state estimation of an unmanned ground vehicle system using an unmanned aerial vehicle system
Physical Description: Mixed Material
Language: English
Creator: MacArthur, Donald K. ( Dissertant )
Crane, Carl D. ( Thesis advisor )
Publisher: University of Florida
Place of Publication: Gainesville, Fla.
Publication Date: 2007
Copyright Date: 2007

Subjects

Subjects / Keywords: Mechanical Engineering thesis, Ph. D.
Dissertations, Academic -- UF -- Mechanical and Aerospace Engineering
Genre: bibliography   ( marcgt )
theses   ( marcgt )
non-fiction   ( marcgt )

Notes

Abstract: Unmanned Air Vehicles (UAV) have several advantages and disadvantages compared with Unmanned Ground Vehicles (UGV). Both systems have different mobility and perception abilities. These UAV systems have extended perception, tracking, and mobility capabilities compared with UGVs. Comparatively, UGVs have more intimate mobility and manipulation capabilities. This research investigates the collaboration of UAV and UGV systems and applies the theory derived to a heterogeneous unmanned multiple vehicle system. This research will also demonstrate the use of UAV perception and tracking abilities to extend the capabilities of a multiple ground vehicle system. This research is unique in that it presents a comprehensive system description and analysis from the sensor and hardware level to the system dynamics. This work also couples the dynamics and kinematics of two agents to form a robust state estimation using completely passive sensor technology. A general sensitivity analysis of the geo-positioning algorithm was performed. This analysis derives the sensitivity equations for determining the passive positioning error of the target UGV. This research provides a framework for analysis of passive target positioning and error contributions of each parameter used in the positioning algorithms. This framework benefits the research and industrial community by providing a method of quantifying positioning error due to errors from sensor noise. This research presents a framework by which a given UAV payload configuration can be evaluated using an empirically derived sensor noise model. Using this data the interaction between sensor noise and positioning error can be compared. This allows the researcher to selectively focus attention to sensors which have a greater effect on position error and quantify expected positioning error.
Subject: geopositioning, helicopter, sensitivity, UAV, UGV, unmanned
General Note: Title from title page of source document.
General Note: Document formatted into pages; contains 110 pages.
General Note: Includes vita.
Thesis: Thesis (Ph. D.)--University of Florida, 2007.
Bibliography: Includes bibliographical references.
General Note: Text (Electronic thesis) in PDF format.

Record Information

Source Institution: University of Florida
Holding Location: University of Florida
Rights Management: All rights reserved by the source institution and holding location.
System ID: UFE0017511:00001


This item has the following downloads:


Full Text





TRACKING AND STATE ESTIMATION OF AN UNMANNED GROUND VEHICLE
SYSTEM USING AN UNMANNED AIR VEHICLE SYSTEM




















By

DONALD KAWIKA MACARTHUR


A DISSERTATION PRESENTED TO THE GRADUATE SCHOOL
OF THE UNIVERSITY OF FLORIDA IN PARTIAL FULFILLMENT
OF THE REQUIREMENTS FOR THE DEGREE OF
DOCTOR OF PHILOSOPHY

UNIVERSITY OF FLORIDA

2007





































O 2007 Donald K. MacArthur


































I proudly dedicate my life and this work to my wonderful wife Erica. Many trials, we both have
suffered through this process.









ACKNOWLEDGMENTS

I would like to thank my father Donald Sr., my mother Janey, and my brother Matthew for

their support through my many years of schooling.












TABLE OF CONTENTS


page

ACKNOWLEDGMENTS .............. ...............4.....


LIST OF TABLES ................ ...............7............ ....


LIST OF FIGURES .............. ...............8.....


AB S TRAC T ............._. .......... ..............._ 1 1..


CHAPTER


1 INTRODUCTION ................. ...............13.......... ......


2 BACKGROUND ................. ...............14.......... .....


Position and Orientation Measurement Sensors ................. ...............14........... ...
Gl ob al Positioning Sy stem s........._..... .......... ...............14....
Inertial Measurement Units ................. ...............17................

M agnetometers .............. ...............18....
Ac el erom eter ................. ...............19...___ .....
Rate G yro....................... ..............1
Unmanned Rotor craft Modeling ........._.___..... .___ ...............20....
Unmanned Rotor craft Control .............. ...............21....


3 EXPERIMENTAL TESTING PLATFORMS .............. ...............24....


Electronics and Sensor Payloads .............. ........ ... ..............2
First Helicopter Electronics and Sensor Payload .............. ...............24....
Second Helicopter Electronics and Sensor Payload ................. ................. ........ 26
Third Helicopter Electronics and Sensor Payload ........... ................... .. ............... ..28
Micro Air Vehicle Embedded State Estimator and Control Payload ................... ...........29
Testing Aircraft............... ...............29
UF Micro Air Vehicles ................. ...............29................
ECO 8 .................. ....... ................3
Miniature Aircraft Gas X cell ................. ...............3.. 0.............

Bergen Industrial Twin............... ...............31..
Yamaha RM AX ................. ...............3.. 1..............


4 GEO-POSITIONING OF STATIC OBJECTS USING MONOCULAR CAMERA
TECHNIQUES .............. ...............33....


Simplified Camera Model and Transformation .....__.....___ ..........._ ...........3
Simple Camera Model ............ ..... .__ ...............33...
Coordinate Transformation .............. .. ........... .... .........3

Improved Techniques for Geo-Positioning of Static Obj ects ........._...... ...._._... ...........3 6












Camera Calibration................. ...............3
Geo-Positioning Sensitivity Analysis............... ...............43


5 UNMANNED ROTORCRAFT MODELING ................ ...............55................


6 STATE ESTIMATION USING ONBOARD SENSORS ................. .......... ...............60


Attitude Estimation Using Accelerometer Measurements .............. ...............60....
Heading Estimation Using Magnetometer Measurements .............. ...............64....
UGV State Estimation .............. ...............66....


7 RE SULT S .............. ...............69....


Geo-Positioning Sensitivity Analysis............... .. .... .. ..........6
Comparison of Empirical Versus Simulated Geo-Positioning Errors ................ ................77
A applied W ork .............. ... ........... .......... .. ... .. .. ............7
Unexploded Ordnance (UXO) Detection and Geo-Positioning Using a UAV ............... 79
Experimentation VTOL aircraft ................. ...............80........... ....
Sensor payload .................. .......... .. .............8
Maximum likelihood UXO detection algorithm ................... ............... 8
Spatial statistics UXO detection algorithm .............. ...............83....
Collaborative UAV/UGV Control ................. ...............86........... ....

W ay point surveying .............. ...............87....
Local map ................. ...............88.................
Citrus Yield Estimation ................. ...............91....... .....
M materials and methods .............. ...............93....
R results .............. ...............96....
Discussion .............. ...............97....


8 CONCLUSIONS .............. ...............102....


LIST OF REFERENCES .....___................. ........___.........10


BIOGRAPHICAL SKETCH ....___ ................ ......._. ..........11










LIST OF TABLES


Table page

7-1 Parameter standard deviations for the horizontal and vertical position ................... ..........70

7-2 Parameter standard deviations for the roll, pitch, and yaw angles............... .................7

7-3 Normalized pixel coordinate standard deviations used during sensitivity analysis...........73

7-4 Parameter standard deviations used during sensitivity analysis .............. ...................73

7-5 Comparison of Monte Carlo Method results ................. ...............77......._._ ..

7-6 Production of Oranges (1000's metric tons) (based on NAS S, 2006) ............... ... ............92

7-7 Production of Grapefruit (1000's metric tons) (based on NAS S, 2006). ................... ........92

7-8 Irrigation Treatments .............. ...............94....

7-9 Results from Image Processing and Individual Tree Harvesting............... ...............9











LIST OF FIGURES


Figure page

2-1 Commercially available GPS units ............ .....__ ...............15.

2-2 Commercially available GPS antennas .....__.....___ ..........._ ...........1

2-3 Commercially available IMU systems ................. ...............17........... ...

2-4 MicroMag3 magnetometer sensor from PNI Corp. ............. ...............18.....

2-5 HMC 1053 tri-axial analog magnetometer from Honeywell .........____....... ._.............19

2-6 ADXL 330 tri-axial SMT magnetometer from Analog Devices Inc. ............... ...............19

2-7 ADXRS 150 rate gyro from Analog Devices Inc. ............ ..... .......__......2

4-1 Image coordinates to proj section angle calculation ................. ............... ......... ...33

4-2 Diagram of coordinate transformation............... .............3

4-3 Normalized focal and proj ective planes. ................._._._.. ......... ..........3

4-4 Relation between a point in the camera and global reference frames............... ................38

4-5 Calibration checkerboard pattern ...._.. ................. ........_.. ........ 4

4-6 Calibration images .............. ...............41....

4-7 Calibration images .............. ...............42....

5-1 Top view of the body Eixed coordinate system ................. ...............56......__.. .

5-2 Side view of the body Eixed coordinate system ................. .....___.............. .....5

5-3 M ain rotor blade angle ................. ...............58......... ....

5-4 Main rotor thrust vector ................. ...............58....___ ....

6-1 Fast Fourier Transform of raw accelerometer data ....._____ .... ... ..__ ...........__....62

6-2 Fast Fourier Transform of raw accelerometer data after low-pass filter .........................62

6-3 Roll and Pitch measurement prior to applying low-pass filter .............. .....................6

6-4 Roll and Pitch measurement after applying low-pass filter ....._____ ...... .....__..........64

6-5 Magnetic heading estimate .............. ...............65....











7-1 Roll and Pitch measurements used for defining error distribution .................. ...............71

7-2 Heading measurements used for defining error distribution ................. ............. .......71

7-3 Image of triangular placard used for geo-positioning experiments ................. ................72

7-4 Results of x and y pixel error calculations ................. ...............72........... .

7-5 Error Variance Histograms for the respective parameter errors .............. ....................76

7-6 Experimental and simulation geo-position results ................. ..............................78

7-7 BLU9 7 Sub muniti on ................. ...............79...............

7-8 Miniature Aircraft Gas Xcell Helicopter .............. ...............80....

7-9 Yamaha RMAX Unmanned Helicopter ................. ...............80........... ...

7-10 Sensor Payload System Schematic .............. ...............81....

7-11 Segmentation software............... ...............82

7-12 Pattern Recognition Process .............. ...............84....

7-13 Raw RGB and Saturation Images of UXO ................. ...............85..............

7-14 Segmented Image ................. ...............85........... ....

7-15 Raw Image with Highlighted UXO .............. ...............85....

7-16 TailGator and HeliGator Platforms ................. ...............86......___. ...

7-17 Aerial photograph of all simulated UXO ................. ...............87........... ..

7-18 Local map generated with Novatel differential GPS .............. ...............88....

7-19 A comparison of the UGV' s path to the differential waypoints ................ ................ ..90

7-20 UAV waypoints vs. UGV path .............. ...............91....

7-21 Individual Tree Yields as Affected by Irrigation Depletion Treatments ...........................96

7-22 Individual Tree Yield as a Function of Orange Pixels in Image............... ..................9

7-23 Individual Tree Yield as a Function of Orange Pixels with Nonirrigated Removed.._......97

7-24 Image of Tree 2C Before and After Image Processing ......___ ........_ ..............98

7-25 Image of Tree 2F Before and After Image Processing .............. ...............98....












7-26 Image of Tree 6D Before and After Image Processing............... ...............9

7-27 Ground Images of Tree 6D and Tree 2E............... ...................100

8-1 Simulated error calculation versus elevation ....._._._ ..... .._.. .. ..._. ..........0


8-2 Geo-Position error versus elevation............... ...............10









Abstract of Dissertation Presented to the Graduate School
of the University of Florida in Partial Fulfillment of the
Requirements for the Degree of Doctor of Philosophy

TRACKING AND STATE ESTIMATION OF AN UNMANNED GROUND VEHICLE
SYSTEM USING AN UNMANNED AERIAL VEHICLE SYSTEM

By

Donald Kawika MacArthur

May 2007

Chair: Carl Crane
Major: Mechanical Engineering

Unmanned Air Vehicles (UAV) have several advantages and disadvantages compared with

Unmanned Ground Vehicles (UGV). Both systems have different mobility and perception

abilities. These UAV systems have extended perception, tracking, and mobility capabilities

compared with UGVs. Comparatively, UGVs have more intimate mobility and manipulation

capabilities. This research investigates the collaboration of UAV and UGV systems and applies

the theory derived to a heterogeneous unmanned multiple vehicle system. This research will also

demonstrate the use of UAV perception and tracking abilities to extend the capabilities of a

multiple ground vehicle system. This research is unique in that it presents a comprehensive

system description and analysis from the sensor and hardware level to the system dynamics.

This work also couples the dynamics and kinematics of two agents to form a robust state

estimation using completely passive sensor technology. A general sensitivity analysis of the

geo-positioning algorithm was performed. This analysis derives the sensitivity equations for

determining the passive positioning error of the target UGV. This research provides a

framework for analysis of passive target positioning and error contributions of each parameter

used in the positioning algorithms. This framework benefits the research and industrial

community by providing a method of quantifying positioning error due to errors from sensor









noise. This research presents a framework by which a given UAV payload configuration can be

evaluated using an empirically derived sensor noise model. Using this data the interaction

between sensor noise and positioning error can be compared. This allows the researcher to

selectively focus attention to sensors which have a greater effect on position error and quantify

expected positioning error.









CHAPTER 1
INTTRODUCTION

The Center for Intelligent Machines and Robotics at the University of Florida has been

performing autonomous ground vehicle research for over 10 years. In that time, research has

been conducted in the areas of sensor fusion, precision navigation, precision positioning systems,

and obstacle avoidance. Researchers have used small unmanned helicopters for remote sensing

purposes for various applications. Recently, experimentation with unmanned aerial vehicles has

been in collaboration with the Tyndall Air Force Research Laboratory at Tyndall AFB, Florida.

Recently, unmanned aerial vehicles (UAVs) have been used more extensively for military

and commercial operations. The improved perception abilities of UAVs compared with

unmanned ground vehicles (UGVs) make them more attractive for surveying and reconnaissance

applications. A combined UAV/UGV multiple vehicle system can provide aerial imagery,

perception, and target tracking along with ground target manipulation and inspection capabilities.

This research investigates collaborative UAV/UGV systems and also demonstrates the

application of a UAV/UGV system for various task-based operations.

The Air Force Research Laboratory at Tyndall Air Force Base has worked toward

improving EOD and range clearance operations by using unmanned ground vehicle systems.

This research incorporates the abilities of UAV/UGV systems to support these operations. The

research vision for the range clearance operations is to develop an autonomous multi-vehicle

system that can perform surveying, ordnance detection/geo-positioning, and disposal operations

with minimal user supervision and effort.









CHAPTER 2
BACKGROUND

Researchers have used small unmanned helicopters for remote sensing purposes for

various applications [1,2,3]. These applications range from agricultural crop yield estimation,

pesticide and fertilizer application, explosive reconnaissance and detection, and aerial

photography and mapping.

This research effort will strive to estimate the states of a UGV system using monocular

camera techniques and the extrinsic parameters of the camera sensor. The extrinsic parameters

can be reduced to the transformation from the camera coordinate system to the global coordinate

sy stem.

Position and Orientation Measurement Sensors

Global Positioning Systems

Global Position Systems (GPS) are widely becoming the position system of choice for

autonomous vehicle navigation. This technology allows for an agent to determine its location

using broadcasted signals from satellites overhead. The Navigation Signal Timing and Ranging

Global Positioning System (NAVSTAR GPS) was established in 1978 and is maintained by the

United States Department of Defense to provide a positioning service for use by the U. S. military

and is utilized by the public as a public good. Since its creation, the service has been used for

commercial purposes such as nautical, aeronautical, and ground based navigation, and land

surveying. The current U.S. based GPS satellite constellation system consists of over 24

satellites. The number of satellites in operation for this system can vary due to satellites being

taken in and out of service. Other countries are leading efforts to develop alternative satellite

systems for their own GPS systems. A similar GPS system is the GLONASS constructed by

Russia. The GALILEO GPS system is being developed by a European consortium. This system









is to be maintained by Europeans and will provide capabilities similar to that of the NAVSTAR

and GLONASS systems.

Each satellite maintains its own specific orbit and circumnavigates earth once every 12

hours. The orbit of each satellite is timed and coordinated so that five to eight satellites are above

the horizon of any location on the surface of earth at any time. A GPS receiver calculates

position by first receiving the microwave RF signals broadcast by each visible satellite. The

signals broadcasted by the satellites are complex high frequency signals with encoded binary

information. The encoded binary data contains a large amount of information but mainly

contains information about the time that the data was sent and location of the satellite in orbit.

The GPS receiver processes this information to solve for its position and current time.

GPSreceivers typically provide position solutions at 1Hz but GPS receivers can be

purchased that output position solutions up to 20Hz. The accuracy of a commercial GPS system

without any augmentation is approximately 15 meters. Several types of commercially available

GPS units are shown in Figure 2-1. Some units are equipped with or without antennas. The

Garmin GPS unit in Figure 2-1 contains the antenna and receiver whereas the other two units are

simply receivers. Several types of antennas are shown in Figure 2-2.


Figure 2-1. Commercially available GPS units


















Figure 2-2. Commercially available GPS antennas

Differential GPS is an alternative method by which GPS signals from multiple receivers

can be used to obtain higher accuracy position solutions. Differential GPS operates by placing a

specialized GPS receiver in a known location and measuring the errors in the position solution

and the associated satellite data. The information is then broadcast in the form of correction data

so that other GPS receivers in the area can calculate a more accurate position solution. This

system is based on the fact that there are inherent delays as the satellite signals are transmitted

through the atmosphere. Localized atmospheric conditions cause the satellite signals within that

area to have the same delays. By calculating and broadcasting the correction values for each

visible satellite the differential GPS system can attain accuracy from 1mm to 1cm [4].

In 2002, a new type of GPS correction system has been integrated so that a land-based

correction signal is not required to improve position solutions. Satellite based augmentation

systems (SBAS) transmit localized correction signals from orbiting satellites [5]. A SBAS

system implemented for North America is the Wide Area Augmentation System (WAAS). This

system has been used in this research and position solutions with errors of less than three meters

have been observed.

In 2005, the first in a series of new satellites was introduced into the NAVSTAR GPS

system. This system provides a new GPS signal referred to as L2C. This enhancement is

intended to improve the accuracy and reliability of the NAVSTAR GPS system for military and

public use.









Inertial Measurement Units

Inertial Measurement Unit (IMU) systems are used extensively in vehicles where accurate

orientation measurements are required. Typical IMU systems contain accelerometers and

angular gyroscopes. These sensors allow for the rigid body motion of the IMU to be measured

and state estimations to be made. These systems can vary greatly in cost and performance.

When coupled with a GPS system, the positioning and orientation of the system can be

accurately estimated. The coupled IMU/GPS combines the position and velocity measurements

based on satellite RF signals with inertial motion measurements. These systems complement

each other whereby the GPS is characterized by low frequency global position measurements

and the IMU provides higher frequency relative positioning/orientation measurements. Some of

the commercially available IMU systems are shown in Figure 2-3.










Figure 2-3. Commercially available IMU systems

Other sensors allow for orientation measurements such as fluidic tilt sensors, imaging

sensors, light sensors, thermal sensors. Each of these sensors has different advantages and

disadvantages for implementation. Fluidic tilt sensors provide high frequency noise rejection

and decent attitude estimation for low dynamic vehicles. In high G turns and extreme dynamics

these sensors fail to provide usable data. Imaging sensors have the advantage of not being

affected by vehicle dynamics. However, advanced image processing algorithms can require

significant computational overhead and these sensors are highly affected by lighting conditions.

Thermopile attitude sensors have been used for attitude estimation and are not affected by










vehicle dynamics. These sensors provide excellent attitude estimations but are affected by

reflective surfaces and changes in environment temperature.

Magnetometers

A magnetometer is a device that allows for the measurement of a local or distant magnetic

field. This device can be used to measure the strength and direction of a magnetic field. The

heading of an unmanned vehicle may be determined by detecting the magnetic field created by

the Earth's magnetic poles. The "magnetic north" direction can aid in navigation and geo-spatial

mapping. For applications where the vehicle orientation is not restricted to planar motion, the

magnetometer is typically coupled with a tilt sensor to provide a horizontal north vector

independent of the vehicle orientation.

There are several commercially available magnetometer sensors. The MicroMag3 from

PNI Corp. provides magnetic field measurements in three axes with a digital serial peripheral

interface (SPI) shown in Figure 2-4.












Figure 2-4. MicroMag3 magnetometer sensor from PNI Corp.

The Honeywell Corporation also manufactures a line of magnetic field detection sensors.

These products vary from analog linear/vector sensors to integrated digital compass devices.

The HMC 1053 from Honeywell is a three axis magneto-resistive sensor for multi-axial magnetic

field detection and is shown in Figure 2-5.





















Figure 2-5. HMC1053 tri-axial analog magnetometer from Honeywell

Accelerometer

An accelerometer measures the acceleration of the device in single or multiple

measurement axes. MEMS based accelerometers provide accurate and inexpensive devices for

measurement of acceleration. The ADXL330 from Analog Devices Inc. provides analog three

axis acceleration measurements in a small surface mount package shown in Figure 2-6.











Figure2-6. ADXL 330 tri-axial SMT magnetometer from Analog Devices Inc.

A two or three axis accelerometer can be used as a tilt sensor. The off horizontal angles

can be determined by measuring the projection of the gravity vector onto the sensor axes. These

measurements relate to the roll and pitch angle of the device and when properly compensated to

account for effects from vehicle dynamics, can provide accurate orientation information.

Rate Gyro

A rate gyro is a device that measures the angular time rate of change about a single axis or

multiple axes. The ADXRS150 is a single axis MEMS rate gyro manufactured by Analog










Devices Inc. which provides analog measurements of the angular rate of the device and is shown

in Figure 2-7.









Figure 2-7. ADXRS150 rate gyro from Analog Devices Inc.

This device uses the Coriolis effect to measure the angular rate of the device. An

internally resonating frame in the device is coupled with capacitive pickoff elements. The

response of the pickoff elements changes with the angular rate. This signal is then conditioned

and amplified. When coupled with an accelerometer, these devices allow for enhanced

orientation solutions.

Unmanned Rotorcraft Modeling

For this research, the aircraft operating region will mostly be in hover mode. The flight

characteristics of the full flight envelope are very complex and involve extensive dynamic,

aerodynamic, and fluid mechanics analysis. Previously, researchers have performed extensive

instrumentation of a Yamaha R-50 remote piloted helicopter [6]. These researchers outfitted a

Yamaha R-50 helicopter with a sensor suite for in-flight measurements of rotor blade motion and

loading. The system was equipped with sensors along the length of the main rotor blades,

measuring strain, acceleration, tilt, and position. This research was unique due to the detail of

instrumentation not to mention the difficulties of instrumenting rotating components. This work

provided structural, modal, and load characteristics for this airframe and demonstrates the

extensive lengths required for obtaining in-flight aircraft properties. In addition, extensive work

has been conducted in system identification for the Yamaha R-50 and the Xcell 60 helicopters










[7,8,9]. These researchers performed extensive frequency response based system identification

and flight testing and compared modeling results with that of the scaled dynamics of the UH-1H

helicopter. These researchers have conducted an extensive analysis of small unmanned

helicopter dynamic equations and system identification. This work has resulted in complete

dynamic modeling of a model scale helicopter. These results showed great promise in that they

demonstrated a close relation between the UH-1H helicopter dynamics and the tested aircraft.

This research also showed that the aircraft modeling technique used was valid and that the

system identification techniques used for larger rotorcraft were extensible to smaller rotorcraft.

Other researchers present a more systems level approach to the aircraft automation

discussion [10]. They present the instrumentation equipment and architecture, and present the

modeling and simulation derivations. They go on to present their work involving hardware-in-

the-loop simulation and image processing.

Unmanned Rotorcraft Control

Many researchers have become actively involved in the control and automation of

unmanned rotorcraft. The research has involved numerous controls topics including robust

controller design, fuzzy control, and full flight envelope control.

Robust Hoo controllers have been developed using Loop Shaping and Gain-Scheduling to

provide rapid and reliable high bandwidth controllers for the Yamaha R-50 UAV [11,12]. In this

research, the authors sought to incorporate the use of high-fidelity simulation modeling into the

control design to improve performance. Coupled with the use of multivariable control design

techniques, they also sought to develop a controller that would provide fast and robust controller

performance that could better utilize the full flight envelope of small unmanned helicopters.

Anyone who has observed experienced competition level Radio Controlled (RC) helicopter










pilots and their escapades during flight have observed the awesome capabilities of small RC

helicopters during normal and inverted flight. It is these capabilities that draw researchers

towards using helicopters for their research. But with increased capabilities comes increased

complexities in aircraft mechanics and dynamics. These researchers have attempted to

incorporate the synergic use of a high-fidelity aircraft model with robust multivariable control

strategies and have validated their findings by implementing and flight testing their control

algorithms on their testing aircraft. Also, Hoo controller design has been applied to highly

flexible aircraft [13]. As will be shown later, helicopter airframes are significantly prone to

failures cause by vibration modes. Disastrous consequences can occur if these vibration modes

are not considered and compensated. In this research, a highly flexible aircraft model is used for

control design and validation. The controller is specifically designed to compensate for the high

flexibility of the airframe. The authors present the aircraft model and uncertainties and discuss

the control law synthesis algorithm. These results demonstrate the meshing of the aircraft

structure modeling/analysis and the control design/stability. This concept is important not only

from a system performance perspective but also from a safety perspective. As UAVs become

more prevalent in domestic airspace, the public can benefit from the improved system safety

provided by more sophisticated modeling and analysis techniques.

Previous researchers have also conducted research on control optimization for small

unmanned helicopters [14]. In this research, the authors focus on the problem of attitude control

optimization for a small-scale unmanned helicopter. By using an identified model of the

helicopter system that incorporates the coupled rotor/stabilizer/fuselage dynamic effects, they

improve the overall model accuracy. This research is unique in that it incorporates the stabilizer

bar dynamic affects commonly not included in previous work. The system model is validated by










performing flight tests using a Yamaha RMax helicopter test-bed system. They go on to

compensate for the performance reduction induced by the stabilizer bar and optimize the

Proportional Derivative (PD) attitude controller using an established control design methodology

with a frequency response envelope specification.










CHAPTER 3
EXPERIMENTAL TESTING PLATFORMS

Electronics and Sensor Payloads

In order to perform testing and evaluation of the theory and concepts involved in this

research, several electronics and sensor payloads were developed. The purpose of these

payloads was to provide perception and aircraft state measurements and onboard processing

capabilities. These systems were developed to operate modularly and enable transfer of the

payload to different aircraft. The payloads were developed with varying capabilities and sizes.

The host aircraft for these payloads ranged from a 6" fixed wing micro-air vehicle to a 3.1 meter

rotor diameter agricultural mini-helicopter.

First Helicopter Electronics and Sensor Payload

The first helicopter electronics and sensor payload was constructed to provide an initial

testing platform to ensure proper operation of the electronics and aircraft during flight. The

system schematic is shown in Figure 3-1.

Imaging

Digital Stereovision Cameras




Communicationl Datatia Storage'"'"

Wireless I CPU Laptop Hard Drive
Ethernet




Power

DC DC Converters LiPo Battery


Figure 3-1. First helicopter payload system schematic










The system consisted of five subsystems:

1. Main processor
2. Imaging
3. Communication
4. Data storage
5. Power
The main processor provides the link between all of the sensors, the data storage device,

and the communication equipment. The imaging subsystem consists of a Videre stereovision

system linked via two firewire connections. The data storage subsystem consists of a 40GB

laptop hard drive which runs a Linux operating system and was used for sensor data storage.

The power subsystem consists of a 12V to 5V DC to DC converter, 12V power regulator and a

3Ah LiPo battery pack. The power regulators condition and supply power to all electronics. The

LiPo battery pack served as the main power source and was selected based on the low weight

and high power density of the LiPo battery chemistry. The first helicopter payload attached to

the aircraft is shown in Figure 3-2.























Figure 3-2. First payload mounted on helicopter



























~_h- --~Ll.i SL~-~ILaa*~~E~P ~L~YUI~PLI
~5~L~CIIIC~
IL3e L~ Y4 -I I
----- ;L ~-Y -C


The first prototype system was equipped on the aircraft and was tested during flight.


Although image data was able to be gathered during flight, it was found that the laptop hard


drive could not withstand the vibration of the aircraft. Figure 3-3 shows in flight testing of the


first prototype payload.


Figure 3-3. Helicopter testing with first payload

Second Helicopter Electronics and Sensor Payload

The payload design was refined in order to provide a more robust testing platform for this

research. In order to improve the designs, vibration isolation of the payload from the aircraft was


required as well as a data storage method that could withstand the harsh environment onboard

the aircraft. The system schematic for the second prototype payload is shown in Figure 3-4.


Imagmg
Data Storage
Digital Stereovislon CamerasCopcFlh

Commnicaion ndusrialCompact Flash



Wireless CPU
Ethernet
Pose Sensols

OEM Garmm GPS


DC/DC Convelters LIPo Battery Digital Compass


Figure 3-4. Second helicopter payload system schematic










The system consisted of six subsystems:

1. Main processor
2. Imaging
3. Pose sensors
4. Communication
5. Data Storage
6. Power
The second prototype payload contained similar components as the first prototype, but

instead of a laptop hard drive it utilized two compact flash drives for storage, and in addition two

pose sensors were added. The OEM Garmin GPS provided global position, velocity, and altitude

data at 5Hz. The digital compass provided heading, roll, and pitch angles at 30Hz. The second

prototype payload is shown in Figure 3-5.


Figure 3-5. Second payload mounted to helicopter












Flight tests showed that the second payload could reliably collect image and pose data


during flight and maintain wireless communication at all times. Figure 3-6 shows the second


prototype payload equipped on the aircraft during flight testing.


















Figure 3-6. Flight testing with second helicopter payload

Third Helicopter Electronics and Sensor Payload


The helicopter electronics and sensor payload was redesigned slightly to include a high


accuracy differential GPS (Figure 3-7). This system has a vendor stated positioning accuracy of


2 cm in differential mode and allows precise helicopter positioning. This system further


improves the overall system performance and allows for comparison of the normal versus RT2


differential GPS systems.


Imaging

Digital Stereovision
Cameras



commemea so" Industrial

Wireless CPU
Ethernet



SPower

DC/DC Converters LIPo Battery


Data Storage

Compact Flash


Compact Flash


Pose Sensors

Novatel RT2 Differential
GPS


Digital Compass


Figure 3-7. Third helicopter payload system schematic












Micro Air Vehicle Embedded State Estimator and Control Payload


An embedded state estimator and control payload was developed to support the Micro Air


Vehicle research being performed at the University of Florida. This system provides control


stability and video data. The system schematic is shown in Figure 3-8.


Imaging

CMOS Camera
RF Video Transmitter


Communication
Atmel
Aerocomm
900MHz RF Megal28
Modem



SPower

DC/DC Converters LIPoBattery


Pose Sensors

2 Axis Accelerometer



Altitude and Ailrspeed
Pressure Sensors


Figure 3-8. Micro-Air Vehicle embedded state estimator and control system schematic

Testing Aircraft

UF Micro Air Vehicles


Several MAVs have been developed for reconnaissance and control applications. This


platform provides a payload capability of < 30 grams with a wingspan of 6" (Figure 3-9). This


system is a fixed wing aircraft with 2-3 control surface actuators and an electric motor. System


development for this platform requires small size and weight, and low power consumption.


Figure 3-9. Six inch micro air vehicle









ECO 8

This aircraft was the first helicopter built in the UF laboratory. The aircraft is powered by

a brushed electric motor with an eight cell nickel cadmium battery pack. The aircraft is capable

of flying for approximately 10 minutes under normal flight conditions. This system has a

payload capacity of less than 60 grams with CCPM swashplate mixing as shown in Figure 3-10.

















Figure 3-10. Eco 8 helicopter

Miniature Aircraft Gas Xcell

A Miniature Aircraft Gas Xcell was the first gas powered helicopter purchased for testing

and experimentation (Figure 3-11). This aircraft is equipped with a two stroke gasoline engine,

740 mm main rotor blades, and has an optimal rotor head speed of 1800 rpm. The payload

capacity is approximately 15 lbs with a runtime of 20 minutes.













Figure 3-11. Miniature Aircraft Gas Xcell










Bergen Industrial Twin

A Bergen Industrial Twin was purchased for testing with heavier payloads (Figure 3-12).

This aircraft is equipped with a dual cylinder two stroke gasoline engine, 810 mm main rotor

blades, and has an optimal rotor head speed of 1500 rpm. The payload capacity is approximately

25 lbs with a runtime of 30 minutes.




















Figure 3-12. Bergen industrial twin helicopter

Yamaha RMAX

Several agricultural Yamaha RMAX helicopters were purchased by the AFRL robotics

research laboratory at Tyndall Air Force base in Panama City, Florida. The aircraft is shown in

Figure 3-13. This system has a two-stroke engine with internal power generation and control

stabilization system. This system has a 60 lb payload capability. The system is typically used

for small area pesticide and fertilizer spraying.

These aircraft were used to conduct various experiments involving remote sensing, sensor

noise analysis, system identification, and various applied rotorcraft tasks. These experiments

and their results will be discussed in the subsequent chapters. Each aircraft has varying costs,










payload capabilities, and runtimes. As with the various sensors available for UAV research, the

aircraft should be selected to tailor to the needs of the particular proj ect or task.


Figure 3-13. Yamaha RMAX helicopter









CHAPTER 4
GEO-POSITIONING OF STATIC OBJECTS USING MONOCULAR CAMERA
TECHNIQUES

Two derivations were performed which allowed for the global coordinates of an object in

an image to be found. Both derivations perform the transformation from a 2D coordinate system

referred to as the image coordinate system to the 3D global coordinate system. The first

derivation utilizes a simplified camera model and calculates the position of the static obj ect using

the concept of the intersection of a line and a plane. The second derivation utilizes intrinsic and

extrinsic camera parameters and uses proj ective geometry and coordinate transformations.

Simplified Camera Model and Transformation

Simple Camera Model

The cameras were modeled by linearly scaling the horizontal and vertical proj section angle

with the x and y position of the pixel respectively as illustrated in Figure 4-1. This allowed for

the relative angle of the static obj ect to be calculated with respect to a coordinate system fixed in

the aircraft.









-Pitch








Roll
Figure 4-1. Image coordinates to projection angle calculation










Coordinate Transformation

A coordinate transformation is performed on the static obj ect location from image

coordinates to global coordinates as shown in Figure 4-2. The image data provides the relative

angle of the static obj ect with respect to the aircraft reference frame. In order to find the position

of the static obj ect, a solution of the intersection of a line and a plane was used.

Holicoptera
Rafaranoc
Frame







Global ''.An
Rafmarnea
Funne


Figure 4-2. Diagram of coordinate transformation


The equation of a plane that is used for this problem is

Ax + By + Cz + D = 0 (4-1)

where x, y, and z are the coordinates of a point in the plane.

The equation of a line used in this problem is

p1 p + u 2 1 l (4-2)

where pi and p2 are pOints on the line.

Substituting (4-2) into (4-1) results in the solution

Ax, + By, + Cz + D
u = (4-3)
A(x, x2)+ B(y, y2 )+ C Z Z2









where xl, yl, and zl are the coordinates of point pi, and x2, y2, and z2 are the coordinates of point

p2.

For this problem the ground plane is defined in the global reference frame by A=0, B=0,

C=1, and D=-ground elevation. The point pi is the focal point of the camera and it is determined

in the global reference frame based on the sensed GPS data. The point p2 is calculated in the

global reference frame as equal to the coordinates of pi plus a unit distance along the static

obj ect proj section ray. This is known from the static obj ect image angle and the camera' s

orientation as measured by attitude and heading sensors. In other words the direction of the

static obj ect proj section ray in the global reference frame was found by transforming the

proj section vector from the aircraft to the static obj ect as measured in the aircraft frame to the

global frame. This entailed using a downward vector in the aircraft frame and rotating about the

yaw, pitch, and roll axis by the proj section angles and pose angles. The rotation matrices are

Cos # Sin~ O
'R2 = Sin#~ Cos# 0 (4-4)
0 0 1


Cos8 0 Sin0
2R3 = 0 1 0 45
-Sin0 0 CosB

10 0
3R4 =I 0 Cos y Sin l (4-6)
0 Sin r Cos y

where

cp=yaw of the aircraft

6=pitch of the aircraft plus proj section pitch angle

uy=roll of the aircraft plus proj section pitch angle.









The downward vector r=(0 0 -1)T was transformed using the compound rotation matrix

1R4 __1R"R"R4 (4-7)

The new proj section vector was found as

jr = R4r (4-8)

where r is the proj section ray measured in the aircraft reference frame and j"' is the proj section ray

as measured in the global reference frame. Using the solution found for the intersection of a line

and a plane and using the aircraft position as a point on the line, the position of the static object

in the global reference frame was found. Thus, for each obj ect identified in an image, the

coordinates of pl and p2 are determined in the global reference frame and (4-2) and (4-3) are then

used to calculate the position of the object in the global reference frame.

Improved Techniques for Geo-Positioning of Static Objects

A precise camera model and an image to global coordinate transformation were developed.

This involved finding the intrinsic and extrinsic camera parameters of the camera system

attached to the aerial vehicle. A relation between the normalized pixel coordinates and

coordinates in the proj ective coordinate plane was used:




=v 1.X Zec (4-9)



The normalized pixel coordinate vector iib and the proj ective plane coordinate vector M\~

are related using Equation 4-9 and form the projection relationship between points in the image

plane and points in the camera reference frame as shown in Figure 4-9, where










Ye .:1 (4-11)

I 1

















Figure 4-3. Normalized focal and projective planes

The transformation from image coordinates to global coordinates was determined using the

normalized pixel coordinates, and the camera position and orientation with respect to the global

coordinate system (Figure 4-4). The transformation of a point M expressed in the camera

reference system C to a point expressed in the global system is shown in Equation 4-12.

G Mh __ GC C M (4-12)



XG Xc

G~~: GCPG G 1 (4-13)


Dividing both sides of Equation 4-13 by Zc and substituting ZG = 0 (assuming the

elevation of the camera is evaluated as the above ground level and the target location exists on

the ZG = 0 global plane) results in Equation 4-14.





Figure 4-4. Relation between a point in the camera and global reference frames


Zc



1
Zc,


(4-14)


Substituting Xc/Zc


un and Yc/Zc = vn:


(4-15)


(4-16)


Zc



1
Zc,


Zc u


SZcYO Of 1 1:



This leads to three equations and three unknowns XG, YG, Zc:

X, GPM
G= R,,u, + R12 n + R13 C=
Zc Zc










G= Rept,, + R22V, + R23 + c?(-7


GPC
0 = R?1u,, + R32v,, + R33 + c (4-18)


where the scalar R,, represents the element in the ith' row and j"' column of the R,, matrix.

Using Equations 4-16, 4-17, and 4-18, Zc, XG, YG can be determined explicitly:

GP
c, =o (4-19)
R31 72 + R32 n R33


XG Gieo= ( u, + R12 72 + R13 GPeox (4-20)


R31 72 + ;JR32 72 + R33 (-1




Equations 4-20 and 4-21 provide the global coordinates of the static obj ect.

Camera Calibration

In order to calculate the normalized pixel coordinates using raw imaging sensor data, a

calibration procedure is performed using a camera calibration toolbox for MATLAB@[15]. The

calibration procedure determines the extrinsic and intrinsic parameters of the camera system.

During the calibration procedure, several images are used with checkerboard patterns of specific

size that allow for the different parameters to be estimated as shown in Figure 4-5.

The extrinsic parameters define the position and orientation characteristics of the camera

system. These parameters are affected by the mounting and positioning of the camera relative to

the body fixed coordinate system.




























Figure 4-5. Calibration checkerboard pattern

The intrinsic parameters define the optic projection and perspective characteristics of the

camera system. These parameters are affected by the camera lens properties, imaging sensor

properties, and lens/sensor placement properties. The camera lens properties are generally

characterized by the focal length and prescribed imaging sensor size. The focal length is a

measure of how strongly the lens focuses the light energy. This in essence correlates to the zoom

of the lens given a fixed sensor size and distance. The imaging sensor properties are generally

characterized by the physical size, and horizontal/vertical resolution of the imaging sensor.

These properties help to define the dimensions and geometry of the image pixels. The

lens/sensor placement properties are generally characterized by the misalignment of the lens and

image sensor, and the lens to sensor planar distance. For our analysis we are mostly concerned

with determining the intrinsic parameters of the camera system. These parameters are used for

calculating the normalized pixel coordinates given the raw pixel coordinates.

The intrinsic parameters that are used for generating the normalized pixel coordinates are

the focal length, principal point, skew coefficient, and image distortion coefficients. The focal










length as described earlier estimates the linear proj section of points observed is space to the focal

plane. The focal length has components in the x and y axes and does not assume these values are

equal. The principal point estimates the center pixel position. All normalized pixel coordinates

are referenced to this point. The skew coefficient estimates the angle between the x and y axes

of each pixel. In some instances the pixel geometry is not square or even rectangular. This

coefficient describes how "off-square" the pixel x and y axes are and allows for compensation.

The image distortion coefficients estimate the radial and tangential distortions typically caused

by the camera lens. Radial distortion causes a changing magnifieation effect at varying radial

distances. These effects are apparent when a straight line appears to be curved through the

camera system. The tangential distortions are caused by ill centering or defects of the lens

optics. These cause the displacement of points perpendicular to the radial imaging Hield.


Calibration images






















Figure 4-6. Calibration images









The camera calibration toolbox allows for all of the intrinsic parameters to be estimated

using several images of the predefined checkerboard pattern. Once the calibration procedure is

completed, the intrinsic parameters are used in the geo-positioning algorithm. Selections of

images were used that captured the checker pattern at different ranges and orientations as shown

in Figure 4-6.

The boundaries of the checker pattern were then selected manually for each image. The

calibration algorithm used the gradient of the pattern to then find all of the vertices of the

checkerboard as shown in Figure 4-7.



















Figure 4-7. Calibration images

Once the boundaries for all of the images were selected, the algorithm calculated the

intrinsic camera parameter estimates using a gradient descent search. Using the selected images

the following parameters were calculated:

Focal Length: fc = [ 1019.52796 1022.12290 ] a [ 20.11515 20.62667 ]

Principal point: cc = [645.66333 527.72943 ] a[ 13.60462 10.92129 ]

Skew: alpha~c = [ 0.00000 ] a [ 0.00000 ] => angle of pixel axes = 90.00000 +

0.00000 degrees









Distortion: kc = [-0.17892 0.13875 -0.00128 0.00560 0.00000 ] +[0.01419

0.02983 0.00158 0.00203 0.00000 ]

Pixel error: err = [ 0.22613 0.14137 ]

Geo-Positioning Sensitivity Analysis

In this section the derivation for the sensitivity of the position solution will be derived

based on the measurable parameters used in the geo-positioning algorithm. This analysis will

show the general sensitivity of the positioning solution, and also the sensitivity at common

operating conditions.

Equation 4-15 is used to determine the global position of the target based on the global

position and orientation of the camera, and the normalized target pixel coordinates. Multiplying

Equation 4-15 through by Ze produces:



7G G GCol 1 (4-22)




Since the geo-positioning coordinates are of primary concern for the sensitivity analysis,

Equation 4-22 is reduced to the form:



Gz Yo ~ (4-23)
yG C GR) GR) GR Ge



Equation 4-23 is rewritten in the form:

xG
= zeAb (4-24)
.PG
where









C,C, +S,S S, C,S, -S,S C, S,C, GPeox
A =
-CS CC S, GP~ (4-25)

n,

V,

b=l 1

1 (4-26)



The geo-positioning process is modeled by assuming there are some errors in the

parameters used in the calculation. The parameter vector is defined below:



G;Pc





(4-27)





The modeled process is shown below:


r:(X actual _=A H error)
where








G Pco, + 3 GPc,~

G Pco,+ 3 G Pc,;





vn + 3(v,)


(4-29)


The positioning error from Equation 4-28 reduces to:


(4-30)


In order to establish a metric from measurement of the error for the sensitivity analysis, the
inner product of Equation 4-30 is used.



8" &= zcb c~b c~b zci


-' & = zcAb


Upon using Equation 4-24 to substitute for zc.Ab^ h enrcfrmfrthro

variance becomes:


C"e=z2 T7AAb"AE


(4-32)


The partial derivative of the generic error variance is shown for an arbitrary parameter 5.


S= eXG =z, b -zc b .


zc1 zcAb zAb, +( zcAb zAb1~


-2 zcAAb G', C2X Tz.b"AAb"A .










i~' 8zC2 TA'A6g1 8 zcAb 8zC/c I~ 2 TA'A6 p
-2 +

i' 8zC2 TA'A6Ep 82zcbTAT L


= z, cb AAb +z 2 ATAb +zC2 T Ab + zC2TAT b+ zC2 TATA sr

8zT xG bT xG T SA' xG
-2 c r b'A' 1 +z( A'() + zcb s> lc


(4-33)

In order to reduce the complexity of the analysis and to provide a more concise

representation of the effects of the parameter errors on the error variance, and without loss of

generality, the target position is set as the origin of the global coordinate system. Equation 4-30

reduces to:


e = zc Ab .r (4-34)

d'e = zci zcsi~ b

(4-35)
e"Te =z2b" A" A

This quantity equates to the error variance of the positioning solution given the system

configuration and error values. It is desirable to determine the effects of errors in each parameter

used in the geo-positioning solution. Hence the partial derivative of the inner product is

calculated with respect to each parameter error.

The partial derivative of the reduced error variance is shown for an arbitrary parameter 5.

deTe" dz, 2dbT dAT A b
~~ir= 2zc c r'ATb +zC2 ATAb +zC2 T Ab +zC2 TAT _b + zC2 TATA (4-36)








Equation 4-25 is restated below along with the partial derivatives with respect to G~co,~


GPco,a GPco, #,0, 7, u and v


-C"S

sA 0_=[

83 GPcz) 0


,,A 0[


SGP:


:CL


001


000

000


(4-25)

(4-37)


(4-38)


(4-39)


83 (9) rS CS, ~S

d6A8 -S4C, +CS 'CS, Y

"BA -CS, + SS ~,CC,


-S4C C:, -S

-S S, C'S C,~ -S~S 0
o o o

-C C S S, O 0


(4-40)

(4-41)

(4-42)

(4-43)

(4-44)


Equation 4-26 is restated below along with the partial derivatives with respect to GPco ,

GPcoI, G~co, 8, /, U, u, and v :


u


(4-26)


Sze )


as (us,)a oo oo o
as (v.)a o o oo oo









1


(4-45)


G ^
'1
0





1
0
SS,+C~S C,


db
83(u) >


(4-46)


db
83(vl )


(4-47)


o'O
0
0
LOI


db
"( P~


(4-48)


db


(4-49)


db
83 GPcz)


(4-50)


C SC -CCC


- udSC,-C;) SS +9S4Sp-
GP~~I)

li~i~cs(a- 2~~C ~~


db
83(#)


(4-51)


U(s SC -SSr +9d(SS +C S Cr -CC











db 0
(4-52)
a6( u CC+SSS +9 (CSc-S~S~C )+SC




db 0
(4-53)
-Sp-S +9 C-S



Equation 4-19 is restated below along with the partial derivatives with respect to GPco,

G~co, G~,, #, 8, ry, u and v :


z c (4-54)
Su -SC+ S+v -SS-C C + C
8z
S= 0 (4-55)
as (G Cox
8z,
= 0 (4-56)



B B +V(4-57)
83(GP~ Co n -S C +C SS+ +9 -S-C C +CC

(4-58)
83(u ) -~S C', +CSS+9 -SSCS~C +CC 2


(4-59)
83(v ) -G~S C,+C SS ~+9 -~Sj S,-~C SC +CC


(4-60)









8z G Cozn -S (I,((s )s) 4s+9 -CiS,+SS )S
(4-61)
83(c8) (6 -SC +CiS S,)+9-S Scic,c,-CSCl+


(4-62)
83(', r) g -SC +CISS )+-(SS -CSSC )+CC 2
With the partial derivatives for the components of the error inner product defined, the

sensitivity of the error can be quantified for each parameter. Hence the sensitivity of the error

variance can be derived with respect to each parameter error.

The error sensitivity with respect to GPco, is shown in Equations 4-63 and 4-64.


= 2z c bTATAb +z 2, ATAb + zC2 T Ab
86 G~ox C GPox G Cx G~ox(4-63)
dA db
+zC2 TAT + zC2 TATA


deT"= -2zc b 'A'Ab+v-C C
86~~~~~ GPo n- A C S T-S C C CC (4-64)

+zC T l Ab + zC2 TAT br O:


The error sensitivity with respect to GPco, is shown in Equations 4-65 and 4-66.


= 2z c bTATAb + z ATAb + zC2 T Ab
?6 GPro C GC7roi "C GPo Go (4-65)
dA db
+zC2 TAT + zC2 TATA

deTe T Coo Cl o l

=z2 T AE +zC2 TAI Ib (4-66)


The error sensitivity with respect to GPco_ is shown in Equations 4-67 and 4-68.
























































-SCC -SIS 0 SCS

-S C, Ci 01 S4S


az, db'
2z c bTA Ab +z2 2
C G Co, C Gp~o,


db
ATAb + zC2 TATA
G~p~o


86 GP~oz


(4-67)


dAT dA
+z2 2T ~Ab +zC2 TAT bh
dCG GC
Co~ Co,


85'! "
86 GPoz


A' Ab


SSC CSS + SS CSCi -CCY

(G PC~- 2


4A'


+z 2T


(z (SC, -CSS +v S-CSC


-C~C


(4-68)


The error sensitivity with respect to # is shown in Equations 4-69 and 4-70.


de" &


85'!

de" ( )


22zc zcT ATA + zC2b ATAb + zC2 T SA A +zC2 TA bA + +zC2 TTAA b


(4-69)


GpC n S +v C C -C
c" ( SC +C~SS +9 -S~S -S~C +CC


-bTA Ab






-u0C~C-S + jC~C~C + C~S


-u^C~C3S +9(C~CC +C~S


ATAb + zC2 TATA


+zC2


+zC2 T c~
S~S


-S~CC -S~S 0

-SC, Ci 0


(4-70)


I; (~", '1~To~+ 0 (.S~Y~1





















0




u~ (CCn +SdSnlS +CS,-SSCl )+SC(
GP
Coz


The error sensitivity with respect to B is shown in Equations 4-71 and 4-72.


e z, b' ab dAT A -
= 2zc cTA Ab +zC2 A Ab +zC2 TAT A+ zC2 T Ab + zC2b ATA b

(4-71)

G -o n -S^SIS^ +9 -S+S SC -S C A
=e 2z b Aa'Ab
c S CS+ -S-S C


r -
830)


+z2


ATAb


0
0
0

u C SS +9r+i~~+ CSB~i L~~-SSC SC

GP


+zC2bA"TA


-Sa~S 0l Ab



-SBSd 0 bi


+zC2b' -S C, +CrS S -S ~7S -CSC
0 0

+z ~~ 2bA -C + S S -S S:Y CS
C=2"A (,
0 0


(4-72)


The error sensitivity with respect to ry is shown in Equations 4-73 and 4-74.


8z, b' db
22zc cgATAb +zC2 ATAb +zC2 TA A-


dAT A -
+ zC2 T Ab + zC2 TAT ~b

(4-73)










GP S?, SS+CoS^C, +9,, -iSC,+CoSoS,-CoS bA

c; (2," -SC+CoSoS, +v, -So~S,-CoSoC,)+CC


de"' e
dS(v)


u~ +
\ -SS -CSC +9 SC -C S.S


A' Ab


0



22-S,-S. -Clse S.C.~ +9SC -
G


+z,2 TA


oS + S~So-C, CoC, +SoSoS,


-C~C


-C~S
cd V'~~s
CoC,+SoS


+z, i'A' -CoS,+So~C,


SAb
0 01

Oo 0o


(4-74)


The error sensitivity with respect to u,, is shown in Equations 4-75 and 4-76.


22ze cgATAb + z, ATAb + z, 'ATA-


de"Te


dA'
+ z, 2T ~


dA ~
Ab + z, 5'A' (4-75)


22i, (2


Gd-SoC ,+C oSoS
Sb'A Ab
-SC,~';+CoSoS, )+"(9,; -SoS,-CoYC)+Co 2 To~


de"Te
86(uM>)


(4-76)


ATAb + z, i'AT


The error sensitivity with respect to v,, is shown in Equations 4-77 and 4-78.


dz, dbT db
22ze cgATAb + z, ATAb +z, h A A-
dv & &


e"'! e
BS (v,,)


dAT dA
+z, r Ab + z, &'AT b (4-77)
dv, dv










G C S~S, -C^S^CY



q-S +CS+C S 9-SS- SCS +CC 2^



C C,


(4-78)


This derivation provides the general sensitivity equations for target geo-positioning from a

UAV. These equations provide the basis for the sensitivity analysis conducted in the following

chapters. These results will be combined with empirically derived sensor data to determine the

parameter significance relative to the induced geo-positioning error.


85"'!
= 2z
86(v,) c




+z2









CHAPTER 5
UNMANNED ROTORCRAFT MODELING

In order to derive the equations of motion of the aircraft and to perform further analysis, an

aircraft model was developed based on previous work [7,8,9, 10,16, 17]. For this research the

scope of the rotorcraft mechanics was limited to Bell-Hiller mixing and Flapping rotor head

design. A simplified aircraft model was developed previously [16, 17] for simulation and

controller development. A similar approach will be used here for the derivations.

Mettler et al. [7,8,9] use more complex analysis when deriving their dynamic equations.

Their analysis includes more complex dynamic factors such as fly-bar paddle mixing, main blade

drag/torque effects, and fuselage/stabilizer aerodynamic effects.

The actuator inputs commonly used for control of RC rotorcraft are composed of:

Ston: Longitudinal cyclic control

Stat: Lateral cyclic control

Scol: Collective pitch control

Grmd: Tail rudder pitch control

Gthr: Throttle control

A body Eixed coordinate system was used in order to relate sensor and motion information

in the inertial and relative reference frames. Figures 5-1 and 5-2 show the body Eixed coordinate

sy stem.

A transformation matrix was derived which relates the position and orientation of the body

Eixed frame to the inertial frame. The orientation of the body Eixed frame is related to the inertial

frame using a 3-1-2 rotation sequence. The inertial frame is initially in the North-East-Down

orientation. The coordinates system undergoes a rotation uy about the Z axis, then a rotation (p









about the X' axis, and then a rotation 6 about the Y"' axis. The compound rotation is equated

below in Equation 5-1 and the subsequent rotations are shown in 5-2, 5-3, and 5-4.





















Figure 5-1. Top view of the body fixed coordinate system










Figure 5-2. Side view of the body fixed coordinate system


1R4 _1R22RR4

Cos y Sin7 0y
'R =Sin r Cos y 0
0 0 1


(5-1)


(5-2)









10 0
2R =I 0 Cos# Sing~ (5-3)
0 Sin # Cos #



R = 1 0(5-4)
-Sin0 0 CosB

The final compound rotation matrix is equated below in Equation 5-4.
C, C, + S, SS, C, S, S, SC, S, C,
'R' = -C,S, C,C, S,(54
S, C, + C, SS, S, S, C, SC, C,C~ C,-

where the notation C, and S, represent the Cosine and Sine of the angle i respectively.

The transformation matrix which converts a point measured in the body fixed frame to the

point measured in the inertial fixed frame is shown in Equation 5-5.


i~etraT~oy br~ertial R 'retrr"""'P-5
bretia Bo __ Boa o o 5 5



where ''eremiPeo,,o represents the position of the body fixed frame origin measured in the inertial

frame .


The lateral and longitudinal motion of the aircraft is primarily controlled by the lateral and

longitudinal cyclic control inputs. For a flapping rotor head, the motions of the main rotor blades

form a disk whose orientation with respect to the airframe is controlled by these inputs. The

orientation of the main rotor disk is illustrated in Figure 5-3:

In this analysis, a represents the lateral rotation of the main rotor blade disk and b

represents the longitudinal rotation of the main rotor blade disk. In a report by Heffley and

Munich [17], motion of the main rotor disc is approximated by a first order system as shown

below:
















X1 -


Front View


Port Side View


Figure 5-3. Main rotor blade angle


1;,O ir~0 (ah am 0 d'"' (5-6)



where z,,, is the lateral cyclic damping coefficient and z0,, is the longitudinal cyclic damping


coefficient.


The angular velocity as measured in the body fixed frame B can be translated into angular

velocity in the inertial frame by using Equation 5-4:

BRG 1R4

GR" ="RG T(5-7)

mG) GRB" my


TM RR










z z



Front View Port Side view


Figure 5-4. Main rotor thrust vector









The main rotor induces a moment and linear force on the body of the aircraft. These

induce lateral and longitudinal motion, and roll and pitch rotations of the aircraft. The main rotor

thrust vector Tm is illustrated in Figure 5-4:

The main rotor thrust vector as measured in the body fixed frame is:

7 =7 j-Sin(b)
TM =R T Sin(a) I (5-7)



The equations of motion of the aircraft were derived in the inertial frame using the

following equations:

GR" = ma
GR""AI, = R[I, Rdi (5-8)

This derivation has resulted in a simplified helicopter dynamic model. This model

provides a foundation for simulation of the aircraft in the absence of an experimental platform.

This derivation was performed to provide the reader with basic helicopter dynamic principles

and an introduction to helicopter control mechanics. Now that a background on helicopter

mechanics and dynamics has been presented, the next chapter will discuss the use of onboard

sensors and signal processing for aircraft state estimation.









CHAPTER 6
STATE ESTIMATION USINTG ONBOARD SENSORS

This research proposes to derive and demonstrate the estimation of UGV states using a

UAV. In order to estimate the UGV states, the estimates of the UAV states are required. In this

research, sensors measurements from the UAV will be used to perform the state estimation of the

UAV and UGV. This research is primarily concerned with developing a remote sensing system.

Where it stands out is in how UAV dynamics and state measurements are utilized in passively

determining the states of the UGV.

Attitude Estimation Using Accelerometer Measurements

As discussed earlier, a two or three axis accelerometer can be used for determining the

attitude of an aircraft. A simple equation for determining the roll and pitch angles of an aircraft

using the acceleration measurements in the x and y body fixed axes is shown in Equation 6-1 and

6-2.


roll =sin_ (6-1)




pitch =r sina (6-2)

where ax and a are the measured acceleration measurements in the body fixed x and y axes.

A maj or problem with using accelerometers for attitude estimation is the effects from high

frequency vibration inherent to rotary wing aircraft. There are several characteristic frequencies

in the rotorcraft system to consider when analyzing the accelerometer signals. The main

characteristic frequencies are the speed of the main rotor blades, tail rotor blades, and

engine/motor. The highest frequency vibration will come from the engine/motor. The main gear

of the power transmission reduces the frequency to the main rotor head by about a factor of 9.8

for the Gas Xcell helicopter. The frequency is then further reduced by the tail rotor transmission









to the tail rotor blades. Any imbalances in the motor/motor fan, transmission gears, and rotor

heads/blades can cause significant vibration. Also any bent or misaligned shafts can cause

vibration in the system.

Due to the speed and number of moving parts in a helicopter, these aircraft have significant

vibration at the engine, main and tail rotor frequencies and harmonics. Extreme care must be

taken to ensure balance and proper alignment of all elements of the drive train. Time taken

balancing and inspecting components can payoff in the long run in system performance. The

airframe and payload structure must be carefully considered. Due to the energy content at

specific frequencies, any structural element with a natural frequency at or around the engine, or

main/tail rotor frequencies or harmonics could produce disastrous effects. Rigid mounting of

payload is highly discouraged as there would be no element other than the aircraft structure to

dissipate the cyclic loading.

Prospective researchers are forewarned that small and large unmanned aircraft systems

should be treated like any other piece of heavy machinery. In this case the payload was rigidly

attached to the base of the aircraft frame. Upon spool-up of the engine the head speed

transitioned into the natural frequency of the airframe with the most flexible component of the

system being the side frames of the aircraft. In less than a second the aircraft entered a resonant

vibration mode. This resulted in a tail-boom strike by the main blades. The damage was a result

of the main shaft shattering and proj ecting the upper main bearing block striking the pilot over

thirty feet away. Airframe resonance is particularly dangerous in all rotary aircraft from small

unmanned systems to large heavy lift commercial and military helicopters.

A Fast Fourier Transform (FFT) of the accelerometer measurements show very specific

spikes on all axes at specific frequencies as shown in Figure 6-1.





















300 FFT of Raw Y Axis Accelerometer Measurement
200


0 5 10 15 20 25


Frequency (Hz)



Figure 6-1. Fast Fourier Transform of raw accelerometer data


Frequeny (H2)


Figure 6-2. Fast Fourier Transform of raw accelerometer data after low-pass filter

Strategic filtering at the maj or vibration frequencies can improve the attitude estimates


while still allowing for the aircraft dynamics to be measured. Also by attenuating only specific










frequency bands, the noise can be reduced yet still produce fast signal response. A discrete low-

pass Butterworth IIR filter was used, with a 5 Hz pass band and a 10 Hz stop band that filtered

the high frequency noise evident between 15-25Hz. The raw accelerometer FFT response using

the low-pass filter is shown in Figure 6-2.

The FFT of the raw accelerometer data shows that the high frequency noise is attenuated

beyond 5 Hz thereby eliminating the maj or effects caused by the power train or high frequency

electrical interference. Before the low-pass filter was applied, the roll and pitch measurements

were almost unusable as shown in Figure 6-3.


Measured Roll Angle


50 100


150 200
sample
Measured Pitch Angle


50 100 150 200 250 300 350
sample


Figure 6-3. Roll and Pitch measurement prior to applying low-pass filter























-



-

-

-

-


After the low-pass filter was applied, the measurements produced much more viable results


as shown in Figure 6-4.



Measured Roll Angle


0


40

20



-20

-40

-60

-80

-100


100 150 200
sample

Measured Pitch Angle


50 100 150 200 250 300
sample


Figure 6-4. Roll and Pitch measurement after applying low-pass filter


These results indicate the importance of proper vehicle maintenance and assembly. More


rigorous balancing and tuning of the vehicle can produce much better system performance and


reduce the work required to compensate for vibration in sensor data.


Heading Estimation Using Magnetometer Measurements


The heading of unmanned ground and air vehicles is commonly estimated by measuring


the local magnetic Hield of the earth. The magnetic north or compass bearing has been used for


hundreds of years for navigation and mapping. By measuring the local magnetic Hield, an











estimate of the northern magnetic field vector can be obtained. Errors between the true north as


measured relative to latitude and longitude compared with magnetic north varies depending on


the location on the globe. Depending on the location, the variations between the true and


magnetic north are known and can be compensated. Alternative methods for determining the

heading of unmanned systems exist including using highly accurate rate gyros. By precisely


measuring the angular rate of a static vehicle, the angular rate induced from the rotation of the


earth can be used to estimate heading. This requires extremely high precision rate gyros which


are currently too expensive, large, and sensitive for small unmanned systems.


Normally all three axes of the magnetometer would be used for heading estimation but due


to the fact that the aircraft does not perform any radical roll or pitch maneuvers, only the lateral


and longitudinal magnetometer measurements are required as shown in Figure 6-5.


x 10"
Lateral Magnetic Field Measurement
14-




0 50 100 150 200 250 300 350
x 10" time (s)
Longitudinal Magnetic Field Measurement


$2

0 50 100 150 200 250 300 350
time (s)
100
Heading Estimate



~-100

-200
0 50 100 150 200 250 300 350
time (s)



Figure 6-5. Magnetic heading estimate









UGV State Estimation

The geo-positioning equations derived in the previous chapters are restated below:


r ,eI= zc Ab (6-3)
where


CoC, + SoS,S, CoS, SoS,C, SoC, GPCo
A = (64
-CS, C,C, S~ GPCo,



b = 1 (6-5)




GPc
Zc o (6-6)
R31 n + R32 n R33

By identifying two unique points fixed to the UGV, the direction vector can be defined:


XG2 G1 Un2 n1


Sin~y)(6-7)



The heading of the vehicle can be found using:

ry = atan2(Sin(ry), Cos( y)) (6-8)
The kinematic motion of the vehicle can be described by the linear and angular velocity

terms. In the 2D case, the UGV is constrained to move in the x-y plane with only a z component

in the angular velocity vector. Hence the state vector is shown below:

xvCos(ry) Cos(ry) 0
2 =y =vSin(ry) =Sin(ry) 0 69
7 7 0 1








In [27] the researchers define the kinematic equations for an Ackermann style UGV.

Using these equations the kinematic equations are restated using our notation:

FL L
x 2 2v
v~o~ )+Sin(W) Cos(ry) Sin(ry)
v~in~y) Cos(W) -Sin(r)Csy)
Equation 6-10 follows the structure outlined in [28] and is rewritten in the form:

Cos (w) LSin(W)
i= L (6-11)
Sin(y) -C2 os(IY)1

where z is the measurement vector, x is the state vector, and 9 is the additive measurement

error. The measurement error can be isolated and the squared error can be written in the form:



The measurement estimate is written in the form:

z =Hi (6-13)

Hence the sum of the squares of the measurement variations z z is represented by:

J= (z- HiZ) (z- Hj) (6-14)
The sum of the squares of the measurement variations is minimized with respect to the state
estimate as shown:

83J 8 (2 H)T (Z H~i)

0=(-H)T (fHxl)+(x-Hi) (-H)
0 = -H'i"+ HTHi -z"TH + 'HH (6-15)
0 = -2H'" + 2HTHS
;= H'H) H'E
Therefore the state estimate can be expressed as:









I CsryY L CsLyI
Cos(ry) Sin(r)Co() Sin(w) Cos(ry) Sin(ry)
Sin( ) Co() L Sin(7) Co(7
2 2 Sin"('/) Co( 2 2

1 0Cos(ry) Sin(ry)

0 -- Sin(w) LCos(ly)] (6-16)
1 0 Cos (r) Sin (r)
0= OL LSin(v/) LCos(w> )

Cos(ry) Sin(ry)
X Sin(v/) L Cos(V/)I

Equation 6-16 can be rewritten in the form:

Cos(ry) Sin(ry)
= 2 2 (-7
cI; -rSin(ly) Cos(7) y~
This chapter has discussed the use of onboard sensors for aircraft and ground vehicle state

estimation. These techniques will be used in the following chapter to determine the sensor noise

models for the sensitivity analysis. They will also be used for the validation of the geo-

positioning algorithm and comparison with simulation results.









CHAPTER 7
RESULTS

This chapter presents the results derived from the experiments performed using the

experimental aircraft and payload systems. It will also present the results of the effort of

applying this research to solve several engineering problems.

Geo-Positioning Sensitivity Analysis

The sensitivity of the error variance to the errors of each parameter is highly coupled with

the other parameters in the positioning solution. The mapping of the error variance sensitivity to

the parameters is highly nonlinear. In order to achieve a concise qualitative representation of the

effects of each parameter' s error in the positioning solution, a localized sensitivity analysis is

performed. This entails using common operating parameters and experimentally observed values

for the parameter errors. When substituted into the error variance partial differential equations,

the respective sensitivity of each parameter is observed for common testing conditions.

The error models for the various geo-positioning parameters were obtained using

manufacturer specifications and empirically derived noise measurements. The main aircraft used

for empirical noise analysis was the Miniature Aircraft Gas Xcell platform. This aircraft was

equipped with the testing payload as discussed in previously and sensor measurements were

recorded with the aircraft on the ground but with the engine on, and head-speed just below

takeoffs speed.

The sensor used for measuring the global position of the camera was a WAAS enabled

Garmin 16A model GPS. This GPS provides a 5 Hz positioning solution. The manufacturer

specifications for horizontal and vertical positioning accuracy is less than 3 meters. For the

sensitivity analysis the lateral and longitudinal error distribution was defined using a uniform











radial error distribution bounded by a three meter range. The error distribution parameters for


the horizontal and vertical positioning measurements are stated in Table 7-1.


Parameter Value

3m


G ^ 1.5 m
Pco

Table 7-1. Parameter standard deviations for the horizontal and vertical position



The sensor used for measuring the orientation of the camera was a Microstrain 3DMG


orientation sensor. This sensor provide three axis measurements of linear acceleration, angular


rate, and magnetic field. The sensor measurements used for determining the roll and pitch angle

of the camera were the lateral and longitudinal linear accelerations. The roll and pitch angles


were calculated using these measurements as described previously. The roll and pitch


measurements used for defining the error distribution are shown in Figure 7-1.


Measured Roll Angle
10









-20
0 2 4 6 8 10 12 14 16 18 20
time (s)
Measured Pitch Angle
20









-20
0 2 4 6 8 10 12 14 16 18 20
time (s)

























































Parameter Value

0~ 4.40

os 6.8o


Table 7-2. Parameter standard deviations for the roll, pitch, and yaw angles


Figure 7-1. Roll and Pitch measurements used for defining error distribution

The Microstrain 3DMG contains a three axis magnetometer for estimating vehicle heading.

The measurements made to estimate the heading error distribution for the sensitivity analysis is

shown in Figure 7-2.


Heading Estimate


0 2 4 6 8 10 12 14 16
time (s)


Figure 7-2. Heading measurements used for defining error distribution

Using this data set the standard deviations for the roll, pitch, and yaw were calculated and

shown in Table 7-2.

























































0 50 100 150 200 250 300 350 400 45(
Image sample


The error distributions for the normalized pixel coordinates were calculated using a series


of images of a triangular placard taken from various elevations as shown in Figure 7-3.


Figure 7-3. Image of triangular placard used for geo-positioning experiments


6 X Component of Plxel Error


r


Y Component of Pixel Erro


5
v,
" O
.x
a
a,
m


) 50 100 150 200 250
Image sample


300 350 400 450


Figure 7-4. Results of x and y pixel error calculations


-










It was difficult to quantify the expected error distribution for the normalized pixel

coordinates. The error distribution for the x and y components of the normalized pixel

coordinates were estimated by comparing the detected vertex points of the placard with the

calculated centroid of the volume. The resulting variation is shown in Figure 7-4. The pixel

errors were then converted to normalized pixel errors and are shown in Table 7-3.

Parameter Value
0.0021

0.0070

Table 7-3. Normalized pixel coordinate standard deviations used during sensitivity analysis



A summary of the parameters for the sensor error distributions that are used in the

proceeding sensitivity analysis are shown in Table 7-4.

Parameter Value


gG^ 3m
Pco


G ^ 1.5 m
COz
4.40



0.90

0.0021

0.0070

Table 7-4. Parameter standard deviations used during sensitivity analysis



The Monte Carlo method was used to evaluate each sensitivity equation. In order to

demonstrate the significance of each parameter in the Monte Carlo analysis, each parameter is

perturbed by a uniform error distribution based on experimentally derived measurements. This










analysis seeks to show the difference between the positioning errors based off of each varying

parameter. The key element to this analysis is that the error sensitivity for each parameter is

calculated including errors from other parameters. This allows for the nonlinear and coupled

relationship between the parameters to propagate through the sensitivity analysis. The results of

this analysis determine the rank of the dominance of each parameter in causing positioning error.

The error sensitivity is used in the subsequent analysis and is restated in Equation 7-1.

87e'
S' = (7-1)


The error sensitivity is evaluated using the common parameter values perturbed by a

uniform error distribution. The range of the error distribution is defined using experimentally

derived data. A uniform distribution was chosen instead of a normal distribution for the Monte

Carlo simulation. It was found that the normal distribution took too long to converge during

testing. The normal distribution also had a larger search space, combined with the nonlinear

coupling between the parameters, caused for the processing times to not be manageable. The

uniform distribution provides solid limits to the error distribution and quickly traversed through

the search space. This provided a quick yet fruitful analysis. In order to quantify the errors in

position attributable to each parameter, Equation 7-1 was modified as shown in Equation 7-2.





where p : parameter vector with all elements perturbed by associated uniform error distribution

f>: parameter vector with all elements but 5 perturbed by associated uniform error

di stributi on

ee : parameter error.








This formulation allows for the Monte Carlo simulation to calculate the error variance

distribution associated with each parameter using all parameter error distributions. This allows

not only for the coupling between the different parameters to affect the positioning error but also

the various parameter error distributions to affect the results. As with many complex systems,

not only does the inherent relationship between the various parameters effect the observations

but also the measurement errors of the various parameters.

The simulation uses the empirically derived error distribution and the target system

configuration. For this analysis, the target system is a hovering unmanned rotorcraft, operating

at a 10 meter elevation. The parameter values used for this analysis are shown in Equation 7-3.


GPco, = 0.0(m)

GPco, = 0.0(m)




SP) =~ < = 0.00
0 = 0.00

ry = 0.00 (7-3)

u, = 0.O

v, = 0.0

The error distribution is defined as a uniform distribution with bounds at the standard deviations

defined previously for the geo-positioning algorithm parameters. The histograms of the error

variance relative to each parameter are shown in Figure 7-5.











sigma(Px) sigma(Py) sigma(Pz) sigma(Phi)
6000r 50001 120001 6000

5000 4010000 ~ 5000

4000 1 8000 ~ 4000
3000
3000 1 6000 ~ 3000
2000
2000 L 4000 I 2000

1001000 2000 -1000

0- 0- 0 0
0 10 20 0 10 20 0 20 40 0 1 2
Error Variance (m 2) Error Vlariance (m 2) Error Variance (m 2) Error Variance (m 2)
x 141gma(theta) x 10sigma(psi) sigma(un) sigma(vn)
2.5 2 8000 6000


2 ~1.51 6000 50
4000
1.5
11 ~4000~ 3000

2000
0.5 1 2000

0.0 0 010000
0 0.2 0.4 0 2~aacerd 0 1 2 0 0.01 0.02
Error Variance (m 2) Ero ainerh Error Variancegrth Error Variance (m 2

Figure 7-5. Error Variance Histograms for the respective parameter errors

The bounds of the results with respect to the parameter standard deviations are shown in

Table 7-5. For the given parameter error distributions and system configuration, the results show


that the order of significance is as follows: 3 Gj:,),cOz Gco, 3 GPc, ,~, 3(#), 3(0), 3(y/),


3(v,), and 3(u). The most significant term 3 GPc,), demonstrates the importance in the


altitude data in the geo-positioning calculations. This simulation has shown the process by

which the geo-positioning parameter rank was calculated using empirically derived sensor noise

distributions and a specified system configuration. By simply adapting the sensor noise

distributions and system configuration values, this process can be applied to any given system to

provide insight in geo-positioning error source dominance.









Parameter Value

max S, eG 18.00 m2



max YOGC~ 18.00 m2


max S ~~~i)20.53 m2


max S 1.594 m2


max (SaH)0.3130 m2

maxS 0.0001524 m2





max So 0.01316 m2

Table 7-5. Comparison of Monte Carlo Method results


Comparison of Empirical Versus Simulated Geo-Positioning Errors

The experimental results obtained using the Gas Xcell Aircraft equipped with a downward facing

camera and the experimental payload discussed earlier were compared with simulation results

using the estimated error distributions used in the Monte Carlo analysis. The testing conditions

used for the simulation analysis are shown in Equation 7-4. The results show that the geo-

positioning errors from simulation closely match the geo-positioning results obtained using the

experimental vehicle/payload setup. The geo-positioning results are shown in Figure 7-6.











G~rco, = 0.0 (m)
GPco ~= 0.0(m)

G~c, = 10 (m)

p). #= 0.00
B = 0.00 (7-4)

ry = 0.00

u,~ = 0.0

v, = 0.0

Experimental and Simulation GeoPositioning Error
10

8- Experimental
S+ Simulation



6-t j+~L-







-6 -t


-8f -






-10
-10 -8 -6 -4 -2 0 2 4 6 8 10
UTM Easting (m)

Figure 7-6. Experimental and simulation geo-position results

The use of a uniform error distribution for the simulation produces different results

compared with a normal distribution. While the simulation results vary slightly from the










experimental results, the uniform distribution provides more of an absolute bound for the error

di stributi on.

Applied Work

Unexploded Ordnance (UXO) Detection and Geo-Positioning Using a UAV

This research investigated the automatic detection and geo-positioning of unexploded

ordnance using VTOL UAVs. Personnel at the University of Florida in conjunction with those at

the Air Force Research Laboratory at Tyndall Air Force Base, Florida, have developed a sensor

payload capable of gathering image, attitude, and position information during flight. A software

suite has also been developed that processes the image data in order to identify unexploded

ordnance (UXO). These images are then geo-referenced so that the absolute positions of the

UXO can be determined in terms of the ground reference frame. This sensor payload was

outfitted on a Yamaha RMAX aircraft and several experiments were conducted in simulated and

live bomb testing ranges. This paper discusses the obj ect recognition and classification

techniques used to extract the UXO from the images, and present the results from the simulated

and live bombing range experiments.















Figure 7-7. BLU97 Submunition

Researchers have used aerial imagery obtained from small unmanned VTOL aircraft for

control, remote sensing and mapping experiments [1,2,3]. In these experiments, it was necessary









to detect a particular type of ordnance. The primary UXO of interest in these experiments was

the BLU97. After deployment, this ordnance has a yellow main body with a circular decelerator.

The BLU97 is shown in Figure 7-7.

Experimentation VTOL Aircraft

The UXO experiments were conducted using several aircraft in order to demonstrate the

modularity of the sensor payload and to determine the capabilities of each aircraft. The first

aircraft that was used for testing was a Miniature Aircraft Gas Xcell RC helicopter. The aircraft

was configured for heavy lift applications and has a payload capacity of 10-151bs. The typical

flight time for this aircraft is 15 minutes and provides a smaller VTOL aircraft for experiments at

UJF and the Air Force Research Laboratory. The Xcell helicopter is shown in Figure 7-8.










Figure 7-8. Miniature Aircraft Gas Xcell Helicopter

The second aircraft used for testing was a Yamaha RMAX unmanned helicopter. With a

payload capacity of 601bs and a runtime of 20 minutes, this platform provided a more robust and

capable testing platform for range clearance operations. The RMAX is shown in Figure 7-9.










Figure 7-9. Yamaha RMAX Unmanned Helicopter











Sensor Payload

Several sensor payloads were developed for various UAV experiments. Each payload was

constructed modularly so as to enable attachment to various aircraft. The system schematic for

the sensor payload is shown in Figure 7-10.


Irnagmg
Digital Stereovision t s,,Copc lh

Cameras Cmpact Flash
a sell~~~ ~ Industrial ICompactFlash

EthernetPose Sensors
....................................Novatel RT2 Dif ferential
GPS
Power

DC/DC Converters LPo Battery Digital Compass



Figure 7-10. Sensor Payload System Schematic

The detection sensor used for these experiments was dual digital cameras operating in the

visible spectrum. These cameras provided high resolution imagery with low weight packaging.

These experiments sought to also explore and quantify the effectiveness of this sensor for UXO

detection.


Maximum Likelihood UXO Detection Algorithm

A statistical color model was used to differentiate pixels in the image that compose the

UXO. The maximum likelihood (ML) UXO detection algorithm used a priori knowledge of the

color distribution of the surface of the BLU97s in order to detect ordnance in an image. The

color model was constructed using the RGB color space. The feature vector was defined as



x = g (7-5)


where r, g, and b are the eight bit color values for each pixel.










Using the K-means Segmentation Algorithm [18], an image containing a UXO was

segmented and selected. A segmented image is shown in Figure 7-11. This implementation

used a 5D feature vector for each pixel which allowed for clustering using spatial and color

parameters. Results varied depending on relative scaling of the feature vector components.


Segmented .
Ordnance










Figure 7-11. Segmentation software

The distribution of the UXO pixels was assumed to be a Gaussian distribution [18] thereby

the maximum likelihood method was used to approximate the UXO color model. The region

containing the UXO pixels was selected and the color model was calculated. The mean color

vector is calculated as


p f (7-6)


where n is the number of pixels in the selected region.

The covariance matrix was then calculated as


[E]= -1 )x-p (7-7)


The mean and covariance of the UXO pixels were then used to develop a classification

model. This classification model described the location and the distribution of the training data

within the RGB color space. The equation used for the classification metric was











= 1 (7-8)


The classification metric is similar to the likelihood probability except that it lacks the pre-

scaling coefficient required by Gaussian pdfs. The pre-scaling coefficient was removed in order

to optimize the performance of the classification algorithm. This allows for the classification

metric value to range from 0 to 1. The analysis was performed by selecting a threshold for the

classification metric in order to classify UXO pixels from the image. This allowed for images to

be screened for UXO detection and for the pixel coordinate location of the UXO to be identified

in the image.

Initial experimentation using simulated UXO and the ML UXO detection algorithm proved

to provide successful classification of UXO using images obtained using both aircraft. As

expected the performance of the algorithm deteriorated when there were variations in the color of

the surface of the UXO or the contrast between the UXO and the background. Relating to the

data, the ML UXO detection algorithm failed when either the actual UXO color distribution fell

far from the modeled distribution in RGB space or the background distribution closely

encompassed the actual UXO color distribution. In these cases, the variations caused both false

positives and negatives when using the classification algorithm. The use of an expanded training

data set and multiple Gaussian distributions for modeling was investigated but found to slightly

improve UXO detection rates but greatly increase false positive readings from background

pixels. The algorithm performance was also extremely sensitive to the likelihood threshold

thereby introducing another tunable parameter to the algorithm.

Spatial Statistics UXO Detection Algorithm

Previous experimental results showed that when the background of the image closely

resembled the UXO color, the ML UXO performance degraded. In order to perform more robust









UXO detection, an algorithm was developed whose parameters were based solely on the

dimensions of the UXO and not a trained color model. A more sophisticated pattern recognition

approach was used as shown in Figure 7-12.


Capture Pre- I ISegmentation I IClassification
Image I Ifiltering


Fig. 7-12. Pattern Recognition Process

The spatial statistics UXO detection algorithm was designed to segment like

colored/shaded obj ects and classify them based on their dimensions. This would allow for robust

performance in varying lighting, color, and background conditions. The assumptions made for

this algorithm were that the UXO was of continuous color/shading, and the UXO region would

have scaled spatial properties of an actual UXO. Based on the measured above ground level of

the aircraft and the proj ective properties of the imaging device, the algorithm parameters would

be auto-tuned to accommodate the scaling from the imaging process.

In order to reduce the dimensionality of the data set, the color space was first converted

from RGB to HSV. By inspection, it was found that the saturation channel provided the greatest

contrast between the background and the UXO. The raw RGB image and the saturation channel

images are shown in Figure 7-13.










Figure 7-13. Raw RGB and Saturation Images of UXO

The pre-filtering process consisted of histogram equalization of the saturation image. This

allowed for the contrast between the UXO pixels and the background and improved

segmentation.

The segmentation process was conducted by segmenting the pre-filtered image using the k-

means algorithm as shown in Figure 7-14.


Figure 7-14. Segmented Image

Each region was analyzed and classified using the scaled spatial statistics of the UXO.

Properties such as the maj or/minor axis length for the region were used to classify the regions.

Regions whose spatial properties closely matched those of the UXO were classified as UXO and

highlighted in the final image as shown in Figure 7-15.


Figure 7-15. Raw Image with Highlighted UXO









Collaborative UAV/UGV Control

Recently unmanned aerial vehicles (UAVs) have been used more extensively in military

operations. The improved perception abilities of UAVs compared with unmanned ground

vehicles (UGVs) make them more attractive for surveying and reconnaissance applications. A

combined UAV/UGV multiple vehicle system can provide aerial imagery, perception, and target

tracking along with ground target manipulation and inspection capabilities. This experiment was

conducted to demonstrate the application of a UAV/UGV system for simulated mine disposal

operations.

The experiment was conducted by surveying the target area with the UAV and creating a

map of the area. The aerial map was transmitted to the base station and post-processed to extract

the locations of the targets and develop waypoints for the ground vehicle to navigate. The

ground vehicle then proceeded to each of the targets, which simulated the validation, and

disposal of the ordnance. Results include the aerial map, processed images of the extracted

ordnances, and the ground vehicle's ability to navigate to the target points.

The platforms used for the collaborative control experiments are shown in Figure 7-16.

















Figure 7-16. TailGator and HeliGator Platforms









Waypoint Surveying

In order to evaluate the performance of the UAV/UGV system, the waypoints were

surveyed using a Novatel RT-2 differential GPS. This system provided two centimeter accuracy

or better when provided with a base station correction signal. Accurate surveying of the visited

waypoints provided a baseline for comparison of the results obtained from the helicopter and the

corresponding path the ground vehicle traversed.

The UXOs were simulated to resemble BLU-97 ordnance. Aerial photographs as shown in

Figure 7-17 of the ordnance along with the camera position and orientation were collected.

Using the transformation described previously the global coordinates of the UXOs were

calculated. The calculated UXO positions were compared with the precision survey data.


Figure 7-17. Aerial photograph of all simulated UXO






















m Diff. Waypoints
Boundaries






m





Local Map

A local map of the operating region was generated using the precision survey data. This

local map as shown in Figure 7-18 provided a baseline for all of the position comparisons

throughout this task.


3280360
-205


3280355


32803450


3280345


3280340




3280330


3280325


3280320
368955 368960 368965 368970 368975 368980 368985 368990 368995 369000

Easting (m)


Figure 7-18. Local map generated with Novatel differential GPS

The data collected compares the positioning ability of the UGV and the ability of the UAV

sensor system to accurately calculate the UXO positions. While both the UGV and UAV use

WAAS enabled GPS there is some inherent error due to vehicle motion and environmental

affects. The UGV' s control feedback was based on waypoint to waypoint control versus a path

following control algorithm.









Once a set of waypoints was provided by the UAV, the UGV was programmed to visit

every waypoint as if to simulate the automated recovery/disposal process of the UXOs. The

recovery/disposal process was optimized by ordering the waypoints in a manner that would

minimize the total distance traveled by the UGV. This problem was similar to the traveling

salesman optimization problem in which a set of cities must all be visited once while minimizing

the total distance traveled. An A* search algorithm was implemented in order to solve this

problem.

The A* search algorithm operates by creating a decision graph and traversing the graph

from node to node until the goal is reached. For the problem of waypoint order optimization, the

current path distance g, estimated distance to the final waypoint I;, and the estimated total

distance 3 was evaluated for each node by

g = length of straight line segments of all predecessor waypoints

h; = (minimum distance of any two way points t (successors & current way points)) x (# of

successors)

f = g +h. (7-10)

The requirement for the A* algorithm of the admissibility of the A heuristic is fulfilled due

to the fact that there exists no path from the current node n to a goal node with a distance less

than 4. Therefore the heuristic provides the minimum bound as required by the A* algorithm

and guarantees optimality should a path exist.

The UGV was commanded to come within a specified threshold of a waypoint before

switching to the next waypoint as shown in Figure 7-19. The UGV consistently traveled within

three meters or less of each of the desired waypoints which is within the error envelope of typical

WAAS GPS accuracy.











3280360


m Diff. Waypoints
Boundaries


3280355-
-UGV


3280350-


3280345-


.C 3280340-


3280335-


3280330-


3280325-


3280320
368955 368960 368965 368970 368975 368980 368985 368990 368995 369000

Easting (m)


Figure 7-19. A comparison of the UGV's path to the differential waypoints

The UAV calculates the waypoints based on its sensors and these points are compared with

the surveyed waypoints. There is an offset in the UAV' s data due to the GPS being used and due

to error in the transformation from image coordinates to global coordinates as shown in Figure 7-

20.

The UGV is able to navigate within several meters of the waypoints, however, is limited

due to the vehicle kinematics. Further work involves a waypoint sorting algorithm that accounts

for the turning radius of the vehicle.












3280360 Wapnt
Boundaries
UAV
3280355


3280350 -1


3280345


.E 3280340


3280335


3280330


3280325 -


3280320
368955 368960 368965 368970 368975 368980 368985 368990 368995 369000
Easting (m)


Figure 7-20. UAV waypoints vs. UGV path

Citrus Yield Estimation


Within the USA, Florida is the dominant state for citrus production, producing over two-


thirds of the USA' s tonnage, even in the hurricane-damaged 2004-2005 crop year. The citrus


crops of most importance to Florida are oranges and grapefruit, with tangerines and other citrus


being of less importance.

With contemporary globalization, citrus production and marketing is highly


internationalized, especially frozen juice concentrates. So there is great competition between


various countries. Tables 7-6 and 7-7 show the five most important countries for production of


oranges and grapefruit in two crop years. Production can vary significantly from year-to-year

due to weather, especially due to hurricanes. Note the dominance of Brazil in oranges and the


rise of China in both crops.










Country 2000-2001 Crop Year 2004-2005 Crop Year
Brazil 14,729 16,606
USA 11,139 8,293
China 2,635 4,200
Mexico 3,885 4,120
Spain 2,688 2,700
Other Countries 9,512 9,515
World Total 44,588 45,434
Table 7-6. Production of Oranges (1000's metric tons) (based on NASS, 2006)

Country 2000-2001 Crop Year 2004-2005 Crop Year
China 0 1,724
USA 2,233 914
Mexico 320 310
South Africa 288 270
Israel 286 247
Other Countries 680* 330
World Total 3,807 3,795
*Cuba produced a very significant 310 (1000 metric tons) in 2000-2001
Table 7-7. Production of Grapefruit (1000's metric tons) (based on NASS, 2006)


The costs of labor, land, and environmental compliance are generally less in most of these

countries than in the USA. Labor is the largest cost for citrus production in the USA, even

though many workers, especially harvesters, are migrants. In order for producers from the USA

to be competitive, they must have advantages in productivity, efficiency, or quality to counteract

the higher costs.

This need for productivity, efficiency, and quality translates into a need for better

management. One management advantage that USA producers can use to remain competitive is

to utilize advanced technologies. Precision agriculture is one such set of technologies which can

be used to improve profitability and sustainability. Precision agriculture technologies were

researched and applied later to citrus than some other crops, but there has been successful

precision agriculture research [19,20,21]. There has been some commercial adoption [22].









Yield maps have been a very important part of precision agriculture for over twenty years

[23]. They allow management to make appropriate decisions to maximize crop value

(production quantity and quality) while minimizing costs and environmental impacts [24].

However, citrus yield maps, like most yield maps, can currently only be generated after the fruit

is harvested because the production data is obtained during the harvesting process. It would be

advantageous if the yield map was available before harvest because this would allow better

management, including better harvest scheduling and crop marketing.

There has been a history of using machine vision to locate fruit on trees for robotic

harvesting [25]. More recent work at the University of Florida has attempted to use machine

vision techniques to do on-tree yield mapping. Machine vision has been used to count the

number of fruit on trees [26]. Other researchers not only counted the fruit, but used machine

vision and ultrasonic sensors to determine fruit size [27]. This research has been extended to

allow for counting earlier in the season when the fruit is still quite green[28].

However, these methods all require vehicles to travel down the alleys between the rows of

trees to take the machine vision images. Researchers have demonstrated that a small remotely-

piloted mini-helicopter with machine vision hardware and software could be built and operated

in citrus groves[29]. They also discuss some of the recent research on using mini-helicopters in

agriculture, primarily conducted at Hokkaido University and the University of Illinois.

The obj ective of this research was to determine if images taken from a mini-helicopter

would have the potential to be used to generate yield maps. If so, there might be a possibility of

rapidly and flexibly producing citrus yield maps before harvest.

Materials and Methods

The orange trees used to test this concept were located at Water Conserv II, jointly owned

by the City of Orlando and Orange County. The facility, located about 20 miles west of









Orlando, is the largest water reclamation project (over 100 million liters per day) of its type in

the world, one that combines agricultural irrigation and rapid infiltration basins (RIBs). A block

of 'Hamlin' orange trees, an early maturing variety (as opposed to the later maturing 'Valencia'

variety), was chosen for study.

The spatial variability of citrus tree health and production can range from very small to

extremely great depending upon local conditions. This block had some natural variability,

probably due to its variable blight infestation and topography. Additional variability was

introduced by the trees being subjected to irrigation depletion experiments. However, mainly

due to substantial natural rainfall in the 2005-2006 growing season, the variation in the yield is

within the bounds of what might be expected in contemporary commercial orange production,

even with the depletion experiments.

The irrigation depletion treatment (percent of normal irrigation water NOT applied) was

indicated by the treatment number. Irrigation depletion amounts were sometimes different for

the Spring and the Fall/Winter parts of the growing season, as seen in Table 7-8 below. The

replication was indicated by a letter suffix. Only 15 of the 42 trees (six treatments with seven

replications each) were used for this mini-helicopter imaging effort. Treatment 6 had no

irrigation except periodic fertigation, and the trees lived on rainfall alone.


Treatment Spring Deletion Fall/Winter Deletion
1 25 25
2 25 50
3 25 75
4 50 50
5 50 75
6 100 100
Table 7-8. Irrigation Treatments









The mini-helicopter used for this work was a Gas Xcell model modified for increased

payload by its manufacturer [30]. It was purchased in 2004 for about US$ 2000 and can fly up to

32 kph and carry a 6.8 kg payload. Its rotor is rated to 1800 rpm and has a diameter of less than

1.6 m. The instrumentation platform is described in MacArthur, et al., (2005) and includes GPS

with WAAS, two compact flash drives, a digital compass, and wireless Ethernet. The machine

vision system uses a Videre model STH-MDCS-VAR-C stereovision sensor.

The mini-helicopter was flown at the Water Conserv II site on 10 January 2006, a mostly

sunny day, shortly before noon. The helicopter generally hovered over each tree for a short

period of time as it moved down the row taking images with the Videre camera. The images

were stored on the helicopter and some were simultaneously transferred to a laptop computer

over the wireless Ethernet. In addition, a Canon Power Shot S2 IS five-megapixel digital camera

was used to take photos of the trees (in north-south rows) from the east and west sides.

The fruit on the individual trees were hand harvested by professional pickers on 13

February 2006. The fruit from each tree was weighed and converted to the industry-standard

measurement unit of"field boxes". A field box is defined as 40.8 kg (90 lbs.).

The images were later processed manually. A "best" image of each tree was selected,

generally on the basis of lighting and complete coverage of the tree. Each overhead image was

cropped into a square that enclosed the entire tree and scaled to 960 by 960 pixels. The pixel

data from several oranges were collected from several representative images in the data set. The

data was assumed to be normally distributed, thus the probability function was calculated for

each orange pixel dataset. Using a "Mixture of Gaussians" to represent the orange class model,

the images were analyzed and a threshold established based on our color model. The number of

"orange" pixels was then calculated in each image and used in our further analysis










Results

The results of the image processing and the individual tree harvesting of the 15 trees

studied in this work are presented in Table 7-9. As Figure 7-21 illustrates, only irrigation

depletion treatment 6 had a great effect on the individual tree yields. Treatment 6 was 100%

depletion, or no irrigation. The natural rainfall was such in this production year that the other

treatments produced yields of at least four boxes per tree.



Treatment Replication Orange Pixels Boxes Fruit
1 B 13990 7
1 G 6391 6
2 B 11065 8
2 C 2202 4
2 E 5884 5
2 F 17522 7.5
3 B 2778 6
4 A 4433 6.2
4 B 5516 4.8
4 E 5002 4
4 F 11559 4.3
5 B 9069 7
5 C 17088 6.8
6 B 5376 2.5
6 D 6296 1
Table 7-9 Results from Image Processing and Individual Tree Harvesting


* *


S0 1 2 3 4 5 6 7
Treatment Number


Figure 7-21. Individual Tree Yields as Affected by Irrigation Depletion Treatments





















y = 0.0002x + 3.5919
R- = 0.2835



* **


The images were treated by the process discussed above. The number of "orange" pixels

varied from 2202 to 17,522. More pixels should indicate more fruit. However, as Figure 7-22

shows, there was substantial scatter in the data. It can be improved somewhat by the removal of

the nonirrigated treatment 6, as shown in Figure 7-23.


0 5000 10000 15000

Orange Pixels in Image


Figure 7-22. Individual Tree Yield as a Function of Orange Pixels in Image


20000


y = 0.0002x + 4.5087
R- = 0.373


O 5000 10000 15000 20000

Orange Pixels

Individual Tree Yield as a Function of Orange Pixels with Nonirrigated Removed


Figure 7-23.


Discussion

This work showed that good overhead images of citrus trees could be taken by a mini-

helicopter and processed to have some correlation with the individual tree yield. A tree with

fewer oranges should have fewer pixels in the image of the "orange" color. For example, tree









2C had only 2202 pixels and 4 boxes of fruit while tree 2F had 17,522 pixels and 7.5 boxes of

fruit. These are shown as Figures 7-24 and 7-25 below in which the "oranges" in the "After"

photo are enhanced to indicate their detection by the image processing algorithm.


Figure 7-24. Image of Tree 2C Before and After Image Processing


Figure 7-25. Image of Tree 2F Before and After Image Processing


The image processing used in this initial research was very simple. More sophisticated

techniques would likely improve the ability to better separate oranges from other elements in the

images. The strong sunlight likely contributed to some of the errors. Again, the use of more










sophisticated techniques from other previous research, especially the techniques developed for

yield mapping of citrus from the ground, would likely improve the performance in overhead

yield mapping.

A maj or assumption in this work is that the number of orange pixels visible is proportional

to the tree yield. However, the tree canopy (leaves, branches, other fruit, etc.) do hide some of

the fruit. Differing percentages of the fruit may be visible on differing trees. This is quite

apparent with the treatment 6 trees. Figure 7-26 shows the images for tree 6D. This tree,

obviously greatly affected by the lack of irrigation and a blight disease, has 6296 "orange" pixels

but only yielded one box of fruit. The poor health of the tree meant that there were not many

leaves to hide the interior oranges. Hence, a falsely high estimate of the yield was given.

Figure 7-27 shows the images taken from the ground of Trees 6D and 2E. Even though they had

similar numbers of "orange" pixels on the images taken from the helicopter, Tree 2E had five

times the number of fruit. The more vigorous vegetation, especially the leaves, meant that the

visible oranges on Tree 2E represented a smaller percentage of the total tree yield.


Figure 7-26. Image of Tree 6D Before and After Image Processing

























Figure 7-27. ~Ground Images of Tree 6D and Tree 2E


Mini-helicopters are smaller and less expensive than piloted aircraft. Accordingly, the

financial investment in them may be justifiable to growers and small industry firms. The mini-

helicopters would give their owners the flexibility of being able to take images on their own

schedule. The mini-helicopters also do not cause a big disturbance in the fruit grove. The noise

and wind are moderate. They can operate in a rather inconspicuous manner, as shown by Figure

7-28.




















Figure 7-28 Mini-Helicopter Operating at Water Conserv II