<%BANNER%>

Development of Sensor Component for Terrain Evaluation and Obstacle Detection for an Unmanned Autonomous Vehicle

xml version 1.0 encoding UTF-8
REPORT xmlns http:www.fcla.edudlsmddaitss xmlns:xsi http:www.w3.org2001XMLSchema-instance xsi:schemaLocation http:www.fcla.edudlsmddaitssdaitssReport.xsd
INGEST IEID E20101115_AAAAEY INGEST_TIME 2010-11-15T18:25:34Z PACKAGE UFE0018321_00001
AGREEMENT_INFO ACCOUNT UF PROJECT UFDC
FILES
FILE SIZE 3696 DFID F20101115_AABQZU ORIGIN DEPOSITOR PATH solanki_s_Page_120thm.jpg GLOBAL false PRESERVATION BIT MESSAGE_DIGEST ALGORITHM MD5
13a673dafb8393087431f43aa4f668e5
SHA-1
c68fccd8681c60c86c28a3f7f05eddc1c42a7e59
117585 F20101115_AABQDB solanki_s_Page_032.jp2
4a576f81426edcbc59eaa66e9b1ac61c
a5d07d9560290da887627d2ef7ab7281b02ed39d
41717 F20101115_AABQCN solanki_s_Page_112.pro
4eb1831621f024a943ddfc94585ab2d6
a5312b7db1b8594878cc6ef0ca25077fbaf01717
84489 F20101115_AABPXH solanki_s_Page_056.jpg
e949cfd2c01024625850ebc7e4b39696
ad9bfd8994746a6d954e8f086323852b297df0e7
97681 F20101115_AABQBZ solanki_s_Page_047.jpg
5a216f24e0961ab16b6921f5405f3328
1f4f84943b0d758bee461e51bbfe0a657a6bc03b
1908 F20101115_AABPWS solanki_s_Page_085.pro
723561e6549636f87b471a22d194caf9
e3b7e78fda7c3fe7dffb0e84ec42c2e52c444eb3
12164 F20101115_AABQZV solanki_s_Page_121.QC.jpg
3e1f08918050c11ef888c9559c6d8552
ee5e5c816f7b5f514e7bc3b0b481098069c4aee6
23732 F20101115_AABQDC solanki_s_Page_105.QC.jpg
d5e72a77dfeda73d720cf6f48ae616a0
e15a1e58332bd95b55ed8e489597bdbf01244737
89644 F20101115_AABQCO solanki_s_Page_114.jp2
8685e90c663ba7f28c0a4bdd2fdd945a
78ef2c1730c22a1d13c92a28da79861ff8008966
930895 F20101115_AABPXI solanki_s_Page_065.jp2
8d1a07ee3bc4fc5f922f2ac41dcc4382
4f7f3e3588f49c19d783a57d953827061d84502e
25271604 F20101115_AABPWT solanki_s_Page_083.tif
3a11979fc7a4d8e5d576158e2ba3e4b7
c5e091a874d11265818cb2a40ddc39b894fd1e0f
8734 F20101115_AABQZW solanki_s_Page_122.QC.jpg
3503e0d41b22cefa056abd8cff01a50b
9dba5e9b3be3fe29ba56cbe4f81bc681d68d7fb2
38305 F20101115_AABQDD solanki_s_Page_043.jp2
fcada76d4a2e6a4f6cbc8fa88b65a16e
6afbe15c745232f3fb7ce24ea78219b41263c4d1
6670 F20101115_AABQCP solanki_s_Page_027thm.jpg
2f106119a68fef931a5b88d876f4dc27
25281eb6609d2ccd93db48aa8d393acd5681cf82
34972 F20101115_AABPXJ solanki_s_Page_042.QC.jpg
0e67b9f848b3064119b7a235f81a2c4a
08c51297069f0b22226f185ba79e58660eeeddfd
269357 F20101115_AABPWU solanki_s_Page_017.jp2
eeb101e13e4011b8669043bd9b97ef9c
d4f03dff21f44f5fc4ce6fc2aa4de8e5e27244ae
2658 F20101115_AABQZX solanki_s_Page_122thm.jpg
2a81b26857d512c25e3116dff4aff7fe
00b5686050ddea04ac8e51f3f9c1589499cbb672
F20101115_AABQDE solanki_s_Page_082.tif
ee8a1890317bac34406ac8bfd8d3cd84
b4089cac67ff4bc3b8e146f1ae2ec1eaf4663cb1
84 F20101115_AABQCQ solanki_s_Page_085.txt
7f3a3767cbf2f40a295aebfc15670808
147bf3316b711c5f39a74b9fc258f0feb83a670b
122311 F20101115_AABPXK solanki_s_Page_124.jpg
9b47dd9e0039c58c7732cf4678057f64
9b35cedd92b694197637727d2031f3178a2b1187
8888 F20101115_AABPWV solanki_s_Page_125thm.jpg
f60d1c7cda90d422784dd3bb341c2f2d
705db3bbc95b6651b37d3533e569cd7ec3c3972f
2503 F20101115_AABQZY solanki_s_Page_123thm.jpg
0e5216d65822e5f3bb9d9cf2ee158174
805e5caf78cdfe3cb933b9a363efac1aeeff6f9f
179 F20101115_AABQDF solanki_s_Page_117.txt
38b71c4eac4a4a63b37fa9f23442145e
db566bae2acfe24439f419a913c12c2693eec0cf
8756 F20101115_AABQCR solanki_s_Page_042thm.jpg
695b5f1a281ba30b81a2c03796e1aa82
2c53d7d2aab86439938a922e65552b4dc5aa2748
60276 F20101115_AABPXL solanki_s_Page_061.jpg
8e326a3c8b0e8ed00bbcc78b8c929f17
7f6d8639c0d7a64895d393170c6bcb274801ba4b
20522 F20101115_AABPWW solanki_s_Page_061.QC.jpg
da8fa419c4b9a63ff9ec5152449af454
229cd6c99f7e555e0015059bf95ef69163a0cc99
35469 F20101115_AABQZZ solanki_s_Page_124.QC.jpg
7a3575247d4c328b6df488b49a47549f
1f3c9d841dbd98bbcb9d05d6cc995e14869000a1
1053954 F20101115_AABQDG solanki_s_Page_024.tif
0450dfedf76f37797c9777361d87c99a
cbfbc7045201001a85a0d8fe0eb2b6e85e4a1bfc
115627 F20101115_AABPYA solanki_s_Page_118.jp2
af7b7f7cd4f606e39b23b786e6b5ddda
82c4a43ddebe6dfa59ba81fb1a050674e2683186
103219 F20101115_AABQCS solanki_s_Page_011.jpg
ccfe918dc1c164dd15361438ebde92f0
db311d49697b339bad67f7a918cf66c62a694ce9
7969 F20101115_AABPXM solanki_s_Page_047thm.jpg
37f34a9911551bcc604bda9a48067de1
91d4c282fe48a3d2eaeb426f443bde84f41ccb72
9188 F20101115_AABPWX solanki_s_Page_001.QC.jpg
5c572697f002a0948eaeb1b40764541e
2635e01f7bf7702ceefa8e0c9a0005c038565bb7
6851 F20101115_AABQDH solanki_s_Page_115thm.jpg
a837241244546fe7836771192e7043cb
a9153fb092f17f2ae140760789d2138578a0e793
29945 F20101115_AABPYB solanki_s_Page_075.pro
bc4b04d54ca935adfafa2ca9b66ca469
ff1f9c0667f1b3be4954b06f97c0db53557dba87
3262 F20101115_AABQCT solanki_s_Page_109.pro
46cedbfb80d556924e082f8171b8d85c
a11542b34863f5fa1785d8c12c809dbf53a51c0d
1051955 F20101115_AABPXN solanki_s_Page_125.jp2
4cb49fc317873ec722c383e389c6425d
a037ccabae668ee1c0bad8b374582daf427ddab2
55249 F20101115_AABPWY solanki_s_Page_032.pro
b4df3ee782c9e44b8ddc592f8323fa1f
940489d004cfe3256fa1ec54626861d8a061772d
32328 F20101115_AABQDI solanki_s_Page_047.QC.jpg
9c031c8fb7e89b5504565209fd8c050b
f85b8a7faef1258d9c60c9abfa7cd80470369c5e
97384 F20101115_AABPYC solanki_s_Page_023.jp2
4873225e37c0f3adb1e1cdd7f9a72e92
9fc1dbba6bbf945ac9478a5e8efbf4e5c2741315
F20101115_AABQCU solanki_s_Page_052.tif
7a64b1cc771e6d2e9b46e0d2184b6663
b907a29b248cf576183d0a17a85538309ae64ec1
3655 F20101115_AABPXO solanki_s_Page_003.jpg
5a093540c70141f24a9f034cd3a9b0b0
b83bc7663879d00de7cc727f6f8856d7d9df18b9
8792 F20101115_AABPWZ solanki_s_Page_123.QC.jpg
84be894fe38987a23ff1e1164137ba55
e0a729e6c309b98acfc913ae4a7dd1e4b61d757d
1051979 F20101115_AABQDJ solanki_s_Page_005.jp2
f85d4277293c53aa8366775d9a70d6c0
943b2395271f7421185d96e925341e524898db80
F20101115_AABPYD solanki_s_Page_099.tif
32a14b215d7ad1ca2ec1340031d42d4c
d6f3d58521c034bb385b6224920885440b3fc155
111427 F20101115_AABQCV solanki_s_Page_026.jpg
cd4af01c38ec94594a536dff33338689
406e932394ea8d1c536d00392c1a71257ce4a318
93679 F20101115_AABPXP solanki_s_Page_025.jpg
9c1a571829a28103f34d5d4c90515cc0
7dc81219b4587b3fa10eecbf9412f96c6f005f44
3667 F20101115_AABQDK solanki_s_Page_121thm.jpg
1429a0c0cd83ad7195d730106e8b18a1
15052308ab06fd4ed401babeaaf46bb6a00788b5
99829 F20101115_AABPYE solanki_s_Page_069.jp2
b283f3e6c90d359850353d4df8a68daf
85bc2a6796053b3e63b937e2583b4414206b0802
43340 F20101115_AABQCW solanki_s_Page_120.jpg
868f167dfbb6ee2822326ed583df717a
f17cd970cc19af5850061f3c740a4a7c73808ac0
752 F20101115_AABPXQ solanki_s_Page_002thm.jpg
77c5354932825576bd64db56a7378e3a
b2597710afbc5d5064a44ba7cdb7b8dc153bf1d0
1788 F20101115_AABQDL solanki_s_Page_011.txt
c2e34d3ad7f1072273ace3ce6ce03053
1ba81a077024471f52e1fb3dc9268bd825b9242c
F20101115_AABPYF solanki_s_Page_097.tif
8edeeeef8171f9c6cf5575a43b640d06
7f041f6413621167aabb76e905a370059ea0cd18
55570 F20101115_AABQCX solanki_s_Page_031.pro
06d94a7557d05dc46bceae07f1ec326b
dcf48f98bd5eabb34de4a56e3e0865eef07ebeff
112542 F20101115_AABQDM solanki_s_Page_029.jp2
87d6415a06cf8242f188e9aef87e5e17
48fc72d17fd8a4124238c34cd557193778b85d05
106963 F20101115_AABPYG solanki_s_Page_079.jpg
47f7c0258871529abb4aec3f4df1cdc3
e0a65a6df14db15a0879bf963bf78510d4bc5cb1
6012 F20101115_AABQCY solanki_s_Page_088.pro
547690d40640ec7eef9a3e08061e1c01
33ece76540338abd27b2175d8104eac8be6e6458
8262 F20101115_AABPXR solanki_s_Page_078thm.jpg
30cf1a64a4d5dc19b1e9fcbb26d81751
1959bdaabd5e3b4bc4fc2ac9bfd02d2d20cde53d
107992 F20101115_AABQEA solanki_s_Page_005.jpg
28ffa9e762a43aee3157e8b7361e544f
baee2a52665aa13cfe85d4854e50c70ef38c93f8
50116 F20101115_AABQDN solanki_s_Page_106.jpg
6b243001587f0b3f5a5f045710a5a695
868f3dfd6415f38b699d6d3f489c75d290027ccf
117502 F20101115_AABPYH solanki_s_Page_055.jp2
67b6945dbce82364d9a54cb67fb7585b
cfa7ae6168c1ccacc31ce4fa090a1785df7f0fe3
778 F20101115_AABQCZ solanki_s_Page_082.txt
0f14777720150cc34eca5ecb429e5b5a
6ca02fba5ee661f7b701fe9569cdc4dff6d8a9c8
53862 F20101115_AABPXS solanki_s_Page_048.pro
fdb2721edc6d48c716fc12143fa3ec4f
30d64ccf503ef950faab9689fb26a9f7039a14bb
145899 F20101115_AABQEB solanki_s_Page_006.jpg
2c61c2b6336974667a83d4386f6efe2c
82c6d8cf11f7bdb21ad6f1da06f6628d0f71a7e1
8623 F20101115_AABQDO solanki_s_Page_071thm.jpg
eafbacf4ab2ecb61b514348bb5f576a7
d223cbb6e96c7e424201dd71fc8a9a8d53ab57f3
30635 F20101115_AABPYI solanki_s_Page_090.QC.jpg
c568c17b18460bad8c05c120d3713b97
feb57c2e64ff6dc1faa851b55a39d9d73c3dffe5
6731374 F20101115_AABPXT odbarreltest1_16mph.avi
77073d01a09a78abb97b977b530b597a
c0915ea58b3e953923507a9b175bd2ee41fca9b6
22975 F20101115_AABQEC solanki_s_Page_007.jpg
b34a6994988e584cb19a39cabecb20a0
2a94a822c6f98e037781275005120ffebf2701ff
5864 F20101115_AABQDP solanki_s_Page_005thm.jpg
ac8bc221dfb0a1bd83fcfda7d82e16c1
872306cd844d05210e146636eb50a42a8c0befc5
8196 F20101115_AABPYJ solanki_s_Page_051.pro
2bc78f46a54f6cbcbbd7c1a10977c1a8
00a65eadc043fe70392ba12f04679b753f224eb0
5821 F20101115_AABPXU solanki_s_Page_097thm.jpg
77641623cc36a57dda64f048e5eb4538
f9954e6cc8b0a851201e0c7eb06649747b9b4c10
52071 F20101115_AABQED solanki_s_Page_008.jpg
bd48a9238d6f1350cdf7ad83d384d518
83863d7b4c1959e0ae42f203203be517202c757d
159 F20101115_AABPYK solanki_s_Page_104.txt
d7fda32dd8a523863031f562f296bb2b
c42e9c6ee37a212684f5493dd3e9aff9ccd58fd5
57550 F20101115_AABPXV solanki_s_Page_108.jpg
5649e2e2127d83491a49ee7add5c0f67
1504ec8a39d6eec0c55e45d45a6c080a3d7d15ac
99383 F20101115_AABQEE solanki_s_Page_009.jpg
2f16347567fddaa6a2067938d230d1ff
a8466a7953a000913b81a7c68f29b833190daa7d
27955 F20101115_AABQDQ solanki_s_Page_064.QC.jpg
6b713928a76918e1a95d4cb0bfdd67c1
4d06b4d91c12e228b468ff41c8c587f0c274d76d
2210 F20101115_AABPYL solanki_s_Page_119.txt
1d766f7a419157becdf49d84f7837fbe
f8df462f64455486569f0a533d6176a3de992b89
34991 F20101115_AABPXW solanki_s_Page_118.QC.jpg
b3d4bd2556c8df377b04779e52339fc6
8540354a9ffb095d3c2b6358a3b1fc6ec91103c4
87949 F20101115_AABQEF solanki_s_Page_010.jpg
cc215973bdd1e5b6e7b7388c2484d732
e33ef7bf65b34c23c2ebe37dbada4a92a5c8d6e0
2138 F20101115_AABQDR solanki_s_Page_074.txt
5ed7b6b52dc6c526ada3bce8ed7952d0
113adef14024a539857f3cfe7b0661363761b60f
8701 F20101115_AABPYM solanki_s_Page_074thm.jpg
a28dfb59108f6c57b3d24cc510dcea03
a68f54c5fbfea17419be9ef7d1d456e148be1b28
2213 F20101115_AABPXX solanki_s_Page_118.txt
022584c47f2d18ea53526fab903d8477
1e7c7bd6cf24f6ce7695c4355b8034cf137693dd
98208 F20101115_AABQEG solanki_s_Page_012.jpg
f6b32acdd22962580799797e29f79d03
8698e04f5b1ae50c96ffd8092a349547a4bcf6e9
23924 F20101115_AABPZA solanki_s_Page_123.jpg
3bb1843dfcf11a4b6cb85ae1466b6e62
ce4dbab5b84a60830a3062fad9fd2238938cd846
F20101115_AABQDS solanki_s_Page_053.tif
215a8a9b49b668522c1b521bd58eae94
f60edc4cc866e023d4de1077c85c9ba62ace0957
102550 F20101115_AABPYN solanki_s_Page_037.jpg
e7cba6b9feb4651163d3f38242921dce
3806c33c0e54711c91e36a6a08246a6123a85501
F20101115_AABPXY solanki_s_Page_055.tif
68c3996a9652344af7846941b628cbe8
d0f1a753a1460b6d744e044bc2a4380ede65e0a0
44305 F20101115_AABQEH solanki_s_Page_013.jpg
8ba7d03e36ad27bc97f5a682c6fbc6b4
eb7e781d8dbead4c504a98543dfaa7a218fcf91d
50606 F20101115_AABPZB solanki_s_Page_020.pro
490fb8e2cc87d51cf7e0fb3d86fb2fee
c5ea284edfda62658d9f92baef9a389410b13e1f
82947 F20101115_AABQDT solanki_s_Page_067.jp2
2d96346f8be160e28d3e0d04ff98f032
173dc135ca2fd12ab5ad19c1ba3667fc7ca01fb7
8939340 F20101115_AABPYO odfusiontest_22mph.avi
968215ecdc9adcc9827e003674dc30a9
108682445550cfe3db27204d1a677badeba39362
1079 F20101115_AABPXZ solanki_s_Page_003.QC.jpg
62fbdaa0e3e034a67dbffe1a66745dfe
b0a1d18c6177b75621a007e4f6143f7b421f163d
106156 F20101115_AABQEI solanki_s_Page_014.jpg
e2b25adf7ede7324ffbaa49791c2b149
59caba17654f6a1f89dfbc5cbcb096eba2704f02
44900 F20101115_AABPZC solanki_s_Page_018.pro
cbbc8b1b812ed8ce40e206afd44f2350
7cabc9136dbb0f5aab55aaae6d23b7700ed1c0b4
F20101115_AABQDU solanki_s_Page_008.tif
93b666ebaa38ebb708d83d2dd42e2482
85faf4691e33b988b7ba1ff8269b5dbec7ca22a0
90031 F20101115_AABPYP solanki_s_Page_072.jpg
10a67e9e998db22dbf45e76050a190b0
1b1cf11a1c8b41c428bb523aa8acdc7aaeef8a2f
21432 F20101115_AABQEJ solanki_s_Page_016.jpg
8151fa66551e592674f1169d2598d5c7
7163d93a811cdf0821a8452e2badfcd10ce1fe3a
23751 F20101115_AABPZD solanki_s_Page_076.pro
7e3e101685ffe1616f52291e4c24c9ee
7e6e52d69ceac845b25bc1105736c3a606913875
196544 F20101115_AABQDV UFE0018321_00001.xml FULL
7452216df8b66a6caaad33cb94b6f7e5
00cb9aedaf0605da0f9b695e41128eae55d8383b
BROKEN_LINK
solanki_s_Page_040.tif
119388 F20101115_AABPYQ solanki_s_Page_026.jp2
082bd2e3e303b0d09b53080fda9815af
d99ebb01be5ae394fdcb5682ab00b27b4a9f6e66
27502 F20101115_AABQEK solanki_s_Page_017.jpg
7719991872a9c39f672dbf351294eddd
c98ebf228f0276faa322ffbe9c332ef6382b3244
360 F20101115_AABPZE solanki_s_Page_087.txt
09a63da5f58325f2946cb23aca3af269
8a706273e51a1f45a26550a2a3a19856896637c1
34524 F20101115_AABPYR solanki_s_Page_021.QC.jpg
b33adc046fcd6c7b04845efc8954b6a4
ad0cc0ddf3666f951af5bfdc8f26033bf9059e2f
101359 F20101115_AABQEL solanki_s_Page_019.jpg
35243037bd5e65ac4cf541aaf51b4046
c16b9b3cd873a48376c381f0fb8329d0d291abc5
56209 F20101115_AABPZF solanki_s_Page_030.pro
a03c0dde308f5102de3cf5e993844486
8b5e57bf44995cfb05e01b10793367e233984e64
111826 F20101115_AABQFA solanki_s_Page_046.jpg
e6d5591c8686ee58404993804354dc50
fccb38371ae7429a359ac34eb29a5399390e5e76
90665 F20101115_AABQEM solanki_s_Page_023.jpg
4b109866653d99ea6d110dd277b412eb
46497a61cb0961c16d502a6b6a1e0fc2e9492fea
111044 F20101115_AABPZG solanki_s_Page_030.jpg
90678bc947313a0692c74bb5c1b42bf1
e8454517c5e813718cb1c47083e9cbec3626d27e
6065 F20101115_AABQDY solanki_s_Page_002.jpg
899796eaa474eeb4fca5544bb7267381
de00ce041aafd0e00d264ee6592eb37238f26abb
104504 F20101115_AABPYS solanki_s_Page_021.jpg
3dc1e8c79661be435896f57b96e6c17c
a15bb84c4225f2779ccda94c5729bde5afa1fda1
8581 F20101115_AABQFB solanki_s_Page_049.jpg
d8ca8a0950c440d419374dfc61e6c6b9
9f0f284d4b43b8f93a97f1cfc6fa436e0f5a0844
81005 F20101115_AABQEN solanki_s_Page_027.jpg
06bcfccde8b2930d0f3b076933bfdc46
635195ef2e02742dcfddcfcd3a69969dd41bc725
130 F20101115_AABPZH solanki_s_Page_002.txt
d0f5c967509fe189e1f7946daeabf2c5
54a2262769318db6c5f1be5b5af4c416c7165c12
54457 F20101115_AABQDZ solanki_s_Page_004.jpg
7bc17bcc649ce68071f2e72add4b285b
024538af976942039e0e20333c3f82b3c641721e
5647 F20101115_AABPYT solanki_s_Page_075thm.jpg
cf4b92d2184be9e5efcccd1f94e1e800
4882a17f9f5ffb09a140533cc8edac269fa03f78
64961 F20101115_AABQFC solanki_s_Page_051.jpg
8f71880eee83549562798f2445cbb450
957e24820859837f459c11d3d0d7625b993387a8
103893 F20101115_AABQEO solanki_s_Page_028.jpg
622335ee6d0a2179567d11cc0facdb81
5b2d56d2807833755399f6c5a0967310630cdc65
8801 F20101115_AABPZI solanki_s_Page_122.pro
255fae4676b927b5cc424c2c59a19467
a13d12dd252edae1ef526288cb6b00c76f0d1ebd
1797 F20101115_AABPYU solanki_s_Page_023.txt
0e8f8ba8b4df64d921ce0030173e55cb
323f734991e9d50b51e89bd7fcbee64b3628f687
51317 F20101115_AABQFD solanki_s_Page_052.jpg
47baf68c365538bf71e0f498d8b5c057
b55fca914e092c9eff7a7e2a56c4e588b85a540e
105281 F20101115_AABQEP solanki_s_Page_029.jpg
43857b7ff092113197e808cdfe1d7e27
3b35f0e44481b9775316a21e0442bd817082c8be
16879 F20101115_AABPZJ solanki_s_Page_052.QC.jpg
dfe4ffac0cfc2b170b0bbcc76ec3178b
6388f668c9a412a0793ea452fd35f55df9577c24
8926 F20101115_AABPYV solanki_s_Page_035thm.jpg
6c4268604f5bc2a578adc52a9023e212
b685820f3f79e703fe95beb1a2f63316e8c756bb
101988 F20101115_AABQFE solanki_s_Page_053.jpg
93a034133211115f1d9679b518db9646
7c2885b72bb50e4e5326d3c684a6e70021b7465c
111291 F20101115_AABQEQ solanki_s_Page_031.jpg
2cfea2ef9df2e8d9ac866695b7618eb0
2211cb00b5e10e6df12e4faed2b7aeb25c2da5df
420 F20101115_AABPZK solanki_s_Page_098.txt
408b97940553e912f02d8efd192fba97
990d3fb70b4dd5092d42e03cbaedee7841c18656
284728 F20101115_AABPYW solanki_s_Page_100.jp2
acfaf52139e2ffcc2000d582925f460c
53350325f4ce76197c734614f6034bb85ed53f59
99683 F20101115_AABQFF solanki_s_Page_054.jpg
d549db3357d4c9b6a8f433cb78717e7a
f2705faf854a578722506346089e8d82cb3c3466
110279 F20101115_AABQER solanki_s_Page_032.jpg
f6750e6d5a0da50da5267b5de91bd8d0
74ffb8785f7425ccb5df6a7d4dcc89893e2659f6
79497 F20101115_AABPZL solanki_s_Page_115.jp2
779fdab935929cba2dfe6eafdfdaefef
e36a3cfb28dfcf7128cb968e052b1dba083638b5
7943 F20101115_AABPYX solanki_s_Page_081thm.jpg
d5b65717ff91c4533bffa51b65a7483d
7e8f0cf02ecd6a76414b5dbc7f724b8db8b182b4
111188 F20101115_AABQFG solanki_s_Page_055.jpg
249c421e81a833ff29eeede60962c0fc
34af618a6097729ae357e0d32bc1241b5bfba9a1
105104 F20101115_AABQES solanki_s_Page_033.jpg
7d4fd2d9e22e62362e6eae33387fe910
d35d9a55a7a6bafc742bb19a88bf34fef36d58ab
12572 F20101115_AABPZM solanki_s_Page_116.jpg
ac2510cceb10e52f5ad2ea94750d000f
7f8b41b995019c42c3110c360c9f45441bafd3a0
71911 F20101115_AABPYY solanki_s_Page_083.jpg
7042540859a7970f2bda6cfc6affc4c9
794cdcacd9fecf0d8496ae7ab19684c9bf4677f5
97450 F20101115_AABQFH solanki_s_Page_057.jpg
c5c30ae35534afcde7343d7f2e570d65
f1907366fb22b9d69e4a02055a48ffd96ed01188
114019 F20101115_AABQET solanki_s_Page_034.jpg
86bd958aa4171ca7f2f3fcf1e05d2008
4c4fcc1f92388be3ab07ee522d5718f798acfdf3
51836 F20101115_AABPZN solanki_s_Page_019.pro
dd9b8cf80f1e68f602bd01997215e256
06cb4983911fd633afe7c5e48e078b42e4feb8b5
51435 F20101115_AABPYZ solanki_s_Page_095.pro
7649a426680ba3f4b20fca2f4fa94f69
2bda6888cfff75df88e927d937e734ba49e30db8
95611 F20101115_AABQFI solanki_s_Page_058.jpg
849e58c150939084ca03ee40eb69f818
99cdd17d70bec3edb044b58269a63d04f0f3e2f6
112172 F20101115_AABQEU solanki_s_Page_036.jpg
c8df8325248a794780c54382fa2441e2
8d978b4feb7f0866c4e7a1a5317fea8ec7d10391
117688 F20101115_AABPZO solanki_s_Page_119.jp2
7179ade00b7dec71abc67371bca7a8df
784c24b55294162485145d7a2390a720dedfa94c
104981 F20101115_AABQFJ solanki_s_Page_059.jpg
efbb3cc3c6b4ec839e761d7793137d28
f229b05908d82ecaf55d35f1600d5e9f214d8d93
9889 F20101115_AABQEV solanki_s_Page_038.jpg
9298613884a74936fba028a44a1cd91c
738e802d145c39c31c6d7ce437c4624e8afcbeb8
F20101115_AABPZP solanki_s_Page_067.tif
d6aa52bf8538daac829e8a7b13e08ca3
c4a8d183d52ebc47782b6b9441f5d76a02194bba
83482 F20101115_AABQFK solanki_s_Page_060.jpg
6e4b46a8397484a074411cced45667cf
ddf185446d379b1d9d25d001e0d5eae8bc602845
18730 F20101115_AABQEW solanki_s_Page_039.jpg
fa3dc30167a0e019078bd9a3eda6f3f8
65efd11a7cd2e93be6527ba15cd060bb3c4ba800
236 F20101115_AABPZQ solanki_s_Page_105.txt
e81fd8f15a437190a1a2ef4b103bee86
fca9e2dd812b29b1f7c6b53bddb5c1f2a89dd6ac
93750 F20101115_AABQFL solanki_s_Page_062.jpg
7b7e50851e5031354195cdd06a221179
442006d282fc01de07b13c3bf39b8af4668972f1
102552 F20101115_AABQEX solanki_s_Page_040.jpg
92dfdc764663272d0affedb40bd5909b
503c32aa3063e2c70ffa7403b37c698f81327717
2956 F20101115_AABPZR solanki_s_Page_073.txt
ce6572e3d15204bccdb35259b4017359
73a590c4b4cbbb35e5bb5b4ac1cd608ddaa72c21
73438 F20101115_AABQFM solanki_s_Page_063.jpg
b85b505ada3c2a753ef99d0e15a4a28b
7489511484e3e24fe644f1c34b959d5938dd5a76
96731 F20101115_AABQEY solanki_s_Page_041.jpg
13838ce370a983540ce002a40721c64e
26a707df1223fe5f1d6a6df1a8f36021edc62458
6108 F20101115_AABPZS solanki_s_Page_102.pro
fbe7390c7e183dd653df8a82470eb123
e52d62248ed665178501b429000740a1122c69bd
48977 F20101115_AABQGA solanki_s_Page_088.jpg
dd6dea1cf24a3a7a4b46c77deafe497c
4ce40563129402fbee1dd464d6dbf437f32dd680
79106 F20101115_AABQFN solanki_s_Page_067.jpg
a7767ede9a9c413d7cb814069e227b30
b6b5a59bfe6ff1407ba57664363ba9f6682452b4
106969 F20101115_AABQEZ solanki_s_Page_045.jpg
5ff80a45e1b4cdd583e3f098d50a2f0a
f50969718fc648b78c89213bd31fa545aeeeb488
54514 F20101115_AABQGB solanki_s_Page_089.jpg
e5a4247b900e99d9cca4049cdf4dc8a3
6f48e4767105645476b945844d150c96964684e0
110946 F20101115_AABQFO solanki_s_Page_071.jpg
e3b7c5aec59df33987bd89806aaba730
167365d65ae5f349c6eb0dbac6ebbd54b9a08db6
8601 F20101115_AABPZT solanki_s_Page_113thm.jpg
f773383436ba78694deb50452d2096e1
134bf1ca0b6ae9ca530d4951be1f6e1b3355d55a
122085 F20101115_AABQGC solanki_s_Page_093.jpg
9b11b0198056870bcf84b64b5640854f
78a39679675c2872fffdb4ebf26622dcc2c188a5
107516 F20101115_AABQFP solanki_s_Page_074.jpg
2722ca301da251ec14902331e6a78504
2f054373e4ce9c656666aea0cdb16c7688c6fa64
34893 F20101115_AABPZU solanki_s_Page_059.QC.jpg
58e8dddf9e34a9e8e9c3a7308a967dba
aba30fd9bdc86786c9e17873d4411df90a5a5165
114309 F20101115_AABQGD solanki_s_Page_094.jpg
5330a942fb123b61a96b4fa21e4127b9
3acfbbe8d4cd8530b722ff6b65b1c86d2b479569
64141 F20101115_AABQFQ solanki_s_Page_075.jpg
bc83449a2b4fa621f1cc29f8e23e74fe
fcf0307a6c7550e438d9a833db4c90a39eeb2a42
11679 F20101115_AABPZV solanki_s_Page_043.QC.jpg
59d7aca636e25b43de1f54156458d23d
1725bcc9f3efa4f3d2e93ce46142cf15ffa16282
56257 F20101115_AABQGE solanki_s_Page_096.jpg
b4f9da73f73e3546c9a89440601dcbf1
b8fc27a5608b31e8afde17e5b739f364523a8448
54680 F20101115_AABQFR solanki_s_Page_076.jpg
4e329035921236f90be56168a837fdf8
010c178e08284f4823d5cccfbbdbe74b9944ab39
2042 F20101115_AABPZW solanki_s_Page_019.txt
e6c376c09959f191ceeb6615eb743726
7a849ed8b2a573e9a9b098655e040f447e09f86b
28940 F20101115_AABQGF solanki_s_Page_098.jpg
3be16339841743a692efbe4517d4963a
a4fff5e8ecfce5425e66d052bc54127d684cf6b5
103767 F20101115_AABQFS solanki_s_Page_077.jpg
4384227d3b9e0bdcdddc1c600fecbd00
d64165c628dc86afc8b4f7f4a32f030ce726ef49
F20101115_AABPZX solanki_s_Page_077.tif
4adbfa5b3a79ee4b2cf7c34a8a279971
ea2d65f338518a60e867e0e0be645618e0faac24
37604 F20101115_AABQGG solanki_s_Page_099.jpg
13ebc14464e42a2ee54afa1ced46ea22
abf33ac0bb925b5da1f9bdb0dea66117c5921ed2
102309 F20101115_AABQFT solanki_s_Page_078.jpg
fde8d88901e6cd4007b7a2899b7fa616
5c1465f629fe1cbe255331791a809a71e509820e
F20101115_AABPZY solanki_s_Page_079.tif
621545b937c67146bf937adcbd0c532b
ffc82174286d2005092613a63597ce87ae95b85a
29814 F20101115_AABQGH solanki_s_Page_100.jpg
e202922944f1b24c121d0a99466b364d
aca584bf40c2d1228cf7add2bb7a10f5d464b241
95539 F20101115_AABQFU solanki_s_Page_080.jpg
9288cdd435ba82afc67dc3a3314841e5
7316442c273aeab2159d1dd6429a2c289f85ddcc
8616 F20101115_AABPZZ solanki_s_Page_048thm.jpg
851f149bcc233c3d7f180392e2d1f414
f993901691a12c982380a35b4f54673d67605aae
79533 F20101115_AABQGI solanki_s_Page_101.jpg
406ae5bcd7ff63148b23f6230fd14425
cc2b37019f3d29959f23b1b0f30723e91f5c9a5a
120510 F20101115_AABQFV solanki_s_Page_081.jpg
266d3188221f63d050685e186648a095
f8dfd791e5c5bd766d011c269fe203decce9c0ee
60877 F20101115_AABQGJ solanki_s_Page_102.jpg
2f9475e4c401af58f5cd5fa4b9b53428
9561a757ba0eeec87b3b23cb6c8833c23e89f012
61157 F20101115_AABQFW solanki_s_Page_082.jpg
df6e29c8ad96070a68eb679864e885ea
93df6b7a83b409e82c51331bb873c070e197e35d
44906 F20101115_AABQGK solanki_s_Page_104.jpg
499c482333ff8834c7f98db89361bded
b763311c8f2cbe0fb41b9d79c2b840109d9f2ede
25604 F20101115_AABQFX solanki_s_Page_085.jpg
b4ff7550ff81d2954312a95119388e3e
03fd4dd28114f6339abfeddf6ce6546cefb61219
63975 F20101115_AABQGL solanki_s_Page_105.jpg
a1f9f43270510dd8be7b38d5527f5340
62298ab3cdfcc1ca1964fa365077953d8d317235
48477 F20101115_AABQFY solanki_s_Page_086.jpg
8d182b86fdd63f2fd189d490ff6577bd
cfba1e8bfd3fc9d23f8b92626880666ca4e135c5
1051985 F20101115_AABQHA solanki_s_Page_009.jp2
d6ca89c05ffb64c20ea31aa61007e176
3d2b81f5f159fb97b6b214d4f25672fc64a8831e
71204 F20101115_AABQGM solanki_s_Page_107.jpg
a5c36a1cceccb7c79879554072e06dc1
ff817309fe12581a9ff5b14af20216eff4c86bbf
47526 F20101115_AABQFZ solanki_s_Page_087.jpg
c9482dadb540f69b0129c685d01f2c0a
c1f60c3f64435a7dda0eeaa4c1ea98f65194748d
F20101115_AABQHB solanki_s_Page_010.jp2
8a42526962b28d5c73f0a69204e8fd01
20820d4247585a74c2adcd7681fae1b4c4a56e91
45843 F20101115_AABQGN solanki_s_Page_109.jpg
7b4dc602d445514ae375f6f09eaceade
2151ddd188fc6b3ff265e6194ff35f64ef0ca12c
1051967 F20101115_AABQHC solanki_s_Page_011.jp2
21cdd00771645be7a3a0dae4dd3c133b
ffd4597fb4079a497460c02be342e5743b6b2603
98464 F20101115_AABQGO solanki_s_Page_111.jpg
b3ff404c70d000384b9ada2a33f5ef99
d608cd4d8c6ecf6cea39d8c5d4804cfc5a2a6ea3
101925 F20101115_AABQHD solanki_s_Page_012.jp2
a5109d55908c829428d76f356925890b
12642023737d5b3184215f2d8d4bba01fe519b15
82939 F20101115_AABQGP solanki_s_Page_112.jpg
94f31f7343a70739c0f09532509192f4
4c0e3f60c5c10ab25b4055b3b9b292ff3defe72b
46784 F20101115_AABQHE solanki_s_Page_013.jp2
6ef0e83619b041fa6430f921e4c92ef8
e543875d410afcbedf447989eafd8e0b12aca8f7
84592 F20101115_AABQGQ solanki_s_Page_114.jpg
1cc554e6f1b23440f8ab2f65185dbaa1
37d00522edbaeb81fc2dba7a6fc2eb94b9dd21e1
110959 F20101115_AABQHF solanki_s_Page_014.jp2
340246100ae163d0f1d820cc86a79f24
128193d8286f8bf10479463913b62b1847b82be4
74695 F20101115_AABQGR solanki_s_Page_115.jpg
7eaf0b3596f3f3462266330a8cab8a64
d0d1b59944baa7f2bb60318623e03d80c8b3a4d5
113383 F20101115_AABQHG solanki_s_Page_015.jp2
72c68ca5d1113ccde678624d30ac619d
dd15d1eadfc0e97a985be44feb31c0d9488d0b46
40047 F20101115_AABQGS solanki_s_Page_117.jpg
3675347dedeebf40e6bbb5ce326a4af4
698d2fbf7b73d76b9a2a4cffacebac433ef24439
23010 F20101115_AABQHH solanki_s_Page_016.jp2
cb598ff0ee4cba4de8181028928879a0
c699ed9815bd66d0d415d1c39a90282d1063b98f
36816 F20101115_AABQGT solanki_s_Page_121.jpg
de41c5e35e9e07324ae30b9bd450bbdf
784af40167aa40b63f265c45a00f16942ba3d4e1
98252 F20101115_AABQHI solanki_s_Page_018.jp2
c6946140b9f8e518d669ec2048deec13
f3429a6aa94b535c347de7b44bb52b87a9262abf
126496 F20101115_AABQGU solanki_s_Page_126.jpg
0257bc0582e67d5f6ae9f9b4c900105a
589c1bb9c6780f80df18e835413d79ec5209d1c6
111444 F20101115_AABQHJ solanki_s_Page_019.jp2
ed2bf9fcaa84a618ad58653481b3e90d
4d5da3494db3dff393bb69eab8bab6a05a1aae51
44266 F20101115_AABQGV solanki_s_Page_127.jpg
b2679843bd1076d75cc041a7efc57ae2
3e686379b5957c822aeb9687dcc47b412ec18856
108366 F20101115_AABQHK solanki_s_Page_020.jp2
67557084f8186b9cea0f6551002e5f75
21af634a1adb47668079ea671f7926885d22315a
7144 F20101115_AABQGW solanki_s_Page_002.jp2
d860d3d01d84d8e9623b9bd7fc54f0a9
cabc9679bf91e015f2a1a36ac5d70f9e31c1bf0e
113379 F20101115_AABQIA solanki_s_Page_045.jp2
8704b9bb5fa5d59529924e354122e23b
2662136d555b10eedee560af63145cd9f5758227
112498 F20101115_AABQHL solanki_s_Page_021.jp2
582725bec6d2148908e9fe3ca37455ec
2a8c44256ff1b2c419f1d3756e96a4cb8aa44b57
4783 F20101115_AABQGX solanki_s_Page_003.jp2
83e32ff26144057eb4364406ef73eaef
a6ce7e2d60c6e82647253e689b09f537c06450ec
79640 F20101115_AABQHM solanki_s_Page_022.jp2
d692e9c5f5e9dc692b2e43bc8fa673c7
fdff0a0f830d7f7b32c17bcb0c20cee3a0ddb3c3
56718 F20101115_AABQGY solanki_s_Page_004.jp2
6e430cc9685e6346dade3f5ab02d7174
9508304a9e9d90746ab4847b9069d505780458cc
102766 F20101115_AABQIB solanki_s_Page_047.jp2
9dc351e48eba74c2c3f0ba4e23a5f673
956b04099b47a205e72065b0e26290512e0c10b1
89708 F20101115_AABQHN solanki_s_Page_024.jp2
67dd8fa30c1358e9d76034d4acdd06f3
ab9e8d2b26f62bea79640c90b39e4a3df8d84b8d
1051977 F20101115_AABQGZ solanki_s_Page_006.jp2
4d58896cb09736bc0395beb4707afac0
38496aa55009dcd4a7de394e346aa06961cd78d5
114796 F20101115_AABQIC solanki_s_Page_048.jp2
b87afe08b9b0adfbd463b76e79dc2b38
65b81ebb785ec8fa3ee7773147358a153f0b38c5
100477 F20101115_AABQHO solanki_s_Page_025.jp2
39b483be11c610cf73c5ce9ce2f41a33
8859295d56fdb5bfd6041b7575faa60da7a07818
10333 F20101115_AABQID solanki_s_Page_049.jp2
916845f45564d1e42a8de8cfb58d2b11
257650ec1e5f82ed8a024c2c70a2f6ee2c352240
87067 F20101115_AABQHP solanki_s_Page_027.jp2
04b5557a51b3a877e2c3814e588a4271
c37e7c0bcb04b9fdf3883797fdeb281038b0663f
379000 F20101115_AABQIE solanki_s_Page_050.jp2
934bb22dc04d58b758c192cb65ea07a2
02f6e4119f9d866460648df195746b119813b77f
111856 F20101115_AABQHQ solanki_s_Page_028.jp2
d7c29a8b5e49a734ce63c6b2c5348724
dd602d348a6905442091678554da25ea5a985130
F20101115_AABQIF solanki_s_Page_051.jp2
d0f4f9cc19103a6ab3ad29fec1b9da3f
43ec5e379cdc65db13b7c8d11d0fa72eac3d57e8
118903 F20101115_AABQHR solanki_s_Page_031.jp2
84f864519f2433de4f2c1b94032cbc8a
5676adf78a5b196d45d6679c7acf8b99e1aba203
550459 F20101115_AABQIG solanki_s_Page_052.jp2
14e5141dc9bf2175eee1bd7353682e46
1e89a96c980366ec34f78c1110b6ed3e84c71a99
112191 F20101115_AABQHS solanki_s_Page_033.jp2
40c916730d30e7716060f7e9ed480628
15c768aaea0744dd4af0fc57668179f2f09ebbb7
103649 F20101115_AABQIH solanki_s_Page_058.jp2
1416f97888c6ce3168eab71c640b2750
38b5897a1c185aba08bc10edc3c3ea159bb0400a
121927 F20101115_AABQHT solanki_s_Page_034.jp2
f5ba9265d5499ae884fbc84dea0ec081
aac26587b8f95a495a7ee40c5376463c2a50def6
111598 F20101115_AABQII solanki_s_Page_059.jp2
dddd9a6653e9e2008e64340f8e3b475b
45c48f7bfac398925e6b2ccf0382dff23a5f73fa
120450 F20101115_AABQHU solanki_s_Page_035.jp2
d9ca7108d4fb6420e2cede9ef9e812d8
85c28bc751eeee3f0534de708979c555fd965338
88485 F20101115_AABQIJ solanki_s_Page_060.jp2
ede3fa93b127a2cec9c8ad7cb2b43861
15c466ce3ddfb4cdab1827abb8617bdcc6cc6a6d
109382 F20101115_AABQHV solanki_s_Page_037.jp2
42b56aa56b85755a1678954f4194955f
d7692f9654d03008b9818424bf119899fb78a804
66147 F20101115_AABQIK solanki_s_Page_061.jp2
7541c22819114643739c08cff72c12ba
5a364dbf1eb043e31fe2fbb1f15ddb81b7e3335b
174575 F20101115_AABQHW solanki_s_Page_039.jp2
adc9f4a8e4da5b6367da91318715e156
461164f9f3d2b7a9531235d9c1eaab2f935661a7
100054 F20101115_AABQIL solanki_s_Page_062.jp2
71bc233dcafdbf7b2d6b940105a080c3
e056b23c13a48576584489d2ae8da2fae27d7ad3
110754 F20101115_AABQHX solanki_s_Page_040.jp2
531cdc1e8f515851ad620ece584ee0e5
82c4e583ac90df1b636ea8b079db4e5f9c4ba52c
265285 F20101115_AABQJA solanki_s_Page_085.jp2
bd002717eb6d2aee26362439ce94cff7
79cafe41b3732c81aab777d76a2799996a0a3353
90026 F20101115_AABQIM solanki_s_Page_064.jp2
3b88c330ef14682b0452904e1d56136c
262271c2823f7fea6e6c7d762944902391e5dcf1
103953 F20101115_AABQHY solanki_s_Page_041.jp2
4142441b0c05150bf59b024e021dc86a
392776907ead7396a75d1f593bab3de17e48e53c
536169 F20101115_AABQJB solanki_s_Page_086.jp2
cb75a634c503ee72686a9776c66f98d5
4ec17d55eae90081b3dda11ab7543809a95c21dc
113736 F20101115_AABQHZ solanki_s_Page_042.jp2
990076305a5ea43a43f69a4dc6addc1e
583c5d9d4f70a2d0b81b90e4d0f1365510c5e2ca
99155 F20101115_AABQIN solanki_s_Page_068.jp2
6b8210f8420acbfceac21cd7059d18f9
5405f2afc1c6e45f7465da4dcc459cdb3ded957d
426098 F20101115_AABQJC solanki_s_Page_087.jp2
7b2af650a873f482cc0acd890d285944
8794721711adaeb64e12f6aa5f4f39e6e3e94663
120585 F20101115_AABQIO solanki_s_Page_070.jp2
ac1ffb62aa7cb990095be56a8ab12c7a
cc0afb682a17cecf879e5f49daf44605a2ba7608
479595 F20101115_AABQJD solanki_s_Page_088.jp2
3adedf080c02176dfc7c78cb1919c0d0
c5466b49842979cdf71c2e3b11102821c7cc1a3b
142743 F20101115_AABQIP solanki_s_Page_073.jp2
d73674b79d97992a61d3708f6092b57b
aa10e397d21928817bb207a6c68cacf2f5ea3bfb
657900 F20101115_AABQJE solanki_s_Page_089.jp2
75074476907a043dd82599508a09fd0d
b440fc2bc2ebd29a2d24d8f536629fe8818ce3f2
116537 F20101115_AABQIQ solanki_s_Page_074.jp2
f2dbe13bc594b07e027ffde0514b8dcf
e7a40b52dad13585ec44c00e4237867b6aaac2e0
100461 F20101115_AABQJF solanki_s_Page_090.jp2
f83f73b72bb781bd8ebcc7c9ef892f43
e8808585fe67738ecfd03938a6d77441456172d2
68747 F20101115_AABQIR solanki_s_Page_075.jp2
a3e46adf55d56b41d90ba460ab146142
e4a91b132bf9899957ede1ee0c6563fc9c679278
108828 F20101115_AABQJG solanki_s_Page_091.jp2
eba9afca920b94fc163cb199c1ad02e4
3beb4eede0337010a6bde213650f1d3a824b6832
54925 F20101115_AABQIS solanki_s_Page_076.jp2
6e4e13e94b5f171e426276604a3060cb
7f9f1321500676ce21a2d91a3fdff7273bda2808
1051983 F20101115_AABQJH solanki_s_Page_092.jp2
4daf96e7c9fb44e18e1aac6b678d14cf
4c5055a4cc51dcbb2735ffd785ca8ed5c765a402
109761 F20101115_AABQIT solanki_s_Page_077.jp2
1f2a7c5873f633d744b36c88021c691c
7a223185447e312aec893945c8aabce5477bbe6c
1051976 F20101115_AABQJI solanki_s_Page_094.jp2
2f4151643f19256fef6fc3b75269bff0
9a4d5fc71f557a186af07471c067e3b59e0bf7ab
109688 F20101115_AABQIU solanki_s_Page_078.jp2
e6778577106fb4873731bab117127f75
e73a158eb26c2e0be513560a08f47841ba134164
109926 F20101115_AABQJJ solanki_s_Page_095.jp2
fab3abc696ec9dcfea633fbef9a6d304
ff216877b503fefa3bee6b23c95cd2430ed67474
113943 F20101115_AABQIV solanki_s_Page_079.jp2
88fb191303722acdb3109a6605564457
abb0e712095e544276902a5eeb3e6ff2eb4b4b99
735644 F20101115_AABQJK solanki_s_Page_096.jp2
f5cf9603e23cb16b9bd60dce3ce8ee7b
4b0a5f889afee6ca00e1fb9f4fc81fd116a18793
101741 F20101115_AABQIW solanki_s_Page_080.jp2
0731169dafa964c6284512eb7f43145f
51f9097cf901ad15343de427cc0b86b64c8335cd
36195 F20101115_AABQKA solanki_s_Page_121.jp2
1fb3519df20931f4849db332d608a2ea
43a20f377c57bcd626f2338f083678cba72c80c7
825413 F20101115_AABQJL solanki_s_Page_097.jp2
c9f77ce74c62c150301389f52408da21
da3888a1740b5180bf328f481d384c6e92e3c746
536560 F20101115_AABQIX solanki_s_Page_082.jp2
a8b7f193c0829823ea639dcfcdf4d903
d5fcb0d58c37e072026277e2303c29f37f037abc
24568 F20101115_AABQKB solanki_s_Page_122.jp2
4e5ca3d2aaeff955dc6657f9f6d6ac0f
810d6f102abb915bb8ac689d594e2a2b8c1c9eb6
271033 F20101115_AABQJM solanki_s_Page_098.jp2
06905f090cf1204f1d74ab4751d5b987
53448a1095306a8b8ff284a31ec5f529253503e1
1051972 F20101115_AABQIY solanki_s_Page_083.jp2
96280b6573c0705f814423661070edbe
40a72caba1d98bbf6c53f707fa402d0c5ec98658
22729 F20101115_AABQKC solanki_s_Page_123.jp2
ab33be67beba36fa8e680ee2780ed581
d63f24686a8d9897c2f3b51779423d993c73d2c4
1051877 F20101115_AABQJN solanki_s_Page_102.jp2
a8ff8eef3d7ebe283e1ddf9e5b3cc53c
555161c110bf4bd69935fbf1d1c8e5b7fe01f133
341666 F20101115_AABQIZ solanki_s_Page_084.jp2
a9ff202dc824bde71238e299e6d5f951
b6cf0332a85f832adab4cd8566bcb2ea45a19277
1051919 F20101115_AABQJO solanki_s_Page_103.jp2
a9d6847159c93c7872211d201b08d9a6
9f6362ad477c8df760a0a4dd84d90726ba50f44b
135462 F20101115_AABQKD solanki_s_Page_124.jp2
8e629116c931342fd4b65b0feb78a121
c2a1ba87674a9c2396b45932f336f33413eb10d1
629121 F20101115_AABQJP solanki_s_Page_104.jp2
eab44acf102d79b4fa1ffc1df7591fe8
b84b340dcfd0ff4726033db2b05b2d41d22ac7fe
133707 F20101115_AABQKE solanki_s_Page_126.jp2
e588112f77383e98bf2590f1cff69728
d85ad65e4e5cfd37f228552442c1a2ffb53b1043
1051907 F20101115_AABQJQ solanki_s_Page_105.jp2
ded013520f2d206e20f5a8b1dcc9eaf1
035eccc1bf7edd5aa93692574bef21b328bce316
44132 F20101115_AABQKF solanki_s_Page_127.jp2
e0bf271f5cb8361b30d1a5c091217f52
c9c65734df1cb0a31aef8bf28125b1ba4ac81a6c
953245 F20101115_AABQJR solanki_s_Page_106.jp2
b48239322407c7df80e60fe6115e8eb9
d9870f1f8115b23ddb44f0221867cc0f2fa0f01b
70501 F20101115_AABQKG solanki_s_Page_128.jp2
47bb8b853254004cabc7765b26bc30cb
d7de26a14591e03d7778781eab682d533b0f8515
F20101115_AABQJS solanki_s_Page_107.jp2
1ffb440719626ad51c9f9fd35f7cc89f
443689d955b6f6d552313e5147aebca805b902ae
F20101115_AABQKH solanki_s_Page_002.tif
5bb71eeb7d760776ecaec466199e72e4
1fd3adcd29c8c4787f0924f3cbb8dc9dd7de18b4
1011594 F20101115_AABQJT solanki_s_Page_108.jp2
896054fb8cf9973ab1c7cdd72536bc07
2bcf740f89b9014b0ade5d38d393cd10688fb298
F20101115_AABQKI solanki_s_Page_003.tif
8c49b3991c10ad19cf1f23a51aac0939
b1ed00e7d41f403110c23548ac2e1953f0d2614b
865420 F20101115_AABQJU solanki_s_Page_109.jp2
463dd3d1d338d6409f1c40ad714dd64b
9bff03aa8c7f54a87f3caaf6ee45029fa5418c06
F20101115_AABQKJ solanki_s_Page_004.tif
903d6606aa8f4c4be8f4563a8d5dd124
1f6f296035b799cace7aa28b81305241c2d3b444
69104 F20101115_AABQJV solanki_s_Page_110.jp2
2db7841bbddd74f5f5ea88baa0af99aa
f5e027d7815eab838b6a125510cae24a21197a67
F20101115_AABQKK solanki_s_Page_005.tif
f583b271d1cf360a51c193393a657ef5
6ca974d8f2d0becc2b643c524960eb3deed8a11a
105139 F20101115_AABQJW solanki_s_Page_111.jp2
cc4f4524923755f3d00dae56bc45d869
f4283127360dcbcb8b2293f1b8f8d603081075f0
F20101115_AABQKL solanki_s_Page_009.tif
5401498d2833a214b1fab07e4bc2d559
13599252a54f962124dffa43339439e816b1be49
13285 F20101115_AABQJX solanki_s_Page_116.jp2
289fee1b3f10b4ca0153f7984e20f6f7
0db65e82614310572147ed709ebcd5fa0aa49912
F20101115_AABQLA solanki_s_Page_036.tif
ffea4a8bbebc630e376d5818a9502be1
23976d3d4fd0dd1fe07d7448ff51a0b3e611ff21
F20101115_AABQKM solanki_s_Page_012.tif
be04435531ad0a1982c133051572ad8a
e6c67edc72ceb3243af00fedd1c099ffdef3eafc
358325 F20101115_AABQJY solanki_s_Page_117.jp2
994035fab4a76dbcf99c5b1d0a385a86
3759a9ba7070acec5d95fa0e4ba56309ee6eba52
F20101115_AABQLB solanki_s_Page_038.tif
4600f03e846da8825fbad5c0fb9b4fc2
2409aa505e40721be82719bd255d0b5f60ef77f9
F20101115_AABQKN solanki_s_Page_013.tif
16fcb25e3378052489e1ed2cfc0114fe
afe5e188e260d28371f34d99380a814565462ae0
46232 F20101115_AABQJZ solanki_s_Page_120.jp2
4e9e2b2a29704229552a024aa03ba9de
2934b11af8e4da8038528055814c618dea0f91f5
F20101115_AABQLC solanki_s_Page_041.tif
30472f29ff4f3f98570596fd1a157084
f0b86de0acbe27552a7320b23cbf666d8ca5c7d9
F20101115_AABQKO solanki_s_Page_015.tif
8b6cabde57e22dedd42752326cdff31e
102348a5528f75e54dc6f65daef7546b70704bf1
F20101115_AABQLD solanki_s_Page_042.tif
0caa91462323ffe605ca112eefc87b27
f46aeee4dfc7e6ce976db8ad3fc59c60e122d44b
F20101115_AABQKP solanki_s_Page_016.tif
fddd1e399539d8a13a17065cceb58b85
e19e81e8d126d98f3a19753fa0ccf5b3c07d4b95
F20101115_AABQKQ solanki_s_Page_017.tif
3ee897d3662174b666074c6f5d6489d9
43d1e2e3e11f86fb7354fb2e03dc96cb2b95de8f
F20101115_AABQLE solanki_s_Page_043.tif
d74f99b21766365ca2c2bbe412551ea1
f92d67ce9760d6c93ba4bdda033be763ff4b3891
F20101115_AABQKR solanki_s_Page_022.tif
8fcd07e79df5edeff3ee1e187d0c765c
47fc50c75ee86a08fcf11b05cf91784354bd10dd
F20101115_AABQLF solanki_s_Page_044.tif
b4bdf446e4b093b5288af049df78e64a
8fa81e14c79930b1781142b4589387789c1acbb8
F20101115_AABQKS solanki_s_Page_023.tif
f07f29707663ea3c86a929fb888efc6c
fce247666d478726eac0536a8cb08f61bbf3755b
F20101115_AABQLG solanki_s_Page_045.tif
abec481894f903f2ed8c7c450dd4de73
cd4a6cd45a4cc40a0a8ce28b0c8775b17955d809
F20101115_AABQKT solanki_s_Page_027.tif
e996e1558dd7f48acdbf94cc77db05b7
492d5b99ef0c20eaac73f025bc38700e62e44dd2
F20101115_AABQLH solanki_s_Page_046.tif
6dfd43641cfec69d8a78769e0b75c3a8
d214ff9daf3c1e3d37149f5690f8f1e9492b2f84
F20101115_AABQKU solanki_s_Page_028.tif
eab37e791db99fe12aa16e4329580244
0a884c3ab12c6cb040fb15efdf5c9014dec869fd
F20101115_AABQLI solanki_s_Page_051.tif
a0adac7ee751231c62381f646a9ad1a9
6e86cacd0d97c8c56a1dc0a0697c934cfa980d8f
F20101115_AABQKV solanki_s_Page_029.tif
70b347e710ce8db26c75c9228cd61343
5b3e6ca3cc5eda5b589e5dc14edb675b952b3bcc
F20101115_AABQLJ solanki_s_Page_054.tif
50fcb55d9c788613d813407153dbe114
060b9a5bff5f6df573faa772b47bdba7c48a0e3f
F20101115_AABQKW solanki_s_Page_030.tif
7ed4c7d1a0418b07f44e4777ae3673e7
652d309d6c29e182119990d186ae13230e8324af
F20101115_AABQLK solanki_s_Page_056.tif
7a74db0c30f7981037ef549c7a073d1f
530c1c1e8a4b6d2ebea8cf628fdfe512b3cdfae8
F20101115_AABQKX solanki_s_Page_031.tif
1e110355dd9b16fd51c0d6dcb7220cdf
5dabd1ba334e67a449fe3cedb153fb67ecc6df25
F20101115_AABQMA solanki_s_Page_085.tif
c5a0a170ba1d763e90ba3c398ed937ef
40d8e4c9353150a62db2a8629977f6f3a2417cbf
F20101115_AABQLL solanki_s_Page_057.tif
196c615c2c23aa542d3f58ed9d379d3d
8b1c43f505a442acbe40b0f8f43b87f7229932e1
F20101115_AABQKY solanki_s_Page_033.tif
274d29dda15f773e61547a85fedfa568
2d452d1cc176e8e9ef76a4b31d511a8c6bbf96e1
F20101115_AABQMB solanki_s_Page_086.tif
efd418506c265b36f3191f967d2e1889
731439ae0351bfb372bb29066e8c38028ef5baed
F20101115_AABQLM solanki_s_Page_058.tif
e2ef9044ab3867820d1f574a59486aea
6f1e5f65ebf866b1f0a5689012b57c6a9e99e2b8
F20101115_AABQKZ solanki_s_Page_034.tif
29053698bfaa0d7a18a0d39792743ba7
6ff249ec35447f0389f9365e69ec65ce91976504
F20101115_AABQMC solanki_s_Page_088.tif
d87a69dd2297eb1e7e8c2ed07c1042c3
c6baf0f2fad419ac099e17a532d4bcaeee2a1091
F20101115_AABQLN solanki_s_Page_059.tif
07303c58e144fafb1845f595d7a1aaa9
c9c0e7a913beecd1102cea527535a379f92ba2f0
F20101115_AABQMD solanki_s_Page_089.tif
d7962a7cff9cc693690231fc989b2c50
78e0d017cee95beea18344ca115c22fb31ec5e1e
F20101115_AABQLO solanki_s_Page_060.tif
6c1cae675d8d09f97e610ff74f5e1a69
ba94d3fba41b9ded12651b3e9c283f288ef40964
F20101115_AABQME solanki_s_Page_090.tif
d8f182951d75db047b5d7fc34bfa4a49
77b2e123dd76b0a25836555a35d8a7684522147e
F20101115_AABQLP solanki_s_Page_062.tif
431ef1dce6aef8f9cac85a5090185402
47867b390da47eddd21a196a934641b2a707ad9c
F20101115_AABQLQ solanki_s_Page_063.tif
c58b729e154e88df80d089182f1d043f
bab293724268738c6d228526ce6b98995795639e
F20101115_AABQMF solanki_s_Page_093.tif
c4f40266b6549ac0b9c688ba74a0b1e5
15c301edc70ca88eb5b3226e6c76476fd1bca660
F20101115_AABQLR solanki_s_Page_064.tif
b938d0e9f6b9e0cff67a133c68182b9a
ba61ad37fb23cdcde9d23e267467643306f36c9e
F20101115_AABQMG solanki_s_Page_094.tif
13f3294170ba4ad9f8c797b1e0949d9d
48171af5a1102223d2106de332833635d3394299
F20101115_AABQLS solanki_s_Page_065.tif
8c309dbded771510b51c2f1c0a798dc8
03538c3ea1c85270bd675fe3ac93482d99f25d0e
F20101115_AABQMH solanki_s_Page_100.tif
553f6b8bd2f09a3a7614dbb1c8190d85
75caa7de30fc80a54a0d53e2e9067451c371f8f9
F20101115_AABQLT solanki_s_Page_068.tif
1f986afc31c195fbbac1b4b4e24468a3
035d7659e3fa768bfb6b0f19a5d4badb3b48b028
F20101115_AABQMI solanki_s_Page_102.tif
28c12e8ee773d8af636ffd9cdff9135c
8bcaa5d03f881d411c65b7c67a1bf93a89014c24
F20101115_AABQLU solanki_s_Page_069.tif
531c1ab8af3267b86051781406614936
b389f964fa86015cc84f609bd67a48566ebc5100
F20101115_AABQMJ solanki_s_Page_103.tif
f41928ee803db739089a9a5fe13158b3
38fb3c7cb5e5afc650e05642f043e01e43940c5b
F20101115_AABQLV solanki_s_Page_071.tif
a9dfbdf06378bd7d7eaf911085cd958f
69c589a7f411805cc909185b9a3188478f240a2b
F20101115_AABQMK solanki_s_Page_104.tif
8d955bcc74cd3c2507265b61ca40b0c0
196e0a37fde1c353726dee16a3780ddb1583120f
F20101115_AABQLW solanki_s_Page_075.tif
251c4fc6db974a7147c9320c2cf3859e
9f7b4cf499f18c72fa263375d5b0a79f51edd863
F20101115_AABQNA solanki_s_Page_126.tif
dfa65052cd59bb44654cb5782bc371f3
2f8a87073e4a8c9b73777c2ca8006609bf062306
F20101115_AABQML solanki_s_Page_105.tif
5aaeecfac58016b1548392c397446946
80d3ecc93ae6c827ca8730f2f15d5ad3b78c0161
F20101115_AABQLX solanki_s_Page_078.tif
8874e3789b07b959040853a26d47acbe
0a39a01a9bb2739a2e2aa0f5fae97cb406ece598
F20101115_AABQNB solanki_s_Page_128.tif
be4b7f0678b7f04f53e429ae69eb0c02
fc5d0e84c751b30cd93417c32bafd85c364e1bd3
F20101115_AABQMM solanki_s_Page_106.tif
8038975ef9b37e69033d00461f4d1df8
aa725b98516bddb6b245cc690bf81c35deddd368
F20101115_AABQLY solanki_s_Page_080.tif
59e0946926c2c65e0368ddfedd63b0b6
e58a0c7f6529b6a12ff425680a2209c239f8c601
9093 F20101115_AABQNC solanki_s_Page_001.pro
e0493a90689cf044517d68441cf985d4
c8cf0d5afa3476473729d9d1282de7100691da7f
F20101115_AABQMN solanki_s_Page_107.tif
95ab15c86b59a0cb4a05f0ea639eca1e
c74916465b30619c2bfcf75b3c526de13b608e40
F20101115_AABQLZ solanki_s_Page_081.tif
2f7467b34edf4c5932193a2faff71157
da3d7680d847b467ff20302e108ce7bf7c312203
805 F20101115_AABQND solanki_s_Page_003.pro
86c072b7a4ca600eeafa32729f3acb44
289d935825958ab8f0a069e63adc93880ead8451
F20101115_AABQMO solanki_s_Page_109.tif
815ca829a5666cef75be72019133f1cb
5f9d0b212337bb3bc388a5836229e53302ed05e0
67841 F20101115_AABQNE solanki_s_Page_005.pro
78419e1cf70baa8fa9742c1f786822e3
ec77d882bae2839ffd69b859b1e4a59dda03a847
F20101115_AABQMP solanki_s_Page_110.tif
28c178f41ec2f94218dc68357333dd20
3478806dc7419bb80c0da580b26d03118701f5f6
89161 F20101115_AABQNF solanki_s_Page_006.pro
be15460ff568305cb323a2ecd842d243
3d2d950cf7f5b418ab5f420a9b92b7d1107c5f1c
F20101115_AABQMQ solanki_s_Page_111.tif
5cea0bbb4eb5f7dc360b4d5e7f05414b
02a04423fb8a85d5fc9d62ab6b3b856fe2b8ed39
F20101115_AABQMR solanki_s_Page_112.tif
4f955da8093357a17000800a41060458
4e410246b1cab0a1fa0435d402250568ad50f457
9868 F20101115_AABQNG solanki_s_Page_007.pro
2aa679973649b9130ce0cef5055881b7
c3f91db70598e12eabc75b1b881eb0e2495e1c7e
F20101115_AABQMS solanki_s_Page_115.tif
c7557a9776528b8c0f120e1c3122e645
0576305b65ca536364a1c98d90197bdc054acb37
55192 F20101115_AABQNH solanki_s_Page_009.pro
d8f3fce0b186f25f7e4226e5ea635afc
c18f35c5e1d82f1c60a757a712a2b4e54acc60d6
F20101115_AABQMT solanki_s_Page_116.tif
50f91865211847a1636eb8f6d0dc1c28
649416a47ad42d4fe16fb98d9bba940c9619ec29
44540 F20101115_AABQNI solanki_s_Page_010.pro
b141d15204e55fda10e2978054ff1ce0
ebd455d7194891823cf30663e754f80e9dea6d4d
F20101115_AABQMU solanki_s_Page_117.tif
74c99201cbdc36ec22900764402ebd56
ad67ff76fd2d30cc9ac1a9e0358c90d0eb983d69
45013 F20101115_AABQNJ solanki_s_Page_011.pro
9897c540544a75af3e0f28e9fc98b1b1
81ba864b0488515a50a11c17334a4885ab72e4b4
F20101115_AABQMV solanki_s_Page_119.tif
0e602e81f68e40df679692c04dd1d4f9
cba82e4fae4e70342f198f846a739464d81078a0
F20101115_AABQMW solanki_s_Page_120.tif
99db9a28f0a051241d42b3ada5a5e49d
28727ec0beb8eb8f6d553c5a83b881ec87e5d2e2
20971 F20101115_AABQNK solanki_s_Page_013.pro
a6939e74e424a763b6996bb492c70b0d
5a0001f4009dd67cd80d3ddb507ac386a2e5db6e
F20101115_AABQMX solanki_s_Page_121.tif
d859488162d9c493d4eadcaf36a2da8d
85ad5334b4e3c0c72b4636681111501d1deb5463
52209 F20101115_AABQOA solanki_s_Page_040.pro
c1fa0a3c5c929080611c340a41a7d14e
20b953a42401b381006452fb23ce8609680c78f6
52036 F20101115_AABQNL solanki_s_Page_014.pro
f190cc54819486873f8caf76da1c8f9d
ce8e963d65de5347721723bb343a6a94a2454739
F20101115_AABQMY solanki_s_Page_122.tif
829d0b1d80b6f33a815956511f0c7b29
0f799875492a4f97b26a76683169d14472513a98
47322 F20101115_AABQOB solanki_s_Page_041.pro
ce0648fa7aba2afddd8d09396951f8e6
b43ae4cef9d1ec2e232d10388fce703aa7144369
52907 F20101115_AABQNM solanki_s_Page_015.pro
b8933f52d63a1358506caeb6cef0f903
d677f3f0d52dd12c922ba26788f1df0fe505a512
F20101115_AABQMZ solanki_s_Page_125.tif
d7892380b40f74f095d808631023b84a
22d9da08dfbe59cf9dadc5fe04d41f347b34417d
16769 F20101115_AABQOC solanki_s_Page_043.pro
962939cfa9f734e975272fe10788ce73
2135d37f702e2abb2944619b6e609cf3c0cba5b2
9038 F20101115_AABQNN solanki_s_Page_016.pro
84f4b5fb48efeb2ed0b5b5d2eac4f8bb
315eb1d80b860b27671026f1a336a57e36a9c39f
3329 F20101115_AABQOD solanki_s_Page_044.pro
cfa2ac6a47cc2a4dc6f939302fdd007a
4e122ca1d2a291c730345df7c1fafea54b6c987b
3344 F20101115_AABQNO solanki_s_Page_017.pro
774f0467ca833303957e43e290897234
99a8f180ecf5730cb8688255aeae1fc375729714
53939 F20101115_AABQOE solanki_s_Page_045.pro
b74765e4d1f7710d386d6f02bb88ceac
6eb2e5415395bb3ca8c2ea76afb65daa21462d6d
52404 F20101115_AABQNP solanki_s_Page_021.pro
7c31c7a0f5f8375fe39af99af64e9f37
348ee303dd98ad3a796acb243b890f9b41695044
59134 F20101115_AABQOF solanki_s_Page_046.pro
98be39fce2b8aa09bb44cb46685b3db8
ec5cac06df9bcfc1c3d4d8a9350266ce9a3feb91
36531 F20101115_AABQNQ solanki_s_Page_022.pro
3edf51bdd2f55c01ede1e61298bb969f
caae1ca9500d3430c7a7a6a592008a569df36a60
48205 F20101115_AABQOG solanki_s_Page_047.pro
88912458cf7b8c7504b9a85df6d72457
7be2a4b3e6a0e96f0a6606f74c89a98dfbc9bdd1
40340 F20101115_AABQNR solanki_s_Page_024.pro
bd9480cafad4c172a7c298f3a96906a2
f4bab4b7a22774a5ccffbd2cab760aa242fee7dc
45853 F20101115_AABQNS solanki_s_Page_025.pro
4ebe0e97c11194270496e99034ade4ce
cc9e124e2de9456fefac1602e8097860a5444373
3186 F20101115_AABQOH solanki_s_Page_049.pro
ed5aba2f32da6aa8e535d3edb0669a8a
eb64ff5064a59d124780e449647c4a2c7aacd474
39544 F20101115_AABQNT solanki_s_Page_027.pro
bd200cec0600492501bf59bd25a363ea
0751c4b3f37a104ea9d936050b6798ad0e5a9123
7072 F20101115_AABQOI solanki_s_Page_050.pro
5a1e057e3e116c9ef111e728e5cf1eb3
52dd6539fb9a91ada0433d72d62d6b40b510dadc
51628 F20101115_AABQNU solanki_s_Page_028.pro
6d52c8b959103b5cb34ccbbd8027ff39
7246cee2032507c418dc5873a32bd13ef10ebe33
20339 F20101115_AABQOJ solanki_s_Page_052.pro
416c7fa00a24560f2f659db04cb2f3fb
75f65abe19c71ab4a3cee51279f124878511b68f
53345 F20101115_AABQNV solanki_s_Page_029.pro
12031a6cd6f13dc1a44ae1c1f4fd74ea
7e0ede1f79eb53728963da7e5737a8914f2c51d4
50056 F20101115_AABQOK solanki_s_Page_053.pro
bbc4ac34fa1ea74ea015fab8cc08425b
efd289e421abf1b68519e5ac9c4e675ccc4df412
52285 F20101115_AABQNW solanki_s_Page_033.pro
0e8b1a792a17deb18702ae81ad7b958c
c14ace6306ff549ca636546e5b52b25436521501
48941 F20101115_AABQPA solanki_s_Page_078.pro
80b3d58bb2fd8af2d6f021cee3eebd44
f352c9fa7c8a8cee1708b4ade9390a7d045288a8
46970 F20101115_AABQOL solanki_s_Page_054.pro
d5562ad2e446e30d7dbb21689479d8fb
a0631fa91cb8b19f47e63937faa3ffbe4311d1de
57638 F20101115_AABQNX solanki_s_Page_034.pro
807840fb4a691625d793c43eb18f7798
5def5ecd43a783f81fc952b102121273373f07eb
13702 F20101115_AABQPB solanki_s_Page_082.pro
c10537c6be6d4ac1459afe4a7bd0346f
0a2a2d03398fb214915aac11cf28ed480f03da64
55319 F20101115_AABQOM solanki_s_Page_055.pro
1e0a639da90221951fc4f17d6280d547
3df4b2e359af6d122e2801c3e9c667c7fa13608b
56743 F20101115_AABQNY solanki_s_Page_036.pro
42c623131a798d17c50f16c3d4eb56f7
562dccbe155fcfcff8694f1aa13bc1bc3da3df09
8603 F20101115_AABQPC solanki_s_Page_086.pro
bf7cc08122681b5df12d5b32c505433d
e727e4d566da575594a7d0fbc1c2bcac558cd922
41044 F20101115_AABQON solanki_s_Page_056.pro
c23503c7a847d6df9f3f6e641e1071ac
5b14c087ea304c0db04d214b3702624d6ced345b
50743 F20101115_AABQNZ solanki_s_Page_037.pro
ffc02aad22c0aca746332fc0f2069082
7bca1e4c9e834e1f4b5547e28ea8bd251d65eeed
5519 F20101115_AABQPD solanki_s_Page_089.pro
5cc4ed6c829c1e06d4b021cc33b6046a
dc866fc9b8c4f9a2dcbcc5f2892b686464ea328d
40891 F20101115_AABQOO solanki_s_Page_060.pro
f92d8ca1e987b52381b5b23a033c5f58
3e4a7e9b77c914657df5ff0a64548720b53cc99a
46393 F20101115_AABQPE solanki_s_Page_090.pro
bf3aeae93a528d0d58f8c5f2930438cd
dee3d69e2a5fbeb44974f4d0f345b025043ecbf1
28398 F20101115_AABQOP solanki_s_Page_061.pro
25e29c5ec2a3a912e93b247bd34b7e22
9e50c68a2235caf027c87c2a937a572f443da36c
51307 F20101115_AABQPF solanki_s_Page_091.pro
87eab894708456d2105feae4c0848bfd
6e291b153c4f637c4e10fd0d0703bf3c600c4d10
36299 F20101115_AABQOQ solanki_s_Page_063.pro
ade7cc488509ae0f0b849896af23a0d8
a6ee941aae2f6841ce2df7b9bf37c3a89a234689
57769 F20101115_AABQPG solanki_s_Page_093.pro
f210ba5f627ff2849b56c12fdef643f3
dd495fbea32c1ddbd8dcdb9e36164170517426e5
41501 F20101115_AABQOR solanki_s_Page_064.pro
369b19a3875ac732edef79160df718e6
b48b68726895ba00be9361c7eda9d5edd6669c61
51625 F20101115_AABQPH solanki_s_Page_094.pro
5d07fb5c5ba3d011c2bf82e106bc2d8f
c1661fe6c7760c820bd8993157abd16f461e1ded
42401 F20101115_AABQOS solanki_s_Page_065.pro
383b64c3e0fe19865b5beb7fbeb167c6
1d6adba7bb3dd26e0c35952ee2ce099f5337389c
56154 F20101115_AABQOT solanki_s_Page_066.pro
28147549284175baa0be45bd3cab6ac7
584a389f28584f6dea3bc105c0f4d5d085f5a7d7
24211 F20101115_AABQPI solanki_s_Page_096.pro
8d1c5caf3d0fe83fbfacd0c2db466a51
7a3d8e102a016c7fd63a66e4f6085f2261c863eb
46346 F20101115_AABQOU solanki_s_Page_069.pro
f17f1f34ce3df709eb57ba36e1585857
8e30749a4fed05a4f601d5ba2e4789798720b226
4034 F20101115_AABQPJ solanki_s_Page_097.pro
d1250d7d3cb07de73dbabddf8532bcd5
90cbc59a69e432d0aaedaf0ae8d1d83ee688ba5e
57353 F20101115_AABQOV solanki_s_Page_070.pro
faae8be2dd16e9c8037af3248355dc37
ea9095393351f3537493432e88b16fe455df7c7c
7840 F20101115_AABQPK solanki_s_Page_098.pro
73585c99208ba98a2f05623f10b17c76
aaaf9b48890d968450a5953a8614a6219e27910b
55872 F20101115_AABQOW solanki_s_Page_071.pro
f07f442f7985210bb0944fa254880340
2f3369d55c1bfe66052d4624d5daf6098093f829
3001 F20101115_AABQPL solanki_s_Page_099.pro
3c1d40401316e144299c79eed72594ad
2bece9a1213691ef073cfe3e40308ef955266356
44285 F20101115_AABQOX solanki_s_Page_072.pro
b731a3b580ad4eaf34571fedbecfb6e7
765cce6219478611a736d0fe919e6a61a6ca6ba2
19514 F20101115_AABQQA solanki_s_Page_127.pro
e9dd7ce12d98ce1f8f12fa07e95f042e
45390425a5e1f5089684d4f841d8ab650dd75dc7
5730 F20101115_AABQPM solanki_s_Page_103.pro
9b974a9d96c6681dca99da12514965d8
ff086b717ea298259e1f15479f83dd451ff67c93
71291 F20101115_AABQOY solanki_s_Page_073.pro
665ad14a0796c1dbf7e4d5f688c15fb8
491a18f36ecf5eba2fdb948aabb19e5823307689
91 F20101115_AABQQB solanki_s_Page_003.txt
72e87ded66a3ce4a64c24a12d1c20309
0add8770e4fd7f3392c26cbed5bc3eb497c7fc8c
2767 F20101115_AABQPN solanki_s_Page_104.pro
470e2317d6b534dbb473476814c70045
ae8c709d5bca77b8092c8c3c33ffc23303d675f4
53535 F20101115_AABQOZ solanki_s_Page_074.pro
55f139d9bb055db4691b646018dca217
7856d8de0b47db02290e1c44f0036aa1eb8840c0
1022 F20101115_AABQQC solanki_s_Page_004.txt
4958a770e26cf84449c403cfbd7b8c47
03e17e20843a9b04d5e11d5f443080618fb83db7
3296 F20101115_AABQPO solanki_s_Page_107.pro
979b40bed625d7399ab0b897fa7ea65e
0d7f138a2e7973016afa23c734a62b7a35ee389b
3011 F20101115_AABQQD solanki_s_Page_005.txt
e8d7700afe5ac2cfa218600d244544af
18e528db7e1e52e539e4e6b26f25e778640c99b9
4408 F20101115_AABQPP solanki_s_Page_108.pro
680b02c9d505742c45d2e604f8afc0d6
588088e038e234a3107cc4c32008eb353f19c835
3851 F20101115_AABQQE solanki_s_Page_006.txt
b6599a5a4a7c4e48d2e4e878ba00c675
9b75ff162006c34a79ee44e1a5ad3fe44268acb6
48409 F20101115_AABQPQ solanki_s_Page_111.pro
01902fe2ed35303692e4600a29af8012
1dc33afb047d97457ec2ad8cdae6814e5fc7477a
2357 F20101115_AABQQF solanki_s_Page_009.txt
e8aeb34d700e54cb0539b046db5229c2
e307ff6952ca2d4e2ab2f8222a072284d5126575
53089 F20101115_AABQPR solanki_s_Page_113.pro
f1db5d5446656776cf88b5cf0213d941
1632eb47582d8219d41dcdd36811e5f6157623d8
1872 F20101115_AABQQG solanki_s_Page_010.txt
e7c581015e3339d1505e5f113e2ec8e5
9c9823809bbe0c05f4d0a27b6d7e36035a4302a4
43005 F20101115_AABQPS solanki_s_Page_114.pro
b0459254fe498248e1f20c3c77c4265c
ce2b7668d84d60b0afed9c3ec8c9308fdfbbc8ae
2113 F20101115_AABQQH solanki_s_Page_015.txt
f8f3f95734c8274ed73774511c40858f
017b51457d0f20189d288768690d3f337db55a40
54452 F20101115_AABQPT solanki_s_Page_118.pro
19fb713c5235c614c1d96fb03d447cc1
3b817596aa282fd5b2c6ad9e09464e987c979a7a
369 F20101115_AABQQI solanki_s_Page_016.txt
b9a5aca10591589741f7f5f457596683
2766045aeacfe4e95865ad306d341a32e2dd4fd2
56134 F20101115_AABQPU solanki_s_Page_119.pro
e9efaed38377c92faa338fa7e09e5a1f
50c957e8190a4319aff2d09ab6b1af4d45a86778
20450 F20101115_AABQPV solanki_s_Page_120.pro
242517dee0c21f6c1c974a3edd2a5405
54a87274088ce7b1b9770aba1fe4524a0edffe44
218 F20101115_AABQQJ solanki_s_Page_017.txt
b4ccc93e9cc8981bcdacbd14d7619661
bf1671a055100785dd93ff7bece93353a4b18a9a
13262 F20101115_AABQPW solanki_s_Page_121.pro
12c322f322f7360344e72153679c9fbd
98a967849de8e5ba9d180f029d03ba917084e739
2137 F20101115_AABQQK solanki_s_Page_021.txt
6bb4fabd33a826d0b5e7e0e22cf105c5
bfd3f5bcd70c76226fd5522df7673422a651b320
8427 F20101115_AABQPX solanki_s_Page_123.pro
5df6c091e1d8dfca830aca46caaadace
7cbea58d52f742762f78981da5113981f01848dc
696 F20101115_AABQRA solanki_s_Page_043.txt
f26771d8d471a25b7292532ca2c63b49
6536192c268126361e84bae7157ef0c4fe8115f6
1560 F20101115_AABQQL solanki_s_Page_022.txt
974c73d0197cff7615d5704ff12ff28d
943bb67f05d07c193d5e9f9792f80120e82d7162
64048 F20101115_AABQPY solanki_s_Page_124.pro
0fe2ce56579f4ac75c68163819baf1c7
f21ff4fd91477a2d2843e676ead758ee929ab88a
293 F20101115_AABQRB solanki_s_Page_044.txt
14c171310efe4c0430e7cd74299643c9
9b682e9879a160e7a3a60440f272fe3351462a69
1632 F20101115_AABQQM solanki_s_Page_024.txt
3bd979ae10aa777e67039edf8c9c5c22
ac40303ebcafdaa5aacaee5f84f6120dc16ee9c1
64225 F20101115_AABQPZ solanki_s_Page_126.pro
c9fe86b4106618c6d93af29b623bdf26
59d6211882d3e25535bb699de57c32d1156c6cac
2216 F20101115_AABQRC solanki_s_Page_045.txt
d3b81e7c4207b03ebe6b0908ca68fce4
17acec6553f4b9908c09049cd038cad423f0b643
1899 F20101115_AABQQN solanki_s_Page_025.txt
e3c5e0b2dca7d52f84e09374e28b6953
f3d5e17caa84747ecd6bf23bb8eb485395bf24ce
2341 F20101115_AABQRD solanki_s_Page_046.txt
377d0f8f9b7e5e94234dd89dd78e2e0a
98a1f7579e31a5812f94a5dd807981b76cec5b40
1645 F20101115_AABQQO solanki_s_Page_027.txt
f1cbcfd94fe29b87625c0f6125584c0d
2fe2389be51007724ec53f31e182abeac6ceddb4
1953 F20101115_AABQRE solanki_s_Page_047.txt
e44bf90ef6ef79e2860d7b5a55075597
e19f6e399b0a13befa458326832cc7fe89977c99
2053 F20101115_AABQQP solanki_s_Page_028.txt
fe01d52451b9292f5cc966e24d63a0ab
e0c852da67456d6d49c436bcdafd00828e04c7a2
2154 F20101115_AABQRF solanki_s_Page_048.txt
e874f87fac0785be430b8601c30fcb6b
16451027cf443c0850e29fc5e2bda5627da29d18
2101 F20101115_AABQQQ solanki_s_Page_029.txt
e2e2aac06a5884159ae0c1239039eb76
ca34b56a0582bb8a3645e0e58ebd0bea5b6117eb
131 F20101115_AABQRG solanki_s_Page_049.txt
4a4d2d440e527a96e9ebcabdfcb089e1
a6e0e3380ac44a5246abb5cf1482a428406511c7
2224 F20101115_AABQQR solanki_s_Page_030.txt
fc4b9e2b55f1b6937c8a9b6a91d71d26
1a61b51d348e0baaa6e9ae90468071b7cb68328a
368 F20101115_AABQRH solanki_s_Page_050.txt
ae74a2c6e3f064d4b34d7af34d563998
92286d87ab9fb2f244dac392edc2b62556f179d8
2181 F20101115_AABQQS solanki_s_Page_031.txt
96b7a698086721b919eaa41c535ea5b2
4e235a75278967f64a2d3d3270024cc783702a60
417 F20101115_AABQRI solanki_s_Page_051.txt
b86a10f2c61e15e1e4ce35226e4b0e31
0e864551cb9cd1d1e85c00136ac4ac34cbbc05c7
2063 F20101115_AABQQT solanki_s_Page_033.txt
7d0c062b3f4862fc7da625618f62d149
2c12d49534fd5febfdfb6c079a0a373f5ec2dd82
2073 F20101115_AABQRJ solanki_s_Page_053.txt
bebaa0ecba50b29df238cc3efeca98f8
fb053b05dc5821623d2e51f7cad4cf4a6b8eb542
2262 F20101115_AABQQU solanki_s_Page_034.txt
356e2a17c0001b6007583aba7aa83029
c4386eb9922ac25d2c0c286a075e58a287329f73
2201 F20101115_AABQQV solanki_s_Page_035.txt
a21864e954e7c9dad1c9005c74ee3636
0f3ad1b68fecfee528d339049f840211dcfb05c3
1867 F20101115_AABQRK solanki_s_Page_054.txt
7bc8a92c6f659cd9e7b6f75d1d4d9b31
354f9e701e3edac6912f159585c9317789894690
151 F20101115_AABQQW solanki_s_Page_038.txt
0c9f13e79bf6037ba673ef72fc3638d8
324e4feef6198b44b3e55e9c000f5afb337cf4ff
1972 F20101115_AABQSA solanki_s_Page_078.txt
b7082fe0b5c9808227fc69febdcd8fe0
54f88080f81e4b8ef4998755ed4431c68c16cb3a
2250 F20101115_AABQRL solanki_s_Page_055.txt
4b59593f7d6a239e467a8050bf0350e1
83d4561ae9bd86d2a45db7593c9e57e0314afb7c
339 F20101115_AABQQX solanki_s_Page_039.txt
f8085d13104d436d4c8ed20f6c3aaa0e
a823c156344b97c9a46db139fde30dd7c9c034bb
2147 F20101115_AABQSB solanki_s_Page_079.txt
07b26fea072de28082f8bc947f89a811
ab1e42efab55fc6bb8e59614c88ed3a01920b1ea
1786 F20101115_AABQRM solanki_s_Page_056.txt
11646dda2591ae57d3b04bceb7a939ba
8a9f6222e6ae46f7968945980782e751cd076215
2268 F20101115_AABQQY solanki_s_Page_040.txt
4b64eb69a28a5c45082b1bfba58725a2
0fe970345788a3839f7ccfec3d964728c9efce0c
1974 F20101115_AABQSC solanki_s_Page_080.txt
855989ea4e15bb76c371190854af84ca
744bd0a36fb6684386fdee7d1b77eedf46dbde93
1961 F20101115_AABQRN solanki_s_Page_057.txt
a0aff204c9e1b001bb26349790867a27
29ca2bf421727bbdf21abaf8caa1383c21332b89
1914 F20101115_AABQQZ solanki_s_Page_041.txt
7026f997b78de62c8d999fd144b537d3
85a30929725eb7ced1738ecceb72a42535368c58
2630 F20101115_AABQSD solanki_s_Page_081.txt
905282bc3a2f31b4ca4960d68df8f33e
6cd7b1b3ddc34f39fc912948443da82f2760f9f8
2124 F20101115_AABQRO solanki_s_Page_059.txt
f61a1d881bfcdea4306ac998cee95481
be5aaae0621052e93695f8485b242a5272b05b49
235 F20101115_AABQSE solanki_s_Page_083.txt
489b791d0a44f4daedd4f286c6e8210a
03f06860847ae6ed75c883837e721c29cee9f5f6
1700 F20101115_AABQRP solanki_s_Page_060.txt
d29f51edf41558fe95547fe27b9036cd
674bacc008842cdbac79e3a06197f34548721d9d
122 F20101115_AABQSF solanki_s_Page_084.txt
bc1daff45a47d2e010b7af7b858d8465
e46805bc35607de9d64c5eea05e08aa2fa1d536d
1357 F20101115_AABQRQ solanki_s_Page_061.txt
f9ddc2b22ab435802875e6665a5c752e
0e4173bbba0868fd24f6cd20fe6b277e59825761
423 F20101115_AABQSG solanki_s_Page_086.txt
cc2139a7623d8161cc60c4f19c98ba07
2d0da052ce1d02c4b816cf0ecd1ec190317de9b9
1901 F20101115_AABQRR solanki_s_Page_062.txt
8e8c6317f27899e6f1823ae412e0df9f
c937929a0bbce2a5b95f0631c36de39d6a10a407
334 F20101115_AABQSH solanki_s_Page_088.txt
e027283482f5581bf5b280940abc7033
2b94f32175f09e3d7af998b8bd4f016be9c3f88b
1581 F20101115_AABQRS solanki_s_Page_063.txt
20d9cb30f4a2c7ae6b3cd1aba4861e00
de513f6080d0cedbdcb795e264f2841b15ba9356
1681 F20101115_AABQRT solanki_s_Page_064.txt
e727659783174be1577c070f6e9120cb
8cf6232b7a78c2c88ceaaa0a4cd863d116ebb369
398 F20101115_AABQSI solanki_s_Page_089.txt
856a781231ef7747cbdeb4f3b52f3b7a
0a685b5f84fcbb75d7c12de49cd149660a3a15e9
1582 F20101115_AABQRU solanki_s_Page_067.txt
9b280d8261b401140e80cd9bf697910e
9f9eedf4e00aaeb26785fc933b2eabecaa87d7c8
1942 F20101115_AABQSJ solanki_s_Page_090.txt
ca9ed46ff87283ab0593d59f230458dd
2dd6d67c41c4549572a6496ab2827f848f6d1bd1
1927 F20101115_AABQRV solanki_s_Page_068.txt
1f9d7673fc7e0f135ea742fe4f8d52e2
fc3f5a4c6b793e09839c9a056d74f049bc7dece3
F20101115_AABQSK solanki_s_Page_092.txt
f03a223af2e0de5a00e0183dd01bb0c5
d064dedca60b3d1201aead853c2a0fe44fce7849
2198 F20101115_AABQRW solanki_s_Page_071.txt
9c4ee689b3164bad116419481809d417
fe1b4b56d72794213387d52c118d649d28bde449
1349 F20101115_AABQRX solanki_s_Page_075.txt
2adc9cf246f3d804a250a2c207c4aaaa
dcc13528ee163c4d883b39358dbf7d3b8071b44b
1554 F20101115_AABQTA solanki_s_Page_115.txt
ed9015352568269858b9b64930a37dd2
aea904e237d377df295cb40c9e0f4c8a9c475741
2279 F20101115_AABQSL solanki_s_Page_093.txt
d5ed8e38959721a936145fbb9312254e
b86aab0ad1706d0b44cb2442a7085355029f069f
1206 F20101115_AABQRY solanki_s_Page_076.txt
3c01fac0e2a0c9564aa6cc1d6d67dd78
abd9f4d086ab54b190d8391425e02b55291ffd1a
187 F20101115_AABQTB solanki_s_Page_116.txt
36af3e0b13c02f830d64ac2890871110
d2ab888f776c066380fb5d4d86f561010659c5ce
2057 F20101115_AABQSM solanki_s_Page_095.txt
2abea38e51006493173c80eb50d079b2
520c34b73746fabac0aff9109fb3131a6bd68086
2011 F20101115_AABQRZ solanki_s_Page_077.txt
e42e49e1dc0a4addcb6aec4daba0d2c6
9a245ae07bc09d82ae6b44cde3365995e406bd8d
818 F20101115_AABQTC solanki_s_Page_120.txt
c5daad28b729104bf7bb4c0cff85a79b
29c37161d9042965fee7a114053d2c4fbd122143
965 F20101115_AABQSN solanki_s_Page_096.txt
7f2f9c8efcde63d87aa4c50e1b0ac9ac
f668fbb3f470c4392a7642b41695ed2499d1c455
594 F20101115_AABQTD solanki_s_Page_121.txt
479c1588a44d62b75d1ef75b9248abda
dd21c500aaf40d638351c84bc2654075f107ea7c
F20101115_AABQSO solanki_s_Page_099.txt
fb81701ca65e9da0e51480bd62241cc3
dae7883073856899b14a404d9839df36ecdc7b65
373 F20101115_AABQTE solanki_s_Page_123.txt
93486a06339a2683f8772314e2a26229
5f508cb3e52f47ac6c0753ec45aafecb248c8c6d
452 F20101115_AABQSP solanki_s_Page_100.txt
31756cabdb8b518f0c4d606b1162d33a
1efe9eccef197e3024cece3212cafc8945ba50f4
2575 F20101115_AABQTF solanki_s_Page_124.txt
acf92c3beadb4a509a169236910da3cb
d7b844bc9fda7c47fe3929b1184a17b6d38a567d
260 F20101115_AABQSQ solanki_s_Page_101.txt
065b2eaef79046422f2be991a64f4287
f8c10d95a32aaa54039caa12ebe77c89a67ac75c
2537 F20101115_AABQTG solanki_s_Page_125.txt
6f81d434ae8cd9b025406453de1f160e
5e709ef66073ed748a3600c1863ea698ddd6fe67
298 F20101115_AABQSR solanki_s_Page_102.txt
438aec2bab0afc253c5220251f801683
ffd33d634c6ef0b1b3113a590a74938539a7af62
2611 F20101115_AABQTH solanki_s_Page_126.txt
e0e8c0957fd87116ef1793f1de8f75c0
496f0f9493f61c9f59fbe7aac6e440dffbaa8465
359 F20101115_AABQSS solanki_s_Page_103.txt
fa39293edc3751bc13280c209cc55cee
7165a0d74a4b147b07adb17a7f89c3d5b85bbbc0
796 F20101115_AABQTI solanki_s_Page_127.txt
ad9e91e9f58dc009ad89c3b1a27b47cb
83806171d015de3b1265115df453da9f682089df
216 F20101115_AABQST solanki_s_Page_108.txt
853fa68c31ff3e32ceed4a315d4fb8b5
e2f34fd93a4537df460d4b627dd42e5ae46ff169
1272 F20101115_AABQTJ solanki_s_Page_128.txt
6d78d349148f5051eae21e578885ee5e
02da78ac460cb239d45368ed8ce2ac032179991c
152 F20101115_AABQSU solanki_s_Page_109.txt
55d38e9bbd3a06e986579000c76730be
65e7576ebc25beea5f09847a614973fe2ef1c5a2
31250330 F20101115_AABQTK tefusiontest1_10mph.avi
00d7827e0531b25bad31a40a613d37ce
a3c01757b9daf1a32f4714323be35e73c2e865df
1338 F20101115_AABQSV solanki_s_Page_110.txt
f32f0e6c1bfab709a9d238560d8b188d
63cf5845dda12baf498c92564c3bf5ffc9c72120
27483456 F20101115_AABQTL videoforwardpath.avi
735b11676db72b5b539e00e2afbfb99b
3b1126729876d50e0381081552863cbae42e2f16
1999 F20101115_AABQSW solanki_s_Page_111.txt
44f08656d48488e89b794ec3d9f84b5f
4b7151db158ec6aa76bbffa4ebd9643d02d7616b
7642 F20101115_AABQUA solanki_s_Page_006thm.jpg
8082568b00bd1fb380561470873e980e
e8d3d9a79a846e85e7a576667bc30e4022b43d94
1697 F20101115_AABQSX solanki_s_Page_112.txt
b91f7ce6d6b1ce79fd2cdf4e1319e09f
0c7f5002cb4d22a19238ab330d7fdcb78187b638
6368 F20101115_AABQUB solanki_s_Page_007.QC.jpg
8afa4fa57b0a9f81ce7870bb62f8631f
8ab0bea895bd967f797fd7d1b9f36efb91ee53d6
19617022 F20101115_AABQTM videoreturnpath.avi
f66492f0f5e9c452b18007edc18b3fe1
bed559720677415b2ca4b85e7eee8fc2c41f9229
2156 F20101115_AABQSY solanki_s_Page_113.txt
c8478118b0ed9335a41b45f5a5bdaecc
01e6af99a9110a57b3284e7266a4190981740588
1744 F20101115_AABQUC solanki_s_Page_007thm.jpg
9f40a598a64ca6e3f3d663e430035afa
355f6642594970dcf9a96d910e259b89cda31456
8939364 F20101115_AABQTN tefusiontest_22mph.avi
32d4c334160aae3d07fc78faf32566fb
3d26c0e44b477cf8e66d2999dbe318206b0b4234
1834 F20101115_AABQSZ solanki_s_Page_114.txt
e7e8ce5dace6ab79ee8287cac67b6f52
562ad9594108a02a6f9672038d6b451b437267aa
122135 F20101115_AABPRB solanki_s_Page_046.jp2
32fd1931166c22c3112453f1f55a7036
0918e60ff9cf94e029f68589b4f002cd9ffd19db
15683 F20101115_AABQUD solanki_s_Page_008.QC.jpg
6913cf4843152d3e37ed1af98f817f41
036ed87132da78c589970582ccded40a79397e95
25804980 F20101115_AABQTO tefusiontest2_10mph.avi
39e29940ef0c4f7a889957b872df8184
b631a5d1de9052aa49cadaf527d67d2d6db53e3d
10260 F20101115_AABPRC solanki_s_Page_098.QC.jpg
ba9c8a260282992d70bbccc4b1fcdee5
573de57b67f12c403ea08590dd8b517e083cdec1
4005 F20101115_AABQUE solanki_s_Page_008thm.jpg
4b235ee34aafce332ed5aba2f35294c6
ce5c3f45711883d9de4192d7c332e3145f4d2708
19593770 F20101115_AABQTP ltsstest2_10mph.avi
2d643a66c029318fc1a6983bb1bb8759
409a994a1b62f0a8ad50c7d4f56d41b034ba2032
F20101115_AABPRD solanki_s_Page_070.tif
6cdc64f31e22bfcc4abc8f86a28c15d0
c20b631c641bdce5760b7aa575a64ef811f42646
26428 F20101115_AABQUF solanki_s_Page_009.QC.jpg
5190780e24d7ea38ce044b9343abdbed
7658b3b538e38f514ecfafa1bce3fcc746e1d30c
5757822 F20101115_AABQTQ odbarreltest1_22mph.avi
227a177e867d1e4ea36d417078186f9c
62270ce73bcc458260b136d8024f721d544e42a9
8957 F20101115_AABRAA solanki_s_Page_124thm.jpg
0d41e38817cb8329be3e09bead2ca91a
fb7233b775c45c4152dce63a6cdbd490b95848a2
4644 F20101115_AABPRE solanki_s_Page_052thm.jpg
cd02a0f077aae82ddb42887c1255fc84
c5f9a3d3c81474a6c4cd52584fbd48066471e2bf
6952 F20101115_AABQUG solanki_s_Page_009thm.jpg
cfa658dc359416326e79368f2c16f631
ba2437fff1a0c576148126040fe037dee5ff0624
25127446 F20101115_AABQTR odfusiontest2_10mph.avi
2e4c380efdd6726e5dc81ab0abb5b352
c19a6b6eaef67298ed9f23b378526698c6d1eea8
36032 F20101115_AABRAB solanki_s_Page_125.QC.jpg
faa0b45f76002ff337b9630dfd6e9a86
85b7e68d8868ab09a49f0c2f84736051978acfb2
105429 F20101115_AABPRF solanki_s_Page_113.jpg
f295bd441964ada5c9f141ed7e34c23e
76858813d48b4486fd4249301603e0b034a6d273
25059 F20101115_AABQUH solanki_s_Page_010.QC.jpg
8ea56a39dfaebd410e04a1bbc252d1dd
8ebf513a7789b3d68257ae46c3e9dbd1f0671693
7890258 F20101115_AABQTS odbarreltest1_10mph.avi
bb0c158dac189d981eea2462e1707812
ab01b17c5f4a6122186f897e832bdba74abd93c6
8692 F20101115_AABRAC solanki_s_Page_126thm.jpg
9f2b9bebfd84b20920cd2bfdc3cd7cd2
4c693dda1648fb7e94b07a29ba1fbf6d8f81af45
F20101115_AABPRG solanki_s_Page_007.tif
ff495ed0e439d30497413e90372e4033
ac5d52d5fa00d27f90fb7775343fed8055191d71
6068 F20101115_AABQUI solanki_s_Page_010thm.jpg
6c9b8951a96791e8a3e3765ca70440c3
1b03330ae86fda8bcd3793f99379a4b318c1575d
7063480 F20101115_AABQTT odbarreltest2_10mph.avi
841b4d2840b02fd3e971fe3a6b3faefc
abd958e8da104e9b37fe3af5ab2e05a063cfc44d
12230 F20101115_AABRAD solanki_s_Page_127.QC.jpg
c1f4c23d6387135a3920676261c923e9
f83384a90bd6ff741a28eccef34a2b6349a3ff3f
46414 F20101115_AABPRH solanki_s_Page_068.pro
603fa6c9e72bb616cc4752bd0c11c78a
7bd9e40da327537839b67176c279a77d7caf3aac
31407 F20101115_AABQUJ solanki_s_Page_011.QC.jpg
7f041ffb1e39c67b4c383fb486893411
a421cff03428f9be3dfbe412a0dffe94ad8ee74f
11065054 F20101115_AABQTU ltsstest_22mph.avi
2b13bb75472b4adceafa7dfe864d7850
e8773114e5cc5c1f8026acce6a4b38eea2a2fd6c
2883 F20101115_AABRAE solanki_s_Page_127thm.jpg
e1e8182cb6a2bd9ec7afed78693190c6
b6f89a13f10b02e9846981c237b4668767a257e4
67599 F20101115_AABPRI solanki_s_Page_128.jpg
eb9594ae8c73ea804dc7e83415b23d5b
864342c4300ce4b21690c806c2161fd6a58e0f66
7190 F20101115_AABQUK solanki_s_Page_011thm.jpg
2df0df01cdcd5a0fee2908a595164bb2
6cfa81c84ade0c89efbc85743afabd08bf1af811
5951554 F20101115_AABQTV odbarreltest2_16mph.avi
2d85b32108a783947504e98bed4d200c
cbff0093b63e81e341b91f794e784f4543b9a4b0
21569 F20101115_AABRAF solanki_s_Page_128.QC.jpg
c02160a90bdee0d751251ecea2ca30fe
aa124fd2e4dcd60007b6c17319c3b3905f07d5a7
838 F20101115_AABPRJ solanki_s_Page_013.txt
fa838e7df7bdfccd5ac35b6abb2fdbfe
7eb252522804b1dac29ce8835b0864f8d89b187b
29893 F20101115_AABQUL solanki_s_Page_012.QC.jpg
b69a832af66edf67151d984cae93128b
e89d9f052713c55d0b5faa62116df380c30669bb
2242 F20101115_AABQTW solanki_s_Page_002.QC.jpg
86ed3926b85a81c0801eafe20e59c29e
2734400480659f5624c0b38378f3f3d29680937f
5363 F20101115_AABRAG solanki_s_Page_128thm.jpg
92bd9880c5e30dbfa3ec185e24380943
871c2b1c0cb00b9de5e6b5c52a0e304336fbe666
7465 F20101115_AABPRK solanki_s_Page_090thm.jpg
bb5551731fbd3685c792f4930af2df14
d349c6655352b0d6e6d00fe4acc24ee88bf2fa77
7884 F20101115_AABQUM solanki_s_Page_012thm.jpg
8d93ba5cefda2d62d69d499dccb61b0f
8015941db890160cca5839983a1759cdb6c3c49c
492 F20101115_AABQTX solanki_s_Page_003thm.jpg
d55d1645733835666763e835c0aa8d1b
6d9a54b6fc3658f429254eabb6ccad68f8004c7f
8152 F20101115_AABQVA solanki_s_Page_020thm.jpg
beefebfec4cebcbb080945f9ce2e2f57
80d5fbf44e079be5655e88ff1733684a7a6da1e0
152635 F20101115_AABRAH UFE0018321_00001.mets
f46b8a3e20253dfecb26c89a6d0d2d62
ca756649fa6b02f03d86fee037684f21a6c46308
solanki_s_Page_040.tif
4781 F20101115_AABQTY solanki_s_Page_004thm.jpg
43895664db4ce4230b9e734acda70ee4
4735c927c979c543a63ba027a929dd4cee84fa2c
8302 F20101115_AABQVB solanki_s_Page_021thm.jpg
dab39938091304627f130a1d60b2c82e
7d37204d5c8f72dc04fbb7256a114ba718c362e9
4624 F20101115_AABPRL solanki_s_Page_109thm.jpg
4c912dec37b7d3de25d72267f45103cb
9a6caef28f1a92c894c86d106a1540e564f100e5
14714 F20101115_AABQUN solanki_s_Page_013.QC.jpg
b69ad280bdd5a8817412fb686b413b86
1593828c75640d6ee7a209150e80160ffb6834ee
32302 F20101115_AABQTZ solanki_s_Page_006.QC.jpg
40f63bf198874424c548094bdd861477
23ead2c2fd0e79b8346a06435658e40d7e63f908
108858 F20101115_AABPSA solanki_s_Page_118.jpg
1a04a82e9e7ea592bf968e011b460c4e
ac1547e32d403f9f29a97633e30e468503bc6c97
6289 F20101115_AABQVC solanki_s_Page_022thm.jpg
856a14e5d57a1e1264d6e7ecec5c5c71
7477539dc386d4a78a65341e902f2b9d8f238dd4
109888 F20101115_AABPRM solanki_s_Page_113.jp2
3a36f460319005c7a2607ccf2d521ee0
b3b291ab24d78b54475be7ee6d53b55209f696f3
3810 F20101115_AABQUO solanki_s_Page_013thm.jpg
f8f7c41f0cbe8e68cc9c9ceab9bc6ee1
ca4d8744858d143f2250602645e49253fced738a
27624326 F20101115_AABPSB ltsstest1_10mph.avi
d2af64ec255cbd55e42031c85b0d2e0f
daa06f310ecd8527dd2cb665711e1d3d633004cf
30724 F20101115_AABQVD solanki_s_Page_023.QC.jpg
76d1bf54aab2cbd34d974d12c985fceb
12e22d24770428de7756d71d18678383f9daaf9b
48444 F20101115_AABPRN solanki_s_Page_057.pro
5417e1cfce960b945342514f67bdd01f
d7cefc8497bfc1372122313be5825252baf9b3b6
35060 F20101115_AABQUP solanki_s_Page_014.QC.jpg
2ffc2042c0081b05ce42254f3ffc9389
0242177c77a238fc1a7353443ab2c7c1aa4ff4e6
F20101115_AABPSC solanki_s_Page_011.tif
cab2fc6c0b5df62aa71f58cd16ae925f
50c283980e1af61ead34d0bfdb1aa412c9902a77
7801 F20101115_AABQVE solanki_s_Page_023thm.jpg
494ceb99cb2f7e214dde6a66c9fc1fba
f321309839863dfbe410a343a3f901f7490cdaa5
F20101115_AABPRO solanki_s_Page_025.tif
9483d6e61fc9a00809a1412a235e6ae6
58374f2178ee76ad95f7768d3fb06115bc92e612
8337 F20101115_AABQUQ solanki_s_Page_014thm.jpg
c060dc8991000a75b9b960d85247451e
a97652ec18c569c8fda634b374067fae7f2efa02
74143 F20101115_AABPSD solanki_s_Page_022.jpg
1c81a5aefb8dc7787f229cd2ef150992
aec74d28264a3c4732da8f2a3296957923f73d11
28670 F20101115_AABQVF solanki_s_Page_024.QC.jpg
bfd0658d63ea7581c34911bc2772c99e
97fe25e277b68bf1734316055bcb98292beab2c4
8457 F20101115_AABPRP solanki_s_Page_079thm.jpg
33dc14b5cc85404386266d9096717086
f05cbfb7a684702ad865388cbdee24f971a5f2b7
35283 F20101115_AABQUR solanki_s_Page_015.QC.jpg
98d1b92e00355bacf3cadebb728a0548
89874f1e2705d2074ff81371c22c1cc3d73d2eec
6924 F20101115_AABPSE solanki_s_Page_100.pro
24a2d502e1bbc9a34ec661cdcbfcb9d2
6cc33486feaf6a350b109cb865a6d8605793bbcd
7296 F20101115_AABQVG solanki_s_Page_024thm.jpg
c519ba5fed893961e5d48835b8528436
479cff25583e5d346c365c26bb1f95b6ebf7457b
33499 F20101115_AABQVH solanki_s_Page_025.QC.jpg
cb28e6ad7222fad8b6f5991eb20b89cb
eac21a468c1fe01940e6b60b367b7d1b5cb6d9ca
F20101115_AABPRQ solanki_s_Page_124.tif
0c6010897282d7f666fa859d7bfcfe7c
8bb90c97f1f8d96acd8c988d7f8f9f17c9645603
7303 F20101115_AABQUS solanki_s_Page_016.QC.jpg
9035584e879a9b8d5db38bbd703b61c6
2e986a8ea1c45c2b12437928ee7b21fe1af3c4ac
47130 F20101115_AABPSF solanki_s_Page_080.pro
cc561d26d88e04506fa1bd3ded080794
46624de6a149e23c62324ace8be7def2c8696310
7834 F20101115_AABQVI solanki_s_Page_025thm.jpg
b2419d06dc0810d69f1c317f007e9e36
53f207b96aa826e5fa11d6ee59f9f8618ae0e216
115687 F20101115_AABPRR solanki_s_Page_071.jp2
d3a9fdfad44e4897e77c03dc0deac77b
474331edc6c5ef203e627cc71823a9268b348268
2007 F20101115_AABQUT solanki_s_Page_016thm.jpg
72107388e4e212093df3c88194100525
09a5bd043f1ecce739974c0d8f9e593ff605068f
934768 F20101115_AABPSG solanki_s_Page_008.jp2
0b2331948f85ecf0fab9dca0ac227a18
659a582a6052b700f4fe161cecdc40fcb7f90863
37017 F20101115_AABQVJ solanki_s_Page_026.QC.jpg
59a166e609b6c74a564b0971be16bda8
04437052492a4f15dff7018d6f1f8756d39b8a13
89141 F20101115_AABPRS solanki_s_Page_112.jp2
67bd8055536abe4ef2dca8bb1029e727
af01acf2790685594e2b7dc0c4ebe19eea67ecbd
9111 F20101115_AABQUU solanki_s_Page_017.QC.jpg
2e8cad70a364920e09dcfd2d737f9bf8
6eb332b34e0073d6ba3faa8a201e8ad1dbf718e3
690041 F20101115_AABPSH solanki_s_Page_099.jp2
a1b0993a28ff445ebcc23cac358da346
50f3fbb47b57cbbd59a102a3f1a4d16e04a3874c
8833 F20101115_AABQVK solanki_s_Page_026thm.jpg
e52bf06ef0c1861e1dc96d8666a45084
09c177085f569141ec3d8d9342299d491c2be134
35169 F20101115_AABPRT solanki_s_Page_113.QC.jpg
2ed4b315500d11e9c55cfecb503c6e78
2470a155c174a70804cfc2864b87e48784693cd9
3040 F20101115_AABQUV solanki_s_Page_017thm.jpg
c05e3775be589a73684ac2233a7da4d9
a18380729b2149e5ff86d0173850ce0184c5ded1
5822 F20101115_AABPSI solanki_s_Page_103thm.jpg
cdae9176849bfec6dbbc90f3d2cef501
4df755ecc32e1e5f09ceab04bd4c6b5b187f8ed2
28250 F20101115_AABQVL solanki_s_Page_027.QC.jpg
724798e45e872e79b7a9bd2844506ddf
cc515a83004da20c26bef56d2f3edbf7e4c4d17c
44008 F20101115_AABPRU solanki_s_Page_023.pro
0d6f84b80bc971345de798754478dcfe
0197c24666e74a3d3108f08ebd7f7f2ea09b7e38
30434 F20101115_AABQUW solanki_s_Page_018.QC.jpg
129d2ee9fe9b3bd8f1911ed2bc16c734
c2128896adc42253dc9f1667cd16657be57a409e
35359 F20101115_AABPSJ solanki_s_Page_115.pro
13ee9ce94aa8975865f22d60607e000e
0ec11742ec8664dccfe7c17fdd72fa0c71f25274
30928 F20101115_AABQWA solanki_s_Page_040.QC.jpg
66c57e3bc14f6d339933ce9c384dd11a
33da0e95514e2e155b5a5f131206f6d640f0aec9
34144 F20101115_AABQVM solanki_s_Page_028.QC.jpg
87bf9f088778d0c1c433786e03d7e2db
038472288b8e0291aada59abf30cd50ae8456d93
7751 F20101115_AABQUX solanki_s_Page_018thm.jpg
46b224d94387edf5f96ece115ea22433
74cf3f377c2035db27fc1f3ec16fd65e1bd30bcd
108525 F20101115_AABPSK solanki_s_Page_053.jp2
9fb84b55e99bf89811dcd74f4616bfe2
eb21b7e4ceddc1c753e186104b67f7398ed9903c
5137 F20101115_AABPRV solanki_s_Page_050thm.jpg
2b6d063fb89a625e7c7e41df6c24a479
406850c1e4f875db8056aaf60185a3b17d2d8683
7923 F20101115_AABQWB solanki_s_Page_040thm.jpg
944ec20f059b4ff63148c345547c08a8
11af68039723008fd5e0424afad4f551860cb5b7
8602 F20101115_AABQVN solanki_s_Page_028thm.jpg
cac4c3fb389f6165c8e7100c7fd6081b
c64cc7b3d171582fd6d5652af3771bd944b744ca
8341 F20101115_AABQUY solanki_s_Page_019thm.jpg
bf2d956d6b11674a458eb335659e3ef7
c57cc3fdac9b40ee9b7a94aa7ce0beef8a2cb651
2194474 F20101115_AABPSL solanki_s.pdf
cd818a6e17a88183f5d9a7d698189925
249f23d682964375a65659ee59c16fe5e8999343
TEFusionTest1_10mph.avi
ODBarrelTest2_16mph.avi
ODBarrelTest2_10mph.avi
TEFusionTest_22mph.avi
ODFusionTest1_10mph.avi
ODFusionTest2_10mph.avi
ODBarrelTest1_22mph.avi
ODFusionTest_22mph.avi
VideoForwardPath.avi
VideoReturnPath.avi
ODBarrelTest1_16mph.avi
LTSSTest2_10mph.avi
TEFusionTest2_10mph.avi
LTSSTest_22mph.avi
LTSSTest1_10mph.avi
ODBarrelTest2_22mph.avi
ODBarrelTest1_10mph.avi
VideoForwardPath.avi
ODFusionTest1_10mph.avi
TEFusionTest2_10mph.avi
LTSSTest1_10mph.avi
VideoReturnPath.avi
ODFusionTest2_10mph.avi
TEFusionTest1_10mph.avi
ODBarrelTest2_16mph.avi
ODBarrelTest2_22mph.avi
ODBarrelTest1_16mph.avi
ODBarrelTest1_10mph.avi
ODBarrelTest2_10mph.avi
LTSSTest2_10mph.avi
ODBarrelTest1_22mph.avi
VideoForwardPath.avi
ODFusionTest1_10mph.avi
TEFusionTest2_10mph.avi
LTSSTest_22mph.avi
ODBarrelTest1_16mph.avi
ODBarrelTest1_10mph.avi
ODBarrelTest2_10mph.avi
LTSSTest2_10mph.avi
ODBarrelTest1_22mph.avi
LTSSTest1_10mph.avi
VideoReturnPath.avi
ODFusionTest2_10mph.avi
ODFusionTest_22mph.avi
TEFusionTest1_10mph.avi
ODBarrelTest2_16mph.avi
ODBarrelTest2_22mph.avi
TEFusionTest_22mph.avi
ODBarrelTest2_16mph.avi
ODBarrelTest1_16mph.avi
ODBarrelTest1_10mph.avi
ODBarrelTest2_10mph.avi
ODBarrelTest1_22mph.avi
ODBarrelTest2_22mph.avi
VideoForwardPath.avi
TEFusionTest2_10mph.avi
LTSSTest1_10mph.avi
VideoReturnPath.avi
ODFusionTest2_10mph.avi
TEFusionTest1_10mph.avi
ODFusionTest1_10mph.avi
LTSSTest2_10mph.avi
LTSSTest_22mph.avi
TEFusionTest_22mph.avi
ODFusionTest_22mph.avi
7510 F20101115_AABPRW solanki_s_Page_114thm.jpg
e4fe9c0f2ad9c01baaa2ec88575018f9
436e89c96a4968620831d0fb9eeafdf70be824cc
32887 F20101115_AABQWC solanki_s_Page_041.QC.jpg
5a25edcb267da07f5446e42a48f60bb5
c05394683a3081bb7bb9e5343fb6b225fb49235c
32906 F20101115_AABQUZ solanki_s_Page_020.QC.jpg
de5aced2b380a76ef747bba9eb7baf39
6b7dd159986052855d8585105421b59f89ffb7f5
110692 F20101115_AABPTA solanki_s_Page_119.jpg
9b553f1c4a81cf57bf574c1235cdc9a9
4c731e198194af97802598e3319e1b57bc5d7fd1
47358 F20101115_AABPRX solanki_s_Page_062.pro
b014c15544181458c26c83b8fac02656
e3863f921166c2f20345ded87e4f9453a7f200cc
8320 F20101115_AABQWD solanki_s_Page_041thm.jpg
c969b174673e59cdb942a0af03dec439
93cef7584ba29d80147736c9ed06272a0926c108
34070 F20101115_AABQVO solanki_s_Page_029.QC.jpg
5f6ff7a233e01913247469575d6c52de
bf9316e86b706ad1dca15507d5ed7a00fddb9e16
95042 F20101115_AABPTB solanki_s_Page_072.jp2
dd8c51a7c2dd9fa05675414454ba0ef8
a6a3eda02e24517f8dc689d0ba938cba1e29f75a
37517 F20101115_AABPSM solanki_s_Page_067.pro
a0b02e64c1c25c43e4a08aaf89c544a2
9ebc7bd365c2ff51f49c9046129df3c6da4a8c43
364 F20101115_AABPRY solanki_s_Page_007.txt
1a61d17b54746522df58068e266742c8
39b64e60dad9fe0f80e8540723ee0faa237a998f
2940 F20101115_AABQWE solanki_s_Page_043thm.jpg
2f64ae337a5926a178da218c84b8a000
5bb2f5f5c373e5956ff3c4eb3768e76a02c5d6ca
36240 F20101115_AABQVP solanki_s_Page_030.QC.jpg
ea3ce630039f76377516a3a6ed51b8a3
0b423e848aa55ba757da9be67bec181adf7eeb32
29388 F20101115_AABPTC solanki_s_Page_072.QC.jpg
a157e3b76f154c4fb9ab6922000c3a4a
1d9454e5650fa486e3336a29dad08e59572a4b47
25718 F20101115_AABPSN solanki_s_Page_115.QC.jpg
20cd171a7a06efc5a5e7b34f1da1e712
6ebb63b8612b303ec5c1157b54941225c3866ef2
1824 F20101115_AABPRZ solanki_s_Page_072.txt
c52e50f18fd1d35ce98caf3905842b3d
822c48dfd537fea8a2ce5ed969ad8f1acd5bb51c
25624 F20101115_AABQWF solanki_s_Page_044.QC.jpg
59ec19393e77ba4b07944a816bf29be0
5e5142b2480ec8669b4847081da407c707dbe2c9
8881 F20101115_AABQVQ solanki_s_Page_030thm.jpg
2db58ecb9d295ab004797e4a83543bb3
2776bb489e775012a9baeb080a9dc2efde6ae1ab
32659 F20101115_AABPTD solanki_s_Page_084.jpg
5106c3bdd1f618f6370eb1a6f55ee237
bce311730f680e707d91af9abca3c8978a4994c8
1051941 F20101115_AABPSO solanki_s_Page_044.jp2
1f976c0415b1d33a4349ecda7f4a7e99
ee21332e83195782244547061f2e23d9a16e592f
7076 F20101115_AABQWG solanki_s_Page_044thm.jpg
ffebd22607e298deb08f9c1b0ee5f971
b03f95867bfdc2d167562528f224ebc974ad9366
36026 F20101115_AABQVR solanki_s_Page_031.QC.jpg
2f12763b637beb38ead3b550de321d74
e0fca5ebea7599802c62c922e8f4d93bfd8786d5
1030 F20101115_AABPTE solanki_s_Page_038thm.jpg
68a75581ca8cb500fcc332572906afbc
fa2138d4ba86b386086583d77917081fb1b47e03
1902 F20101115_AABPSP solanki_s_Page_018.txt
8adf82ae488ebdd60f157179d6345458
9ecde415f5c6d0aa91dc49cf6397762bf08efd31
35976 F20101115_AABQWH solanki_s_Page_045.QC.jpg
9bc67757c2b1b85c8840fbf087292fbc
91d06aaede76dfe139776e532c23ffc7a7557aff
9000 F20101115_AABQVS solanki_s_Page_031thm.jpg
7030e8f905858f981ff175616e976300
7dbc4473bcea21e58acce20cf55b9477f02a294a
F20101115_AABPTF solanki_s_Page_066.tif
9e4b01955106c0721a4474af38497862
2242b1e6a108b545bbda5898dc734cc30541ccb4
1051980 F20101115_AABPSQ solanki_s_Page_036.jp2
37c97f72421b1ef8d944595863ab7729
f39ecdc73b64a12353da97173ba69d359a2c8801
9068 F20101115_AABQWI solanki_s_Page_046thm.jpg
880ef12adf12635de38e5d030002ef5e
99fb7e526619d8901640b2cb25a1ba705d1da0b2
35602 F20101115_AABQVT solanki_s_Page_032.QC.jpg
be4a03475bb9fe926611d4105a8ab751
da2114a78d3cb2f24381ad589cfb1077a85ddbd9
112311 F20101115_AABPTG solanki_s_Page_035.jpg
fdd214a41550361725fb9844ef3e0523
276a48850cf7de2250610d19947eb34bdba81d39
35904 F20101115_AABPSR solanki_s_Page_119.QC.jpg
c90dad522510a0be292a9cb7d1c06976
588a3abd034b005939c81e58497a5d89f1c5659c
35712 F20101115_AABQWJ solanki_s_Page_048.QC.jpg
6e02643cf5eb09327c4734b87335be10
8c500ee4931c8415feafad9a9d29d2d0a9eb3521
8705 F20101115_AABQVU solanki_s_Page_033thm.jpg
f9722064bfd6b7025010cd23b0fbd14f
dc9ed58ee3279eb8d040a583d765d22a915fcbed
103791 F20101115_AABPTH solanki_s_Page_095.jpg
d59154e0421a4ea4dc3685e3df8a7ee5
6130a44a123aced3f3059fcd10f8063eec489007
17645 F20101115_AABPSS solanki_s_Page_004.QC.jpg
cc6e7950052628de5e45f3b6db50785d
fa58f98146925540aabdf4d24ee47a16315137e5
3265 F20101115_AABQWK solanki_s_Page_049.QC.jpg
36d7e709245d0ca09e42c341674c07f5
49fa0765eb93ab6d639d6da58fbe0ecc075abed7
36933 F20101115_AABQVV solanki_s_Page_034.QC.jpg
6643f5cea6ef24d7b874cc4ffb81215d
55057a63f5c7040c81afd1d41d1a51afd94f6165
F20101115_AABPTI solanki_s_Page_101.tif
e272864f871cac8285d5ed39b1a25ce2
e47f9a7ee30a3fdfc27bac49de431bda62da7bf7
96004 F20101115_AABPST solanki_s_Page_090.jpg
e6d6a0a92de581a9660c343bc429a43b
8a6154ecc643f3270d296ac13d9d436c9cb8dbf2
21092 F20101115_AABQWL solanki_s_Page_051.QC.jpg
5bb1f345abef9d8de78b3f3a28170f21
531cf076f03ccffd652972100edd748509b4ec02
9127 F20101115_AABQVW solanki_s_Page_034thm.jpg
5d8cc5e9c2f767df45fd624b3aee6b06
bf342a7e47fd5f610be58ece9abbcda22c4f6772
33385 F20101115_AABPTJ solanki_s_Page_033.QC.jpg
9b886ea131e45afd5d6967e0e30c3267
86740ccfaf358205d83a739ae629ecf48ee4420f
4122 F20101115_AABPSU solanki_s_Page_117.pro
ec381e4f7c0cce4d8b1d354a95276684
a234db261a7c55afb1ff4f550bbb781b5a980e22
6511 F20101115_AABQXA solanki_s_Page_063thm.jpg
31d612dc26c9b7b315423123f68d4fce
2f9592d0bbeeabb9287aaf63204250637b540232
8507 F20101115_AABQWM solanki_s_Page_053thm.jpg
f72c07006a6fed277d62a8240f5a7aca
d2b6f8858c9f0c112dfbb80ba7f30aaf7b94bb80
36942 F20101115_AABQVX solanki_s_Page_036.QC.jpg
7cbf0f026115fe27c38b2240d6bb2563
6c541ea553ffd7a20a8d3bba534e3a59e05742ab
F20101115_AABPTK solanki_s_Page_087.tif
95610d3fca4bdea0b2e0a0d678cce35f
707f87541f91eb9be610bee583db88e96a86e735
195 F20101115_AABPSV solanki_s_Page_107.txt
8ff2e7fc7f4b3fe13390e8e59ec6648d
35b349d2b9ace7c01e69dc858c4abd2f5a90b56d
7162 F20101115_AABQXB solanki_s_Page_064thm.jpg
ad382ec90fe2f492cddfa39b1cc4285e
6800a303017d1d0366c2ec422db284818af90238
34359 F20101115_AABQWN solanki_s_Page_054.QC.jpg
586b44693440540c44d530a5d9a306c4
f4adb5c297c77ef34f47cfc3e98df3580304de0b
8090 F20101115_AABQVY solanki_s_Page_037thm.jpg
32265ae8208dc924f49a063530bf1c7a
bfb09429f9eb2871962062788e818ec875e6e3ce
30177 F20101115_AABPTL solanki_s_Page_001.jpg
86d64f06d05fe147057cb694aae7e549
a67372bf31623c1d00e87d6caa139e822f28e8ef
6436 F20101115_AABPSW solanki_s_Page_039.QC.jpg
92adc1605a3698da90ef53c9a594439e
4bec32251c3229d139f318a2d339fc98f8bdc28d
27635 F20101115_AABQXC solanki_s_Page_065.QC.jpg
44f32ec44517a3e60189d1393372dc5a
0ff4d24bbde7cf00a5ec18dcf5aa118806c74c01
8078 F20101115_AABQWO solanki_s_Page_054thm.jpg
264261477e6b38f71d50c8ac45b6e8d9
e7455a05fcfa77b065e46de7874e270e89419125
3766 F20101115_AABQVZ solanki_s_Page_038.QC.jpg
fb91e4ab7705ef66a257fad6d42f1ec7
86dd630de2ba4eadebf1ca083409cf156c5677d6
1407 F20101115_AABPTM solanki_s_Page_002.pro
3a5a2a48ece6506a5a80fda200c11545
92380b35c07bfc553a0416e92eae7e0e147f1f64
6426 F20101115_AABPSX solanki_s_Page_051thm.jpg
ba3e0a7b471464159fe574f6cee6fdfc
7ec6d66a8ce65e90a34d1776c0b89a97c1cc6aef
56004 F20101115_AABPUA solanki_s_Page_035.pro
81ae254de78460d6edc89be57655bb51
8beab314280e60c6df69ce566a2444a7f3f6fa7d
36082 F20101115_AABQXD solanki_s_Page_066.QC.jpg
8996073e2490a52da689bb1e22fcb3a1
7b38e6e20f5b3c94b9260dd0be4777ab60d6d3db
F20101115_AABPSY solanki_s_Page_123.tif
aaf78e862f2e1373127739538cb9ec7c
7a129b19b67cd835875eb2b8c63e33da2ebe31c0
F20101115_AABPUB solanki_s_Page_021.tif
5e0c012dacd597a00d969ac8028f5986
a7ba686a411966ccdf9a7385d2acde5b6fc6cb7b
8967 F20101115_AABQXE solanki_s_Page_066thm.jpg
6b55f67897a9964017ba5ae0a976201e
ef9271c63d16807ad81865d307230d0469c37507
34857 F20101115_AABQWP solanki_s_Page_055.QC.jpg
2e54ca44027986b9c4bf5ae0f13f36c9
2e098531143d35b65959868b8f0673baa4c21cc5
101613 F20101115_AABPTN solanki_s_Page_020.jpg
d1f0aab5a5049101ef042379f0abf4fd
1949ae8697472708c06d3a07eced188471b55d8d
23738 F20101115_AABPSZ solanki_s_Page_022.QC.jpg
f407ec9b6dc311440c1e05cf62e0628c
799d09109c3d094bbf64bbf808100b991eac41eb
49698 F20101115_AABPUC solanki_s_Page_077.pro
d351ea4265c71fe3d4ff6a3624ece315
701d06fd3335765e5d2220ef092763d17635ab97
26264 F20101115_AABQXF solanki_s_Page_067.QC.jpg
7bfb95b6e664430bdf4b839acbf0a9fe
18d584e17bba173beb7c2463093a760fbfe527bb
27205 F20101115_AABQWQ solanki_s_Page_056.QC.jpg
7295965ae83bf1e63cfdfd362d70ca16
e023add47b3ff29510cfe88c41e2f5e12da25c1f
F20101115_AABPTO solanki_s_Page_026.txt
2756e7feccc47430a78e794c6ff14595
56f2284a7fd5bc0d72cc81249a728cd7d2a6ee3c
F20101115_AABPUD solanki_s_Page_073.tif
0b8152cf5666f71a02b46b949e933d47
8e635315becaf05d6608840b061246043a73f3d2
6643 F20101115_AABQXG solanki_s_Page_067thm.jpg
9708115fc9aa7d4d64eb30d108a5ad9f
32875b0beb372d818185d43e41db1c186e0628aa
32866 F20101115_AABQWR solanki_s_Page_057.QC.jpg
24e566048324bc04d3e4073beae2f807
a7f3a19508c434acde6d6d38226b5a63dd26522e
25026 F20101115_AABPTP solanki_s_Page_122.jpg
e3e6187f919dc8906053981f63e425ad
c9b39d8612a51125b3fc571e648a859809f6ad01
38374 F20101115_AABPUE solanki_s_Page_043.jpg
1f237089c3bdf7db10ab4b50111645d1
208d11527ad6fb425995fb59226c216ea72ad703
30190 F20101115_AABQXH solanki_s_Page_068.QC.jpg
2308c361777accb508e97d93821faad0
83b4634fe1a02180e6073561b623d364a59443f1
7906 F20101115_AABQWS solanki_s_Page_057thm.jpg
d7d0577c3848f029f0a8f71ae34d64b7
84f590061f18bb6eb508a1de2d23fdcc4484fae4
5224 F20101115_AABPTQ solanki_s_Page_039.pro
dc91faa659f66f2e27dfbbcc8cac6ec7
f02f7d2d367c4e2118c465b21e79d6b811d155b7
1051910 F20101115_AABPUF solanki_s_Page_101.jp2
448c728f2fb0b4ff45db755c683ea392
8e360c8cdfc2424dff92609ee65e1a4dceee9859
30843 F20101115_AABQXI solanki_s_Page_069.QC.jpg
6d2f05714a5c15fe3cb60593aa53fbae
760d6149757c0c6e7577466dcc7a10286d734c75
32583 F20101115_AABQWT solanki_s_Page_058.QC.jpg
8999d23dc337bc228cc1e78abef5a4b1
6b9364822e4c3c918f6ef2def5599ae525743c21
49989 F20101115_AABPTR solanki_s_Page_097.jpg
43a84f5d89bafa532eb28cc9415d82b1
f7fd549ad7f8d69ff59a3846e407d6bce7a472ae
2003 F20101115_AABQAA solanki_s_Page_037.txt
738c384adfca859b547fc36fc4c06dcb
454b992c406a55afd664015c193d7eb06da4d71d
33815 F20101115_AABPUG solanki_s_Page_091.QC.jpg
4079e76140dda893b7962b1ddd97fd82
eefc35298b0392432d960757694bc053fb3633bf
7658 F20101115_AABQXJ solanki_s_Page_069thm.jpg
2bc5dc2ed93d3b2461e1b1bde97ce600
053f3ac4ac3876b8256d2b8b5d9fc04e14af7b87
8283 F20101115_AABQWU solanki_s_Page_058thm.jpg
3e0f4e11d4d497eb6b81fe4696893e3a
31328bce21fe680d533d2755c1715ebc60995b75
55768 F20101115_AABPTS solanki_s_Page_026.pro
439ca3301133dc51f8ebb66df29dda80
d0d947005c9f923a0f6ed24b8273a6b0d49de7b5
F20101115_AABQAB solanki_s_Page_118.tif
be3d9be494d87b68e94c6b1335882a78
9fbe089aed768b1fe2b0d6c481e0bba06e5eff11
F20101115_AABPUH solanki_s_Page_113.tif
6cdaeca15e4e1b68610a0fcf2cdc5b9c
b59d4eeb7dfc7d96f603291c2a195370c9202bfa
9211 F20101115_AABQXK solanki_s_Page_070thm.jpg
60b970738552c92117f680ed262d7b34
11373a4df6ea1d25d50371f4c01fd3956f278975
8101 F20101115_AABQWV solanki_s_Page_059thm.jpg
c77a5203514a84130dbe4969cd07171e
be2fe435d2ed5b2e33e23ad3e6354d05e9fc2aab
F20101115_AABPTT solanki_s_Page_014.tif
909c4c85b5a55b5facb204218537ef54
91b675611dfb63da539c336a91161bdd40812478
F20101115_AABQAC solanki_s_Page_076.tif
9982a8b662bb539c46c9008058f472bd
4eddcdc22e86089f0c4a735e8ae135cc90a31196
64637 F20101115_AABPUI solanki_s_Page_081.pro
b627cbda9423c756afdaade236048127
6fb870a8d06b5bb670fc7751bf6fc94c4ce34f81
37324 F20101115_AABQXL solanki_s_Page_071.QC.jpg
14bdcf19e8d46a9260f9811ae82200cf
305d8ab39c5d071989825425133827fb3a391ac3
27346 F20101115_AABQWW solanki_s_Page_060.QC.jpg
bc11cee10dbb7c1610a4ba3ef17325a0
8ce01361d3ce9f795aed96d35933f8c0926ae87e
6906 F20101115_AABPTU solanki_s_Page_065thm.jpg
690eba175ca0a2db54a40d08ad5b8093
2e03440cc794084586cf197f0fd5c3241774714a
F20101115_AABQAD solanki_s_Page_092.tif
a4c7653ffb770ccf5055c80e82b8a66b
7518f3d41c3d0df0581b56b5c61af4debaf87e47
F20101115_AABPUJ solanki_s_Page_091.tif
3fc49ac62e96fc0e9d0cf65921e8e8da
8452b7205fe62386fcdef69fe44d441a8af7f976
6998 F20101115_AABQYA solanki_s_Page_083thm.jpg
46124fbee061642ed1d6c5bdc0fd5f1e
b709d5149304bbdd172ce52dc7a5a714cbfb4805
7317 F20101115_AABQXM solanki_s_Page_072thm.jpg
5a5c1a45ac22e7bd5e860f8257cb7078
3d776c629bdc0eb485ab16f0d57272a8e5e0eec3
7059 F20101115_AABQWX solanki_s_Page_060thm.jpg
a2256021b6ed0e9f3ba23de25bf43303
ba51a0789590cb6c6565ae5e78ee1e2929234b6c
52856 F20101115_AABPTV solanki_s_Page_059.pro
c39df932438ecce6055b01bf4e024405
8079884526337884f6a9419e327ac9f88227836d
36421 F20101115_AABQAE solanki_s_Page_035.QC.jpg
aeb6410c2f0c4d01d33be46494d59f3b
69df57bd1711c4c426224f29c8f1f9046b1c7389
F20101115_AABPUK solanki_s_Page_006.tif
60393e042b61a0c0e9db4f094e2e2edd
877801cde89632f952e1c261de8bc442408ac86e
11280 F20101115_AABQYB solanki_s_Page_084.QC.jpg
f911d245d18a4c9100138f2255c224a1
6245849c804331bf256388c5a10a0fc4fb0e237d
37126 F20101115_AABQXN solanki_s_Page_073.QC.jpg
b1e2b4ef14fef5adc2e0b7c74492f9ea
1fdf858af118998f6803af3937e93211e954b16f
5659 F20101115_AABQWY solanki_s_Page_061thm.jpg
988b73be8689595e9f28efed294c7072
0f1681f2b80e4dda525375d8f81b6f16a160a8c5
2255 F20101115_AABPTW solanki_s_Page_070.txt
5efe1fa27f7074ce2152d9bd75aff38d
b9d489552382343e3d1b5d83b378cb1608ef3829
29221 F20101115_AABQAF solanki_s_Page_110.pro
14fde41526cce485109776a86eca27aa
7831b9559f520c131473403a8804cdcdbebdf385
F20101115_AABPUL solanki_s_Page_032.tif
f06c8d9c456485aaa75f248606efc842
b314f95129b3557b6227039cd9a98e6c02e1e326
3823 F20101115_AABQYC solanki_s_Page_084thm.jpg
42c2608026ccf957c5c2aa36ee6af908
366a201ded0610e85c61400d85567131ae0aed72
8783 F20101115_AABQXO solanki_s_Page_073thm.jpg
e5e981a752377663a0888d630294edb5
173d4ae9923172498d5e1c3d159831d7a9c08f76
32210 F20101115_AABQWZ solanki_s_Page_062.QC.jpg
3290eee6923b41689d360cdd77e0b439
83ba40a3491252dcc136ad1d2193ca08127e9e2e
1991 F20101115_AABPTX solanki_s_Page_012.txt
94841a9f48e4c6eac9c26064bd912f3e
052daaf1b34e33766d73b493885b57a8ab1a31ca
62951 F20101115_AABQAG solanki_s_Page_125.pro
f82a9e8add6972073ee8583a724cd73b
64c49537f7ad5a0e11de1859d540f219cc07313a
2028 F20101115_AABPVA solanki_s_Page_094.txt
b1978ebd09575f915d110d5d2ebdba66
818adf573bf0ca3996af119ea53e87a3f6761b0b
F20101115_AABPUM solanki_s_Page_072.tif
bce269db619479fa6a2cbe12fc26d47f
d0185a3c8310d3106a7b4e43ab655258fca29a23
8760 F20101115_AABQYD solanki_s_Page_085.QC.jpg
4dec08cc2599a03eed6c392bb7bf674d
b691d978782e2af12c1c624b3df5776863fd0893
20871 F20101115_AABQXP solanki_s_Page_075.QC.jpg
4f1ca5088e5dc8147806c234bed3d8a6
2be79b9a677c41a98c4f042c40c181d8fc43dbe5
F20101115_AABPTY solanki_s_Page_108.tif
a15dfc2784885d8f2a7fa6659ff5856d
11ffee621e8c7d8eafeed661fd21a5aac60b7391
1722 F20101115_AABQAH solanki_s_Page_065.txt
65d67ec375657e7edfd01767f7868099
eeb6a84f909e899ba0524ef124dc4ce815dfa5d6
20373 F20101115_AABPVB solanki_s_Page_050.QC.jpg
42c8265db0ced22975d45e78f8c18635
c7ee75c8092ae39a70f65629b780d6d582ec7a0a
32899 F20101115_AABPUN solanki_s_Page_081.QC.jpg
48bb84d19e0c12e2f613b1b9fe4f0b14
c3089b1374c34ae15eb75ba0841f0912e54992dd
15869 F20101115_AABQYE solanki_s_Page_086.QC.jpg
d2c80f09033df455f34f2e403401d0bf
904dfe3cce1df3b600dc7d2532b2dcfadcf809e2
25913 F20101115_AABPTZ solanki_s_Page_063.QC.jpg
8dd4c6e0da00043c3e7cc39d1abbbb14
e031fb594f19e0d15e4b89a49c51976643cd8d55
34038 F20101115_AABQAI solanki_s_Page_019.QC.jpg
d97ee9a2c006ab932d87b3300578410c
7e40b08f6c8878053da5e8bcdaf868e3667d4df4
3009 F20101115_AABPVC solanki_s_Page_085thm.jpg
447b50630ca71b8433722aca41808bf2
fff846cd11d78968d0829658e0ffba6a94126c98
16094 F20101115_AABQYF solanki_s_Page_087.QC.jpg
5c2ba3d26ebe5d9a13c93d8add39299c
21e30b16e627ceb9e908d2269b8286a36e672daf
17548 F20101115_AABQXQ solanki_s_Page_076.QC.jpg
c7a1619d2fc24bd245b3321161536e77
2c670fe571e64eb48fdf9480b3353eb3cff272a2
77589 F20101115_AABQAJ solanki_s_Page_063.jp2
da270887a1418911ef650e3b920276b3
4316580f07d6d7747277d56216a55242a6279b9e
83485 F20101115_AABPVD solanki_s_Page_064.jpg
045fb7531780b20cbebd630841bf6d65
44556c4237f73f44f6d9f221be88e0362332b728
839 F20101115_AABPUO solanki_s_Page_052.txt
a02ade21004b94ad3a587624b43089ad
ba30929de9c1b681d7fa161cf7ae411d1beb87ac
4655 F20101115_AABQYG solanki_s_Page_087thm.jpg
05dfce93198d3b2881525bf35375630f
8a790c8d88eff46b986dbd5babbf3c213c1582f3
34988 F20101115_AABQXR solanki_s_Page_077.QC.jpg
74bde96dd2a011fb3f663c1bf6b5ffb9
dc594a55090d7bea34de3ec0940a33aeb6f2f69c
4804 F20101115_AABQAK solanki_s_Page_076thm.jpg
e6f28319a7fd9d6d30e6a74b219cbfce
fb0915a8e6609bab0149ff5912c2599dd0a79408
58611 F20101115_AABPVE solanki_s_Page_103.jpg
d7d2eae831d5cf08e17adbeb27a6b399
3feb288fc5af4348383a3c26bd88c8d7f80cd6ac
F20101115_AABPUP solanki_s_Page_050.tif
a04c3a7514aac466cf936e63e854c297
890f865791ec1e7792dcb72d23ba99039255f779
18817 F20101115_AABQYH solanki_s_Page_088.QC.jpg
bc0b4f8e25689f6b373d0da9a8fba69f
1c99daf7e2b10a303c6afc5d95f1bd0d4681d08d
8234 F20101115_AABQXS solanki_s_Page_077thm.jpg
099c26acdb856be3a0ef9152d2029ca7
dd036ebfed1d7a60559c27413ef60006dda0c8d3
89053 F20101115_AABQAL solanki_s_Page_056.jp2
ed7af4e8622ea11f09cf8c050c45345a
eccff3beff7247fe55908544f6280a9b81f7a116
4757 F20101115_AABPVF solanki_s_Page_101.pro
e9387d7816128e23d6b2fb65abcd0134
401e08c7a3bb79df1191393e039aa071ea2c8b51
F20101115_AABPUQ solanki_s_Page_049.tif
b63abbd227e3b50d91242e8a248ec07a
8a9ed2a0c19984396fc510b9dbb9a6bdd14f5c30
5639 F20101115_AABQYI solanki_s_Page_089thm.jpg
65b1c9eefa3043ae0e5849c135f23926
abf13098dcc7d19dfab39683fbfe5c50ffb53bc2
34030 F20101115_AABQXT solanki_s_Page_078.QC.jpg
922e309f01c35efbd9d35e4e32cf4907
dcb4ca22083eea346be0a9a901f0843964791ef3
F20101115_AABQAM solanki_s_Page_096.tif
d32331096f50df9807e76e6b9e15eca3
4c4fff95a147d06ef05a98a279a86cba70c8d1f3
114273 F20101115_AABPVG solanki_s_Page_070.jpg
f97f7e9b15da7c3f7809b8a2bc6dbf9b
31a0a5085561e48c3bcb12562991eb452c3beb0f
4642 F20101115_AABPUR solanki_s_Page_116.pro
6134a27dee1d73ea9a40478f74901c56
db2587ec19be15c1fdf2a9802ca70614ca7f074a
7426 F20101115_AABQBA solanki_s_Page_068thm.jpg
fb098156faef8de2d8aefa4140d29d95
83905912d5159ed996b1e7ada7b6964012c9acc1
8648 F20101115_AABQYJ solanki_s_Page_091thm.jpg
2a39d3fef16cf712415a401693942843
f235c10f7064aa379581ef69aaf3051f827cdcaa
F20101115_AABQXU solanki_s_Page_079.QC.jpg
1eec6ba1b97ccf6d4329f0c6c2d92249
cca99f81e35605551dd48ab4c7285b2349416f63
58155 F20101115_AABQAN solanki_s_Page_050.jpg
8bfe7c27a1691ad2cf15945ac6087427
d5302201b886d6227bb43dc2c25370918a5b0076
3706 F20101115_AABPVH solanki_s_Page_038.pro
aeeef2e6112e84c9350c28dfc3c6d8b7
efedd34225d9ba50427132ab956e09d953c32169
F20101115_AABPUS solanki_s_Page_037.tif
97b4a7cab73592da1be9b99337f81ca3
8963680d6444598be2571355b2a41e77a37444a2
33926 F20101115_AABQBB solanki_s_Page_053.QC.jpg
70e28bd4b990e16a7f7d8a9b1baad486
6aee12e020cf54d3175eb99f40be03a7ea69da04
36773 F20101115_AABQYK solanki_s_Page_092.QC.jpg
6d78fdadf6b870140ada094c4ef32ba3
98688315fc5c9f03a99bde6762a7fd0ea01afadd
30926 F20101115_AABQXV solanki_s_Page_080.QC.jpg
39140a6ab3e2cdcde58618314ac73d60
4faf039b210e3a4cd5073f74902016673924bce4
344139 F20101115_AABQAO solanki_s_Page_007.jp2
aa06e3cce0c19a4297a54bc2038da316
73bd729111b694260d696da4d7350975fc120b2d
37234 F20101115_AABPVI solanki_s_Page_046.QC.jpg
f7a48dbea9d8aec186f535e1a1850215
12b25630c9f14eac4f92bc2034910b3c792fa182
92185 F20101115_AABPUT solanki_s_Page_069.jpg
d6e2daf1714b31dfee88081d0de291d1
59ce8cfc41f59962935768f4dea4587a199f6a47
7621 F20101115_AABQBC solanki_s_Page_087.pro
14697f910d44e17ed1dd9faca4a0ac72
64ba7c118f32252547870f1a92177a45807ab46b
9453 F20101115_AABQYL solanki_s_Page_092thm.jpg
e5de18db2802314c4e759e77fff956d0
433272d64839c16633c2fbedc11584a22fd663c0
7641 F20101115_AABQXW solanki_s_Page_080thm.jpg
0c8c0b0bbc676e27de70d2f81650d809
6f9f7df9fd291816d494de9e6e59af11ce2e7491
31712 F20101115_AABQAP solanki_s_Page_111.QC.jpg
3f7d5adc934b34b71874fdc749b21abe
f8aa3d73a87c03cca11c5bf46633b7d9122afd78
2239 F20101115_AABPVJ solanki_s_Page_039thm.jpg
084ad6a092f6ed491f3e8a1910a34675
eff9a46cac041cf5bad5760eb10c07524ea20357
70730 F20101115_AABPUU solanki_s_Page_110.jpg
45f99c8a77c93312d3f097e48402af62
a7c6a5bd4b5ea944834c2f515c114deb3a6d6477
F20101115_AABQBD solanki_s_Page_074.tif
2e125de1e3ebe83ec17cc75ee077092b
1db68d41e64f277660f991d837be4e49dc12afde
20122 F20101115_AABQZA solanki_s_Page_103.QC.jpg
e556d944808aa94a4630b9d163a2683f
baba7d97170f51a15c818d2c88cea408b31d0480
39030 F20101115_AABQYM solanki_s_Page_093.QC.jpg
a81e562730f9ddaeebbef4d1f0d3f291
d78b951f320184fb0140f07e149ea6599aec87e4
21194 F20101115_AABQXX solanki_s_Page_082.QC.jpg
86d99f3b92c9eccc854da34b3e02f3b1
4e195f74d1870fb53f69c2f92c9631397b4e4689
31132 F20101115_AABQAQ solanki_s_Page_128.pro
da37649a9e5ea93d115cc178019b7c34
724025bd635a1dabba413709afb4e6f5159c7d95
48948 F20101115_AABPVK solanki_s_Page_058.pro
cd516698e77830f4f135c22ee5c21a83
c0c15590b6960926c678e10bd540567a94382fe0
28136 F20101115_AABPUV solanki_s_Page_001.jp2
451015a6f849b9d9378dd1e550888162
a8e900e2acd2d69945c497ccf9ef90877461ba2a
35579 F20101115_AABQBE solanki_s_Page_126.QC.jpg
7b58e9d8370a9f3d992368599e8720f6
daced6e4128453c7bb2c64df22a81f6444790fa7
4611 F20101115_AABQZB solanki_s_Page_104thm.jpg
3531b3dd4c01419242aebd8e04a70f0c
a98bb5d5d471630018e6cd8a83299e2677e70d93
9618 F20101115_AABQYN solanki_s_Page_093thm.jpg
1386f9c9cb7b306c21b82cdbbfee96e5
363db50b66e9b0785eb84a065675541159f0dad2
5952 F20101115_AABQXY solanki_s_Page_082thm.jpg
d13c7b09bd56de287ee224e108993382
861e609934b8edf470280fd6c698cc960b3ce0a5
F20101115_AABQAR solanki_s_Page_048.tif
a2aee184d9966dff4d46ccadc2a48722
3921eea58929f535e6996543bf39cc7725ff84e0
F20101115_AABPVL solanki_s_Page_001.tif
8aff6a5ef399dcf3661642df87f59b7d
94c5a284502bc2b96ba78d2c04046fccc3a2812d
F20101115_AABPUW solanki_s_Page_047.tif
ce25baa618bb4ea0283f2497b34342a8
63c82dd3f02789c0917900e5f98ca326a63bf25d
500 F20101115_AABQBF solanki_s_Page_001.txt
59f7208d30ffbfcadd13da3d1e78ab3b
933e68ead1b888b6b163dc930c1197d1c996682c
7528 F20101115_AABQZC solanki_s_Page_105thm.jpg
82afe70689009d5b850072c2ebe3ca71
3ae56dac8f63dbae5d78391ef560f489c27622ee
36375 F20101115_AABQYO solanki_s_Page_094.QC.jpg
3bfb3d4b592deb83981960f5124212ba
ed9c0c59573f721412c1df7903a4b56377b77955
22464 F20101115_AABQXZ solanki_s_Page_083.QC.jpg
b02f186fb7f24e567fd0e12a9a9e09d8
baf0793e3e4829a484b75426afacdb4eaabac071
F20101115_AABPWA solanki_s_Page_020.tif
61c0db75702cec8058817591b06ba542
983400b69e14876e6f8f33bcb156a2a23a542c0e
118339 F20101115_AABQAS solanki_s_Page_030.jp2
870ae21416ca40d71860857d81a146ae
62243c1befde1012f3de4819df9eb7e50e4af44a
24879 F20101115_AABPVM solanki_s_Page_008.pro
63ea8ed6ec802415e15661031e9e72dc
dce610ab979ee3a0b9d23579cac35c65e53846e4
F20101115_AABPUX solanki_s_Page_114.tif
0c5bad3c96b10b01915393caf57f04b9
79043c405f6c504daf8e69ed925e92d0491872bb
F20101115_AABQBG solanki_s_Page_018.tif
3c4d2245c33f741e58b0e8bfb2f9d13e
b6172bcc74e5265e6087e08691a5c57d9dd7c9a0
17376 F20101115_AABQZD solanki_s_Page_106.QC.jpg
fb4260d9847d7b12652f9ef011475f79
5c5df2e030969c9afd33a7554fae1ecab3438498
8214 F20101115_AABQYP solanki_s_Page_095thm.jpg
ca9e58d8fd55faa44be3f33956789aba
d67badd5dc9ebd76b0550cfdc8024fe2b1f0d737
F20101115_AABPWB solanki_s_Page_019.tif
ffb1749337b895d0a31972a2ac5506a6
89bb5a7904c086043fdf2a17131f20a22f8bdd3a
37291 F20101115_AABQAT solanki_s_Page_070.QC.jpg
bb516bcf8295202cdc0d3e921bb1747d
d117f7d70e0ae950ab637260239c197a030fac9e
F20101115_AABPVN solanki_s_Page_026.tif
98d28faabcb6f36bdc2f441e08d77341
ced6ad955baf8ab79a1884cd6bb221433691ea12
7417 F20101115_AABPUY solanki_s_Page_056thm.jpg
a81e33bac3a7213493d1103615a55435
341092dc9c91183859c72694b5a22a5fe06c7bc3
1011 F20101115_AABQBH solanki_s_Page_008.txt
625f39414ee40bfb62ca83792ef9f351
35236f5bc9abc11021212f162e91b2722362bca2
5013 F20101115_AABQZE solanki_s_Page_106thm.jpg
1fdaf9abf3313bce7bfa3db9bdb0a9ac
862a0a38d2f0f80c5229dde0515175ca0c6b6328
19110 F20101115_AABQYQ solanki_s_Page_096.QC.jpg
7c3dcad506efac6a72202a42364c1756
6e34a8fa33d3ee5f2e532ded0799a7f953490d47
8525 F20101115_AABPWC solanki_s_Page_015thm.jpg
7178f22b1a4077b9b98652032e355b68
cb6fd608a87ff9e363e51deea8d7e6d7713a3df9
8567 F20101115_AABQAU solanki_s_Page_055thm.jpg
90321af54d3e971263728cff86b31a32
be735a8c16a9674dac09896eb3a0f9d737f42837
F20101115_AABPVO solanki_s_Page_039.tif
9b232b580cfb5240783ce8e8b1b6a4a2
36e5f96b12c90b3ac3b15a14c4dc133468a8757c
2070 F20101115_AABPUZ solanki_s_Page_042.txt
c4044937878df3f971ebdaba053d7410
e97f1f209c5dff6ea6762f0923e44c91ba754cc7
11323 F20101115_AABQBI solanki_s_Page_038.jp2
ce45006c8ad4828427d4c75a0ec27a3b
0d490c6e357380b9a1408f32946e40d81bcb08dd
23996 F20101115_AABQZF solanki_s_Page_107.QC.jpg
5296e552c8df44a153ba5a70e13fbd93
cb71f233d8e8033fbd6f5d6a97fa1266329d9094
2232 F20101115_AABPWD solanki_s_Page_036.txt
77fd7bd82420268823af5b5713b868c0
e133486cda8196761bc37540a2caf19be0b11677
1051958 F20101115_AABQAV solanki_s_Page_093.jp2
d4f98f31ca783023929cbf2356ffa62a
4297f5ddc2b7e82fc184ab94efbc73309a76dc12
8843 F20101115_AABQBJ solanki_s_Page_032thm.jpg
1035f0e6560a10f0624615baa48416a3
c552ee4db19a47762e71206ff398856a7a034972
21042 F20101115_AABQZG solanki_s_Page_108.QC.jpg
edeb6a6c83e54b13ce69ddacfa5882df
4b025907b649c467b861ab0e6a4b9504374ed007
4638 F20101115_AABQYR solanki_s_Page_096thm.jpg
b4f462f2e83db9d1787bf2981f7ee83c
0e0703a398db45d35aa0730e126de7c91873bc79
51671 F20101115_AABPWE solanki_s_Page_042.pro
8876ed2db63a09a6dd1f4a5eee360808
7ef062868a132e9442465686989c4843f7ff9c5e
111624 F20101115_AABQAW solanki_s_Page_066.jpg
a165ff86ba715c43daeea70a79bce1fe
9e401bf313642c8fdbdba51abb704a7bd5e983a6
87294 F20101115_AABPVP solanki_s_Page_044.jpg
0401c1b995a0c652b32435a7bf78403c
5568be4f9254ada8440f80fd3babc43d3f7b28cf
8932 F20101115_AABQBK solanki_s_Page_029thm.jpg
13db6be5d5355b54bc384297fdca1e65
3f623caaa5cffdd726d8f4b36a6eeb0dd61e9103
7157 F20101115_AABQZH solanki_s_Page_108thm.jpg
eae287a7a64771e5025947811ce32af4
a53edd2bfeedd0cb4f8ee8b53de65d9100facf09
17127 F20101115_AABQYS solanki_s_Page_097.QC.jpg
bcdddba3a1ec3cfd0d698386ef2ee56c
49d41557c341e62b1f918c7e19c36d42cf1ec3e7
9007 F20101115_AABPWF solanki_s_Page_094thm.jpg
e8429c751112b8ae2cfbce8caedc721c
dcfb1c889aa5506e40b71073a9536370704cd362
18488 F20101115_AABQAX solanki_s_Page_089.QC.jpg
6b9833c2aced69cd04a4f87527bb6bc5
26786839f1edbd97c02e0fc06c3ce04ec72b4bab
F20101115_AABPVQ solanki_s_Page_069.txt
5e61386cae30204d17bf6143a380fa84
3fbba6b46241692b88055301074c8215cfcba105
F20101115_AABQBL solanki_s_Page_084.tif
b059eb01bf8b2c06bdcc92ccfdda399e
444c455d7aff7cdd6d2e55d5157426c59fee5bab
16266 F20101115_AABQZI solanki_s_Page_109.QC.jpg
1e17e0b1ec444a87fe2174529f2cc263
5c1723c7c532154cdb50f571fed14fcf6bd32b24
3812 F20101115_AABQYT solanki_s_Page_098thm.jpg
6c21d255015daf67977e26ec3c8afff5
fedd71c2f509eed8697684abf1b475e2599cf63d
4619 F20101115_AABPWG solanki_s_Page_105.pro
8ca1da48326b74f3ae7da03fe28e496d
1e42dcdd72708f880c56ac305f9d8d0423951ca0
108961 F20101115_AABQAY solanki_s_Page_042.jpg
190e7b60189cffe0d43c64ca093bb2fe
1126b64f12da335bdbef5428d57a67d8332659e9
115263 F20101115_AABPVR solanki_s_Page_092.jpg
5fb7b46b74438519d25787e2eaabd82a
3711538a1ed8535c1b094ab0ffd0ef3312df3083
15142 F20101115_AABQCA solanki_s_Page_104.QC.jpg
4f0e4adffe952eb709564389b00522fc
637f9770112109ca1a98b858fe364efe6e5b201f
93008 F20101115_AABQBM solanki_s_Page_018.jpg
c11f9ae0c293ad824db8ca3957166817
6fc6bba169723a6a01086d8e887e6fc753604260
20750 F20101115_AABQZJ solanki_s_Page_110.QC.jpg
ca61350c383072a479f2242d68699dc8
32a8002e33edee2ebdacd1b7f6bbda45cb484f9a
10996 F20101115_AABQYU solanki_s_Page_099.QC.jpg
dc6a9190a7c49782e05f81eddb6de8b0
3dc1f1285f5b9de0da40d391ab5fdca30706ada9
306 F20101115_AABPWH solanki_s_Page_097.txt
c52ec729419ae08431c15668deed5629
ce53cd9621f73fba4ce1a0d5fb68a9c449723eb5
7769 F20101115_AABPVS solanki_s_Page_062thm.jpg
eed464695251f93c4e05b7ea670fc946
24d741e0ab4d82cb5571a45477fe5f84e6e3b020
35840 F20101115_AABQCB solanki_s_Page_074.QC.jpg
303f10de0c92e676580450e62463479e
5e3558c0923f87864c1ab5443a18508a11e58171
4232 F20101115_AABQBN solanki_s_Page_083.pro
81a8b7972ea20f6f3b8a90181cb74d4c
425661855f05ac259e382039bba05a47fd26a4a4
5693 F20101115_AABQZK solanki_s_Page_110thm.jpg
0496d86efdba686913d9772cd6ead107
ce4e058527a6184e2f5dc0ab4aabc1341b65a0d8
3345 F20101115_AABQYV solanki_s_Page_099thm.jpg
f0f2f382fcb2ecec41d5ff49663d61af
91bc880bfc038dc82b6520e2c7dc6de07c242789
6904 F20101115_AABPWI solanki_s_Page_101thm.jpg
10a246b49da8c91499569c91edb5d219
801353bc1087d0a0ba5e2c82ca84063f74bc3ab7
94171 F20101115_AABQAZ solanki_s_Page_068.jpg
9d49d0aa4ea7cd1280cec5316dc968df
c3813a4904a15145f157dc96bb8536dbc2881fc3
53501 F20101115_AABPVT solanki_s_Page_092.pro
0a04298b8e3dfb46522adb6341d67378
7a54a71d7c9af43c3874414ed1afae7ca70e338a
1996 F20101115_AABQCC solanki_s_Page_020.txt
8c432851326bf3833b1288fc432a4ade
8bfbdddb944a221b9b51ca2024555228ebc1952d
F20101115_AABQBO solanki_s_Page_127.tif
cb0965a7e63b542244f5456c4077a066
dfdefe5250aa10b275cce038612039edd458a8b5
7953 F20101115_AABQZL solanki_s_Page_111thm.jpg
fe3c2265f9a7b04b5efa1a2a5a61d898
ab314568e2afcac9e445f130ee0e850dc1864fd6
11869 F20101115_AABQYW solanki_s_Page_100.QC.jpg
c8984631d2ddf05ef6f60f17f4728fd8
7cce5302b27cea8f43f95d0adfe36ca5219af9f6
4823 F20101115_AABPWJ solanki_s_Page_086thm.jpg
cfee162aeb79fe719a801292b6a75438
74e620836d33c09cebc971626c6984b466e87057
7300 F20101115_AABPVU solanki_s_Page_107thm.jpg
d91e524f75b81676ca5617b3a69e3f55
dd523b1b9cf4e3740da4d18a77226ff550b3024d
35064 F20101115_AABQCD solanki_s_Page_095.QC.jpg
b1795e516e0725993b1c2046506af833
887bf8056e8a38eb149a29c983941731282dd89d
912 F20101115_AABQBP solanki_s_Page_049thm.jpg
2af8eb30e3a7a9c4f9567239f4acd49d
804dcf022ef53d659a6f637a77ff076048fd2615
28187 F20101115_AABQZM solanki_s_Page_112.QC.jpg
1499c63aaa7134125a5553bf6a1745f1
dd082bfd608f6b996759fe11ecee882f7b5172ac
3989 F20101115_AABQYX solanki_s_Page_100thm.jpg
6a510446f628d3844ee21ea4e5fe6da3
94b58e4e327885f785d184e9f7ac03ccdeef72e4
209 F20101115_AABPWK solanki_s_Page_106.txt
9012396701bca4e66ddab9de32d6db71
aadb8c032c9b589cba842265489128305654cbb0
9203 F20101115_AABPVV solanki_s_Page_036thm.jpg
1e40cc756505396a787e817a2cb586e2
9cb9dedb3dc3ec7d8b7fff40e07771e64af8238f
4732 F20101115_AABQCE solanki_s_Page_116.QC.jpg
4574bc15d13b90016997e7142282cef9
aec4945012c43c17daf53a5e4aeda331337d147a
2136 F20101115_AABQBQ solanki_s_Page_001thm.jpg
9e6bd955daad7fd9b49b37236a1edaa6
9de43eed050c037b871b5a07aa0887f1a9bdae95
7188 F20101115_AABQZN solanki_s_Page_112thm.jpg
7b98296579164f004b6d6c4464f22f8a
48ed2c7b4d8e1d6ac505e4a77a3303b56a7aa128
21210 F20101115_AABQYY solanki_s_Page_102.QC.jpg
97efc6232ffc0e6961b17bf537f95787
7e425a8b528f5f837af2205b3e2b9e3166467c36
33508 F20101115_AABPWL solanki_s_Page_037.QC.jpg
0fdd12c9af3a50c2465c336ca6c4320d
047a8e07b6bd633623a4c7b6825d22ddb67d9811
3096 F20101115_AABPVW solanki_s_Page_106.pro
d3a097c01b09eb0dd2496993e2042017
12ceec4756b4811579c0a8147f6105f72700aec2
23276 F20101115_AABQCF solanki_s_Page_101.QC.jpg
c5f968d7602135558a32f645efa24266
773f71b2a8aece44aaf79c3bc4c7248d2f033a8e
F20101115_AABQBR solanki_s_Page_010.tif
43ab683f46617ef8c43790acf806609e
caee3be6cef31f89f6e379219b1865663717dc77
27567 F20101115_AABQZO solanki_s_Page_114.QC.jpg
60710da152d6e1190596ead9675f7396
0b74585e1a8e21dcce593e9f22bddfc005a16654
6888 F20101115_AABQYZ solanki_s_Page_102thm.jpg
9c1aeb6672f3c475648e5b518f7e0621
0791312dfb714d8ceec12ce625e94680356793b3
116935 F20101115_AABPWM solanki_s_Page_066.jp2
54de19d26fc30ba3ba6a3220ca9d8794
6fb8b34b911201787b1c808a5096c2f893be330b
53523 F20101115_AABPVX solanki_s_Page_079.pro
73dce3d98270ed7231e08699ad7721f3
0018a44c5cc6142e755479dc20722ef0e3fff3c3
85243 F20101115_AABQCG solanki_s_Page_065.jpg
b313fdbe3930f658727a4d02d5e1f282
9ab3516c73d2db2a6e0165c432ac392c3bcbbcce
102958 F20101115_AABPXA solanki_s_Page_091.jpg
083394f964f5397ebff82d8d302a2a32
c682b5a639550d634bca459ebdf6bdac878ef8ec
2165 F20101115_AABQBS solanki_s_Page_014.txt
abf511fbc6bec0e5dc01c715a038309f
e7a284d0d831057a264d5dd64425770074d956fa
14178 F20101115_AABQZP solanki_s_Page_117.QC.jpg
1994127f17095aeeedd595a9106d542f
00ad2ec05a96fd4d959ece8252fbb6fc4af97853
23073 F20101115_AABPWN solanki_s_Page_005.QC.jpg
e8d167f80e44115ad2f9bb46157d2737
a00a318d692b29b521b50cd6e9d9d3d957ef7524
F20101115_AABPVY solanki_s_Page_095.tif
e664a01d875fa1d9d441d89ac953aefb
4affa5ca682a5a43d1d56b5541a1c3b24e84d134
29009350 F20101115_AABQCH odfusiontest1_10mph.avi
c442e626366278d1303ed19ff1ccb9c3
bc56c1549c8ad163b5853a4eb57f3e9417b19766
2029 F20101115_AABPXB solanki_s_Page_091.txt
596289fe229ab36383aa94d15cc11dd0
ea789b46b1b9169c7560469cbf0ba986a0727922
F20101115_AABQBT solanki_s_Page_098.tif
734f9e0ebdd18e88fc5ca35ce0bf45e0
a12c127edd1215992aa60ea53b791ee9f9d18f43
4391 F20101115_AABQZQ solanki_s_Page_117thm.jpg
a5382563f2f133a68450a1ca67fa1fef
7ef55dd59dd7abd54e0b57cc8a7ae93cb718694b
1507 F20101115_AABPWO solanki_s_Page_084.pro
4a161fd10a8cbd0421b502551dd3274c
7e74a915e58d70dcc57bcd8e72e957bd5b9e0395
24461 F20101115_AABPVZ solanki_s_Page_004.pro
598bcb947ce74293041630f4c74a34ed
af0c70b46767f4a7847a31d89b33d8eca51fdbb4
F20101115_AABQCI solanki_s_Page_061.tif
6355a63c68436d994aa3202ac3aaa421
bb1568d0fa4c77eba11e6a053f3d3782daaa6696
105496 F20101115_AABPXC solanki_s_Page_054.jp2
f836d73173103f946b24c1b728ff7096
8d64bb87c483b4372bf572492bf9dbc2a5beebac
2202 F20101115_AABQBU solanki_s_Page_066.txt
f899bc1aba2f5fce6b8a40cf2352d6a3
601eab9ce5b2828ed043892a6f3c73622c83b94c
8749 F20101115_AABQZR solanki_s_Page_118thm.jpg
bf770058b3e44b603e60d44db9917e95
d3bb2869a6968f4c1a79aab8ae155ce341e32a69
1252 F20101115_AABPWP solanki_s_Page_116thm.jpg
2de4981422f738f9b3be93f527023d11
f372fad7093e8d2f030e03c5e3323f5217b18ceb
121761 F20101115_AABQCJ solanki_s_Page_125.jpg
29e7b9da10612f0296b735cb1befc39d
2c9bbf29cd0b9e8e4361999881350779f67adeb6
5957 F20101115_AABPXD solanki_s_Page_088thm.jpg
5bf4631499e721826445d01135e2bc42
9b1c97965d5aeb230690282e0885abd85d1c66d1
104476 F20101115_AABQBV solanki_s_Page_057.jp2
8b39e46981feb5fbc056268693c6181d
be9e65294afc2dc45bcb2023ec9fa0f573f541a4
F20101115_AABQCK solanki_s_Page_035.tif
817427f4893c9270140b988d3bcbe736
015485f5b31f8772dd8c5ece964d4818934de7a8
5942262 F20101115_AABPXE odbarreltest2_22mph.avi
06257c8277c14a094ab594334b9efa98
17ce3c79504707154da968809260f230e2d6f163
2169 F20101115_AABQBW solanki_s_Page_032.txt
b984ca72f74d86479888c7d656e71bf8
2d554d5f343d6cd625e3df91e97b5b8ffb70cb87
8877 F20101115_AABQZS solanki_s_Page_119thm.jpg
74182af857a39037ecd2ac348971e7c4
a6a1eca222c79d34cb70f9dfc999c7553a7ba2db
F20101115_AABPWQ solanki_s_Page_122.txt
b9c4792fe3abbec658032e100cf2267a
eb2d68c2c6971c825ae796cc7a5e7c1b61d4d544
107617 F20101115_AABQCL solanki_s_Page_015.jpg
366afc8a96069040ee928c8a1f50b67f
63e27ca7b3c10c726dfac91018f8f921322f3024
1982 F20101115_AABPXF solanki_s_Page_058.txt
c56da659b73a35ddee9304c42a7452b9
b805e2d3e24280995ae08efcf396342463696751
135587 F20101115_AABQBX solanki_s_Page_073.jpg
926ca6e311fca80d3ec3e610e2ace046
1ab04a0ae5159e180f57b3a1f2567cccbf78297c
14491 F20101115_AABQZT solanki_s_Page_120.QC.jpg
1d0a791b26936c2b43ee57e856f0442d
66db4270be9dd5ce06f77c3d906dd1e04d2751f0
129653 F20101115_AABPWR solanki_s_Page_081.jp2
ab3415ab63fa986b3344a34aa6bc5ec9
a099b8829d7c4064634a95659b1b71a2d32eb2e4
46294 F20101115_AABQDA solanki_s_Page_012.pro
3c7c3fe4ca3af9d99387d07c95d63422
90ac3dade6c76466b42603a9fc4413067082e508
107744 F20101115_AABQCM solanki_s_Page_048.jpg
292022ab76873229459b6cf3fb152316
0846a4d328d790513253a63aae07e7f377f1dab8
8755 F20101115_AABPXG solanki_s_Page_045thm.jpg
e9e1085ca4c8212128d9e4438e3ca015
7996e40833f7a8552d5804557b2b46a368f94df0
84006 F20101115_AABQBY solanki_s_Page_024.jpg
0619e7c3cb1dfdaff0f098651cf42228
5f3b5aea7dbf8ac9419984b84e5371f17ea180e0



PAGE 1

1 DEVELOPMENT OF SENSOR COMPONENT FOR TERRAIN EVALUATION AND OBSTACLE DETECTION FOR AN UNMANNED AUTONOMOUS VEHICLE By SANJAY CHAMPALAL SOLANKI A DISSERTATION PRESENTED TO THE GRADUATE SCHOOL OF THE UNIVERSITY OF FLOR IDA IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF DOCTOR OF PHILOSOPHY UNIVERSITY OF FLORIDA 2007

PAGE 2

2 Copyright 2007 by SANJAY CHAMPALAL SOLANKI

PAGE 3

3 To my sister, Seema.

PAGE 4

4 ACKNOWLEGEMENTS I would like to thank my committee members; Dr Carl Crane, Dr Warren Dixon, Dr John Schueller, Dr Antonio Arroyo and Dr Douglas Dankel. Especially I am very thankful to my advisor, Dr. Carl Crane, who ha s been a great support and inspiration for the entire course of my graduate studies. I would like to thank David Armstrong, for his support as a project manager. I thank Dr Mike Griffis, for putting all the efforts in building the NaviGator. I express my thanks to all the students at CIMAR including Bob Touchton, Tom Galluzzo, Daniel Kent, Robe rto Montane, Jaesang Lee, Shannon Ridgeway, Ji Hyun Yoon, Steve Velat and Greg Garcia, wh o have accompanied me on this challenging and exciting path. I would like to thank everybody at the Air Force Research Laboratory at Tyndall for their support. Thanks to my wife Rita, my daughter Risha, my parents, and my sisters who have always been there to help me.

PAGE 5

5 TABLE OF CONTENTS page ACKNOWLEGEMENTS................................................................................................................4 LIST OF TABLES................................................................................................................. ..........8 LIST OF FIGURES................................................................................................................ .........9 LIST OF OBJECTS................................................................................................................ .......11 ABSTRACT....................................................................................................................... ............12 CHAPTER 1 INTRODUCTION..................................................................................................................14 Motivation..................................................................................................................... ..........14 Background..................................................................................................................... ........15 2 LITERATURE REVIEW.......................................................................................................18 Sensors........................................................................................................................ ............18 Monocular Vision............................................................................................................18 Stereo Vision.................................................................................................................. .18 Thermal Imaging.............................................................................................................19 Laser.......................................................................................................................... ......19 Ultrasonic Transducers....................................................................................................20 Radar.......................................................................................................................... ......20 Environment Representation..................................................................................................21 Uncertainty Management........................................................................................................21 Uncertainty Management in Multi-Sensor Fusion.................................................................23 Certainty Factors.............................................................................................................23 Dempster-Shafer Theory.................................................................................................24 Neural Networks..............................................................................................................25 Sensor Implementations in Autonomous Vehicles.................................................................26 3 RESEARCH GOALS.............................................................................................................40 Statement of Problem........................................................................................................... ..40 Research Requirements..........................................................................................................40 Autonomous Platform............................................................................................................ .41 Mechanical Specifications...............................................................................................41 Power System..................................................................................................................41 Computing Resources......................................................................................................42 Localization................................................................................................................... ..42 Contributions of this Research................................................................................................42

PAGE 6

6 4 TRAVERSABILITY GRID APPROACH.............................................................................45 Traversability Grid Representation.........................................................................................45 Traversability Grid Implementation.......................................................................................47 Traversability Grid Propagation.............................................................................................48 5 SENSOR IMPLEMENTATION............................................................................................53 Sensor Hardware................................................................................................................ .....53 LADAR Sensor...............................................................................................................53 Sensor Interface...............................................................................................................54 Computing Resources and Operating System.................................................................54 Sensor Mount...................................................................................................................54 Sensor Algorithms.............................................................................................................. ....55 Obstacle Detection............................................................................................................. .....56 Terrain Evaluation............................................................................................................. .....59 Terrain Mapping..............................................................................................................60 Terrain Classification......................................................................................................62 Classification based on the slope of the best fitting plane...............................................63 Classification based on the variance................................................................................64 Weighted Neighborhood Analysis..................................................................................65 Negative Obstacle Detection...........................................................................................68 Terrain Evaluation Output...............................................................................................71 Advantages and Limitations of the OD and TE Sensor Algorithms......................................72 Fusion of the Sensor Components..........................................................................................74 Implementation of the LTSS as a JAUS component..............................................................77 Joint Architecture for Unmanned Systems......................................................................77 JAUS System Architecture on the NaviGator.................................................................78 Implementation of the LTSS as a JAUS component.......................................................79 6 EXPERIMENTAL RESULTS...............................................................................................90 Traversability Grid Visualization...........................................................................................90 Obstacle Detection Sensor Results.........................................................................................91 Obstacle Mapping Results...............................................................................................91 Obstacle Detection Response Time.................................................................................92 Fusion of the Obstacle Detection an d Terrain Evaluation algorithms....................................93 Scene 1........................................................................................................................ .....94 Scene 2........................................................................................................................ .....95 Scene 3........................................................................................................................ .....95 High Speed Test of the LTSS component..............................................................................95 7 GENERALIZED SENSOR COMPONENT........................................................................111 General Parameters for Obstacle Detection sensor..............................................................112 General Parameters for Terrain Evaluation sensor...............................................................113 8 CONCLUSION AND FUTURE WORK.............................................................................118

PAGE 7

7 APPENDIX. TRAVERSABILITY VALUE MAPPI NG FOR TERRAIN EVALUATION ALGORITHMS....................................................................................................................121 LIST OF REFERENCES.............................................................................................................124 BIOGRAPHICAL SKETCH.......................................................................................................128

PAGE 8

8 LIST OF TABLES Table page 4-1 Update the circular buffer to account for the grid movement............................................52 4-2 Accessing the grid cell in the circular buffer.....................................................................52 6-1 Traversability Grid mapping of obs tacle detection algorithm for reading 1...................110 6-2 Traversability Grid mapping of obs tacle detection algorithm for reading 2...................110 6-3 Response time reading 1..................................................................................................110 6-4 Response time reading 2 with increased obstacle height.................................................110 A-1 Mapping of slope values to traversability value.............................................................121 A-2 Mapping of the variance values in to traversability values..............................................122 A-3 Mapping of neighborhood values to traversability value...............................................123

PAGE 9

9 LIST OF FIGURES Figure page 1-1 Important components of an autonomous vehicle.............................................................17 2-1 Time of flight measurement...............................................................................................39 3-1 Testing platform NaviGator...............................................................................................44 3-2 Computing resources........................................................................................................ .44 4-1 Traversability Grid representation.....................................................................................50 4-2 Grid movement.............................................................................................................. ....50 4-3 Circular buffer repres entation of the grid..........................................................................51 4-4 Traversability Grid propagation.........................................................................................51 5-1 Measurement range (Top view scan from right to left). A) A ngular Range 0 -180 B) Angular Range 0-100.................................................................................................82 5-2 Laser sensor RS422 interface.............................................................................................82 5-3 Sensor mounts on the NaviGator.......................................................................................83 5-4 Sensor mount design........................................................................................................ ..83 5-5 Block diagram of the LTSS component.............................................................................84 5-6 Obstacle detection LADAR...............................................................................................85 5-7 Traversability value mapping............................................................................................86 5-8 Schematic of terrain evaluation sensors.............................................................................86 5-9 Weighted neighborhood analysis.......................................................................................87 5-10 Schematic working of negativ e obstacle detec tion algorithm...........................................87 5-11 Block diagram of terrain evaluation algorithms................................................................88 5-12 JAUS system topology...................................................................................................... .88 5-13 JAUS compliant system architecture.................................................................................89 6-1 Traversability Grid color code...........................................................................................97

PAGE 10

10 6-2 Obstacle detection readi ng 1 experimental set-up.............................................................97 6-3 Obstacle detection reading 1 output at varied speeds. A) 10 mph B) 16 mph C) 22 mph............................................................................................................................ ........98 6-4 Obstacle detection readi ng 2 experimental set-up.............................................................99 6-5 Obstacle detection reading 2 output at varied speeds. A) 10 mph. B) 16 mph. C) 22 mph............................................................................................................................ ......100 6-6 Obstacle detection response time with increased height..................................................101 6-7 Test environment showing scene 1..................................................................................101 6-8 Output results for scene 1. A) OD algorithm B) TE algorithm.......................................102 6-9 Scene 1 LTSS component output environment representation........................................103 6-10 Test environment showing scene 2..................................................................................104 6-11 Output results for scene 2. A) OD algorithm B) TE algorithm.......................................105 6-12 Scene 2 LTSS component output environment representation........................................106 6-13 Test environment showing scene 3..................................................................................107 6-14 Output results for scene 3. A) OD algorithm B) TE algorithm.......................................108 6-15 Scene 3 LTSS component output environment representation........................................109 7-1 Sensor configuration....................................................................................................... .117 7-2 Minimum radius of curvature..........................................................................................117

PAGE 11

11 LIST OF OBJECTS Object page 6-1 OD reading 1 at 10 mph [ODB arrelTest1_10mph.avi, 100892 KB].................................92 6-2 OD reading 1 at 16 mph [ODB arrelTest1_16mph.avi, 83265 KB]...................................92 6-3 OD reading 1 at 22 mph [ODB arrelTest1_22mph.avi, 76214 KB]...................................92 6-4 OD with increased barrel height at 10 mph [ODBarrelTest2_10mph.avi, 92079 KB].....92 6-5 OD with increased barrel height at 16 mph [ODBarrelTest2_16mph.avi, 83265 KB].....92 6-6 OD with increased barrel height at 22 mph [ODBarrelTest2_22mph.avi, 75392 KB].....93 6-7 Test Path [VideoForwardPath.avi, 335689 KB]................................................................93 6-8 OD result for test path [ODF usionTest1_10mph.avi, 356136 KB]...................................94 6-9 TE result for test path [TEFusionTest1_10mph.avi, 393154 KB].....................................94 6-10 LTSS result for test path [LTSSTest1_10mph.avi, 337804 KB].......................................94 6-11 Return test path [VideoReturnPath.avi, 239561 KB]........................................................94 6-12 OD result for return path at 10m ph [ODFusionTest2_10m ph.avi, 306897 KB]...............94 6-13 TE result for return path at 10m ph [TEFusionTest2_10mph.avi, 316651 KB].................94 6-14 LTSS result for return path at 10mph [LTSSTest2_10mph.avi, 239208 KB]...................94 6-15 Obstacle detection result at 22mph [ODFusionTest_22mph.avi, 109706 KB].................96 6-16 Terrain evaluation result at 22m ph [TEFusionTest_22mph.avi, 110646 KB]..................96 6-17 LTSS result at 22mph [LTSSTest_22mph.avi, 135207 KB].............................................96

PAGE 12

12 Abstract of Dissertation Pres ented to the Graduate School of the University of Florida in Partial Fulfillment of the Requirements for the Degree of Doctor of Philosophy DEVELOPMENT OF SENSOR COMPONENT FOR TERRAIN EVALUATION AND OBSTACLE DETECTION FOR AN UNM ANNED AUTONOMOUS VEHICLE By SANJAY CHAMPALAL SOLANKI May 2007 Chair: Carl Crane III Cochair: Warren Dixon Major: Mechanical Engineering Applications of autonomous vehicles are f ound in various diversified fields including defense, agriculture, mining, and space explora tion. For successful operation of an autonomous vehicle, the vehicle has to dynamically interact with the environment ar ound it. My dissertation presents the development and implementation of a sensor component to represent the surrounding environment around an autonomous vehicl e. The environment is modeled as a grid and each cell in the grid is assigned a traversability value. This grid, termed as the Traversability Grid, forms an internal representation of th e state of the environment in real time. The Traversability Grid aids the autonomous platform to drive on a good traversable path and avoid non-traversable regions around it. The real world environment is highly unstruc tured and dynamic. It is very difficult to discretize the world into two variables: good and bad, hence th e sensor component developed in this dissertation uses a scale of traversability values to classify the region around the vehicle. LADAR (Laser Detection and Rangi ng) sensors are used to sense the environment. The raw data from the LADAR sensors are processed using tw o different algorithms; obstacle detection and terrain evaluation. The obstacle de tection algorithm uses a weight ed sum of evidences method to

PAGE 13

13 find if a region is occupied or free. The terrain evaluation algorithm maps the terrain in front of the vehicle. Based on the geometry of the terrai n, it assigns a traversabil ity value to the region. Features used to find the traversa bility include discontinuity in the terrain, slope of the terrain based on the least squares method, a nd the variance in the point cloud. Each of the two algorithms, obstacle dete ction and terrain evaluation, have certain advantages and limitations. Also like any other sensor algorithm, thes e two algorithms have some uncertainties associated with their outputs. In my work these algorithms are fused together to complement the uncertainties and limitati ons of each of them, and outputs of the two algorithms are fused using the certainty factors approach.

PAGE 14

14 CHAPTER 1 INTRODUCTION Motivation The concept of unmanned autonomous vehicles is one of todays leading research areas. An autonomous vehicle can be de fined as a vehicle which would drive by itself with a limited degree of human intervention or possibly no human in tervention at all. So far these vehicles have found applications in various fields such as military, planetary exploration, agriculture, and mining. The Defense Advanced Research Projects Agency (DARPA) has taken a lead role in encouraging and pursuing the autonomous vehi cle technology. The large participation and excitement associated with the DARPA Grand Challenge National event in 2004 and 2005 is an indication of the importance of auto nomous vehicles in todays world. As shown in Figure 1-1, the operation of an aut onomous vehicle can be classified into four main components, of these the perception of the environment and vehicle control components interact with the real world e nvironment, and the world model a nd intelligence components store the information and make appropriate decisions. Perception of the environment in real-time is critical for the successful operation of an autonom ous vehicle and the resear ch in this paper is focused on the perception element. My research is intended towards representing the real world environment in a format that would help the robot find and traverse the best possible terrain and avoid obstacles as it moves towards its goal. The problem of perception can be categorized into two main sections, one is sensing the environmen t with the use of various sensors and the other is to represent this data in a form the vehicle can use to maneuver. To scale the difficulty of th e problem imagine driving on rough terrain. How many times would one have to ask the question, which would be the best way to go? Would it be desirable to pass the small ditch in order to avoid a big bou lder or would it be better to traverse the uneven

PAGE 15

15 terrain rather than going through the ditch? It is some what difficult for humans to answer these questions; how difficult it would be to represent this heterogeneous environment in a digital format for a robot to understand. The problem is not limited to this. The human perception system is very powerful and humans are able to relate to things around them very well; the world is defined in terms of human perception. It is not so easy in the case of sensors. Most of the sensors available today are limited to measure only a limited category of features. For example, one could capture color information with a camer a but not depth, so is the same with other commercially available sensors such as laser, infrared, and sonar. Alt hough there are techniques like stereo vision where one can capture both color and depth, they are limited by factors such as environmental lighting conditions and noise. The certainty of any commercially available sensor varies with the environmental conditions. The work in this research is an effort to find an effective solution to the above problems. Background Within the Center for Intelligence Machines and Robotics (CIMAR) lab, research in the area of obstacle detection using real-time se nsors for autonomous vehicles started with simulations presented by Takao Okui [Okui 99]. Okui presente d various techniques for updating local grid maps using simulated sensors. Some of the methods he used included probability theory, Bayes theorem, histogram method, fuzzy logic, and non-homogeneous Markov chains. This work helped the research community in comparing the advantages and disadvantages of each of the above methods. David Novick [Novick 2002] took the research further by incorporating the obstacle dete ction component on an actual vehicle. He pursued the nonhomogeneous Markov chains approach for updating gr id maps. He did an excellent job in fusing the output of two non-homogenous sensors, the laser range finder and the sonar sensor.

PAGE 16

16 My work here aims to extend these previous e fforts. My objective is to identify traversable regions in an off-road environment that a car -like autonomous vehicle can travel at speeds approaching 30 mph. The next chap ter contains a review of the sensors and algorithms already presented by the research comm unity to solve this problem.

PAGE 17

17 Intelligence System World Model Perception / action data Control system Intelligence System World Model Perception / action data Control system Figure 1-1. Important components of an autonomous vehicle.

PAGE 18

18 CHAPTER 2 LITERATURE REVIEW Various sensors have been used to date for detecting obstacles and estimating terrain. This chapter begins with a brief discussion on the opera ting principles of some of these sensors. This is followed by a brief theory on the represen tation of the environment and managing the uncertainties associated with sensing. The chapte r concludes with a review on the literature that has been presented and implemented in the area of real time perception of the environment for autonomous vehicles. Sensors Sensors in general can be categorized as activ e or passive. Active sensors have an internal energy source and emit energy into the environm ent while passive sensors do not emit energy and depend on their surrounding environment as the source of energy. Both types of sensors are widely used in autonomous vehicle applications Sensing of the environment includes detecting obstacles and characterizing terrain. Monocular Vision Object detection and classification using co lor has been discusse d in the research community for a long time. The primary features used for object detection in this case are the RGB color space. Some of the commonly used al gorithms to detect obstacles or classify traversable terrain are edge dete ction and optical flow. Implemen tations of monocular vision are found in Davis [1995], Takeda [1996] and Royer [2005]. Stereo Vision The slightly different perspectives from which our eyes perceive the world lead to different retinal images, with relative disp lacements of objects (d isparities) in the two monocular views of a scene. The size and direction of the disparities of an object serve as a measure of its relative

PAGE 19

19 depth; absolute depth-informati on can be obtained if the geometry of the imaging system is known. Stereo vision can be used to construct a three-dimensional image of the surroundings using the same principle as human eye, provided the geometry and optics of the sensor are well known. An implementation of the stereo vi sion system can be found in Wu [2002]. Thermal Imaging Thermal cameras can detect energy at infrar ed frequencies. All objects emit infrared energy, the hotter the obj ect the more infrared energy it em its. Thermal cameras work better in the night as there exists a be tter contrast of the surrounding objects. During the day objects absorb heat from the sunlight and tend to blend together. Algorithms such as edge detection are used for classification of obstacles using ther mal imaging. Matthies [2003] implements a thermal camera for detecting negative obstacles. Laser Most laser sensors use a visibl e or infrared laser beam to project a spot of light onto a target, whose distance is to be measured. The di stance from the spot on the target back to the light-detecting portion of the sensor is then measured in several ways [Carmer 1996]. The general factors to consider wh en specifying a laser distance sensor include maximum range, sensitivity, target reflectance a nd specularity, accuracy, resolution, and sample rate. The general methods used to measure distance are optical triangulation and time of flight distance measurement. The optical triangulation measurement is used to measure distance with accuracy from a few microns to a few millimeters over a range of few millimeters to meters at a rate of 100 to 60,000 times per second. A single point optical tria ngulation system uses a laser light source, a lens and a linear light sensitive sensor. A light source illuminates a point on an object, an image of this light spot is then formed on the sensor surface, as the object is moved the image moves

PAGE 20

20 along the sensor; by measuring the location of th e light spot image the distance of the object from the instrument can be determined. Laser time-of-flight instruments offer very long range distance meas urement with a trade off between accuracy and speed. Figure 2-1 shows the two different methods to compute range distance using laser sensors. As shown in Figur e 2-1, a short pulse of light is emitted and the delay until its reflection returns is timed very ac curately. Since the speed of light is known, the distance to the reflecting object can be calculated and this is referred to as the time of flight measurement. The laser sensor used in this rese arch works on the principl e of time of flight. The second method to compute the range distance is by measuring the phase difference between emitted and reflected waves. Example of a laser se nsor working on this principle can be found in Hancock [1998]. Ultrasonic Transducers An ultrasonic transducer work s on the principle of time of flight measurement as well. A short acoustic pulse is emitted from the sensor; if the pulse hits an object the return echo is sensed. There are some inherent properties of this sensor which reduce its accuracy of measurement such as, the large wavelength of ultrasonic energy as compared to the surface roughness of an object and its wider beam angle. An implementation of the Ultrasonic sensors can be found in Novick [2002]. Radar Radar emits electromagnetic radi ation. As the signal propagates, objects reflect, refract and absorb the radiation. Large reflecting objects will reflect a stronger signa l. The signal strength would be different for different materials. A lowe r signal strength is received for a large obstacle with high absorptivity. Radars are generally used to detect large metallic obstacles at a distance. Radar based sensor modeling can be found in Foessel-Bunting [1999].

PAGE 21

21 Environment Representation In order to use the sensors in autonomous r obotic applications, the measurements from the sensor are converted into state variables wh ich form an internal representation of the environment. The state variables define the state of the environment at the current instance of time. The two methods generally used to m odel the environment in autonomous vehicle applications are vector maps and raster maps. The 2-dimensional vector maps are based on geometric primitives: point, lines, circles, polygons etc. A set of environmental characteri stics are tagged with these primitives. These characteristics could be positive or negative obstacles, moving obstacles, obstacle segmentation based on color, etc. Raster maps are built by tessellating the environm ent around the robot in a 2-D grid. Each cell in the 2-D grid can be classified based on the sensed environment characteristics. The main advantage of a vector map is its compactness. For example only four vertices are needed to define a rectangle of si ze m by n, while to represent the same rectangle in a 2-D grid map an entire rectangular matrix of cells is required whose size will depend on the size of the rectangle. Generating vector primitiv es require highly accurate sensor data as compared to generating 2-D raster maps. One of the examples of a raster map implementation found in the literature is the Occupancy Gr id map [Thrun et al. 2005]. Occupancy Grids tessellate the surrounding continuous space arou nd the robot into fine grained cells. 2-D Occupancy Grids are more common. In 2-D grid s, the information from the 3-D world is represented as a 2-D slice. Each cell in the grid corresponds to a variable, which represents the probability of occupancy of that location. Uncertainty Management The sensors discussed above measure a particul ar characteristic of the environment; hence these sensors provide only partial information of the surroundings. Sensor measurements are

PAGE 22

22 often corrupted by noise. Hence there is always so me amount of uncertainty associated with the sensor in its representation of the environment. The state estimation algorithms seek to recover the state variables (i.e., the state of the envir onment from sensor measurements). As it can be inferred, most of these algorithms have to use an uncertainty management mechanism to come up with an estimate of the environment from th e sensor readings. Some of these algorithms; probability theory, histogram method, Bayes theore m, fuzzy logic is discussed in Novick [2003]. Thrun et al. [2005] discusses the implementation of Bayes filter, which is a variant of the Bayes theorem used to update Occupancy Grid s. The Bayes filter us es the log odds ratio representation instead of the probability re presentation of occupancy. The log odds representation, lx is given as: log 1 px lx p x (2.1) where p(x) is the probability of occupancy. The log odds ratio is recursively computed as th e posterior value from the prior log odds ratio and the current sensor observation as follows: 10| log 1|t tt tpxz lll pxz (2.2) where: lt is the posterior log odds ratio, lt-1 is the prior log odds ratio, |tpxz is the inverse measurement model which specifies the distribution over the state variable x as a function of the measurement zt and

PAGE 23

23 l0 is a constant denoting the prio r before any sensor readings. As seen from the Equation 2.2, the Bayes f ilter is additive and hence any algorithm that increments and decrements a variable in response to measurements can be interpreted as a Bayes filter in log odds form. Uncertainty Management in Multi-Sensor Fusion Any one type of sensor measures a limited ch aracteristic of the environment. Moreover similar sensor hardware used in different sens or algorithms, can obtai n different information from the environment. The Bayes filter discusse d above is not very useful in combining the outputs from different sensor al gorithms. Consider for example, an obstacle which can be detected by one sensor type but cannot be detected by another sensor. These two sensor types would generate conflicting information, and th e resulting occupancy grid map would depend on evidences brought by both sensor systems, hence th e results will be ill-defined. The information from these multiple sensors is combined by gene rating a sensor map for each sensor type and then integrate the maps by using a suitable se nsor fusion technique. Some of the techniques which can be used to fuse more then one sensor maps into a single map, where each sensor map might provide different information of the environment are discussed below. Certainty Factors Certainty factors [Gonzalez 1993] formalism assi gns a belief value called as the CF value for a hypothesis, based on the evidence pres ented to support the hypothesis or evidence presented to contradict the hypothe sis. The CF values are expresse d as a set of rules having the format: IF evidence THEN hypothesis (with a CF value)

PAGE 24

24 A number of such rules are used, each to de fine the belief in the hypothesis based on the evidence presented. If the evidence supports the hypothesis, the CF value is between 0 and 1. If the evidence contradicts the hypothe sis, than the CF value is betw een 0 and -1. The overall belief in the hypothesis is computed by combining the individual CF values from each rule. The following mathematical relationships are used to combine the CF values: Let, CF1 CF2 be the two CF values to be combin ed. If both the CF values support the hypothesis, the combined CF value is, 1211combinedCFCFCFCF (2.3) If the two CF values contradict each other, then the combined CF value is, 12 121min,combinedCFCF CF CFCF (2.4) The resulting CFcombined is then used as the CF1 value and combined with another CF value. For the sensor fusion process, consid er a number of sensors, whic h evaluates the environment and presents evidence to either suppor t the presence of an obstacle or supports the presence of a good traversable path (contradiction to the presence of obstacle). The result s from these individual sensors can be combined using th e certainty factors approach. Dempster-Shafer Theory The Dempster Shafer theory [Shafer 1976] wa s developed through the efforts of Arthur Dempster and his student, Glenn Sh afer. It is based on a mathematical theory of evidence where a value between 0 and 1 is assigne d to some fact as its degree of support. The theory allows belief values to be associated with sets of facts as well as individual facts.

PAGE 25

25 For example consider a case, where there are two possible conclusions. The set of possible conclusions is, 12{,} The frame of discernment is defined as the set of all possible conclusions. This set is, {1212,{},{},{,} }, where, represents an empty set. A mass probability function assign s a numeric value from the range [0, 1] to every set in the frame of discernment. The mass probabilit ies are assigned based on the support that the evidence presents for each of the conclusions. The sum of these mass probabilities (i.e., the total probability mass) is 1. The belief, Bel of a subset A is the sum of all the mass probabilities, m assigned to all of the proper subsets B of A: 121212,, Belmmm (2.5) The certainty associated with a particular subset A, is defined by the belief interval, B elAPA where, B elA is the measure of total belief in A and its subsets and *1cPABelA is a measure of failure to doubt A. Finally for combining two evidences, the mass probabilities from each of the evidence are combined using Dempster,s combination rule. Consider mass probabilities, m1 (representing a preexisting certainty state associated with hypothesis subsets X) and m2 (representing some new evidence associated with hypothesis subsets Y), these two are combined into a mass probability m3 (representing the certainty state associated with C, the intersection of X and Y) as follows: 12 3 12 XYC XYmXmY mc mXmY (2.6) Neural Networks Artificial neural network (ANN) is an informatio n processing model that is inspired by the way biologic nervous systems such as the brai n, process information. The ANN model is made

PAGE 26

26 of strongly connected groups of simple proce ssing elements called neurons. These neurons are activated when their input is higher than some threshold value. When the neurons fire, they communicate their value to their downstream connected neurons, which perform the same function. The main advantage of ANNs is their ability to learn from training data. The learning process may be supervised (backpropagation method) or unsupervised learning (Kohonen network). The major limitation of neural networks is the problem of developing a general representation from a limited amount of training data. Sensor Implementations in Autonomous Vehicles The previous section introduced the various ty pes of sensors used in autonomous vehicles for real time perception. A brief theory of so me of the environment state representation techniques and managing uncertainty associated with sensor results was discussed. In this section some of the methods and algorithms used to implement the sensors are presented. In most of the methods, terrain mapping is described as building a Cartesian elevation map from range sensor data. However using this met hod for mobile robots has certain drawbacks, such as the elevation data can be sparse and no n-uniform (this is more apparent as the distance from the sensor increases and because of the occlus ion of regions due to obstacles.) To deal with the above problems [Kweon 1991], implements a locus method for building 3D occupancy maps using successive range images obtained from a lase r range finder. In this method the elevation z, at a point (x,y) is found by computing the intersection of the terrain with a h ypothesized vertical line at (x,y). The basic idea of the algorithm is to com pute this intersection in the image plane. [Matthies 2003] describes how a thermal camera is used to detect negative obstacles. The change in the temperature profile of the terrain due to uneven heating and cooling is accounted for. Specifically during night time the interior parts of the negative obstacles tend to remain warmer as compared to the surrounding terrain.

PAGE 27

27 RGB space has been effectively used [Davis 19 95] to classify terrain as vegetation or nonvegetation. Two classification algorithms, the Fisher linear discriminant (FLD) and the backpropagation neural network, are discussed. Consider an N-Dimensional space, where the input data could belong to one of the two classes A or B. The FLD algorith m is a linear classifier and generates a linear decision su rface that maximizes the correct classifications of A and B. In many cases the robots are required to work in a dynamic environment. A method for detection of moving obstacles [Kyriakopoulos 20 03] and estimating the velocity of the obstacle is discussed and implemented. The paper assu mes a polyhedral world i.e., the environment around the robot is composed of linear segmen ts. A single 2-D laser range sensor scans the environment. The range measurements from the sensor are expressed in a world coordinate frame after accounting for the vehicle position and orientation as follows: R rxxre (2.7) where x represents the Cartesian coordinates of the point in the world coordinate system, ,R rx is the position and orientation of the vehicle, is the angle of the laser ray in the sensor coordinate system and re represents the unit vector in the direction of the laser ray Each line segment of an obstacle can be defined by a parametric model as, 0 iii x xue (2.8) where 0 i x is a point on the segment and ie is the associated unit vect or parallel to the segment i.

PAGE 28

28 From the Equations 2.7 and 2.8, the intersecti ons of the line segments with the laser beam can be computed. Additional constraints are cons idered such as the case where the laser ray intersects more than one obstacle line segment for which the range value will correspond to the nearest obstacle. The candidate moving obstacles are then selected on the basis of the obstacle line segments obtained from the successive range reading sets. Map building algorithms have been devel oped [Huber 1999] using range sensors. The authors discuss the face-based spin image al gorithm to build a map from two laser range scanners, each mounted on a different vehicle. Kelly and Stentz [1998] developed an adaptive perception algorithm for off-road autonomous navigation. The input to the algo rithm is through range images from laser rangefinders and stereo vision. The data from the se nsors is processed only in a region of interest that contains the most useful information. The 3D data of the terrain is stored in a 2-D ringbuffer that accommodates vehicle motion through modulo arithmetic indexing. The terrain map discussed eliminates the need of copying the terrain data as the vehicle moves. Wolf [2005] presents an online algorithm th at builds a 3-D map of the terrain using 2-D laser range finder data. The terrain is classi fied as navigable or non-navigable using Hidden Markov models. A radar sensor model [Foessel-Bunting 1999] is developed to build three-dimensional evidence grid maps. The grid is updated using the logarithmic representation of Bayes rule. A perception module which processes range imag es to identify non-traversable regions of the terrain is developed [Langer D et al. 1994] for autonomous off-road navigation. The sensor used is the ERIM laser which acquires 64 by 256 range images at 2 Hz. The data from the laser range images are converted into Cartesian coordina tes local to the vehicle. These data points are

PAGE 29

29 then stored in a 2 dimensional grid map. E ach cell of the grid map is evaluated for its traversability. The features used to evaluate trav ersability are the height variation of the terrain within the cell, the slope of the patch of te rrain contained in the cell and the presence of discontinuity of elevation in the cell. The al gorithm only computes untraversable cells and renders no opinion on smooth traversable terrain. Bhatavia [2002] proposed a method to detect obstacles on an outdoor terrain which is highly curved with natural rise and fall, but is assumed to be locally smooth. A cost-effective laser scanner system was developed to replace th e commercially available high cost two axis lasers. A single axis laser scanner is mounted at an angle of 90 so as to scan a vertical line. The laser is provided with a rotary motion about th e second axis to sweep from left to right and register a set of vertical line scans. The data registered from each vertical scan ar e classified as an obstacle or free space using a gradient based algorithm. Each of this single scan classification is stored in time-history. The nearest neighbor fusion algorithm generates th e candidate obstacles by clustering the group of scans collected in a time window Experimental results demonstrate that the algorithm is capable of detecting obstacles of size 15 cm in a hilly terrain with minimal false positives; however the terrain has to be very smooth. The work presented by Talukder [2002a] implem ents a different approach to classify obstacles based on 3-D geometry for an off-ro ad environment. A conical region is defined around each point in 3-D space. If there is any poin t in the truncated portion of the cone, both the base point of the cone and the point under c onsideration are defined as an obstacle. This essentially means for a point to be an obstacle the height difference between the base point which represents the cone and the point under consider ation should be greater than a threshold value

PAGE 30

30 and the angle between the line formed by the two points and the horizontal surface should also be greater than a threshold value. The 3D poi nts are projected on an image plane and an algorithm to implement the above strategy in the image plane is presented. In the above method, obstacles are represented as pairs of points a nd this feature is further extended to segment obstacles. All the pairs of points which repres ent obstacles are define d as linked nodes of an undirected graph. Thus points connected in the chain will represent the same obstacle. Extension of the above work is presented in Talukder [2002b]. Along with the geometrical properties, color and texture are us ed to classify obstacles. The author discusses the concept of obstacle negotiation by presenting techniques to pr edict the dynamic vehicle response to various natural obstacles. The obstacles in a natural terrain such as green vegetation, dry vegetation, hard rock and soil are composed of wi dely different material properti es and yet can be of the same size. Using the color and texture features, the obst acles can be classified based on their material properties. The compressibility of the obstacles is modeled using a spring-damper system. The vehicle is also modeled using a mass-spring system for each of its wheels. The vertical acceleration of the vehicle is comp uted as it traverses the evalua ted path using the obstacle and vehicle models. This predicted acceleration profile is used for obstacle negotiation. A novel filtering method for terrain mapping, called the Certainty Assisted Spatial (CAS) filter is proposed [Ye 2003]. In this method an elevation map and a certainty map are built from a forward looking 2-D laser range finder. The ab ove filter is compared with the conventional available filters for effective f iltering of erroneous measuremen ts due to mixed pixels, missing data, artifacts and noise. An approach for traversability analysis of rough terrain [Ye 2004] utilizing an elevation map is proposed. The elevation map is defined as E = {Zi,j}where i and j are the row and column

PAGE 31

31 of the cell. A cluster of neighboring cells with non-zero elevation valu es is defined as an elevation-obstacle. A traversabili ty map is created along with the elevation map to define the traversability index. The traversability index for each cell is evaluated by computing the slope and roughness of a square terrain patch with the cell under consideration as the center of the patch. The size of the patch is equivalent to the number of ce lls required to accommodate the vehicle in any orientation. Depending on the valu e of the traversability index, the originally classified elevation-obstacles are re-classified. In contrast to the general approach of conver ting the disparity image from stereovision into 3-D Cartesian coordinates to det ect obstacles/traversa ble terrain, [Wu 2002] proposes to use the raw disparity data for 3-D obstacle detection using stereovision. Two different maps are generated: the obstacle map and the slope map. The obstacle map is 75 by 75 elements with each element representing a 0.2 m by 0.2 m area and th e slope map is 15 by 15 elements with each element of size 1 m by 1 m. The obstacle map elem ent is marked as one of four possible values; undefined, traversable, negative obstacle or positive obstacle. The obstacle detection algorithm is based on the horizontal isodispar ity profiles generated by the stereo image at fixed intervals and the reference lines for each of the isodisparity profiles. To represent and classify the environment around the robot, a height map is built from range data and fused with the video image [Asa da 1988]. The height map obtained from the range image is segmented into four regions: unex plored, occluded, traversable and obstacle. The region where the height information is not availabl e and falls outside the sensor view is tagged as unexplored and the remainder of that region is marked as occluded. The points close to the assumed ground plane and the neighboring points w ith low slope and curvature are classified as traversable. The remaining regions are labeled as obstacles. The obstacle region is further

PAGE 32

32 classified as natural or artificial using the he ight map and intensity image. The paper assumes that artificial obstacles such as cars have planar surfaces wh ich yield constant slope and low curvature in the height map and linear features in the intensity image while natural obstacles such as trees have fine structures and yield high curvatures in the height map and large variance in the intensity image. A physical simulato r of the scale 87:1 was built to perform the experiments. Most of the commercially available laser range sensors provide two kinds of information for each laser beam reading: the distance of the object being scanned and the intensity of the returned signal. [Hancock 1998] tries to expl oit the second piece of information to detect obstacles in a highway environment. Obstacle det ection based on range readings as discussed in most of the literature involve building 3-D Cartesian maps and hence involves large computation, not making it suitable for high speed environments. Intensity based obstacle detection for a forward looking laser works on the simp le principle that vertical surfaces in front of the vehicle return a stronger signal as compared to the signal returned by horizontal surfaces. This is especially true for larger look ahead di stances of the order of 60m where the tilt angle of the laser is very small (about 1 in the present paper). Howeve r the major drawback with the above method is that the intensity varies signif icantly with the surface properties of the object. Most autonomous vehicle development projects use off the shelf laser range sensors which are not specifically developed fo r the application. With the a dvancement in autonomous vehicle technology, commercially available sensors do no t meet the specific requirements of range and angular resolution for evaluating wide range of obstacles and terrain at high speed. Carnegie Mellon University has developed a high speed, hi gh accuracy LADAR [Hancock 1998] in joint collaboration with K2T, Inc., and Zoller + Froehlich. The device consists of a two-axis

PAGE 33

33 mechanical scanning system with a single poin t laser measurement system. The scanner provides an unobstructed 360 horizontal field of view a nd a 70 vertical field of view. The resolution of the system is variable with a maximum of 0.6 per pixel. Multiple sensors have been used [Stentz 2003] to detect and avoid obstacles in natural terrain. The perception system is categorized in two sensing modules. The appearance sensing module includes color cameras and Forward L ooking Infrared (FLIR) cameras, while the geometric sensing module includes LADAR sens ors and stereo vision. The geometric sensing module detects geometric hazards from the range data and assigns traversability costs. The traversability costs are further modulated by th e compressibility of the sensed terrain. The compressibility is assessed using the range texture, color and colo r texture to discriminate rigid rocks from bushes. Sensor data from color and IR cameras are comb ined with range data from a laser sensor in [Dima 2003] and [Dima 2004] for obstacle detect ion in an off-road environment. The image space is tessellated into a grid of rectangular cells. The range data are projected into the tessellated grid. Features such as the mean and st andard deviation of pixel color in each cell, mean and standard deviation of infrared pixel values in each cell, texture information and the range reading statistics are used as input to the classifier. Instead of co mbining all the features into one vector and using only a simple low level data fusion, methods to use a pool of classifiers on the subsets of these features and then represent the final output as a fusion of these classifiers are discussed. The classifier fusion algorithms presented in the paper are Committees of Expert, Stacked Generalization and AdaBoost with classifier selection. One of the limitations with the classifier fusion algorithms is the need of supervised learning and hence a large set of training data.

PAGE 34

34 Range imaging sensors are widely used in de tecting obstacles with a variety of obstacle detection algorithms. [Matthies 1994] proposes a model to evaluate the different obstacle detection methods. The proposed method for evaluati on is divided into two levels: the quality of the raw range data from the sensors and the qua lity of the obstacle detection algorithm. The quality of the range data is evaluated based on statistical performance models where both the random errors caused by noise and the systematic errors due to artifacts are considered. To evaluate the quality of the obstacle detection algorithm, the model predicts the quality of detecting obstacles and the probability of false pos itives as a function of the size and distance of the obstacle, the resolution of the sensor and the level of noise in the range data. [Murphy 1998] discusses the evidential reas oning techniques to fuse sensor outputs. Dempster-Shafer (DS) theory is implemented for sensor fusion at symbol level in the Sensor Fusion Effects (SFX) architecture. The SFX consists of three distinct activities: configuration, uncertainty management and exception handling. The uncertainty management activity collects observations from the sensor and computes the tota l belief in the percept in three steps. First the observations from each sensor are collected and fu sed at the pixel level in the collection step. The features extracted from these observations ar e fused at the feature level in the preprocessing step resulting in a belief function over each sens ors unique frame of discernment. Finally the fusion step combines these belief functions from different sensors into a total belief in the percept. In the case of a sensor anomaly in any of the steps, exception ha ndling is triggered. Two components from the DS theory, the weight of conflict metric and the enlargement of the frame of discernment, are used for the sensor fusion. The implementation of the above architecture is demonstrated on a mobile robot which acts as a security guard. The outputs from a color camera, a black and white camera, an infrared camera and an ultrasonic sensor are fused to guard a scene.

PAGE 35

35 In an application different from autonomous vehicles, a sensor fusion architecture is designed [Wu 2002] for context sensing. The auth ors have pointed out the difficulty that lies with fusing different sensors due to the overlaps and conflicts in the outputs and the variations in a sensors performance with a change in situat ion. In the experimental example presented, a users focus of attention is tracked using multip le evidences from different sensors. The output from each of the sensors, a camera and a microphone are viewed as evidences. Two different methods, weighted probability linear combinati on and Dempster-Shafer theory of evidence, are used to fuse the evidences obtained from the se nsors. Since the confiden ce level of the sensors varies with situation, the authors propose a new concept for a weighted Dempster-Shafer evidence combining rule. The idea is that if we know how a sensor performs historically in similar situations, we can use this information to decide how much confidence we have on the sensors current estimation. The results from the combined estimations are compared against the results obtained using only a single sensor a nd found to be better in most of the cases. Rather than extracting sensor data to form an intermediate representation which can be utilized in the planning, [Goodridge 1994] pr oposes a non-representa tion based sensor fusion scheme to control robots. A mapping between the sensor data and control signals is developed using Fuzzy sets. The PCFUZ fuzzy development sy stem is developed and used on an indoor mobile robot to conduct experiments involving goal seeking, obstacle avoidance, barrier following and object docking behaviors. The se nsors on board include sonar, vision and tactile sensors. Fuzzy rules are created using the outp uts from these sensors to produce a continuous surface which defines the control signals. Differe nt techniques such as summation, weighted averaging, fuzzy multiplexing and hierarchical switching are discussed to implement multiple behaviors such as the combination of obstacle avoidance and goal seeking.

PAGE 36

36 Urmson [2006], Miller [2006], Trepagnier [2006] and Braid [2006] presents some of the sensor implementations in the 2006 DARPA Gra nd Challenge competition. A brief discussion of these sensor implementation is presented below. Urmson [2006] implements a set of five la ser sensors and one radar sensor to detect obstacles and to classify terrain for off-road naviga tion. Three of these lasers are used to evaluate the terrain while the other two lasers and the radar detect obvious positive obstacles. Each of the sensor processing algorithms ge nerates a cost map. The cost map is a 2-D grid, where each cell in the grid is associated with a cost value whic h represents the traversability of that cell. The sensor fusion algorithm generates a composite map using a weighted average of each of the sensor cost maps. The terrain evaluation algorit hm operates on a single line scan of the laser data instead of a point cloud formed by the suc cessive line scans as the vehicle moves. This approach reduces the effect of imperfect pose estimation. Th e terrain evaluation algorithm operates by fitting a line to the vertical planar projection of points in vehicle width segments. The traversability cost is computed as a weighted maximum of the slope and the line fit residual. A grid based terrain representation method, wh ere each cell in the grid, holds not only the elevation values of the cell but also the uncertainties associated with the elevation value is presented in [Miller 2006]. A set of three laser rang e sensors are used to scan the ground in front of the vehicle. Two of these sensors are fixed on the vehicle, while one sensor is mounted on a two axis gimbaled platform. The gimbaled platform allows the laser to be pointed in any desired direction. The gimbaled laser is combined in a feed back loop with the path planner to gather data along potential paths. The range data from all the three sensors is fused into one map. The probability of the association of each range r eading to a particular cell is computed. The probabilistic model takes into account the measurement errors in the lasers itself and the error in

PAGE 37

37 the positioning system. The laser measurements assigned to a part icular cell are converted into an estimate of the elevation distribution within th at cell. The information is stored as an estimate of the cells elevation and the a ssociated conditional covariance. Each new measurement is fused with the previous measurement readings, hence there is no need to store the individual sensor measurement readings and only the posterior pr obabilities of the es timated elevation and variance are stored. A set of two Sick LMS 291 laser scanners [T repagnier 2006] are mounted on each side of the front end of the vehicle. Each of these lasers scan the surrounding in a vertical plane and are mounted on an oscillatory platform to provide a 30 oscillating angle. Range data from both the lasers are fused into a common elevation grid ma p. The data is time stamped and hence obstacles are kept in memory only for a certain amount of time. If the obstacles no longer exist after a specified time, they are removed from the map. This helps in clearing moving obstacles, once they have passed. Team TerraMax [Braid 2006] selected an ar ray of sensors including a forward looking vision system, single-plane LADAR and multi-pl ane LADAR. The vision system is based on multi stereoscopic vision and consists of three id entical cameras mounted on a rigid bar in front of the vehicle. The three cameras form three different baselines between them. Depending on the speed of the vehicle one of the three camera pairs is selected. The large baseline is used for higher speeds to obtain a deeper field of view ; the medium baseline is used for medium speed and the shorter baseline for slower speeds. The multi-plane LADAR is an IBEO ALASCA 4plane scanner that is used for positive obstacle detection. Two of the planes scan towards the ground and the other two scan towards the sky. The obstacle detection algorithm is based on

PAGE 38

38 detecting the slope of the terrain. Obstacles are re ported in terms of the closeness of the object collision to the proposed path.

PAGE 39

39 Figure 2-1. Time of flight measurement.

PAGE 40

40 CHAPTER 3 RESEARCH GOALS Statement of Problem Given a vehicle capable of maneuvering autono mously and which can obtain its position and orientation measured with respect to a Global Coordinate syst em, develop a sensor component that will evaluate th e terrain and detect ob stacles to support autonomous navigation. The sensor component has to be developed for a full-size autonomous vehicle which can traverse on paved and off-road conditions at maximum sp eed approaching 30 mph. The information should be presented in such a way that the vehi cle can use it to avoid obs tacles and rough terrain and to seek out smooth terrain. Research Requirements Assuming the availability of an autonomous vehicle platform and a positioning system, the steps that are required to be fulfilled to de velop a sensor component, that meet the goals mentioned in the problem statement are as follows: 1. Selection of appropriate sensor hardware, whic h is able to give the required information of the surrounding environment. 2. Selection of proper computer resource s to interface the sensor hardware. 3. Design of an environment representation mode l. The model should not only be able to distinguish the region around th e vehicle into traversable and non-traversable areas but should also be able to re present a degree of traversability or non-traversability. 4. Development of algorithms to process the sens or data to detect obs tacles and evaluate terrain. Combine the individual algorithms, into one fused output. The fusion process should be able to manage the uncertainties and the limitations associated with the outputs of the individual algorithms The output of the environment should be in the above mentioned environment representation model. 5. The sensor component should be develope d as a Joint Architecture for Unmanned Systems (JAUS) component. 6. Conduct experiments on an autonomous vehicl e platform to validate the results of the developed sensor component.

PAGE 41

41 The following sections discuss the autonomous platform and the positioning system used to develop and implement the sensor component. Autonomous Platform The development and experimentation work for the sensor component research presented in this dissertation has been done on the au tonomous platform, NaviGator (Figure 3-1) developed in the Center for Intelligent Machines and Robotics Laboratory at the University of Florida. The platform was initially develo ped to participate in the 2005 DARPA Grand Challenge competition. Some of the important sp ecifications of the platform are discussed. Mechanical Specifications The NaviGATORs base platform is a custom built all terrain vehicle. The frame is made of mild steel roll bar with an open design. It has 9" Currie axles, B ilstein Shocks, hydraulic steering, and front and rear disk brakes with an emergency brake to the rear. It has a 150 HP Transverse Honda engine/transax le mounted longitudinally, with locked transaxle that drives front and rear Detroit Locker differentials (4 wheel drive). The vehicle was chosen for its versatility, mobility, openness, and ease of development. Power System The power system consists of two independ ent 140A, 28V alternator systems. Each alternator drives a 2400W continuous, 4800W p eak inverter and is backed up by 4 deep cell batteries. Each alternator feeds one of two automa tic transfer switches (ATS). The output of one ATS drives the computers and electronics while the other drives the actuators and a 3/4 Ton (approx. 1kW cooling) air conditioner. Should e ither alternator/battery system fail the entire load automatically switches to the other alte rnator/battery system. Total system power requirement is approximately 2200W, so the power system is totally redundant.

PAGE 42

42 Computing Resources All the computing systems and electronics are housed in a NEMA 4 enclosure mounted in the rear of the vehicle as shown in Figure 3-2. The computing system consists of single processor computing nodes. The system uses a total of ei ght computers, each of them equipped with an AMD 2 GHz processor and 512 MB RAM. Each node is targeted to perform a specific function of the autonomous system. The system architecture discussed in detail in Ch apter 5, explains the breakdown of the autonomous system functiona lity into these individual nodes. The sensor component developed for this research resides on one computer. Localization The NaviGATOR determines its current locatio n using a combination of GPS and inertial navigation system sensor data. The processing and fusing of the navigation data is done by an Inertial Navigation System from Smiths Aerosp ace. This system is named the North Finding Module (NFM). The module maintains Kalman Filter estimates of the vehicles global position and orientation, as well as linear and angular ve locities. It fuses internal accelerometer and gyroscope data with data from an external NMEA GPS and external odometer. The GPS signal provided to the NFM comes from one of the two onboard GPS systems. These include a NavCom Technologies Starfire 2050 and a Garmin WAAS Enabled GPS 16. An onboard computer simultaneously parses data from the two GPS units and routes the best-determined signal to the NFM. This is done to maintain valid information to the NFM at times when only one sensor is tracking GPS satellit es. During valid tracking, the pr ecision of the NavCom data is better than the Garmin, and thus the system is bi ased to always use the NavCom when possible. Contributions of this Research The previous sections described the autono mous platform and the available computing resources to be used for this research. The sy stem explained above is the base foundation and

PAGE 43

43 provides all the necessary resources for successful implementation of the research discussed in this dissertation. The contributions of this research can be summarized in short as follows: 1. Development and implementation of an obstacle detection sensor system. 2. Development and implementation of a terrain evaluation sensor system. 3. Development and experimental evaluation of a new sensor fusion algorithm that combine the information from the obstacle detection and terrain evaluation sensor systems. 4. Development of generalized results that determ ine an optimal sensor system design based on specific vehicle parameters.

PAGE 44

44 Figure 3-1. Testing platform NaviGator Figure 3-2. Computing resources.

PAGE 45

45 CHAPTER 4 TRAVERSABILITY GRID APPROACH The surrounding environment of an outdoor au tonomous vehicle is highly unstructured and dynamic. The environment consists of natural obst acles which include positive obstacles such as rocks, trees etc. and negative obstacles such as cliffs, ditches etc. Ther e is a possibility of numerous man-made obstacles such as buildings, fence posts, and other vehicles and as well as many other type of objects. The terrain characteristics are also very important for safe navigation. The robotic vehicle sh ould avoid rough, uneven terrain as far as possible and at the same time seek out smooth traversable regions The representation model of the environment should not only be able to distinguish clearly de fined highly non traversa ble obstacles but should also be able to represent the small differences in the terrain which would for example distinguish a clearly defined paved or smooth dirt path from the surrounding region. This would help to keep the vehicle on the road and at the same time avoid obstacles. In Chapter 2, two broad classifications of environment representation techniques were discussed; the vector based and the grid or raster based. The current research presents the concept of the Traversability Grid which is a grid based representation of the environment. Traversability Grid Representation The Traversability Grid representation tesse llates the region around the vehicle into a 2D grid. Figure 4-1 shows the important parameters us ed to define the Traversability Grid. The grid is always oriented in the North-East direction with the vehicle position in the center of the grid. The grid is defined by the number of rows, numbe r of columns and the resolution of the grid. It consists of an odd number of rows and columns to create a center cell for the vehicle. In the current implementation, the Traversability Grid is 121 rows by 121 columns with a half meter by half meter resolution. This allows a sensor to report data at least 30 m ahead of the vehicle.

PAGE 46

46 Each cell in the grid corresponds to a small re gion of the environment and is represented by a traversability value. A traversability value of a cell defines the measure of traversability of that cell. The range of traversability value that ca n be assigned to a cell is from 0 to 15. A realtime sensor component can assign a traversabil ity value between 2 and 12 based on the sensed environmental characteristics. A traversability va lue of 7 is considered to be neutral; a value above 7 represents a traversabl e region while a value below 7 represents a non traversable region. As the region becomes more favorable to be traversable, th e traversability value of that cell gradually increases from 7 to 12. For exampl e, consider a paved pa th surrounded by a flat terrain which is not as smooth as the path. Sin ce the flat terrain surround ing the paved path is still traversable, it will be represented by a traver sability value above 7; however, the paved path would be represented by a traversability value higher then the surrounding flat terrain, thus although the vehicle could traverse on the surrou nding flat terrain, it would be more favorable to stay on the paved road. Similarly, as the region starts becoming non-traversable, the traversability value of the corresponding cell in th e grid gradually decreases from 7 to 2. In case the vehicle has to choose to drive through a non tr aversable region, it woul d choose to drive in the grid cells whose traversability value is larg er. The sensor component outputs a traversability value based on two factors: 1. The severity of a non-traversable region (size of the obstacle or roughness of the terrain) or the good terrain characteristics of a traversable region (smoothness of the terrain). 2. The confidence on the evaluated character istic. The obstacle might be highly nontraversable but what is the confiden ce on the presence of the obstacle. A value of 14 is assigned to a cell whose tr aversability value cannot be determined by the sensor. The remaining values are reserved for a specific purpose; the value of 0 represents out of bounds, 1 implies to use the same value as last time, 13 is reserved for failure or error and 15 denotes the vehicle location.

PAGE 47

47 Traversability Grid Implementation The sensor algorithms, discussed in the next ch apter, implement the Traversability Grid as a dynamically allocated circular buffer. The circul ar buffer implementation of the Traversability Grid is very efficient in updating the grid to the new position as the vehicle moves. The main advantage of using a circular buffer in place of a 2-dimensional array is that for a 2-D array, as the vehicle moves, for every new po sition of the grid, data from the cells in the old grid is copied into the corresponding cell indices in the new grid, this expensive computing operation of copying the grid data every time can be avoided by using a circular buffer. Figure 4-2, shows the change in the grid position due to the movement of the vehicle. As shown in the figure, all the data corresponding to the overlappi ng cells in the two grids has to be copied from the old grid position to the new grid position for a 2-D array representation. The circular buffer implementation of the Traver sability Grid avoids this computationally expensive operation. The circular buffer stores the da ta in the grid as a 1D array of size equal to the number of cells in the grid. The position of any cell corresponding to (row, column) in the 2D grid is mapped into the 1D circular buffer as follows, arrayPositionrownumberOfColumnscolumn (4.1) When the grid shifts to a new position, instead of copying data from individual cells, the circular buffer defines a pointer to a cell in the grid, which ke eps track of the cell corresponding to (0, 0) in the new grid position. The position of this pointer is defined by two variables, rowBegin and columnBegin. Initially these variables are set to (0, 0). When the grid moves to a new position, the variables are updated as shown in Table 4-1.

PAGE 48

48 The updated (rowBegin, columnBegin) are the indices of a ce ll in the old grid, which correspond to the indices (0, 0) in the new grid Thus each time the grid moves to a new position the above algorithm recursively keeps track of the start cell of the grid a nd physically all the data remains in the same array location. The only othe r operation needed is to clear the data in the cells which no longer overlap. Since the actual (0, 0) position of th e 2-D grid now corresponds to the (rowBegin, columnBegin), a cell given by the indices (row, column) is accessed using the algorithm shown in Table 4-2. The above algorithms define the basic functioning of the circular buffer data structure. Any number of variables of any data type can be stored in th e grid using the circular buffer implementation. Each of these variables would be stored as a 1-D array. At the minimum at least one such 1-D array is defined by a sensor component for storing the traversability values of the cells. This array is defined of type, unsigned char. Other arrays may be defined to store information about the cells which could be used internal to the sensor algorithm. Traversability Grid Propagation The Traversability Grid data structure is a common data structure representing the environment among all the components of the autonomous platform. Figure 4-4 shows the schematic diagram of the propagation of the Trav ersability Grid from the Smart Sensors to the Reactive Driver. Each of the Smart Sensor compon ent outputs its own Traversability Grid. These grids are fused into a single Traversability Gr id in the Smart Arbiter component. The Reactive Driver uses the Traversability Grid obtained from the arbiter to dynamically compute the vehicle speed and heading and accordingly alters its command to the Primitive Driver, while doing so the Reactive Driver accounts for the traversability value of each cell in the grid and seeks to follow the path with higher traversability values (reported by the sensors as favorable to be

PAGE 49

49 traversable) and avoid the cells with lower traversability values (reported by the sensors as non traversable).

PAGE 50

50 Vehicle Position (Grid Center) North Resolution East columns rows Vehicle Position (Grid Center) North Resolution East columns rows Figure 4-1. Traversability Grid representation. Rows moved Columns Moved Rows moved Columns Moved Figure 4-2. Grid movement.

PAGE 51

51 0 1 2. . Number of cells Start position New start position After grid movement 0 1 2. . Number of cells Start position New start position After grid movement Figure 4-3. Circular buffer representation of the grid. LADAR Traversability Smart Sensor Smart Aribiter Reactive Driver World Model a priori path Other Smart Sensors Primitive Driver LADAR Traversability Smart Sensor Smart Aribiter Reactive Driver World Model a priori path Other Smart Sensors Primitive Driver Figure 4-4. Traversabili ty Grid propagation.

PAGE 52

52 Table 4-1. Update the circular buffe r to account for the grid movement. rowsMoved and columnsMoved are the number of rows and columns the grid moved. numberOfRows and numberOfColumns define the size of the Grid. () { } () { } rowBeginrowBeginrowMoved whilerowBeginnumberOfRows rowBeginrowBeginnumberOfRows columnBegincolumnBegincolumnMoved whilecolumnBeginnumberOfColumns columnBegincolumnBeginnumberOfcolumns Table 4-2. Accessing the grid cell in the circular buffer { } rowrowrowBegin whilerownumberOfRows rowrownumberOfRows { } columncolumncolumnBegin whilecolumnnumberOfColumns columncolumnnumberOfColumns The resulting ( row column ) is then mapped into the 1D ci rcular array using Equation 4.1.

PAGE 53

53 CHAPTER 5 SENSOR IMPLEMENTATION The Smart Sensor component developed to meet the research goals of this dissertation is named as the LADAR (Laser Detection and Rangi ng) Traversability Smart Sensor (LTSS). The LTSS component use LADAR sensors for the realtime sensing of the environment. The words LADAR and laser sensor are used interchangeably and they both mean the same. This chapter explains in detail the hardware and soft ware implementation of the LTSS component. Sensor Hardware LADAR Sensor The LADAR sensor model LMS291-S05 from Si ck Inc. was selected. The LMS291-S05 is an optical sensor that scans its surrounding with infrared laser beams two dimensionally similar to laser radar. An infrared laser beam is generated by the scanners internal diode. If the beam strikes an object, the reflection is received by th e scanner and the distance is calculated based on the time of flight. The pulsed laser beam is defl ected by an internal rotating mirror so that a fan shaped scan is made of the surrounding area. The laser scans an angular range of either 180 or 100. Figure 5-1 shows the field of view for the two angular range config urations. The scanner can be configured in three angular resolutions; 1, 0.5 and 0.25, however the resolution of 0.25 can only be achieved with a 100 range. Because of the smaller beam width, the laser is more susceptible to false echoes due to small particle s. The maximum range distance measured by the system is 80 m with a measurement resolution of 10 mm. A higher range re solution of 1 mm is available with a maximum range of 8 m. The frequency of the scan depends on the angular resolution used. The scanner can operate at a maximum frequency of 72 Hz if the angular resolution is set to 1. However for higher resolu tions the operating frequency drops down to 36 Hz for a 0.5 resolution and 18 Hz for 0.25 resolution.

PAGE 54

54 The range distance measured by the sensor also depends on the reflectivity of the object. For highly reflective surfaces the sensor can meas ure objects at greater distances then objects with low reflectivity. Sensor Interface The sensor operating voltage is 24 V DC +/ 15%. The 24 V power supply is fed to the sensor through the power supply system developed for the vehicle (discuss ed in Chapter 3). The sensor can be interfaced to the computer usi ng serial communications The serial interface RS232 or RS422 may be used for communication. To take advantage of the high frequency LADAR data, the data has to be transferred at a baud rate of 500 Kb. He nce the RS422 interface is selected which allows data transfer at the higher baud rates. The RS422 interface is achieved via a high speed USB to serial hub from Sealev el (part # 2403). Figure 5-2 shows the RS422 serial interface c onnections diagram. Computing Resources and Operating System The LTSS component was developed on a singl e computing node. The node is equipped with an AMD 2GHz processor, 512 MB ram and a 1 GB compact flash solid state hard drive. The software is developed and tested on a Linux operating system. The Linux version Fedora Core 3 was used to develop and test the softwa re. All the software wa s developed using the C programming language. The GCC compiler with the built-in libraries for math functions, multithreading, socket communications and serial communications were used for the software development. Sensor Mount Figure 5-3 shows three LADARS mounted on the NaviGator. Two of these LADARS are mounted on a specially designed sensor cage on th e top of the front end of the vehicle. The design of the mounts for these LADARS allows them to be mounted at different angular

PAGE 55

55 configurations. Figure 5-4, shows the design of these mounts. The LADARS are mounted facing towards the ground at different angles. These two LADARS continuously scan the ground in front of the vehicle. The third LADAR is mount ed on the bumper level parallel to the ground plane. Sensor Algorithms So far, the concept of the representation of the environment as a Traversability Grid was discussed in Chapter 4 and the pr evious section discu ssed the actual sensor hardware used to sense the environment. Now, the mathematical a pproach and the software implementation of the LTSS component is explained. As mentioned in the problem statement the goal of the developed Smart Sensor component is to detect obstacles and evaluate the terrai n surrounding the vehicle. To achieve this goal, the LTSS component impl ements a number of different algorithms. Figure 5-5 shows a block diagram overview of the LTSS component. The input to the component is the raw range data from the lasers and the position and orie ntation of the vehicle and its output is the Traversability Grid which represents the state of the environment at the current time. The block diagram gives an overvie w of the different algor ithms and flow of the outputs of each of these algorithms to form th e output Traversability Grid. As shown in the Figure 5-5 the component consists of three distinct parts: 1. Obstacle Detection: The obstacle detection (OD) algorithm receives raw range data from the LADAR at the bumper level a nd outputs a Traversability Grid. 2. Terrain Evaluation: The terrain evaluation (TE) algorithm receives raw range data from the top two LADARS mounted on the sensor ca ge. Each of these two LADARS feed data into an individual terrain evaluation grid, which are then combined. 3. Sensor Fusion: The sensor fusion algorithm combines the outputs from the OD algorithm and the TE algorithm and produces a fused Trav ersability Grid. The fused Traversability Grid represents the results of the LTSS component and acts as an input to the Smart Arbiter like any other Smart Sensor component.

PAGE 56

56 The following sections discuss in detail the different algorithms and the fusion process used to combine these algorithms. Obstacle Detection The LADAR sensor used for the obstacle detec tion (OD) algorithm is mounted at bumper level, scanning in a plane parallel to the ground at a height of 0. 6 m. This height of 0.6 m may be defined as a threshold value. An obstacle of heig ht greater than the threshold value is a positive obstacle and anything below the threshold value is free space. The LADAR is set to scan at an angular range of 180 range with a 0.5 resolution. Figure 5-1 A s hows the field of view of this sensor. The OD algorithm can be divided into two parts, the mapping of the la ser range data into the Global coordinate system and then ev aluating each Cell based on the mapped data. The range data is mapped into 2D Cartesian coordinates in the Global coordinate system. As mentioned before the global coor dinate system is always orient ed in the North-East direction and its origin is the centerlin e of the vehicle at ground level below the rear axle (i.e., the projection of the GPS antenna ont o the ground). After each scan of 180 the range data from the laser is converted into the Global coordinate system. This conversi on is done in two steps. First the data is converted from polar coordinates to Cartesian coordinates lo cal to the sensor as follows: 0_sin0 _0cosS x sensorrange P y sensorrange (5.1) where 0_ _S x sensor P y sensor is the data point in the sensor coordinate system.

PAGE 57

57 The second transformation takes into account the vehicle orientation and the sensor offset distance and transforms the data point from the sensor coordinate system to the global frame. The OD algorithm is based on the weighted sum of evidences. The traversability value of a cell is computed based on the weighted sum of th e evidence of the cell being occupied or free. The evidence of a cell being an obstacle or fr ee space is derived based on the current sensor observation and initial evidences. A sensor obser vation may be defined as an outcome of the sensor measurement used to eval uate the state of the system. The sensor observation is managed in ternally using the two variables, OccupiedHits and FreeHits for each cell. After each laser scan the range measurements are transformed into observations. For each single coordinate generated fr om the range value, the cell to which this coordinate belongs in the Traver sability Grid is determined, followed by all of the intervening cells between the determined cell and the sensor Bresenhams line algorit hm [Foley 1990] and [Novick 2002] is used to determine the i ndices of the intervening cells. The OccupiedHits buffer is incremented by one for the cell which receives the hit and the FreeHits buffer is incremented by one for all the intervening cells. For cases where the received range value is beyond the Traversability Grid map, the cell at the intersection of the line formed by the range value and the sensor origin with the bounds of the grid map is found. For all the cells on this line the FreeHits is incremented by one. The evidence of a cell being occupied or free is computed as 11*occoccWtWtOccupiedHitskFreeHits (5.2) 12*freefreeWtWtFreeHitskOccupiedHits (5.3) where k1 and k2 are configurable parameters.

PAGE 58

58 The first term in the Equations 5.2 and 5.3, is the initial weight of evidence. The initial weight of evidence defines the state of the system before the current sensor observation. These initial weights could be viewed as a Markovian model. A Markov Chain is defined as a process where the future may be stochastic but no variables prior to the current state may influence the stochastic evolution of future st ates. Thus, all the information of the past state of the system is represented by these initial weights. The second term in the above equations is the sensor observations. The observation OccupiedHits strengthens the evidence of an obstacle and the observation FreeHits strengthens the evidence of free space. The third term in both the equations above act s as a parameter to adjust the speed of response of the system. For example in the case of moving obstacles, once the obstacle is clear, the third term in Equation 5.2 helps in fast reco very of free space. It also helps in clearing spurious ground noise. Similarly, in the case wh ere a cell is occupied by a moving obstacle the third term in Equation 5.3 helps in fast recovery of occupied space. Note however that for the case of fixed distinguishable obstacl es the last terms of these equations tend to cancel each other and hence do not have a major impact on the algorithm. After computing the weights of evidence for eac h cell, the weighted sum is computed as: s umoccfreeWWW (5.4) where represents the ratio of evidence of a cell being occupied to a cell being free. is a tunable parameter. The value of for current experimental results is selected as 1/6. Finally the weighted sum value of the cell is mapped to the Traversability Value. The mapping from the Wsum to Traversability Value is e xponential as shown in Figure 5-7.

PAGE 59

59 The algorithm is very similar to the Bayes filter discussed in Chapter 2, since any algorithm that increments and decrements a stat e variable (in our case occupied or free), in response to sensor measurements can be interp reted as a Bayes filter. The main difference between the current algorithm and the Bayes f ilter is that, the Bayes filter computes the probability of one state variable, occupied or free-s pace. The other variable is just the negation of the computed variable. In this case both the st ate variables are computed and then a weighted average of these variables is performed. The OD identifies positive obstacles and re nders no opinion regard ing the smoothness or traversability of areas where no positive obstacl e is reported. Hence it reports Traversability values from 2 to 7. A cell with a value of 2 has a high probability of a positive obstacle while a cell with a value of 7 is free space. A traversabil ity value higher than 7 is not assigned to a cell since it is not known if the free sp ace is a smooth traversable path or a rough terrain or even a negative obstacle. Terrain Evaluation In the terrain evaluation (TE) algorithm the terr ain in front of the vehicle, is mapped with the laser. A Cartesian elevation map is built from the successive laser scans and the positioning system readings corresponding to these scans. From the 3-dimensional map, the terrain is classified based on the geometry of the terrain. A set of classification features is generated by performing statistical anal ysis on the terrain map. Two LADAR systems are used to map the terra in. These two LADARs will be identified as TerrainLADAR1 and TerrainLADAR2. The Terra inLADAR1 is mounted at an angle of 6 and the TerrainLADAR2 is mounted at an angle of 12 facing forward towards the ground. Both these LADARs are mounted at a height of 1. 9 m above the vehicle ground level. The two LADARs populate two different maps of the te rrain and each terrain map is classified

PAGE 60

60 individually. The range readings from the two different lasers are not mapped into a common terrain map, since the two lasers scan the same pa rt of terrain at differ ent time and hence would have a built-in GPS localization erro r in these two sets of readings. The LADARs are set to a configuration of 100 angular range and 0.25 angular resolution. Figure 5-8 shows the sc hematic of the field of view of the two lasers. With this configuration and for nominal conditions (flat ground surface, ve hicle level), the TerrainLADAR1 scans at a distance of ~18 m ah ead of the vehicle and ~43 m wide and the TerrainLADAR2 scans the ground at a distance of ~9 m ahead of the vehi cle and ~21.4 m wide. Terrain Mapping After each complete scan of 100 the range da ta reported by the lasers are mapped into 3-D point clouds. The data points are stored in the co rresponding cells of the Tr aversability Grid as a linked list of 3-D Cartesian coor dinates. The conversion from th e range data to the 3-D point cloud is done in the st eps as explained. The range data reported by the laser is convert ed into Cartesian coor dinates local to the sensor. The local sensor coordinate system has its origin coincident with the origin of the laser beam and the X-Y plane coincident with the lase r scanning plane. The local sensor coordinates are computed from the range information as follows: *sin *cos 0.0l l lxrange yrange z (5.5) where is the angle of the laser beam with respect to the y-axis of the sens or coordinate system.

PAGE 61

61 Next, the local sensor coordinate is convert ed into the vehicle coordinate system. The vehicle coordinate system has its X-axis in the direction of the vehicle, and Z-axis vertically down. *cos *sinvloffset vloffset vloffset x xx yyy zxz (5.6) where is the laser tilt angle with the horizontal plane and ( xoffset yoffset zoffset) is the position of the sensor origin in the vehicle coordinate system. The data is then transformed into the global co ordinate system attached to the vehicle as well as a fixed global coordinate system by taki ng into account the vehicle orientation. The transformation from the local to global coordinate system attached to the vehicle is given as follows: 0 0 0 100011G v G v G v x CosCosSinCosCosSinSinSinSinCosSinCosx ySinCosCosCosSinSinSinCosSinSinSinCosy zSinCosSinCosCosz (5.7) where 1G G G x y z are the global coordinates of the point, ,, are the yaw, pitch a nd roll respectively and (,,)ooo x yz are the sensor coordinates in the global coordinate system.

PAGE 62

62 The fixed global coordinate system has similar orientation as the globa l coordinate system attached to the vehicle, but the origin is at a fixed point whereas the origin of the global coordinate system attached to the vehicle moves with the vehicle. The fixed origin is necessary to take into account the vehicle mo tion to build the point cloud. Th e coordinates of the data point in the fixed coordinate system are computed by adding the vehicle coordinates to the global coordinates obtained above. The data point is stored in the cell corresponding to the location of the point. The coordinate system attached to th e vehicle gives the locatio n of the point in the Traversability Grid. The maximum number of data points that can be stored in one cell is limited by a configuration variable. In case th e maximum number of data points in a cell is reached, the next new data point is compared with the already existi ng list of data points. If the coordinates of the new data point does not match with any of the exis ting data points, the firs t data point stored in the link list is replaced by the new data point. The new data point is assumed to match with the already existing data point if it is contained in a 0.1m cube centered on the old data point. After the mapping of the data points is complete the next step is the classificati on of the Traversability Grid. Terrain Classification Each cell in the Traversabili ty Grid is evaluated individually and classified for its traversability value. The following geometrical features are used for the classification: 1. The slope of the best fitting plane through the data point s in each cell. 2. The variance of the elevation of the data points within the cell. 3. The weighted neighborhood analysis. 4. Negative obstacle algorithm.

PAGE 63

63 The first three of these criteria are used for th e classification of the laser data from both the LADARs. The negative obstacle algorithm is implemented only for the data from the TerrainLADAR2. Classification based on the slope of the best fitting plane The slope feature helps in distinguishing disc ontinuous terrain or obstacle surfaces from a gradual slope. The classification of the terrain is based on the fact that the traversability of the vehicle decreases with the increas e in the slope of the terrain. The slope of the terrain is computed as the tangent of the angle between the X-Y (ground) plane and the computed plane of the terrain. Th e slope value is computed individually for each terrain patch corresponding to th e respective cell in the Traversa bility Gird. Based on the data points in the cell, the terrain pa tch is approximated as a planar surface using the least squares error approximation. To approximate the plane of the terrain patc h a minimum of three data points must be present in the cell. The equation for the best fit ting plane, derived using the least squares solution technique, is given as: b G G G ST T optimum 1 (5.8) where: optimumS is the vector ( Sx Sy, Sz) perpendicular to the best fitting plane G is an n 3 matrix given by: n n ny y x z y x z y x G2 2 2 1 1 1 b is a vector of length n given by:

PAGE 64

64 nD D D b0 02 01 AssumingiD0 equal to 1, Equation 5.8, is used to find optimumS for the data points within each cell. Chapter 5 of [Solanki 2003] provide s a thorough proof of the Equation 5.8 for finding the perpendicular to a best f itting plane given a set of 3-D data points. Once the vector perpendicular to the best fitting plane is known, the slope of the computed plane with respect to the X-Y (ground) plane is computed as follows: 22 xy zSS Slope S (5.9) The above computed slope value represents the tangent of the a ngle between the best fitting plane and the ground plane. This value is used to assign the traversability value. The assignment of the traversability value is heur istic and based on comparing the classification results with the actual terrain conditions. Table A-1 lists the ma pping of the slope value to the traversability value used in the experiments for this research. Instead of showing the tangent of the angle, the Table A-1, shows the value of th e angle between the best fitting plane and the ground plane. Classification based on the variance The variance is a measure of the dispersion of data. The dispersion of data is defined as the extent to which the data is scattered about the zone of central tendency. The variance is defined as the sum of the squares of the deviations from the mean value divided by the number of observations. For the terrain classification problem, the variance of the data in the Z direction is an important parameter. The varian ce of the data points in the Z direction gives an indication of

PAGE 65

65 the traversability of th e terrain. A higher value of the vari ance indicates a rougher, uneven and hence less traversable terrain while decrease in the variance indicates a smoother, traversable terrain condition. From the data points within each cell, the variance of the terrain patch corresponding to the respective cell is measured as follows: 2variZ iance n (5.10) where is the mean height of the cell given as: i Z n Zi is the elevation of the data point, and n is the number of data points. The variance value computed using the Equation 5.11, is mapped to a traversability value which is assigned to the cell. The mapping from th e variance value to the traversability value is heuristic and based on classificat ion results obtained from driv ing through different terrain conditions. The mapping variables ar e stored as configuration parame ters in the config file. The mapping values used in the experi mental results of this dissert ation are shown in Table A-2. Weighted Neighborhood Analysis As discussed in Chapter 2, a few research papers [Ye 2004] use the mean height as a criterion for evaluating the terrain. However, usin g absolute height as a parameter to evaluate terrain often results in misclassification of the terrain especially in case of uphill and downhill slopes. Instead of evaluating terrain based on th e absolute height of each cell, a measure of terrain evaluation presented here is to compar e the height of neighbor ing cells. The comparison of the neighboring cells gives an indication of the discontinuity in the terrain between the two cells.

PAGE 66

66 One of the difficult problems in neighborhood analysis is determining the extent of neighboring cells to use for comparison. Fi gure 5-9 shows the application of neighborhood analysis in the present context. Assuming that the current position of the vehicle is on a traversable region, consider the mean height of the center cell of the grid as ideal (the vehicle is always in the center of the grid). Now, move out from the center cell and compare the mean height of each of the cells adj acent to the center cell with the he ight of the center cell. As the cells examined expand from the center compare the he ight of each cell to be evaluated with the neighboring cells that are between the vehicle (i.e., the center cell ) and the cell being evaluated. The idea here is that since thes e neighboring cells fall in between the vehicle position and the cell being evaluated, for the vehicle to travel through the cell, it has to pass through these neighboring cells. Hence if the terrain is discon tinuous between the cell u nder consideration and the neighboring cells, then the cell is less likely to be traversable. For any cell in the grid there are three neighboring cells as show n in Figure 5-9, which fall in between the ce ll and the center of the grid. These three neighbori ng cells are the cell in the adjacen t row, the cell in the adjacent column and the diagonally adjacent cell. Since ther e are more than one cell which is adjacent to the cell being evaluated an algor ithm to decide on the importance of each neighboring cell is designed. The algorithm takes into consideration, that the impor tance of each of the individual neighboring cells is dependent on the position of the cell under consideration in the grid. For example, for a cell which is in the diagonal dir ection of the grid the di agonally neighboring cell will have more weight age then the neighboring row and column cell. Similarly the cells which are towards the center row of th e grid will have more weight age on the neighboring column cell and the cells towards the center column of the gr id will have more weight age on the neighboring

PAGE 67

67 row cell. Each neighboring cell is assigned a weig ht, which is computed based on the position of the cell in the grid. For any cell in the grid, compute the unit vector in the direction from the cell to the center of the grid. The neighboring cells are the adjacent cells that are closer to the center. These are assigned weights depending on the position of the vector. Consider a unit vector, vaibj which represents the direction from a cell to be evaluated to the center of the grid (vehicle position). The i component of the vector is in the dire ction of the neighboring row cell and the j component is in the direction of the neighbor ing column cell discussed above. The neighboring row, column and diagonal cell are assigned weights c1, c2, and c3 as follows: 1 1 2 2 1 2 1 2 3 1 2a c abab b c abab ab c abab (5.12) From the Equations 5.12, it can be seen that the weights are chosen based on the components of the unit vector in each of the dire ctions. These weights are normalized so that the sum of the weights is always 1. A weighted neighborhood analysis is then done for the cell being evaluated. Thus, depending on the height difference between the cell being evaluated and the neighboring cells, a traversability value is assigned to the cell. For example, consider the cel l shown in Figure 5-9. The neighborhood cell analysis valu e is calculated as follows:

PAGE 68

68 ,1,1,2,1,13,,1 Valueijchijhijchijhijchijhij (5.13) where hij is the height of the cell in row i and column j. The above computation is shown for a cell in the fi rst quadrant of the grid. Similarly for the cells lying in the different quadrant s of the grid, the co rresponding neighboring cells are selected using the scheme discussed above. The cells in the center row and the center column are treated as special cases and the neighborhood analysis of these cells is done by comparing them with only one cell, which is the nei ghboring cell directly in between the cell being evaluated and the center of the grid. The neighborhood analysis value obtained for each cell from the above calculations is mapped to a traversability value. Again in this case the mapping is heuristic and is saved as configuration parameters in the config file Table A-3 shows the mapping of the Weighted Neighborhood analysis value to the traversability value used in th e current experimental results. Negative Obstacle Detection In a rough outdoor environment, negative obstacle s such as big holes or cliffs on the side of the road are very common. W ith the laser range sensor, the onl y information one can obtain is the distance of the object (ter rain surface or obstacle) hit by the laser beam. With this information, the terrain map was built and algor ithms were developed to classify terrain. However in case of voids, empty space, the only important information that can be used from the laser data is that the region in between the laser and the laser range reading is a free space. The negative obstacle algorithm makes use of this information to give an estimate if the free space is negative obstacle.

PAGE 69

69 The algorithm compares each range reading obtai ned from the laser to the expected range reading. The expected range reading is computed based on the geometry of the laser beam. The unit vector in the direction of the laser beam is known since the laser is fixed with respect to the vehicle and the vehicle orientati on is known. Expecting good terrain c ondition, it is assumed that the region in front of the vehicle is a level ground in the plane of the vehicle. From the geometry of the laser beam (i.e., the line vector in the di rection of the laser beam) and the assumption of a level ground (X-Y plane) the expected range read ing is computed by solving the problem of the intersection of a pl ane and a line. Consider a unit vector in the di rection of the laser beam. The un it vector is expressed in the Laser coordinate system using Equation 5.5. The unit vector is then transformed to the vehicle orientation and subsequently to the global orientation using Equa tions 5.6 and 5.7. The sensor offset terms in Equation 5.6 are not considered, since it is only th e direction of the vector which needs to be expressed in th e global orientation frame. The expected range distance in the direction of the unit vector is computed using the information that the z-compone nt of the expected range ve ctor expressed in the global orientation frame with the origin attached to the sensor is equal to the height of the laser above the ground plane. The expected range is given as: /rdhek (5.14) where d is the range distance from the sensor or igin to the point of intersection, h is the height of the sensor a bove the vehicle ground plane and er is the unit vector representing the direction of the laser beam in the global coordinate system.

PAGE 70

70 The expected range reading is subtracted fr om the actual reading obtained from the laser and the difference is used as a fact or to assign traversability value. If the difference is less than a threshold value, the algorithm concludes that ground exists (positive obstacle or smooth flat terrain) and does not report anythi ng. However, if the difference is above the threshold value, the possibility of a negative obstacle is assumed. The cell in the Traversa bility Grid which would otherwise register the expected reading (in case of flat ground plan e) is assigned a traversability value. This cell would lie on an imaginary plan e formed as an extension of the vehicle ground plane similar to the one shown in Figure 5-10. Th e cell can easily be found since the range vector in the Global coordinate system is already known from the Equation 5.14 and it just needs to be transformed from the sensor orig in to the vehicle origin (cente r of the Traversability Grid). The severity of assigning this cell as a negative obstacle; incr eases with the increase in the difference in the actual and expected range distance readings. The algorithm assigns a traversability value between 2 a nd 7, since it is seeks only nega tive obstacles and does not render any opinion on smoothness or good terrain conditions. The algorithm is very sensitive to the tilt angl e of the laser with respect to the ground plane. The algorithm cannot distinguish between th e severities of negative obstacle if the change in the slope of the ground plane is greater than the tilt angle of the la ser. The TerrainLADAR2 sensor is tilted at an angle of 12 towards the ground and the TerrainLADAR1 is tilted at an angle of 6 towards the ground. TerrainLADAR2 se nsor would be able to distinguish between traversable path and negative obstac le as long as the change in the slope of the traversable path is less then 12 and if this change is greater then 12, there is a high probab ility of a false positive being registered. For the TerrainLADAR1 this allo wable change in slope is limited to only 6. Hence, the above algorithm is implemented onl y for the data from the TerrainLADAR2 sensor.

PAGE 71

71 The probability of registering a false positive al so depends on the vehicle orientation. For example, in cases where the vehicle is on a horiz ontal ground plane and there is a down hill slope in front of the laser, with magnitude greater th an the laser tilt angle, the algorithm would indicate a negative obstacle irrespective of whether the slope is a gradual, smooth downhill slope or it is a sudden discontinuity in th e terrain. However, if the vehicle orientation is pitched up i.e., the vehicle is on an uphill slope, the output fr om the algorithm would depend on the vehicle orientation angles, since the di rection of the laser beam depe nds on the vehicle orientation angles. The negative obstacle detection algorithm assigns traversability values only to the cells, which do not register any data points from the te rrain mapping algorithm. If there are data points present in the cell, the cell is not consider ed to be evaluated for negative obstacle. Terrain Evaluation Output The previous sections discussed the four algorithms used to evaluate the terrain characteristics. These algorithms are implemente d separately on each of the two sensor data. Figure 5-5, showed the overall block diagram of the LTSS component. From the figure it can be seen that, the terrain is eval uated by processing the data from each of the Terrain LADARs separately. The blocks terrain evaluation 1 a nd terrain evaluation 2, represent the terrain evaluation process for the two LADARs. A more elaborate picture of th e terrain evaluation 1 block is shown in the Figure 5-11. As discussed in the previous sections and as shown in the figure, the laser data is first mappe d into a terrain model. This terrain map is then input to each of the terrain evaluation algorithms. The Traversability Grid output from the terrain evaluation 1 block is the fusion of the traversability va lues obtained from each of the algorithms. The slope and the variance algorithms evalua te the cell based on the data across each single cell. Each of these algor ithms would work better than the other depending on the terrain

PAGE 72

72 condition being evaluated. The average of the sl ope and variance based traversability values gives an evaluation of the cell based on the data within the cell. The nearest neighborhood analysis evaluates the cell based on the discontinuity in the data between cells. The output of the terrain evaluation 1 block is the mini mum of the value above two values. The traversability values from the terrain evaluation algorithm implemented on the TerrainLADAR2 is obtained using a similar equation as 5.15. Ex cept in the case of TerrainLADAR2, if there is no data present in a particular cell a nd that cell has been evaluated by the negative obstacle detection algorithm, the cell would be assigned a traversability value from the negative obstacle detection algorithm. Al l the other cells which do not contain any data and which have not been evaluated by the nega tive obstacle detection algorithm are considered to be not in the field of view of the sensor and hence traversability value of 14 (unknown) is assigned to the cell. A simple averaging algorithm combines the ou tputs of the terrain ev aluation 1 and terrain evaluation 2. The computed average is the traver sability value of the cell based on the terrain evaluation: 122 TTVTTV TTV 5.16 where, TTV is the traversability value of th e cell based on terrain evaluation, TTV1 is the traversability value of the cell assigned by the terrain evaluation 1, TTV2 is the traversability value of the cell assigned by the terrain evaluation 2. Advantages and Limitations of th e OD and TE Sensor Algorithms The experiments conducted with the above sensor algorithms reveal the following advantages and limitations

PAGE 73

73 1. The obstacle detection (OD) algorithm can be used only for positive obstacles; it gives no opinion on the smoothness of the terrain and negative obstacles. 2. In off-road conditions the OD generates a lot of ground noise. Most of the noise is due to misclassification of an approaching uphill slope or due to going down a hill transitioning into flat ground. 3. However the OD algorithm is not as complex as the terrain evaluation (TE) and it does not have to create a 3-D point cloud. It is very reliable in iden tifying positive obstacles. The grid is updated about 35 times a second in the region bounded by th e field of view of the sensor. In this region moving obstacles wh ich have passed are cleared in the grid and if a moving obstacle shows up in the grid, the grid will be updated. 4. Since the OD algorithm does not depend on mapp ing the true coordina te of the point; but just checks to see if the point belongs to a cell, th e error in mapping the obstacle is very small compared to the TE algorithm 5. The main concern with the TE algorithms is mo deling the ground plane. Since the data is collected in successive scans, the ground plane of the vehicle changes. Thus each time the points are registered with a diffe rent reference plane. In cases where the vehicle is on flat smooth terrain this is not a problem. Howeve r, in cases of uneven terrain, it is very difficult to relate the data in a common gr ound plane. Although the points are registered in a fixed global frame there is some error associated with the registration process and experiments have shown the magnitude of the error depends on the condition of the terrain. 6. Since the look ahead distance of the Terrain LADARs is limited by the tilt angle of the laser, in the present case TE algorithm is eff ective for a range of only up to 18 m. It does not provide any information of obsta cles further than this distance. 7. The TE algorithm maps moving obstacle as part of the terrain and hence does not clear them after they have passed from the grid. 8. In spite of the above disadvantages of the TE, the algorithm actually maps the surrounding into a 3D point cl oud and characterizes the terr ain based on slope, variance and discontinuities, and hence th e classification is based on mo re detailed information of the surrounding as compared to the OD. The conclusion that can be drawn from the discussion and actual implementation of the OD and TE sensors is that these sensors provide very good classi fication results in a limited range of environmental conditions. However, much uncertainty is associated with these two sensor implementations, when using them in th e real world heterogene ous environment. The

PAGE 74

74 following section presents an uncertainty manageme nt tool, which is used to fuse the outputs from these two algorithms. Fusion of the Sensor Components In the previous sections two different sensor algorithms, the OD and the TE were developed and the advantages and disadvantages of each were discussed. To take advantage of each sensor algorithm and at the same time overc ome some of its limitations, the outputs from the above sensors are fused together using a si mple rule-based forward reasoning scheme. The uncertainties associated with the two sensors are combined using certainty factors [4]. The certainty factor (CF) formalism presents an approach to combine evidences supporting or contradicting a hypothesis. As opposed to the Ba yesian analysis where only those evidences supporting a hypothesis can be combined, the certa inty factor formalism provides a mechanism to combine contradictory evidences. The certainty factor value for a pa rticular hypothesis is between -1.0 and 1.0. The CF value of 1.0 re presents complete conf idence on the hypothesis while a CF value of -1.0 represents a complete confidence against the hypothesis. In the present case, a CF value of 1.0 indicates complete conf idence in the presence of an obstacle or highly non traversable region while a CF value of -1.0 represent a highly traversable and smooth region. The value of 0.0 represents a cell which does not show any confidence either towards the presence of an obstacle or towa rds a desirable traversable path. The rule based reasoning scheme assigns different CF values to each of the se nsors based on the observe d readings. Information such as the mean height of the cell is used in determining the confidence level whether the sensor should be able to see the obstacle. The traversability values obtained from each of the sensors is converted into the evidence of the presence of an obstacle or presence of a traversable path. The c onfidence on the evidences presented by each of these sensors is represented as,

PAGE 75

75 evidenceODCF is the evidence of the presence of an obstacle detected by the OD algorithm. As discussed in the previous secti ons, since the OD sensor does not gi ve any input on the traversable path, the value of this variable is between 0.0 (representing traversability value of 7) and 1.0 (representing traversability value of 2) depending on the presence of Obstacle. evidenceTECF is the evidence of the traversability or non-traversability of the cell detected by the TE sensor. The value of this variable is be tween -1.0 (traversability value of 12) and 1.0 (traversability value of 2) depending on the traversability of the cell. The pseudo code for the implementation of th e fusion of these two sensor algorithms is presented below. ODCF and TECF are the certainties associated with the respective sensors depending on which rule executes. LTSSCF is the combined CF value. The following scheme is applied: Case 1: The OD indicates the Cell is OCCUPIED IF TE = UNKNOWN THEN 1.0ODevidenceOD LTSSODCFCF CFCF IF TE = NON-TRAVERSABLE THEN 1.0 1.0 1ODevidenceOD TEevidenceTE L TSSODTEODCFCF CFCF CFCFCFCF IF TE = TRAVERSABLE THEN

PAGE 76

76 0.9 0.9 1min,ODevidenceOD TEevidenceTE ODTE LTSS ODTECFCF CFCF CFCF CF CFCF Case 2: The OD indicat es the Cell is FREE IF TE = UNKNOWN THEN LTSSODCFCF IF TE = NON-TRAVERSABLE THEN IF (Mean Height <= 0.6m) THEN 1.0TEevidenceTECFCF IF (Mean Height > 0.6m && Mean Height < 0.8m) THEN 0.8TEevidenceTECFCF IF (Mean Height >= 0.8m) THEN 0.2TEevidenceTECFCF LTSSTECFCF IF TE = TRAVERSABLE THEN 1.0TEevidenceTE LTSSTECFCF CFCF Case 3: The OD indicates the Cell is UNKNOWN 1.0TEevidenceTE LTSSTECFCF CFCF The CF values are mapped back into the traver sability value. As di scussed earlier, -1.0 corresponds to a value of 12, w ith 7 corresponding to a value of 0 and 1.0 corresponding to a value of 2. Should one of the sensors fail to report, all the values in the gr id for that sensor are marked as unknown, and the above scheme would give the output from the other sensor as the

PAGE 77

77 fused output. Hence, the fusion process is modular in the sense that each one of the sensors output would still be valid as a final output should the other sensor fail. Implementation of the LTSS as a JAUS component The LTSS is implemented as a JAUS component. This section starts with a brief overview of the JAUS architecture followed by the JAUS implementation on the autonomous test platform, NaviGator. Finally a detailed explana tion of the implementation of the LTSS as an experimental JAUS component is given. Joint Architecture for Unmanned Systems The Joint Architecture for Unmanned Systems (JAUS) [JAUS] is an architecture defined for use in the research, development and acqui sition of unmanned systems. The two main purposes for the development of JAUS are to support interopera bility amongst heterogeneous unmanned systems originating from different developers and to support the reuse/insertion of technology. To ensure that the arch itecture is applicable to the development of entire domain of unmanned systems the following constraints ar e imposed on JAUS; platform independence, mission isolation, computer hardware independe nce and technology independence. JAUS is a component based, message passing architecture th at specifies data formats and methods of communication among computing nodes. The JAUS system architecture is defined in a hierarchical structure. The syst em topology is shown in the Figur e 5-12. The different levels of the architecture are defined in the following terms: System: A system is comprised of all the unmanned systems and human interfaces meant for a common application. Subsystem: A subsystem is a single or more than one unmanned system which can be defined as a single localized en tity within a system. The au tonomous platform, Navigator which has been developed at CIMAR may be defined as a single JAUS subsystem

PAGE 78

78 Node: A JAUS node defines a distinct processi ng capability within the subsystem. Each node runs its own node manager component to manage the flow and control of JAUS messages. Component: A component provi des a unique functional capability for the unmanned system. A JAUS component resides wholly within a JAUS node. More then one components may reside on a single node. Instances: Instances provide a way to dup licate JAUS components All components are uniquely addressed using the subsystem, node, component and instance identifiers. JAUS defines a set of reusable com ponents and the messages supporting these components. However, JAUS does not impose any regulations on the configuration of the system. JAUS also allows the development of experimental components for performing tasks which otherwise cannot be performed by the already defined JAUS components. The only absolutely necessary requirement that has to be satisfied for the implementation of JAUS is that all JAUS components can communicate between each other only through JAUS messages. JAUS System Architecture on the NaviGator In the hierarchical structure of the JAUS system, the NaviGator is defined as a fully independent JAUS subsystem. The NaviGATOR system architecture is formulated using the existing JAUS defined components wherever po ssible. Experimental JAUS components are developed for the tasks which did not have a JAUS component. A JAUS compliant messaging system is used to define all the communica tion between components. The sensor component developed in this research is an experimental JAUS component. Each of the rectangular blocks shown in Figure 5-13, is a JAUS component. From the autonomous functionality view point, at the highes t level, the Navigator system architecture is categorized into four fundamental elements. These elements are:

PAGE 79

79 1. Planning element: The components that act as a repository for a priori data. These components perform off-line planni ng based on the a priori data. 2. Control element: The components that perform closed loop control in order to keep the vehicle on a specified path. 3. Perception element: The components that perf orm the sensing tasks required to locate obstacles and to evaluate the smoothness of terrain. 4. Intelligence element: The components that dete rmine the best path segment based on the sensed information. As stated in Chapter 4, the Traversability Gr id is the common data structure used to represent the environment in all the above components. The components in the perception element represent the world as a Traversabi lity Grid based on real-time perception, the components in the planning element represent th e world around the robot as a Traversability Grid based on a priori information, the components in the intelligence element utilizes these Traversability Grids to plan the best possible path and the control element executes the planned path. The complete loop of per ception, planning and control is re peated continuously at about 40 Hz. The following section explains in detail the implementation of the LTSS as a JAUS component which comprises of all the sensor algorithms discussed above. Implementation of the LTSS as a JAUS component The LTSS JAUS component is developed us ing C programming language in the Linux Operating system environment. The LTSS and a ll the other JAUS compon ents on the NaviGator are implemented as finite state machines. At a ny point of time, each component can assume one of the seven states enumerated as; Startup, In itialize, Standby, Ready, Emergency, Failure and Shutdown. Of these the Emergency state is not used in the LTSS component; the Failure state is used to report any type of failure, such as failure to allocate dynamic memory or failure to create service connections and the Shutdown state is ca lled during shutdown of the component to end

PAGE 80

80 all the processes in a proper sequence. The impo rtant operations in the other states are in the following sequence: Startup state: The component is checked into the system and obtains instance, node and subsystem identification numbers from the node manager. All the required data structures declared as global variable s are initialized using dynami c memory allocation. These include three data structures of type Circular Buffer to stor e the grid information for each of the LADARs. The basic implementation of the circular buffer was discussed in Chapter 3. Since the obstacle detection algorithm a nd the terrain evaluation algorithm store variables of different data type s, the circular buffer implementa tion for the two; is different in regards to the number and type of variab les but the basic functioning remains the same. The position and orientation information is obtained from the JAUS component, GPOS and the vehicle state information is obtain ed from the JAUS component, VSS. This information is obtained using JAUS service connections. The JAUS service connections provide a mechanism to continuously obtain information at a fixed update rate from another component without th e necessity to query the co mponent each time for the information. Before entering into the Initia lize state, the LTSS component creates the service connections to the GPOS and the VSS in the Startup state. Fi nally all the config variables in the config file are loaded in the program. Initialize State: This state makes sure that the GPOS and VSS service connections are active. In case these connections are not act ive the component rema ins in the initialize state. Even when in Ready state, if the GPOS or VSS service c onnections are down the component defaults to initialize state.

PAGE 81

81 Ready state: This is the most important part of the component and all the processing is done in this state. Once the component is set to run with all the globa l variables initialized and the service connections activ e, it remains in the ready st ate. It loops through the ready state once every time it produces an output Tr aversability Grid. Th e following sequence of steps is performed each time: 1. Conversion of the vehicle position from th e LLA (Latitude/Longitude/Altitude) data format to the vehicle position in a fixe d Global coordinate system using the UTM (Universal Transverse Mercator) conversi on. This information is then transformed into the number of rows and columns move d by the vehicle since the previous update. 2. Update each of the Circular buffers to account for the number of rows and columns moved by the vehicle. The circular buffer update method to account for the movement of the vehicle was discussed in Chapter 3. 3. The next step is to acquire the range data from each of the lasers. The data from each laser is acquired in a different thread The coordination between the laser data acquisition thread and the main component thread is maintained using a mutex lock. Since the OD laser runs at a frequency of 36 Hz and the two TE lasers run at a frequency of 18 Hz., the OD laser loads a new set of range readings for every iteration of the Ready state, while each of the Terrain evaluation lasers alternatively load a new set of readings for every iteration. 4. After the acquisition of the range data, th e aforementioned algorithms pertaining to the mapping of data and assignment of the tr aversability values are implemented. The algorithms are executed in the following seque nce; the Traversability Grid for the OD is updated with the new data, the two Traversability Grids for terrain evaluation 1 and terrain evaluation 2 are populated with th e most recent data from the two terrain lasers respectively. Next, the OD algorith m evaluates and assigns each cell a traversability value, similarly the two terrain evaluation Traversability Grids are evaluated and traversability values of th ese two terrain evaluation grids are then combined into a traversability value. The final step is the fusion of the OD and the TE grid. 5. The output Traversability Gr id is passed on to the Smart Arbiter. The algorithm repeats the steps in the Ready state as l ong as the service conn ections for the GPOS and VSS are active.

PAGE 82

82 0 0 180 0 0 180 0 0 180 0 40 140 0 40 140 A B Figure 5-1. Measurement range (Top view scan from right to left). A) Angular Range 0 -180 B) Angular Range 0-100. Not Connected 8 8 Jumper 2 GND 5 5 GND Not Connected 6 6 Not Connected Not Connected 7 7 Jumper 1 Not Connected 9 9 Not Connected Tx+ 4 4 TxTx3 3 Tx+ Rx2 2 Rx+ Rx+ 1 1 RxSignal Designation Pin No Pin No Signal Designation CPU (USB to Serial Hub) Laser Not Connected 8 8 Jumper 2 GND 5 5 GND Not Connected 6 6 Not Connected Not Connected 7 7 Jumper 1 Not Connected 9 9 Not Connected Tx+ 4 4 TxTx3 3 Tx+ Rx2 2 Rx+ Rx+ 1 1 RxSignal Designation Pin No Pin No Signal Designation CPU (USB to Serial Hub) Laser Figure 5-2. Laser sensor RS422 interface

PAGE 83

83 Figure 5-3. Sensor mounts on the NaviGator. Variable Angle Mount Variable Angle Mount Figure 5-4. Sensor mount design.

PAGE 84

84 Figure 5-5. Block diagram of the LTSS component

PAGE 85

85 Figure 5-6. Obstacle detection LADAR.

PAGE 86

86 0 2 4 8 16 32 2 3 4 5 6 7 Weighted SumTraversability Value Figure 5-7. Traversability value mapping. ra n g e TerrainLADAR1 Field of View TerrainLADAR2 Field of View Sensor Coordinate system x y z Fixed Global Coordinate system Global Coordinate system Attached to vehicle ra n g e TerrainLADAR1 Field of View TerrainLADAR2 Field of View Sensor Coordinate system x y z Fixed Global Coordinate system Global Coordinate system Attached to vehicle Figure 5-8. Schematic of terrain evaluation sensors.

PAGE 87

87 Figure 5-9. Weighted neighborhood analysis Imaginary horizontal plane Laser beam Negative obstacle (e.g., large pothole on the road or a cliff on the side of the road) Imaginary horizontal plane Laser beam Negative obstacle (e.g., large pothole on the road or a cliff on the side of the road) Figure 5-10. Schematic working of ne gative obstacle dete ction algorithm.

PAGE 88

88 Figure 5-11. Block diagram of terrain evaluation algorithms Figure 5-12. JAUS system topology.

PAGE 89

89 Figure 5-13. JAUS compliant system architecture.

PAGE 90

90 CHAPTER 6 EXPERIMENTAL RESULTS The developed LADAR Traversability Smart Se nsor (LTSS) component was tested on the autonomous platform, NaviGator. In Chapter 3 the NaviGator and the positioning system available on the platform were discussed. The LT SS component is formed from the fusion of two different implementations of the LADAR sensors, the obstacle detection (OD) algorithm and the terrain evaluation (TE) algorithm. Outputs from these two algorithms were fused resulting in the LTSS Traversability Grid. This chapter presents and analyzes the resu lts obtained from the OD sensor, the TE sensor and finally the fusion of these two sensors. The tests in this chapter are conducted at the solar park facility of the University of Florida. Traversability Grid Visualization Throughout this chapter the Traver sability Grid visualization t ool is used to demonstrate the results of the sensor algorithms. The Traversabi lity Grid is represented using the color code shown in Figure 6-1. As shown in the figure a va lue of 2, which is high ly non-traversable is represented by the color red. The co lor green represents a traversability value of 12, which mean highly favorable to traverse and th e color grey is for 7, which is a neutral value. The color shades for the intermediate values are shown in Figure 6-1. The color pink represents a value of 14 which is assigned to cells whose traversability value is unknown. As discussed in the previous chapters the vehicle position is always in the center of the grid. In the Traversability Grid results that follow, the position and direc tion of the vehicle is always re presented by an arrow. Similarly the images showing the test set-up also represen t the vehicle position a nd direction as shown by an arrow.

PAGE 91

91 Obstacle Detection Sensor Results Obstacle Mapping Results The obstacle detection (OD) sens or detects positive obstacles of height above a threshold value. If the obstacle is smaller then the th reshold height, which is 0.6 m in the present implementation, the sensor cannot detect the obst acle. Similarly, the algorithm cannot offer any opinion regarding negative obsta cles or good traversable path. Hence the algorithm can report traversability values in the range of 2 to 7 ba sed on the confidence it has on the obstacle. A value of 7 represents free space, which means there is no positive obstacle reported by the OD sensor; however the cell could contain a negative obstac le or could be a rough, uneven terrain which cannot be determined by this sensor. The performance of the OD sensor is evaluate d on the basis of two im portant factors: the accuracy in mapping the obstacle and the respon se time in detecting the obstacle. These performance assessments are evaluated at 3 diffe rent speeds of the vehicle: 10 mph, 16 mph and 22 mph. Figure 6-2 shows the experimental set-up for one set of readings. The results of the output from the OD sensor are shown in Figure 6-3 and the summary of the mapping results is presented in Table 6-1. The table shows the co mparison between the actual physical distance measured between the barrels and the output from the sensor at vehicle speeds of 10 mph, 16 mph and 22 mph. As seen in Table 6-1 the obstacles are mapped accurately within an error of 0.5 m which is equal to the grid resolution. Hence th e maximum error is with in 1 cell of the grid. Another set of readings was taken by changi ng the position of the barrels and repeating the experiments at the three sp eeds. Figure 6-4 shows the experi mental set-up for the second set of readings. The results are pres ented in Figure 6-5 and Table 62. Similar to the first set of readings the error is with in 1 grid cell i.e., 0.5 m.

PAGE 92

92 From the above two set of readings it can be se en that the error in the system is within 1 grid cell and the accuracy is independent of the lateral distance of the obstacle from the vehicle path. It was also seen that the error was repeated between the same barrels in all the three speeds. Since the actual physical distance wa s measured by a measuring tape this also causes some error in the measurement of the distance. Obstacle Detection Response Time The response time of the sensor algorithm is computed by measuring the distance between the obstacle and the vehicl e, at the time when the obstacle is detected. This distance is measured for each of the barrels in the two set of readings shown in Figure 6-2 and Figure 6-4 respectively. A comprehensive summary of the resu lts is presented in Ta ble 6-3. The table shows the distance range in which the obstacles were det ected in the direction of the vehicle path. The results of the reading 1 (repre sented by Figure 6-2) are presen ted in the movie links in the Objects 6-1, 6-2 and 6-3 for the vehicl e speeds of 10, 16 and 22 mph respectively. Object 6-1. OD reading 1 at 10 mph [ODBa rrelTest1_10mph.avi, 100892 KB]. Object 6-2. OD reading 1 at 16 mph [ODBa rrelTest1_16mph.avi, 83265 KB]. Object 6-3. OD reading 1 at 22 mph [ODBa rrelTest1_22mph.avi, 76214 KB]. Although the OD sensor detects obs tacles above a threshold va lue, it was observed that the height of the obstacle played an important role in how fast the obs tacle is detected. The experiment of detecting the obstacle is repeated by increasing the height of the obstacle. Figure 6-6 shows the experimental set-up for the increas ed height by placing the barrels one on top of the other. The result is presented in the movies in Objects 6-4, 6-5 and 6-6 for the vehicle speeds of 10, 16 and 22 mph. Object 6-4. OD with increased barrel height at 10 mph [ODBarrelTest2_10mph.avi, 92079 KB]. Object 6-5. OD with increased barrel height at 16 mph [ODBarrelTest2_16mph.avi, 83265 KB].

PAGE 93

93 Object 6-6. OD with increased barrel height at 22 mph [ODBarrelTest2_22mph.avi, 75392 KB]. The analysis of the result for obstacles with increa sed height is presented in Table 6-4. It can be seen from the results presented in Table 6-3 and Table 6-4, that the distance from which the obstacle is detected from the vehicle is very sens itive to the height of the obstacle, especially when the height of the obstacle is below 2 m. Th is is because even a small change in the vehicle pitch and roll angles cause a change in the angle of the laser beam. Due to this change the laser beam is no longer horizontal to the ground. For exam ple a change in the pitch angle of 2 would cause a laser beam to shoot at a height difference of 1 m above the critical height at a distance of 30 m from the laser. At high speeds on rough path (such as the environment shown in the above experiments) the roll and pitch changes and the rate of these changes are very high, and hence there is a difference in the performan ce based on the height of the obstacle. The above experiments demonstrated the pe rformance of the OD algorithm. The next section presents the results for the TE algor ithm and the fusion of the two algorithms. Fusion of the Obstacle Detection and Terrain Evaluation algorithms The main task of the terrain evaluation (TE) algorithm is to detect a smooth path and distinguish it from the surroundings. Unlike the obstacle detection (OD) algorithm which can report traversability valu es only in the range of 2 to 7, the TE algorithm can report values from 2 to 12. As discussed in Chapter 5, the traversability value is computed for each cell based on a set of features. To assess the pe rformance of the TE algorithm a nd subsequently the result of combining the outputs from the TE and OD sensors, the vehicle is driven on a small paved road within the solar park facility. The actual path is shown in the movie in Object 6-7. Object 6-7. Test Path [VideoForwardPath.avi, 335689 KB]. The vehicle speed was maintained at approxima tely10 mph. The same path was driven three times and each time the LTSS component was execute d on this path in a different output mode.

PAGE 94

94 The output from the OD, TE and the fused LTSS grid is shown in the movies linked to Objects 6-8, 6-9 and 6-10 respectively. Object 6-8. OD result for test path [ODFus ionTest1_10mph.avi, 356136 KB]. Object 6-9. TE result for test path [TEFusionTest1_10mph.avi, 393154 KB]. Object 6-10. LTSS result for test path [LTSSTest1_10mph.avi, 337804 KB]. For a second set of experiment the same pa th was driven in the opposite direction. The actual path is shown in the movie linked to Object 11. Object 6-11. Return test path [VideoReturnPath.avi, 239561 KB]. The output result from the OD algorithm, TE algor ithm and the fused LTSS grid are shown in the movies linked to Objects 6-12, 6-13 and 6-14 respectively. Object 6-12. OD result for return path at 10mph [ODFusionTest2_10mph.avi, 306897 KB]. Object 6-13. TE result for return path at 10mph [TEFusionTest2_10mph.avi, 316651 KB]. Object 6-14. LTSS result for return path at 10mph [LTSSTest2_10mph.avi, 239208 KB]. A couple of scenarios from the above two expe riments are selected and discussed here. Scene 1 The scene 1 is selected from the first of the above two expe rimental readings. Figure 6-7 shows the image of the environment with barrels, poles, trailers and some name boards. Figure 68 shows the outputs from the OD and TE algorithms. As seen in the results the obstacles which can be clearly distinguished from the surroundings mainly due to th eir height (these include the barrels, name boards, poles and trailers) are mapped very well by the OD sensor. However, the rest of the region is shown as free space without any indication of a favor able traversable path. The TE algorithm makes a good attempt to dist inguish the smooth path from the surrounding grass region and also shows the discontinuity at the edge of the path. The classification results from the TE algorithm are based on the absolute s cale of traversability defined for each of the

PAGE 95

95 features; the slope, variance and the nearest ne ighbor. As it can be seen although the path is distinguishable, the trav ersability values for most of the region ranges between 6 and 12. This is because, even though there is a clearly distinguis hable paved path, the region around the path is still considered to be drivable by the vehicle. The result after combining the two algorithms is depicted in Figure 6-9. As seen in the figure, the LTSS component takes the advantages of both the algorithms to represent clearly distingu ishable obstacles and to follow a smooth path. Scene 2 The scene 2 is also selected from the first experiment. Figure 6-10, shows a snapshot of the video representing the scene. The results from th e individual sensor al gorithms are shown in Figure 6-11 and the fused output is shown in Figur e 6-12. Similar to scene 1, it can be seen that the fused output has the advantages from both th e sensor algorithms and makes a more complete representation of the environment then the output obtained from either one of the algorithm. Scene 3 The scene 3 is selected from the second set of experiments (i.e., when the vehicle is on its way back). Figure 6-13 shows the environment. Th e results are presented in Figures 6-14 and 615. It can be seen from Figure 6-14 that the OD al gorithm identifies and maps obstacles such as the dumpster and the TE algorithm distinguishe s the paved path from the surrounding region. The fused output shows a very good re presentation of the environment. High Speed Test of the LTSS component A part of the first experimental path was dr iven at a higher speed and the results were assessed. The vehicle reached the speed of appr oximately 22 mph. The vehicle was driven three times to obtain the outputs from the OD sensor TE sensor, and the fused LTSS Traversability Grid. Objects 6-15, 6-16 and 6-17 shows the out put from OD, TE and the fused LTSS algorithm respectively.

PAGE 96

96 Object 6-15. Obstacle detection result at 22m ph [ODFusionTest_22mph.avi, 109706 KB]. Object 6-16. Terrain evaluation result at 22mph [TEFusionTest_22mph.avi, 110646 KB]. Object 6-17. LTSS result at 22mph [LTSSTest_22mph.avi, 135207 KB]. The main limitation to achieve better results at higher speeds is the laser update rate. The terrain sensors operate at 18 Hz and at this rate each individual grid cell ba rely manages to get at the most a single laser scan data above speeds of 20 mph. The speed limitation factor based on the laser update rate is discussed in Chapter 7. Another important factor is the vehicle position and orientation. At higher speeds it is critical to correlate the la ser data and the vehicle position fairly accurate to obtain a r easonable point cloud of the lase r data. In spite of the above limitations, the developed LTSS component produces fairly good results up to speeds of 20 mph.

PAGE 97

97 Figure 6-1. Traversabili ty Grid color code. Barrel 1 Barrel 2 Barrel 3 Barrel 4 Barrel 1 Barrel 2 Barrel 3 Barrel 4 Figure 6-2. Obstacle detection r eading 1 experimental set-up.

PAGE 98

98 Barrel 1 Barrel 2 Barrel 3 Barrel 4 Barrel 1 Barrel 2 Barrel 3 Barrel 4 A Barrel 4 Barrel 3 Barrel 2 Barrel 1 Barrel 4 Barrel 3 Barrel 2 Barrel 1 B Barrel 4 Barrel 3 Barrel 2 Barrel 1 Barrel 4 Barrel 3 Barrel 2 Barrel 1 C Figure 6-3. Obstacle detection read ing 1 output at varied speeds A) 10 mph B) 16 mph C) 22 mph

PAGE 99

99 Barrel 1 Barrel 2 Barrel 3 Barrel 4 Barrel 1 Barrel 2 Barrel 3 Barrel 4 Figure 6-4. Obstacle detection r eading 2 experimental set-up.

PAGE 100

100 Barrel 1 Barrel 2 Barrel 3 Barrel 4 Barrel 1 Barrel 2 Barrel 3 Barrel 4 A Barrel 1 Barrel 2 Barrel 3 Barrel 4 Barrel 1 Barrel 2 Barrel 3 Barrel 4 B Barrel 1 Barrel 2 Barrel 3 Barrel 4 Barrel 1 Barrel 2 Barrel 3 Barrel 4 C Figure 6-5. Obstacle detection read ing 2 output at varied speeds A) 10 mph. B) 16 mph. C) 22 mph.

PAGE 101

101 Barrel 1 Barrel 2 Barrel 1 Barrel 2 Figure 6-6. Obstacle detection respon se time with increased height. Pole 1 Pole2 Barrel 1 Barrel 2 Pole 1 Pole2 Barrel 1 Barrel 2 Figure 6-7. Test environment showing scene 1.

PAGE 102

102 Board 2 Board 1 Pole 1 Pole 2 Trailer 1 Trailer 2 Barrel 1 Barrel 2 Board 2 Board 1 Pole 1 Pole 2 Trailer 1 Trailer 2 Barrel 1 Barrel 2A Barrel 1 Barrel 2 Path Barrel 1 Barrel 2 Barrel 1 Barrel 2 Path B Figure 6-8. Output results for scen e 1. A) OD algorithm B) TE algorithm

PAGE 103

103 Trailer 1 Trailer 2 Barrel 1 Barrel 2 Pole 1 Pole 2 Board 1 Board 2 Trailer 1 Trailer 2 Barrel 1 Barrel 2 Pole 1 Pole 2 Board 1 Board 2 Figure 6-9. Scene 1 LTSS component out put environment representation.

PAGE 104

104 Dumpster Trailer Tree 1 Tree 2 Dumpster Trailer Tree 1 Tree 2 Figure 6-10. Test enviro nment showing scene 2.

PAGE 105

105 Trailer Dumpster Tree 1 Tree 2 Trailer Dumpster Tree 1 Tree 2 A Dumpster Trailer Tree 1 Tree 2 Dumpster Trailer Tree 1 Tree 2B Figure 6-11. Output results for scen e 2. A) OD algorithm B) TE algorithm

PAGE 106

106 Dumpster Trailer Tree 1 Tree 2 Dumpster Dumpster Dumpster Trailer Tree 1 Tree 2 Figure 6-12. Scene 2 LTSS component output environment representation.

PAGE 107

107 Dumpster Obstacle 1 Obstacle 2 Obstacle 3 Dumpster Obstacle 1 Obstacle 2 Obstacle 3 Obstacle 1 Obstacle 2 Obstacle 1 Obstacle 2 Figure 6-13. Test enviro nment showing scene 3.

PAGE 108

108 Dumpster Obstacle 3 Obstacle 1 Obstacle 2 Dumpster Obstacle 3 Obstacle 1 Obstacle 2 A Dumpster Obstacle 3 Dumpster Obstacle 3 B Figure 6-14. Output results for scen e 3. A) OD algorithm B) TE algorithm

PAGE 109

109 Obstacle 1 Obstacle 2 Dumpster Obstacle 1 Obstacle 2 Obstacle 1 Obstacle 2 Dumpster Figure 6-15. Scene 3 LTSS component out put environment representation.

PAGE 110

110 Table 6-1. Traversability Grid mapping of obs tacle detection algorithm for reading 1. Distance measured between Actual distance (m) Output measurement (m) at 10mph Output measurement (m) at 16mph Output measurement (m) at 22mph Barrel 1 and 2 6 6 6 6 Barrel 2 and 3 24 24 24 24 Barrel 3 and 4 6 6.5 6.5 6.5 Table 6-2. Traversability Grid mapping of obs tacle detection algorithm for reading 2. Actual Distance (m) Output measurement (m) at 10mph Output measurement (m) at 16mph Output measurement (m) at 22mph Barrel 1 and 2 12 12 12 12 Barrel 2 and 3 18 18.5 18.5 18.5 Barrel 3 and 4 12 12 12 12 Table 6-3. Response time reading 1 Speed of the vehicle (mph) Distance at which obstacle first detected (i.e. traversability value < 7) (m) Distance at which trav ersability value = 2 (m) 10 29-30 24-25 16 28-30 20-22 22 21-22 14-16 Table 6-4. Response time reading 2 with increased obstacle height Speed of the vehicle (mph) Distance at which obstacle first detected (i.e. Traversability value < 7) (m) Distance at which Traversability value = 2 (m) 10 30 29-30 16 29-30 26-27 22 26-28 20-21

PAGE 111

111 CHAPTER 7 GENERALIZED SENSOR COMPONENT The previous chapters concentrated on th e implementation of the LTSS component on a specific vehicle. Most of the laser based real -time terrain evaluation and obstacle detection implementations discussed in the literature are al so vehicle specific and/or terrain specific (i.e., an obstacle detection sensor or a terrain evaluation sensor is deve loped for a specific vehicle and environmental conditions). To enable a wider use of the developed sensor component, this chapter presents a general guideline to select specific sensor related parameters for implementing the proposed sensor component algorithms on differe nt vehicles or vice a versa given the sensor configuration parameters, what would be the limiting conditions of operation for the proposed algorithms to work. Figure 7-1 shows the schematic of the overall implementation of sensor component. If the current implementation is examined in a broader view, it can be seen that the obstacle detection algorithm and the negative obstacle detection al gorithm discussed in Chapter 5, are the limiting implementations of the terrain evaluation algorithms (slope, va riance and neighborhood analysis). While the terrain evaluation algorithms evaluates the terrain characteristic in front of the vehicle from a point cloud of the terrain, the obstacle detec tion makes sure that there is nothing in front of the vehicle over which it cannot drive and the nega tive obstacle detection makes sure that there is some surface on which to drive. There are a number of sensor related parameters that would depend on the vehicle on wh ich these sensors are implemented. Some of these sensor related parameters are: 1. Placement of each of the sensor hardware on the vehicle. 2. Sensor tilt angles towards the ground. 3. Sensor field of view.

PAGE 112

112 4. Sensor resolution. 5. Update rate of the sensor. These sensor parameters are functions of th e vehicle specifications and the performance requirements of the vehicle. A list of these specifications would include: 1. Vehicle overall dimensions. 2. Turning radius of the vehicle. 3. Speed range of the vehicle. The sensor parameters would also be affected by the terrain conditions wh ich could be analyzed in the following terms: 1. Roughness of the road. 2. Maximum expected curvature in the terrain. 3. Presence and magnitude of negative obstacles. 4. Minimum height of moving obstacles expected in the terrain. The problem statement could thus be stated as, given the vehicle specifications and the performance requirements of the vehicle devise a generalized method to define the actual hardware requirements of the laser sensors and the positioning of the sensors on the vehicle to implement the proposed algorithms. General Parameters for Ob stacle Detection sensor For the obstacle detection (OD) sensor, the height at which th e sensor is mounted is very important. Since this height acts as a threshold value, obstacles below this height will not be detected by the OD sensor; at the same time select ing to place the laser at a lower height would increase the chances of false h its due to ground noise. Ground noise is more prominent in paths with significant up and down hills. Hence the moun ting height of the lase r depends on the terrain condition, if the terrain is relatively flat, choosing to mount the sensor at a lower height will help

PAGE 113

113 detect obstacles with lower heights, however for path with uphill and downhill slopes, the sensor will have to be mounted at a relatively higher height to avoid ground noise. The OD sensor is particularly important for clearing moving obst acles. If the moving obstacle is less then the height of the OD sensor, it will not be possible to detect the ob stacle and clear it once it has moved. Although the obstacle could be detected by the terrain evaluation sensors, the problem would be to clear this obstacle once it has moved since terrain evaluation algorithms do not clear the obstacle once it is detected at a particular location. General Parameters for Terrain Evaluation sensor One of the important vehicle dimensions is the height at which the terrain mapping laser can be mounted on the vehicle. Let h be the height at which th e laser can be mounted on the vehicle. The other laser mounting parameter is the tilt angle at which the laser is mounted. The angle of tilt is governed by th e terrain condition. The tilt angl e cannot be greater then the expected slope changes in the terrain. If angle is the tilt angle of the laser with respect to the vehicle ground plane, than this angle should be greater then the change in slope, found on the path, otherwise the laser beam readings will not hit the terrain surface. Th e above two parameters govern the look ahead distance, dl of the sensor as follows: tanlh d (7.1) The current implementation used two lasers for terra in evaluation; one with a smaller tilt angle to get a greater look ahead distance, the other one wa s used with a bigger tilt angle to be able to scan the ground ahead with higher slope changes, but at a distance much cl oser to the vehicle. The TerrainLADAR1 discussed in Chapter 5 is mounted at an angle of 6 and the TerrainLADAR2 is mounted at an angle of 12. Figur e 7-1 Both the lasers are placed at a height of 1.9m. The look ahead distance for the TerrainLADAR1 is computed as:

PAGE 114

114 1.9 18.07 tan6ldm Similarly the look ahead distance for the TerrainLADAR2 is: 1.9 8.94 tan12ldm Figure 7-1 shows the three parameters for the TerainLADAR1. The look ahead distance is an important digit in deciding the sp eed of the vehicle. This distance is the time the vehicle has to avoid obstacles and travel a smoot h path. But at the same time, th e speed of the vehicle is also limited by the update rate of the lase r and the expected grid resoluti on. To evaluate the grid, each cell in the grid should have sufficient amount of data points. To make sure th at at least data from one single laser scan are assigned to a cell the following relation has to hold: Rev LaserUpdateRate Gridsolution (7.2) where, v is the speed of the vehicle. For example in the case of the NaviGator, the update rate of the terrain laser sensors is 18 Hz. and the implemented grid resolution is 0.5 m. Hen ce to obtain laser data in each cell in the grid within the field of view of the laser the speed of the vehicle has to be limited to: *Re 18*0.5 9/ vLaserUpdateRateGridsolution v vms When the vehicle is driving on a st raight road, the sensor field of view is in the direction of the vehicle travel; however when the vehicle is making a turn this is not the case. While making a turn the sensor field of view is in a direction tangential to the circle defined by the curved path followed by the vehicle. For safe driving, one of the sensor design requirements is to specify a parameter which defines the minimum width of the terrain from either side of the centerline of

PAGE 115

115 the vehicle drive path that should be in the se nsor field of view at any time. This design parameter, wm, is shown in Figure 7-2. To satisfy this design condition, the vehicle drive path is limited to a minimum allowable radius of curvature, R . The expression for R can be expressed in terms of the parameters defining the sensor fi eld of view. As shown in the figure the sensor field of view is defined by the angle, and the look ahead distance, dl. Consider the triangle OAB in the Figure 7-2, the angle b can be expressed as: 90 2 b (7.3) Using the cosine rule for the triangle OAB, the following expression is obtained: 2 222sec2seccos90 2222m llw RRdRd (7.4) Solving the Equation 7.4, R is obtained as: 2 22*sec 24 2seccos90 22m l lmw d R dw (7.5) Equation 7.5 expresses R in terms of the sensor parameters and the minimum width, which is a design requirement. In case of the terrain sensor implementation of the NaviGator, consider the TerrainLADAR1 sensor which scans th e terrain at a distan ce of 18 m in front of the vehicle. Let the minimum required width, wm be 28 m, (i.e. the sensor should scan a distance of at least 14 m on each side of the path center lin e). Using the above formula, 2 2210028 18*sec 24 100100 2*18*seccos9028 22 39.47. R Rm

PAGE 116

116 Thus the above condition of scanning the mini mum width across the pa th center line can be achieved only for a radius of curvature a bove or equal to the computed value.

PAGE 117

117 Terrain Evaluation sensors Obstacle Detection Sensor Region of Interest h dl Terrain Evaluation sensors Obstacle Detection Sensor Region of Interest h dl Figure 7-1. Sensor configuration. dlR wmO A B b dlR wmO A B dlR wm dlR wmO A B b Figure 7-2. Minimum radius of curvature

PAGE 118

118 CHAPTER 8 CONCLUSION AND FUTURE WORK The research presented an approach to re present the environment around an autonomous vehicle. Two different implementations of la ser sensors to evaluate the surroundings were presented. The advantages and disadvantages of each were discussed and a sensor fusion technique to combine the outputs of these two algorithms was presented. The dissertation presented a novel technique, the weighted neighborhood analysis and the results from this algorithm were fused with the sl ope and variance algorithms to give an estimate of the terrain surrounding the autonomous vehi cle. The results obtained from the terrain evaluation algorithms were very promising. Instead of using a binary classi fication of traversable or obstacl e, a traversability scale was used to define the environment. The traversabil ity scale allowed representing a wide range of the terrain. It was possible to disti nguish between varying degrees of obstacleness. The traversability scale helped the planner to propose an optimal path of travel at every instant of time. The next step is to use a finer traversability scale. A tr aversability scale of 0:63 instead of the 0:15 will give a better resolution of the environment and hence should be considered for implementation. The terrain mapping algorithm discussed in the literature does not implement a time history of data points. The only way the latest da ta are accounted for in the present scheme is by replacing the old data points with the new da ta points once the cell has reached a maximum allowed number of data points. The problem with th is scheme is that if there is a moving obstacle or if the sensor registers an erroneous hit within a cell, there is no way to clear it unless the same region is scanned again by the sensor. A better wa y to approach this problem would be to time stamp the data points. The time stamped data points can be assigned weights. If the vehicle gets stuck due to an erroneous data po int, the confidence on the data point can be lowered with time.

PAGE 119

119 The vehicle orientation was accounted for mappi ng the data points. Th e vehicle orientation also provides a very valuable i nput of the terrain characteristic s. The rate of change of the orientation angles provides a good estimate of the roughness of the path the vehicle is traveling on. It was also observed that the magnitude of e rror in the mapping is re lated to the rate of change of orientation angles. On a rougher terrain these rates are hi gher and so are the errors in mapping the data. The final results of the sensor component could be improved by incorporating this information in the rule based scheme. The grid resolution used for the current impl ementation was 0.5m. Considering the size of the vehicle and the outdoor environment, the reso lution gave good results. However, it would be worth trying to use a higher resolution grid such as a 0.25m and compare the results. It is generally desirable to have a higher resolution grid in the area close to the vehicle and as the distance from the vehicle increases a grid with a lower resolution could serve the purpose. This can be achieved by maintaining the data in tw o grids with different resolutions; a higher resolution grid with smaller overall dimensions close to the vehicle and another lower resolution grid covering a larger area ar ound the vehicle. A different way to serve the purpose of higher resolution near the center of the grid (vehicle position) but at the same time cover a larger area would be the implementation of a polar grid. A pol ar grid inherently give s a higher resolution in the region close to the vehicle, and as the distance increases the resolution of the grid decreases. The certainty factors technique for uncertainty management has wide applications in the medical field. However, the use of certainty factor s in sensor fusion is a novel approach. One of the biggest advantages of the scheme is its abil ity to combine controvers ial evidences. In future, the possibility of combining outputs from other se nsors such as the monocular vision using rulebased implementation of certainty factors should be studied.

PAGE 120

120 The lasers used for mapping the terrain were 2-dimensional range scanners (i.e. they produced a single line scan image) and the map wa s built by coupling these line scans with the vehicle motion. Although commercial la ser scanners are available wi th a horizontal and vertical field of view, they are highly expensive. A good alte rnative to generate a mu lti-line scanner is to provide a tilt mechanism to th e single line scanners. The tilt motion produces multiple lines of scans at different angles. The mechanism can eith er be used to point th e laser at the required angle or line scans can be generated at a uniform distance. To take advantage of the tilt motion, the laser scan update rate should be high enough to be able to scan multiple lines at the required speed.

PAGE 121

121 APPENDIX TRAVERSABILITY VALUE MAPPING FO R TERRAIN EVALUATION ALGORITHMS The table below shows the mapping of the criteri on used to evaluate the terrain in to the traversability values. These tables were implemente d as configuration parame ters in a config file. Table A-1. Mapping of slope valu es to traversability value. Slope (degrees) Traversability Value <= 10 12 > 10 & <= 20 11 > 20 & <= 30 10 > 30 & <= 32 9 > 32 & <= 35 8 > 35 & <= 40 7 >40 & <= 50 6 >50 & <= 60 5 >60 & <=80 4 >80 & <= 85 3 >85 & <= 90 2

PAGE 122

122 Table A-2. Mapping of the variance valu es in to traversability values. Variance (m) Traversability value <= 0.0002 12 > 0.0002 & <= 0.0003 11 > 0.0003 & <= 0.0004 10 > 0.0004 & <= 0.0005 9 > 0.0005 & <= 0.001 8 > 0.001 & <= 0.003 7 > 0.003 & <= 0.05 6 > 0.05 & <= 0.1 5 > 0.1 & <= 0.2 4 > 0.2 & <= 0.4 3 > 0.4 & <= 1.0 2

PAGE 123

123 Table A-3. Mapping of neighborhood va lues to traversability value. Neighborhood value (m) Traversability value <= 0.08 12 > 0.08 & <= 0.16 11 > 0.16 & <= 0.2 10 > 0.2 & <= 0.25 9 > 0.25 & <= 0.3 8 > 0.3 & <= 0.35 7 > 0.35 & <= 0.4 6 > 0.4 & <= 0.5 5 > 0.5 & <= 0.6 4 > 0.6 & <= 0.8 3 >0.8 & <= 2.0 2

PAGE 124

124 LIST OF REFERENCES Asada, M. (1988, November). Map building fo r a mobile Robot from sensory data. IEEE transactions on Systems, Man a nd Cybernetics, 20(6), 1326-1336. Bhatavia, P., & Singh, S. (2002, May). Obstacle de tection in smooth high curvature terrain. IEEE International Conference on Robotics and Automation (ICRA 2002), Washington, D.C., 3, 3062-3067. Braid, D., Broggi, A., & Schmiedel, G. (2006, June). The TerraMax autonomous vehicle concludes the 2005 DARPA Grand Challenge. Intelligent Vehicles Symposium 2006, Tokyo, Japan, 534-539. Carmer, D., & Peterson, L. (1996, February). Laser radar in robotics. Proceedings of the IEEE, 84(2), 299-320. Davis, I., Kelly, A., Stentz, A., & Matthies, L. (1995, September). Terrain typing for real robots. Proceedings of IEEE Intelligent Vehicles, 400-405. Dima, C., Vandapel, N., & Hebert, M. (2003, Oct ober). Sensor and classi fier fusion for outdoor obstacle detection: an applic ation of data fusion to aut onomous off-road navigation. 32nd Applied Imagery Recognition Workshop (AIPR2003), IEEE Computer Society. Dima, C., Vandapel, N., & Hebert, M., (2004, Ap ril). Classifier fusion for outdoor obstacle detection. IEEE International Confer ence on Robotics and Automation, 1, 665-671. Foessel-Bunting, A. (2000, November). Radar se nsor model for three-dimensional map building. Proc. SPIE, Mobile Robots XV and Telemani pulator and Telepresence Technologies VII, SPIE, 4195. Foley, J., Dam, A., Feiner, S., & Hughes, J. (1990 ). Computer Graphics: Pr inciples and Practice, Addison Wesley, Massachusetts. Goodridge, S., Luo, R., & Kay, M. (1994, October). Multi-layered fuzzy behavior fusion for real-time control of system s with many sensors. IEEE International conference on Multisensor Fusion and Integration for In telligent Systems, Las Vegas, NV, 272-299. Gonzalez, A., & Dankel, D. (1993). The engineer ing of knowledge based systems: theory and practice, Prentice Hall. Hancock, J., Langer, D., Hebert, M., Sullivan, R. Ingimarson, D., Hoffman, D., Mettenleiter, M., & Froehlich, C. (1998, May). Active laser ra dar for high-performance measurements. IEEE International Conference on Robotics and Automation, Leuven, Belgium, 2, 1465-1470. Hancock, J., Hebert, M., & Thorpe, C. (1998, Oct ober). Laser intensity-bas ed obstacle detection, IEEE International Conference on Intelligent Robots and Syst ems, Victoria, BC, Canada, 3, 1541-1546.

PAGE 125

125 Huber, D., & Hebert, M. (1999). A new appro ach to 3-D terrain ma pping. IEEE International Conference on Intelligent Robots and Sy stems, Kyongju, South Korea, 2, 1121-1127. JAUS. (2005). Joint architecture for unmanned sy stems reference architecture, version 3.2: JAUS Working Group. Retrieved from http://www.jauswg.org/ on February 24, 2007. Kyriakopoulas, K., & Skounakis, N. (2003, Septem ber). Moving obstacle detection for a skidsteered vehicle endowed with a single 2-D laser scanner. IEEE International Conference on Robotics and Automation, 1, 7-12. Kelly, A., & Stentz, A. (1998, May). Rough terrain autonomous mobility-part 2: an active vision, predictive control approach. Autonomous Robots, No. 5, 163-198. Kweon, I. (1991). Modeling rugged terrain by mo bile robots with multiple sensors. PhD Dissertation, Carnegie Mellon University. Langer, D., Rosenblatt, J., & Hebert, M. (1994, December). A behavior-based system for offroad navigation. IEEE Journal of R obotics and Automation, 10(6), 776-782. Matthies, L., & Grandjean, P. (1994, Decembe r). Stochastic performance modeling and evaluation of obstacle detectability with im aging range sensors. IEEE Transactions on Robotics and Automation, 10(6), 783-792. Matthies, L., & Rankin, A. (2003, October). Negative obstacle detection by thermal signature. IEEE International Confer ence on Intelligent Robots and Systems, 1, 906-913. Miller, I., Lupashin, S., Zych, N., Moran, P., Schimpf, B., Nathan, A., & Garcia, E. (2006, August). Cornell universitys 2005 DARPA Gr and Challenge entry. Journal of Field Robotics, 23(8), 625-652. Murphy, R. (1998, April). Dempster-Shafer theory for sensor fusion in autonomous mobile robots. IEEE transactions on R obotics and Automation, 14(2), 197-206. Novick, D. (2002, May). Implementation of a se nsor fusion based objec t detection component for an autonomous outdoor vehicle. PhD Dissertation, University of Florida. Okui, T. (1999). Development of a multi-laye red map management system utilizing the nonhomogeneous Markov chain approach. PhD Di ssertation, University of Florida. Royer, E., Bom, J., Dhome, M., Thuilot, B ., Lhuillier, M., & Marmoiton, F. (2005, August). Outdoor autonomous navigati on using monocular vision. IEEE International Conference on Intelligent Robots and Systems, 1253-1258. Shafer, G. (1976). A mathematical th eory of evidence. Princeton, N.J.

PAGE 126

126 Stentz, A., Kelly, A., Rander, P., Herman, H ., Amidi, O., Mandelbau m, R., Salgian, G., & Pedersen, J. (2003, July). Real-time, multiperspective perception for unmanned ground vehicles, Proceedings of AUVSI2003. Solanki, S. (2003). Implementation of laser range sensor and dielectric laser mirrors for 3D scanning of glove box environment. Maste rs Thesis, University of Florida. Talukder, A., Manduchi, R., Rankin, A., & Matthies, L. (2002, June). Fast and reliable obstacle detection and segmentation for cross-count ry navigation. IEEE Intelligent Vehicle Symposium, 2, 610-618. Takeda, N., Wantabe, M., & Kazunori, O. (1996, November). Moving obstacle detection using residual error of FOE estimation. IEEE Intern ational Conference and Intelligent Robots and Systems, Osaka, Japan, 3, 1642-1647. Talukder, A., Manduchi, R., Castano, R., Owens, K., Matthies, L., Castano, A., & Hogg, R. (2002, October). Autonomous terra in characterization and mode ling for dynamic control of unmanned vehicles. IEEE Conference on Inte lligent Robots and Systems, Lausanne, Switzerland, 1, 708-713. Thrun, S., Burgard, W., & Fox, D. (2005). Probab ilistic robotics. The MIT Press, Cambridge, Massachusetts, London, England. Thrun, S., Burgard, W., & Fox, D. (2000, April) A Real-time algorithm for mobile robot mapping with applications to multi-robot and 3D mapping. IEEE Conference on Robotics and Automation, 1, 321-328. Trepagnier, P., Nagel, J., Kinney, P., Koutsouge ras, C., & Dooner, M. (2006, August). KAT-5: Robust systems for autonomous vehicle navi gation in challengi ng and unknown terrain, Journal of Field Robotics, 23(8), 509-526. Urmson, C., Ragusa, C., Ray, D., Anhalt, J., Bartz, D., Galatali, T., Gutierrez, A., Johnston, J., Harbaugh, S., Kato, H., Messner, W., Miller, N ., Peterson, K., Smith, B., Snider, J., Spiker, S., Ziglar, J., Whittaker, W., Clark, M., Koon, P., Mosher, M., & Struble, J. (2006, August). A robust approach to high-speed navigation for unrehearsed desert terrain. Journal of Field Robotics, 23(8), 467-508. Wolf, D., Sukhatme, G., Fox, D., & Burgard, W. (2005, April). Autonomous terrain mapping and classification using hidden Markov models. IEEE Conference on Robotics and Automation, 2026-2031. Wu, H., Siegel, M., Stiefelhagen, R., & Yang, J. (2002, May). Sensor fusion using Dempster Shafer theory, Proceedings of the 19th IEEE Instrumentation and Measurement Technology Conference, 1, 7-12.

PAGE 127

127 Xu, J., Wang, H., Guzman, J., Ng, T., Shen, J., & Chan, C. (2003). Isodisparity profile processing for real-time 3D obstacle detection. Proceedings of the IEEE 6th International Conference on Intelligent Transportation Systems, Shanghai, China, 1, 288292. Ye, C., & Borenstein, J. (2003, April). A new te rrain mapping method for m obile robots obstacle negotiation. Proceedings of the Unmanned Ground Vehicle Technology Conference at the 2003 SPIE Aerospace Symposium, 5083, 52-62. Ye, C., & Borenstein, J. (2004, April). T-transfor mation a new traversability analysis method for terrain navigation. Proceedings of the Un manned Ground Vehicle Technology Conference at the 2004 SPIE Defense and Security Symposium, Orlando, FL, 473-483.

PAGE 128

128 BIOGRAPHICAL SKETCH Sanjay Solanki was born in India. He comple ted his bachelors degree in mechanical engineering at Vishwakarma Institute of Technol ogy, Pune. After his graduation he worked as a research and development engineer in Mahindr a & Mahindra Ltd., an au tomobile company in India. After two years of industr y experience, Sanjay decided to go back to school. He came to the United States in 2001, to pursue his maste rs degree in mechanical engineering at the University of Florida. For the next six years, th e Center for Intelligent Machines and Robotics at the University was his home, where he first co mpleted his masters degree in 2003 and then joined the PhD program. Sanjay has worked on va rious robotics projects under the guidance of his advisor, Dr Carl Crane. Among the most pres tigious projects, Sanjay was actively involved in the development of autonomous vehicles for the DARPA Grand Challenge competition in the year 2004 and 2005. To increase his skills in the robotics field, he added dual minors in computer science and electrical engineering. In May 2007, he received his doctoral degree. He plans to continue his work in the robotic industry.


Permanent Link: http://ufdc.ufl.edu/UFE0018321/00001

Material Information

Title: Development of Sensor Component for Terrain Evaluation and Obstacle Detection for an Unmanned Autonomous Vehicle
Physical Description: Mixed Material
Copyright Date: 2008

Record Information

Source Institution: University of Florida
Holding Location: University of Florida
Rights Management: All rights reserved by the source institution and holding location.
System ID: UFE0018321:00001

Permanent Link: http://ufdc.ufl.edu/UFE0018321/00001

Material Information

Title: Development of Sensor Component for Terrain Evaluation and Obstacle Detection for an Unmanned Autonomous Vehicle
Physical Description: Mixed Material
Copyright Date: 2008

Record Information

Source Institution: University of Florida
Holding Location: University of Florida
Rights Management: All rights reserved by the source institution and holding location.
System ID: UFE0018321:00001

Full Text





DEVELOPMENT OF SENSOR COMPONENT FOR TERRAIN EVALUATION AND
OBSTACLE DETECTION FOR AN UNMANNED AUTONOMOUS VEHICLE




















By

SANJAY CHAMPALAL SOLANKI


A DISSERTATION PRESENTED TO THE GRADUATE SCHOOL
OF THE UNIVERSITY OF FLORIDA IN PARTIAL FULFILLMENT
OF THE REQUIREMENTS FOR THE DEGREE OF
DOCTOR OF PHILOSOPHY

UNIVERSITY OF FLORIDA

2007
































Copyright 2007

by

SANJAY CHAMPALAL SOLANKI




































To my sister, Seema.









ACKNOWLEGEMENTS

I would like to thank my committee members; Dr Carl Crane, Dr Warren Dixon, Dr John

Schueller, Dr Antonio Arroyo and Dr Douglas Dankel.

Especially I am very thankful to my advisor, Dr. Carl Crane, who has been a great support

and inspiration for the entire course of my graduate studies. I would like to thank David

Armstrong, for his support as a proj ect manager. I thank Dr Mike Griffis, for putting all the

efforts in building the NaviGator. I express my thanks to all the students at CIMAR including

Bob Touchton, Tom Galluzzo, Daniel Kent, Roberto Montane, Jaesang Lee, Shannon Ridgeway,

Ji Hyun Yoon, Steve Velat and Greg Garcia, who have accompanied me on this challenging and

exciting path. I would like to thank everybody at the Air Force Research Laboratory at Tyndall

for their support.

Thanks to my wife Rita, my daughter Risha, my parents, and my sisters who have always

been there to help me.












TABLE OF CONTENTS


page


ACKNOWLEGE MENT S ............. ..... ._ ...............4....


LIST OF TABLES ............. ..... __ ...............8...


LIST OF FIGURES .............. ...............9.....


LIST OF OBJECT S ............. ..... __ ...............11..


AB S TRAC T ............._. .......... ..............._ 12...


CHAPTER


1 INTRODUCTION ................. ...............14.......... ......


Motivation............... ...............1

Background ................. ...............15.................


2 LITERATURE REVIEW ................. ...............18................


Sensors .................. ...............18.................
Monocular Vision ................. ...............18.................
Stereo Vision .............. ...............18....

Thermal Imaging .............. ...............19....
Laser ................. ........... ...............19......
Ultrasonic Transducers ................. ...............20.................
Radar ................. ............ ...............20.......

Environment Representation .............. ...............21....
Uncertainty Management................ ..................2
Uncertainty Management in Multi-Sensor Fusion .............. ...............23....
Certainty Factors .............. ...............23....

Dempster-Shafer Theory .............. ...............24....
Neural Networks............... ...... ...............2

Sensor Implementations in Autonomous Vehicles............... ...............26


3 RESEARCH GOALS .............. ...............40....


Statement of Problem ........._... ......___ ...............40....

Research Requirements .............. ...............40....
Autonomous Platform............... ...............41

Mechanical Specifications ........._... ...... ._._ ...............41....
Power Sy stem .............. ...............41....

Computing Resources ........._... ...... ._._ ...............42....
Localization ........._..... ......_ ._ ...............42....
Contributions of this Research. ........._... ......___ ...............42...












4 TRAVERSABILITY GRID APPROACH ................. ...............45................


Traversability Grid Representation............... .............4
Travers ability Grid Implementation .............. ...............47....
Travers ability Grid Propagation .............. ...............48....


5 SENSOR IMPLEMENTATION .............. ...............53....


Sensor Hardware ................. ...............53.................
LADAR Sensor .............. ...............53....
Sensor Interface ............... .. .... ..__..... ...............54.

Computing Resources and Operating System .............. ...............54....
Sensor Mount............... ...............54.
Sensor Al gorithms .............. ...............55....
Obstacle Detection............... ...............5
Terrain Evaluation .............. ...............59....
Terrain M apping ............ ......_ ...............60....
Terrain Classifieation .............. .......... .. .. ....... .........6
Classifieation based on the slope of the best fitting plane. ......____ ...... ....._.........63
Classifieation based on the variance............... ...............64

Weighted Neighborhood Analysis .............. ...............65....
Negative Obstacle Detection ............_...... ...............68...
Terrain Evaluation Output .............. .. ..... ..__ .... .....__ ............7
Advantages and Limitations of the OD and TE Sensor Algorithms .............. ...............72
Fusion of the Sensor Components ............_.. .....__ ...............74..
Implementation of the LTSS as a JAUS component ................. ...............77..............
Joint Architecture for Unmanned Sy stem s ................ ...............77........... ..
JAUS System Architecture on the NaviGator ....._.__._ ........___ .........__........7
Implementation of the LTS S as a JAUS component ................. ......... ................79

6 EXPERIMENTAL RESULTS .............. ...............90....


Traversability Grid Visualization .............. ...............90....
Ob stacle Detection Sensor Results ................ ...............91........... ...
Ob stacle Mapping Results ................. ...............91................
Obstacle Detection Response Time ............... .. ... ......... .... ...............92...
Fusion of the Obstacle Detection and Terrain Evaluation algorithms ................. ................93
Scene 1............... ...............94...
Scene 2............... ...............95...
Scene 3................. ......... .............9

High Speed Test of the LTSS component ................. ...............95........... ..

7 GENERALIZED SENSOR COMPONENT ................. ...............111...............


General Parameters for Obstacle Detection sensor ................. ................. ......... ..11
General Parameters for Terrain Evaluation sensor ................. ...............113..............


8 CONCLUSION AND FUTURE WORK ................. ...............118........... ...












APPENDIX. TRAVERSABLITY VALUE MAPPING FOR TERRAIN EVALUATION
ALGORITHMS ................. ...............121......... ......


LIST OF REFERENCE S ................. ...............124................


BIOGRAPHICAL SKETCH ................. ...............128......... ......










LIST OF TABLES


Table page

4-1 Update the circular buffer to account for the grid movement ................. ........_.._.. .....52

4-2 Accessing the grid cell in the circular buffer ....__. ................. ........__ .......5

6-1 Traversability Grid mapping of obstacle detection algorithm for reading 1. .................. 1 10

6-2 Traversability Grid mapping of obstacle detection algorithm for reading 2. ..................1 10

6-3 Response time reading 1 ................ ...............110....._... ...

6-4 Response time reading 2 with increased obstacle height ................. .......__. ..........110

A-1 Mapping of slope values to traversability value. ............. ...............121....

A-2 Mapping of the variance values in to traversability values ................. ............. .......122

A-3 Mapping of neighborhood values to traversability value. ................ ................. .....123











LIST OF FIGURES


Figure page

1-1 Important components of an autonomous vehicle. ............. .....................17

2-1 Time of flight measurement. ............_. ...._... ...............39...

3-1 Testing platform NaviGator ........._._.._......_.. ...............44....

3-2 Computing resources. ............. ...............44.....

4-1 Travers ability Grid representation. ............. ...............50.....

4-2 Grid movement. ............. ...............50.....


4-3 Circular buffer representation of the grid. ................ ...............51........... ..

4-4 Traversability Grid propagation ................. ...............51................

5-1 Measurement range (Top view scan from right to left). A) Angular Range 00 -1800
B) Angular Range 00-1000 ................ ...............82...............

5-2 Laser sensor RS422 interface............... ...............8

5-3 Sensor mounts on the NaviGator. .............. ...............83....


5-4 Sensor mount design. .............. ...............83....

5-5 Block diagram of the LTS S component. ....__. ...._.._.._ ......._.... ..........8

5-6 Obstacle detection LADAR ............... ...............85....


5-7 Traversability value mapping. ............. ...............86.....

5-8 Schematic of terrain evaluation sensors. ................ ............... ........ ......... ...86


5-9 Weighted neighborhood analysis............... ...............87

5-10 Schematic working of negative obstacle detection algorithm. ................ ................ ...87

5-11 Block diagram of terrain evaluation algorithms ................. ...............88........... ..

5-12 JAUS system topology............... ...............88

5-13 JAUS compliant system architecture. .............. ...............89....

6-1 Traversability Grid color code. .............. ...............97....











6-2 Obstacle detection reading 1 experimental set-up. ............. ...............97.....

6-3 Obstacle detection reading 1 output at varied speeds. A) 10 mph B) 16 mph C) 22
m ph .............. ...............98....

6-4 Obstacle detection reading 2 experimental set-up. ............. ...............99.....

6-5 Obstacle detection reading 2 output at varied speeds. A) 10 mph. B) 16 mph. C) 22
m ph. ............. ...............100....

6-6 Obstacle detection response time with increased height ................. ........... ...........101

6-7 Test environment showing scene 1.............. ...............101...

6-8 Output results for scene 1. A) OD algorithm B) TE algorithm .............. .................... 102

6-9 Scene 1 LTSS component output environment representation..........._.._.. .........._.._.. .103

6-10 Test environment showing scene 2. ........._._. ...._... ...............104..

6-11 Output results for scene 2. A) OD algorithm B) TE algorithm ................... ...............10

6-12 Scene 2 LTSS component output environment representation..........._.._.. .........._.._.. .106

6-13 Test environment showing scene 3 ........._._. ...._... ...............107..

6-14 Output results for scene 3. A) OD algorithm B) TE algorithm ........._._.... ......_........108

6-15 Scene 3 LTSS component output environment representation..........._.._.. .........._.._.. .109

7-1 Sensor configuration. ........._.._.._ ...............117._._.. .....

7-2 Minimum radius of curvature ................. ...............117......_....










LIST OF OBJECTS


Object page

6-1 OD reading 1 at 10 mph [ODBarrelTestl_10Omph.avi, 100892 KB] ................ ...............92

6-2 OD reading 1 at 16 mph [ODBarrelTestl_16Gmph.avi, 83265 KB]. ................ ...............92

6-3 OD reading 1 at 22 mph [ODBarrelTestl_22mph. avi, 76214 KB] ................ ................92

6-4 OD with increased barrel height at 10 mph [ODBarrelTest2_10Omph.avi, 92079 KB]. ....92

6-5 OD with increased barrel height at 16 mph [ODBarrelTest2_16Gmph.avi, 83265 KB]. ....92

6-6 OD with increased barrel height at 22 mph [ODB arrelTest2_22mph. avi, 75 392 KB]. ....93

6-7 Test Path [VideoForwardPath.avi, 335689 KB]. ............. ...............93.....

6-8 OD result for test path [ODFusionTestl_10Omph.avi, 356136 KB] ................ ................94

6-9 TE result for test path [TEFusionTestl_10Omph.avi, 393154 KB] ................. ................. 94

6-10 LTS S result for test path [LTS STestl _10mph.avi, 3 37804 KB] ................. ................. .94

6-11 Return test path [VideoReturnPath.avi, 239561 KB]. ............. ...............94.....

6-12 OD result for return path at 10mph [ODFusionTest2_10Omph.avi, 306897 KB]............... 94

6-13 TE result for return path at 10mph [TEFusionTest2_10Omph.avi, 316651 KB]................ .94

6-14 LTS S result for return path at 10mph [LTSSTest2_10mph.avi, 23 9208 KB] ................... 94

6-15 Obstacle detection result at 22mph [ODFusionTest_22mph.avi, 109706 KB]. ................96

6-16 Terrain evaluation result at 22mph [TEFusionTest_22mph.avi, 110646 KB]. .................96

6-17 LTSS result at 22mph [LTSSTest_22mph.avi, 135207 KB]. ................... ...............9









Abstract of Dissertation Presented to the Graduate School
of the University of Florida in Partial Fulfillment of the
Requirements for the Degree of Doctor of Philosophy

DEVELOPMENT OF SENSOR COMPONENT FOR TERRAIN EVALUATION AND
OBSTACLE DETECTION FOR AN UNMANNED AUTONOMOUS VEHICLE

By

SANJAY CHAMPALAL SOLANKI

May 2007

Chair: Carl Crane III
Cochair: Warren Dixon
Major: Mechanical Engineering

Applications of autonomous vehicles are found in various diversified fields including

defense, agriculture, mining, and space exploration. For successful operation of an autonomous

vehicle, the vehicle has to dynamically interact with the environment around it. My dissertation

presents the development and implementation of a sensor component to represent the

surrounding environment around an autonomous vehicle. The environment is modeled as a grid

and each cell in the grid is assigned a traversability value. This grid, termed as the Traversability

Grid, forms an internal representation of the state of the environment in real time. The

Traversability Grid aids the autonomous platform to drive on a good traversable path and avoid

non-traversable regions around it.

The real world environment is highly unstructured and dynamic. It is very difficult to

discretize the world into two variables: good and bad, hence the sensor component developed in

this dissertation uses a scale of traversability values to classify the region around the vehicle.

LADAR (Laser Detection and Ranging) sensors are used to sense the environment. The raw data

from the LADAR sensors are processed using two different algorithms; obstacle detection and

terrain evaluation. The obstacle detection algorithm uses a weighted sum of evidences method to









find if a region is occupied or free. The terrain evaluation algorithm maps the terrain in front of

the vehicle. Based on the geometry of the terrain, it assigns a traversability value to the region.

Features used to find the traversability include discontinuity in the terrain, slope of the terrain

based on the least squares method, and the variance in the point cloud.

Each of the two algorithms, obstacle detection and terrain evaluation, have certain

advantages and limitations. Also like any other sensor algorithm, these two algorithms have

some uncertainties associated with their outputs. In my work these algorithms are fused together

to complement the uncertainties and limitations of each of them, and outputs of the two

algorithms are fused using the certainty factors approach.









CHAPTER 1
INTTRODUCTION

Motivation

The concept of unmanned autonomous vehicles is one of today's leading research areas.

An autonomous vehicle can be defined as a vehicle which would drive by itself with a limited

degree of human intervention or possibly no human intervention at all. So far these vehicles have

found applications in various fields such as military, planetary exploration, agriculture, and

mining. The Defense Advanced Research Proj ects Agency (DARPA) has taken a lead role in

encouraging and pursuing the autonomous vehicle technology. The large participation and

excitement associated with the DARPA Grand Challenge National event in 2004 and 2005 is an

indication of the importance of autonomous vehicles in today's world.

As shown in Figure 1-1, the operation of an autonomous vehicle can be classified into four

main components, of these the perception of the environment and vehicle control components

interact with the real world environment, and the world model and intelligence components store

the information and make appropriate decisions. Perception of the environment in real-time is

critical for the successful operation of an autonomous vehicle and the research in this paper is

focused on the perception element. My research is intended towards representing the real world

environment in a format that would help the robot find and traverse the best possible terrain and

avoid obstacles as it moves towards its goal. The problem of perception can be categorized into

two main sections, one is sensing the environment with the use of various sensors and the other

is to represent this data in a form the vehicle can use to maneuver.

To scale the difficulty of the problem imagine driving on rough terrain. How many times

would one have to ask the question, "which would be the best way to go?" Would it be desirable

to pass the small ditch in order to avoid a big boulder or would it be better to traverse the uneven









terrain rather than going through the ditch? It is somewhat difficult for humans to answer these

questions; how difficult it would be to represent this heterogeneous environment in a digital

format for a robot to understand. The problem is not limited to this. The human perception

system is very powerful and humans are able to relate to things around them very well; the world

is defined in terms of human perception. It is not so easy in the case of sensors. Most of the

sensors available today are limited to measure only a limited category of features. For example,

one could capture color information with a camera but not depth, so is the same with other

commercially available sensors such as laser, infrared, and sonar. Although there are techniques

like stereo vision where one can capture both color and depth, they are limited by factors such as

environmental lighting conditions and noise. The certainty of any commercially available sensor

varies with the environmental conditions. The work in this research is an effort to find an

effective solution to the above problems.

Background

Within the Center for Intelligence Machines and Robotics (CIMAR) lab, research in the

area of obstacle detection using real-time sensors for autonomous vehicles started with

simulations presented by Takao Okui [Okui 99]. Okui presented various techniques for updating

local grid maps using simulated sensors. Some of the methods he used included probability

theory, Bayes theorem, histogram method, fuzzy logic, and non-homogeneous Markov chains.

This work helped the research community in comparing the advantages and disadvantages of

each of the above methods. David Novick [Novick 2002] took the research further by

incorporating the obstacle detection component on an actual vehicle. He pursued the non-

homogeneous Markov chains approach for updating grid maps. He did an excellent j ob in fusing

the output of two non-homogenous sensors, the laser range finder and the sonar sensor.










My work here aims to extend these previous efforts. My obj ective is to identify traversable

regions in an off-road environment that a car-like autonomous vehicle can travel at speeds

approaching 30 mph. The next chapter contains a review of the sensors and algorithms already

presented by the research community to solve this problem.











World Model I- Intelligence
Sy stem



SControl system
Figure 1-1. Important components of an autonomous vehicle.










CHAPTER 2
LITERATURE REVIEW

Various sensors have been used to date for detecting obstacles and estimating terrain. This

chapter begins with a brief discussion on the operating principles of some of these sensors. This

is followed by a brief theory on the representation of the environment and managing the

uncertainties associated with sensing. The chapter concludes with a review on the literature that

has been presented and implemented in the area of real time perception of the environment for

autonomous vehicles.

Sensors

Sensors in general can be categorized as active or passive. Active sensors have an internal

energy source and emit energy into the environment while passive sensors do not emit energy

and depend on their surrounding environment as the source of energy. Both types of sensors are

widely used in autonomous vehicle applications. Sensing of the environment includes detecting

obstacles and characterizing terrain.

Monocular Vision

Obj ect detection and classification using color has been discussed in the research

community for a long time. The primary features used for obj ect detection in this case are the

RGB color space. Some of the commonly used algorithms to detect obstacles or classify

traversable terrain are edge detection and optical flow. Implementations of monocular vision are

found in Davis [1995], Takeda [1996] and Royer [2005].

Stereo Vision

The slightly different perspectives from which our eyes perceive the world lead to different

retinal images, with relative displacements of obj ects (disparities) in the two monocular views of

a scene. The size and direction of the disparities of an obj ect serve as a measure of its relative










depth; absolute depth-information can be obtained if the geometry of the imaging system is

known. Stereo vision can be used to construct a three-dimensional image of the surroundings

using the same principle as human eye, provided the geometry and optics of the sensor are well

known. An implementation of the stereo vision system can be found in Wu [2002].

Thermal Imaging

Thermal cameras can detect energy at infrared frequencies. All obj ects emit infrared

energy, the hotter the object the more infrared energy it emits. Thermal cameras work better in

the night as there exists a better contrast of the surrounding objects. During the day obj ects

absorb heat from the sunlight and tend to blend together. Algorithms such as edge detection are

used for classification of obstacles using thermal imaging. Matthies [2003] implements a thermal

camera for detecting negative obstacles.

Laser

Most laser sensors use a visible or infrared laser beam to proj ect a spot of light onto a

target, whose distance is to be measured. The distance from the spot on the target back to the

light-detecting portion of the sensor is then measured in several ways [Carmer 1996]. The

general factors to consider when specifying a laser distance sensor include maximum range,

sensitivity, target reflectance and specularity, accuracy, resolution, and sample rate. The general

methods used to measure distance are optical triangulation and time of flight distance

measurement.

The optical triangulation measurement is used to measure distance with accuracy from a

few microns to a few millimeters over a range of few millimeters to meters at a rate of 100 to

60,000 times per second. A single point optical triangulation system uses a laser light source, a

lens and a linear light sensitive sensor. A light source illuminates a point on an obj ect, an image

of this light spot is then formed on the sensor surface, as the obj ect is moved the image moves










along the sensor; by measuring the location of the light spot image the distance of the obj ect

from the instrument can be determined.

Laser time-of-flight instruments offer very long range distance measurement with a trade

off between accuracy and speed. Figure 2-1 shows the two different methods to compute range

distance using laser sensors. As shown in Figure 2-1, a short pulse of light is emitted and the

delay until its reflection returns is timed very accurately. Since the speed of light is known, the

distance to the reflecting obj ect can be calculated and this is referred to as the time of flight

measurement. The laser sensor used in this research works on the principle of time of flight. The

second method to compute the range distance is by measuring the phase difference between

emitted and reflected waves. Example of a laser sensor working on this principle can be found in

Hancock [1998].

Ultrasonic Transducers

An ultrasonic transducer works on the principle of time of flight measurement as well. A

short acoustic pulse is emitted from the sensor; if the pulse hits an obj ect the return echo is

sensed. There are some inherent properties of this sensor which reduce its accuracy of

measurement such as, the large wavelength of ultrasonic energy as compared to the surface

roughness of an obj ect and its wider beam angle. An implementation of the Ultrasonic sensors

can be found in Novick [2002].

Radar

Radar emits electromagnetic radiation. As the signal propagates, obj ects reflect, refract and

absorb the radiation. Large reflecting objects will reflect a stronger signal. The signal strength

would be different for different materials. A lower signal strength is received for a large obstacle

with high absorptivity. Radars are generally used to detect large metallic obstacles at a distance.

Radar based sensor modeling can be found in Foessel-Bunting [1999].










Environment Representation

In order to use the sensors in autonomous robotic applications, the measurements from the

sensor are converted into state variables which form an internal representation of the

environment. The state variables define the state of the environment at the current instance of

time. The two methods generally used to model the environment in autonomous vehicle

applications are vector maps and raster maps.

The 2-dimensional vector maps are based on geometric primitives: point, lines, circles,

polygons etc. A set of environmental characteristics are tagged with these primitives. These

characteristics could be positive or negative obstacles, moving obstacles, obstacle segmentation

based on color, etc. Raster maps are built by tessellating the environment around the robot in a

2-D grid. Each cell in the 2-D grid can be classified based on the sensed environment

characteristics. The main advantage of a vector map is its compactness. For example only four

vertices are needed to define a rectangle of size m by n, while to represent the same rectangle in

a 2-D grid map an entire rectangular matrix of cells is required whose size will depend on the

size of the rectangle. Generating vector primitives require highly accurate sensor data as

compared to generating 2-D raster maps. One of the examples of a raster map implementation

found in the literature is the Occupancy Grid map [Thrun et al. 2005]. Occupancy Grids

tessellate the surrounding continuous space around the robot into fine grained cells. 2-D

Occupancy Grids are more common. In 2-D grids, the information from the 3-D world is

represented as a 2-D slice. Each cell in the grid corresponds to a variable, which represents the

probability of occupancy of that location.

Uncertainty Management

The sensors discussed above measure a particular characteristic of the environment; hence

these sensors provide only partial information of the surroundings. Sensor measurements are










often corrupted by noise. Hence there is always some amount of uncertainty associated with the

sensor in its representation of the environment. The state estimation algorithms seek to recover

the state variables (i.e., the state of the environment from sensor measurements). As it can be

inferred, most of these algorithms have to use an uncertainty management mechanism to come

up with an estimate of the environment from the sensor readings. Some of these algorithms;

probability theory, histogram method, Bayes theorem, fuzzy logic is discussed in Novick [2003].

Thrun et al. [2005] discusses the implementation of Bayes filter, which is a variant of the

Bayes theorem used to update Occupancy Grids. The Bayes filter uses the log odds ratio

representation instead of the probability representation of occupancy. The log odds

representation, lx is given as:


1(x)= log ilp~x(x) (2.1)


where

p(x) is the probability of occupancy.

The log odds ratio is recursively computed as the posterior value from the prior log odds ratio

and the current sensor observation as follows:


1, = 1_,+log -- ~ lo'( t)i (2.2)


where :

1, is the posterior log odds ratio,

le; is the prior log odds ratio,

p (x | z,) is the inverse measurement model which specifies the distribution over the state

variable x as a function of the measurement zt and










lo is a constant denoting the prior before any sensor readings.

As seen from the Equation 2.2, the Bayes filter is additive and hence any algorithm that

increments and decrements a variable in response to measurements can be interpreted as a Bayes

filter in log odds form.

Uncertainty Management in Multi-Sensor Fusion

Any one type of sensor measures a limited characteristic of the environment. Moreover

similar sensor hardware used in different sensor algorithms, can obtain different information

from the environment. The Bayes filter discussed above is not very useful in combining the

outputs from different sensor algorithms. Consider for example, an obstacle which can be

detected by one sensor type but cannot be detected by another sensor. These two sensor types

would generate conflicting information, and the resulting occupancy grid map would depend on

evidences brought by both sensor systems, hence the results will be ill-defined. The information

from these multiple sensors is combined by generating a sensor map for each sensor type and

then integrate the maps by using a suitable sensor fusion technique. Some of the techniques

which can be used to fuse more then one sensor maps into a single map, where each sensor map

might provide different information of the environment are discussed below.

Certainty Factors

Certainty factors [Gonzalez 1993] formalism assigns a belief value called as the CF value

for a hypothesis, based on the evidence presented to support the hypothesis or evidence

presented to contradict the hypothesis. The CF values are expressed as a set of rules having the

format:

IF evidence

THEN hypothesis (with a CF value)









A number of such rules are used, each to define the belief in the hypothesis based on the

evidence presented. If the evidence supports the hypothesis, the CF value is between 0 and 1. If

the evidence contradicts the hypothesis, than the CF value is between 0 and -1. The overall belief

in the hypothesis is computed by combining the individual CF values from each rule. The

following mathematical relationships are used to combine the CF values:

Let, CF;, CF2 be the two CF values to be combined. If both the CF values support the

hypothesis, the combined CF value is,

CFcombinedi = C; + CF, x(1- CF;) (2.3)

If the two CF values contradict each other, then the combined CF value is,


CFcombined (2.4)


The resulting CFcombmned is then used as the CF; value and combined with another CF value. For

the sensor fusion process, consider a number of sensors, which evaluates the environment and

presents evidence to either support the presence of an obstacle or supports the presence of a good

traversable path (contradiction to the presence of obstacle). The results from these individual

sensors can be combined using the certainty factors approach.

Dempster-Shafer Theory

The Dempster Shafer theory [Shafer 1976] was developed through the efforts of Arthur

Dempster and his student, Glenn Shafer. It is based on a mathematical theory of evidence where

a value between 0 and 1 is assigned to some fact as its degree of support. The theory allows

belief values to be associated with sets of facts as well as individual facts.









For example consider a case, where there are two possible conclusions. The set of possible

conclusions is, O = (Oz, 8,) The frame of discernment is defined as the set of all possible

conclusions. This set is, { #, {B ), ((),, { ) ), where, #, represents an empty set.

A mass probability function assigns a numeric value from the range [0, 1] to every set in

the frame of discernment. The mass probabilities are assigned based on the support that the

evidence presents for each of the conclusions. The sum of these mass probabilities (i.e., the total

probability mass) is 1.

The belief, Bel of a subset A is the sum of all the mass probabilities, nz, assigned to all of

the proper subsets B of A:

Be@@, ) =m na ) + nz () +m nz 8,z, ) (2.5)


The certainty associated with a particular subset A, is defined by the belief interval,

[Bel(A) P* (A)] wNhere, Bel(A) is the measure of total belief in A and its subsets and


P" (A) = 1 -Bel(A") is a measure of failure to doubt A.

Finally for combining two evidences, the mass probabilities from each of the evidence are

combined using Dempster,s combination rule. Consider mass probabilities, nt; (representing a

preexisting certainty state associated with hypothesis subsets X) and na, (representing some new

evidence associated with hypothesis subsets Y), these two are combined into a mass probability

no (representing the certainty state associated with C, the intersection of X and Y) as follows:

Ex r=cn"1(X>x na,(Y)
ng (c) = (2.6)
E, r= ina, (X) x n7, (Y)

Neural Networks

Artificial neural network (ANN) is an information processing model that is inspired by the

way biologic nervous systems such as the brain, process information. The ANN model is made









of strongly connected groups of simple processing elements called neurons. These neurons are

activated when their input is higher than some threshold value. When the neurons fire, they

communicate their value to their downstream connected neurons, which perform the same

function. The main advantage of ANN' s is their ability to learn from training data. The learning

process may be supervised (backpropagation method) or unsupervised learning (Kohonen

network). The maj or limitation of neural networks is the problem of developing a general

representation from a limited amount of training data.

Sensor Implementations in Autonomous Vehicles

The previous section introduced the various types of sensors used in autonomous vehicles

for real time perception. A brief theory of some of the environment state representation

techniques and managing uncertainty associated with sensor results was discussed. In this section

some of the methods and algorithms used to implement the sensors are presented.

In most of the methods, terrain mapping is described as building a Cartesian elevation map

from range sensor data. However using this method for mobile robots has certain drawbacks,

such as the elevation data can be sparse and non-uniform (this is more apparent as the distance

from the sensor increases and because of the occlusion of regions due to obstacles.) To deal with

the above problems [Kweon 1991], implements a locus method for building 3D occupancy maps

using successive range images obtained from a laser range finder. In this method the elevation z,

at a point (x,y) is found by computing the intersection of the terrain with a hypothesized vertical

line at (x,y). The basic idea of the algorithm is to compute this intersection in the image plane.

[Matthies 2003] describes how a thermal camera is used to detect negative obstacles. The

change in the temperature profile of the terrain due to uneven heating and cooling is accounted

for. Specifically during night time the interior parts of the negative obstacles tend to remain

warmer as compared to the surrounding terrain.









RGB space has been effectively used [Davis 1995] to classify terrain as vegetation or non-

vegetation. Two classification algorithms, the Fisher linear discriminant (FLD) and the

backpropagation neural network, are discussed. Consider an N-Dimensional space, where the

input data could belong to one of the two classes A or B. The FLD algorithm is a linear classifier

and generates a linear decision surface that maximizes the correct classifications of A and B.

In many cases the robots are required to work in a dynamic environment. A method for

detection of moving obstacles [Kyriakopoulos 2003] and estimating the velocity of the obstacle

is discussed and implemented. The paper assumes a polyhedral world i.e., the environment

around the robot is composed of linear segments. A single 2-D laser range sensor scans the

environment. The range measurements from the sensor are expressed in a world coordinate

frame after accounting for the vehicle position and orientation as follows:

x= xR+r~e(8, +p) (2.7)

where

x represents the Cartesian coordinates of the point in the world coordinate system,


(xR,B. is the position and orientation of the vehicle,

9 is the angle of the laser ray in the sensor coordinate system and


e (8,. + 97) represents the unit vector in the direction of the laser ray

Each line segment of an obstacle can be defined by a parametric model as,

x = xd +u'el (2.8)

where

xE, is a point on the segment and

e' is the associated unit vector parallel to the segment i.









From the Equations 2.7 and 2.8, the intersections of the line segments with the laser beam

can be computed. Additional constraints are considered such as the case where the laser ray

intersects more than one obstacle line segment for which the range value will correspond to the

nearest obstacle. The candidate moving obstacles are then selected on the basis of the obstacle

line segments obtained from the successive range reading sets.

Map building algorithms have been developed [Huber 1999] using range sensors. The

authors discuss the face-based spin image algorithm to build a map from two laser range

scanners, each mounted on a different vehicle.

Kelly and Stentz [1998] developed an adaptive perception algorithm for off-road

autonomous navigation. The input to the algorithm is through range images from laser

rangefinders and stereo vision. The data from the sensors is processed only in a region of interest

that contains the most useful information. The 3D data of the terrain is stored in a 2-D ring-

buffer that accommodates vehicle motion through module arithmetic indexing. The terrain map

discussed eliminates the need of copying the terrain data as the vehicle moves.

Wolf [2005] presents an online algorithm that builds a 3-D map of the terrain using 2-D

laser range finder data. The terrain is classified as navigable or non-navigable using Hidden

Markov models.

A radar sensor model [Foessel-Bunting 1999] is developed to build three-dimensional

evidence grid maps. The grid is updated using the logarithmic representation of Bayes rule.

A perception module which processes range images to identify non-traversable regions of

the terrain is developed [Langer D et al. 1994] for autonomous off-road navigation. The sensor

used is the ERIM laser which acquires 64 by 256 range images at 2 Hz. The data from the laser

range images are converted into Cartesian coordinates local to the vehicle. These data points are









then stored in a 2 dimensional grid map. Each cell of the grid map is evaluated for its

traversability. The features used to evaluate traversability are the height variation of the terrain

within the cell, the slope of the patch of terrain contained in the cell and the presence of

discontinuity of elevation in the cell. The algorithm only computes untraversable cells and

renders no opinion on smooth traversable terrain.

Bhatavia [2002] proposed a method to detect obstacles on an outdoor terrain which is

highly curved with natural rise and fall, but is assumed to be locally smooth. A cost-effective

laser scanner system was developed to replace the commercially available high cost two axis

lasers. A single axis laser scanner is mounted at an angle of 900 so as to scan a vertical line. The

laser is provided with a rotary motion about the second axis to sweep from left to right and

register a set of vertical line scans.

The data registered from each vertical scan are classified as an obstacle or free space using

a gradient based algorithm. Each of this single scan classification is stored in time-history. The

nearest neighbor fusion algorithm generates the candidate obstacles by clustering the group of

scans collected in a time window. Experimental results demonstrate that the algorithm is capable

of detecting obstacles of size 15 cm in a hilly terrain with minimal false positives; however the

terrain has to be very smooth.

The work presented by Talukder [2002a] implements a different approach to classify

obstacles based on 3-D geometry for an off-road environment. A conical region is defined

around each point in 3-D space. If there is any point in the truncated portion of the cone, both the

base point of the cone and the point under consideration are defined as an obstacle. This

essentially means for a point to be an obstacle the height difference between the base point which

represents the cone and the point under consideration should be greater than a threshold value









and the angle between the line formed by the two points and the horizontal surface should also

be greater than a threshold value. The 3D points are proj ected on an image plane and an

algorithm to implement the above strategy in the image plane is presented. In the above method,

obstacles are represented as pairs of points and this feature is further extended to segment

obstacles. All the pairs of points which represent obstacles are defined as linked nodes of an

undirected graph. Thus points connected in the chain will represent the same obstacle.

Extension of the above work is presented in Talukder [2002b]. Along with the geometrical

properties, color and texture are used to classify obstacles. The author discusses the concept of

obstacle negotiation by presenting techniques to predict the dynamic vehicle response to various

natural obstacles. The obstacles in a natural terrain such as green vegetation, dry vegetation, hard

rock and soil are composed of widely different material properties and yet can be of the same

size. Using the color and texture features, the obstacles can be classified based on their material

properties. The compressibility of the obstacles is modeled using a spring-damper system. The

vehicle is also modeled using a mass-spring system for each of its wheels. The vertical

acceleration of the vehicle is computed as it traverses the evaluated path using the obstacle and

vehicle models. This predicted acceleration profie is used for obstacle negotiation.

A novel filtering method for terrain mapping, called the Certainty Assisted Spatial (CAS)

Eilter is proposed [Ye 2003]. In this method an elevation map and a certainty map are built from

a forward looking 2-D laser range finder. The above filter is compared with the conventional

available filters for effective filtering of erroneous measurements due to mixed pixels, missing

data, artifacts and noise.

An approach for traversability analysis of rough terrain [Ye 2004] utilizing an elevation

map~ is prlr~. 1 IVUII Ir ~11~ Lmn noposed The eleva~;tio map is definedl as E = {Z,})where i and j are the row and column









of the cell. A cluster of neighboring cells with non-zero elevation values is defined as an

elevation-obstacle. A traversability map is created along with the elevation map to define the

traversability index. The traversability index for each cell is evaluated by computing the slope

and roughness of a square terrain patch with the cell under consideration as the center of the

patch. The size of the patch is equivalent to the number of cells required to accommodate the

vehicle in any orientation. Depending on the value of the traversability index, the originally

classified elevation-obstacles are re-classified.

In contrast to the general approach of converting the disparity image from stereovision into

3-D Cartesian coordinates to detect obstacles/traversable terrain, [Wu 2002] proposes to use the

raw disparity data for 3-D obstacle detection using stereovision. Two different maps are

generated: the obstacle map and the slope map. The obstacle map is 75 by 75 elements with each

element representing a 0.2 m by 0.2 m area and the slope map is 15 by 15 elements with each

element of size 1 m by 1 m. The obstacle map element is marked as one of four possible values;

undefined, traversable, negative obstacle or positive obstacle. The obstacle detection algorithm is

based on the horizontal isodisparity profiles generated by the stereo image at fixed intervals and

the reference lines for each of the isodisparity profiles.

To represent and classify the environment around the robot, a height map is built from

range data and fused with the video image [Asada 1988]. The height map obtained from the

range image is segmented into four regions: unexplored, occluded, traversable and obstacle. The

region where the height information is not available and falls outside the sensor view is tagged as

unexplored and the remainder of that region is marked as occluded. The points close to the

assumed ground plane and the neighboring points with low slope and curvature are classified as

traversable. The remaining regions are labeled as obstacles. The obstacle region is further









classified as natural or artificial using the height map and intensity image. The paper assumes

that artificial obstacles such as cars have planar surfaces which yield constant slope and low

curvature in the height map and linear features in the intensity image while natural obstacles

such as trees have Eine structures and yield high curvatures in the height map and large variance

in the intensity image. A physical simulator of the scale 87: 1 was built to perform the

experiments.

Most of the commercially available laser range sensors provide two kinds of information

for each laser beam reading: the distance of the obj ect being scanned and the intensity of the

returned signal. [Hancock 1998] tries to exploit the second piece of information to detect

obstacles in a highway environment. Obstacle detection based on range readings as discussed in

most of the literature involve building 3-D Cartesian maps and hence involves large

computation, not making it suitable for high speed environments. Intensity based obstacle

detection for a forward looking laser works on the simple principle that vertical surfaces in front

of the vehicle return a stronger signal as compared to the signal returned by horizontal surfaces.

This is especially true for larger look ahead distances of the order of 60m where the tilt angle of

the laser is very small (about 10 in the present paper). However the maj or drawback with the

above method is that the intensity varies significantly with the surface properties of the obj ect.

Most autonomous vehicle development proj ects use off the shelf laser range sensors which

are not specifically developed for the application. With the advancement in autonomous vehicle

technology, commercially available sensors do not meet the specific requirements of range and

angular resolution for evaluating wide range of obstacles and terrain at high speed. Carnegie

Mellon University has developed a high speed, high accuracy LADAR [Hancock 1998] in j oint

collaboration with K2T, Inc., and Zoller + Froehlich. The device consists of a two-axis










mechanical scanning system with a single point laser measurement system. The scanner provides

an unobstructed 3600 horizontal Hield of view and a 700 vertical Hield of view. The resolution of

the system is variable with a maximum of 0.60 per pixel.

Multiple sensors have been used [Stentz 2003] to detect and avoid obstacles in natural

terrain. The perception system is categorized in two sensing modules. The appearance sensing

module includes color cameras and Forward Looking Infrared (FLIR) cameras, while the

geometric sensing module includes LADAR sensors and stereo vision. The geometric sensing

module detects geometric hazards from the range data and assigns traversability costs. The

traversability costs are further modulated by the compressibility of the sensed terrain. The

compressibility is assessed using the range texture, color and color texture to discriminate rigid

rocks from bushes.

Sensor data from color and IR cameras are combined with range data from a laser sensor in

[Dima 2003] and [Dima 2004] for obstacle detection in an off-road environment. The image

space is tessellated into a grid of rectangular cells. The range data are proj ected into the

tessellated grid. Features such as the mean and standard deviation of pixel color in each cell,

mean and standard deviation of infrared pixel values in each cell, texture information and the

range reading statistics are used as input to the classifier. Instead of combining all the features

into one vector and using only a simple low level data fusion, methods to use a pool of classifiers

on the subsets of these features and then represent the Einal output as a fusion of these classifiers

are discussed. The classifier fusion algorithms presented in the paper are Committees of Expert,

Stacked Generalization and AdaBoost with classifier selection. One of the limitations with the

classifier fusion algorithms is the need of supervised learning and hence a large set of training

data.










Range imaging sensors are widely used in detecting obstacles with a variety of obstacle

detection algorithms. [Matthies 1994] proposes a model to evaluate the different obstacle

detection methods. The proposed method for evaluation is divided into two levels: the quality of

the raw range data from the sensors and the quality of the obstacle detection algorithm. The

quality of the range data is evaluated based on statistical performance models where both the

random errors caused by noise and the systematic errors due to artifacts are considered. To

evaluate the quality of the obstacle detection algorithm, the model predicts the quality of

detecting obstacles and the probability of false positives as a function of the size and distance of

the obstacle, the resolution of the sensor and the level of noise in the range data.

[Murphy 1998] discusses the evidential reasoning techniques to fuse sensor outputs.

Dempster-Shafer (DS) theory is implemented for sensor fusion at symbol level in the Sensor

Fusion Effects (SFX) architecture. The SFX consists of three distinct activities: configuration,

uncertainty management and exception handling. The uncertainty management activity collects

observations from the sensor and computes the total belief in the percept in three steps. First the

observations from each sensor are collected and fused at the pixel level in the collection step.

The features extracted from these observations are fused at the feature level in the preprocessing

step resulting in a belief function over each sensor' s unique frame of discernment. Finally the

fusion step combines these belief functions from different sensors into a total belief in the

percept. In the case of a sensor anomaly in any of the steps, exception handling is triggered. Two

components from the DS theory, the weight of conflict metric and the enlargement of the frame

of discernment, are used for the sensor fusion. The implementation of the above architecture is

demonstrated on a mobile robot which acts as a security guard. The outputs from a color camera,

a black and white camera, an infrared camera and an ultrasonic sensor are fused to guard a scene.










In an application different from autonomous vehicles, a sensor fusion architecture is

designed [Wu 2002] for context sensing. The authors have pointed out the difficulty that lies

with fusing different sensors due to the overlaps and conflicts in the outputs and the variations in

a sensor's performance with a change in situation. In the experimental example presented, a

user' s focus of attention is tracked using multiple evidences from different sensors. The output

from each of the sensors, a camera and a microphone are viewed as evidences. Two different

methods, weighted probability linear combination and Dempster-Shafer theory of evidence, are

used to fuse the evidences obtained from the sensors. Since the confidence level of the sensors

varies with situation, the authors propose a new concept for a weighted Dempster-Shafer

evidence combining rule. The idea is that if we know how a sensor performs historically in

similar situations, we can use this information to decide how much confidence we have on the

sensors current estimation. The results from the combined estimations are compared against the

results obtained using only a single sensor and found to be better in most of the cases.

Rather than extracting sensor data to form an intermediate representation which can be

utilized in the planning, [Goodridge 1994] proposes a non-representation based sensor fusion

scheme to control robots. A mapping between the sensor data and control signals is developed

using Fuzzy sets. The PCFUZ fuzzy development system is developed and used on an indoor

mobile robot to conduct experiments involving goal seeking, obstacle avoidance, barrier

following and obj ect docking behaviors. The sensors on board include sonar, vision and tactile

sensors. Fuzzy rules are created using the outputs from these sensors to produce a continuous

surface which defines the control signals. Different techniques such as summation, weighted

averaging, fuzzy multiplexing and hierarchical switching are discussed to implement multiple

behaviors such as the combination of obstacle avoidance and goal seeking.









Urmson [2006], Miller [2006], Trepagnier [2006] and Braid [2006] presents some of the

sensor implementations in the 2006 DARPA Grand Challenge competition. A brief discussion of

these sensor implementation is presented below.

Urmson [2006] implements a set of five laser sensors and one radar sensor to detect

obstacles and to classify terrain for off-road navigation. Three of these lasers are used to evaluate

the terrain while the other two lasers and the radar detect obvious positive obstacles. Each of the

sensor processing algorithms generates a cost map. The cost map is a 2-D grid, where each cell

in the grid is associated with a cost value which represents the traversability of that cell. The

sensor fusion algorithm generates a composite map using a weighted average of each of the

sensor cost maps. The terrain evaluation algorithm operates on a single line scan of the laser

data instead of a point cloud formed by the successive line scans as the vehicle moves. This

approach reduces the effect of imperfect pose estimation. The terrain evaluation algorithm

operates by fitting a line to the vertical planar proj section of points in vehicle width segments.

The traversability cost is computed as a weighted maximum of the slope and the line fit residual.

A grid based terrain representation method, where each cell in the grid, holds not only the

elevation values of the cell but also the uncertainties associated with the elevation value is

presented in [Miller 2006]. A set of three laser range sensors are used to scan the ground in front

of the vehicle. Two of these sensors are fixed on the vehicle, while one sensor is mounted on a

two axis gimbaled platform. The gimbaled platform allows the laser to be pointed in any desired

direction. The gimbaled laser is combined in a feedback loop with the path planner to gather data

along potential paths. The range data from all the three sensors is fused into one map. The

probability of the association of each range reading to a particular cell is computed. The

probabilistic model takes into account the measurement errors in the lasers itself and the error in










the positioning system. The laser measurements assigned to a particular cell are converted into

an estimate of the elevation distribution within that cell. The information is stored as an estimate

of the cell's elevation and the associated conditional covariance. Each new measurement is fused

with the previous measurement readings, hence there is no need to store the individual sensor

measurement readings and only the posterior probabilities of the estimated elevation and

variance are stored.

A set of two Sick LMS 291 laser scanners [Trepagnier 2006] are mounted on each side of

the front end of the vehicle. Each of these lasers scan the surrounding in a vertical plane and are

mounted on an oscillatory platform to provide a 300 oscillating angle. Range data from both the

lasers are fused into a common elevation grid map. The data is time stamped and hence obstacles

are kept in memory only for a certain amount of time. If the obstacles no longer exist after a

specified time, they are removed from the map. This helps in clearing moving obstacles, once

they have passed.

Team TerraMax [Braid 2006] selected an array of sensors including a forward looking

vision system, single-plane LADAR and multi-plane LADAR. The vision system is based on

multi stereoscopic vision and consists of three identical cameras mounted on a rigid bar in front

of the vehicle. The three cameras form three different baselines between them. Depending on the

speed of the vehicle one of the three camera pairs is selected. The large baseline is used for

higher speeds to obtain a deeper field of view; the medium baseline is used for medium speed

and the shorter baseline for slower speeds. The multi-plane LADAR is an IBEO ALASCA 4-

plane scanner that is used for positive obstacle detection. Two of the planes scan towards the

ground and the other two scan towards the sky. The obstacle detection algorithm is based on









detecting the slope of the terrain. Obstacles are reported in terms of the closeness of the obj ect

collision to the proposed path.










Transmitted signal

it,


/ Reflected signal



Reflected difference


Transmitted pulse
T=, 0 ~--.............-... ....


Reflected Pulse



Tirned pulse


Figure 2-1. Time of flight measurement.









CHAPTER 3
RESEARCH GOALS

Statement of Problem

Given a vehicle capable of maneuvering autonomously and which can obtain its position

and orientation measured with respect to a Global Coordinate system, develop a sensor

component that will evaluate the terrain and detect obstacles to support autonomous navigation.

The sensor component has to be developed for a full-size autonomous vehicle which can traverse

on paved and off-road conditions at maximum speed approaching 30 mph. The information

should be presented in such a way that the vehicle can use it to avoid obstacles and rough terrain

and to seek out smooth terrain.

Research Requirements

Assuming the availability of an autonomous vehicle platform and a positioning system, the

steps that are required to be fulfilled to develop a sensor component, that meet the goals

mentioned in the problem statement are as follows:

1. Selection of appropriate sensor hardware, which is able to give the required information
of the surrounding environment.

2. Selection of proper computer resources to interface the sensor hardware.

3. Design of an environment representation model. The model should not only be able to
distinguish the region around the vehicle into traversable and non-traversable areas but
should also be able to represent a degree of traversability or non-traversability.

4. Development of algorithms to process the sensor data to detect obstacles and evaluate
terrain. Combine the individual algorithms, into one fused output. The fusion process
should be able to manage the uncertainties and the limitations associated with the outputs
of the individual algorithms. The output of the environment should be in the above
mentioned environment representation model.

5. The sensor component should be developed as a Joint Architecture for Unmanned
Systems (JAUS) component.

6. Conduct experiments on an autonomous vehicle platform to validate the results of the
developed sensor component.









The following sections discuss the autonomous platform and the positioning system used

to develop and implement the sensor component.

Autonomous Platform

The development and experimentation work for the sensor component research presented

in this dissertation has been done on the autonomous platform, "NaviGator" (Figure 3-1)

developed in the Center for Intelligent Machines and Robotics Laboratory at the University of

Florida. The platform was initially developed to participate in the 2005 DARPA Grand

Challenge competition. Some of the important specifications of the platform are discussed.

Mechanical Specifications

The NaviGATOR' s base platform is a custom built all terrain vehicle. The frame is made

of mild steel roll bar with an open design. It has 9" Currie axles, Bilstein Shocks, hydraulic

steering, and front and rear disk brakes with an emergency brake to the rear. It has a 150 HP

Transverse Honda engine/transaxle mounted longitudinally, with locked transaxle that drives

front and rear Detroit Locker differentials (4 wheel drive). The vehicle was chosen for its

versatility, mobility, openness, and ease of development.

Power System

The power system consists of two independent 140A, 28V alternator systems. Each

alternator drives a 2400W continuous, 4800W peak inverter and is backed up by 4 deep cell

batteries. Each alternator feeds one of two automatic transfer switches (ATS). The output of one

ATS drives the computers and electronics while the other drives the actuators and a 3/4 Ton

(approx. 1kW cooling) air conditioner. Should either altemnator/battery system fail the entire

load automatically switches to the other alternator/battery system. Total system power

requirement is approximately 2200W, so the power system is totally redundant.










Computing Resources

All the computing systems and electronics are housed in a NEMA 4 enclosure mounted in

the rear of the vehicle as shown in Figure 3-2. The computing system consists of single processor

computing nodes. The system uses a total of eight computers, each of them equipped with an

AMD 2 GHz processor and 512 MB RAM. Each node is targeted to perform a specific function

of the autonomous system. The system architecture discussed in detail in Chapter 5, explains the

breakdown of the autonomous system functionality into these individual nodes. The sensor

component developed for this research resides on one computer.

Localization

The NaviGATOR determines its current location using a combination of GPS and inertial

navigation system sensor data. The processing and fusing of the navigation data is done by an

Inertial Navigation System from Smith's Aerospace. This system is named the North Finding

Module (NFM). The module maintains Kalman Filter estimates of the vehicle's global position

and orientation, as well as linear and angular velocities. It fuses internal accelerometer and

gyroscope data with data from an external NMEA GPS and external odometer. The GPS signal

provided to the NFM comes from one of the two onboard GPS systems. These include a

NavCom Technologies Starfire 2050 and a Garmin WAAS Enabled GPS 16. An onboard

computer simultaneously parses data from the two GPS units and routes the best-determined

signal to the NFM. This is done to maintain valid information to the NFM at times when only

one sensor is tracking GPS satellites. During valid tracking, the precision of the NavCom data is

better than the Garmin, and thus the system is biased to always use the NavCom when possible.

Contributions of this Research

The previous sections described the autonomous platform and the available computing

resources to be used for this research. The system explained above is the base foundation and










provides all the necessary resources for successful implementation of the research discussed in

this dissertation. The contributions of this research can be summarized in short as follows:

1. Development and implementation of an obstacle detection sensor system.

2. Development and implementation of a terrain evaluation sensor system.

3. Development and experimental evaluation of a new sensor fusion algorithm that combine
the information from the obstacle detection and terrain evaluation sensor systems.

4. Development of generalized results that determine an optimal sensor system design based
on specific vehicle parameters.





































~ '
4
-i--`[~aB
3.:
~' .~;Ars ;~L$1


Figure 3-1. Testing platform NaviGator


Figure 3-2. Computing resources.









CHAPTER 4
TRAVERSABILITY GRID APPROACH

The surrounding environment of an outdoor autonomous vehicle is highly unstructured and

dynamic. The environment consists of natural obstacles which include positive obstacles such as

rocks, trees etc. and negative obstacles such as cliffs, ditches etc. There is a possibility of

numerous man-made obstacles such as buildings, fence posts, and other vehicles and as well as

many other type of obj ects. The terrain characteristics are also very important for safe

navigation. The robotic vehicle should avoid rough, uneven terrain as far as possible and at the

same time seek out smooth traversable regions. The representation model of the environment

should not only be able to distinguish clearly defined highly non traversable obstacles but should

also be able to represent the small differences in the terrain which would for example distinguish

a clearly defined paved or smooth dirt path from the surrounding region. This would help to keep

the vehicle on the road and at the same time avoid obstacles.

In Chapter 2, two broad classifications of environment representation techniques were

discussed; the vector based and the grid or raster based. The current research presents the

concept of the Traversability Grid which is a grid based representation of the environment.

Traversability Grid Representation

The Traversability Grid representation tessellates the region around the vehicle into a 2D

grid. Figure 4-1 shows the important parameters used to define the Traversability Grid. The grid

is always oriented in the North-East direction with the vehicle position in the center of the grid.

The grid is defined by the number of rows, number of columns and the resolution of the grid. It

consists of an odd number of rows and columns to create a center cell for the vehicle. In the

current implementation, the Traversability Grid is 121 rows by 121 columns with a half meter by

half meter resolution. This allows a sensor to report data at least 30 m ahead of the vehicle.










Each cell in the grid corresponds to a small region of the environment and is represented

by a traversability value. A traversability value of a cell defines the measure of traversability of

that cell. The range of traversability value that can be assigned to a cell is from 0 to 15. A real-

time sensor component can assign a traversability value between 2 and 12 based on the sensed

environmental characteristics. A traversability value of 7 is considered to be neutral; a value

above 7 represents a traversable region while a value below 7 represents a non traversable

region. As the region becomes more favorable to be traversable, the traversability value of that

cell gradually increases from 7 to 12. For example, consider a paved path surrounded by a flat

terrain which is not as smooth as the path. Since the flat terrain surrounding the paved path is

still traversable, it will be represented by a traversability value above 7; however, the paved path

would be represented by a traversability value higher then the surrounding flat terrain, thus

although the vehicle could traverse on the surrounding flat terrain, it would be more favorable to

stay on the paved road. Similarly, as the region starts becoming non-traversable, the

traversability value of the corresponding cell in the grid gradually decreases from 7 to 2. In case

the vehicle has to choose to drive through a non traversable region, it would choose to drive in

the grid cells whose traversability value is larger. The sensor component outputs a traversability

value based on two factors:

1. The severity of a non-traversable region (size of the obstacle or roughness of the terrain)
or the good terrain characteristics of a traversable region (smoothness of the terrain).

2. The confidence on the evaluated characteristic. The obstacle might be highly non-
traversable but what is the confidence on the presence of the obstacle.

A value of 14 is assigned to a cell whose traversability value cannot be determined by the

sensor. The remaining values are reserved for a specific purpose; the value of 0 represents out of

bounds, 1 implies to use the same value as last time, 13 is reserved for failure or error and 15

denotes the vehicle location.









Traversability Grid Implementation

The sensor algorithms, discussed in the next chapter, implement the Traversability Grid as

a dynamically allocated circular buffer. The circular buffer implementation of the Traversability

Grid is very efficient in updating the grid to the new position as the vehicle moves. The main

advantage of using a circular buffer in place of a 2-dimensional array is that for a 2-D array, as

the vehicle moves, for every new position of the grid, data from the cells in the old grid is copied

into the corresponding cell indices in the new grid, this expensive computing operation of

copying the grid data every time can be avoided by using a circular buffer. Figure 4-2, shows the

change in the grid position due to the movement of the vehicle. As shown in the Eigure, all the

data corresponding to the overlapping cells in the two grids has to be copied from the old grid

position to the new grid position for a 2-D array representation.

The circular buffer implementation of the Traversability Grid avoids this computationally

expensive operation. The circular buffer stores the data in the grid as a 1D array of size equal to

the number of cells in the grid. The position of any cell corresponding to (row, column) in the 2D

grid is mapped into the 1D circular buffer as follows,

anrryPositionz = (row:x numberC~fColumns) columna (4.1)

When the grid shifts to a new position, instead of copying data from individual cells, the

circular buffer defines a pointer to a cell in the grid, which keeps track of the cell corresponding

to (0, 0) in the new grid position.

The position of this pointer is defined by two variables, 'rowBegin' and 'columnBegin'.

Initially these variables are set to (0, 0). When the grid moves to a new position, the variables are

updated as shown in Table 4-1.









The updated (rowBegin, cohemnBegin) are the indices of a cell in the old grid, which

correspond to the indices (0, 0) in the new grid. Thus each time the grid moves to a new position

the above algorithm recursively keeps track of the start cell of the grid and physically all the data

remains in the same array location. The only other operation needed is to clear the data in the

cells which no longer overlap.

Since the actual (0, 0) position of the 2-D grid now corresponds to the (rowBegin,

cohemnBegin), a cell given by the indices (row, cohemn) is accessed using the algorithm shown

in Table 4-2.

The above algorithms define the basic functioning of the circular buffer data structure.

Any number of variables of any data type can be stored in the grid using the circular buffer

implementation. Each of these variables would be stored as a 1-D array. At the minimum at least

one such 1-D array is defined by a sensor component for storing the traversability values of the

cells. This array is defined of type, unsigned char. Other arrays may be defined to store

information about the cells which could be used internal to the sensor algorithm.

Traversability Grid Propagation

The Traversability Grid data structure is a common data structure representing the

environment among all the components of the autonomous platform. Figure 4-4 shows the

schematic diagram of the propagation of the Traversability Grid from the Smart Sensors to the

Reactive Driver. Each of the Smart Sensor component outputs its own Traversability Grid. These

grids are fused into a single Traversability Grid in the Smart Arbiter component. The Reactive

Driver uses the Traversability Grid obtained from the arbiter to dynamically compute the vehicle

speed and heading and accordingly alters its command to the Primitive Driver, while doing so

the Reactive Driver accounts for the traversability value of each cell in the grid and seeks to

follow the path with higher traversability values (reported by the sensors as favorable to be









traversable) and avoid the cells with lower traversability values (reported by the sensors as non

traversable).














I/


columns

Figure 4-1. Traversability Grid representation.


O



O


I I I I I I I I I I I I


I I I I I I I I I I I I


Vehicle Position
, (Grid Center)


Nogh


rows


Resolution


SEast


R ows
moved


Figure 4-2. Grid movement.


r


COlumns

Moved




















New start
position
After grid
movement


Number of cells
Start position

Figure 4-3. Circular buffer representation of the grid.


Primitive
Driver


LADAR

Trav~erisaability

Sensor


Figure 4-4. Traversability Grid propagation.


***
ee
**
"

r


Reactive
Driver


I ord I
I~~ Mdl


jl









Table 4-1. Update the circular buffer to account for the grid movement.
rows2Moved and columns2Moved are the number of rows and columns the grid moved.

numberOfRows and number~fColumns define the size of the Grid.

rowBegin = rowBegin + row2~oved
while(rowBegin >= numberOfRows)


rowBegin = rowBegin -numberOfRows


columnBegin = columnBegin +column2~oved
while(columnBegin >= number~fColumns)


columnBegin = columnBegin number~fcolumns


Table 4-2. Accessing the grid cell in the circular buffer
row = row + rowBegin
wh~lile (row~r >= n2umberOfRowvs)


row = row numberOfRows


column = column + columnBegin
while (column >= number~fColumns)


column = column number~fColumns



The resulting (row, column) is then mapped into the 1D circular array using Equation 4.1i.









CHAPTER 5
SENSOR IMPLEMENTATION

The Smart Sensor component developed to meet the research goals of this dissertation is

named as the LADAR (Laser Detection and Ranging) Traversability Smart Sensor (LTSS). The

LTSS component use LADAR sensors for the real-time sensing of the environment. The words

LADAR and laser sensor are used interchangeably and they both mean the same. This chapter

explains in detail the hardware and software implementation of the LTSS component.

Sensor Hardware

LADAR Sensor

The LADAR sensor model LMS291-SO5 from Sick Inc. was selected. The LMS291-SO5 is

an optical sensor that scans its surrounding with infrared laser beams two dimensionally similar

to laser radar. An infrared laser beam is generated by the scanner' s internal diode. If the beam

strikes an obj ect, the reflection is received by the scanner and the distance is calculated based on

the time of flight. The pulsed laser beam is deflected by an internal rotating mirror so that a fan

shaped scan is made of the surrounding area. The laser scans an angular range of either 1800 or

1000. Figure 5-1 shows the field of view for the two angular range configurations. The scanner

can be configured in three angular resolutions; 1, 0.5 and 0.250, however the resolution of 0.250

can only be achieved with a 1000 range. Because of the smaller beam width, the laser is more

susceptible to false echoes due to small particles. The maximum range distance measured by the

system is 80 m with a measurement resolution of 10 mm. A higher range resolution of 1 mm is

available with a maximum range of 8 m. The frequency of the scan depends on the angular

resolution used. The scanner can operate at a maximum frequency of 72 Hz if the angular

resolution is set to 10. However for higher resolutions the operating frequency drops down to 36

Hz for a 0.50 resolution and 18 Hz for 0.250 resolution.









The range distance measured by the sensor also depends on the reflectivity of the obj ect.

For highly reflective surfaces the sensor can measure obj ects at greater distances then objects

with low reflectivity.

Sensor Interface

The sensor operating voltage is 24 V DC +/- 15%. The 24 V power supply is fed to the

sensor through the power supply system developed for the vehicle (discussed in Chapter 3). The

sensor can be interfaced to the computer using serial communications. The serial interface

RS232 or RS422 may be used for communication. To take advantage of the high frequency

LADAR data, the data has to be transferred at a baud rate of 500 Kb. Hence the RS422 interface

is selected which allows data transfer at the higher baud rates. The RS422 interface is achieved

via a high speed USB to serial hub from Sealevel (part # 2403). Figure 5-2 shows the RS422

serial interface connections diagram.

Computing Resources and Operating System

The LTSS component was developed on a single computing node. The node is equipped

with an AMD 2GHz processor, 512 MB ram and a 1 GB compact flash solid state hard drive.

The software is developed and tested on a Linux operating system. The Linux version Fedora

Core 3 was used to develop and test the software. All the software was developed using the C

programming language. The GCC compiler with the built-in libraries for math functions,

multithreading, socket communications and serial communications were used for the software

development.

Sensor Mount

Figure 5-3 shows three LADARS mounted on the NaviGator. Two of these LADARS are

mounted on a specially designed sensor cage on the top of the front end of the vehicle. The

design of the mounts for these LADARS allows them to be mounted at different angular









configurations. Figure 5-4, shows the design of these mounts. The LADARS are mounted facing

towards the ground at different angles. These two LADARS continuously scan the ground in

front of the vehicle. The third LADAR is mounted on the bumper level parallel to the ground

plane.

Sensor Algorithms

So far, the concept of the representation of the environment as a Traversability Grid was

discussed in Chapter 4 and the previous section discussed the actual sensor hardware used to

sense the environment. Now, the mathematical approach and the software implementation of the

LTSS component is explained. As mentioned in the problem statement the goal of the developed

Smart Sensor component is to detect obstacles and evaluate the terrain surrounding the vehicle.

To achieve this goal, the LTSS component implements a number of different algorithms.

Figure 5-5 shows a block diagram overview of the LTSS component. The input to the

component is the raw range data from the lasers and the position and orientation of the vehicle

and its output is the Traversability Grid which represents the state of the environment at the

current time. The block diagram gives an overview of the different algorithms and flow of the

outputs of each of these algorithms to form the output Traversability Grid. As shown in the

Figure 5-5 the component consists of three distinct parts:

1. Obstacle Detection: The obstacle detection (OD) algorithm receives raw range data from
the LADAR at the bumper level and outputs a Traversability Grid.

2. Terrain Evaluation: The terrain evaluation (TE) algorithm receives raw range data from
the top two LADARS mounted on the sensor cage. Each of these two LADARS feed data
into an individual terrain evaluation grid, which are then combined.

3. Sensor Fusion: The sensor fusion algorithm combines the outputs from the OD algorithm
and the TE algorithm and produces a fused Traversability Grid. The fused Traversability
Grid represents the results of the LTSS component and acts as an input to the Smart
Arbiter like any other Smart Sensor component.










The following sections discuss in detail the different algorithms and the fusion process used to

combine these algorithms.

Obstacle Detection

The LADAR sensor used for the obstacle detection (OD) algorithm is mounted at bumper

level, scanning in a plane parallel to the ground at a height of 0.6 m. This height of 0.6 m may be

defined as a threshold value. An obstacle of height greater than the threshold value is a positive

obstacle and anything below the threshold value is free space. The LADAR is set to scan at an

angular range of 1800 range with a 0.50 resolution. Figure 5-1 A shows the field of view of this

sensor.

The OD algorithm can be divided into two parts, the mapping of the laser range data into

the Global coordinate system and then evaluating each Cell based on the mapped data.

The range data is mapped into 2D Cartesian coordinates in the Global coordinate system.

As mentioned before the global coordinate system is always oriented in the North-East direction

and its origin is the centerline of the vehicle at ground level below the rear axle (i.e., the

proj section of the GPS antenna onto the ground). After each scan of 1800 the range data from the

laser is converted into the Global coordinate system. This conversion is done in two steps. First

the data is converted from polar coordinates to Cartesian coordinates local to the sensor as

follows:

s x senZsor sin a 0 ag
O =oa~ung (5.1)
y eso osa rnzge


where

s x [;senZsoTr Iis h data poin i n the sensor coorinat system.
y _senZsor









The second transformation takes into account the vehicle orientation and the sensor offset

distance and transforms the data point from the sensor coordinate system to the global frame.

The OD algorithm is based on the weighted sum of evidences. The traversability value of a

cell is computed based on the weighted sum of the evidence of the cell being occupied or free.

The evidence of a cell being an obstacle or free space is derived based on the current sensor

observation and initial evidences. A sensor observation may be defined as an outcome of the

sensor measurement used to evaluate the state of the system.

The sensor observation is managed internally using the two variables, 'OccupiedHits' and

'FreeHits' for each cell. After each laser scan the range measurements are transformed into

observations. For each single coordinate generated from the range value, the cell to which this

coordinate belongs in the Traversability Grid is determined, followed by all of the intervening

cells between the determined cell and the sensor. Bresenham's line algorithm [Foley 1990] and

[Novick 2002] is used to determine the indices of the intervening cells. The 'OccupiedHits'

buffer is incremented by one for the cell which receives the hit and the 'FreeHits' buffer is

incremented by one for all the intervening cells. For cases where the received range value is

beyond the Traversability Grid map, the cell at the intersection of the line formed by the range

value and the sensor origin with the bounds of the grid map is found. For all the cells on this line

the 'FreeHits' is incremented by one.

The evidence of a cell being occupied or free is computed as

Woco (t) = Woco (t 1) +OccupiedHits kl FreeHits (5.2)


W,,, (t) = W,,, (t- 1) +Fr~eeHits k 2 Occup~iedHits (5.3)

where

k1 and k2 are configurable parameters.









The first term in the Equations 5.2 and 5.3, is the initial weight of evidence. The initial weight of

evidence defines the state of the system before the current sensor observation. These initial

weights could be viewed as a Markovian model. A Markov Chain is defined as a process where

the future may be stochastic but no variables prior to the current state may influence the

stochastic evolution of future states. Thus, all the information of the past state of the system is

represented by these initial weights.

The second term in the above equations is the sensor observations. The observation

OccupiedHits strengthens the evidence of an obstacle and the observation FreeHits strengthens

the evidence of free space.

The third term in both the equations above acts as a parameter to adjust the speed of

response of the system. For example in the case of moving obstacles, once the obstacle is clear,

the third term in Equation 5.2 helps in fast recovery of free space. It also helps in clearing

spurious ground noise. Similarly, in the case where a cell is occupied by a moving obstacle the

third term in Equation 5.3 helps in fast recovery of occupied space. Note however that for the

case of fixed distinguishable obstacles the last terms of these equations tend to cancel each other

and hence do not have a maj or impact on the algorithm.

After computing the weights of evidence for each cell, the weighted sum is computed as:

Wsu =W, p Wf, (5.4)


where

p represents the ratio of evidence of a cell being occupied to a cell being free.

p is a tunable parameter. The value of p for current experimental results is selected as 1/6.

Finally the weighted sum value of the cell is mapped to the Traversability Value. The mapping

from the Ws,,,; to Traversability Value is exponential as shown in Figure 5-7.









The algorithm is very similar to the Bayes filter discussed in Chapter 2, since any

algorithm that increments and decrements a state variable (in our case occupied or free), in

response to sensor measurements can be interpreted as a Bayes filter. The main difference

between the current algorithm and the Bayes filter is that, the Bayes filter computes the

probability of one state variable, occupied or free-space. The other variable is just the negation of

the computed variable. In this case both the state variables are computed and then a weighted

average of these variables is performed.

The OD identifies positive obstacles and renders no opinion regarding the smoothness or

traversability of areas where no positive obstacle is reported. Hence it reports Traversability

values from 2 to 7. A cell with a value of 2 has a high probability of a positive obstacle while a

cell with a value of 7 is free space. A traversability value higher than 7 is not assigned to a cell

since it is not known if the free space is a smooth traversable path or a rough terrain or even a

negative obstacle.

Terrain Evaluation

In the terrain evaluation (TE) algorithm the terrain in front of the vehicle, is mapped with

the laser. A Cartesian elevation map is built from the successive laser scans and the positioning

system readings corresponding to these scans. From the 3-dimensional map, the terrain is

classified based on the geometry of the terrain. A set of classification features is generated by

performing statistical analysis on the terrain map.

Two LADAR systems are used to map the terrain. These two LADAR' s will be identified

as TerrainLADAR1 and TerrainLADAR2. The TerrainLADAR1 is mounted at an angle of 60

and the TerrainLADAR2 is mounted at an angle of 120 facing forward towards the ground. Both

these LADAR' s are mounted at a height of 1.9 m above the vehicle ground level. The two

LADAR' s populate two different maps of the terrain and each terrain map is classified










individually. The range readings from the two different lasers are not mapped into a common

terrain map, since the two lasers scan the same part of terrain at different time and hence would

have a built-in GPS localization error in these two sets of readings.

The LADAR' s are set to a configuration of 1000 angular range and 0.250 angular

resolution. Figure 5-8 shows the schematic of the Hield of view of the two lasers. With this

configuration and for nominal conditions (flat ground surface, vehicle level), the

TerrainLADAR1 scans at a distance of ~18 m ahead of the vehicle and ~43 m wide and the

TerrainLADAR2 scans the ground at a distance of ~9 m ahead of the vehicle and ~21.4 m wide.

Terrain Mapping

After each complete scan of 1000 the range data reported by the lasers are mapped into 3-D

point clouds. The data points are stored in the corresponding cells of the Traversability Grid as a

linked list of 3-D Cartesian coordinates. The conversion from the range data to the 3-D point

cloud is done in the steps as explained.

The range data reported by the laser is converted into Cartesian coordinates local to the

sensor. The local sensor coordinate system has its origin coincident with the origin of the laser

beam and the X-Y plane coincident with the laser scanning plane. The local sensor coordinates

are computed from the range information as follows:

x, = range*" sin a
y = orange cosa (5.5)
z, = 0.0

where

a is the angle of the laser beam with respect to the y-axis of the sensor coordinate system.










Next, the local sensor coordinate is converted into the vehicle coordinate system. The

vehicle coordinate system has its X-axis in the direction of the vehicle, and Z-axis vertically

down.

x, = x1 cos B + xofcet
y, = y, + ye, (5.6)
zV = x, sin O+ zosee

where

a is the laser tilt angle with the horizontal plane and

(v ....ffet, zgfeer) is the position of the sensor origin in the vehicle coordinate system.

The data is then transformed into the global coordinate system attached to the vehicle as

well as a fixed global coordinate system by taking into account the vehicle orientation. The

transformation from the local to global coordinate system attached to the vehicle is given as

follows:

xG COslyCosB -SinlyCos# + CoslySinOSin# SinlySin0 + CoslySinOCos# 0 x
yG Sinly/CosB CoslyCos# + SinlySinOSin# -CoslySin# + SinlySinOCos# 0 y
zG -Sin0 CosBSin# CosBCos# 0 z,
1 0 0 0 1 1


(5.7)

where


xG
Gare the global coordinates of the point,




(y/, #) are the yaw, pitch and roll respectively and

(xo, y0, zo ) are the sensor coordinates in the global coordinate system.









The fixed global coordinate system has similar orientation as the global coordinate system

attached to the vehicle, but the origin is at a fixed point whereas the origin of the global

coordinate system attached to the vehicle moves with the vehicle. The fixed origin is necessary

to take into account the vehicle motion to build the point cloud. The coordinates of the data point

in the fixed coordinate system are computed by adding the vehicle coordinates to the global

coordinates obtained above. The data point is stored in the cell corresponding to the location of

the point. The coordinate system attached to the vehicle gives the location of the point in the

Traversability Grid.

The maximum number of data points that can be stored in one cell is limited by a

configuration variable. In case the maximum number of data points in a cell is reached, the next

new data point is compared with the already existing list of data points. If the coordinates of the

new data point does not match with any of the existing data points, the first data point stored in

the link list is replaced by the new data point. The new data point is assumed to match with the

already existing data point if it is contained in a 0.1Im cube centered on the old data point. After

the mapping of the data points is complete the next step is the classification of the Traversability

Grid.

Terrain Classification

Each cell in the Traversability Grid is evaluated individually and classified for its

traversability value. The following geometrical features are used for the classification:

1. The slope of the best fitting plane through the data points in each cell.

2. The variance of the elevation of the data points within the cell.

3. The weighted neighborhood analysis.

4. Negative obstacle algorithm.










The first three of these criteria are used for the classification of the laser data from both the

LADAR' s. The negative obstacle algorithm is implemented only for the data from the

TerrainLADAR2 .

Classification based on the slope of the best fitting plane

The slope feature helps in distinguishing discontinuous terrain or obstacle surfaces from a

gradual slope. The classification of the terrain is based on the fact that the traversability of the

vehicle decreases with the increase in the slope of the terrain.

The slope of the terrain is computed as the tangent of the angle between the X-Y (ground)

plane and the computed plane of the terrain. The slope value is computed individually for each

terrain patch corresponding to the respective cell in the Traversability Gird. Based on the data

points in the cell, the terrain patch is approximated as a planar surface using the least squares

error approximation.

To approximate the plane of the terrain patch a minimum of three data points must be

present in the cell. The equation for the best fitting plane, derived using the least squares solution

technique, is given as:


S,,,,,,,, = (G3. Gr G1 b (5.8)

where :


S,,,,,,,, is the vector (S,~ S, S,) perpendicular to the best fitting plane

G is an n x 3 matrix given by:

X1 1 1 z
x y :
G;=

x y yn

b is a vector of length 'n' given by:










-D,,


b=- D,n



Assuming D,I equal to 1, Equation 5.8, is used to Eind S,,,,,,,, for the data points within

each cell. Chapter 5 of [Solanki 2003] provides a thorough proof of the Equation 5.8 for Einding

the perpendicular to a best fitting plane given a set of 3-D data points. Once the vector

perpendicular to the best fitting plane is known, the slope of the computed plane with respect to

the X-Y (ground) plane is computed as follows:



Slope = (5.9)


The above computed slope value represents the tangent of the angle between the best

fitting plane and the ground plane. This value is used to assign the traversability value. The

assignment of the traversability value is heuristic and based on comparing the classification

results with the actual terrain conditions. Table A-1 lists the mapping of the slope value to the

traversability value used in the experiments for this research. Instead of showing the tangent of

the angle, the Table A-1, shows the value of the angle between the best fitting plane and the

ground plane.

Classification based on the variance

The variance is a measure of the dispersion of data. The dispersion of data is defined as the

extent to which the data is scattered about the zone of central tendency. The variance is defined

as the sum of the squares of the deviations from the mean value divided by the number of

observations. For the terrain classification problem, the variance of the data in the Z direction is

an important parameter. The variance of the data points in the Z direction gives an indication of









the traversability of the terrain. A higher value of the variance indicates a rougher, uneven and

hence less traversable terrain while decrease in the variance indicates a smoother, traversable

terrain condition. From the data points within each cell, the variance of the terrain patch

corresponding to the respective cell is measured as follows:


varlanZce = E(,-#2(5.10)


where pu is the mean height of the cell given as:


p = EZ(5.11i)


Z, is the elevation of the data point, and

n is the number of data points.

The variance value computed using the Equation 5.11, is mapped to a traversability value

which is assigned to the cell. The mapping from the variance value to the traversability value is

heuristic and based on classification results obtained from driving through different terrain

conditions. The mapping variables are stored as configuration parameters in the config file. The

mapping values used in the experimental results of this dissertation are shown in Table A-2.

Weighted Neighborhood Analysis

As discussed in Chapter 2, a few research papers [Ye 2004] use the mean height as a

criterion for evaluating the terrain. However, using absolute height as a parameter to evaluate

terrain often results in misclassification of the terrain especially in case of uphill and downhill

slopes. Instead of evaluating terrain based on the absolute height of each cell, a measure of

terrain evaluation presented here is to compare the height of neighboring cells. The comparison

of the neighboring cells gives an indication of the discontinuity in the terrain between the two

cells.









One of the difficult problems in neighborhood analysis is determining the extent of

neighboring cells to use for comparison. Figure 5-9 shows the application of neighborhood

analysis in the present context. Assuming that the current position of the vehicle is on a

traversable region, consider the mean height of the center cell of the grid as ideal (the vehicle is

always in the center of the grid). Now, move out from the center cell and compare the mean

height of each of the cells adj acent to the center cell with the height of the center cell. As the

cells examined expand from the center compare the height of each cell to be evaluated with the

neighboring cells that are between the vehicle (i.e., the center cell) and the cell being evaluated.

The idea here is that since these neighboring cells fall in between the vehicle position and the

cell being evaluated, for the vehicle to travel through the cell, it has to pass through these

neighboring cells. Hence if the terrain is discontinuous between the cell under consideration and

the neighboring cells, then the cell is less likely to be traversable. For any cell in the grid there

are three neighboring cells as shown in Figure 5-9, which fall in between the cell and the center

of the grid. These three neighboring cells are the cell in the adj acent row, the cell in the adj acent

column and the diagonally adj acent cell. Since there are more than one cell which is adjacent to

the cell being evaluated an algorithm to decide on the importance of each neighboring cell is

designed. The algorithm takes into consideration, that the importance of each of the individual

neighboring cells is dependent on the position of the cell under consideration in the grid. For

example, for a cell which is in the diagonal direction of the grid the diagonally neighboring cell

will have more weight age then the neighboring row and column cell. Similarly the cells which

are towards the center row of the grid will have more weight age on the neighboring column cell

and the cells towards the center column of the grid will have more weight age on the neighboring









row cell. Each neighboring cell is assigned a weight, which is computed based on the position of

the cell in the grid.

For any cell in the grid, compute the unit vector in the direction from the cell to the center

of the grid. The neighboring cells are the adj acent cells that are closer to the center. These are

assigned weights depending on the position of the vector. Consider a unit vector, v = at + by ,

which represents the direction from a cell to be evaluated to the center of the grid (vehicle

position). The i component of the vector is in the direction of the neighboring row cell and the j

component is in the direction of the neighboring column cell discussed above. The neighboring

row, column and diagonal cell are assigned weights cl, c2, and c3 as follows:

cl =
a+b+ (a~1cb)


c2 = (5.12)
a+b+ (aZb


(a + b)
c3 =

ab+ Z(ab)


From the Equations 5.12, it can be seen that the weights are chosen based on the

components of the unit vector in each of the directions. These weights are normalized so that the

sum of the weights is always 1.

A weighted neighborhood analysis is then done for the cell being evaluated. Thus,

depending on the height difference between the cell being evaluated and the neighboring cells, a

traversability value is assigned to the cell. For example, consider the cell shown in Figure 5-9.

The neighborhood cell analysis value is calculated as follows:










Value(i, j) =cl* h(i, j) -h(i -1,j) +c2*[h(i, j) -h(i-1, j-1) +c3+[h(i, j) -h(i, j-1)]

(5.13)

where

h (i, j) is the height of the cell in row i and cohemn j.

The above computation is shown for a cell in the first quadrant of the grid. Similarly for the cells

lying in the different quadrants of the grid, the corresponding neighboring cells are selected

using the scheme discussed above. The cells in the center row and the center column are treated

as special cases and the neighborhood analysis of these cells is done by comparing them with

only one cell, which is the neighboring cell directly in between the cell being evaluated and the

center of the grid.

The neighborhood analysis value obtained for each cell from the above calculations is

mapped to a traversability value. Again in this case the mapping is heuristic and is saved as

configuration parameters in the config file. Table A-3 shows the mapping of the Weighted

Neighborhood analysis value to the traversability value used in the current experimental results.

Negative Obstacle Detection

In a rough outdoor environment, negative obstacles such as big holes or cliffs on the side

of the road are very common. With the laser range sensor, the only information one can obtain is

the distance of the obj ect (terrain surface or obstacle) hit by the laser beam. With this

information, the terrain map was built and algorithms were developed to classify terrain.

However in case of voids, empty space, the only important information that can be used from the

laser data is that the region in between the laser and the laser range reading is a free space. The

negative obstacle algorithm makes use of this information to give an estimate if the free space is

negative obstacle.









The algorithm compares each range reading obtained from the laser to the expected range

reading. The expected range reading is computed based on the geometry of the laser beam. The

unit vector in the direction of the laser beam is known since the laser is fixed with respect to the

vehicle and the vehicle orientation is known. Expecting good terrain condition, it is assumed that

the region in front of the vehicle is a level ground in the plane of the vehicle. From the geometry

of the laser beam (i.e., the line vector in the direction of the laser beam) and the assumption of a

level ground (X-Y plane) the expected range reading is computed by solving the problem of the

intersection of a plane and a line.

Consider a unit vector in the direction of the laser beam. The unit vector is expressed in the

Laser coordinate system using Equation 5.5. The unit vector is then transformed to the vehicle

orientation and subsequently to the global orientation using Equations 5.6 and 5.7. The sensor

offset terms in Equation 5.6 are not considered, since it is only the direction of the vector which

needs to be expressed in the global orientation frame.

The expected range distance in the direction of the unit vector is computed using the

information that the z-component of the expected range vector expressed in the global

orientation frame with the origin attached to the sensor is equal to the height of the laser above

the ground plane. The expected range is given as:

d= h/ e,*k^
(5.14)

where

d is the range distance from the sensor origin to the point of intersection,

h is the height of the sensor above the vehicle ground plane and

er is the unit vector representing the direction of the laser beam in the global coordinate

sy stem.









The expected range reading is subtracted from the actual reading obtained from the laser

and the difference is used as a factor to assign traversability value. If the difference is less than a

threshold value, the algorithm concludes that ground exists (positive obstacle or smooth flat

terrain) and does not report anything. However, if the difference is above the threshold value, the

possibility of a negative obstacle is assumed. The cell in the Traversability Grid which would

otherwise register the expected reading (in case of flat ground plane) is assigned a traversability

value. This cell would lie on an imaginary plane formed as an extension of the vehicle ground

plane similar to the one shown in Figure 5-10. The cell can easily be found since the range vector

in the Global coordinate system is already known from the Equation 5.14 and it just needs to be

transformed from the sensor origin to the vehicle origin (center of the Traversability Grid).

The severity of assigning this cell as a negative obstacle; increases with the increase in the

difference in the actual and expected range distance readings. The algorithm assigns a

traversability value between 2 and 7, since it is seeks only negative obstacles and does not render

any opinion on smoothness or good terrain conditions.

The algorithm is very sensitive to the tilt angle of the laser with respect to the ground

plane. The algorithm cannot distinguish between the severities of negative obstacle if the change

in the slope of the ground plane is greater than the tilt angle of the laser. The TerrainLADAR2

sensor is tilted at an angle of 120 towards the ground and the TerrainLADAR1 is tilted at an

angle of 60 towards the ground. TerrainLADAR2 sensor would be able to distinguish between

traversable path and negative obstacle as long as the change in the slope of the traversable path is

less then 120 and if this change is greater then 120, there is a high probability of a false positive

being registered. For the TerrainLADAR1 this allowable change in slope is limited to only 60

Hence, the above algorithm is implemented only for the data from the TerrainLADAR2 sensor.










The probability of registering a false positive also depends on the vehicle orientation. For

example, in cases where the vehicle is on a horizontal ground plane and there is a down hill slope

in front of the laser, with magnitude greater than the laser tilt angle, the algorithm would indicate

a negative obstacle irrespective of whether the slope is a gradual, smooth downhill slope or it is a

sudden discontinuity in the terrain. However, if the vehicle orientation is pitched up i.e., the

vehicle is on an uphill slope, the output from the algorithm would depend on the vehicle

orientation angles, since the direction of the laser beam depends on the vehicle orientation

angles.

The negative obstacle detection algorithm assigns traversability values only to the cells,

which do not register any data points from the terrain mapping algorithm. If there are data points

present in the cell, the cell is not considered to be evaluated for negative obstacle.

Terrain Evaluation Output

The previous sections discussed the four algorithms used to evaluate the terrain

characteristics. These algorithms are implemented separately on each of the two sensor data.

Figure 5-5, showed the overall block diagram of the LTSS component. From the figure it can be

seen that, the terrain is evaluated by processing the data from each of the Terrain LADAR' s

separately. The blocks terrain evaluation 1 and terrain evaluation 2, represent the terrain

evaluation process for the two LADAR' s. A more elaborate picture of the terrain evaluation 1

block is shown in the Figure 5-11. As discussed in the previous sections and as shown in the

figure, the laser data is first mapped into a terrain model. This terrain map is then input to each of

the terrain evaluation algorithms. The Traversability Grid output from the terrain evaluation 1

block is the fusion of the traversability values obtained from each of the algorithms.

The slope and the variance algorithms evaluate the cell based on the data across each

single cell. Each of these algorithms would work better than the other depending on the terrain









condition being evaluated. The average of the slope and variance based traversability values

gives an evaluation of the cell based on the data within the cell. The nearest neighborhood

analysis evaluates the cell based on the discontinuity in the data between cells. The output of the

terrain evaluation 1 block is the minimum of the value above two values.

The traversability values from the terrain evaluation algorithm implemented on the

TerrainLADAR2 is obtained using a similar equation as 5.15. Except in the case of

TerrainLADAR2, if there is no data present in a particular cell and that cell has been evaluated

by the negative obstacle detection algorithm, the cell would be assigned a traversability value

from the negative obstacle detection algorithm. All the other cells which do not contain any data

and which have not been evaluated by the negative obstacle detection algorithm are considered

to be not in the field of view of the sensor and hence traversability value of 14 (unknown) is

assigned to the cell.

A simple averaging algorithm combines the outputs of the terrain evaluation 1 and terrain

evaluation 2. The computed average is the traversability value of the cell based on the terrain

evaluation:

TV=TTV +TTV2
2 5.16

where,

TTY is the traversability value of the cell based on terrain evaluation,

TTV, is the traversability value of the cell assigned by the terrain evaluation 1,

TTV2 is the traversability value of the cell assigned by the terrain evaluation 2.

Advantages and Limitations of the OD and TE Sensor Algorithms

The experiments conducted with the above sensor algorithms reveal the following

advantages and limitations









1. The obstacle detection (OD) algorithm can be used only for positive obstacles; it gives no
opinion on the smoothness of the terrain and negative obstacles.

2. In off-road conditions the OD generates a lot of ground noise. Most of the noise is due to
misclassification of an approaching uphill slope or due to going down a hill transitioning
into flat ground.

3. However the OD algorithm is not as complex as the terrain evaluation (TE) and it does
not have to create a 3-D point cloud. It is very reliable in identifying positive obstacles.
The grid is updated about 3 5 times a second in the region bounded by the field of view of
the sensor. In this region moving obstacles which have passed are cleared in the grid and
if a moving obstacle shows up in the grid, the grid will be updated.

4. Since the OD algorithm does not depend on mapping the true coordinate of the point; but
just checks to see if the point belongs to a cell, the error in mapping the obstacle is very
small compared to the TE algorithm

5. The main concern with the TE algorithms is modeling the ground plane. Since the data is
collected in successive scans, the ground plane of the vehicle changes. Thus each time the
points are registered with a different reference plane. In cases where the vehicle is on flat
smooth terrain this is not a problem. However, in cases of uneven terrain, it is very
difficult to relate the data in a common ground plane. Although the points are registered
in a fixed global frame there is some error associated with the registration process and
experiments have shown the magnitude of the error depends on the condition of the
terramn.

6. Since the look ahead distance of the Terrain LADAR' s is limited by the tilt angle of the
laser, in the present case TE algorithm is effective for a range of only up to 18 m. It does
not provide any information of obstacles further than this distance.

7. The TE algorithm maps moving obstacle as part of the terrain and hence does not clear
them after they have passed from the grid.

8. In spite of the above disadvantages of the TE, the algorithm actually maps the
surrounding into a 3D point cloud and characterizes the terrain based on slope, variance
and discontinuities, and hence the classification is based on more detailed information of
the surrounding as compared to the OD.

The conclusion that can be drawn from the discussion and actual implementation of the

OD and TE sensors is that these sensors provide very good classification results in a limited

range of environmental conditions. However, much uncertainty is associated with these two

sensor implementations, when using them in the real world heterogeneous environment. The










following section presents an uncertainty management tool, which is used to fuse the outputs

from these two algorithms.

Fusion of the Sensor Components

In the previous sections two different sensor algorithms, the OD and the TE were

developed and the advantages and disadvantages of each were discussed. To take advantage of

each sensor algorithm and at the same time overcome some of its limitations, the outputs from

the above sensors are fused together using a simple rule-based forward reasoning scheme. The

uncertainties associated with the two sensors are combined using certainty factors [4]. The

certainty factor (CF) formalism presents an approach to combine evidences supporting or

contradicting a hypothesis. As opposed to the Bayesian analysis where only those evidences

supporting a hypothesis can be combined, the certainty factor formalism provides a mechanism

to combine contradictory evidences. The certainty factor value for a particular hypothesis is

between -1.0 and 1.0. The CF value of 1.0 represents complete confidence on the hypothesis

while a CF value of -1.0 represents a complete confidence against the hypothesis. In the present

case, a CF value of 1.0 indicates complete confidence in the presence of an obstacle or highly

non traversable region while a CF value of -1.0 represent a highly traversable and smooth region.

The value of 0.0 represents a cell which does not show any confidence either towards the

presence of an obstacle or towards a desirable traversable path. The rule based reasoning scheme

assigns different CF values to each of the sensors based on the observed readings. Information

such as the mean height of the cell is used in determining the confidence level whether the sensor

should be able to see the obstacle.

The traversability values obtained from each of the sensors is converted into the evidence

of the presence of an obstacle or presence of a traversable path. The confidence on the evidences

presented by each of these sensors is represented as,










CFevidnceO is the evidence of the presence of an obstacle detected by the OD algorithm. As

discussed in the previous sections, since the OD sensor does not give any input on the traversable

path, the value of this variable is between 0.0 (representing traversability value of 7) and 1.0

(representing traversability value of 2) depending on the presence of Obstacle.

CFewnceT is the evidence of the traversability or non-traversability of the cell detected by

the TE sensor. The value of this variable is between -1.0 (traversability value of 12) and 1.0

(traversability value of 2) depending on the traversability of the cell.

The pseudo code for the implementation of the fusion of these two sensor algorithms is

presented below.

CFOD and CFTE are the certainties associated with the respective sensors depending on

which rule executes.

C:Frs is the combined CF value.

The following scheme is applied:

Case 1: The OD indicates the Cell is OCCUPIED

IF TE = UNKNOWN
THEN
CFo = 1.0 x CFevdncO
CFLS = CFO

IF TE = NON-TRAVERSABLE
THEN

CFo = 1.0x CFelncO
CTE =1O ~vdenceTE
CF,,s = CFOD+ CFT (- CFOD


IF TE = TRAVERS ABLE
THEN










CFo = 0.9x CFevlecO
CF=OD g C~videnceOD
CFCF = 0.9 C,
CF,
rs=1- iCO T

Case 2: The OD indicates the Cell is FREE

IF TE = UNKNOWN
THEN
CFLS = CFO

IF TE = NON-TRAVERSABLE
THEN
IF (Mean Height <= 0.6m)
THEN CF= 1.0 x CFedecT
IF (Mean Height > 0.6m && Mean Height < 0.8m)
THEN CF =- 0.8 x CFevdncT
IF (Mean Height >= 0.8m)
THEN CFT = 0.2 x CFevdncT

CFLS = CFT


IF TE = TRAVERS ABLE
THEN
CTE =1O aedenceTE
CFLS = CFT

Case 3: The OD indicates the Cell is UNKNOWN

CTE =1O aedenceTE
CFLS = CFT

The CF values are mapped back into the traversability value. As discussed earlier, -1.0

corresponds to a value of 12, with 7 corresponding to a value of 0 and 1.0 corresponding to a

value of 2. Should one of the sensors fail to report, all the values in the grid for that sensor are

marked as unknown, and the above scheme would give the output from the other sensor as the









fused output. Hence, the fusion process is modular in the sense that each one of the sensors

output would still be valid as a final output should the other sensor fail.

Implementation of the LTSS as a JAUS component

The LTSS is implemented as a JAUS component. This section starts with a brief overview

of the JAUS architecture followed by the JAUS implementation on the autonomous test

platform, NaviGator. Finally a detailed explanation of the implementation of the LTSS as an

experimental JAUS component is given.

Joint Architecture for Unmanned Systems

The Joint Architecture for Unmanned Systems (JAUS) [JAUS] is an architecture defined

for use in the research, development and acquisition of unmanned systems. The two main

purposes for the development of JAUS are to support interoperability amongst heterogeneous

unmanned systems originating from different developers and to support the reuse/insertion of

technology. To ensure that the architecture is applicable to the development of entire domain of

unmanned systems the following constraints are imposed on JAUS; platform independence,

mission isolation, computer hardware independence and technology independence. JAUS is a

component based, message passing architecture that specifies data formats and methods of

communication among computing nodes. The JAUS system architecture is defined in a

hierarchical structure. The system topology is shown in the Figure 5-12. The different levels of

the architecture are defined in the following terms:

System: A system is comprised of all the unmanned systems and human interfaces meant

for a common application.

Subsystem: A subsystem is a single or more than one unmanned system which can be

defined as a single localized entity within a system. The autonomous platform, Navigator

which has been developed at CIMAR may be defined as a single JAUS subsystem









Node: A JAUS node defines a distinct processing capability within the subsystem. Each

node runs its own node manager component to manage the flow and control of JAUS

messages.

Component: A component provides a unique functional capability for the unmanned

system. A JAUS component resides wholly within a JAUS node. More then one

components may reside on a single node.

Instances: Instances provide a way to duplicate JAUS components. All components are

uniquely addressed using the subsystem, node, component and instance identifiers.

JAUS defines a set of reusable components and the messages supporting these

components. However, JAUS does not impose any regulations on the configuration of the

system. JAUS also allows the development of experimental components for performing tasks

which otherwise cannot be performed by the already defined JAUS components. The only

absolutely necessary requirement that has to be satisfied for the implementation of JAUS is that

all JAUS components can communicate between each other only through JAUS messages.

JAUS System Architecture on the NaviGator

In the hierarchical structure of the JAUS system, the NaviGator is defined as a fully

independent JAUS subsystem. The NaviGATOR system architecture is formulated using the

existing JAUS defined components wherever possible. Experimental JAUS components are

developed for the tasks which did not have a JAUS component. A JAUS compliant messaging

system is used to define all the communication between components. The sensor component

developed in this research is an experimental JAUS component.

Each of the rectangular blocks shown in Figure 5-13, is a JAUS component. From the

autonomous functionality view point, at the highest level, the Navigator system architecture is

categorized into four fundamental elements. These elements are:









1. Planning element: The components that act as a repository for a priori data. These
components perform off-line planning based on the a priori data.

2. Control element: The components that perform closed loop control in order to keep the
vehicle on a specified path.

3. Perception element: The components that perform the sensing tasks required to locate
obstacles and to evaluate the smoothness of terrain.

4. Intelligence element: The components that determine the best path segment based on the
sensed information.

As stated in Chapter 4, the Traversability Grid is the common data structure used to

represent the environment in all the above components. The components in the perception

element represent the world as a Traversability Grid based on real-time perception, the

components in the planning element represent the world around the robot as a Traversability

Grid based on a priori information, the components in the intelligence element utilizes these

Traversability Grids to plan the best possible path and the control element executes the planned

path. The complete loop of perception, planning and control is repeated continuously at about 40

Hz. The following section explains in detail the implementation of the LTSS as a JAUS

component which comprises of all the sensor algorithms discussed above.

Implementation of the LTSS as a JAUS component

The LTSS JAUS component is developed using C programming language in the Linux

Operating system environment. The LTSS and all the other JAUS components on the NaviGator

are implemented as finite state machines. At any point of time, each component can assume one

of the seven states enumerated as; Startup, Initialize, Standby, Ready, Emergency, Failure and

Shutdown. Of these the Emergency state is not used in the LTSS component; the Failure state is

used to report any type of failure, such as failure to allocate dynamic memory or failure to create

service connections and the Shutdown state is called during shutdown of the component to end









all the processes in a proper sequence. The important operations in the other states are in the

following sequence:

Startup state: The component is checked into the system and obtains instance, node and

subsystem identification numbers from the node manager. All the required data structures

declared as global variables are initialized using dynamic memory allocation. These

include three data structures of type Circular Buffer to store the grid information for each

of the LADAR' s. The basic implementation of the circular buffer was discussed in Chapter

3. Since the obstacle detection algorithm and the terrain evaluation algorithm store

variables of different data types, the circular buffer implementation for the two; is different

in regards to the number and type of variables but the basic functioning remains the same.

The position and orientation information is obtained from the JAUS component, GPOS

and the vehicle state information is obtained from the JAUS component, VSS. This

information is obtained using JAUS service connections. The JAUS service connections

provide a mechanism to continuously obtain information at a Eixed update rate from

another component without the necessity to query the component each time for the

information. Before entering into the Initialize state, the LTSS component creates the

service connections to the GPOS and the VSS in the Startup state. Finally all the config

variables in the config fie are loaded in the program.

Initialize State: This state makes sure that the GPOS and VSS service connections are

active. In case these connections are not active the component remains in the initialize

state. Even when in Ready state, if the GPOS or VSS service connections are down the

component defaults to initialize state.










Ready state: This is the most important part of the component and all the processing is

done in this state. Once the component is set to run with all the global variables initialized

and the service connections active, it remains in the ready state. It loops through the ready

state once every time it produces an output Traversability Grid. The following sequence of

steps is performed each time:

1. Conversion of the vehicle position from the LLA (Latihtde/Longitude/Altitude) data
format to the vehicle position in a fixed Global coordinate system using the UTM
(Universal Transverse Mercator) conversion. This information is then transformed
into the number of rows and columns moved by the vehicle since the previous update.

2. Update each of the Circular buffers to account for the number of rows and columns
moved by the vehicle. The circular buffer update method to account for the
movement of the vehicle was discussed in Chapter 3.

3. The next step is to acquire the range data from each of the lasers. The data from each
laser is acquired in a different thread. The coordination between the laser data
acquisition thread and the main component thread is maintained using a mutex lock.
Since the OD laser runs at a frequency of 36 Hz and the two TE laser' s run at a
frequency of 18 Hz., the OD laser loads a new set of range readings for every
iteration of the Ready state, while each of the Terrain evaluation lasers alternatively
load a new set of readings for every iteration.

4. After the acquisition of the range data, the aforementioned algorithms pertaining to
the mapping of data and assignment of the traversability values are implemented. The
algorithms are executed in the following sequence; the Traversability Grid for the OD
is updated with the new data, the two Traversability Grids for terrain evaluation 1 and
terrain evaluation 2 are populated with the most recent data from the two terrain
lasers respectively. Next, the OD algorithm evaluates and assigns each cell a
traversability value, similarly the two terrain evaluation Traversability Grids are
evaluated and traversability values of these two terrain evaluation grids are then
combined into a traversability value. The final step is the fusion of the OD and the TE
grid.

5. The output Traversability Grid is passed on to the Smart Arbiter. The algorithm
repeats the steps in the Ready state as long as the service connections for the GPOS
and VSS are active.






















A B

Figure 5-1. Measurement range (Top view scan from right to left). A) Angular Range 00 -1800
B) Angular Range 00-1000.


Laser CPU (USB to Serial Hub)
Signal Designation Pin No Pin No Signal Designation
Rx- 1 1 Rx+
Rx+ 2 ,.2 Rx-
Tx+ 3 3 Tx-
Tx- 4 *4 Tx+
GND 5 5 GND
Not Connected 6 6 Not Connected

Jumper 1 7 --7 Not Connected
Jumper 2 8 *-8 Not Connected
Not Connected 9 9 Not Connected


400


1400


180o


Figure 5-2. Laser sensor RS422 interface










Terrain Evaluation







Obstac~le Detection






Figure 5-3. Sensor mounts on the NaviGator.














Variable ~Angle X
Mount



Figure 5-4. Sensor mount design.






































Figure 5-5. Block diagram of the LTSS component









North


East
yGlobal


Figure 5-6. Obstacle detection LADAR.












6-

5-

4-






0 2 4 8 16 32
Weighted Sum

Figure 5-7. Traversability value mapping.


Sensor Coordinate

system


Te rra i nLADAR 1

Fiel ofVie


Global
Coordinate system
Attached to vehicle


Te rra i nLA DAR2
Field of View


Fixed Global
Coordinate system

Figure 5-8. Schematic of terrain evaluation sensors.


































Figure 5-9. Weighted neighborhood analysis


Laser beam


Imaginary horizontal
plane


Negative obstacle
(e.g., large pothole on the road or a cliff on the
side of the road)


Figure 5-10. Schematic working of negative obstacle detection algorithm.


Cell (i,j)

















Terrain Map


NearesI
Slope an'anI


ev~arestor


Sloped Vnrainc
EvalutionEvaluation


Figure ~ ~ ~ ~ ~ Evlato I- .Bokdarmo eri vlainagrtm

S~J~EI


Nade I Ih ~ I hade~I I -aI

Figue 5-2. JUS sstemtopoogI





















Intelligence
Element







Planning
Element '


Control (
Element e Kl


Raster


Per ~eplion
~~ Elmetll~l


Figure 5-13. JAUS compliant system architecture.
































89


NaviGATOR Component Block Diagram









CHAPTER 6
EXPERIMENTAL RESULTS

The developed LADAR Traversability Smart Sensor (LTSS) component was tested on the

autonomous platform, NaviGator. In Chapter 3 the NaviGator and the positioning system

available on the platform were discussed. The LTSS component is formed from the fusion of two

different implementations of the LADAR sensors, the obstacle detection (OD) algorithm and the

terrain evaluation (TE) algorithm. Outputs from these two algorithms were fused resulting in the

LTSS Traversability Grid. This chapter presents and analyzes the results obtained from the OD

sensor, the TE sensor and finally the fusion of these two sensors. The tests in this chapter are

conducted at the solar park facility of the University of Florida.

Traversability Grid Visualization

Throughout this chapter the Traversability Grid visualization tool is used to demonstrate

the results of the sensor algorithms. The Traversability Grid is represented using the color code

shown in Figure 6-1. As shown in the figure a value of 2, which is highly non-traversable is

represented by the color red. The color green represents a traversability value of 12, which mean

highly favorable to traverse and the color grey is for 7, which is a neutral value. The color shades

for the intermediate values are shown in Figure 6-1. The color pink represents a value of 14

which is assigned to cells whose traversability value is unknown. As discussed in the previous

chapters the vehicle position is always in the center of the grid. In the Traversability Grid results

that follow, the position and direction of the vehicle is always represented by an arrow. Similarly

the images showing the test set-up also represent the vehicle position and direction as shown by

an arrow.









Obstacle Detection Sensor Results


Obstacle Mapping Results

The obstacle detection (OD) sensor detects positive obstacles of height above a threshold

value. If the obstacle is smaller then the threshold height, which is 0.6 m in the present

implementation, the sensor cannot detect the obstacle. Similarly, the algorithm cannot offer any

opinion regarding negative obstacles or good traversable path. Hence the algorithm can report

traversability values in the range of 2 to 7 based on the confidence it has on the obstacle. A value

of 7 represents free space, which means there is no positive obstacle reported by the OD sensor;

however the cell could contain a negative obstacle or could be a rough, uneven terrain which

cannot be determined by this sensor.

The performance of the OD sensor is evaluated on the basis of two important factors: the

accuracy in mapping the obstacle and the response time in detecting the obstacle. These

performance assessments are evaluated at 3 different speeds of the vehicle: 10 mph, 16 mph and

22 mph. Figure 6-2 shows the experimental set-up for one set of readings. The results of the

output from the OD sensor are shown in Figure 6-3 and the summary of the mapping results is

presented in Table 6-1. The table shows the comparison between the actual physical distance

measured between the barrels and the output from the sensor at vehicle speeds of 10 mph, 16

mph and 22 mph. As seen in Table 6-1 the obstacles are mapped accurately within an error of 0.5

m which is equal to the grid resolution. Hence the maximum error is within 1 cell of the grid.

Another set of readings was taken by changing the position of the barrels and repeating

the experiments at the three speeds. Figure 6-4 shows the experimental set-up for the second set

of readings. The results are presented in Figure 6-5 and Table 6-2. Similar to the first set of

readings the error is within 1 grid cell i.e., 0.5 m.









From the above two set of readings it can be seen that the error in the system is within 1

grid cell and the accuracy is independent of the lateral distance of the obstacle from the vehicle

path. It was also seen that the error was repeated between the same barrels in all the three speeds.

Since the actual physical distance was measured by a measuring tape, this also causes some error

in the measurement of the distance.

Obstacle Detection Response Time

The response time of the sensor algorithm is computed by measuring the distance

between the obstacle and the vehicle, at the time when the obstacle is detected. This distance is

measured for each of the barrels in the two set of readings shown in Figure 6-2 and Figure 6-4

respectively. A comprehensive summary of the results is presented in Table 6-3. The table shows

the distance range in which the obstacles were detected in the direction of the vehicle path. The

results of the reading 1 (represented by Figure 6-2) are presented in the movie links in the

Obj ects 6-1, 6-2 and 6-3 for the vehicle speeds of 10, 16 and 22 mph respectively.

Obj ect 6-1. OD reading 1 at 10 mph [ODBarrelTestl_10Omph.avi, 100892 KB].

Object 6-2. OD reading 1 at 16 mph [ODBarrelTestl_16Gmph.avi, 83265 KB].

Obj ect 6-3. OD reading 1 at 22 mph [ODBarrelTestl_22mph. avi, 76214 KB].

Although the OD sensor detects obstacles above a threshold value, it was observed that

the height of the obstacle played an important role in how fast the obstacle is detected. The

experiment of detecting the obstacle is repeated by increasing the height of the obstacle. Figure

6-6 shows the experimental set-up for the increased height by placing the barrels one on top of

the other. The result is presented in the movies in Obj ects 6-4, 6-5 and 6-6 for the vehicle speeds

of 10, 16 and 22 mph.

Obj ect 6-4. OD with increased barrel height at 10 mph [ODBarrelTest2_10Omph.avi, 92079 KB].

Obj ect 6-5. OD with increased barrel height at 16 mph [ODBarrelTest2_16Gmph.avi, 83265 KB].









Obj ect 6-6. OD with increased barrel height at 22 mph [ODBarrelTest2_22mph. avi, 753 92 KB].

The analysis of the result for obstacles with increased height is presented in Table 6-4. It can be

seen from the results presented in Table 6-3 and Table 6-4, that the distance from which the

obstacle is detected from the vehicle is very sensitive to the height of the obstacle, especially

when the height of the obstacle is below 2 m. This is because even a small change in the vehicle

pitch and roll angles cause a change in the angle of the laser beam. Due to this change the laser

beam is no longer horizontal to the ground. For example a change in the pitch angle of 20 would

cause a laser beam to shoot at a height difference of 1 m above the critical height at a distance of

30 m from the laser. At high speeds on rough path (such as the environment shown in the above

experiments) the roll and pitch changes and the rate of these changes are very high, and hence

there is a difference in the performance based on the height of the obstacle.

The above experiments demonstrated the performance of the OD algorithm. The next

section presents the results for the TE algorithm and the fusion of the two algorithms.

Fusion of the Obstacle Detection and Terrain Evaluation algorithms

The main task of the terrain evaluation (TE) algorithm is to detect a smooth path and

distinguish it from the surroundings. Unlike the obstacle detection (OD) algorithm which can

report traversability values only in the range of 2 to 7, the TE algorithm can report values from 2

to 12. As discussed in Chapter 5, the traversability value is computed for each cell based on a set

of features. To assess the performance of the TE algorithm and subsequently the result of

combining the outputs from the TE and OD sensors, the vehicle is driven on a small paved road

within the solar park facility. The actual path is shown in the movie in Obj ect 6-7.

Obj ect 6-7. Test Path [VideoForwardPath. avi, 3 35689 KB].

The vehicle speed was maintained at approximatelyl0 mph. The same path was driven three

times and each time the LTSS component was executed on this path in a different output mode.









The output from the OD, TE and the fused LTSS grid is shown in the movies linked to Obj ects

6-8, 6-9 and 6-10 respectively.

Obj ect 6-8. OD result for test path [ODFusionTestl_10Omph.avi, 3 56136 KB].

Obj ect 6-9. TE result for test path [TEFusionTestl_10Omph.avi, 393 154 KB].

Obj ect 6-10. LTSS result for test path [LTSSTestl_10Omph.avi, 337804 KB].

For a second set of experiment the same path was driven in the opposite direction. The

actual path is shown in the movie linked to Obj ect 11i.

Obj ect 6-11i. Return test path [VideoReturnaathaavi, 239561 KB].

The output result from the OD algorithm, TE algorithm and the fused LTSS grid are shown in

the movies linked to Obj ects 6-12, 6-13 and 6-14 respectively.

Obj ect 6-12. OD result for return path at 10mph [ODFusionTest2_10Omph.avi, 306897 KB].

Obj ect 6-13. TE result for return path at 10mph [TEFusionTest2_10Omph.avi, 3 1665 1 KB].

Obj ect 6-14. LTSS result for return path at 10mph [LTS STest2_10Omph.avi, 23 9208 KB].

A couple of scenarios from the above two experiments are selected and discussed here.

Scene 1

The scene 1 is selected from the first of the above two experimental readings. Figure 6-7

shows the image of the environment with barrels, poles, trailers and some name boards. Figure 6-

8 shows the outputs from the OD and TE algorithms. As seen in the results the obstacles which

can be clearly distinguished from the surroundings mainly due to their height (these include the

barrels, name boards, poles and trailers) are mapped very well by the OD sensor. However, the

rest of the region is shown as free space without any indication of a favorable traversable path.

The TE algorithm makes a good attempt to distinguish the smooth path from the surrounding

grass region and also shows the discontinuity at the edge of the path. The classification results

from the TE algorithm are based on the absolute scale of traversability defined for each of the









features; the slope, variance and the nearest neighbor. As it can be seen although the path is

distinguishable, the traversability values for most of the region ranges between 6 and 12. This is

because, even though there is a clearly distinguishable paved path, the region around the path is

still considered to be drivable by the vehicle. The result after combining the two algorithms is

depicted in Figure 6-9. As seen in the figure, the LTSS component takes the advantages of both

the algorithms to represent clearly distinguishable obstacles and to follow a smooth path.

Scene 2

The scene 2 is also selected from the first experiment. Figure 6-10, shows a snapshot of the

video representing the scene. The results from the individual sensor algorithms are shown in

Figure 6-11 and the fused output is shown in Figure 6-12. Similar to scene 1, it can be seen that

the fused output has the advantages from both the sensor algorithms and makes a more complete

representation of the environment then the output obtained from either one of the algorithm.

Scene 3

The scene 3 is selected from the second set of experiments (i.e., when the vehicle is on its

way back). Figure 6-13 shows the environment. The results are presented in Figures 6-14 and 6-

15. It can be seen from Figure 6-14 that the OD algorithm identifies and maps obstacles such as

the dumpster and the TE algorithm distinguishes the paved path from the surrounding region.

The fused output shows a very good representation of the environment.

High Speed Test of the LTSS component

A part of the first experimental path was driven at a higher speed and the results were

assessed. The vehicle reached the speed of approximately 22 mph. The vehicle was driven three

times to obtain the outputs from the OD sensor TE sensor, and the fused LTSS Traversability

Grid. Obj ects 6-15, 6-16 and 6-17 shows the output from OD, TE and the fused LTSS algorithm

respectively.









Object 6-15. Obstacle detection result at 22mph [ODFusionTest_22mph.avi, 109706 KB].

Object 6-16. Terrain evaluation result at 22mph [TEFusionTest_22mph.avi, 110646 KB].

Obj ect 6-17. LTSS result at 22mph [LTS STest_22mph.avi, 13 5207 KB].

The main limitation to achieve better results at higher speeds is the laser update rate. The

terrain sensors operate at 18 Hz and at this rate each individual grid cell barely manages to get at

the most a single laser scan data above speeds of 20 mph. The speed limitation factor based on

the laser update rate is discussed in Chapter 7. Another important factor is the vehicle position

and orientation. At higher speeds it is critical to correlate the laser data and the vehicle position

fairly accurate to obtain a reasonable point cloud of the laser data. In spite of the above

limitations, the developed LTSS component produces fairly good results up to speeds of 20 mph.





























Barrel 1






Barrel 2

Barrel
Barrel 4


Figure 6-2. Obstacle detection reading 1 experimental set-up.


Figure 6-1. Traversability Grid color code.










i+~ :L


Barrel 4~~

Barrel 3




Barrel 2


Barrel 1


rC ii


Barrel 4--)c

Barrel 3-




Barrel 2~-----.;

Barrel 1C


:


Barrel 4 c- -"

Barrel 3'



Barrel 2 -c3

Barrel 1


C

Figure 6-3. Obstacle detection reading 1 output at varied speeds. A) 10 mph B) 16 mph C) 22
mph


















Barrel 1







Barrel 3 Barrel 2 Bre


Figure 6-4. Obstacle detection reading 2 experimental set-up.











Barrel 4~ `.



Barrel 3






Barrel 2



Barrel 1--~




Barrel 4

Barrel




Barrel 2




Barrel 1





Barrel 4 e .



Barrel 3





Barrel 2



Barrel ~





Figure 6-5. Obstacle detection reading 2 output at varied speeds. A) 10 mph. B) 16 mph. C) 22
mph.


100